NASA Astrophysics Data System (ADS)
Luis, Josep M.; Duran, Miquel; Andrés, José L.
1997-08-01
An analytic method to evaluate nuclear contributions to electrical properties of polyatomic molecules is presented. Such contributions control changes induced by an electric field on equilibrium geometry (nuclear relaxation contribution) and vibrational motion (vibrational contribution) of a molecular system. Expressions to compute the nuclear contributions have been derived from a power series expansion of the potential energy. These contributions to the electrical properties are given in terms of energy derivatives with respect to normal coordinates, electric field intensity or both. Only one calculation of such derivatives at the field-free equilibrium geometry is required. To show the useful efficiency of the analytical evaluation of electrical properties (the so-called AEEP method), results for calculations on water and pyridine at the SCF/TZ2P and the MP2/TZ2P levels of theory are reported. The results obtained are compared with previous theoretical calculations and with experimental values.
Vandenplas, Jérémie; Colinet, Frederic G; Gengler, Nicolas
2014-09-30
A condition to predict unbiased estimated breeding values by best linear unbiased prediction is to use simultaneously all available data. However, this condition is not often fully met. For example, in dairy cattle, internal (i.e. local) populations lead to evaluations based only on internal records while widely used foreign sires have been selected using internally unavailable external records. In such cases, internal genetic evaluations may be less accurate and biased. Because external records are unavailable, methods were developed to combine external information that summarizes these records, i.e. external estimated breeding values and associated reliabilities, with internal records to improve accuracy of internal genetic evaluations. Two issues of these methods concern double-counting of contributions due to relationships and due to records. These issues could be worse if external information came from several evaluations, at least partially based on the same records, and combined into a single internal evaluation. Based on a Bayesian approach, the aim of this research was to develop a unified method to integrate and blend simultaneously several sources of information into an internal genetic evaluation by avoiding double-counting of contributions due to relationships and due to records. This research resulted in equations that integrate and blend simultaneously several sources of information and avoid double-counting of contributions due to relationships and due to records. The performance of the developed equations was evaluated using simulated and real datasets. The results showed that the developed equations integrated and blended several sources of information well into a genetic evaluation. The developed equations also avoided double-counting of contributions due to relationships and due to records. Furthermore, because all available external sources of information were correctly propagated, relatives of external animals benefited from the integrated information and, therefore, more reliable estimated breeding values were obtained. The proposed unified method integrated and blended several sources of information well into a genetic evaluation by avoiding double-counting of contributions due to relationships and due to records. The unified method can also be extended to other types of situations such as single-step genomic or multi-trait evaluations, combining information across different traits.
Dental students' evaluations of an interactive histology software.
Rosas, Cristian; Rubí, Rafael; Donoso, Manuel; Uribe, Sergio
2012-11-01
This study assessed dental students' evaluations of a new Interactive Histology Software (IHS) developed by the authors and compared students' assessment of the extent to which this new software, as well as other histology teaching methods, supported their learning. The IHS is a computer-based tool for histology learning that presents high-resolution images of histology basics as well as specific oral histologies at different magnifications and with text labels. Survey data were collected from 204 first-year dental students at the Universidad Austral de Chile. The survey consisted of questions for the respondents to evaluate the characteristics of the IHS and the contribution of various teaching methods to their histology learning. The response rate was 85 percent. Student evaluations were positive for the design, usability, and theoretical-practical integration of the IHS, and the students reported they would recommend the method to future students. The students continued to value traditional teaching methods for histological lab work and did not think this new technology would replace traditional methods. With respect to the contribution of each teaching method to students' learning, no statistically significant differences (p>0.05) were found for an evaluation of IHS, light microscopy, and slide presentations. However, these student assessments were significantly more positive than the evaluations of other digital or printed materials. Overall, the students evaluated the IHS very positively in terms of method quality and contribution to their learning; they also evaluated use of light microscopy and teacher slide presentations positively.
Ethnographic methods for process evaluations of complex health behaviour interventions.
Morgan-Trimmer, Sarah; Wood, Fiona
2016-05-04
This article outlines the contribution that ethnography could make to process evaluations for trials of complex health-behaviour interventions. Process evaluations are increasingly used to examine how health-behaviour interventions operate to produce outcomes and often employ qualitative methods to do this. Ethnography shares commonalities with the qualitative methods currently used in health-behaviour evaluations but has a distinctive approach over and above these methods. It is an overlooked methodology in trials of complex health-behaviour interventions that has much to contribute to the understanding of how interventions work. These benefits are discussed here with respect to three strengths of ethnographic methodology: (1) producing valid data, (2) understanding data within social contexts, and (3) building theory productively. The limitations of ethnography within the context of process evaluations are also discussed.
Unmanned aircraft system sense and avoid integrity and continuity
NASA Astrophysics Data System (ADS)
Jamoom, Michael B.
This thesis describes new methods to guarantee safety of sense and avoid (SAA) functions for Unmanned Aircraft Systems (UAS) by evaluating integrity and continuity risks. Previous SAA efforts focused on relative safety metrics, such as risk ratios, comparing the risk of using an SAA system versus not using it. The methods in this thesis evaluate integrity and continuity risks as absolute measures of safety, as is the established practice in commercial aircraft terminal area navigation applications. The main contribution of this thesis is a derivation of a new method, based on a standard intruder relative constant velocity assumption, that uses hazard state estimates and estimate error covariances to establish (1) the integrity risk of the SAA system not detecting imminent loss of '"well clear," which is the time and distance required to maintain safe separation from intruder aircraft, and (2) the probability of false alert, the continuity risk. Another contribution is applying these integrity and continuity risk evaluation methods to set quantifiable and certifiable safety requirements on sensors. A sensitivity analysis uses this methodology to evaluate the impact of sensor errors on integrity and continuity risks. The penultimate contribution is an integrity and continuity risk evaluation where the estimation model is refined to address realistic intruder relative linear accelerations, which goes beyond the current constant velocity standard. The final contribution is an integrity and continuity risk evaluation addressing multiple intruders. This evaluation is a new innovation-based method to determine the risk of mis-associating intruder measurements. A mis-association occurs when the SAA system incorrectly associates a measurement to the wrong intruder, causing large errors in the estimated intruder trajectories. The new methods described in this thesis can help ensure safe encounters between aircraft and enable SAA sensor certification for UAS integration into the National Airspace System.
Approximating genomic reliabilities for national genomic evaluation
USDA-ARS?s Scientific Manuscript database
With the introduction of standard methods for approximating effective daughter/data contribution by Interbull in 2001, conventional EDC or reliabilities contributed by daughter phenotypes are directly comparable across countries and used in routine conventional evaluations. In order to make publishe...
Empirical entropic contributions in computational docking: evaluation in APS reductase complexes.
Chang, Max W; Belew, Richard K; Carroll, Kate S; Olson, Arthur J; Goodsell, David S
2008-08-01
The results from reiterated docking experiments may be used to evaluate an empirical vibrational entropy of binding in ligand-protein complexes. We have tested several methods for evaluating the vibrational contribution to binding of 22 nucleotide analogues to the enzyme APS reductase. These include two cluster size methods that measure the probability of finding a particular conformation, a method that estimates the extent of the local energetic well by looking at the scatter of conformations within clustered results, and an RMSD-based method that uses the overall scatter and clustering of all conformations. We have also directly characterized the local energy landscape by randomly sampling around docked conformations. The simple cluster size method shows the best performance, improving the identification of correct conformations in multiple docking experiments. 2008 Wiley Periodicals, Inc.
ERIC Educational Resources Information Center
Daigneault, Pierre-Marc; Jacob, Steve
2014-01-01
Although combining methods is nothing new, more contributions about why and how to mix methods for validation purposes are needed. This article presents a case of validating the inferences drawn from the Participatory Evaluation Measurement Instrument, an instrument that purports to measure stakeholder participation in evaluation. Although the…
Creating Alternative Methods for Educational Evaluation.
ERIC Educational Resources Information Center
Smith, Nick L.
1981-01-01
A project supported by the National Institute of Education is adapting evaluation procedures from such areas as philosophy, geography, operations research, journalism, film criticism, and other areas. The need for such methods is reviewed, as is the context in which they function, and their contributions to evaluation methodology. (Author/GK)
Efficient Credit Assignment through Evaluation Function Decomposition
NASA Technical Reports Server (NTRS)
Agogino, Adrian; Turner, Kagan; Mikkulainen, Risto
2005-01-01
Evolutionary methods are powerful tools in discovering solutions for difficult continuous tasks. When such a solution is encoded over multiple genes, a genetic algorithm faces the difficult credit assignment problem of evaluating how a single gene in a chromosome contributes to the full solution. Typically a single evaluation function is used for the entire chromosome, implicitly giving each gene in the chromosome the same evaluation. This method is inefficient because a gene will get credit for the contribution of all the other genes as well. Accurately measuring the fitness of individual genes in such a large search space requires many trials. This paper instead proposes turning this single complex search problem into a multi-agent search problem, where each agent has the simpler task of discovering a suitable gene. Gene-specific evaluation functions can then be created that have better theoretical properties than a single evaluation function over all genes. This method is tested in the difficult double-pole balancing problem, showing that agents using gene-specific evaluation functions can create a successful control policy in 20 percent fewer trials than the best existing genetic algorithms. The method is extended to more distributed problems, achieving 95 percent performance gains over tradition methods in the multi-rover domain.
ERIC Educational Resources Information Center
Fellenz, Martin R.
2006-01-01
A key challenge for management instructors using graded groupwork with students is to find ways to maximize student learning from group projects while ensuring fair and accurate assessment methods. This article presents the Groupwork Peer-Evaluation Protocol (GPEP) that enables the assessment of individual contributions to graded student…
Characterization of target camouflage structures by means of different microwave imaging procedures
NASA Astrophysics Data System (ADS)
Inaebnit, Christian; John, Marc-Andre; Aulenbacher, Uwe; Akyol, Zeynrep; Hueppi, Rudolf; Wellig, Peter
2009-05-01
This paper presents two different test methods for camouflage layers (CL) like nets or foam based structures. The effectiveness of CL in preventing radar detection and recognition of targets depends on the interaction of CL properties as absorption and diffuse scattering with target specific scattering properties. This fact is taken into account by representing target backscattering as interference of different types of GTD contributions and evaluating the impact of CL onto these individual contributions separately. The first method investigates how a CL under test alters these individual scattering contributions and which "new" contributions are produced by "self-scattering" at the CL. This information is gained by applying ISAR imaging technique to a test structure with different types of scattering contributions. The second test method aims for separating the effects of absorption and "diffuse scattering" in case of a planar metallic plate covered by CL. For this, the equivalent source distribution in the plane of the CL is reconstructed from bistatic scattering data. Both test methods were verified by experimental results obtained from X-band measurements at different CL and proved to be well suited for an application specific evaluation of camouflage structures from different manufacturers.
An evaluation method for nanoscale wrinkle
NASA Astrophysics Data System (ADS)
Liu, Y. P.; Wang, C. G.; Zhang, L. M.; Tan, H. F.
2016-06-01
In this paper, a spectrum-based wrinkling analysis method via two-dimensional Fourier transformation is proposed aiming to solve the difficulty of nanoscale wrinkle evaluation. It evaluates the wrinkle characteristics including wrinkling wavelength and direction simply using a single wrinkling image. Based on this method, the evaluation results of nanoscale wrinkle characteristics show agreement with the open experimental results within an error of 6%. It is also verified to be appropriate for the macro wrinkle evaluation without scale limitations. The spectrum-based wrinkling analysis is an effective method for nanoscale evaluation, which contributes to reveal the mechanism of nanoscale wrinkling.
USDA-ARS?s Scientific Manuscript database
Mixing models have been used to predict sediment source contributions. The inherent problem of the mixing models limited the number of sediment sources. The objective of this study is to develop and evaluate a new method using Discriminant Function Analysis (DFA) to fingerprint sediment source contr...
The main methodological contribution lies in the demonstration and evaluation of these two BT methods, which are not in common use, in a new application (valuing beach closures). Another important contribution is the use of the NSRE data for rigorous nonmarket valuation ap...
Using Contribution Analysis to Evaluate the Impacts of Research on Policy: Getting to "Good Enough"
ERIC Educational Resources Information Center
Riley, Barbara L.; Kernoghan, Alison; Stockton, Lisa; Montague, Steve; Yessis, Jennifer; Willis, Cameron D.
2018-01-01
Assessing societal impacts of research is more difficult than assessing advances in knowledge. Methods to evaluate research impact on policy processes and outcomes are especially underdeveloped, and are needed to optimize the influence of research on policy for addressing complex issues such as chronic diseases. Contribution analysis (CA), a…
Boland, Mary Regina; Rusanov, Alexander; So, Yat; Lopez-Jimenez, Carlos; Busacca, Linda; Steinman, Richard C; Bakken, Suzanne; Bigger, J Thomas; Weng, Chunhua
2014-12-01
Underspecified user needs and frequent lack of a gold standard reference are typical barriers to technology evaluation. To address this problem, this paper presents a two-phase evaluation framework involving usability experts (phase 1) and end-users (phase 2). In phase 1, a cross-system functionality alignment between expert-derived user needs and system functions was performed to inform the choice of "the best available" comparison system to enable a cognitive walkthrough in phase 1 and a comparative effectiveness evaluation in phase 2. During phase 2, five quantitative and qualitative evaluation methods are mixed to assess usability: time-motion analysis, software log, questionnaires - System Usability Scale and the Unified Theory of Acceptance of Use of Technology, think-aloud protocols, and unstructured interviews. Each method contributes data for a unique measure (e.g., time motion analysis contributes task-completion-time; software log contributes action transition frequency). The measures are triangulated to yield complementary insights regarding user-perceived ease-of-use, functionality integration, anxiety during use, and workflow impact. To illustrate its use, we applied this framework in a formative evaluation of a software called Integrated Model for Patient Care and Clinical Trials (IMPACT). We conclude that this mixed-methods evaluation framework enables an integrated assessment of user needs satisfaction and user-perceived usefulness and usability of a novel design. This evaluation framework effectively bridges the gap between co-evolving user needs and technology designs during iterative prototyping and is particularly useful when it is difficult for users to articulate their needs for technology support due to the lack of a baseline. Copyright © 2013 Elsevier Inc. All rights reserved.
An organic tracer method, recently proposed for estimating individual contributions of toluene and α-pinene to secondary organic aerosol (SOA) formation, was evaluated by conducting a laboratory study where a binary hydrocarbon mixture, containing the anthropogenic aromatic hydro...
Evaluation of Contribution for Voltage Control Ancillary Services Based on Social Surplus
NASA Astrophysics Data System (ADS)
Ueki, Yuji; Hara, Ryoichi; Kita, Hiroyuki; Hasegawa, Jun
Reactive power supply plays an important role in active power supply with adequate system voltages. Various pricing mechanism for reactive power supply have been developed and some of them are adopted in some power systems, however they are in a trial stage. The authors also focus on development of a pricing method for reactive power ancillary services. This problem involves two technical issues: rational estimation of the cost associated with reactive power supply and fair and transparent allocation of the estimated cost among the market participants. This paper proposes methods for evaluating the contribution of generators and demands.
ERIC Educational Resources Information Center
Raven, Neil
2016-01-01
Whilst published data sources exist for evaluating interventions aimed at widening higher education access, there is value for practitioners in conducting their own research. However, recognition of the contribution afforded by generating new data raises questions over which research methods to utilise. One method relatively new to widening…
Implementation and evaluation of PM2.5 source contribution analysis in a photochemical model
Source culpability assessments are useful for developing effective emissions control programs. The Integrated Source Apportionment Method (ISAM) has been implemented in the Community Multiscale Air Quality (CMAQ) model to track contributions from source groups and regions to ambi...
The present study investigated whether combining of targeted analytical chemistry methods with unsupervised, data-rich methodologies (i.e. transcriptomics) can be utilized to evaluate relative contributions of wastewater treatment plant (WWTP) effluents to biological effects. The...
ERIC Educational Resources Information Center
Neal, Jennifer Watling; Neal, Zachary P.; VanDyke, Erika; Kornbluh, Mariah
2015-01-01
Qualitative data offer advantages to evaluators, including rich information about stakeholders' perspectives and experiences. However, qualitative data analysis is labor-intensive and slow, conflicting with evaluators' needs to provide punctual feedback to their clients. In this method note, we contribute to the literature on rapid evaluation and…
ERIC Educational Resources Information Center
Sprang, G.; Clark, J.J.; Bass, S.
2005-01-01
Objectives:: This study used data gathered during evaluations conducted by the Comprehensive Assessment and Training Services (CATS) Project to determine the relative contribution of four primary domains (demographic, adult characteristics, child characteristics, relational characteristics) to variation in the severity of child maltreatment, and…
The Evaluation of a Training and Employment Program: Discussion on Design
ERIC Educational Resources Information Center
Bustos, Antonio; Arostegui, Jose Luis
2012-01-01
Universities in Europe have been playing an increasingly important role in the institutional evaluation of political and social systems for the last thirty years. Their major contribution to those processes of accountability has been to add methods and safeguards of evaluative research. In this paper we report an illustration of how evaluative…
USDA-ARS?s Scientific Manuscript database
A benchtop baking method has been developed to predict the contribution of gluten functionality to overall flour performance for chemically leavened crackers. Using a diagnostic formula and procedure, dough rheology was analyzed to evaluate the extent of gluten development during mixing and machinin...
Two-Photon Transitions in Hydrogen-Like Atoms
NASA Astrophysics Data System (ADS)
Martinis, Mladen; Stojić, Marko
Different methods for evaluating two-photon transition amplitudes in hydrogen-like atoms are compared with the improved method of direct summation. Three separate contributions to the two-photon transition probabilities in hydrogen-like atoms are calculated. The first one coming from the summation over discrete intermediate states is performed up to nc(max) = 35. The second contribution from the integration over the continuum states is performed numerically. The third contribution coming from the summation from nc(max) to infinity is calculated in an approximate way using the mean level energy for this region. It is found that the choice of nc(max) controls the numerical error in the calculations and can be used to increase the accuracy of the results much more efficiently than in other methods.
Contribution mapping: a method for mapping the contribution of research to enhance its impact
2012-01-01
Background At a time of growing emphasis on both the use of research and accountability, it is important for research funders, researchers and other stakeholders to monitor and evaluate the extent to which research contributes to better action for health, and find ways to enhance the likelihood that beneficial contributions are realized. Past attempts to assess research 'impact' struggle with operationalizing 'impact', identifying the users of research and attributing impact to research projects as source. In this article we describe Contribution Mapping, a novel approach to research monitoring and evaluation that aims to assess contributions instead of impacts. The approach focuses on processes and actors and systematically assesses anticipatory efforts that aim to enhance contributions, so-called alignment efforts. The approach is designed to be useful for both accountability purposes and for assisting in better employing research to contribute to better action for health. Methods Contribution Mapping is inspired by a perspective from social studies of science on how research and knowledge utilization processes evolve. For each research project that is assessed, a three-phase process map is developed that includes the main actors, activities and alignment efforts during research formulation, production and knowledge extension (e.g. dissemination and utilization). The approach focuses on the actors involved in, or interacting with, a research project (the linked actors) and the most likely influential users, who are referred to as potential key users. In the first stage, the investigators of the assessed project are interviewed to develop a preliminary version of the process map and first estimation of research-related contributions. In the second stage, potential key-users and other informants are interviewed to trace, explore and triangulate possible contributions. In the third stage, the presence and role of alignment efforts is analyzed and the preliminary results are shared with relevant stakeholders for feedback and validation. After inconsistencies are clarified or described, the results are shared with stakeholders for learning, improvement and accountability purposes. Conclusion Contribution Mapping provides an interesting alternative to existing methods that aim to assess research impact. The method is expected to be useful for research monitoring, single case studies, comparing multiple cases and indicating how research can better be employed to contribute to better action for health. PMID:22748169
Electronic contributions to the sigma(p) parameter of the Hammett equation.
Domingo, Luis R; Pérez, Patricia; Contreras, Renato
2003-07-25
A statistical procedure to obtain the intrinsic electronic contributions to the Hammett substituent constant sigma(p) is reported. The method is based on the comparison between the experimental sigma(p) values and the electronic electrophilicity index omega evaluated for a series of 42 functional groups commonly present in organic compounds.
Can Value Added Add Value to Teacher Evaluation?
ERIC Educational Resources Information Center
Darling-Hammond, Linda
2015-01-01
The five thoughtful papers included in this issue of "Educational Researcher" ("ER") raise new questions about the use of value-added methods (VAMs) to estimate teachers' contributions to students' learning as part of personnel evaluation. The papers address both technical and implementation concerns, considering potential…
NASA Astrophysics Data System (ADS)
Sizova, Evgeniya; Zhutaeva, Evgeniya; Chugunov, Andrei
2018-03-01
The article highlights features of processes of urban territory renovation from the perspective of a commercial entity participating in the implementation of a project. The requirements of high-rise construction projects to the entities, that carry out them, are considered. The advantages of large enterprises as participants in renovation projects are systematized, contributing to their most efficient implementation. The factors, which influence the success of the renovation projects, are presented. A method for selecting projects for implementation based on criteria grouped by qualitative characteristics and contributing to the most complete and comprehensive evaluation of the project is suggested. Patterns to prioritize and harmonize renovation projects in terms of multi-project activity of the enterprise are considered.
Evaluation of scaling invariance embedded in short time series.
Pan, Xue; Hou, Lei; Stephen, Mutua; Yang, Huijie; Zhu, Chenping
2014-01-01
Scaling invariance of time series has been making great contributions in diverse research fields. But how to evaluate scaling exponent from a real-world series is still an open problem. Finite length of time series may induce unacceptable fluctuation and bias to statistical quantities and consequent invalidation of currently used standard methods. In this paper a new concept called correlation-dependent balanced estimation of diffusion entropy is developed to evaluate scale-invariance in very short time series with length ~10(2). Calculations with specified Hurst exponent values of 0.2,0.3,...,0.9 show that by using the standard central moving average de-trending procedure this method can evaluate the scaling exponents for short time series with ignorable bias (≤0.03) and sharp confidential interval (standard deviation ≤0.05). Considering the stride series from ten volunteers along an approximate oval path of a specified length, we observe that though the averages and deviations of scaling exponents are close, their evolutionary behaviors display rich patterns. It has potential use in analyzing physiological signals, detecting early warning signals, and so on. As an emphasis, the our core contribution is that by means of the proposed method one can estimate precisely shannon entropy from limited records.
Evaluation of Scaling Invariance Embedded in Short Time Series
Pan, Xue; Hou, Lei; Stephen, Mutua; Yang, Huijie; Zhu, Chenping
2014-01-01
Scaling invariance of time series has been making great contributions in diverse research fields. But how to evaluate scaling exponent from a real-world series is still an open problem. Finite length of time series may induce unacceptable fluctuation and bias to statistical quantities and consequent invalidation of currently used standard methods. In this paper a new concept called correlation-dependent balanced estimation of diffusion entropy is developed to evaluate scale-invariance in very short time series with length . Calculations with specified Hurst exponent values of show that by using the standard central moving average de-trending procedure this method can evaluate the scaling exponents for short time series with ignorable bias () and sharp confidential interval (standard deviation ). Considering the stride series from ten volunteers along an approximate oval path of a specified length, we observe that though the averages and deviations of scaling exponents are close, their evolutionary behaviors display rich patterns. It has potential use in analyzing physiological signals, detecting early warning signals, and so on. As an emphasis, the our core contribution is that by means of the proposed method one can estimate precisely shannon entropy from limited records. PMID:25549356
Speech Emotion Feature Selection Method Based on Contribution Analysis Algorithm of Neural Network
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang Xiaojia; Mao Qirong; Zhan Yongzhao
There are many emotion features. If all these features are employed to recognize emotions, redundant features may be existed. Furthermore, recognition result is unsatisfying and the cost of feature extraction is high. In this paper, a method to select speech emotion features based on contribution analysis algorithm of NN is presented. The emotion features are selected by using contribution analysis algorithm of NN from the 95 extracted features. Cluster analysis is applied to analyze the effectiveness for the features selected, and the time of feature extraction is evaluated. Finally, 24 emotion features selected are used to recognize six speech emotions.more » The experiments show that this method can improve the recognition rate and the time of feature extraction.« less
ERIC Educational Resources Information Center
Barkley, Russell A.; Fischer, Mariellen
2010-01-01
Objective: Emotional impulsiveness (EI) may be a central feature of attention-deficit/hyperactivity disorder (ADHD) contributing impairment beyond the two ADHD dimensions of inattention and hyperactivity-impulsivity. Method: We evaluated EI in hyperactive (N = 135) and control (N = 75) children followed to adulthood (mean age 27 years). The…
Factors Contributing to Perceived Stress among Doctor of Pharmacy (PharmD) Students
ERIC Educational Resources Information Center
Ford, Kentya C.; Olotu, Busuyi S.; Thach, Andrew V.; Roberts, Rochelle; Davis, Patrick
2014-01-01
Objective: The purpose of this study was to report on perceived stress levels, identify its contributing factors, and evaluate the association between perceived stress and usage of university resources to cope with stress among a cross-section of Doctor of Pharmacy (PharmD) students. Methods: Perceived stress was measured via a web-based survey of…
QED contributions to electron g-2
NASA Astrophysics Data System (ADS)
Laporta, Stefano
2018-05-01
In this paper I briefly describe the results of the numerical evaluation of the mass-independent 4-loop contribution to the electron g-2 in QED with 1100 digits of precision. In particular I also show the semi-analytical fit to the numerical value, which contains harmonic polylogarithms of eiπ/3, e2iπ/3 and eiπ/2 one-dimensional integrals of products of complete elliptic integrals and six finite parts of master integrals, evaluated up to 4800 digits. I give also some information about the methods and the program used.
ERIC Educational Resources Information Center
Valle, Victor M.
Intended as a contribution to a workshop discussion on program evaluation in higher education, the paper covers five major evaluation issues. First, it deals with evaluation concepts, explaining the purposes of evaluation; pertinent terms; and the sources of evaluation in public health procedures, the scientific method, the systems approach, and…
Structural safety evaluation of Gerber Arch Dam
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barrie, R.E.
1995-12-31
Gerber Dam, a variable radius arch structure, has experienced seepage and extensive freeze-thaw damage since its construction. A construction key was found cracked at its crest. A finite element investigation was made to evaluate the safety of the arch structure. Design methods and assumptions are evaluated. Historical performance is used in the evaluation. Stress levels, patterns, and distributions were evaluated for loads the structure has experienced to determine behavior contributing to seepage and cracking.
Assessing Faculty Performance: A Test of Method.
ERIC Educational Resources Information Center
Clark, Mary Jo; Blackburn, Robert T.
A methodology for evaluating faculty work performance was discussed, using data obtained from a typical liberal arts college faculty. Separate evaluations of teaching effectiveness and of overall contributions to the college for 45 full-time faculty (85% response rate) were collected from administrators, faculty colleagues, students, and from the…
NASA Astrophysics Data System (ADS)
Chen, Dan; Luo, Zhaohui; Webber, Michael; Chen, Jing; Wang, Weiguang
2014-09-01
Emergy theory and method are used to evaluate the contribution of irrigation water, and the process of its utilization, in three agricultural systems. The agricultural systems evaluated in this study were rice, wheat, and oilseed rape productions in an irrigation pumping district of China. A corresponding framework for emergy evaluation and sensitivity analysis methods was proposed. Two new indices, the fraction of irrigation water ( FIW), and the irrigation intensity of agriculture ( IIA), were developed to depict the contribution of irrigation water. The calculated FIW indicated that irrigation water used for the rice production system (34.7%) contributed more than irrigation water used for wheat (5.3%) and oilseed rape (11.2%) production systems in a typical dry year. The wheat production with an IIA of 19.0 had the highest net benefit from irrigation compared to the rice (2.9) and oilseed rape (8.9) productions. The transformities of the systems' products represented different energy efficiencies for rice (2.50E + 05 sej·J-1), wheat (1.66E + 05 sej·J-1) and oilseed rape (2.14E + 05 sej·J-1) production systems. According to several emergy indices, of the three systems evaluated, the rice system had the greatest level of sustainability. However, all of them were less sustainable than the ecological agricultural systems. A sensitivity analysis showed that the emergy inputs of irrigation water and nitrogenous fertilizer were the highest sensitivity factors influencing the emergy ratios. Best Management Practices, and other agroecological strategies, could be implemented to make further improvements in the sustainability of the three systems.
The Formative Evaluation of a Web-based Course-Management System within a University Setting.
ERIC Educational Resources Information Center
Maslowski, Ralf; Visscher, Adrie J.; Collis, Betty; Bloemen, Paul P. M.
2000-01-01
Discussion of Web-based course management systems (W-CMSs) in higher education focuses on formative evaluation and its contribution in the design and development of high-quality W-CMSs. Reviews methods and techniques that can be applied in formative evaluation and examines TeLeTOP, a W-CMS produced at the University of Twente (Netherlands). (LRW)
Methods to approximate reliabilities in single-step genomic evaluation
USDA-ARS?s Scientific Manuscript database
Reliability of predictions from single-step genomic BLUP (ssGBLUP) can be calculated by inversion, but that is not feasible for large data sets. Two methods of approximating reliability were developed based on decomposition of a function of reliability into contributions from records, pedigrees, and...
ERIC Educational Resources Information Center
Elfer, Peter
2017-01-01
Nursery experience is now common for young children and their families. Questions of quality have focussed mainly on safety and early learning. The roles of subtle emotional processes in daily pedagogic interactions have received surprisingly little attention. This paper discusses the Tavistock Observation Method (TOM), a naturalistic method of…
An extension of the finite cell method using boolean operations
NASA Astrophysics Data System (ADS)
Abedian, Alireza; Düster, Alexander
2017-05-01
In the finite cell method, the fictitious domain approach is combined with high-order finite elements. The geometry of the problem is taken into account by integrating the finite cell formulation over the physical domain to obtain the corresponding stiffness matrix and load vector. In this contribution, an extension of the FCM is presented wherein both the physical and fictitious domain of an element are simultaneously evaluated during the integration. In the proposed extension of the finite cell method, the contribution of the stiffness matrix over the fictitious domain is subtracted from the cell, resulting in the desired stiffness matrix which reflects the contribution of the physical domain only. This method results in an exponential rate of convergence for porous domain problems with a smooth solution and accurate integration. In addition, it reduces the computational cost, especially when applying adaptive integration schemes based on the quadtree/octree. Based on 2D and 3D problems of linear elastostatics, numerical examples serve to demonstrate the efficiency and accuracy of the proposed method.
NASA Astrophysics Data System (ADS)
Song, Xiaoning; Feng, Zhen-Hua; Hu, Guosheng; Yang, Xibei; Yang, Jingyu; Qi, Yunsong
2015-09-01
This paper proposes a progressive sparse representation-based classification algorithm using local discrete cosine transform (DCT) evaluation to perform face recognition. Specifically, the sum of the contributions of all training samples of each subject is first taken as the contribution of this subject, then the redundant subject with the smallest contribution to the test sample is iteratively eliminated. Second, the progressive method aims at representing the test sample as a linear combination of all the remaining training samples, by which the representation capability of each training sample is exploited to determine the optimal "nearest neighbors" for the test sample. Third, the transformed DCT evaluation is constructed to measure the similarity between the test sample and each local training sample using cosine distance metrics in the DCT domain. The final goal of the proposed method is to determine an optimal weighted sum of nearest neighbors that are obtained under the local correlative degree evaluation, which is approximately equal to the test sample, and we can use this weighted linear combination to perform robust classification. Experimental results conducted on the ORL database of faces (created by the Olivetti Research Laboratory in Cambridge), the FERET face database (managed by the Defense Advanced Research Projects Agency and the National Institute of Standards and Technology), AR face database (created by Aleix Martinez and Robert Benavente in the Computer Vision Center at U.A.B), and USPS handwritten digit database (gathered at the Center of Excellence in Document Analysis and Recognition at SUNY Buffalo) demonstrate the effectiveness of the proposed method.
On illicit drug policies; methods of evaluation and comments on recent practices.
Trovato, Giovanni; Vezzani, Antonio
2013-06-01
This contribution provides an overview of different approaches used to analyse drug policies within and across countries. Besides the great number of cost of illness studies which have contributed to the assessment of health harms and risks associated to the drug use, most of the recent efforts have focused on the creation of synthetic indices to classify countries around the world or to evaluate particular law enforcement policies in some countries. This is probably due to a general lack of comparable data across countries. The wide variety of budgetary practices in the drugs field in Europe contributes to the problems that exist in estimating drug-related public expenditure. These heterogeneous accounting practices, together with the complexity of the drug phenomenon and the multiplicity of perspectives on the issue, strongly constrains the possibility of economically evaluate and compare drug laws across countries.
Therapeutic methods for psychosomatic disorders in oto-rhino-laryngology
Decot, Elke
2005-01-01
Psychosomatic disorders such as tinnitus, acute hearing loss, attacks of dizziness, globus syndrome, dysphagias, voice disorders and many more are quite common in ear, nose and throat medicine. They are mostly caused by a number of factors, although the bio-psycho-social model does play an important role. Initial contact with a psychosomatically ill patient and compiling a first case history are important steps to psychosomatic oriented therapy. This contribution will sum up the most important otorhinolaryngological diseases with psychosomatic comorbidity and scientifically evaluated methods of treatment. The contribution will also introduce the reader to important psychosomatic treatment methods from psychotherapeutic relaxation techniques to talk therapy. To conclude, the contribution will discuss the criteria for outpatient as well as inpatient treatment and look at the advantages of psychosomatically oriented therapy, both for the patient and for the doctor. PMID:22073069
Evaluating Blended and Flipped Instruction in Numerical Methods at Multiple Engineering Schools
ERIC Educational Resources Information Center
Clark, Renee; Kaw, Autar; Lou, Yingyan; Scott, Andrew; Besterfield-Sacre, Mary
2018-01-01
With the literature calling for comparisons among technology-enhanced or active-learning pedagogies, a blended versus flipped instructional comparison was made for numerical methods coursework using three engineering schools with diverse student demographics. This study contributes to needed comparisons of enhanced instructional approaches in STEM…
In 2008, the USEPA, NHDES and US Geological Survey initiated a data collection effort to evaluate borehole characterization methods for identifying natural contaminant flow into bedrock water-supply wells. The investigation: 1) tests methods at a variety of bedrock supply well sy...
Evaluation of background radiation dose contributions in the United Arab Emirates.
Goddard, Braden; Bosc, Emmanuel; Al Hasani, Sarra; Lloyd, Cody
2018-09-01
The natural background radiation consists of three main components; cosmic, terrestrial, and skyshine. Although there are currently methods available to measure the total dose rate from background radiation, no established methods exist that allow for the measurement of each component the background radiation. This analysis consists of a unique methodology in which the dose rate contribution from each component of the natural background radiation is measured and calculated. This project evaluates the natural background dose rate in the Abu Dhabi City region from all three of these components using the developed methodology. Evaluating and understanding the different components of background radiation provides a baseline allowing for the detection, and possibly attribution, of elevated radiation levels. Measurements using a high-pressure ion chamber with different shielding configurations and two offshore measurements provided dose rate information that were attributed to the different components of the background radiation. Additional spectral information was obtained using an HPGe detector to verify and quantify the presence of terrestrial radionuclides. By evaluating the dose rates of the different shielding configurations the comic, terrestrial, and skyshine contribution in the Abu Dhabi City region were determined to be 33.0 ± 1.7, 15.7 ± 2.5, and 2.4 ± 2.1 nSv/h, respectively. Copyright © 2018. Published by Elsevier Ltd.
ERIC Educational Resources Information Center
Burmeister, Mareike; Eilks, Ingo
2012-01-01
This paper describes the development and evaluation of a secondary school lesson plan for chemistry education on the topic Education for Sustainable Development (ESD). The lessons focus both on the chemistry of plastics and on learning about the societal evaluation of competing, chemistry-based industrial products. A specific teaching method was…
Forensic Child Sexual Abuse Evaluations: Assessing Subjectivity and Bias in Professional Judgements
ERIC Educational Resources Information Center
Everson, Mark D.; Sandoval, Jose Miguel
2011-01-01
Objectives: Evaluators examining the same evidence often arrive at substantially different conclusions in forensic assessments of child sexual abuse (CSA). This study attempts to identify and quantify subjective factors that contribute to such disagreements so that interventions can be devised to improve the reliability of case decisions. Methods:…
How to Assign Individualized Scores on a Group Project: An Empirical Evaluation
ERIC Educational Resources Information Center
Zhang, Bo; Ohland, Matthew W.
2009-01-01
One major challenge in using group projects to assess student learning is accounting for the differences of contribution among group members so that the mark assigned to each individual actually reflects their performance. This research addresses the validity of grading group projects by evaluating different methods that derive individualized…
ERIC Educational Resources Information Center
Tsompanoudi, Despina; Satratzemi, Maya; Xinogalos, Stelios
2016-01-01
The results presented in this paper contribute to research on two different areas of teaching methods: distributed pair programming (DPP) and computer-supported collaborative learning (CSCL). An evaluation study of a DPP system that supports collaboration scripts was conducted over one semester of a computer science course. Seventy-four students…
Evaluating and Improving the Mathematics Teaching-Learning Process through Metacognition
ERIC Educational Resources Information Center
Desoete, Annemie
2007-01-01
Introduction: Despite all the emphasis on metacognition, researchers currently use different techniques to assess metacognition. The purpose of this contribution is to help to clarify some of the paradigms on the evaluation of metacognition. In addition the paper reviews studies aiming to improve the learning process through metacognition. Method:…
Funane, Tsukasa; Atsumori, Hirokazu; Katura, Takusige; Obata, Akiko N; Sato, Hiroki; Tanikawa, Yukari; Okada, Eiji; Kiguchi, Masashi
2014-01-15
To quantify the effect of absorption changes in the deep tissue (cerebral) and shallow tissue (scalp, skin) layers on functional near-infrared spectroscopy (fNIRS) signals, a method using multi-distance (MD) optodes and independent component analysis (ICA), referred to as the MD-ICA method, is proposed. In previous studies, when the signal from the shallow tissue layer (shallow signal) needs to be eliminated, it was often assumed that the shallow signal had no correlation with the signal from the deep tissue layer (deep signal). In this study, no relationship between the waveforms of deep and shallow signals is assumed, and instead, it is assumed that both signals are linear combinations of multiple signal sources, which allows the inclusion of a "shared component" (such as systemic signals) that is contained in both layers. The method also assumes that the partial optical path length of the shallow layer does not change, whereas that of the deep layer linearly increases along with the increase of the source-detector (S-D) distance. Deep- and shallow-layer contribution ratios of each independent component (IC) are calculated using the dependence of the weight of each IC on the S-D distance. Reconstruction of deep- and shallow-layer signals are performed by the sum of ICs weighted by the deep and shallow contribution ratio. Experimental validation of the principle of this technique was conducted using a dynamic phantom with two absorbing layers. Results showed that our method is effective for evaluating deep-layer contributions even if there are high correlations between deep and shallow signals. Next, we applied the method to fNIRS signals obtained on a human head with 5-, 15-, and 30-mm S-D distances during a verbal fluency task, a verbal working memory task (prefrontal area), a finger tapping task (motor area), and a tetrametric visual checker-board task (occipital area) and then estimated the deep-layer contribution ratio. To evaluate the signal separation performance of our method, we used the correlation coefficients of a laser-Doppler flowmetry (LDF) signal and a nearest 5-mm S-D distance channel signal with the shallow signal. We demonstrated that the shallow signals have a higher temporal correlation with the LDF signals and with the 5-mm S-D distance channel than the deep signals. These results show the MD-ICA method can discriminate between deep and shallow signals. Copyright © 2013 Elsevier Inc. All rights reserved.
Influence Map Methodology for Evaluating Systemic Safety Issues
NASA Technical Reports Server (NTRS)
2008-01-01
"Raising the bar" in safety performance is a critical challenge for many organizations, including Kennedy Space Center. Contributing-factor taxonomies organize information about the reasons accidents occur and therefore are essential elements of accident investigations and safety reporting systems. Organizations must balance efforts to identify causes of specific accidents with efforts to evaluate systemic safety issues in order to become more proactive about improving safety. This project successfully addressed the following two problems: (1) methods and metrics to support the design of effective taxonomies are limited and (2) influence relationships among contributing factors are not explicitly modeled within a taxonomy.
A method for estimating both the solubility parameters and molar volumes of liquids
NASA Technical Reports Server (NTRS)
Fedors, R. F.
1974-01-01
Development of an indirect method of estimating the solubility parameter of high molecular weight polymers. The proposed method of estimating the solubility parameter, like Small's method, is based on group additive constants, but is believed to be superior to Small's method for two reasons: (1) the contribution of a much larger number of functional groups have been evaluated, and (2) the method requires only a knowledge of structural formula of the compound.
Evaluation of Rock Joint Coefficients
NASA Astrophysics Data System (ADS)
Audy, Ondřej; Ficker, Tomáš
2017-10-01
A computer method for evaluation of rock joint coefficients is described and several applications are presented. The method is based on two absolute numerical indicators that are formed by means of the Fourier replicas of rock joint profiles. The first indicator quantifies the vertical depth of profiles and the second indicator classifies wavy character of profiles. The absolute indicators have replaced the formerly used relative indicators that showed some artificial behavior in some cases. This contribution is focused on practical computations testing the functionality of the newly introduced indicators.
NASA Astrophysics Data System (ADS)
Tiecher, Tales; Caner, Laurent; Gomes Minella, Jean Paolo; Henrique Ciotti, Lucas; Antônio Bender, Marcos; dos Santos Rheinheimer, Danilo
2014-05-01
Conventional fingerprinting methods based on geochemical composition still require a time-consuming and critical preliminary sample preparation. Thus, fingerprinting characteristics that can be measured in a rapid and cheap way requiring a minimal sample preparation, such as spectroscopy methods, should be used. The present study aimed to evaluate the sediment sources contribution in a rural catchment by using conventional method based on geochemical composition and on an alternative method based on near-infrared spectroscopy. This study was carried out in a rural catchment with an area of 1,19 km2 located in southern Brazil. The sediment sources evaluated were crop fields (n=20), unpaved roads (n=10) and stream channels (n=10). Thirty suspended sediment samples were collected from eight significant storm runoff events between 2009 and 2011. Sources and sediment samples were dried at 50oC and sieved at 63 µm. The total concentration of Ag, As, B, Ba, Be, Ca, Cd, Co, Cr, Cu, Fe, K, La, Li, Mg, Mn, Mo, Na, Ni, P, Pb, Sb, Se, Sr, Ti, Tl, V and Zn were estimated by ICP-OES after microwave assisted digestion with concentrated HNO3 and HCl. Total organic carbon (TOC) was estimated by wet oxidation with K2Cr2O7 and H2SO4. The near-infrared spectra scan range was 4000 to 10000 cm-1 at a resolution of 2 cm-1, with 100 co added scans per spectrum. The steps used in the conventional method were: i) tracer selection based on Kruskal-Wallis test, ii) selection of the best set of tracers using discriminant analyses and finally iii) the use of a mixed linear model to calculate the sediment sources contribution. The steps used in the alternative method were i) principal component analyses to reduce the number of variables, ii) discriminant analyses to determine the tracer potential of the near-infrared spectroscopy, and finally iii) the use of past least square based on 48 mixtures of the sediment sources in various weight proportions to calculate the sediment sources contribution. Both conventional and alternative methods were capable to discriminate 100% of the sediment sources. Conventional fingerprinting method provided a sediment sources contribution of 33±19% by crop fields, 25±13% by unpaved roads and 42±19% by stream channels. The contribution of sediment sources obtained by alternative fingerprinting method using near-infrared spectroscopy was 71±22% of crop fields, 21±12% of unpaved roads and 14±19% of stream channels. No correlation was observed between source contribution assessed by the two methods. Notwithstanding, the average contribution of the unpaved roads was similar by both methods. The highest difference in the average contribution of crop fields and stream channels estimated by the two methods was due to similar organic matter content of these two sediment sources which hampers their discrimination by assessing the near-infrared spectra, where much of the bands are highly correlated with the TOC levels. Efforts should be taken to try to combine both the geochemical composition and near-infrared spectroscopy information on a single estimative of the sediment sources contribution.
Matula, Svatopluk; Báťková, Kamila; Legese, Wossenu Lemma
2016-11-15
Non-destructive soil water content determination is a fundamental component for many agricultural and environmental applications. The accuracy and costs of the sensors define the measurement scheme and the ability to fit the natural heterogeneous conditions. The aim of this study was to evaluate five commercially available and relatively cheap sensors usually grouped with impedance and FDR sensors. ThetaProbe ML2x (impedance) and ECH₂O EC-10, ECH₂O EC-20, ECH₂O EC-5, and ECH₂O TE (all FDR) were tested on silica sand and loess of defined characteristics under controlled laboratory conditions. The calibrations were carried out in nine consecutive soil water contents from dry to saturated conditions (pure water and saline water). The gravimetric method was used as a reference method for the statistical evaluation (ANOVA with significance level 0.05). Generally, the results showed that our own calibrations led to more accurate soil moisture estimates. Variance component analysis arranged the factors contributing to the total variation as follows: calibration (contributed 42%), sensor type (contributed 29%), material (contributed 18%), and dry bulk density (contributed 11%). All the tested sensors performed very well within the whole range of water content, especially the sensors ECH₂O EC-5 and ECH₂O TE, which also performed surprisingly well in saline conditions.
Matula, Svatopluk; Báťková, Kamila; Legese, Wossenu Lemma
2016-01-01
Non-destructive soil water content determination is a fundamental component for many agricultural and environmental applications. The accuracy and costs of the sensors define the measurement scheme and the ability to fit the natural heterogeneous conditions. The aim of this study was to evaluate five commercially available and relatively cheap sensors usually grouped with impedance and FDR sensors. ThetaProbe ML2x (impedance) and ECH2O EC-10, ECH2O EC-20, ECH2O EC-5, and ECH2O TE (all FDR) were tested on silica sand and loess of defined characteristics under controlled laboratory conditions. The calibrations were carried out in nine consecutive soil water contents from dry to saturated conditions (pure water and saline water). The gravimetric method was used as a reference method for the statistical evaluation (ANOVA with significance level 0.05). Generally, the results showed that our own calibrations led to more accurate soil moisture estimates. Variance component analysis arranged the factors contributing to the total variation as follows: calibration (contributed 42%), sensor type (contributed 29%), material (contributed 18%), and dry bulk density (contributed 11%). All the tested sensors performed very well within the whole range of water content, especially the sensors ECH2O EC-5 and ECH2O TE, which also performed surprisingly well in saline conditions. PMID:27854263
Online Statistics Labs in MSW Research Methods Courses: Reducing Reluctance toward Statistics
ERIC Educational Resources Information Center
Elliott, William; Choi, Eunhee; Friedline, Terri
2013-01-01
This article presents results from an evaluation of an online statistics lab as part of a foundations research methods course for master's-level social work students. The article discusses factors that contribute to an environment in social work that fosters attitudes of reluctance toward learning and teaching statistics in research methods…
Design of a Workstation by a Cognitive Approach
Jaspers, MWM; Steen, T.; Geelen, M.; van den Bos, C.
2001-01-01
To ensure ultimate acceptance of computer systems that are easy to use, provide the desired functionality and fits into users work practices requires the use of improved methods for system design and evaluation. Both designing and evaluating workstations that link up smoothly with daily routine of physicians' work requires a thorough understanding of their working practices. The application of methods from cognitive science may contribute to a thorough understanding of the activities involved in medical information processing. We used cognitive task analysis in designing a physicians' workstation, which seems a promising method to ensure that the system meets the user needs.
Good Practices for Learning to Recognize Actions Using FV and VLAD.
Wu, Jianxin; Zhang, Yu; Lin, Weiyao
2016-12-01
High dimensional representations such as Fisher vectors (FV) and vectors of locally aggregated descriptors (VLAD) have shown state-of-the-art accuracy for action recognition in videos. The high dimensionality, on the other hand, also causes computational difficulties when scaling up to large-scale video data. This paper makes three lines of contributions to learning to recognize actions using high dimensional representations. First, we reviewed several existing techniques that improve upon FV or VLAD in image classification, and performed extensive empirical evaluations to assess their applicability for action recognition. Our analyses of these empirical results show that normality and bimodality are essential to achieve high accuracy. Second, we proposed a new pooling strategy for VLAD and three simple, efficient, and effective transformations for both FV and VLAD. Both proposed methods have shown higher accuracy than the original FV/VLAD method in extensive evaluations. Third, we proposed and evaluated new feature selection and compression methods for the FV and VLAD representations. This strategy uses only 4% of the storage of the original representation, but achieves comparable or even higher accuracy. Based on these contributions, we recommend a set of good practices for action recognition in videos for practitioners in this field.
Simulation of the XV-15 tilt rotor research aircraft
NASA Technical Reports Server (NTRS)
Churchill, G. B.; Dugan, D. C.
1982-01-01
The effective use of simulation from issuance of the request for proposal through conduct of a flight test program for the XV-15 Tilt Rotor Research Aircraft is discussed. From program inception, simulation complemented all phases of XV-15 development. The initial simulation evaluations during the source evaluation board proceedings contributed significantly to performance and stability and control evaluations. Eight subsequent simulation periods provided major contributions in the areas of control concepts; cockpit configuration; handling qualities; pilot workload; failure effects and recovery procedures; and flight boundary problems and recovery procedures. The fidelity of the simulation also made it a valuable pilot training aid, as well as a suitable tool for military and civil mission evaluations. Simulation also provided valuable design data for refinement of automatic flight control systems. Throughout the program, fidelity was a prime issue and resulted in unique data and methods for fidelity evaluation which are presented and discussed.
Nagasaka, Kei; Mizuno, Koji; Ito, Daisuke; Saida, Naoya
2017-05-29
In car crashes, the passenger compartment deceleration significantly influences the occupant loading. Hence, it is important to consider how each structural component deforms in order to control the passenger compartment deceleration. In frontal impact tests, the passenger compartment deceleration depends on the energy absorption property of the front structures. However, at this point in time there are few papers describing the components' quantitative contributions on the passenger compartment deceleration. Generally, the cross-sectional force is used to examine each component's contribution to passenger compartment deceleration. However, it is difficult to determine each component's contribution based on the cross-sectional forces, especially within segments of the individual members itself such as the front rails, because the force is transmitted continuously and the cross-sectional forces remain the same through the component. The deceleration of a particle can be determined from the derivative of the kinetic energy. Using this energy-derivative method, the contribution of each component on the passenger compartment deceleration can be determined. Using finite element (FE) car models, this method was applied for full-width and offset impact tests. This method was also applied to evaluate the deceleration of the powertrain. The finite impulse response (FIR) coefficient of the vehicle deceleration (input) and the driver chest deceleration (output) was calculated from Japan New Car Assessment Program (JNCAP) tests. These were applied to the component's contribution on the vehicle deceleration in FE analysis, and the component's contribution to the deceleration of the driver's chest was determined. The sum of the contribution of each component coincides with the passenger compartment deceleration in all types of impacts; therefore, the validity of this method was confirmed. In the full-width impact, the contribution of the crush box was large in the initial phases, and the contribution of the passenger compartment was large in the final phases. For the powertrain deceleration, the crush box had a positive contribution and the passenger compartment had a negative contribution. In the offset test, the contribution of the honeycomb and the passenger compartment deformation to the passenger compartment deceleration was large. Based on the FIR analysis, the passenger compartment deformation contributed the most to the chest deceleration of the driver dummy in the full-width impact. Based on the energy-derivative method, the contribution of the components' deformation to deceleration of the passenger compartment can be calculated for various types of crash configurations more easily, directly, and quantitatively than by using conventional methods. In addition, by combining the energy-derivative method and FIR, each structure's contribution to the occupant deceleration can be obtained. The energy-derivative method is useful in investigating how the deceleration develops from component deformations and also in designing deceleration curves for various impact configurations.
Sheehan, Mary C; Lam, Juleen; Navas-Acien, Ana; Chang, Howard H
2016-01-01
Systematic review and meta-analysis (SRMA) are increasingly employed in environmental health (EH) epidemiology and, provided methods and reporting are sound, contribute to translating science evidence to policy. Ambient air pollution (AAP) is both among the leading environmental causes of mortality and morbidity worldwide, and of growing policy relevance due to health co-benefits associated with greenhouse gas emissions reductions. We reviewed the published AAP SRMA literature (2009 to mid-2015), and evaluated the consistency of methods, reporting and evidence evaluation using a 22-point questionnaire developed from available best-practice consensus guidelines and emerging recommendations for EH. Our goal was to contribute to enhancing the utility of AAP SRMAs to EH policy. We identified 43 studies that used both SR and MA techniques to examine associations between the AAPs PM2.5, PM10, NO2, SO2, CO and O3, and various health outcomes. On average AAP SRMAs partially or thoroughly addressed 16 of 22 questions (range 10-21), and thoroughly addressed 13 of 22 (range 5-19). We found evidence of an improving trend over the period. However, we observed some weaknesses, particularly infrequent formal reviews of underlying study quality and risk-of-bias that correlated with lower frequency of thorough evaluation for key study quality parameters. Several other areas for enhanced reporting are highlighted. The AAP SRMA literature, in particular more recent studies, indicate broad concordance with current and emerging best practice guidance. Development of an EH-specific SRMA consensus statement including a risk-of-bias evaluation tool, would be a contribution to enhanced reliability and robustness as well as policy utility. Copyright © 2016 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Tai, Joanna Hong-Meng; Canny, Benedict J.; Haines, Terry P.; Molloy, Elizabeth K.
2016-01-01
This study explored the contribution of peer-assisted learning (PAL) in the development of evaluative judgement capacity; the ability to understand work quality and apply those standards to appraising performance. The study employed a mixed methods approach, collecting self-reported survey data, observations of, and reflective interviews with, the…
ERIC Educational Resources Information Center
Callinan, Carol J.; van der Zee, Emile; Wilson, Garry
2018-01-01
Social cognitive learning theory has shown that observational learning positively influences essay writing development in high-school students, and that self-efficacy impacts on motivation. This study investigated the relative contribution of model observation, model evaluation, post-submission feedback, and factors relating to self-efficacy, as…
ERIC Educational Resources Information Center
Jensen, Eric Allen
2016-01-01
This article addresses some of the challenges faced when attempting to evaluate the long-term impact of informal science learning interventions. To contribute to the methodological development of informal science learning research, we critically examine (Falk and Needham (2011) "Journal of Research in Science Teaching," 48: 1-12.) study…
ERIC Educational Resources Information Center
Hogan, Sarah; Stokes, Jacqueline; White, Catherine; Tyszkiewicz, Elizabeth; Woolgar, Alexandra
2008-01-01
Providing unbiased data concerning the outcomes of particular intervention methods is imperative if professionals and parents are to assimilate information which could contribute to an "informed choice". An evaluation of Auditory Verbal Therapy (AVT) was conducted using a formal assessment of spoken language as an outcome measure. Spoken…
ERIC Educational Resources Information Center
Torgerson, Carole J.
2009-01-01
The randomised controlled trial (RCT) is an evaluative method used by social scientists in order to establish whether or not an intervention is effective. This contribution discusses the fundamental aspects of good RCT design. These are illustrated through the use of a recently completed RCT which evaluated an information and communication…
Advancing the research agenda for diagnostic error reduction.
Zwaan, Laura; Schiff, Gordon D; Singh, Hardeep
2013-10-01
Diagnostic errors remain an underemphasised and understudied area of patient safety research. We briefly summarise the methods that have been used to conduct research on epidemiology, contributing factors and interventions related to diagnostic error and outline directions for future research. Research methods that have studied epidemiology of diagnostic error provide some estimate on diagnostic error rates. However, there appears to be a large variability in the reported rates due to the heterogeneity of definitions and study methods used. Thus, future methods should focus on obtaining more precise estimates in different settings of care. This would lay the foundation for measuring error rates over time to evaluate improvements. Research methods have studied contributing factors for diagnostic error in both naturalistic and experimental settings. Both approaches have revealed important and complementary information. Newer conceptual models from outside healthcare are needed to advance the depth and rigour of analysis of systems and cognitive insights of causes of error. While the literature has suggested many potentially fruitful interventions for reducing diagnostic errors, most have not been systematically evaluated and/or widely implemented in practice. Research is needed to study promising intervention areas such as enhanced patient involvement in diagnosis, improving diagnosis through the use of electronic tools and identification and reduction of specific diagnostic process 'pitfalls' (eg, failure to conduct appropriate diagnostic evaluation of a breast lump after a 'normal' mammogram). The last decade of research on diagnostic error has made promising steps and laid a foundation for more rigorous methods to advance the field.
Helium Mass Spectrometer Leak Detection: A Method to Quantify Total Measurement Uncertainty
NASA Technical Reports Server (NTRS)
Mather, Janice L.; Taylor, Shawn C.
2015-01-01
In applications where leak rates of components or systems are evaluated against a leak rate requirement, the uncertainty of the measured leak rate must be included in the reported result. However, in the helium mass spectrometer leak detection method, the sensitivity, or resolution, of the instrument is often the only component of the total measurement uncertainty noted when reporting results. To address this shortfall, a measurement uncertainty analysis method was developed that includes the leak detector unit's resolution, repeatability, hysteresis, and drift, along with the uncertainty associated with the calibration standard. In a step-wise process, the method identifies the bias and precision components of the calibration standard, the measurement correction factor (K-factor), and the leak detector unit. Together these individual contributions to error are combined and the total measurement uncertainty is determined using the root-sum-square method. It was found that the precision component contributes more to the total uncertainty than the bias component, but the bias component is not insignificant. For helium mass spectrometer leak rate tests where unit sensitivity alone is not enough, a thorough evaluation of the measurement uncertainty such as the one presented herein should be performed and reported along with the leak rate value.
Makrakis, Vassilios; Kostoulas-Makrakis, Nelly
2016-02-01
Quantitative and qualitative approaches to planning and evaluation in education for sustainable development have often been treated by practitioners from a single research paradigm. This paper discusses the utility of mixed method evaluation designs which integrate qualitative and quantitative data through a sequential transformative process. Sequential mixed method data collection strategies involve collecting data in an iterative process whereby data collected in one phase contribute to data collected in the next. This is done through examples from a programme addressing the 'Reorientation of University Curricula to Address Sustainability (RUCAS): A European Commission Tempus-funded Programme'. It is argued that the two approaches are complementary and that there are significant gains from combining both. Using methods from both research paradigms does not, however, mean that the inherent differences among epistemologies and methodologies should be neglected. Based on this experience, it is recommended that using a sequential transformative mixed method evaluation can produce more robust results than could be accomplished using a single approach in programme planning and evaluation focussed on education for sustainable development. Copyright © 2015 Elsevier Ltd. All rights reserved.
Yang, Y.; Mahler, B.J.; Van Metre, P.C.; Ligouis, B.; Werth, C.J.
2010-01-01
Measurements of black carbon (BC) using either chemical or thermal oxidation methods are generally thought to indicate the amount of char and/or soot present in a sample. In urban environments, however, asphalt and coal-tar particles worn from pavement are ubiquitous and, because of their pyrogenic origin, could contribute to measurements of BC. Here we explored the effect of the presence of asphalt and coal-tar particles on the quantification of BC in a range of urban environmental sample types, and evaluated biases in the different methods used for quantifying BC. Samples evaluated were pavement dust, residential and commercial area soils, lake sediments from a small urban watershed, and reference materials of asphalt and coal tar. Total BC was quantified using chemical treatment through acid dichromate (Cr2O7) oxidation and chemo-thermal oxidation at 375??C (CTO-375). BC species, including soot and char/charcoal, asphalt, and coal tar, were quantified with organic petrographic analysis. Comparison of results by the two oxidation methods and organic petrography indicates that both coal tar and asphalt contribute to BC quantified by Cr2O7 oxidation, and that coal tar contributes to BC quantified by CTO-375. These results are supported by treatment of asphalt and coal-tar reference samples with Cr2O7 oxidation and CTO-375. The reference asphalt is resistant to Cr2O7 oxidation but not to CTO-375, and the reference coal tar is resistant to both Cr2O7 oxidation and CTO-375. These results indicate that coal tar and/or asphalt can contribute to BC measurements in samples from urban areas using Cr2O7 oxidation or CTO-375, and caution is advised when interpreting BC measurements made with these methods. ?? 2010 Elsevier Ltd.
Offenberg, John H; Lewis, Charles W; Lewandowski, Michael; Jaoui, Mohammed; Kleindienst, Tadeusz E; Edney, Edward O
2007-06-01
An organic tracer method, recently proposed for estimating individual contributions of toluene and alpha-pinene to secondary organic aerosol (SOA) formation, was evaluated by conducting a laboratory study where a binary hydrocarbon mixture, containing the anthropogenic aromatic hydrocarbon, toluene, and the biogenic monoterpene, alpha-pinene, was irradiated in air in the presence of NO(x) to form SOA. The contributions of toluene and alpha-pinene to the total SOA concentration, calculated using the organic tracer method, were compared with those obtained with a more direct 14C content method. In the study, SOA to SOC ratios of 2.07 +/- 0.08 and 1.41 +/- 0.04 were measured for toluene and (alpha-pinene SOA, respectively. The individual tracer-based SOA contributions of 156 microg m(-3) for toluene and 198 microg m(-)3 for alpha-pinene, which together accounted for 82% of the gravimetrically determined total SOA concentration, compared well with the 14C values of 182 and 230 microg m(-3) measured for the respective SOA precursors. While there are uncertainties associated with the organic tracer method, largely due to the chemical complexity of SOA forming chemical mechanisms, the results of this study suggest the organic tracer method may serve as a useful tool for determining whether a precursor hydrocarbon is a major SOA contributor.
Balezentiene, Ligita; Kusta, Albinas
2012-01-01
N(2)O, CH(4), and CO(2) are potential greenhouse gas (GHG) contributing to climate change; therefore, solutions have to be sought to reduce their emission from agriculture. This work evaluates GHG emission from grasslands submitted to different mineral fertilizers during vegetation period (June-September) in two experimental sites, namely, seminatural grassland (8 treatments of mineral fertilizers) and cultural pasture (intensively managed) in the Training Farm of the Lithuanian University of Agriculture. Chamber method was applied for evaluation of GHG emissions on the field scale. As a result, soil chemical composition, compactness, temperature, and gravimetric moisture as well as biomass yield of fresh and dry biomass and botanical composition, were assessed during the research. Furthermore, a simulation of multi-criteria assessment of sustainable fertilizers management was carried out on a basis of ARAS method. The multicriteria analysis of different fertilizing regimes was based on a system of environmental and productivity indices. Consequently, agroecosystems of cultural pasture (N(180)P(120)K(150)) and seminatural grassland fertilizing rates N(180)P(120)K(150) and N(60)P(40)K(50) were evaluated as the most sustainable alternatives leading to reduction of emissions between biosphere-atmosphere and human-induced biogenic pollution in grassland ecosystems, thus contributing to improvement of countryside environment.
Balezentiene, Ligita; Kusta, Albinas
2012-01-01
N2O, CH4, and CO2 are potential greenhouse gas (GHG) contributing to climate change; therefore, solutions have to be sought to reduce their emission from agriculture. This work evaluates GHG emission from grasslands submitted to different mineral fertilizers during vegetation period (June–September) in two experimental sites, namely, seminatural grassland (8 treatments of mineral fertilizers) and cultural pasture (intensively managed) in the Training Farm of the Lithuanian University of Agriculture. Chamber method was applied for evaluation of GHG emissions on the field scale. As a result, soil chemical composition, compactness, temperature, and gravimetric moisture as well as biomass yield of fresh and dry biomass and botanical composition, were assessed during the research. Furthermore, a simulation of multi-criteria assessment of sustainable fertilizers management was carried out on a basis of ARAS method. The multicriteria analysis of different fertilizing regimes was based on a system of environmental and productivity indices. Consequently, agroecosystems of cultural pasture (N180P120K150) and seminatural grassland fertilizing rates N180P120K150 and N60P40K50 were evaluated as the most sustainable alternatives leading to reduction of emissions between biosphere-atmosphere and human-induced biogenic pollution in grassland ecosystems, thus contributing to improvement of countryside environment. PMID:22645463
Critical evaluation of lung scintigraphy in cystic fibrosis: study of 113 patients
DOE Office of Scientific and Technical Information (OSTI.GOV)
Piepsz, A.; Wetzburger, C.; Spehl, M.
1980-10-01
A long-term study has been performed on 285 lung perfusion scintigrams obtained from 113 patients with cystic fibrosis. Transverse and longitudinal comparisons with clinical and radiological scores, as well as retrospective analysis of the deceased patients, were the methods used in order to evaluate the importance of the scintigraphic images. It appears that lung scintigraphy is the best index of the regional lung impairment, and contributes, as does a chest radiograph, to the early detection of lung lesions, the two methods being complementary.
Basic methods for measuring the reflectance color of iron oxides
NASA Astrophysics Data System (ADS)
Pospisil, Jaroslav; Hrdy, Jan; Hrdy Jan, Jr.
2007-06-01
The main contribution of the present article consists in coherent description and interpretation of the principles of basic measuring methods and colorimeters for color classification and evaluation of light reflecting samples containing iron oxides. The chosen relevant theoretical background is based on the CIE tristimulus colorimetric system (X,Y,Z), the CIE colorimetric system (L*,a*,b*) and the Munsell colorimetric system (H,V,C). As an example of color identification and evaluation, some specific mathematical and graphical relationships between the soil redness rate and the corresponding hematite content are shown.
A revision of the gamma-evaluation concept for the comparison of dose distributions.
Bakai, Annemarie; Alber, Markus; Nüsslin, Fridtjof
2003-11-07
A method for the quantitative four-dimensional (4D) evaluation of discrete dose data based on gradient-dependent local acceptance thresholds is presented. The method takes into account the local dose gradients of a reference distribution for critical appraisal of misalignment and collimation errors. These contribute to the maximum tolerable dose error at each evaluation point to which the local dose differences between comparison and reference data are compared. As shown, the presented concept is analogous to the gamma-concept of Low et al (1998a Med. Phys. 25 656-61) if extended to (3+1) dimensions. The pointwise dose comparisons of the reformulated concept are easier to perform and speed up the evaluation process considerably, especially for fine-grid evaluations of 3D dose distributions. The occurrences of false negative indications due to the discrete nature of the data are reduced with the method. The presented method was applied to film-measured, clinical data and compared with gamma-evaluations. 4D and 3D evaluations were performed. Comparisons prove that 4D evaluations have to be given priority, especially if complex treatment situations are verified, e.g., non-coplanar beam configurations.
Kim, Yong-Sung; Kim, Yong-Suk
2015-01-01
There are several methods available in measuring food taste. The sensory evaluation, for instance, is a typical method for panels to test of taste and recognize smell with their nose by measuring the degree of taste characteristic, intensity, and pleasure. There are many issues entailed in the traditional sensory evaluation method such as forming a panel and evaluation cost; moreover, it is only localized in particular areas. Accordingly, this paper aimed to select food in one particular area, and compare and review the content between sensory evaluations using a taste biological sensor, as well as presenting an analysis of brainwaves using EEG and finally a proposal of a new method for sensory evaluation. In this paper, the researchers have conducted a sensory evaluation whereas a maximum of nine points were accumulated by purchasing eight types of rice wine. These eight types of Makgeolli were generalized by generating multidimensional data with the use of TS-5000z, thus learning mapping points and scaling them. The contribution of this paper, therefore, is to overcome the disadvantages of the sensory evaluation with the usage of the suggested taste biological sensor system. PMID:26247031
Gan, Ruijing; Chen, Xiaojun; Yan, Yu; Huang, Daizheng
2015-01-01
Accurate incidence forecasting of infectious disease provides potentially valuable insights in its own right. It is critical for early prevention and may contribute to health services management and syndrome surveillance. This study investigates the use of a hybrid algorithm combining grey model (GM) and back propagation artificial neural networks (BP-ANN) to forecast hepatitis B in China based on the yearly numbers of hepatitis B and to evaluate the method's feasibility. The results showed that the proposal method has advantages over GM (1, 1) and GM (2, 1) in all the evaluation indexes.
Identification and stabilization methods for problematic silt soils : technical summary.
DOT National Transportation Integrated Search
2002-05-01
The objective of this research are to (1) identify the soil properties and characteristics that contribute to a pumping condition, (2) evaluate the effectiveness of selected chemical stabilization techniques, and (3) provide a recommendation for alte...
Recommendations for the treatment of aging in standard technical specifications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Orton, R.D.; Allen, R.P.
1995-09-01
As part of the US Nuclear Regulatory Commission`s Nuclear Plant Aging Research Program, Pacific Northwest Laboratory (PNL) evaluated the standard technical specifications for nuclear power plants to determine whether the current surveillance requirements (SRs) were effective in detecting age-related degradation. Nuclear Plant Aging Research findings for selected systems and components were reviewed to identify the stressors and operative aging mechanisms and to evaluate the methods available to detect, differentiate, and trend the resulting aging degradation. Current surveillance and testing requirements for these systems and components were reviewed for their effectiveness in detecting degraded conditions and for potential contributions to prematuremore » degradation. When the current surveillance and testing requirements appeared ineffective in detecting aging degradation or potentially could contribute to premature degradation, a possible deficiency in the SRs was identified that could result in undetected degradation. Based on this evaluation, PNL developed recommendations for inspection, surveillance, trending, and condition monitoring methods to be incorporated in the SRs to better detect age- related degradation of these selected systems and components.« less
NASA Technical Reports Server (NTRS)
Cucinotta, F. A.; Katz, R.; Wilson, J. W.
1998-01-01
An analytic method is described for evaluating the average radial electron spectrum and the radial and total frequency-event spectrum for high-energy ions. For high-energy ions, indirect events make important contributions to frequency-event spectra. The method used for evaluating indirect events is to fold the radial electron spectrum with measured frequency-event spectrum for photons or electrons. The contribution from direct events is treated using a spatially restricted linear energy transfer (LET). We find that high-energy heavy ions have a significantly reduced frequency-averaged final energy (yF) compared to LET, while relativistic protons have a significantly increased yF and dose-averaged lineal energy (yD) for typical site sizes used in tissue equivalent proportional counters. Such differences represent important factors in evaluating event spectra with laboratory beams, in space- flight, or in atmospheric radiation studies and in validation of radiation transport codes. The inadequacy of LET as descriptor because of deviations in values of physical quantities, such as track width, secondary electron spectrum, and yD for ions of identical LET is also discussed.
Comparative analysis of quantitative efficiency evaluation methods for transportation networks
He, Yuxin; Hong, Jian
2017-01-01
An effective evaluation of transportation network efficiency could offer guidance for the optimal control of urban traffic. Based on the introduction and related mathematical analysis of three quantitative evaluation methods for transportation network efficiency, this paper compares the information measured by them, including network structure, traffic demand, travel choice behavior and other factors which affect network efficiency. Accordingly, the applicability of various evaluation methods is discussed. Through analyzing different transportation network examples it is obtained that Q-H method could reflect the influence of network structure, traffic demand and user route choice behavior on transportation network efficiency well. In addition, the transportation network efficiency measured by this method and Braess’s Paradox can be explained with each other, which indicates a better evaluation of the real operation condition of transportation network. Through the analysis of the network efficiency calculated by Q-H method, it can also be drawn that a specific appropriate demand is existed to a given transportation network. Meanwhile, under the fixed demand, both the critical network structure that guarantees the stability and the basic operation of the network and a specific network structure contributing to the largest value of the transportation network efficiency can be identified. PMID:28399165
NASA Astrophysics Data System (ADS)
Bergese, P.; Bontempi, E.; Depero, L. E.
2006-10-01
X-ray reflectivity (XRR) is a non-destructive, accurate and fast technique for evaluating film density. Indeed, sample-goniometer alignment is a critical experimental factor and the overriding error source in XRR density determination. With commercial single-wavelength X-ray reflectometers, alignment is difficult to control and strongly depends on the operator. In the present work, the contribution of misalignment on density evaluation error is discussed, and a novel procedure (named XRR-density evaluation or XRR-DE method) to minimize the problem will be presented. The method allows to overcome the alignment step through the extrapolation of the correct density value from appropriate non-specular XRR data sets. This procedure is operator independent and suitable for commercial single-wavelength X-ray reflectometers. To test the XRR-DE method, single crystals of TiO 2 and SrTiO 3 were used. In both cases the determined densities differed from the nominal ones less than 5.5%. Thus, the XRR-DE method can be successfully applied to evaluate the density of thin films for which only optical reflectivity is today used. The advantage is that this method can be considered thickness independent.
Comparative analysis of quantitative efficiency evaluation methods for transportation networks.
He, Yuxin; Qin, Jin; Hong, Jian
2017-01-01
An effective evaluation of transportation network efficiency could offer guidance for the optimal control of urban traffic. Based on the introduction and related mathematical analysis of three quantitative evaluation methods for transportation network efficiency, this paper compares the information measured by them, including network structure, traffic demand, travel choice behavior and other factors which affect network efficiency. Accordingly, the applicability of various evaluation methods is discussed. Through analyzing different transportation network examples it is obtained that Q-H method could reflect the influence of network structure, traffic demand and user route choice behavior on transportation network efficiency well. In addition, the transportation network efficiency measured by this method and Braess's Paradox can be explained with each other, which indicates a better evaluation of the real operation condition of transportation network. Through the analysis of the network efficiency calculated by Q-H method, it can also be drawn that a specific appropriate demand is existed to a given transportation network. Meanwhile, under the fixed demand, both the critical network structure that guarantees the stability and the basic operation of the network and a specific network structure contributing to the largest value of the transportation network efficiency can be identified.
Risser, Dennis W.; Madden, Thomas M.
1994-01-01
Valley-fill aquifers in Pennsylvania are the source of drinking water for many wells in the glaciated parts of the State and along major river valleys. These aquifers area subject to contamination because of their shallow water-table depth and highly transmissive sediments. The possibility for contamination of water-supply wells in valley-fill aquifers can be minimized by excluding activities that could contaminate areas that contribute water to supply wells. An area that contributes water to a well is identified in this report as either an area of diversion, time-of-travel area, or contributing area. The area of diversion is a projection to land surface of the valley-fill aquifer volume through which water is diverted to a well and the time-of travel area is that fraction of the area of diversion through which water moves to the well in a specified time. The contributing area, the largest of three areas, includes the area of diversion but also incorporates bedrock uplands and other area that contribute water. Methods for delineating areas of diversion and contributing areas in valley-fill aquifers, described and compared in order of increasing complexity, include fixed radius, uniform flow, analytical, semianalytical, and numerical modeling. Delineated areas are considered approximations because the hydraulic properties and boundary conditions of the real ground-water system are simplified even in the most complex numerical methods. Successful application of any of these methods depends on the investigator's understanding of the hydrologic system in and near the well field, and the limitations of the method. The hydrologic system includes not only the valley-fill aquifer but also the regional surface-water and ground-water flow systems within which the valley is situated. As shown by numerical flow simulations of a well field in the valley-fill aquifer along Marsh Creek Valley near Asaph, Pa., water from upland bedrock sources can provide nearly all the water contributed to the well.
Heat release from wood wall assemblies using oxygen consumption method
Hao C. Tran; Robert E. White
1990-01-01
The concept of heat release rate is gaining acceptance in the evaluation of fire performance of materials and assemblies. However, this concept has not been incorporated into fire endurance testing such as the ASTM E-119 test method. Heat release rate of assemblies can be useful in determining the time at which the assemblies start to contribute to the controlled fire...
An Intelligent Hierarchical Decision Architecture for Operational Test and Evaluation
1996-05-01
Results .......................................... 60 3.4 CONTRIBUTION...FCM Fuzzy Cognitive Map FMEA Failure Modes and Effects Analysis HWIL Hardware-in-the-Loop IBL Increase in Break Locks xiv IDA Institute for Defense... 60 .5 .40 3 .25 0.21 0.26 Figure 8 PROD-ALL COMMFFY Compositional Method .65 . 7 5 . 60 M 0.67 Figure 9 PROD-MAX COMMFFY Compositional Method 49
A method for the in vivo measurement of americium-241 at long times post-exposure
DOE Office of Scientific and Technical Information (OSTI.GOV)
Neton, J.W.
1988-01-01
This study investigated an improved method for the quantitative measurement, calibration and calculation of {sup 241}Am organ burdens in humans. The techniques developed correct for cross-talk or count-rate contributions from surrounding and adjacent organ burdens and assures for the proper assignment of activity to the lungs, liver and skeleton. In order to predict the net count-rates for the measurement geometries of the skull, liver and lung, a background prediction method was developed. This method utilizes data obtained from the measurement of a group of control subjects. Based on this data, a linear prediction equation was developed for each measurement geometry.more » In order to correct for the cross-contributions among the various deposition loci, a series of surrogate human phantom structures were measured. The results of measurements of {sup 241}Am depositions in six exposure cases have been evaluated using these new techniques and have indicated that lung burden estimates could be in error by as much as 100 percent when corrections are not made for contributions to the count-rate from other organs.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kanemoto, S.; Andoh, Y.; Sandoz, S.A.
1984-10-01
A method for evaluating reactor stability in boiling water reactors has been developed. The method is based on multivariate autoregressive (M-AR) modeling of steady-state neutron and process noise signals. In this method, two kinds of power spectral densities (PSDs) for the measured neutron signal and the corresponding noise source signal are separately identified by the M-AR modeling. The closed- and open-loop stability parameters are evaluated from these PSDs. The method is applied to actual plant noise data that were measured together with artificial perturbation test data. Stability parameters identified from noise data are compared to those from perturbation test data,more » and it is shown that both results are in good agreement. In addition to these stability estimations, driving noise sources for the neutron signal are evaluated by the M-AR modeling. Contributions from void, core flow, and pressure noise sources are quantitatively evaluated, and the void noise source is shown to be the most dominant.« less
Wang, Lin-Yan; Tang, Yu-Ping; Liu, Xin; Ge, Ya-Hui; Li, Shu-Jiao; Shang, Er-Xin; Duan, Jin-Ao
2014-04-01
To establish a method for studying efficacious materials of traditional Chinese medicines from an overall perspective. Carthamus tinctorius was taken the example. Its major components were depleted by preparing liquid chromatography. Afterwards, the samples with major components depleted were evaluated for their antioxidant effect, so as to compare and analyze the major efficacious materials of C. tinctorius with antioxidant activity and the contributions. Seven major components were depleted from C. tinctorius samples, and six of them were identified with MS data and control comparison. After all of the samples including depleted materials are compared and evaluated for their antioxidant effect, the findings showed that hydroxysafflor yellow A, anhydrosafflor yellow B and 6-hydroxykaempferol-3, 6-di-O-glucoside-7-O-glucuronide were the major efficacious materials. This study explored a novel and effective method for studying efficacious materials of traditional Chinese medicines. Through this method, we could explain the direct and indirect contributions of different components to the efficacy of traditional Chinese medicines, and make the efficacious material expression of traditional Chinese medicines clearer.
Gürses, İlke Ali; Coşkun, Osman; Gürtekin, Başak; Kale, Ayşin
2016-12-01
Appreciating the contribution of donor-cadavers to medical education is a well observed practice among anatomists. However, the appreciation of their contribution in research and scientific articles remains dubious. We aimed to evaluate how much data anatomists provide about specimens they have used and how frequently anatomists acknowledge their cadavers in published articles. We evaluated all articles performed on human cadaveric specimens that were published in Clinical Anatomy and Surgical and Radiologic Anatomy between January 2011 and December 2015. We evaluated how much data on the demographics, preservation method(s), source, and ethical/legal permissions regarding cadavers were provided. We also evaluated the number of articles that acknowledged donor-cadavers. The majority of articles provided demographic data (age and sex) and preservation method used in the article. The source of the specimens was not mentioned in 45.6 % of the articles. Only 26.2 % of the articles provided a degree of consent and only 32.4 % of the articles reported some form of ethical approval for the study. The cadavers and their families were acknowledged in 17.7 % of the articles. We observed that no standard method for reporting data has been established. Anatomists should collaborate to create awareness among the scientific community for providing adequate information regarding donor-cadavers, including source and consent. Acknowledging donor-cadavers and/or their families should also be promoted. Scientific articles should be used to create a transparent relationship of trust between anatomists and their society.
NASA Technical Reports Server (NTRS)
1972-01-01
A survey of nondestructive evaluation (NDE) technology, which is discussed in terms of popular demands for a greater degree of quality, reliability, and safety in industrial products, is presented as an overview of the NDE field to serve the needs of middle management. Three NDE methods are presented: acoustic emission, the use of coherent (laser)light, and ultrasonic holography.
Turley, James P; Johnson, Todd R; Smith, Danielle Paige; Zhang, Jaijie; Brixey, Juliana J
2006-04-01
Use of medical devices often directly contributes to medical errors. Because it is difficult or impossible to change the design of existing devices, the best opportunity for improving medical device safety is during the purchasing process. However, most hospital personnel are not familiar with the usability evaluation methods designed to identify aspects of a user interface that do not support intuitive and safe use. A review of medical device operating manuals is proposed as a more practical method of usability evaluation. Operating manuals for five volumetric infusion pumps from three manufacturers were selected for this study (January-April 2003). Each manual's safety message content was evaluated to determine whether the message indicated a device design characteristic that violated known usability principles (heuristics) or indicated a violation of an affordance of the device. "Minimize memory load," with 65 violations, was the heuristic violated most frequently across pumps. Variations between pumps, including the frequency and severity of violations for each, were noted. Results suggest that manual review can provide a proxy for heuristic evaluation of the actual medical device. This method, intended to be a component of prepurchasing evaluation, can complement more formal usability evaluation methods and be used to select a subset of devices for more extensive and formal testing.
[Endoscopic contribution in the dilatation of caustic esophagus stenosis].
Seydou, Togo; Abdoulaye, Ouattara Moussa; Xing, Li; Zi, Sanogo Zimogo; Sekou, Koumaré; Wen, Yang Shang; Ibrahim, Sankare; Sekou, Toure Cheik Ahmed; Boubacar, Maiga Ibrahim; Saye, Jacque; Jerome, Dakouo Dodino; Dantoumé, Toure Ousmane; Sadio, Yena
2016-01-01
The aim of this work was to present the contribution of the endoscopy in the management of esophageal dilatation for caustic esophageal stenosis (CES). This was a descriptive and prospective study in the thoracic surgery department at the Hospital of Mali. A total of 46 cases of CES is recorded and divided into 4 groups according to the topography of the esophageal lesions. For the different methods of dilatation the number of performed endoscopic support was determined to understand the contribution of endoscopic means in the success of dilatation for CES. The outcome, complications and mortality in the two methods were compared. Fibroscopy was used in 41.30% of patients with Savary Guillard dilators and in 47.82% of patients with Lerut dilators. Video laryngoscopy was used in 58.69% of patients who underwent dilatation with Lerut dilators. The passage of the guide wire was performed in 39.13% under video laryngoscopy and 58.68% under fibroscopy. In comparison of the two methods, there is a significant difference in the occurrence of complications (p=0.04075), general anesthesia (p=0.02287), accessibility (p=0.04805) and mortality (p=0.00402). The CES is a serious disease and under evaluated in Mali. The endoscopies contribute significantly to the success of esophageal dilatation for caustic stenosis in the different methods we used.
Nakamura, Ryo; Nakano, Kumiko; Tamura, Hiroyasu; Mizunuma, Masaki; Fushiki, Tohru; Hirata, Dai
2017-08-01
Many factors contribute to palatability. In order to evaluate the palatability of Japanese alcohol sake paired with certain dishes by integrating multiple factors, here we applied an evaluation method previously reported for palatability of cheese by multiple regression analysis based on 3 subdomain factors (rewarding, cultural, and informational). We asked 94 Japanese participants/subjects to evaluate the palatability of sake (1st evaluation/E1 for the first cup, 2nd/E2 and 3rd/E3 for the palatability with aftertaste/afterglow of certain dishes) and to respond to a questionnaire related to 3 subdomains. In E1, 3 factors were extracted by a factor analysis, and the subsequent multiple regression analyses indicated that the palatability of sake was interpreted by mainly the rewarding. Further, the results of attribution-dissections in E1 indicated that 2 factors (rewarding and informational) contributed to the palatability. Finally, our results indicated that the palatability of sake was influenced by the dish eaten just before drinking.
Assessment of effect of Yb3+ ion pairs on a highly Yb-doped double-clad fibre laser
NASA Astrophysics Data System (ADS)
Vallés, J. A.; Martín, J. C.; Berdejo, V.; Cases, R.; Álvarez, J. M.; Rebolledo, M. Á.
2018-03-01
Using a previously validated characterization method based on the careful measurement of the characteristic parameters and fluorescence emission spectra of a highly Yb-doped double-clad fibre, we evaluate the contribution of ion pair induced processes to the output power of a double-clad Yb-doped fibre ring laser. This contribution is proved to be insignificant, contrary to analysis by other authors, who overestimate the role of ion pairs.
Buccino, Carla; Ferrara, Carmen; Malvano, Carmela; De Feo, Giovanni
2017-11-07
This study presents an evaluation of the environmental performance of an ice cream cup made of polyethylene (PE)/paper laminate using a life cycle assessment approach 'from cradle to grave'. Two opposite alternative disposal scenarios, as well as their intermediate combinations, were considered: 100% incineration and 100% landfilling. The environmental impacts were calculated using the EPD 2013 evaluation method since the study was developed in an Environmental Product Declaration perspective as well as the method ReCiPe 2008 H at the endpoint level. PE/paper laminate production was the most impactful process since it provided the highest contribution to total impacts in four of six impact categories considered. Ice cream cup production was the second impactful process. The 100% incineration scenario provided negligible contribution to life cycle total impact for all impact categories; while considering the landfilling scenario, the percentage contributions to the total impact provided by the end-of-life phase increased considerably, until to be comparable to the contributions provided by the production processes of the PE/paper laminate and the ice cream cup. The obtained results highlighted that different disposal scenarios can affect significantly the conclusions of a study. At the endpoint level, incineration was more environmentally sound than landfilling for all the ReCiPe damage categories.
Yokotani, Kaori; Umegaki, Keizo
2017-02-01
The contribution of (-)-epigallocatechin gallate (EGCg) intake to in vivo antioxidant activity is unclear, even with respect to plasma. In this study, we examined how administration of EGCg contributes to plasma antioxidant activity, relative to its concentration, endogenous antioxidants, and assay methods, namely oxygen radical absorbance capacity (ORAC) and ferric reducing/antioxidant power (FRAP). Administration of EGCg (500 mg/kg) to rats increased plasma EGCg (4μmol/L as free form) and ascorbic acid (1.7-fold), as well as ORAC (1.2-fold) and FRAP (3-fold) values. The increase in plasma ascorbic acid following EGCg administration was accompanied by its relocation from the adrenal glands and lymphocytes into plasma, and was related to the increase in FRAP. Plasma deproteinization and assays in plasma model solutions revealed that protein levels significantly contributed to ORAC values, where <3 μmol/L EGCg in the presence of protein exhibited minimal antioxidant activity, as measured by both FRAP and ORAC. As the concentration of plasma ascorbic acid was not influenced by deproteinization, differences in FRAP values with and without deproteinization were estimated to determine the contribution of enhanced ascorbic acid attributable to EGCg administration. These results will help to understand the points that should be considered when evaluating EGCg antioxidant activity in plasma.
Strategies for Evaluating Complex Environmental Education Programs
NASA Astrophysics Data System (ADS)
Williams, V.
2011-12-01
Evidence for the effectiveness of environmental education programs has been difficult to establish for many reasons. Chief among them are the lack of clear program objectives and an inability to conceptualize how environmental education programs work. Both can lead to evaluations that make claims that are difficult to substantiate, such as significant changes in student achievement levels or behavioral changes based on acquisition of knowledge. Many of these challenges can be addressed by establishing the program theory and developing a logic model. However, claims of impact on larger societal outcomes are difficult to attribute solely to program activities. Contribution analysis may offer a promising method for addressing this challenge. Rather than attempt to definitively and causally link a program's activities to desired results, contribution analysis seeks to provide plausible evidence that can reduce uncertainty regarding the 'difference' a program is making to observed outcomes. It sets out to verify the theory of change behind a program and, at the same time, takes into consideration other influencing factors. Contribution analysis is useful in situations where the program is not experimental-there is little or no scope for varying how the program is implemented-and the program has been funded on the basis of a theory of change. In this paper, the author reviews the feasibility of using contribution analysis as a way of evaluating the impact of the GLOBE program, an environmental science and education program. Initially conceptualized by Al Gore in 1995, the program's implementation model is based on worldwide environmental monitoring by students and scientists around the globe. This paper will make a significant and timely contribution to the field of evaluation, and specifically environmental education evaluation by examining the usefulness of this analysis for developing evidence to assess the impact of environmental education programs.
Tai, Joanna Hong-Meng; Canny, Benedict J; Haines, Terry P; Molloy, Elizabeth K
2016-08-01
This study explored the contribution of peer-assisted learning (PAL) in the development of evaluative judgement capacity; the ability to understand work quality and apply those standards to appraising performance. The study employed a mixed methods approach, collecting self-reported survey data, observations of, and reflective interviews with, the medical students observed. Participants were in their first year of clinical placements. Data were thematically analysed. Students indicated that PAL contributed to both the comprehension of notions of quality, and the practice of making comparisons between a given performance and the standards. Emergent themes included peer story-telling, direct observation of performance, and peer-based feedback, all of which helped students to define 'work quality'. By participating in PAL, students were required to make comparisons, therefore using the standards of practice and gaining a deeper understanding of them. The data revealed tensions in that peers were seen as less threatening than supervisors with the advantage of increasing learners' appetites for thoughtful 'intellectual risk taking'. Despite this reported advantage of peer engagement, learners still expressed a preference for feedback from senior teachers as more trusted sources of clinical knowledge. While this study suggests that PAL already contributes to the development of evaluative judgement, further steps could be taken to formalise PAL in clinical placements to improve learners' capacity to make accurate judgements on the performance of self and others. Further experimental studies are necessary to confirm the best methods of using PAL to develop evaluative judgement. This may include both students and educators as instigators of PAL in the workplace.
Jackson, Bret; Coffey, Dane; Thorson, Lauren; Schroeder, David; Ellingson, Arin M; Nuckley, David J; Keefe, Daniel F
2012-10-01
In this position paper we discuss successes and limitations of current evaluation strategies for scientific visualizations and argue for embracing a mixed methods strategy of evaluation. The most novel contribution of the approach that we advocate is a new emphasis on employing design processes as practiced in related fields (e.g., graphic design, illustration, architecture) as a formalized mode of evaluation for data visualizations. To motivate this position we describe a series of recent evaluations of scientific visualization interfaces and computer graphics strategies conducted within our research group. Complementing these more traditional evaluations our visualization research group also regularly employs sketching, critique, and other design methods that have been formalized over years of practice in design fields. Our experience has convinced us that these activities are invaluable, often providing much more detailed evaluative feedback about our visualization systems than that obtained via more traditional user studies and the like. We believe that if design-based evaluation methodologies (e.g., ideation, sketching, critique) can be taught and embraced within the visualization community then these may become one of the most effective future strategies for both formative and summative evaluations.
Jackson, Bret; Coffey, Dane; Thorson, Lauren; Schroeder, David; Ellingson, Arin M.; Nuckley, David J.
2017-01-01
In this position paper we discuss successes and limitations of current evaluation strategies for scientific visualizations and argue for embracing a mixed methods strategy of evaluation. The most novel contribution of the approach that we advocate is a new emphasis on employing design processes as practiced in related fields (e.g., graphic design, illustration, architecture) as a formalized mode of evaluation for data visualizations. To motivate this position we describe a series of recent evaluations of scientific visualization interfaces and computer graphics strategies conducted within our research group. Complementing these more traditional evaluations our visualization research group also regularly employs sketching, critique, and other design methods that have been formalized over years of practice in design fields. Our experience has convinced us that these activities are invaluable, often providing much more detailed evaluative feedback about our visualization systems than that obtained via more traditional user studies and the like. We believe that if design-based evaluation methodologies (e.g., ideation, sketching, critique) can be taught and embraced within the visualization community then these may become one of the most effective future strategies for both formative and summative evaluations. PMID:28944349
NASA Astrophysics Data System (ADS)
Milic, Vladimir; Kasac, Josip; Novakovic, Branko
2015-10-01
This paper is concerned with ?-gain optimisation of input-affine nonlinear systems controlled by analytic fuzzy logic system. Unlike the conventional fuzzy-based strategies, the non-conventional analytic fuzzy control method does not require an explicit fuzzy rule base. As the first contribution of this paper, we prove, by using the Stone-Weierstrass theorem, that the proposed fuzzy system without rule base is universal approximator. The second contribution of this paper is an algorithm for solving a finite-horizon minimax problem for ?-gain optimisation. The proposed algorithm consists of recursive chain rule for first- and second-order derivatives, Newton's method, multi-step Adams method and automatic differentiation. Finally, the results of this paper are evaluated on a second-order nonlinear system.
2015-01-01
Background The project selection process is a crucial step for healthcare organizations at the moment of implementing six sigma programs in both administrative and caring processes. However, six-sigma project selection is often defined as a decision making process with interaction and feedback between criteria; so that it is necessary to explore different methods to help healthcare companies to determine the Six-sigma projects that provide the maximum benefits. This paper describes the application of both ANP (Analytic Network process) and DEMATEL (Decision Making trial and evaluation laboratory)-ANP in a public medical centre to establish the most suitable six sigma project and finally, these methods were compared to evaluate their performance in the decision making process. Methods ANP and DEMATEL-ANP were used to evaluate 6 six sigma project alternatives under an evaluation model composed by 3 strategies, 4 criteria and 15 sub-criteria. Judgement matrixes were completed by the six sigma team whose participants worked in different departments of the medical centre. Results The improving of care opportunity in obstetric outpatients was elected as the most suitable six sigma project with a score of 0,117 as contribution to the organization goals. DEMATEL-ANP performed better at decision making process since it reduced the error probability due to interactions and feedback. Conclusions ANP and DEMATEL-ANP effectively supported six sigma project selection processes, helping to create a complete framework that guarantees the prioritization of projects that provide maximum benefits to healthcare organizations. As DEMATEL- ANP performed better, it should be used by practitioners involved in decisions related to the implementation of six sigma programs in healthcare sector accompanied by the adequate identification of the evaluation criteria that support the decision making model. Thus, this comparative study contributes to choosing more effective approaches in this field. Suggestions of further work are also proposed so that these methods can be applied more adequate in six sigma project selection processes in healthcare. PMID:26391445
Shen, Yufeng; Tolić, Nikola; Xie, Fang; Zhao, Rui; Purvine, Samuel O.; Schepmoes, Athena A.; Ronald, J. Moore; Anderson, Gordon A.; Smith, Richard D.
2011-01-01
We report on the effectiveness of CID, HCD, and ETD for LC-FT MS/MS analysis of peptides using a tandem linear ion trap-Orbitrap mass spectrometer. A range of software tools and analysis parameters were employed to explore the use of CID, HCD, and ETD to identify peptides isolated from human blood plasma without the use of specific “enzyme rules”. In the evaluation of an FDR-controlled SEQUEST scoring method, the use of accurate masses for fragments increased the numbers of identified peptides (by ~50%) compared to the use of conventional low accuracy fragment mass information, and CID provided the largest contribution to the identified peptide datasets compared to HCD and ETD. The FDR-controlled Mascot scoring method provided significantly fewer peptide identifications than with SEQUEST (by 1.3–2.3 fold) at the same confidence levels, and CID, HCD, and ETD provided similar contributions to identified peptides. Evaluation of de novo sequencing and the UStags method for more intense fragment ions revealed that HCD afforded more sequence consecutive residues (e.g., ≥7 amino acids) than either CID or ETD. Both the FDR-controlled SEQUEST and Mascot scoring methods provided peptide datasets that were affected by the decoy database and mass tolerances applied (e.g., the identical peptides between the datasets could be limited to ~70%), while the UStags method provided the most consistent peptide datasets (>90% overlap) with extremely low (near zero) numbers of false positive identifications. The m/z ranges in which CID, HCD, and ETD contributed the largest number of peptide identifications were substantially overlapping. This work suggests that the three peptide ion fragmentation methods are complementary, and that maximizing the number of peptide identifications benefits significantly from a careful match with the informatics tools and methods applied. These results also suggest that the decoy strategy may inaccurately estimate identification FDRs. PMID:21678914
Wäscher, Sebastian; Salloch, Sabine; Ritter, Peter; Vollmann, Jochen; Schildmann, Jan
2017-05-01
This article describes a process of developing, implementing and evaluating a clinical ethics support service intervention with the goal of building up a context-sensitive structure of minimal clinical-ethics in an oncology department without prior clinical ethics structure. Scholars from different disciplines have called for an improvement in the evaluation of clinical ethics support services (CESS) for different reasons over several decades. However, while a lot has been said about the concepts and methodological challenges of evaluating CESS up to the present time, relatively few empirical studies have been carried out. The aim of this article is twofold. On the one hand, it describes a process of development, modifying and evaluating a CESS intervention as part of the ETHICO research project, using the approach of qualitative-formative evaluation. On the other hand, it provides a methodological analysis which specifies the contribution of qualitative empirical methods to the (formative) evaluation of CESS. We conclude with a consideration of the strengths and limitations of qualitative evaluation research with regards to the evaluation and development of context sensitive CESS. We further discuss our own approach in contrast to rather traditional consult or committee models. © 2017 John Wiley & Sons Ltd.
Evaluation of algorithm methods for fluorescence spectra of cancerous and normal human tissues
NASA Astrophysics Data System (ADS)
Pu, Yang; Wang, Wubao; Alfano, Robert R.
2016-03-01
The paper focus on the various algorithms on to unravel the fluorescence spectra by unmixing methods to identify cancerous and normal human tissues from the measured fluorescence spectroscopy. The biochemical or morphologic changes that cause fluorescence spectra variations would appear earlier than the histological approach; therefore, fluorescence spectroscopy holds a great promise as clinical tool for diagnosing early stage of carcinomas and other deceases for in vivo use. The method can further identify tissue biomarkers by decomposing the spectral contributions of different fluorescent molecules of interest. In this work, we investigate the performance of blind source un-mixing methods (backward model) and spectral fitting approaches (forward model) in decomposing the contributions of key fluorescent molecules from the tissue mixture background when certain selected excitation wavelength is applied. Pairs of adenocarcinoma as well as normal tissues confirmed by pathologist were excited by selective wavelength of 340 nm. The emission spectra of resected fresh tissue were used to evaluate the relative changes of collagen, reduced nicotinamide adenine dinucleotide (NADH), and Flavin by various spectral un-mixing methods. Two categories of algorithms: forward methods and Blind Source Separation [such as Principal Component Analysis (PCA) and Independent Component Analysis (ICA), and Nonnegative Matrix Factorization (NMF)] will be introduced and evaluated. The purpose of the spectral analysis is to discard the redundant information which conceals the difference between these two types of tissues, but keep their diagnostically significance. The facts predicted by different methods were compared to the gold standard of histopathology. The results indicate that these key fluorophores within tissue, e.g. tryptophan, collagen, and NADH, and flavin, show differences of relative contents of fluorophores among different types of human cancer and normal tissues. The sensitivity, specificity, and receiver operating characteristic (ROC) are finally employed as the criteria to evaluate the efficacy of these methods in cancer detection. The underlying physical and biological basis for these optical approaches will be discussed with examples. This ex vivo preliminary trial demonstrates that these different criteria from different methods can distinguish carcinoma from normal tissues with good sensitivity and specificity while among them, we found that ICA appears to be the superior method in predication accuracy.
Tremblay, Marie-Claude; Brousselle, Astrid; Richard, Lucie; Beaudet, Nicole
2013-10-01
Program designers and evaluators should make a point of testing the validity of a program's intervention theory before investing either in implementation or in any type of evaluation. In this context, logic analysis can be a particularly useful option, since it can be used to test the plausibility of a program's intervention theory using scientific knowledge. Professional development in public health is one field among several that would truly benefit from logic analysis, as it appears to be generally lacking in theorization and evaluation. This article presents the application of this analysis method to an innovative public health professional development program, the Health Promotion Laboratory. More specifically, this paper aims to (1) define the logic analysis approach and differentiate it from similar evaluative methods; (2) illustrate the application of this method by a concrete example (logic analysis of a professional development program); and (3) reflect on the requirements of each phase of logic analysis, as well as on the advantages and disadvantages of such an evaluation method. Using logic analysis to evaluate the Health Promotion Laboratory showed that, generally speaking, the program's intervention theory appeared to have been well designed. By testing and critically discussing logic analysis, this article also contributes to further improving and clarifying the method. Copyright © 2013 Elsevier Ltd. All rights reserved.
Puras-Gil, A M; López-Cousillas, A
1999-01-01
It is obvious that technology has contributed throughout history to the development of the different sciences. In this article, we define the concept of Pathology as a medical speciality, and we explain its influence in a hospital, considering very different fields such as education, research, quality control, hospital information, and patient care. This speciality has undergone a considerable evolution, to which technological innovation has undoubtedly contributed. As a basic discipline, it is of great importance in pre and post-graduate training, in the medical education at the hospital or outside it, and in the fields previously mentioned. Its relation with other disciplines such us Chemistry, (fixation and dyeing), Physics (mechanical devices), Mathematics (algorithms, morphometry, statistics...) and Telecommunications (telepathology, image analysis...) is examined and their contribution to Pathology is evaluated. We are also aware of contributions made by Pathology to technological innovation in the evaluation of different diagnostic methods or in the recent therapeutic technologies based on Radiotherapy, Hyperthermia, laser, prothesis, etc.; where histological examination provides accurate information about the therapeutic capacity or side-effects, or the rejection reactions caused, aiding the research to obtain adequate results.
Tepper, Ronnie
2017-01-01
Background Workplaces today demand graduates who are prepared with field-specific knowledge, advanced social skills, problem-solving skills, and integration capabilities. Meeting these goals with didactic learning (DL) is becoming increasingly difficult. Enhanced training methods that would better prepare tomorrow’s graduates must be more engaging and game-like, such as feedback based e-learning or simulation-based training, while saving time. Empirical evidence regarding the effectiveness of advanced learning methods is lacking. Objective quantitative research comparing advanced training methods with DL is sparse. Objectives This quantitative study assessed the effectiveness of a computerized interactive simulator coupled with an instructor who monitored students’ progress and provided Web-based immediate feedback. Methods A low-cost, globally accessible, telemedicine simulator, developed at the Technion—Israel Institute of Technology, Haifa, Israel—was used. A previous study in the field of interventional cardiology, evaluating the efficacy of the simulator to enhanced learning via knowledge exams, presented promising results of average scores varying from 94% after training and 54% before training (n=20) with P<.001. Two independent experiments involving obstetrics and gynecology (Ob-Gyn) physicians and senior ultrasound sonographers, with 32 subjects, were conducted using a new interactive concept of the WOZ (Wizard of OZ) simulator platform. The contribution of an instructor to learning outcomes was evaluated by comparing students’ knowledge before and after each interactive instructor-led session as well as after fully automated e-learning in the field of Ob-Gyn. Results from objective knowledge tests were analyzed using hypothesis testing and model fitting. Results A significant advantage (P=.01) was found in favor of the WOZ training approach. Content type and training audience were not significant. Conclusions This study evaluated the contribution of an integrated teaching environment using a computerized interactive simulator, with an instructor providing immediate Web-based immediate feedback to trainees. Involvement of an instructor in the simulation-based training process provided better learning outcomes that varied training content and trainee populations did not affect the overall learning gains. PMID:28432039
NASA Astrophysics Data System (ADS)
Chakraborty, Bipasha; Davies, C. T. H.; Koponen, J.; Lepage, G. P.; Peardon, M. J.; Ryan, S. M.
2016-04-01
The quark-line disconnected diagram is a potentially important ingredient in lattice QCD calculations of the hadronic vacuum polarization contribution to the anomalous magnetic moment of the muon. It is also a notoriously difficult one to evaluate. Here, for the first time, we give an estimate of this contribution based on lattice QCD results that have a statistically significant signal, albeit at one value of the lattice spacing and an unphysically heavy value of the u /d quark mass. We use HPQCD's method of determining the anomalous magnetic moment by reconstructing the Adler function from time moments of the current-current correlator at zero spatial momentum. Our results lead to a total (including u , d and s quarks) quark-line disconnected contribution to aμ of -0.15 % of the u /d hadronic vacuum polarization contribution with an uncertainty which is 1% of that contribution.
Review of water footprint components of grain
NASA Astrophysics Data System (ADS)
Ahmad, Wan Amiza Amneera Wan; Meriam Nik Sulaiman, Nik; Zalina Mahmood, Noor
2017-06-01
Burgeoning global population, economic development, agriculture and prevailing climate pattern are among aspects contributed to water scarcity. In low and middle income countries, agriculture takes the highest share among water user sector. Demand for grain is widespread all over the globe. Hence, this study review published papers regarding quantification of water footprint of grain. Review shows there are various methods in quantifying water footprint. In ascertaining water footprint, three (green, blue, grey) or two (green, blue) components of water footprint involved. However, there was a study introduced new term in evaluating water footprint, white water footprint. The vulnerability of varying methods is difficulty in conducting comparative among water footprint. Salient source in contributing high water footprint also varies. In some studies, green water footprint play major role. Conversely, few studies found out blue water footprint most contributing component in water footprint. This fluctuate pattern influenced by various aspects, namely, regional climatic characteristics, crop yield and crop types.
DOT National Transportation Integrated Search
2014-05-01
At its most basic, an asphalt mixture is asphalt : binder and crushed stone aggregate. This : seemingly simple mixture is very complex; method : of preparation and application, additives, and : aggregate type all influence the quality and : durabilit...
[Evaluation of the first training on clinical research methodology in Chile].
Espinoza, Manuel; Cabieses, Báltica; Pedreros, César; Zitko, Pedro
2011-03-01
This paper describes the evaluation of the first training on clinical research methodology in Chile (EMIC-Chile) 12 months after its completion. An online survey was conducted for students and the Delphi method was used for the teaching team. Among the students, the majority reported that the program had contributed to their professional development and that they had shared some of the knowledge acquired with colleagues in their workplace. Forty-one percent submitted a project to obtain research funding through a competitive grants process once they had completed the course. Among the teachers, the areas of greatest interest were the communication strategy, teaching methods, the characteristics of the teaching team, and potential strategies for making the EMIC-Chile permanent in the future. This experience could contribute to future research training initiatives for health professionals. Recognized challenges are the involvement of nonmedical professions in clinical research, the complexities associated with the distance learning methodology, and the continued presence of initiatives of this importance at the national and regional level.
Bassi da Silva, Jéssica; Ferreira, Sabrina Barbosa de Souza; de Freitas, Osvaldo; Bruschi, Marcos Luciano
2017-07-01
Mucoadhesion is a useful strategy for drug delivery systems, such as tablets, patches, gels, liposomes, micro/nanoparticles, nanosuspensions, microemulsions and colloidal dispersions. Moreover, it has contributed to many benefits like increased residence time at application sites, drug protection, increased drug permeation and improved drug availability. In this context, investigation into the mucoadhesive properties of pharmaceutical dosage forms is fundamental, in order to characterize, understand and simulate the in vivo interaction between the formulation and the biological substrate, contributing to the development of new mucoadhesive systems with effectiveness, safety and quality. There are a lot of in vivo, in vitro and ex vivo methods for the evaluation of the mucoadhesive properties of drug delivery systems. However, there also is a lack of standardization of these techniques, which makes comparison between the results difficult. Therefore, this work aims to show an overview of the most commonly employed methods for mucoadhesion evaluation, relating them to different proposed systems and using artificial or natural mucosa from humans and animals.
Evaluating care from a care ethical perspective:: A pilot study.
Kuis, Esther E; Goossensen, Anne
2017-08-01
Care ethical theories provide an excellent opening for evaluation of healthcare practices since searching for (moments of) good care from a moral perspective is central to care ethics. However, a fruitful way to translate care ethical insights into measurable criteria and how to measure these criteria has as yet been unexplored: this study describes one of the first attempts. To investigate whether the emotional touchpoint method is suitable for evaluating care from a care ethical perspective. An adapted version of the emotional touchpoint interview method was used. Touchpoints represent the key moments to the experience of receiving care, where the patient recalls being touched emotionally or cognitively. Participants and research context: Interviews were conducted at three different care settings: a hospital, mental healthcare institution and care facility for older people. A total of 31 participants (29 patients and 2 relatives) took part in the study. Ethical considerations: The research was found not to be subject to the (Dutch) Medical Research Involving Human Subjects Act. A three-step care ethical evaluation model was developed and described using two touchpoints as examples. A focus group meeting showed that the method was considered of great value for partaking institutions in comparison with existing methods. Reflection and discussion: Considering existing methods to evaluate quality of care, the touchpoint method belongs to the category of instruments which evaluate the patient experience. The touchpoint method distinguishes itself because no pre-defined categories are used but the values of patients are followed, which is an essential issue from a care ethical perspective. The method portrays the insider perspective of patients and thereby contributes to humanizing care. The touchpoint method is a valuable instrument for evaluating care; it generates evaluation data about the core care ethical principle of responsiveness.
Evaluation of sites for the location of WEEE recycling plants in Spain.
Queiruga, Dolores; Walther, Grit; González-Benito, Javier; Spengler, Thomas
2008-01-01
As a consequence of new European legal regulations for treatment of waste electrical and electronic equipment (WEEE), recycling plants have to be installed in Spain. In this context, this contribution describes a method for ranking of Spanish municipalities according to their appropriateness for the installation of these plants. In order to rank the alternatives, the discrete multi-criteria decision method PROMETHEE (Preference Ranking Organisation METHod for Enrichment Evaluations), combined with a surveys of experts, is applied. As existing plants are located in North and East Spain, a significant concentration of top ranking municipalities can be observed in South and Central Spain. The method does not present an optimal structure of the future recycling system, but provides a selection of good alternatives for potential locations of recycling plants.
Development of the Ion Exchange-Gravimetric Method for Sodium in Serum as a Definitive Method
Moody, John R.; Vetter, Thomas W.
1996-01-01
An ion exchange-gravimetric method, previously developed as a National Committee for Clinical Laboratory Standards (NCCLS) reference method for the determination of sodium in human serum, has been re-evaluated and improved. Sources of analytical error in this method have been examined more critically and the overall uncertainties decreased. Additionally, greater accuracy and repeatability have been achieved by the application of this definitive method to a sodium chloride reference material. In this method sodium in serum is ion-exchanged, selectively eluted and converted to a weighable precipitate as Na2SO4. Traces of sodium eluting before or after the main fraction, and precipitate contaminants are determined instrumentally. Co-precipitating contaminants contribute less than 0.1 % while the analyte lost to other eluted ion-exchange fractions contributes less than 0.02 % to the total precipitate mass. With improvements, the relative expanded uncertainty (k = 2) of the method, as applied to serum, is 0.3 % to 0.4 % and is less than 0.1 % when applied to a sodium chloride reference material. PMID:27805122
Kamneva, Olga K; Rosenberg, Noah A
2017-01-01
Hybridization events generate reticulate species relationships, giving rise to species networks rather than species trees. We report a comparative study of consensus, maximum parsimony, and maximum likelihood methods of species network reconstruction using gene trees simulated assuming a known species history. We evaluate the role of the divergence time between species involved in a hybridization event, the relative contributions of the hybridizing species, and the error in gene tree estimation. When gene tree discordance is mostly due to hybridization and not due to incomplete lineage sorting (ILS), most of the methods can detect even highly skewed hybridization events between highly divergent species. For recent divergences between hybridizing species, when the influence of ILS is sufficiently high, likelihood methods outperform parsimony and consensus methods, which erroneously identify extra hybridizations. The more sophisticated likelihood methods, however, are affected by gene tree errors to a greater extent than are consensus and parsimony. PMID:28469378
Alternative method for evaluating the pair energy of nucleons in nuclei
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nurmukhamedov, A. M., E-mail: fattah52@mail.ru
2015-12-15
An alternative method for determining the odd–even effect parameter related to special features of the Casimir operator in Wigner’s mass formula for nuclei is proposed. A procedure for calculating this parameter is presented. The proposed method relies on a geometric interpretation of the Casimir operator, experimental data concerning the contribution of spin–orbit interaction to the nuclear mass for even–even and odd–odd nuclei, and systematics of energy gaps in the spectra of excited states of even–even nuclei.
NASA Technical Reports Server (NTRS)
Rodriguez, Ernesto; Kim, Yunjin; Durden, Stephen L.
1992-01-01
A numerical evaluation is presented of the regime of validity for various rough surface scattering theories against numerical results obtained by employing the method of moments. The contribution of each theory is considered up to second order in the perturbation expansion for the surface current. Considering both vertical and horizontal polarizations, the unified perturbation method provides best results among all theories weighed.
Evaluation of genetic divergence among clones of conilon coffee after scheduled cycle pruning.
Dalcomo, J M; Vieira, H D; Ferreira, A; Lima, W L; Ferrão, R G; Fonseca, A F A; Ferrão, M A G; Partelli, F L
2015-11-30
Coffea canephora genotypes from the breeding program of Instituto Capixaba de Pesquisa e Extensão Rural were evaluated, and genetic diversity was estimated with the aim of future improvement strategies. From an initial group of 55 genotypes, 18 from the region of Castelo, ES, were selected, and three clones of the cultivars "Vitória" and "robusta tropical." Upon completion of the scheduled cycle pruning, 17 morphoagronomic traits were measured in the 22 genotypes selected. The principal components method was used to evaluate the contributions relative to the traits. The genetic dissimilarity matrix was obtained through Mahalanobis generalized distance, and genotypes were grouped using the hierarchical method based on the mean of the distances. The most promising clones of Avaliação Castelo were AC02, AC03, AC12, AC13, AC22, AC24, AC26, AC27, AC28, AC29, AC30, AC35, AC36, AC37, AC39, AC40, AC43, and AC46. These methods detected high genetic variability, grouping, by similarity, the genotypes in five groups. The trait that contributed the least to genetic divergence was the number of leaves in plagiotropic branches; however, this was not eliminated, because discarding it altered the groups. There are superior genotypes with potential for use in the next stages of the breeding program, aimed at both the composition of clonal variety and hybridizations.
ERIC Educational Resources Information Center
Svetina, Dubravka
2013-01-01
The purpose of this study was to investigate the effect of complex structure on dimensionality assessment in noncompensatory multidimensional item response models using dimensionality assessment procedures based on DETECT (dimensionality evaluation to enumerate contributing traits) and NOHARM (normal ogive harmonic analysis robust method). Five…
The substrate of fluvial systems is regularly characterized as part of a larger physical habitat assessment. Beyond contributing to a basic scientific understanding of fluvial systems, these measures are instrumental in meeting the regulatory responsibilities of bioassessment and...
Asada, Toshio; Ando, Kanta; Bandyopadhyay, Pradipta; Koseki, Shiro
2016-09-08
A widely applicable free energy contribution analysis (FECA) method based on the quantum mechanical/molecular mechanical (QM/MM) approximation using response kernel approaches has been proposed to investigate the influences of environmental residues and/or atoms in the QM region on the free energy profile. This method can evaluate atomic contributions to the free energy along the reaction path including polarization effects on the QM region within a dramatically reduced computational time. The rate-limiting step in the deactivation of the β-lactam antibiotic cefalotin (CLS) by β-lactamase was studied using this method. The experimentally observed activation barrier was successfully reproduced by free energy perturbation calculations along the optimized reaction path that involved activation by the carboxylate moiety in CLS. It was found that the free energy profile in the QM region was slightly higher than the isolated energy and that two residues, Lys67 and Lys315, as well as water molecules deeply influenced the QM atoms associated with the bond alternation reaction in the acyl-enzyme intermediate. These facts suggested that the surrounding residues are favorable for the reactant complex and prevent the intermediate from being too stabilized to proceed to the following deacylation reaction. We have demonstrated that the free energy contribution analysis should be a useful method to investigate enzyme catalysis and to facilitate intelligent molecular design.
Climate-dependence of ecosystem services in a nature reserve in northern China
Fang, Jiaohui; Song, Huali; Zhang, Yiran; Li, Yanran
2018-01-01
Evaluation of ecosystem services has become a hotspot in terms of research focus, but uncertainties over appropriate methods remain. Evaluation can be based on the unit price of services (services value method) or the unit price of the area (area value method). The former takes meteorological factors into account, while the latter does not. This study uses Kunyu Mountain Nature Reserve as a study site at which to test the effects of climate on the ecosystem services. Measured data and remote sensing imagery processed in a geographic information system were combined to evaluate gas regulation and soil conservation, and the influence of meteorological factors on ecosystem services. Results were used to analyze the appropriateness of the area value method. Our results show that the value of ecosystem services is significantly affected by meteorological factors, especially precipitation. Use of the area value method (which ignores the impacts of meteorological factors) could considerably impede the accuracy of ecosystem services evaluation. Results were also compared with the valuation obtained using the modified equivalent value factor (MEVF) method, which is a modified area value method that considers changes in meteorological conditions. We found that MEVF still underestimates the value of ecosystem services, although it can reflect to some extent the annual variation in meteorological factors. Our findings contribute to increasing the accuracy of evaluation of ecosystem services. PMID:29438427
Climate-dependence of ecosystem services in a nature reserve in northern China.
Fang, Jiaohui; Song, Huali; Zhang, Yiran; Li, Yanran; Liu, Jian
2018-01-01
Evaluation of ecosystem services has become a hotspot in terms of research focus, but uncertainties over appropriate methods remain. Evaluation can be based on the unit price of services (services value method) or the unit price of the area (area value method). The former takes meteorological factors into account, while the latter does not. This study uses Kunyu Mountain Nature Reserve as a study site at which to test the effects of climate on the ecosystem services. Measured data and remote sensing imagery processed in a geographic information system were combined to evaluate gas regulation and soil conservation, and the influence of meteorological factors on ecosystem services. Results were used to analyze the appropriateness of the area value method. Our results show that the value of ecosystem services is significantly affected by meteorological factors, especially precipitation. Use of the area value method (which ignores the impacts of meteorological factors) could considerably impede the accuracy of ecosystem services evaluation. Results were also compared with the valuation obtained using the modified equivalent value factor (MEVF) method, which is a modified area value method that considers changes in meteorological conditions. We found that MEVF still underestimates the value of ecosystem services, although it can reflect to some extent the annual variation in meteorological factors. Our findings contribute to increasing the accuracy of evaluation of ecosystem services.
Effect of genotyped cows in the reference population on the genomic evaluation of Holstein cattle.
Uemoto, Y; Osawa, T; Saburi, J
2017-03-01
This study evaluated the dependence of reliability and prediction bias on the prediction method, the contribution of including animals (bulls or cows), and the genetic relatedness, when including genotyped cows in the progeny-tested bull reference population. We performed genomic evaluation using a Japanese Holstein population, and assessed the accuracy of genomic enhanced breeding value (GEBV) for three production traits and 13 linear conformation traits. A total of 4564 animals for production traits and 4172 animals for conformation traits were genotyped using Illumina BovineSNP50 array. Single- and multi-step methods were compared for predicting GEBV in genotyped bull-only and genotyped bull-cow reference populations. No large differences in realized reliability and regression coefficient were found between the two reference populations; however, a slight difference was found between the two methods for production traits. The accuracy of GEBV determined by single-step method increased slightly when genotyped cows were included in the bull reference population, but decreased slightly by multi-step method. A validation study was used to evaluate the accuracy of GEBV when 800 additional genotyped bulls (POPbull) or cows (POPcow) were included in the base reference population composed of 2000 genotyped bulls. The realized reliabilities of POPbull were higher than those of POPcow for all traits. For the gain of realized reliability over the base reference population, the average ratios of POPbull gain to POPcow gain for production traits and conformation traits were 2.6 and 7.2, respectively, and the ratios depended on heritabilities of the traits. For regression coefficient, no large differences were found between the results for POPbull and POPcow. Another validation study was performed to investigate the effect of genetic relatedness between cows and bulls in the reference and test populations. The effect of genetic relationship among bulls in the reference population was also assessed. The results showed that it is important to account for relatedness among bulls in the reference population. Our studies indicate that the prediction method, the contribution ratio of including animals, and genetic relatedness could affect the prediction accuracy in genomic evaluation of Holstein cattle, when including genotyped cows in the reference population.
Corrêa, A M; Pereira, M I S; de Abreu, H K A; Sharon, T; de Melo, C L P; Ito, M A; Teodoro, P E; Bhering, L L
2016-10-17
The common bean, Phaseolus vulgaris, is predominantly grown on small farms and lacks accurate genotype recommendations for specific micro-regions in Brazil. This contributes to a low national average yield. The aim of this study was to use the methods of the harmonic mean of the relative performance of genetic values (HMRPGV) and the centroid, for selecting common bean genotypes with high yield, adaptability, and stability for the Cerrado/Pantanal ecotone region in Brazil. We evaluated 11 common bean genotypes in three trials carried out in the dry season in Aquidauana in 2013, 2014, and 2015. A likelihood ratio test detected a significant interaction between genotype x year, contributing 54% to the total phenotypic variation in grain yield. The three genotypes selected by the joint analysis of genotypic values in all years (Carioca Precoce, BRS Notável, and CNFC 15875) were the same as those recommended by the HMRPGV method. Using the centroid method, genotypes BRS Notável and CNFC 15875 were considered ideal genotypes based on their high stability to unfavorable environments and high responsiveness to environmental improvement. We identified a high association between the methods of adaptability and stability used in this study. However, the use of centroid method provided a more accurate and precise recommendation of the behavior of the evaluated genotypes.
Suspended-sediment sources in an urban watershed, Northeast Branch Anacostia River, Maryland
Devereux, Olivia H.; Prestegaard, Karen L.; Needelman, Brian A.; Gellis, Allen C.
2010-01-01
Fine sediment sources were characterized by chemical composition in an urban watershed, the Northeast Branch Anacostia River, which drains to the Chesapeake Bay. Concentrations of 63 elements and two radionuclides were measured in possible land-based sediment sources and suspended sediment collected from the water column at the watershed outlet during storm events. These tracer concentrations were used to determine the relative quantity of suspended sediment contributed by each source. Although this is an urbanized watershed, there was not a distinct urban signature that can be evaluated except for the contributions from road surfaces. We identified the sources of fine sediment by both physiographic province (Piedmont and Coastal Plain) and source locale (streambanks, upland and street residue) by using different sets of elemental tracers. The Piedmont contributed the majority of the fine sediment for seven of the eight measured storms. The streambanks contributed the greatest quantity of fine sediment when evaluated by source locale. Street residue contributed 13% of the total suspended sediment on average and was the source most concentrated in anthropogenically enriched elements. Combining results from the source locale and physiographic province analyses, most fine sediment in the Northeast Branch watershed is derived from streambanks that contain sediment eroded from the Piedmont physiographic province of the watershed. Sediment fingerprinting analyses are most useful when longer term evaluations of sediment erosion and storage are also available from streambank-erosion measurements, sediment budget and other methods.
NASA Astrophysics Data System (ADS)
-Aurel Cherecheş, Ioan; -Ioana Borzan, Adela; -Laurean Băldean, Doru
2017-10-01
Study of construction and wearing process in the case of piston-rings and other significant components from internal combustion engines leads at any time to creative and useful optimizing ideas, both in designing and manufacturing phases. Main objective of the present paper is to realize an interdisciplinary research using advanced methods in piston-rings evaluation of a common vehicle on the streets which is Ford Focus FYDD. Specific objectives are a theoretical study of the idea for advanced analysis method in piston-rings evaluation and an applied research developed in at Technical University from Cluj-Napoca with the motor vehicle caught in the repairing process.
Contribution to the rheological testing of pharmaceutical semisolids.
Siska, B; Snejdrova, E; Machac, I; Dolecek, P; Martiska, J
2018-01-22
Rheological behaviour of pharmaceutical semisolid preparations significantly affects manufacturing process, administration, stability, homogeneity of incorporated drug, accuracy of dosing, adhesion in the place of application, drug release, and resulting therapeutic effect of the product. We performed test of consistency by penetrometry, rotational, oscillation and creep tests, and squeeze and tack tests of model samples to introduce methods suitable for characterization and comparison of semisolids in practice. Penetrometry is a simple method allowing sorting the semisolids to low and high stress-resistant materials but deficient for rheological characterization of semisolids. Value of yield stress, generally considered to be appropriate feature of semisolids, is significantly influenced by the method of testing and the way of evaluation. The hysteresis loops of model semisolids revealed incomplete thixotropy, therefore, three-step thixotropy test was employed. Semisolids showed nonlinear response in the creep phase of tests and partial recovery of structure by storing energy in the recovery phase. Squeeze and tack tests seem to be convenient ways for comparison of semisolids. Our study can contribute to a better understanding of different flow behaviour of semisolids given by different physicochemical properties of excipients and can bring useful approaches to evaluation and comparison of semisolids in practice.
NASA Astrophysics Data System (ADS)
Fatrias, D.; Kamil, I.; Meilani, D.
2018-03-01
Coordinating business operation with suppliers becomes increasingly important to survive and prosper under the dynamic business environment. A good partnership with suppliers not only increase efficiency, but also strengthen corporate competitiveness. Associated with such concern, this study aims to develop a practical approach of multi-criteria supplier evaluation using combined methods of Taguchi loss function (TLF), best-worst method (BWM) and VIse Kriterijumska Optimizacija kompromisno Resenje (VIKOR). A new framework of integrative approach adopting these methods is our main contribution for supplier evaluation in literature. In this integrated approach, a compromised supplier ranking list based on the loss score of suppliers is obtained using efficient steps of a pairwise comparison based decision making process. Implemetation to the case problem with real data from crumb rubber industry shows the usefulness of the proposed approach. Finally, a suitable managerial implication is presented.
Bamberger, Michael; Tarsilla, Michele; Hesse-Biber, Sharlene
2016-04-01
Many widely-used impact evaluation designs, including randomized control trials (RCTs) and quasi-experimental designs (QEDs), frequently fail to detect what are often quite serious unintended consequences of development programs. This seems surprising as experienced planners and evaluators are well aware that unintended consequences frequently occur. Most evaluation designs are intended to determine whether there is credible evidence (statistical, theory-based or narrative) that programs have achieved their intended objectives and the logic of many evaluation designs, even those that are considered the most "rigorous," does not permit the identification of outcomes that were not specified in the program design. We take the example of RCTs as they are considered by many to be the most rigorous evaluation designs. We present a numbers of cases to illustrate how infusing RCTs with a mixed-methods approach (sometimes called an "RCT+" design) can strengthen the credibility of these designs and can also capture important unintended consequences. We provide a Mixed Methods Evaluation Framework that identifies 9 ways in which UCs can occur, and we apply this framework to two of the case studies. Copyright © 2016 Elsevier Ltd. All rights reserved.
Preliminary Evaluation of Method to Monitor Landfills Resilience against Methane Emission
NASA Astrophysics Data System (ADS)
Chusna, Noor Amalia; Maryono, Maryono
2018-02-01
Methane emission from landfill sites contribute to global warming and un-proper methane treatment can pose an explosion hazard. Stakeholder and government in the cities in Indonesia been found significant difficulties to monitor the resilience of landfill from methane emission. Moreover, the management of methane gas has always been a challenging issue for long waste management service and operations. Landfills are a significant contributor to anthropogenic methane emissions. This study conducted preliminary evaluation of method to manage methane gas emission by assessing LandGem and IPCC method. From the preliminary evaluation, this study found that the IPCC method is based on the availability of current and historical country specific data regarding the waste disposed of in landfills while from the LandGEM method is an automated tool for estimating emission rates for total landfill gas this method account total gas of methane, carbon dioxide and other. The method can be used either with specific data to estimate emissions in the site or default parameters if no site-specific data are available. Both of method could be utilize to monitor the methane emission from landfill site in cities of Central Java.
Liu, Rui-Sang; Jin, Guang-Huai; Xiao, Deng-Rong; Li, Hong-Mei; Bai, Feng-Wu; Tang, Ya-Jie
2015-01-01
Aroma results from the interplay of volatile organic compounds (VOCs) and the attributes of microbial-producing aromas are significantly affected by fermentation conditions. Among the VOCs, only a few of them contribute to aroma. Thus, screening and identification of the key VOCs is critical for microbial-producing aroma. The traditional method is based on gas chromatography-olfactometry (GC-O), which is time-consuming and laborious. Considering the Tuber melanosporum fermentation system as an example, a new method to screen and identify the key VOCs by combining the aroma evaluation method with principle component analysis (PCA) was developed in this work. First, an aroma sensory evaluation method was developed to screen 34 potential favorite aroma samples from 504 fermentation samples. Second, PCA was employed to screen nine common key VOCs from these 34 samples. Third, seven key VOCs were identified by the traditional method. Finally, all of the seven key VOCs identified by the traditional method were also identified, along with four others, by the new strategy. These results indicate the reliability of the new method and demonstrate it to be a viable alternative to the traditional method. PMID:26655663
Marwani, Hadi M; Lowry, Mark; Keating, Patrick; Warner, Isiah M; Cook, Robert L
2007-11-01
This study introduces a newly developed frequency segmentation and recombination method for frequency-domain fluorescence lifetime measurements to address the effects of changing fractional contributions over time and minimize the effects of photobleaching within multi-component systems. Frequency segmentation and recombination experiments were evaluated using a two component system consisting of fluorescein and rhodamine B. Comparison of experimental data collected in traditional and segmented fashion with simulated data, generated using different changing fractional contributions, demonstrated the validity of the technique. Frequency segmentation and recombination was also applied to a more complex system consisting of pyrene with Suwannee River fulvic acid reference and was shown to improve recovered lifetimes and fractional intensity contributions. It was observed that photobleaching in both systems led to errors in recovered lifetimes which can complicate the interpretation of lifetime results. Results showed clear evidence that the frequency segmentation and recombination method reduced errors resulting from a changing fractional contribution in a multi-component system, and allowed photobleaching issues to be addressed by commercially available instrumentation.
Buciński, Adam; Marszałł, Michał Piotr; Krysiński, Jerzy; Lemieszek, Andrzej; Załuski, Jerzy
2010-07-01
Hodgkin's lymphoma is one of the most curable malignancies and most patients achieve a lasting complete remission. In this study, artificial neural network (ANN) analysis was shown to provide significant factors with regard to 5-year recurrence after lymphoma treatment. Data from 114 patients treated for Hodgkin's disease were available for evaluation and comparison. A total of 31 variables were subjected to ANN analysis. The ANN approach as an advanced multivariate data processing method was shown to provide objective prognostic data. Some of these prognostic factors are consistent or even identical to the factors evaluated earlier by other statistical methods.
Applying Behavior Analytic Procedures to Effectively Teach Literacy Skills in the Classroom
ERIC Educational Resources Information Center
Joseph, Laurice M.; Alber-Morgan, Sheila; Neef, Nancy
2016-01-01
The purpose of this article is to discuss the application of behavior analytic procedures for advancing and evaluating methods for teaching literacy skills in the classroom. Particularly, applied behavior analysis has contributed substantially to examining the relationship between teacher behavior and student literacy performance. Teacher…
USDA-ARS?s Scientific Manuscript database
While T cell contribution to IAV immunity is appreciated, data comparing methods to evaluate IFN-gamma production by IAV-specific T cells elicited following vaccination is limited. To understand the differential immunogenicity between live-attenuated influenza virus (LAIV) and whole-inactivated viru...
Peer Instruction: An Evaluation of Its Theory, Application, and Contribution
ERIC Educational Resources Information Center
Gok, Tolga; Gok, Ozge
2017-01-01
Many qualitative and quantitative studies performed on peer instruction based on interactive engagement method used in many different disciplines and courses were reviewed in the present study. The researchers examined the effects of peer instruction on students' cognitive skills (conceptual learning, problem solving, reasoning ability, etc.) and…
Team Testing for Individual Success
ERIC Educational Resources Information Center
Hurren, B. Lee; Rutledge, Matt; Garvin, Amanda Burcham
2006-01-01
Why do creative teachers who want to help all their students learn in meaningful ways have to use high-pressure testing methods that work against that goal? The authors propose a system of testing that serves the need for evaluation while contributing to students' intellectual and social growth. (Contains 7 endnotes.)
The contribution of fecal pollution from dogs in urbanized areas can be significant and is an often underestimated problem. Microbial source tracking methods (MST) utilizing quantitative PCR of dog-associated gene sequences encoding 16S rRNA of Bacteroidales are a useful tool to ...
The Windsor, Ontario Exposure Assessment Study evaluated the contribution of ambient air pollutants to personal and indoor exposures of adults and asthmatic children living in Windsor, Ontario, Canada. In addition, the role of personal, indoor, and outdoor air pollution exposures...
A model-based approach for the evaluation of vagal and sympathetic activities in a newborn lamb.
Le Rolle, Virginie; Ojeda, David; Beuchée, Alain; Praud, Jean-Paul; Pladys, Patrick; Hernández, Alfredo I
2013-01-01
This paper proposes a baroreflex model and a recursive identification method to estimate the time-varying vagal and sympathetic contributions to heart rate variability during autonomic maneuvers. The baroreflex model includes baroreceptors, cardiovascular control center, parasympathetic and sympathetic pathways. The gains of the global afferent sympathetic and vagal pathways are identified recursively. The method has been validated on data from newborn lambs, which have been acquired during the application of an autonomic maneuver, without medication and under beta-blockers. Results show a close match between experimental and simulated signals under both conditions. The vagal and sympathetic contributions have been simulated and, as expected, it is possible to observe different baroreflex responses under beta-blockers compared to baseline conditions.
Burstyn, Igor; De Roos, Anneclaire J
2016-12-22
We address a methodological issue of the evaluation of the difference in effects in epidemiological studies that may arise, for example, from stratum-specific analyses or differences in analytical decisions during data analysis. We propose a new simulation-based method to quantify the plausible extent of such heterogeneity, rather than testing a hypothesis about its existence. We examine the contribution of the method to the debate surrounding risk of multiple myeloma and glyphosate use and propose that its application contributes to a more balanced weighting of evidence.
Burstyn, Igor; De Roos, Anneclaire J.
2016-01-01
We address a methodological issue of the evaluation of the difference in effects in epidemiological studies that may arise, for example, from stratum-specific analyses or differences in analytical decisions during data analysis. We propose a new simulation-based method to quantify the plausible extent of such heterogeneity, rather than testing a hypothesis about its existence. We examine the contribution of the method to the debate surrounding risk of multiple myeloma and glyphosate use and propose that its application contributes to a more balanced weighting of evidence. PMID:28025514
Ortíz, Miguel A; Felizzola, Heriberto A; Nieto Isaza, Santiago
2015-01-01
The project selection process is a crucial step for healthcare organizations at the moment of implementing six sigma programs in both administrative and caring processes. However, six-sigma project selection is often defined as a decision making process with interaction and feedback between criteria; so that it is necessary to explore different methods to help healthcare companies to determine the Six-sigma projects that provide the maximum benefits. This paper describes the application of both ANP (Analytic Network process) and DEMATEL (Decision Making trial and evaluation laboratory)-ANP in a public medical centre to establish the most suitable six sigma project and finally, these methods were compared to evaluate their performance in the decision making process. ANP and DEMATEL-ANP were used to evaluate 6 six sigma project alternatives under an evaluation model composed by 3 strategies, 4 criteria and 15 sub-criteria. Judgement matrixes were completed by the six sigma team whose participants worked in different departments of the medical centre. The improving of care opportunity in obstetric outpatients was elected as the most suitable six sigma project with a score of 0,117 as contribution to the organization goals. DEMATEL-ANP performed better at decision making process since it reduced the error probability due to interactions and feedback. ANP and DEMATEL-ANP effectively supported six sigma project selection processes, helping to create a complete framework that guarantees the prioritization of projects that provide maximum benefits to healthcare organizations. As DEMATEL- ANP performed better, it should be used by practitioners involved in decisions related to the implementation of six sigma programs in healthcare sector accompanied by the adequate identification of the evaluation criteria that support the decision making model. Thus, this comparative study contributes to choosing more effective approaches in this field. Suggestions of further work are also proposed so that these methods can be applied more adequate in six sigma project selection processes in healthcare.
Ajslev, Jeppe; Brandt, Mikkel; Møller, Jeppe Lykke; Skals, Sebastian; Vinstrup, Jonas; Jakobsen, Markus Due; Sundstrup, Emil; Madeleine, Pascal; Andersen, Lars Louis
2016-05-26
Previous research has shown that reducing physical workload among workers in the construction industry is complicated. In order to address this issue, we developed a process evaluation in a formative mixed-methods design, drawing on existing knowledge of the potential barriers for implementation. We present the design of a mixed-methods process evaluation of the organizational, social, and subjective practices that play roles in the intervention study, integrating technical measurements to detect excessive physical exertion measured with electromyography and accelerometers, video documentation of working tasks, and a 3-phased workshop program. The evaluation is designed in an adapted process evaluation framework, addressing recruitment, reach, fidelity, satisfaction, intervention delivery, intervention received, and context of the intervention companies. Observational studies, interviews, and questionnaires among 80 construction workers organized in 20 work gangs, as well as health and safety staff, contribute to the creation of knowledge about these phenomena. At the time of publication, the process of participant recruitment is underway. Intervention studies are challenging to conduct and evaluate in the construction industry, often because of narrow time frames and ever-changing contexts. The mixed-methods design presents opportunities for obtaining detailed knowledge of the practices intra-acting with the intervention, while offering the opportunity to customize parts of the intervention.
NASA Astrophysics Data System (ADS)
Alexandrou, Constantia; Constantinou, Martha; Hadjiyiannakou, Kyriakos; Jansen, Karl; Kallidonis, Christos; Koutsou, Giannis; Vaquero Avilés-Casco, Alejandro
2018-03-01
We present results on the isovector and isoscalar nucleon axial form factors including disconnected contributions, using an ensemble of Nf = 2 twisted mass cloverimproved Wilson fermions simulated with approximately the physical value of the pion mass. The light disconnected quark loops are computed using exact deflation, while the strange and the charm quark loops are evaluated using the truncated solver method. Techniques such as the summation and the two-state fits have been employed to access ground-state dominance.
[Medicine in Brazil today: education and practice].
Gonçalves, E L
1990-01-01
Present situation of medical education and medical practice in Brazil is analyzed, and the scientific-technological impact in medical practice is studied, in both diagnostic and therapeutic aspects. The influence of scientific methods in medical education, specifically the Flexner's contribution, is evaluated. In the recent years, Flexner's propositions have been put in question, particularly because of important contributions of psychology, anthropology and sociology to a better knowledge of human nature. Therefore many curricular alternatives have been proposed, aiming at a medical education that would favor the formation of well poised personality and a critical intelligence.
Aguilar-Arredondo, Andrea; Arias, Clorinda; Zepeda, Angélica
2015-01-01
Hippocampal neurogenesis occurs in the adult brain in various species, including humans. A compelling question that arose when neurogenesis was accepted to occur in the adult dentate gyrus (DG) is whether new neurons become functionally relevant over time, which is key for interpreting their potential contributions to synaptic circuitry. The functional state of adult-born neurons has been evaluated using various methodological approaches, which have, in turn, yielded seemingly conflicting results regarding the timing of maturation and functional integration. Here, we review the contributions of different methodological approaches to addressing the maturation process of adult-born neurons and their functional state, discussing the contributions and limitations of each method. We aim to provide a framework for interpreting results based on the approaches currently used in neuroscience for evaluating functional integration. As shown by the experimental evidence, adult-born neurons are prone to respond from early stages, even when they are not yet fully integrated into circuits. The ongoing integration process for the newborn neurons is characterised by different features. However, they may contribute differently to the network depending on their maturation stage. When combined, the strategies used to date convey a comprehensive view of the functional development of newly born neurons while providing a framework for approaching the critical time at which new neurons become functionally integrated and influence brain function.
Evaluation of nearest-neighbor methods for detection of chimeric small-subunit rRNA sequences
NASA Technical Reports Server (NTRS)
Robison-Cox, J. F.; Bateson, M. M.; Ward, D. M.
1995-01-01
Detection of chimeric artifacts formed when PCR is used to retrieve naturally occurring small-subunit (SSU) rRNA sequences may rely on demonstrating that different sequence domains have different phylogenetic affiliations. We evaluated the CHECK_CHIMERA method of the Ribosomal Database Project and another method which we developed, both based on determining nearest neighbors of different sequence domains, for their ability to discern artificially generated SSU rRNA chimeras from authentic Ribosomal Database Project sequences. The reliability of both methods decreases when the parental sequences which contribute to chimera formation are more than 82 to 84% similar. Detection is also complicated by the occurrence of authentic SSU rRNA sequences that behave like chimeras. We developed a naive statistical test based on CHECK_CHIMERA output and used it to evaluate previously reported SSU rRNA chimeras. Application of this test also suggests that chimeras might be formed by retrieving SSU rRNAs as cDNA. The amount of uncertainty associated with nearest-neighbor analyses indicates that such tests alone are insufficient and that better methods are needed.
The execution of systematic measurements on plane cascades
NASA Technical Reports Server (NTRS)
Scholz, N.
1978-01-01
The present state of development of the experimental technique regarding the flow through cascades and several points to be specially observed in the design of cascade wind tunnels were discussed. The equations required for the evaluation of the momentum measurements in two dimensional flow through cascades were developed. Regarding the effect of the jet contraction due to the boundary layer along the side walls a simple method for correction was also given in order to obtain two dimensional flow characteristics. Also given were the equations for the evaluation of the pressure distribution measurements. Another contribution was made regarding the presentation of the test results in the form of nondimensional quantities. The results of systematic measurements of cascades with symmetrical aerofoil were reported, and the above suggested method was applied for the evaluation of the measurements.
Laner, David; Rechberger, Helmut
2009-02-01
Waste prevention is a principle means of achieving the goals of waste management and a key element for developing sustainable economies. Small and medium sized enterprises (SMEs) contribute substantially to environmental degradation, often not even being aware of their environmental effects. Therefore, several initiatives have been launched in Austria aimed at supporting waste prevention measures on the level of SMEs. To promote the most efficient projects, they have to be evaluated with respect to their contribution to the goals of waste management. It is the aim of this paper to develop a methodology for evaluating waste prevention measures in SMEs based on their goal orientation. At first, conceptual problems of defining and delineating waste prevention activities are briefly discussed. Then an approach to evaluate waste prevention activities with respect to their environmental performance is presented and benchmarks which allow for an efficient use of the available funds are developed. Finally the evaluation method is applied to a number of former projects and the calculated results are analysed with respect to shortcomings and limitations of the model. It is found that the developed methodology can provide a tool for a more objective and comprehensible evaluation of waste prevention measures.
Application of modified extended method in CREAM for safety inspector in coal mines
NASA Astrophysics Data System (ADS)
Wang, Jinhe; Zhang, Xiaohong; Zeng, Jianchao
2018-01-01
Safety inspector often performs duties in circumstances contributes to the oc currence of human failures. Therefore, the paper aims at quantifying human failure pro bability (HFP) of safety inspector during the coal mine operation with cognitive reliabi lity and error analysis method (CREAM). Whereas, some shortcomings of this approa ch that lacking considering the applicability of the common performance condition (C PC), and the subjective of evaluating CPC level which weaken the accuracy of the qua ntitative prediction results. A modified extended method in CREAM which is able to a ddress these difficulties with a CPC framework table is proposed, and the proposed me thodology is demonstrated by the virtue of a coal-mine accident example. The results a re expected to be useful in predicting HFP of safety inspector and contribute to the enh ancement of coal mine safety.
A new method to evaluate image quality of CBCT images quantitatively without observers
Shimizu, Mayumi; Okamura, Kazutoshi; Yoshida, Shoko; Weerawanich, Warangkana; Tokumori, Kenji; Jasa, Gainer R; Yoshiura, Kazunori
2017-01-01
Objectives: To develop an observer-free method for quantitatively evaluating the image quality of CBCT images by applying just-noticeable difference (JND). Methods: We used two test objects: (1) a Teflon (polytetrafluoroethylene) plate phantom attached to a dry human mandible; and (2) a block phantom consisting of a Teflon step phantom and an aluminium step phantom. These phantoms had holes with different depths. They were immersed in water and scanned with a CB MercuRay (Hitachi Medical Corporation, Tokyo, Japan) at tube voltages of 120 kV, 100 kV, 80 kV and 60 kV. Superimposed images of the phantoms with holes were used for evaluation. The number of detectable holes was used as an index of image quality. In detecting holes quantitatively, the threshold grey value (ΔG), which differentiated holes from the background, was calculated using a specific threshold (the JND), and we extracted the holes with grey values above ΔG. The indices obtained by this quantitative method (the extracted hole values) were compared with the observer evaluations (the observed hole values). In addition, the contrast-to-noise ratio (CNR) of the shallowest detectable holes and the deepest undetectable holes were measured to evaluate the contribution of CNR to detectability. Results: The results of this evaluation method corresponded almost exactly with the evaluations made by observers. The extracted hole values reflected the influence of different tube voltages. All extracted holes had an area with a CNR of ≥1.5. Conclusions: This quantitative method of evaluating CBCT image quality may be more useful and less time-consuming than evaluation by observation. PMID:28045343
Methods for Evaluating the Performance and Human Stress-Factors of Percussive Riveting
NASA Astrophysics Data System (ADS)
Ahn, Jonathan Y.
The aerospace industry automates portions of their manufacturing and assembly processes. However, mechanics still remain vital to production, especially in areas where automated machines cannot fit, or have yet to match the quality of human craftsmanship. One such task is percussive riveting. Because percussive riveting is associated with a high risk of injury, these tool must be certified prior to release. The major contribution of this thesis is to develop a test bench capable of percussive riveting for ergonomic evaluation purposes. The major issues investigated are: (i) automate the tool evaluation method to be repeatable; (ii) demonstrate use of displacement and force sensors; and (iii) correlate performance and risk exposure of percussive tools. A test bench equipped with servomotors and pneumatic cylinders to control xyz-position of a rivet gun and bucking bar simultaneously, is used to explore this evaluation approach.
Nelson, Scott D; Parker, Jaqui; Lario, Robert; Winnenburg, Rainer; Erlbaum, Mark S.; Lincoln, Michael J.; Bodenreider, Olivier
2018-01-01
Interoperability among medication classification systems is known to be limited. We investigated the mapping of the Established Pharmacologic Classes (EPCs) to SNOMED CT. We compared lexical and instance-based methods to an expert-reviewed reference standard to evaluate contributions of these methods. Of the 543 EPCs, 284 had an equivalent SNOMED CT class, 205 were more specific, and 54 could not be mapped. Precision, recall, and F1 score were 0.416, 0.620, and 0.498 for lexical mapping and 0.616, 0.504, and 0.554 for instance-based mapping. Each automatic method has strengths, weaknesses, and unique contributions in mapping between medication classification systems. In our experience, it was beneficial to consider the mapping provided by both automated methods for identifying potential matches, gaps, inconsistencies, and opportunities for quality improvement between classifications. However, manual review by subject matter experts is still needed to select the most relevant mappings. PMID:29295234
Cooper, Chris; Lovell, Rebecca; Husk, Kerryn; Booth, Andrew; Garside, Ruth
2018-06-01
We undertook a systematic review to evaluate the health benefits of environmental enhancement and conservation activities. We were concerned that a conventional process of study identification, focusing on exhaustive searches of bibliographic databases as the primary search method, would be ineffective, offering limited value. The focus of this study is comparing study identification methods. We compare (1) an approach led by searches of bibliographic databases with (2) an approach led by supplementary search methods. We retrospectively assessed the effectiveness and value of both approaches. Effectiveness was determined by comparing (1) the total number of studies identified and screened and (2) the number of includable studies uniquely identified by each approach. Value was determined by comparing included study quality and by using qualitative sensitivity analysis to explore the contribution of studies to the synthesis. The bibliographic databases approach identified 21 409 studies to screen and 2 included qualitative studies were uniquely identified. Study quality was moderate, and contribution to the synthesis was minimal. The supplementary search approach identified 453 studies to screen and 9 included studies were uniquely identified. Four quantitative studies were poor quality but made a substantive contribution to the synthesis; 5 studies were qualitative: 3 studies were good quality, one was moderate quality, and 1 study was excluded from the synthesis due to poor quality. All 4 included qualitative studies made significant contributions to the synthesis. This case study found value in aligning primary methods of study identification to maximise location of relevant evidence. Copyright © 2017 John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
Squizzato, Stefania; Masiol, Mauro
2015-10-01
The air quality is influenced by the potential effects of meteorology at meso- and synoptic scales. While local weather and mixing layer dynamics mainly drive the dispersion of sources at small scales, long-range transports affect the movements of air masses over regional, transboundary and even continental scales. Long-range transport may advect polluted air masses from hot-spots by increasing the levels of pollution at nearby or remote locations or may further raise air pollution levels where external air masses originate from other hot-spots. Therefore, the knowledge of ground-wind circulation and potential long-range transports is fundamental not only to evaluate how local or external sources may affect the air quality at a receptor site but also to quantify it. This review is focussed on establishing the relationships among PM2.5 sources, meteorological condition and air mass origin in the Po Valley, which is one of the most polluted areas in Europe. We have chosen the results from a recent study carried out in Venice (Eastern Po Valley) and have analysed them using different statistical approaches to understand the influence of external and local contribution of PM2.5 sources. External contributions were evaluated by applying Trajectory Statistical Methods (TSMs) based on back-trajectory analysis including (i) back-trajectories cluster analysis, (ii) potential source contribution function (PSCF) and (iii) concentration weighted trajectory (CWT). Furthermore, the relationships between the source contributions and ground-wind circulation patterns were investigated by using (iv) cluster analysis on wind data and (v) conditional probability function (CPF). Finally, local source contribution have been estimated by applying the Lenschow' approach. In summary, the integrated approach of different techniques has successfully identified both local and external sources of particulate matter pollution in a European hot-spot affected by the worst air quality.
Cvetkovic, Dean
2013-01-01
The Cooperative Learning in Engineering Design curriculum can be enhanced with structured and timely self and peer assessment teaching methodologies which can easily be applied to any Biomedical Engineering curriculum. A study was designed and implemented to evaluate the effectiveness of this structured and timely self and peer assessment on student team-based projects. In comparing the 'peer-blind' and 'face-to-face' Fair Contribution Scoring (FCS) methods, both had advantages and disadvantages. The 'peer-blind' self and peer assessment method would cause high discrepancy between self and team ratings. But the 'face-to-face' method on the other hand did not have the discrepancy issue and had actually proved to be a more accurate and effective, indicating team cohesiveness and good cooperative learning.
Evaluating Graduate Education and Transcending Biases in Music Teachers' Professional Development
ERIC Educational Resources Information Center
Laor, Lia
2015-01-01
Research concerning professional development and its contribution to the formation of professional identity is prevalent in both general and music education. However, its implications for music educators in the context of graduate programs for music education are seldom discussed. This mixed-methods case study examined experienced music teachers'…
ERIC Educational Resources Information Center
Suter, Larry E.
2017-01-01
The international comparative studies in 1959 were conducted by International Association for the Evaluation of Educational Achievement (IEA) researchers who recognized that differences in student achievement measures in mathematics across countries could be caused by differences in curricula. The measurements of opportunity to learn (OTL) grew…
Water quality and nitrogen mass loss from anaerobic lagoon columns receiving pretreated influent
USDA-ARS?s Scientific Manuscript database
Control methods are needed to abate ammonia losses from swine anaerobic lagoons to reduce contribution of confined swine operations to air pollution. In a 15-month meso-scale column study, we evaluated the effect of manure pretreatment on water quality, reduction of N losses, and sludge accumulation...
The Role of Education and Rehabilitation Specialists in the Comprehensive Low Vision Care Process.
ERIC Educational Resources Information Center
Lueck, A. H.
1997-01-01
Outlines the contributions of education and rehabilitation specialists in maximizing specific skills, self-esteem, and quality of life of individuals with low vision. The role of these specialists in evaluating functional vision, teaching methods to compensate for impaired vision, and addressing psychosocial concerns are discussed. (Author/CR)
ERIC Educational Resources Information Center
Smith, Kimberly G.; Fogerty, Daniel
2015-01-01
Purpose: This study evaluated the extent to which partial spoken or written information facilitates sentence recognition under degraded unimodal and multimodal conditions. Method: Twenty young adults with typical hearing completed sentence recognition tasks in unimodal and multimodal conditions across 3 proportions of preservation. In the unimodal…
ERIC Educational Resources Information Center
Holt, Rachael Frush; Beer, Jessica; Kronenberger, William G.; Pisoni, David B.; Lalonde, Kaylah
2012-01-01
Purpose: To evaluate the family environments of children with cochlear implants and to examine relationships between family environment and postimplant language development and executive function. Method: Forty-five families of children with cochlear implants completed a self-report family environment questionnaire (Family Environment Scale-Fourth…
ERIC Educational Resources Information Center
Craig, Holly K.; Zhang, Lingling; Hensel, Stephanie L.; Quinn, Erin J.
2009-01-01
Purpose: In this study, the authors evaluated the contribution made by dialect shifting to reading achievement test scores of African American English (AAE)-speaking students when controlling for the effects of socioeconomic status (SES), general oral language abilities, and writing skills. Method: Participants were 165 typically developing…
Polycyclic aromatic hydrocarbons (PAHs) are a class of ubiquitous, anthropogenic chemicals found in the environment. In the present study, computational methods are used to evaluate their potential estrogenicity and the contribution chemicals in this class make to environmental e...
Relation of Executive Functioning to Pragmatic Outcome following Severe Traumatic Brain Injury
ERIC Educational Resources Information Center
Douglas, Jacinta M.
2010-01-01
Purpose: This study was designed to explore the behavioral nature of pragmatic impairment following severe traumatic brain injury (TBI) and to evaluate the contribution of executive skills to the experience of pragmatic difficulties after TBI. Method: Participants were grouped into 43 TBI dyads (TBI adults and close relatives) and 43 control…
A framework for evaluating disciplinary contributions to river restoration
G. E. Grant
2008-01-01
As river restoration has matured into a global-scale intervention in rivers, a broader range of technical disciplines are informing restoration goals, strategies, approaches, and methods. The ecological, geomotphological, hydrological, and engineering sciences each bring a distinct focus and set of perspectives and tools, and are themselves embedded in a larger context...
The Relationship between Emotional Intelligence and Student Teacher Performance
ERIC Educational Resources Information Center
Drew, Todd L.
2006-01-01
The purpose of this mixed methods study (N = 40) was to determine whether Student Teacher Performance (STP), as measured by a behavior-based performance evaluation process, is associated with Emotional Intelligence (EI), as measured by a personality assessment instrument. The study is an important contribution to the literature in that it appears…
Getting Good Results from Survey Research: Part III
ERIC Educational Resources Information Center
McNamara, James F.
2004-01-01
This article is the third contribution to a research methods series dedicated to getting good results from survey research. In this series, "good results" is a stenographic term used to define surveys that yield accurate and meaningful information that decision makers can use with confidence when conducting program evaluation and policy assessment…
Contributions of cultural services to the ecosystem services agenda
Daniel, Terry C.; Muhar, Andreas; Arnberger, Arne; Aznar, Olivier; Boyd, James W.; Chan, Kai M. A.; Costanza, Robert; Elmqvist, Thomas; Flint, Courtney G.; Gobster, Paul H.; Grêt-Regamey, Adrienne; Lave, Rebecca; Muhar, Susanne; Penker, Marianne; Ribe, Robert G.; Schauppenlehner, Thomas; Sikor, Thomas; Soloviy, Ihor; Spierenburg, Marja; Taczanowska, Karolina; Tam, Jordan; von der Dunk, Andreas
2012-01-01
Cultural ecosystem services (ES) are consistently recognized but not yet adequately defined or integrated within the ES framework. A substantial body of models, methods, and data relevant to cultural services has been developed within the social and behavioral sciences before and outside of the ES approach. A selective review of work in landscape aesthetics, cultural heritage, outdoor recreation, and spiritual significance demonstrates opportunities for operationally defining cultural services in terms of socioecological models, consistent with the larger set of ES. Such models explicitly link ecological structures and functions with cultural values and benefits, facilitating communication between scientists and stakeholders and enabling economic, multicriterion, deliberative evaluation and other methods that can clarify tradeoffs and synergies involving cultural ES. Based on this approach, a common representation is offered that frames cultural services, along with all ES, by the relative contribution of relevant ecological structures and functions and by applicable social evaluation approaches. This perspective provides a foundation for merging ecological and social science epistemologies to define and integrate cultural services better within the broader ES framework. PMID:22615401
Taguchi Method Applied in Optimization of Shipley SJR 5740 Positive Resist Deposition
NASA Technical Reports Server (NTRS)
Hui, A.; Blosiu, J. O.; Wiberg, D. V.
1998-01-01
Taguchi Methods of Robust Design presents a way to optimize output process performance through an organized set of experiments by using orthogonal arrays. Analysis of variance and signal-to-noise ratio is used to evaluate the contribution of each of the process controllable parameters in the realization of the process optimization. In the photoresist deposition process, there are numerous controllable parameters that can affect the surface quality and thickness of the final photoresist layer.
Evaluation of forest decontamination using radiometric measurements.
Cresswell, Alan J; Kato, Hiroaki; Onda, Yuichi; Nanba, Kenji
2016-11-01
An experiment has been conducted to evaluate the additional dose reduction by clear felling contaminated forestry in Fukushima Prefecture, Japan, and using the timber to cover the areas with wood chips. A portable gamma spectrometry system, comprising a backpack containing a 3 × 3″ NaI(Tl) detector with digital spectrometer and GPS receiver, has been used to map dose rate and radionuclide activity concentrations before, after and at stages during this experiment. The data show the effect of the different stages of the experiment on dose rate at different locations around the site. The spectrometric data have allowed the assessment of the contributions of natural and anthropogenic radionuclides to the dose rate at different parts of the site before and after the experiment. This has clearly demonstrated the value of radiometric methods in evaluating remediation, and the effect of other environmental processes. The value of spectrometric methods which directly measure radionuclide concentrations has also been shown, especially through the identification of the contribution of natural and anthropogenic activity to the measured dose rate. The experiment has shown that clearing trees and applying wood chips can reduce dose rates by 10-15% beyond that achieved by just clearing the forest litter and natural redistribution of radiocaesium. Copyright © 2016 Elsevier Ltd. All rights reserved.
Langston, Anne; Weiss, Jennifer; Landegger, Justine; Pullum, Thomas; Morrow, Melanie; Kabadege, Melene; Mugeni, Catherine; Sarriot, Eric
2014-01-01
ABSTRACT Background: The Kabeho Mwana project (2006–2011) supported the Rwanda Ministry of Health (MOH) in scaling up integrated community case management (iCCM) of childhood illness in 6 of Rwanda's 30 districts. The project trained and equipped community health workers (CHWs) according to national guidelines. In project districts, Kabeho Mwana staff also trained CHWs to conduct household-level health promotion and established supervision and reporting mechanisms through CHW peer support groups (PSGs) and quality improvement systems. Methods: The 2005 and 2010 Demographic and Health Surveys were re-analyzed to evaluate how project and non-project districts differed in terms of care-seeking for fever, diarrhea, and acute respiratory infection symptoms and related indicators. We developed a logit regression model, controlling for the timing of the first CHW training, with the district included as a fixed categorical effect. We also analyzed qualitative data from the final evaluation to examine factors that may have contributed to improved outcomes. Results: While there was notable improvement in care-seeking across all districts, care-seeking from any provider for each of the 3 conditions, and for all 3 combined, increased significantly more in the project districts. CHWs contributed a larger percentage of consultations in project districts (27%) than in non-project districts (12%). Qualitative data suggested that the PSG model was a valuable sub-level of CHW organization associated with improved CHW performance, supervision, and social capital. Conclusions: The iCCM model implemented by Kabeho Mwana resulted in greater improvements in care-seeking than those seen in the rest of the country. Intensive monitoring, collaborative supervision, community mobilization, and CHW PSGs contributed to this success. The PSGs were a unique contribution of the project, playing a critical role in improving care-seeking in project districts. Effective implementation of iCCM should therefore include CHW management and social support mechanisms. Finally, re-analysis of national survey data improved evaluation findings by providing impact estimates. PMID:25276593
NASA Astrophysics Data System (ADS)
Leuchter, S.; Reinert, F.; Müller, W.
2014-06-01
Procurement and design of system architectures capable of network centric operations demand for an assessment scheme in order to compare different alternative realizations. In this contribution an assessment method for system architectures targeted at the C4ISR domain is presented. The method addresses the integration capability of software systems from a complex and distributed software system perspective focusing communication, interfaces and software. The aim is to evaluate the capability to integrate a system or its functions within a system-of-systems network. This method uses approaches from software architecture quality assessment and applies them on the system architecture level. It features a specific goal tree of several dimensions that are relevant for enterprise integration. These dimensions have to be weighed against each other and totalized using methods from the normative decision theory in order to reflect the intention of the particular enterprise integration effort. The indicators and measurements for many of the considered quality features rely on a model based view on systems, networks, and the enterprise. That means it is applicable to System-of-System specifications based on enterprise architectural frameworks relying on defined meta-models or domain ontologies for defining views and viewpoints. In the defense context we use the NATO Architecture Framework (NAF) to ground respective system models. The proposed assessment method allows evaluating and comparing competing system designs regarding their future integration potential. It is a contribution to the system-of-systems engineering methodology.
Development and Evaluation of an Ontology for Guiding Appropriate Antibiotic Prescribing
Furuya, E. Yoko; Kuperman, Gilad J.; Cimino, James J.; Bakken, Suzanne
2011-01-01
Objectives To develop and apply formal ontology creation methods to the domain of antimicrobial prescribing and to formally evaluate the resulting ontology through intrinsic and extrinsic evaluation studies. Methods We extended existing ontology development methods to create the ontology and implemented the ontology using Protégé-OWL. Correctness of the ontology was assessed using a set of ontology design principles and domain expert review via the laddering technique. We created three artifacts to support the extrinsic evaluation (set of prescribing rules, alerts and an ontology-driven alert module, and a patient database) and evaluated the usefulness of the ontology for performing knowledge management tasks to maintain the ontology and for generating alerts to guide antibiotic prescribing. Results The ontology includes 199 classes, 10 properties, and 1,636 description logic restrictions. Twenty-three Semantic Web Rule Language rules were written to generate three prescribing alerts: 1) antibiotic-microorganism mismatch alert; 2) medication-allergy alert; and 3) non-recommended empiric antibiotic therapy alert. The evaluation studies confirmed the correctness of the ontology, usefulness of the ontology for representing and maintaining antimicrobial treatment knowledge rules, and usefulness of the ontology for generating alerts to provide feedback to clinicians during antibiotic prescribing. Conclusions This study contributes to the understanding of ontology development and evaluation methods and addresses one knowledge gap related to using ontologies as a clinical decision support system component—a need for formal ontology evaluation methods to measure their quality from the perspective of their intrinsic characteristics and their usefulness for specific tasks. PMID:22019377
Software Reuse Methods to Improve Technological Infrastructure for e-Science
NASA Technical Reports Server (NTRS)
Marshall, James J.; Downs, Robert R.; Mattmann, Chris A.
2011-01-01
Social computing has the potential to contribute to scientific research. Ongoing developments in information and communications technology improve capabilities for enabling scientific research, including research fostered by social computing capabilities. The recent emergence of e-Science practices has demonstrated the benefits from improvements in the technological infrastructure, or cyber-infrastructure, that has been developed to support science. Cloud computing is one example of this e-Science trend. Our own work in the area of software reuse offers methods that can be used to improve new technological development, including cloud computing capabilities, to support scientific research practices. In this paper, we focus on software reuse and its potential to contribute to the development and evaluation of information systems and related services designed to support new capabilities for conducting scientific research.
Social and spatial processes associated with childhood diarrheal disease in Matlab, Bangladesh.
Perez-Heydrich, Carolina; Furgurson, Jill M; Giebultowicz, Sophia; Winston, Jennifer J; Yunus, Mohammad; Streatfield, Peter Kim; Emch, Michael
2013-01-01
We develop novel methods for conceptualizing geographic space and social networks to evaluate their respective and combined contributions to childhood diarrheal incidence. After defining maternal networks according to direct familial linkages between females, and road networks using satellite imagery of the study area, we use a spatial econometrics model to evaluate the significance of correlation terms relating childhood diarrheal incidence to the incidence observed within respective networks. Disease was significantly clustered within road networks across time, but only inconsistently correlated within maternal networks. These methods could be widely applied to systems in which both social and spatial processes jointly influence health outcomes. Copyright © 2012 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Saltos, Andrea
In efforts to perform accurate dosimetry, Oakes et al. [Nucl. Intrum. Mehods. (2013)] introduced a new portable solid state neutron rem meter based on an adaptation of the Bonner sphere and the position sensitive long counter. The system utilizes high thermal efficiency neutron detectors to generate a linear combination of measurement signals that are used to estimate the incident neutron spectra. The inversion problem associated to deduce dose from the counts in individual detector elements is addressed by applying a cross-correlation method which allows estimation of dose with average errors less than 15%. In this work, an evaluation of the performance of this system was extended to take into account new correlation techniques and neutron scattering contribution. To test the effectiveness of correlations, the Distance correlation, Pearson Product-Moment correlation, and their weighted versions were performed between measured spatial detector responses obtained from nine different test spectra, and the spatial response of Library functions generated by MCNPX. Results indicate that there is no advantage of using the Distance Correlation over the Pearson Correlation, and that weighted versions of these correlations do not increase their performance in evaluating dose. Both correlations were proven to work well even at low integrated doses measured for short periods of time. To evaluate the contribution produced by room-return neutrons on the dosimeter response, MCNPX was used to simulate dosimeter responses for five isotropic neutron sources placed inside different sizes of rectangular concrete rooms. Results show that the contribution of scattered neutrons to the response of the dosimeter can be significant, so that for most cases the dose is over predicted with errors as large as 500%. A possible method to correct for the contribution of room-return neutrons is also assessed and can be used as a good initial estimate on how to approach the problem.
[Industry regulation and its relationship to the rapid marketing of medical devices].
Matsuoka, Atsuko
2012-01-01
In the market of medical devices, non-Japanese products hold a large part even in Japan. To overcome this situation, the Japanese government has been announcing policies to encourage the medical devices industry, such as the 5-year strategy for medical innovation (June 6, 2012). The Division of Medical Devices has been contributing to rapid marketing of medical devices by working out the standards for approval review and accreditation of medical devices, guidances on evaluation of medical devices with emerging technology, and test methods for biological safety evaluation of medical devices, as a part of practice in the field of regulatory science. The recent outcomes are 822 standards of accreditation for Class II medical devices, 14 guidances on safety evaluation of medical devices with emerging technology, and the revised test methods for biological safety evaluation (MHLW Notification by Director, OMDE, Yakushokuki-hatsu 0301 No. 20 "Basic Principles of Biological Safety Evaluation Required for Application for Approval to Market Medical Devices").
NASA Astrophysics Data System (ADS)
Allard, Alexandre; Fischer, Nicolas
2018-06-01
Sensitivity analysis associated with the evaluation of measurement uncertainty is a very important tool for the metrologist, enabling them to provide an uncertainty budget and to gain a better understanding of the measurand and the underlying measurement process. Using the GUM uncertainty framework, the contribution of an input quantity to the variance of the output quantity is obtained through so-called ‘sensitivity coefficients’. In contrast, such coefficients are no longer computed in cases where a Monte-Carlo method is used. In such a case, supplement 1 to the GUM suggests varying the input quantities one at a time, which is not an efficient method and may provide incorrect contributions to the variance in cases where significant interactions arise. This paper proposes different methods for the elaboration of the uncertainty budget associated with a Monte Carlo method. An application to the mass calibration example described in supplement 1 to the GUM is performed with the corresponding R code for implementation. Finally, guidance is given for choosing a method, including suggestions for a future revision of supplement 1 to the GUM.
Evaluation of health promotion in schools: a realistic evaluation approach using mixed methods
2010-01-01
Background Schools are key settings for health promotion (HP) but the development of suitable approaches for evaluating HP in schools is still a major topic of discussion. This article presents a research protocol of a program developed to evaluate HP. After reviewing HP evaluation issues, the various possible approaches are analyzed and the importance of a realistic evaluation framework and a mixed methods (MM) design are demonstrated. Methods/Design The design is based on a systemic approach to evaluation, taking into account the mechanisms, context and outcomes, as defined in realistic evaluation, adjusted to our own French context using an MM approach. The characteristics of the design are illustrated through the evaluation of a nationwide HP program in French primary schools designed to enhance children's social, emotional and physical health by improving teachers' HP practices and promoting a healthy school environment. An embedded MM design is used in which a qualitative data set plays a supportive, secondary role in a study based primarily on a different quantitative data set. The way the qualitative and quantitative approaches are combined through the entire evaluation framework is detailed. Discussion This study is a contribution towards the development of suitable approaches for evaluating HP programs in schools. The systemic approach of the evaluation carried out in this research is appropriate since it takes account of the limitations of traditional evaluation approaches and considers suggestions made by the HP research community. PMID:20109202
2013-01-01
Background Qualitative research methods are increasingly used within clinical trials to address broader research questions than can be addressed by quantitative methods alone. These methods enable health professionals, service users, and other stakeholders to contribute their views and experiences to evaluation of healthcare treatments, interventions, or policies, and influence the design of trials. Qualitative data often contribute information that is better able to reform policy or influence design. Methods Health services researchers, including trialists, clinicians, and qualitative researchers, worked collaboratively to develop a comprehensive portfolio of standard operating procedures (SOPs) for the West Wales Organisation for Rigorous Trials in Health (WWORTH), a clinical trials unit (CTU) at Swansea University, which has recently achieved registration with the UK Clinical Research Collaboration (UKCRC). Although the UKCRC requires a total of 25 SOPs from registered CTUs, WWORTH chose to add an additional qualitative-methods SOP (QM-SOP). Results The qualitative methods SOP (QM-SOP) defines good practice in designing and implementing qualitative components of trials, while allowing flexibility of approach and method. Its basic principles are that: qualitative researchers should be contributors from the start of trials with qualitative potential; the qualitative component should have clear aims; and the main study publication should report on the qualitative component. Conclusions We recommend that CTUs consider developing a QM-SOP to enhance the conduct of quantitative trials by adding qualitative data and analysis. We judge that this improves the value of quantitative trials, and contributes to the future development of multi-method trials. PMID:23433341
NASA Astrophysics Data System (ADS)
Zhang, Chunwei; Zhao, Hong; Zhu, Qian; Zhou, Changquan; Qiao, Jiacheng; Zhang, Lu
2018-06-01
Phase-shifting fringe projection profilometry (PSFPP) is a three-dimensional (3D) measurement technique widely adopted in industry measurement. It recovers the 3D profile of measured objects with the aid of the fringe phase. The phase accuracy is among the dominant factors that determine the 3D measurement accuracy. Evaluation of the phase accuracy helps refine adjustable measurement parameters, contributes to evaluating the 3D measurement accuracy, and facilitates improvement of the measurement accuracy. Although PSFPP has been deeply researched, an effective, easy-to-use phase accuracy evaluation method remains to be explored. In this paper, methods based on the uniform-phase coded image (UCI) are presented to accomplish phase accuracy evaluation for PSFPP. These methods work on the principle that the phase value of a UCI can be manually set to be any value, and once the phase value of a UCI pixel is the same as that of a pixel of a corresponding sinusoidal fringe pattern, their phase accuracy values are approximate. The proposed methods provide feasible approaches to evaluating the phase accuracy for PSFPP. Furthermore, they can be used to experimentally research the property of the random and gamma phase errors in PSFPP without the aid of a mathematical model to express random phase error or a large-step phase-shifting algorithm. In this paper, some novel and interesting phenomena are experimentally uncovered with the aid of the proposed methods.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Guang, Lu, E-mail: lu_g@163.com; Hui, Wang; Xuejun, Zou
2016-07-15
A group of BiOCl photocatalysts with different drying temperatures were prepared by a soft chemical method. The effects of drying temperatures on the crystalline phase, morphology, surface area and optical property of as-prepared samples were investigated in detail by XRD, SEM, N{sub 2} absorption–desorption and DRS. Moreover, their photocatalytic activities on the degradation of rhodamine B were evaluated under visible light irradiation. It was found that the sample dried at 120 °C had the best photocatalytic activity, which was mainly attributed to the highest exposing proportion of {001} facets correspond to BiOCl, largest BET and minimum bandgap. The degradation mechanismmore » was explored that superoxide radicals were mainly contributed to the degradation of chromophore, however, holes and hydroxyl were mainly contributed to the photo degradation. Moreover, holes and hydroxyl dominated the degradation of RhB. - Graphical abstract: Holes, hydroxyl and superoxide radicals can attribute to the degradation process but take different degradation pathways. Superoxide radicals mainly contribute to the degradation of chromophore, however, holes and hydroxyl mainly contribute to the photo degradation. Display Omitted - Highlights: • BiOCl nanosheets were prepared by a soft chemical method. • Effect of drying temperatures on as-prepared BiOCl samples was studied. • The highest removal efficiency of RhB was obtained over the sample dried at 120 °C.« less
Sharif, K M; Rahman, M M; Azmir, J; Khatib, A; Sabina, E; Shamsudin, S H; Zaidul, I S M
2015-12-01
Multivariate analysis of thin-layer chromatography (TLC) images was modeled to predict antioxidant activity of Pereskia bleo leaves and to identify the contributing compounds of the activity. TLC was developed in optimized mobile phase using the 'PRISMA' optimization method and the image was then converted to wavelet signals and imported for multivariate analysis. An orthogonal partial least square (OPLS) model was developed consisting of a wavelet-converted TLC image and 2,2-diphynyl-picrylhydrazyl free radical scavenging activity of 24 different preparations of P. bleo as the x- and y-variables, respectively. The quality of the constructed OPLS model (1 + 1 + 0) with one predictive and one orthogonal component was evaluated by internal and external validity tests. The validated model was then used to identify the contributing spot from the TLC plate that was then analyzed by GC-MS after trimethylsilyl derivatization. Glycerol and amine compounds were mainly found to contribute to the antioxidant activity of the sample. An alternative method to predict the antioxidant activity of a new sample of P. bleo leaves has been developed. Copyright © 2015 John Wiley & Sons, Ltd.
Mueller, David S.
2017-01-01
This paper presents a method using Monte Carlo simulations for assessing uncertainty of moving-boat acoustic Doppler current profiler (ADCP) discharge measurements using a software tool known as QUant, which was developed for this purpose. Analysis was performed on 10 data sets from four Water Survey of Canada gauging stations in order to evaluate the relative contribution of a range of error sources to the total estimated uncertainty. The factors that differed among data sets included the fraction of unmeasured discharge relative to the total discharge, flow nonuniformity, and operator decisions about instrument programming and measurement cross section. As anticipated, it was found that the estimated uncertainty is dominated by uncertainty of the discharge in the unmeasured areas, highlighting the importance of appropriate selection of the site, the instrument, and the user inputs required to estimate the unmeasured discharge. The main contributor to uncertainty was invalid data, but spatial inhomogeneity in water velocity and bottom-track velocity also contributed, as did variation in the edge velocity, uncertainty in the edge distances, edge coefficients, and the top and bottom extrapolation methods. To a lesser extent, spatial inhomogeneity in the bottom depth also contributed to the total uncertainty, as did uncertainty in the ADCP draft at shallow sites. The estimated uncertainties from QUant can be used to assess the adequacy of standard operating procedures. They also provide quantitative feedback to the ADCP operators about the quality of their measurements, indicating which parameters are contributing most to uncertainty, and perhaps even highlighting ways in which uncertainty can be reduced. Additionally, QUant can be used to account for self-dependent error sources such as heading errors, which are a function of heading. The results demonstrate the importance of a Monte Carlo method tool such as QUant for quantifying random and bias errors when evaluating the uncertainty of moving-boat ADCP measurements.
Grant, Aileen; Dreischulte, Tobias; Treweek, Shaun; Guthrie, Bruce
2012-08-28
Trials of complex interventions are criticized for being 'black box', so the UK Medical Research Council recommends carrying out a process evaluation to explain the trial findings. We believe it is good practice to pre-specify and publish process evaluation protocols to set standards and minimize bias. Unlike protocols for trials, little guidance or standards exist for the reporting of process evaluations. This paper presents the mixed-method process evaluation protocol of a cluster randomized trial, drawing on a framework designed by the authors. This mixed-method evaluation is based on four research questions and maps data collection to a logic model of how the data-driven quality improvement in primary care (DQIP) intervention is expected to work. Data collection will be predominately by qualitative case studies in eight to ten of the trial practices, focus groups with patients affected by the intervention and quantitative analysis of routine practice data, trial outcome and questionnaire data and data from the DQIP intervention. We believe that pre-specifying the intentions of a process evaluation can help to minimize bias arising from potentially misleading post-hoc analysis. We recognize it is also important to retain flexibility to examine the unexpected and the unintended. From that perspective, a mixed-methods evaluation allows the combination of exploratory and flexible qualitative work, and more pre-specified quantitative analysis, with each method contributing to the design, implementation and interpretation of the other.As well as strengthening the study the authors hope to stimulate discussion among their academic colleagues about publishing protocols for evaluations of randomized trials of complex interventions. DATA-DRIVEN QUALITY IMPROVEMENT IN PRIMARY CARE TRIAL REGISTRATION: ClinicalTrials.gov: NCT01425502.
Bolton, Matthew L.; Bass, Ellen J.; Siminiceanu, Radu I.
2012-01-01
Breakdowns in complex systems often occur as a result of system elements interacting in unanticipated ways. In systems with human operators, human-automation interaction associated with both normative and erroneous human behavior can contribute to such failures. Model-driven design and analysis techniques provide engineers with formal methods tools and techniques capable of evaluating how human behavior can contribute to system failures. This paper presents a novel method for automatically generating task analytic models encompassing both normative and erroneous human behavior from normative task models. The generated erroneous behavior is capable of replicating Hollnagel’s zero-order phenotypes of erroneous action for omissions, jumps, repetitions, and intrusions. Multiple phenotypical acts can occur in sequence, thus allowing for the generation of higher order phenotypes. The task behavior model pattern capable of generating erroneous behavior can be integrated into a formal system model so that system safety properties can be formally verified with a model checker. This allows analysts to prove that a human-automation interactive system (as represented by the model) will or will not satisfy safety properties with both normative and generated erroneous human behavior. We present benchmarks related to the size of the statespace and verification time of models to show how the erroneous human behavior generation process scales. We demonstrate the method with a case study: the operation of a radiation therapy machine. A potential problem resulting from a generated erroneous human action is discovered. A design intervention is presented which prevents this problem from occurring. We discuss how our method could be used to evaluate larger applications and recommend future paths of development. PMID:23105914
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
This case history discusses the exploration methods used at the Momotombo Geothermal Field in western Nicaragua, and evaluates their contributions to the development of the geothermal field models. Subsequent reservoir engineering has not been synthesized or evaluated. A geothermal exploration program was started in Nicaragua in 1966 to discover and delineate potential geothermal reservoirs in western Nicaragua. Exploration began at the Momotombo field in 1970 using geological, geochemical, and geophysical methods. A regional study of thermal manifestations was undertaken and the area on the southern flank of Volcan Momotombo was chosen for more detailed investigation. Subsequent exploration by various consultantsmore » produced a number of geotechnical reports on the geology, geophysics, and geochemistry of the field as well as describing production well drilling. Geological investigations at Momotombo included photogeology, field mapping, binocular microscope examination of cuttings, and drillhole correlations. Among the geophysical techniques used to investigate the field sub-structure were: Schlumberger and electromagnetic soundings, dipole mapping and audio-magnetotelluric surveys, gravity and magnetic measurements, frequency domain soundings, self-potential surveys, and subsurface temperature determinations. The geochemical program analyzed the thermal fluids of the surface and in the wells. This report presents the description and results of exploration methods used during the investigative stages of the Momotombo Geothermal Field. A conceptual model of the geothermal field was drawn from the information available at each exploration phase. The exploration methods have been evaluated with respect to their contributions to the understanding of the field and their utilization in planning further development. Our principal finding is that data developed at each stage were not sufficiently integrated to guide further work at the field, causing inefficient use of resources.« less
Stolp, Sean; Bottorff, Joan L; Seaton, Cherisse L; Jones-Bricker, Margaret; Oliffe, John L; Johnson, Steven T; Errey, Sally; Medhurst, Kerensa; Lamont, Sonia
2017-04-01
The purpose of this scoping review was to identify promising factors that underpin effective health promotion collaborations, measurement approaches, and evaluation practices. Measurement approaches and evaluation practices employed in 14 English-language articles published between January 2001 and October 2015 were considered. Data extraction included research design, health focus of the collaboration, factors being evaluated, how factors were conceptualized and measured, and outcome measures. Studies were methodologically diverse employing either quantitative methods (n=9), mixed methods (n=4), or qualitative methods (n=1). In total, these 14 studies examined 113 factors, 88 of which were only measured once. Leadership was the most commonly studied factor but was conceptualized differently across studies. Six factors were significantly associated with outcome measures across studies; leadership (n=3), gender (n=2), trust (n=2), length of the collaboration (n=2), budget (n=2) and changes in organizational model (n=2). Since factors were often conceptualized differently, drawing conclusions about their impact on collaborative functioning remains difficult. The use of reliable and validated tools would strengthen evaluation of health promotion collaborations and would support and enhance the effectiveness of collaboration. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.
Neural networks and fault probability evaluation for diagnosis issues.
Kourd, Yahia; Lefebvre, Dimitri; Guersi, Noureddine
2014-01-01
This paper presents a new FDI technique for fault detection and isolation in unknown nonlinear systems. The objective of the research is to construct and analyze residuals by means of artificial intelligence and probabilistic methods. Artificial neural networks are first used for modeling issues. Neural networks models are designed for learning the fault-free and the faulty behaviors of the considered systems. Once the residuals generated, an evaluation using probabilistic criteria is applied to them to determine what is the most likely fault among a set of candidate faults. The study also includes a comparison between the contributions of these tools and their limitations, particularly through the establishment of quantitative indicators to assess their performance. According to the computation of a confidence factor, the proposed method is suitable to evaluate the reliability of the FDI decision. The approach is applied to detect and isolate 19 fault candidates in the DAMADICS benchmark. The results obtained with the proposed scheme are compared with the results obtained according to a usual thresholding method.
Psychometric properties of an instrument to measure nursing students' quality of life.
Chu, Yanxiang; Xu, Min; Li, Xiuyun
2015-07-01
It is important for clinical nursing teachers and managers to recognize the importance of nursing students' quality of life (QOL) since they are the source of future nurses. As yet, there is no quality of life evaluation scale (QOLES) specific to them. This study designed a quantitative instrument for evaluating QOL of nursing students. The study design was a descriptive survey with mixed methods including literature review, panel discussion, Delphi method, and statistical analysis. The data were collected from 880 nursing students from four teaching hospitals in Wuhan, China. The reliability and validity of the scale were tested through completion of the QOLES in a cluster sampling method. The total scale included 18 items in three domains: physical, psychological, and social functional. The cumulative contributing rate of the three common factors was 65.23%. Cronbach's alpha coefficient of the scale was 0.82. This scale had good reliability and validity to evaluate nursing students' QOL. Copyright © 2015 Elsevier Ltd. All rights reserved.
Mokhtari, Amirhossein; Christopher Frey, H; Zheng, Junyu
2006-11-01
Sensitivity analyses of exposure or risk models can help identify the most significant factors to aid in risk management or to prioritize additional research to reduce uncertainty in the estimates. However, sensitivity analysis is challenged by non-linearity, interactions between inputs, and multiple days or time scales. Selected sensitivity analysis methods are evaluated with respect to their applicability to human exposure models with such features using a testbed. The testbed is a simplified version of a US Environmental Protection Agency's Stochastic Human Exposure and Dose Simulation (SHEDS) model. The methods evaluated include the Pearson and Spearman correlation, sample and rank regression, analysis of variance, Fourier amplitude sensitivity test (FAST), and Sobol's method. The first five methods are known as "sampling-based" techniques, wheras the latter two methods are known as "variance-based" techniques. The main objective of the test cases was to identify the main and total contributions of individual inputs to the output variance. Sobol's method and FAST directly quantified these measures of sensitivity. Results show that sensitivity of an input typically changed when evaluated under different time scales (e.g., daily versus monthly). All methods provided similar insights regarding less important inputs; however, Sobol's method and FAST provided more robust insights with respect to sensitivity of important inputs compared to the sampling-based techniques. Thus, the sampling-based methods can be used in a screening step to identify unimportant inputs, followed by application of more computationally intensive refined methods to a smaller set of inputs. The implications of time variation in sensitivity results for risk management are briefly discussed.
ERIC Educational Resources Information Center
Onyia, Okey Peter
2014-01-01
This paper is a sequel to an earlier one that examines "the efficacy of two innovative peer-assessment templates ("PET" and "PACT") introduced to enable students provide evidence of their fairness in evaluating peer contributions to group project work" (Onyia, O. P. and Allen, S., 2012). In the present paper, three…
ERIC Educational Resources Information Center
Brandriet, Alexandra; Rupp, Charlie A.; Lazenby, Katherine; Becker, Nicole M.
2018-01-01
Analyzing and interpreting data is an important science practice that contributes toward the construction of models from data; yet, there is evidence that students may struggle with making meaning of data. The study reported here focused on characterizing students' approaches to analyzing rate and concentration data in the context of method of…
Perception of Consonants in Reverberation and Noise by Adults Fitted with Bimodal Devices
ERIC Educational Resources Information Center
Mason, Michelle; Kokkinakis, Kostas
2014-01-01
Purpose: The purpose of this study was to evaluate the contribution of a contralateral hearing aid to the perception of consonants, in terms of voicing, manner, and place-of-articulation cues in reverberation and noise by adult cochlear implantees aided by bimodal fittings. Method: Eight postlingually deafened adult cochlear implant (CI) listeners…
Is Telephone Review Feasible and Potentially Effective in Low Vision Services?
ERIC Educational Resources Information Center
Parkes, Claire; Lennon, Julie; Harper, Robert
2013-01-01
Purpose: Demographic transformations within the UK population combine to contribute to a substantial increase in demand for low vision (LV) services, creating a pressing need to reconsider the appropriate methods for service provision. In this study, we evaluate the feasibility of using telephone triage to assess the need for, and timing of, LV…
ERIC Educational Resources Information Center
Palmer, Phyllis M.; Jaffe, Debra M.; McCulloch, Timothy M.; Finnegan, Eileen M.; Van Daele, Douglas J.; Luschei, Erich S.
2008-01-01
Purpose: The purpose of this investigation was to evaluate the relationship between tongue-to-palate pressure and the electromyography (EMG) measured from the mylohyoid, anterior belly of the digastric, geniohyoid, medial pterygoid, velum, genioglossus, and intrinsic tongue muscles. Methods: Seven healthy adults performed tongue-to-palate pressure…
Bialas, Andrzej
2010-01-01
The paper discusses the security issues of intelligent sensors that are able to measure and process data and communicate with other information technology (IT) devices or systems. Such sensors are often used in high risk applications. To improve their robustness, the sensor systems should be developed in a restricted way to provide them with assurance. One of assurance creation methodologies is Common Criteria (ISO/IEC 15408), used for IT products and systems. The contribution of the paper is a Common Criteria compliant and pattern-based method for the intelligent sensors security development. The paper concisely presents this method and its evaluation for the sensor detecting methane in a mine, focusing on the security problem of the intelligent sensor definition and solution. The aim of the validation is to evaluate and improve the introduced method.
Strange nucleon electromagnetic form factors from lattice QCD
NASA Astrophysics Data System (ADS)
Alexandrou, C.; Constantinou, M.; Hadjiyiannakou, K.; Jansen, K.; Kallidonis, C.; Koutsou, G.; Avilés-Casco, A. Vaquero
2018-05-01
We evaluate the strange nucleon electromagnetic form factors using an ensemble of gauge configurations generated with two degenerate maximally twisted mass clover-improved fermions with mass tuned to approximately reproduce the physical pion mass. In addition, we present results for the disconnected light quark contributions to the nucleon electromagnetic form factors. Improved stochastic methods are employed leading to high-precision results. The momentum dependence of the disconnected contributions is fitted using the model-independent z-expansion. We extract the magnetic moment and the electric and magnetic radii of the proton and neutron by including both connected and disconnected contributions. We find that the disconnected light quark contributions to both electric and magnetic form factors are nonzero and at the few percent level as compared to the connected. The strange form factors are also at the percent level but more noisy yielding statistical errors that are typically within one standard deviation from a zero value.
Evaluating the importance of faecal sources in human-impacted waters.
Schoen, Mary E; Soller, Jeffrey A; Ashbolt, Nicholas J
2011-04-01
Quantitative microbial risk assessment (QMRA) was used to evaluate the relative contribution of faecal indicators and pathogens when a mixture of human sources impacts a recreational waterbody. The waterbody was assumed to be impacted with a mixture of secondary-treated disinfected municipal wastewater and untreated (or poorly treated) sewage, using Norovirus as the reference pathogen and enterococci as the reference faecal indicator. The contribution made by each source to the total waterbody volume, indicator density, pathogen density, and illness risk was estimated for a number of scenarios that accounted for pathogen and indicator inactivation based on the age of the effluent (source-to-receptor), possible sedimentation of microorganisms, and the addition of a non-pathogenic source of faecal indicators (such as old sediments or an animal population with low occurrence of human-infectious pathogens). The waterbody indicator density was held constant at 35 CFU 100 mL(-1) enterococci to compare results across scenarios. For the combinations evaluated, either the untreated sewage or the non-pathogenic source of faecal indicators dominated the recreational waterbody enterococci density assuming a culture method. In contrast, indicator density assayed by qPCR, pathogen density, and bather gastrointestinal illness risks were largely dominated by secondary disinfected municipal wastewater, with untreated sewage being increasingly less important as the faecal indicator load increased from a non-pathogenic source. The results support the use of a calibrated qPCR total enterococci indicator, compared to a culture-based assay, to index infectious human enteric viruses released in treated human wastewater, and illustrate that the source contributing the majority of risk in a mixture may be overlooked when only assessing faecal indicators by a culture-based method. Published by Elsevier Ltd.
NASA Astrophysics Data System (ADS)
Griffies, Stephen M.; Danabasoglu, Gokhan; Durack, Paul J.; Adcroft, Alistair J.; Balaji, V.; Böning, Claus W.; Chassignet, Eric P.; Curchitser, Enrique; Deshayes, Julie; Drange, Helge; Fox-Kemper, Baylor; Gleckler, Peter J.; Gregory, Jonathan M.; Haak, Helmuth; Hallberg, Robert W.; Heimbach, Patrick; Hewitt, Helene T.; Holland, David M.; Ilyina, Tatiana; Jungclaus, Johann H.; Komuro, Yoshiki; Krasting, John P.; Large, William G.; Marsland, Simon J.; Masina, Simona; McDougall, Trevor J.; Nurser, A. J. George; Orr, James C.; Pirani, Anna; Qiao, Fangli; Stouffer, Ronald J.; Taylor, Karl E.; Treguier, Anne Marie; Tsujino, Hiroyuki; Uotila, Petteri; Valdivieso, Maria; Wang, Qiang; Winton, Michael; Yeager, Stephen G.
2016-09-01
The Ocean Model Intercomparison Project (OMIP) is an endorsed project in the Coupled Model Intercomparison Project Phase 6 (CMIP6). OMIP addresses CMIP6 science questions, investigating the origins and consequences of systematic model biases. It does so by providing a framework for evaluating (including assessment of systematic biases), understanding, and improving ocean, sea-ice, tracer, and biogeochemical components of climate and earth system models contributing to CMIP6. Among the WCRP Grand Challenges in climate science (GCs), OMIP primarily contributes to the regional sea level change and near-term (climate/decadal) prediction GCs.OMIP provides (a) an experimental protocol for global ocean/sea-ice models run with a prescribed atmospheric forcing; and (b) a protocol for ocean diagnostics to be saved as part of CMIP6. We focus here on the physical component of OMIP, with a companion paper (Orr et al., 2016) detailing methods for the inert chemistry and interactive biogeochemistry. The physical portion of the OMIP experimental protocol follows the interannual Coordinated Ocean-ice Reference Experiments (CORE-II). Since 2009, CORE-I (Normal Year Forcing) and CORE-II (Interannual Forcing) have become the standard methods to evaluate global ocean/sea-ice simulations and to examine mechanisms for forced ocean climate variability. The OMIP diagnostic protocol is relevant for any ocean model component of CMIP6, including the DECK (Diagnostic, Evaluation and Characterization of Klima experiments), historical simulations, FAFMIP (Flux Anomaly Forced MIP), C4MIP (Coupled Carbon Cycle Climate MIP), DAMIP (Detection and Attribution MIP), DCPP (Decadal Climate Prediction Project), ScenarioMIP, HighResMIP (High Resolution MIP), as well as the ocean/sea-ice OMIP simulations.
Luo, Jiaqiang; Cai, Weixi; Wu, Tong; Xu, Baojun
2016-06-15
Total saponin content, total phenolics content, total flavonoids content, condensed tannin content in hull, cotyledon and whole grain of both adzuki bean and mung bean were determined by colorimetric methods. Vitexin and isovitexin contents in mung bean were determined by HPLC. Antioxidant effects were evaluated with DPPH scavenging activity and ferric reducing antioxidant power assay. In vitro anti-inflammatory and anti-diabetic effects of beans were evaluated by protease and aldose reductase inhibitory assays, respectively. The results indicated that the bean hulls were the most abundant in phytochemicals and largely contributed antioxidant activities, anti-inflammatory effects and anti-diabetic effects of whole grains. The result showed that mung bean hull was the most abundant with vitexin at 37.43 mg/g and isovitexin at 47.18 mg/g, respectively. Most of the phytochemicals and bioactivities were most predominantly contributed by the bean hulls with exception for condensed tannin of mung bean; which was more abundant in the cotyledon than its hull. Copyright © 2016 Elsevier Ltd. All rights reserved.
Stein, Mark A.
2008-01-01
Purpose We sought to determine if maternal depression contributed to the use of corporal punishment in children with attention-deficit/hyperactivity disorder (ADHD). Patients and Methods The data were gathered through chart review of clinic-referred children with ADHD and their mothers who were evaluated at a psychiatric clinic located in a large academic medical center in Seoul, Korea. Daily records kept by parents and 13 items from the Physical Assault of the Parent-Child Conflict Tactics Scales (CTSPC) were used to assess corporal punishment. Ninety-one children with ADHD and their mothers were included in this study. Results Mothers who used corporal punishment showed significantly higher scores on the Beck Depression Inventory (t = - 2.952, df = 89, p < 0.01) than mothers who did not. Moreover, maternal depression contributed to the use of corporal punishment in ADHD children (Nagelkerke R2 = 0.102, p < 0.05). Conclusion Maternal depression contributes to the use of corporal punishment with children with ADHD. Assessment and management of the maternal depression should be an important focus of evaluation of children with ADHD. PMID:18729299
2012-01-01
Background Many methods for the genetic analysis of mastitis use a cross-sectional approach, which omits information on, e.g., repeated mastitis cases during lactation, somatic cell count fluctuations, and recovery process. Acknowledging the dynamic behavior of mastitis during lactation and taking into account that there is more than one binary response variable to consider, can enhance the genetic evaluation of mastitis. Methods Genetic evaluation of mastitis was carried out by modeling the dynamic nature of somatic cell count (SCC) within the lactation. The SCC patterns were captured by modeling transition probabilities between assumed states of mastitis and non-mastitis. A widely dispersed SCC pattern generates high transition probabilities between states and vice versa. This method can model transitions to and from states of infection simultaneously, i.e. both the mastitis liability and the recovery process are considered. A multilevel discrete time survival model was applied to estimate breeding values on simulated data with different dataset sizes, mastitis frequencies, and genetic correlations. Results Correlations between estimated and simulated breeding values showed that the estimated accuracies for mastitis liability were similar to those from previously tested methods that used data of confirmed mastitis cases, while our results were based on SCC as an indicator of mastitis. In addition, unlike the other methods, our method also generates breeding values for the recovery process. Conclusions The developed method provides an effective tool for the genetic evaluation of mastitis when considering the whole disease course and will contribute to improving the genetic evaluation of udder health. PMID:22475575
Methods for semi-automated indexing for high precision information retrieval.
Berrios, Daniel C; Cucina, Russell J; Fagan, Lawrence M
2002-01-01
To evaluate a new system, ISAID (Internet-based Semi-automated Indexing of Documents), and to generate textbook indexes that are more detailed and more useful to readers. Pilot evaluation: simple, nonrandomized trial comparing ISAID with manual indexing methods. Methods evaluation: randomized, cross-over trial comparing three versions of ISAID and usability survey. Pilot evaluation: two physicians. Methods evaluation: twelve physicians, each of whom used three different versions of the system for a total of 36 indexing sessions. Total index term tuples generated per document per minute (TPM), with and without adjustment for concordance with other subjects; inter-indexer consistency; ratings of the usability of the ISAID indexing system. Compared with manual methods, ISAID decreased indexing times greatly. Using three versions of ISAID, inter-indexer consistency ranged from 15% to 65% with a mean of 41%, 31%, and 40% for each of three documents. Subjects using the full version of ISAID were faster (average TPM: 5.6) and had higher rates of concordant index generation. There were substantial learning effects, despite our use of a training/run-in phase. Subjects using the full version of ISAID were much faster by the third indexing session (average TPM: 9.1). There was a statistically significant increase in three-subject concordant indexing rate using the full version of ISAID during the second indexing session (p < 0.05). Users of the ISAID indexing system create complex, precise, and accurate indexing for full-text documents much faster than users of manual methods. Furthermore, the natural language processing methods that ISAID uses to suggest indexes contributes substantially to increased indexing speed and accuracy.
Normative Data for an Instrumental Assessment of the Upper-Limb Functionality.
Caimmi, Marco; Guanziroli, Eleonora; Malosio, Matteo; Pedrocchi, Nicola; Vicentini, Federico; Molinari Tosatti, Lorenzo; Molteni, Franco
2015-01-01
Upper-limb movement analysis is important to monitor objectively rehabilitation interventions, contributing to improving the overall treatments outcomes. Simple, fast, easy-to-use, and applicable methods are required to allow routinely functional evaluation of patients with different pathologies and clinical conditions. This paper describes the Reaching and Hand-to-Mouth Evaluation Method, a fast procedure to assess the upper-limb motor control and functional ability, providing a set of normative data from 42 healthy subjects of different ages, evaluated for both the dominant and the nondominant limb motor performance. Sixteen of them were reevaluated after two weeks to perform test-retest reliability analysis. Data were clustered into three subgroups of different ages to test the method sensitivity to motor control differences. Experimental data show notable test-retest reliability in all tasks. Data from older and younger subjects show significant differences in the measures related to the ability for coordination thus showing the high sensitivity of the method to motor control differences. The presented method, provided with control data from healthy subjects, appears to be a suitable and reliable tool for the upper-limb functional assessment in the clinical environment.
Normative Data for an Instrumental Assessment of the Upper-Limb Functionality
Caimmi, Marco; Guanziroli, Eleonora; Malosio, Matteo; Pedrocchi, Nicola; Vicentini, Federico; Molinari Tosatti, Lorenzo; Molteni, Franco
2015-01-01
Upper-limb movement analysis is important to monitor objectively rehabilitation interventions, contributing to improving the overall treatments outcomes. Simple, fast, easy-to-use, and applicable methods are required to allow routinely functional evaluation of patients with different pathologies and clinical conditions. This paper describes the Reaching and Hand-to-Mouth Evaluation Method, a fast procedure to assess the upper-limb motor control and functional ability, providing a set of normative data from 42 healthy subjects of different ages, evaluated for both the dominant and the nondominant limb motor performance. Sixteen of them were reevaluated after two weeks to perform test-retest reliability analysis. Data were clustered into three subgroups of different ages to test the method sensitivity to motor control differences. Experimental data show notable test-retest reliability in all tasks. Data from older and younger subjects show significant differences in the measures related to the ability for coordination thus showing the high sensitivity of the method to motor control differences. The presented method, provided with control data from healthy subjects, appears to be a suitable and reliable tool for the upper-limb functional assessment in the clinical environment. PMID:26539500
Evaluation as institution: a contractarian argument for needs-based economic evaluation.
Rogowski, Wolf H
2018-06-13
There is a gap between health economic evaluation methods and the value judgments of coverage decision makers, at least in Germany. Measuring preference satisfaction has been claimed to be inappropriate for allocating health care resources, e.g. because it disregards medical need. The existing methods oriented at medical need have been claimed to disregard non-consequentialist fairness concerns. The aim of this article is to propose a new, contractarian argument for justifying needs-based economic evaluation. It is based on consent rather than maximization of some impersonal unit of value to accommodate the fairness concerns. This conceptual paper draws upon contractarian ethics and constitution economics to show how economic evaluation can be viewed as an institution to overcome societal conflicts in the allocation of scarce health care resources. For this, the problem of allocating scarce health care resources in a society is reconstructed as a social dilemma. Both disadvantaged patients and affluent healthy individuals can be argued to share interests in a societal contract to provide technologies which ameliorate medical need, based on progressive funding. The use of needs-based economic evaluation methods for coverage determination can be interpreted as institutions for conflict resolution as far as they use consented criteria to ensure the social contract's sustainability and avoid implicit rationing or unaffordable contribution rates. This justifies the use of needs-based evaluation methods by Pareto-superiority and consent (rather than by some needs-based value function per se). The view of economic evaluation presented here may help account for fairness concerns in the further development of evaluation methods. This is because it directs the attention away from determining some unit of value to be maximized towards determining those persons who are most likely not to consent and meeting their concerns. Following this direction in methods development is likely to increase the acceptability of health economic evaluation by decision makers.
Ganta, Shravani; Nagaraj, Anup; Pareek, Sonia; Sidiq, Mohsin; Singh, Kushpal; Vishnani, Preeti
2015-01-01
Background Fluoride in drinking water is known for both beneficial and detrimental effects on health. The principal sources of fluoride include water, some species of vegetation, certain edible marine animals, dust and industrial processes. The purpose of this study was to evaluate the fluoride retention of most commonly consumed estuarine fishes among fish consuming population of Andhra Pradesh. Materials and Methods A cross-sectional study was conducted to evaluate the amount of fluoride retention due to ten most commonly consumed estuarine fishes as a contributing factor to Fluorosis by SPADNS Spectrophotometric method. The presence and severity of dental fluorosis among fish consuming population was recorded using Community Fluorosis Index. Statistical analysis was done using MedCalc v12.2.1.0 software. Results For Sea water fishes, the fluoride levels in bone were maximum in Indian Sardine (4.22 ppm). Amongst the river water fishes, the fluoride levels in bone were maximum in Catla (1.51 ppm). Also, the mean total fluoride concentrations of all the river fishes in skin, muscle and bone were less (0.86 ppm) as compared to the sea water fishes (2.59 ppm). It was unveiled that sea fishes accumulate relatively large amounts of Fluoride as compared to the river water fishes. The mean Community Fluorosis Index was found to be 1.06 amongst a sampled fish consuming population. Evaluation by Community Index for Dental fluorosis (CFI) suggested that fluorosis is of medium public health importance. Conclusion It was analysed that bone tends to accumulate more amount of fluoride followed by muscle and skin which might be due to the increased permeability and chemical trapping of fluoride inside the tissues. The amount of fluoride present in the fishes is directly related to the severity of fluorosis amongst fish consuming population, suggesting fishes as a contributing factor to fluorosis depending upon the dietary consumption. PMID:26266208
Evaluating gull diets: A comparison of conventional methods and stable isotope analysis
Weiser, Emily L.; Powell, Abby N.
2011-01-01
Samples such as regurgitated pellets and food remains have traditionally been used in studies of bird diets, but these can produce biased estimates depending on the digestibility of different foods. Stable isotope analysis has been developed as a method for assessing bird diets that is not biased by digestibility. These two methods may provide complementary or conflicting information on diets of birds, but are rarely compared directly. We analyzed carbon and nitrogen stable isotope ratios of feathers of Glaucous Gull (Larus hyperboreus) chicks from eight breeding colonies in northern Alaska, and used a Bayesian mixing model to generate a probability distribution for the contribution of each food group to diets. We compared these model results with probability distributions from conventional diet samples (pellets and food remains) from the same colonies and time periods. Relative to the stable isotope estimates, conventional analysis often overestimated the contributions of birds and small mammals to gull diets and often underestimated the contributions of fish and zooplankton. Both methods gave similar estimates for the contributions of scavenged caribou, miscellaneous marine foods, and garbage to diets. Pellets and food remains therefore may be useful for assessing the importance of garbage relative to certain other foods in diets of gulls and similar birds, but are clearly inappropriate for estimating the potential impact of gulls on birds, small mammals, or fish. However, conventional samples provide more species-level information than stable isotope analysis, so a combined approach would be most useful for diet analysis and assessing a predator's impact on particular prey groups.
Dzekem, Bonaventure Suiru; Kacou, Jean Baptiste; Abanda, Martin; Kramoh, Euloge; Yapobi, Yves; Kingue, Samuel; Kengne, Andre Pascal; Dzudie, Anastase
Africa bears a quarter of the global burden of disease but contributes less than 2% of the global research publications on health, partially due to a lack of expertise and skills to carry out scientific research. We report on a short course on research methods organised by the Clinical Research Education Networking and Consultancy (CRENC) during the third international congress of the Ivorian Cardiac Society (SICARD) in Abidjan, Cote d'Ivoire. Results from the pre- and post-test evaluation during this course showed that African researchers could contribute more to scientific research and publications, provided adequate support and investment is geared towards the identification and training of motivated early-career scientists.
Dzekem, Bonaventure Suiru; Abanda, Martin; Dzudie, Anastase; Kacou, Jean Baptiste Anzouan; Kramoh, Euloge; Yapobi, Yves; Kingue, Samuel; Dzudie, Anastase; Kengne, Andre Pascal
2017-01-01
Summary Africa bears a quarter of the global burden of disease but contributes less than 2% of the global research publications on health, partially due to a lack of expertise and skills to carry out scientific research. We report on a short course on research methods organised by the Clinical Research Education Networking and Consultancy (CRENC) during the third international congress of the Ivorian Cardiac Society (SICARD) in Abidjan, Cote d’Ivoire. Results from the pre- and post-test evaluation during this course showed that African researchers could contribute more to scientific research and publications, provided adequate support and investment is geared towards the identification and training of motivated early-career scientists. PMID:29144534
O'Cathain, Alicia; Murphy, Elizabeth; Nicholl, Jon
2007-01-01
Background Recently, there has been a surge of international interest in combining qualitative and quantitative methods in a single study – often called mixed methods research. It is timely to consider why and how mixed methods research is used in health services research (HSR). Methods Documentary analysis of proposals and reports of 75 mixed methods studies funded by a research commissioner of HSR in England between 1994 and 2004. Face-to-face semi-structured interviews with 20 researchers sampled from these studies. Results 18% (119/647) of HSR studies were classified as mixed methods research. In the documentation, comprehensiveness was the main driver for using mixed methods research, with researchers wanting to address a wider range of questions than quantitative methods alone would allow. Interviewees elaborated on this, identifying the need for qualitative research to engage with the complexity of health, health care interventions, and the environment in which studies took place. Motivations for adopting a mixed methods approach were not always based on the intrinsic value of mixed methods research for addressing the research question; they could be strategic, for example, to obtain funding. Mixed methods research was used in the context of evaluation, including randomised and non-randomised designs; survey and fieldwork exploratory studies; and instrument development. Studies drew on a limited number of methods – particularly surveys and individual interviews – but used methods in a wide range of roles. Conclusion Mixed methods research is common in HSR in the UK. Its use is driven by pragmatism rather than principle, motivated by the perceived deficit of quantitative methods alone to address the complexity of research in health care, as well as other more strategic gains. Methods are combined in a range of contexts, yet the emerging methodological contributions from HSR to the field of mixed methods research are currently limited to the single context of combining qualitative methods and randomised controlled trials. Health services researchers could further contribute to the development of mixed methods research in the contexts of instrument development, survey and fieldwork, and non-randomised evaluations. PMID:17570838
Zong, Fan; Wang, Lifang
2017-01-01
University scientific research ability is an important indicator to express the strength of universities. In this paper, the evaluation of university scientific research ability is investigated based on the output of sci-tech papers. Four university alliances from North America, UK, Australia, and China, are selected as the case study of the university scientific research evaluation. Data coming from Thomson Reuters InCites are collected to support the evaluation. The work has contributed new framework to the issue of university scientific research ability evaluation. At first, we have established a hierarchical structure to show the factors that impact the evaluation of university scientific research ability. Then, a new MCDM method called D-AHP model is used to implement the evaluation and ranking of different university alliances, in which a data-driven approach is proposed to automatically generate the D numbers preference relations. Next, a sensitivity analysis has been given to show the impact of weights of factors and sub-factors on the evaluation result. At last, the results obtained by using different methods are compared and discussed to verify the effectiveness and reasonability of this study, and some suggestions are given to promote China's scientific research ability.
NASA Astrophysics Data System (ADS)
He, Yan-Chun; Tjiputra, Jerry; Langehaug, Helene R.; Jeansson, Emil; Gao, Yongqi; Schwinger, Jörg; Olsen, Are
2018-03-01
The Inverse Gaussian approximation of transit time distribution method (IG-TTD) is widely used to infer the anthropogenic carbon (Cant) concentration in the ocean from measurements of transient tracers such as chlorofluorocarbons (CFCs) and sulfur hexafluoride (SF6). Its accuracy relies on the validity of several assumptions, notably (i) a steady state ocean circulation, (ii) a prescribed age tracer saturation history, e.g., a constant 100% saturation, (iii) a prescribed constant degree of mixing in the ocean, (iv) a constant surface ocean air-sea CO2 disequilibrium with time, and (v) that preformed alkalinity can be sufficiently estimated by salinity or salinity and temperature. Here, these assumptions are evaluated using simulated "model-truth" of Cant. The results give the IG-TTD method a range of uncertainty from 7.8% to 13.6% (11.4 Pg C to 19.8 Pg C) due to above assumptions, which is about half of the uncertainty derived in previous model studies. Assumptions (ii), (iv) and (iii) are the three largest sources of uncertainties, accounting for 5.5%, 3.8% and 3.0%, respectively, while assumptions (i) and (v) only contribute about 0.6% and 0.7%. Regionally, the Southern Ocean contributes the largest uncertainty, of 7.8%, while the North Atlantic contributes about 1.3%. Our findings demonstrate that spatial-dependency of Δ/Γ, and temporal changes in tracer saturation and air-sea CO2 disequilibrium have strong compensating effect on the estimated Cant. The values of these parameters should be quantified to reduce the uncertainty of IG-TTD; this is increasingly important under a changing ocean climate.
Stadler, David; Sulyok, Michael; Schuhmacher, Rainer; Berthiller, Franz; Krska, Rudolf
2018-05-01
Multi-mycotoxin determination by LC-MS is commonly based on external solvent-based or matrix-matched calibration and, if necessary, the correction for the method bias. In everyday practice, the method bias (expressed as apparent recovery RA), which may be caused by losses during the recovery process and/or signal/suppression enhancement, is evaluated by replicate analysis of a single spiked lot of a matrix. However, RA may vary for different lots of the same matrix, i.e., lot-to-lot variation, which can result in a higher relative expanded measurement uncertainty (U r ). We applied a straightforward procedure for the calculation of U r from the within-laboratory reproducibility, which is also called intermediate precision, and the uncertainty of RA (u r,RA ). To estimate the contribution of the lot-to-lot variation to U r , the measurement results of one replicate of seven different lots of figs and maize and seven replicates of a single lot of these matrices, respectively, were used to calculate U r . The lot-to-lot variation was contributing to u r,RA and thus to U r for the majority of the 66 evaluated analytes in both figs and maize. The major contributions of the lot-to-lot variation to u r,RA were differences in analyte recovery in figs and relative matrix effects in maize. U r was estimated from long-term participation in proficiency test schemes with 58%. Provided proper validation, a fit-for-purpose U r of 50% was proposed for measurement results obtained by an LC-MS-based multi-mycotoxin assay, independent of the concentration of the analytes.
Comparison of field swept ferromagnetic resonance methods - A case study using Ni-Mn-Sn films
NASA Astrophysics Data System (ADS)
Modak, R.; Samantaray, B.; Mandal, P.; Srinivasu, V. V.; Srinivasan, A.
2018-05-01
Ferromagnetic resonance spectroscopy is used to understand the magnetic behavior of Ni-Mn-Sn Heusler alloy film. Two popular experimental methods available for recording FMR spectra are presented here. In plane angular (φH) variation of magnetic relaxation is used to evaluate the in plane anisotropy (Ku) of the film. The out of plane (θH) variation of FMR spectra has been numerically analyzed to extract the Gilbert damping coefficient, effective magnetization and perpendicular magnetic anisotropy (K1). Magnetic homogeneity of the film had also been evaluated in terms of 2-magnon contribution from FMR linewidth. The advantage and limitations of these two popular FMR techniques are discussed on the basis of the results obtained in this comparative study.
O'Cathain, Alicia; Murphy, Elizabeth; Nicholl, Jon
2007-06-14
Recently, there has been a surge of international interest in combining qualitative and quantitative methods in a single study--often called mixed methods research. It is timely to consider why and how mixed methods research is used in health services research (HSR). Documentary analysis of proposals and reports of 75 mixed methods studies funded by a research commissioner of HSR in England between 1994 and 2004. Face-to-face semi-structured interviews with 20 researchers sampled from these studies. 18% (119/647) of HSR studies were classified as mixed methods research. In the documentation, comprehensiveness was the main driver for using mixed methods research, with researchers wanting to address a wider range of questions than quantitative methods alone would allow. Interviewees elaborated on this, identifying the need for qualitative research to engage with the complexity of health, health care interventions, and the environment in which studies took place. Motivations for adopting a mixed methods approach were not always based on the intrinsic value of mixed methods research for addressing the research question; they could be strategic, for example, to obtain funding. Mixed methods research was used in the context of evaluation, including randomised and non-randomised designs; survey and fieldwork exploratory studies; and instrument development. Studies drew on a limited number of methods--particularly surveys and individual interviews--but used methods in a wide range of roles. Mixed methods research is common in HSR in the UK. Its use is driven by pragmatism rather than principle, motivated by the perceived deficit of quantitative methods alone to address the complexity of research in health care, as well as other more strategic gains. Methods are combined in a range of contexts, yet the emerging methodological contributions from HSR to the field of mixed methods research are currently limited to the single context of combining qualitative methods and randomised controlled trials. Health services researchers could further contribute to the development of mixed methods research in the contexts of instrument development, survey and fieldwork, and non-randomised evaluations.
de Lara-Castells, María Pilar; Stoll, Hermann; Mitrushchenkov, Alexander O
2014-08-21
As a prototypical dispersion-dominated physisorption problem, we analyze here the performance of dispersionless and dispersion-accounting methodologies on the helium interaction with cluster models of the TiO2(110) surface. A special focus has been given to the dispersionless density functional dlDF and the dlDF+Das construction for the total interaction energy (K. Pernal, R. Podeswa, K. Patkowski, and K. Szalewicz, Phys. Rev. Lett. 2009, 109, 263201), where Das is an effective interatomic pairwise functional form for the dispersion. Likewise, the performance of symmetry-adapted perturbation theory (SAPT) method is evaluated, where the interacting monomers are described by density functional theory (DFT) with the dlDF, PBE, and PBE0 functionals. Our benchmarks include CCSD(T)-F12b calculations and comparative analysis on the nuclear bound states supported by the He-cluster potentials. Moreover, intra- and intermonomer correlation contributions to the physisorption interaction are analyzed through the method of increments (H. Stoll, J. Chem. Phys. 1992, 97, 8449) at the CCSD(T) level of theory. This method is further applied in conjunction with a partitioning of the Hartree-Fock interaction energy to estimate individual interaction energy components, comparing them with those obtained using the different SAPT(DFT) approaches. The cluster size evolution of dispersionless and dispersion-accounting energy components is then discussed, revealing the reduced role of the dispersionless interaction and intramonomer correlation when the extended nature of the surface is better accounted for. On the contrary, both post-Hartree-Fock and SAPT(DFT) results clearly demonstrate the high-transferability character of the effective pairwise dispersion interaction whatever the cluster model is. Our contribution also illustrates how the method of increments can be used as a valuable tool not only to achieve the accuracy of CCSD(T) calculations using large cluster models but also to evaluate the performance of SAPT(DFT) methods for the physically well-defined contributions to the total interaction energy. Overall, our work indicates the excellent performance of a dlDF+Das approach in which the parameters are optimized using the smallest cluster model of the target surface to treat van der Waals adsorbate-surface interactions.
A systematic and comprehensive approach to teaching and evaluating interpersonal skills.
Grayson, M; Nugent, C; Oken, S L
1977-11-01
This study addressed one problem with current methods for teaching and evaluating interpersonal skills: the failure to include a wide range of behaviors reported in the literature as contributing to patient dissatisfaction and noncompliance. To address this concern, the authors developed a comprehensive interpersonal skills training program and a pretest-posttest evaluation. The tests were administered to two student groups one of which received the interpersonal skills instruction. The student group exposed to the training exhibited a significant positive change from pretest to posttest. Additionally the change for this group was significantly greater than the change for the group not exposed to interpersonal skills instruction.
Active thermography in qualitative evaluation of protective materials.
Gralewicz, Grzegorz; Wiecek, Bogusław
2009-01-01
This is a study of the possibilities of a qualitative evaluation of protective materials with active thermography. It presents a simulation of a periodic excitation of a multilayer composite material. Tests were conducted with lock-in thermography on Kevlar composite consisting of 16 layers of Kevlar fabric reinforced with formaldehyde resin with implanted delamination defects. Lock-in thermography is a versatile tool for nondestructive evaluation. It is a fast, remote and nondestructive procedure. Hence, it was used to detect delaminations in the composite structure of materials used in the production of components designed for personal protection. This method directly contributes to an improvement in safety.
Iggy, Litaor M.; Thurman, E.M.
1988-01-01
Soil interstitial waters in the Green Lakes Valley, Front Range, Colorado were studied to evaluate the capacity of the soil system to buffer acid deposition. In order to determine the contribution of humic substances to the buffering capacity of a given soil, dissolved organic carbon (DOC) and pH of the soil solutions were measured. The concentration of the organic anion, Ai-, derived from DOC at sample pH and the concentration of organic anion, Ax- at the equivalence point were calculated using carboxyl contents from isolated and purified humic material from soil solutions. Subtracting Ax- from Ai- yields the contribution of humic substances to the buffering capacity (Aequiv.-). Using this method, one can evaluate the relative contribution of inorganic and organic constituents to the acid neutralizing capacity (ANC) of the soil solutions. The relative contribution of organic acids to the overall ANC was found to be extremely important in the alpine wetland (52%) and the forest-tundra ecotone (40%), and somewhat less important in the alpine tundra sites (20%). A failure to recognize the importance of organic acids in soil solutions to the ANC will result in erroneous estimates of the buffering capacity in the alpine environment of the Front Range, Colorado. ?? 1988.
Small meteoroids' major contribution to Mercury's exosphere
NASA Astrophysics Data System (ADS)
Grotheer, E. B.; Livi, S. A.
2014-01-01
The contribution of the meteoroid population to the generation of Mercury's exosphere is analyzed to determine which segment contributes most greatly to exospheric refilling via the process of meteoritic impact vaporization. For the meteoroid data, a differential mass distribution based on work by Grün et al. (Grün, E., Zook, H.A., Fechtig, H., Giese, R.H. [1985]. Icarus 62(2), 244-272) and a differential velocity distribution based on the work of Zook (Zook, H.A. [1975]. In: 6th Lunar Science Conference, vol. 2. Pergamon Press, Inc., Houston, TX, pp. 1653-1672) is used. These distributions are then evaluated using the method employed by Cintala (Cintala, M.J. [1992]. J. Geophys. Res. 97(E1), 947-974) to determine impact rates for selected mass and velocity segments of the meteoroid population.
Shiraogawa, Takafumi; Ehara, Masahiro; Jurinovich, Sandro; Cupellini, Lorenzo; Mennucci, Benedetta
2018-06-15
Recently, a method to calculate the absorption and circular dichroism (CD) spectra based on the exciton coupling has been developed. In this work, the method was utilized for the decomposition of the CD and circularly polarized luminescence (CPL) spectra of a multichromophoric system into chromophore contributions for recently developed through-space conjugated oligomers. The method which has been implemented using rotatory strength in the velocity form and therefore it is gauge-invariant, enables us to evaluate the contribution from each chromophoric unit and locally excited state to the CD and CPL spectra of the total system. The excitonic calculations suitably reproduce the full calculations of the system, as well as the experimental results. We demonstrate that the interactions between electric transition dipole moments of adjacent chromophoric units are crucial in the CD and CPL spectra of the multichromophoric systems, while the interactions between electric and magnetic transition dipole moments are not negligible. © 2018 Wiley Periodicals, Inc. © 2018 Wiley Periodicals, Inc.
Clinical and videofluoroscopic diagnosis of dysphagia in chronic encephalopathy of childhood*
Araújo, Brenda Carla Lima; Motta, Maria Eugênia Almeida; de Castro, Adriana Guerra; de Araújo, Claudia Marina Tavares
2014-01-01
Objective To evaluate the contribution of deglutition videofluoroscopy in the clinical diagnosis of dysphagia in chronic encephalopathy of childhood. Materials and Methods The study sample consisted of 93 children diagnosed with chronic encephalopathy, in the age range between two and five years, selected by convenience among patients referred to the authors' institution by speech therapists, neurologists and gastroenterologists in the period from March 2010 to September 2011. The data collection was made at two different moments, by different investigators who were blind to each other. Results The method presented low sensitivity for detecting aspiration with puree consistency (p = 0.04). Specificity and negative predictive value were high for clinical diagnosis of dysphagia with puree consistency. Conclusion In the present study, the value for sensitivity in the clinical diagnosis of dysphagia demonstrates that this diagnostic procedure may not detect any change in the swallowing process regardless of the food consistency used during the investigation. Thus, the addition of the videofluoroscopic method can significantly contribute to the diagnosis of dysphagia. PMID:25741054
A Comparison of seismic instrument noise coherence analysis techniques
Ringler, A.T.; Hutt, C.R.; Evans, J.R.; Sandoval, L.D.
2011-01-01
The self-noise of a seismic instrument is a fundamental characteristic used to evaluate the quality of the instrument. It is important to be able to measure this self-noise robustly, to understand how differences among test configurations affect the tests, and to understand how different processing techniques and isolation methods (from nonseismic sources) can contribute to differences in results. We compare two popular coherence methods used for calculating incoherent noise, which is widely used as an estimate of instrument self-noise (incoherent noise and self-noise are not strictly identical but in observatory practice are approximately equivalent; Holcomb, 1989; Sleeman et al., 2006). Beyond directly comparing these two coherence methods on similar models of seismometers, we compare how small changes in test conditions can contribute to incoherent-noise estimates. These conditions include timing errors, signal-to-noise ratio changes (ratios between background noise and instrument incoherent noise), relative sensor locations, misalignment errors, processing techniques, and different configurations of sensor types.
Computing black hole partition functions from quasinormal modes
Arnold, Peter; Szepietowski, Phillip; Vaman, Diana
2016-07-07
We propose a method of computing one-loop determinants in black hole space-times (with emphasis on asymptotically anti-de Sitter black holes) that may be used for numerics when completely-analytic results are unattainable. The method utilizes the expression for one-loop determinants in terms of quasinormal frequencies determined by Denef, Hartnoll and Sachdev in [1]. A numerical evaluation must face the fact that the sum over the quasinormal modes, indexed by momentum and overtone numbers, is divergent. A necessary ingredient is then a regularization scheme to handle the divergent contributions of individual fixed-momentum sectors to the partition function. To this end, we formulatemore » an effective two-dimensional problem in which a natural refinement of standard heat kernel techniques can be used to account for contributions to the partition function at fixed momentum. We test our method in a concrete case by reproducing the scalar one-loop determinant in the BTZ black hole background. Furthermore, we then discuss the application of such techniques to more complicated spacetimes.« less
Development and evaluation of an ontology for guiding appropriate antibiotic prescribing.
Bright, Tiffani J; Yoko Furuya, E; Kuperman, Gilad J; Cimino, James J; Bakken, Suzanne
2012-02-01
To develop and apply formal ontology creation methods to the domain of antimicrobial prescribing and to formally evaluate the resulting ontology through intrinsic and extrinsic evaluation studies. We extended existing ontology development methods to create the ontology and implemented the ontology using Protégé-OWL. Correctness of the ontology was assessed using a set of ontology design principles and domain expert review via the laddering technique. We created three artifacts to support the extrinsic evaluation (set of prescribing rules, alerts and an ontology-driven alert module, and a patient database) and evaluated the usefulness of the ontology for performing knowledge management tasks to maintain the ontology and for generating alerts to guide antibiotic prescribing. The ontology includes 199 classes, 10 properties, and 1636 description logic restrictions. Twenty-three Semantic Web Rule Language rules were written to generate three prescribing alerts: (1) antibiotic-microorganism mismatch alert; (2) medication-allergy alert; and (3) non-recommended empiric antibiotic therapy alert. The evaluation studies confirmed the correctness of the ontology, usefulness of the ontology for representing and maintaining antimicrobial treatment knowledge rules, and usefulness of the ontology for generating alerts to provide feedback to clinicians during antibiotic prescribing. This study contributes to the understanding of ontology development and evaluation methods and addresses one knowledge gap related to using ontologies as a clinical decision support system component-a need for formal ontology evaluation methods to measure their quality from the perspective of their intrinsic characteristics and their usefulness for specific tasks. Copyright © 2011 Elsevier Inc. All rights reserved.
Hill, Ryan C; Oman, Trent J; Wang, Xiujuan; Shan, Guomin; Schafer, Barry; Herman, Rod A; Tobias, Rowel; Shippar, Jeff; Malayappan, Bhaskar; Sheng, Li; Xu, Austin; Bradshaw, Jason
2017-07-12
As part of the regulatory approval process in Europe, comparison of endogenous soybean allergen levels between genetically engineered (GE) and non-GE plants has been requested. A quantitative multiplex analytical method using tandem mass spectrometry was developed and validated to measure 10 potential soybean allergens from soybean seed. The analytical method was implemented at six laboratories to demonstrate the robustness of the method and further applied to three soybean field studies across multiple growing seasons (including 21 non-GE soybean varieties) to assess the natural variation of allergen levels. The results show environmental factors contribute more than genetic factors to the large variation in allergen abundance (2- to 50-fold between environmental replicates) as well as a large contribution of Gly m 5 and Gly m 6 to the total allergen profile, calling into question the scientific rational for measurement of endogenous allergen levels between GE and non-GE varieties in the safety assessment.
NASA Astrophysics Data System (ADS)
Bai, X. T.; Wu, Y. H.; Zhang, K.; Chen, C. Z.; Yan, H. P.
2017-12-01
This paper mainly focuses on the calculation and analysis on the radiation noise of the angular contact ball bearing applied to the ceramic motorized spindle. The dynamic model containing the main working conditions and structural parameters is established based on dynamic theory of rolling bearing. The sub-source decomposition method is introduced in for the calculation of the radiation noise of the bearing, and a comparative experiment is adopted to check the precision of the method. Then the comparison between the contribution of different components is carried out in frequency domain based on the sub-source decomposition method. The spectrum of radiation noise of different components under various rotation speeds are used as the basis of assessing the contribution of different eigenfrequencies on the radiation noise of the components, and the proportion of friction noise and impact noise is evaluated as well. The results of the research provide the theoretical basis for the calculation of bearing noise, and offers reference to the impact of different components on the radiation noise of the bearing under different rotation speed.
Advances in the analytical methods for determining the antioxidant properties of honey: a review.
Moniruzzaman, M; Khalil, M I; Sulaiman, S A; Gan, S H
2012-01-01
Free radicals and reactive oxygen species (ROS) have been implicated in contributing to the processes of aging and disease. In an effort to combat free radical activity, scientists are studying the effects of increasing individuals' antioxidant levels through diet and dietary supplements. Honey appears to act as an antioxidant in more ways than one. In the body, honey can mop up free radicals and contribute to better health. Various antioxidant activity methods have been used to measure and compare the antioxidant activity of honey. In recent years, DPPH (Diphenyl-1-picrylhydrazyl), FRAP (Ferric Reducing Antioxidant Power), ORAC (The Oxygen Radical Absorbance Capacity), ABTS [2, 2-azinobis (3ehtylbenzothiazoline-6-sulfonic acid) diamonium salt], TEAC [6-hydroxy-2, 5, 7, 8-tetramethylchroman-2-carboxylic acid (Trolox)-equivalent antioxidant capacity] assays have been used to evaluate antioxidant activity of honey. The antioxidant activity of honey is also measured by ascorbic acid content and different enzyme assays like Catalase (CAT), Glutathione Peroxidase (GPO), Superoxide Dismutase (SOD). Among the different methods available, methods that have been validated, standardized and widely reported are recommended.
Xing, Fuguo; Zhang, Wei; Selvaraj, Jonathan Nimal; Liu, Yang
2015-05-01
Food processing methods contribute to DNA degradation, thereby affecting genetically modified organism detection and quantification. This study evaluated the effect of food processing methods on the relative transgenic content of genetically modified rice with Cry1Ab. In steamed rice and rice noodles, the levels of Cry1Ab were ⩾ 100% and <83%, respectively. Frying and baking in rice crackers contributed to a reduction in Pubi and Cry1Ab, while microwaving caused a decrease in Pubi and an increase in Cry1Ab. The processing methods of sweet rice wine had the most severe degradation effects on Pubi and Cry1Ab. In steamed rice and rice noodles, Cry1Ab was the most stable, followed by SPS and Pubi. However, in rice crackers and sweet rice wine, SPS was the most stable, followed by Cry1Ab and Pubi. Therefore, Cry1Ab is a better representative of transgenic components than is Pubi because the levels of Cry1Ab were less affected compared to Pubi. Copyright © 2014 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Tran, H. N. Q.; Tran, T. T.; Mansfield, M. L.; Lyman, S. N.
2014-12-01
Contributions of emissions from oil and gas activities to elevated ozone concentrations in the Uintah Basin - Utah were evaluated using the CMAQ Integrated Source Apportionment Method (CMAQ-ISAM) technique, and were compared with the results of traditional budgeting methods. Unlike the traditional budgeting method, which compares simulations with and without emissions of the source(s) in question to quantify its impacts, the CMAQ-ISAM technique assigns tags to emissions of each source and tracks their evolution through physical and chemical processes to quantify the final ozone product yield from the source. Model simulations were performed for two episodes in winter 2013 of low and high ozone to provide better understanding of source contributions under different weather conditions. Due to the highly nonlinear ozone chemistry, results obtained from the two methods differed significantly. The growing oil and gas industry in the Uintah Basin is the largest contributor to the elevated zone (>75 ppb) observed in the Basin. This study therefore provides an insight into the impact of oil and gas industry on the ozone issue, and helps in determining effective control strategies.
Sewer infiltration/inflow: long-term monitoring based on diurnal variation of pollutant mass flux.
Bares, V; Stránský, D; Sýkora, P
2009-01-01
The paper deals with a method for quantification of infiltrating groundwater based on the variation of diurnal pollutant load and continuous water quality and quantity monitoring. Although the method gives us the potential to separate particular components of wastewater hygrograph, several aspects of the method should be discussed. Therefore, the paper investigates the cost-effectiveness, the relevance of pollutant load from surface waters (groundwater) and the influence of measurement time step. These aspects were studied in an experimental catchment of Prague sewer system, Czech Republic, within a three-month period. The results indicate high contribution of parasitic waters on night minimal discharge. Taking into account the uncertainty of the results and time-consuming maintenance of the sensor, the principal advantages of the method are evaluated. The study introduces a promising potential of the discussed measuring concept for quantification of groundwater infiltrating into the sewer system. It is shown that the conventional approach is sufficient and cost-effective even in those catchments, where significant contribution of foul sewage in night minima would have been assumed.
Goode, N; Salmon, P M; Taylor, N Z; Lenné, M G; Finch, C F
2017-10-01
One factor potentially limiting the uptake of Rasmussen's (1997) Accimap method by practitioners is the lack of a contributing factor classification scheme to guide accident analyses. This article evaluates the intra- and inter-rater reliability and criterion-referenced validity of a classification scheme developed to support the use of Accimap by led outdoor activity (LOA) practitioners. The classification scheme has two levels: the system level describes the actors, artefacts and activity context in terms of 14 codes; the descriptor level breaks the system level codes down into 107 specific contributing factors. The study involved 11 LOA practitioners using the scheme on two separate occasions to code a pre-determined list of contributing factors identified from four incident reports. Criterion-referenced validity was assessed by comparing the codes selected by LOA practitioners to those selected by the method creators. Mean intra-rater reliability scores at the system (M = 83.6%) and descriptor (M = 74%) levels were acceptable. Mean inter-rater reliability scores were not consistently acceptable for both coding attempts at the system level (M T1 = 68.8%; M T2 = 73.9%), and were poor at the descriptor level (M T1 = 58.5%; M T2 = 64.1%). Mean criterion referenced validity scores at the system level were acceptable (M T1 = 73.9%; M T2 = 75.3%). However, they were not consistently acceptable at the descriptor level (M T1 = 67.6%; M T2 = 70.8%). Overall, the results indicate that the classification scheme does not currently satisfy reliability and validity requirements, and that further work is required. The implications for the design and development of contributing factors classification schemes are discussed. Copyright © 2017 Elsevier Ltd. All rights reserved.
Shi, Chaoyang; Kojima, Masahiro; Tercero, Carlos; Najdovski, Zoran; Ikeda, Seiichi; Fukuda, Toshio; Arai, Fumihito; Negoro, Makoto
2014-12-01
There are several complications associated with Stent-assisted Coil Embolization (SACE) in cerebral aneurysm treatments, due to damaging operations by surgeons and undesirable mechanical properties of stents. Therefore, it is necessary to develop an in vitro simulator that provides both training and research for evaluating the mechanical properties of stents. A new in vitro simulator for three-dimensional digital subtraction angiography was constructed, followed by aneurysm models fabricated with new materials. Next, this platform was used to provide training and to conduct photoelastic stress analysis to evaluate the SACE technique. The average interaction stress increasingly varied for the two different stents. Improvements for the Maximum-Likelihood Expectation-Maximization method were developed to reconstruct cross-sections with both thickness and stress information. The technique presented can improve a surgeon's skills and quantify the performance of stents to improve mechanical design and classification. This method can contribute to three-dimensional stress and volume variation evaluation and assess a surgeon's skills. Copyright © 2013 John Wiley & Sons, Ltd.
Dobes, Petr; Otyepka, Michal; Strnad, Miroslav; Hobza, Pavel
2006-05-24
The interaction between roscovitine and cyclin-dependent kinase 2 (cdk2) was investigated by performing correlated ab initio quantum-chemical calculations. The whole protein was fragmented into smaller systems consisting of one or a few amino acids, and the interaction energies of these fragments with roscovitine were determined by using the MP2 method with the extended aug-cc-pVDZ basis set. For selected complexes, the complete basis set limit MP2 interaction energies, as well as the coupled-cluster corrections with inclusion of single, double and noninteractive triples contributions [CCSD(T)], were also evaluated. The energies of interaction between roscovitine and small fragments and between roscovitine and substantial sections of protein (722 atoms) were also computed by using density-functional tight-binding methods covering dispersion energy (DFTB-D) and the Cornell empirical potential. Total stabilisation energy originates predominantly from dispersion energy and methods that do not account for the dispersion energy cannot, therefore, be recommended for the study of protein-inhibitor interactions. The Cornell empirical potential describes reasonably well the interaction between roscovitine and protein; therefore, this method can be applied in future thermodynamic calculations. A limited number of amino acid residues contribute significantly to the binding of roscovitine and cdk2, whereas a rather large number of amino acids make a negligible contribution.
The girlpower project--recreation, BC health goals and social capital.
Higgins, J W; Reed, N
2001-01-01
'GirlPower,' a participatory action research project, explored how participation in recreation might contribute to the achievement of BC's Health Goals and nurture social capital. After identifying their health issues, up to 43 young women participated in recreational activities for 10 months, gradually taking responsibility for the planning of the weekly sessions. Data collection methods included weekly participation rates, two surveys to measure self-perceptions and health habits, focus groups with participants to assess needs and as a process evaluation tool, a qualitative summative evaluation with participants, key informant interviews with staff, a journal kept by the project leader and fieldnotes of researchers' observations. Quantitative findings did not support the propositions that the project contributed to the health of participants. However, analysis of the qualitative data suggests that GirlPower participants emerged from the project with a better sense of control over their lives and feeling more connected to their community.
Evaluation of the sustainability of contrasted pig farming systems: economy.
Ilari-Antoine, E; Bonneau, M; Klauke, T N; Gonzàlez, J; Dourmad, J Y; De Greef, K; Houwers, H W J; Fabrega, E; Zimmer, C; Hviid, M; Van der Oever, B; Edwards, S A
2014-12-01
The aim of this paper is to present an efficient tool for evaluating the economy part of the sustainability of pig farming systems. The selected tool IDEA was tested on a sample of farms from 15 contrasted systems in Europe. A statistical analysis was carried out to check the capacity of the indicators to illustrate the variability of the population and to analyze which of these indicators contributed the most towards it. The scores obtained for the farms were consistent with the reality of pig production; the variable distribution showed an important variability of the sample. The principal component analysis and cluster analysis separated the sample into five subgroups, in which the six main indicators significantly differed, which underlines the robustness of the tool. The IDEA method was proven to be easily comprehensible, requiring few initial variables and with an efficient benchmarking system; all six indicators contributed to fully describe a varied and contrasted population.
Multivariate models for prediction of human skin sensitization hazard.
Strickland, Judy; Zang, Qingda; Paris, Michael; Lehmann, David M; Allen, David; Choksi, Neepa; Matheson, Joanna; Jacobs, Abigail; Casey, Warren; Kleinstreuer, Nicole
2017-03-01
One of the Interagency Coordinating Committee on the Validation of Alternative Method's (ICCVAM) top priorities is the development and evaluation of non-animal approaches to identify potential skin sensitizers. The complexity of biological events necessary to produce skin sensitization suggests that no single alternative method will replace the currently accepted animal tests. ICCVAM is evaluating an integrated approach to testing and assessment based on the adverse outcome pathway for skin sensitization that uses machine learning approaches to predict human skin sensitization hazard. We combined data from three in chemico or in vitro assays - the direct peptide reactivity assay (DPRA), human cell line activation test (h-CLAT) and KeratinoSens™ assay - six physicochemical properties and an in silico read-across prediction of skin sensitization hazard into 12 variable groups. The variable groups were evaluated using two machine learning approaches, logistic regression and support vector machine, to predict human skin sensitization hazard. Models were trained on 72 substances and tested on an external set of 24 substances. The six models (three logistic regression and three support vector machine) with the highest accuracy (92%) used: (1) DPRA, h-CLAT and read-across; (2) DPRA, h-CLAT, read-across and KeratinoSens; or (3) DPRA, h-CLAT, read-across, KeratinoSens and log P. The models performed better at predicting human skin sensitization hazard than the murine local lymph node assay (accuracy 88%), any of the alternative methods alone (accuracy 63-79%) or test batteries combining data from the individual methods (accuracy 75%). These results suggest that computational methods are promising tools to identify effectively the potential human skin sensitizers without animal testing. Published 2016. This article has been contributed to by US Government employees and their work is in the public domain in the USA. Published 2016. This article has been contributed to by US Government employees and their work is in the public domain in the USA.
An open-source framework for stress-testing non-invasive foetal ECG extraction algorithms.
Andreotti, Fernando; Behar, Joachim; Zaunseder, Sebastian; Oster, Julien; Clifford, Gari D
2016-05-01
Over the past decades, many studies have been published on the extraction of non-invasive foetal electrocardiogram (NI-FECG) from abdominal recordings. Most of these contributions claim to obtain excellent results in detecting foetal QRS (FQRS) complexes in terms of location. A small subset of authors have investigated the extraction of morphological features from the NI-FECG. However, due to the shortage of available public databases, the large variety of performance measures employed and the lack of open-source reference algorithms, most contributions cannot be meaningfully assessed. This article attempts to address these issues by presenting a standardised methodology for stress testing NI-FECG algorithms, including absolute data, as well as extraction and evaluation routines. To that end, a large database of realistic artificial signals was created, totaling 145.8 h of multichannel data and over one million FQRS complexes. An important characteristic of this dataset is the inclusion of several non-stationary events (e.g. foetal movements, uterine contractions and heart rate fluctuations) that are critical for evaluating extraction routines. To demonstrate our testing methodology, three classes of NI-FECG extraction algorithms were evaluated: blind source separation (BSS), template subtraction (TS) and adaptive methods (AM). Experiments were conducted to benchmark the performance of eight NI-FECG extraction algorithms on the artificial database focusing on: FQRS detection and morphological analysis (foetal QT and T/QRS ratio). The overall median FQRS detection accuracies (i.e. considering all non-stationary events) for the best performing methods in each group were 99.9% for BSS, 97.9% for AM and 96.0% for TS. Both FQRS detections and morphological parameters were shown to heavily depend on the extraction techniques and signal-to-noise ratio. Particularly, it is shown that their evaluation in the source domain, obtained after using a BSS technique, should be avoided. Data, extraction algorithms and evaluation routines were released as part of the fecgsyn toolbox on Physionet under an GNU GPL open-source license. This contribution provides a standard framework for benchmarking and regulatory testing of NI-FECG extraction algorithms.
A Framework for Human Microbiome Research
2012-06-14
determined that many compo- nents of data production and processing can contribute errors and artefacts. We investigated methods that avoid these errors and...protocol that ensured consistency in the high-throughput production . To maximize accuracy and consistency, protocols were evaluated primarily using a...future benefits, this resource may promote the development of novel prophylactic strategies such as the application of prebiotics and probiotics to
ERIC Educational Resources Information Center
Bridgeman, Brent; Pollack, Judith; Burton, Nancy
2008-01-01
Two methods of showing the ability of high school grades (high school grade point averages) and SAT scores to predict cumulative grades in different types of college courses were evaluated in a sample of 26 colleges. Each college contributed data from three cohorts of entering freshmen, and each cohort was followed for at least four years.…
A Memory-Process Model of Symbolic Assimilation
1974-04-01
Systems: Final Report of a Study Group, published for Artificial Intellegence by North-Holland/Amorican...contribution of the methods is answered by evaluating the same program in the context of the field of artificial intelligence. The remainder of the...been widely demonstrated on a diversity of tasks in tha history of artificial intelligence. See [r.71], chapter 2. Given a particular task to be
ERIC Educational Resources Information Center
Štofková, Katarína; Strícek, Ivan; Štofková, Jana
2014-01-01
The paper is aimed to evaluate the possibility of applying new methods and tools of more effective educational processes, with an emphasis on increasing their quality especially aimed on educational processes at secondary schools and universities. There are some contributions from practice for the effective implementation of time management, such…
ERIC Educational Resources Information Center
McMillan, Whitney; Stice, Eric; Rohde, Paul
2011-01-01
Objective: As cognitive dissonance is theorized to contribute to the effects of dissonance-based eating disorder prevention programs, we evaluated a high-dissonance version of this program against a low-dissonance version and a wait-list control condition to provide an experimental test of the mechanism of intervention effects. Method: Female…
ERIC Educational Resources Information Center
Most, Tova; Harel, Tamar; Shpak, Talma; Luntz, Michal
2011-01-01
Purpose: The purpose of the study was to evaluate the contribution of acoustic hearing to the perception of suprasegmental features by adults who use a cochlear implant (CI) and a hearing aid (HA) in opposite ears. Method: 23 adults participated in this study. Perception of suprasegmental features--intonation, syllable stress, and word…
ERIC Educational Resources Information Center
Russell, Tammy L.
2016-01-01
Many student affairs departments struggle to contribute to an institution's evidence base of student learning. In part, this results from student affairs personnel not having adequate training in how to assess learning outside the classroom. This is a particular challenge for small community colleges, in which individual units (e.g., admissions or…
The Contribution of Children's Advocacy Centers to Felony Prosecutions of Child Sexual Abuse
ERIC Educational Resources Information Center
Miller, Aaron; Rubin, David
2009-01-01
Objective: To describe trends of felony sexual abuse prosecutions between 1992 and 2002 for two districts of a large urban city that differed primarily in their use of children's advocacy centers (CACs) for sexual abuse evaluations in children. Methods: Aggregate data for two districts of a large urban city were provided from 1992 to 2002 from the…
NASA Astrophysics Data System (ADS)
Ramli, Rohaini; Kasim, Maznah Mat; Ramli, Razamin; Kayat, Kalsom; Razak, Rafidah Abd
2014-12-01
Ministry of Tourism and Culture Malaysia has long introduced homestay programs across the country to enhance the quality of life of people, especially those living in rural areas. This type of program is classified as a community-based tourism (CBT) as it is expected to economically improve livelihood through cultural and community associated activities. It is the aspiration of the ministry to see that the income imbalance between people in the rural and urban areas is reduced, thus would contribute towards creating more developed states of Malaysia. Since 1970s, there are 154 homestay programs registered with the ministry. However, the performance and sustainability of the programs are still not satisfying. There are only a number of homestay programs that perform well and able to sustain. Thus, the aim of this paper is to identify relevant criteria contributing to the sustainability of a homestay program. The criteria are evaluated for their levels of importance via the use of a modified pairwise method and analyzed for other potentials. The findings will help the homestay operators to focus on the necessary criteria and thus, effectively perform as the CBT business initiative.
Measuring physical activity environments: a brief history.
Sallis, James F
2009-04-01
Physical activity is usually done in specific types of places, referred to as physical activity environments. These often include parks, trails, fitness centers, schools, and streets. In recent years, scientific interest has increased notably in measuring physical activity environments. The present paper provides an historical overview of the contributions of the health, planning, and leisure studies fields to the development of contemporary measures. The emphasis is on attributes of the built environment that can be affected by policies to contribute to the promotion of physical activity. Researchers from health fields assessed a wide variety of built environment variables expected to be related to recreational physical activity. Settings of interest were schools, workplaces, and recreation facilities, and most early measures used direct observation methods with demonstrated inter-observer reliability. Investigators from the city planning field evaluated aspects of community design expected to be related to people's ability to walk from homes to destinations. GIS was used to assess walkability defined by the 3Ds of residential density, land-use diversity, and pedestrian-oriented designs. Evaluating measures for reliability or validity was rarely done in the planning-related fields. Researchers in the leisure studies and recreation fields studied mainly people's use of leisure time rather than physical characteristics of parks and other recreation facilities. Although few measures of physical activity environments were developed, measures of aesthetic qualities are available. Each of these fields made unique contributions to the contemporary methods used to assess physical activity environments.
Nonmechanistic forecasts of seasonal influenza with iterative one-week-ahead distributions.
Brooks, Logan C; Farrow, David C; Hyun, Sangwon; Tibshirani, Ryan J; Rosenfeld, Roni
2018-06-15
Accurate and reliable forecasts of seasonal epidemics of infectious disease can assist in the design of countermeasures and increase public awareness and preparedness. This article describes two main contributions we made recently toward this goal: a novel approach to probabilistic modeling of surveillance time series based on "delta densities", and an optimization scheme for combining output from multiple forecasting methods into an adaptively weighted ensemble. Delta densities describe the probability distribution of the change between one observation and the next, conditioned on available data; chaining together nonparametric estimates of these distributions yields a model for an entire trajectory. Corresponding distributional forecasts cover more observed events than alternatives that treat the whole season as a unit, and improve upon multiple evaluation metrics when extracting key targets of interest to public health officials. Adaptively weighted ensembles integrate the results of multiple forecasting methods, such as delta density, using weights that can change from situation to situation. We treat selection of optimal weightings across forecasting methods as a separate estimation task, and describe an estimation procedure based on optimizing cross-validation performance. We consider some details of the data generation process, including data revisions and holiday effects, both in the construction of these forecasting methods and when performing retrospective evaluation. The delta density method and an adaptively weighted ensemble of other forecasting methods each improve significantly on the next best ensemble component when applied separately, and achieve even better cross-validated performance when used in conjunction. We submitted real-time forecasts based on these contributions as part of CDC's 2015/2016 FluSight Collaborative Comparison. Among the fourteen submissions that season, this system was ranked by CDC as the most accurate.
Methodology for the evaluation of the Stephanie Alexander Kitchen Garden program.
Gibbs, L; Staiger, P K; Townsend, M; Macfarlane, S; Gold, L; Block, K; Johnson, B; Kulas, J; Waters, E
2013-04-01
Community and school cooking and gardening programs have recently increased internationally. However, despite promising indications, there is limited evidence of their effectiveness. This paper presents the evaluation framework and methods negotiated and developed to meet the information needs of all stakeholders for the Stephanie Alexander Kitchen Garden (SAKG) program, a combined cooking and gardening program implemented in selectively funded primary schools across Australia. The evaluation used multiple aligned theoretical frameworks and models, including a public health ecological approach, principles of effective health promotion and models of experiential learning. The evaluation is a non-randomised comparison of six schools receiving the program (intervention) and six comparison schools (all government-funded primary schools) in urban and rural areas of Victoria, Australia. A mixed-methods approach was used, relying on qualitative measures to understand changes in school cultures and the experiential impacts on children, families, teachers, parents and volunteers, and quantitative measures at baseline and 1 year follow up to provide supporting information regarding patterns of change. The evaluation study design addressed the limitations of many existing evaluation studies of cooking or garden programs. The multistrand approach to the mixed methodology maintained the rigour of the respective methods and provided an opportunity to explore complexity in the findings. Limited sensitivity of some of the quantitative measures was identified, as well as the potential for bias in the coding of the open-ended questions. The SAKG evaluation methodology will address the need for appropriate evaluation approaches for school-based kitchen garden programs. It demonstrates the feasibility of a meaningful, comprehensive evaluation of school-based programs and also demonstrates the central role qualitative methods can have in a mixed-method evaluation. So what? This paper contributes to debate about appropriate evaluation approaches to meet the information needs of all stakeholders and will support the sharing of measures and potential comparisons between program outcomes for comparable population groups and settings.
Benchmarking routine psychological services: a discussion of challenges and methods.
Delgadillo, Jaime; McMillan, Dean; Leach, Chris; Lucock, Mike; Gilbody, Simon; Wood, Nick
2014-01-01
Policy developments in recent years have led to important changes in the level of access to evidence-based psychological treatments. Several methods have been used to investigate the effectiveness of these treatments in routine care, with different approaches to outcome definition and data analysis. To present a review of challenges and methods for the evaluation of evidence-based treatments delivered in routine mental healthcare. This is followed by a case example of a benchmarking method applied in primary care. High, average and poor performance benchmarks were calculated through a meta-analysis of published data from services working under the Improving Access to Psychological Therapies (IAPT) Programme in England. Pre-post treatment effect sizes (ES) and confidence intervals were estimated to illustrate a benchmarking method enabling services to evaluate routine clinical outcomes. High, average and poor performance ES for routine IAPT services were estimated to be 0.91, 0.73 and 0.46 for depression (using PHQ-9) and 1.02, 0.78 and 0.52 for anxiety (using GAD-7). Data from one specific IAPT service exemplify how to evaluate and contextualize routine clinical performance against these benchmarks. The main contribution of this report is to summarize key recommendations for the selection of an adequate set of psychometric measures, the operational definition of outcomes, and the statistical evaluation of clinical performance. A benchmarking method is also presented, which may enable a robust evaluation of clinical performance against national benchmarks. Some limitations concerned significant heterogeneity among data sources, and wide variations in ES and data completeness.
Fage-Butler, Antoinette
2013-01-01
The purpose of this paper is to present an evaluative model of patient-centredness for text and to illustrate how this can be applied to patient information leaflets (PILs) that accompany medication in the European Union. Patients have criticized PILs for sidelining their experiences, knowledge and affective needs, and denying their individuality. The health communication paradigm of patient-centredness provides valuable purchase on these issues, taking its starting point in the dignity and integrity of the patient as a person. Employing this evaluative model involves two stages. First, a Foucauldian Discourse Analysis is performed of sender and receiver and of the main discourses in PILs. These aspects are then evaluated using the perspectives of patient-centredness theory relating to the medical practitioner, patient and content. The evaluative model is illustrated via a PIL for medication for depression and panic attacks. Evaluation reveals a preponderance of biomedical statements, with a cluster of patient-centred statements primarily relating to the construction of the patient. The paper contributes a new method and evaluative approach to PIL and qualitative health research, as well as outlining a method that facilitates the investigation of interdiscursivity, a recent focus of critical genre analysis.
Seed germination test for toxicity evaluation of compost: Its roles, problems and prospects.
Luo, Yuan; Liang, Jie; Zeng, Guangming; Chen, Ming; Mo, Dan; Li, Guoxue; Zhang, Difang
2018-01-01
Compost is commonly used for the growth of plants and the remediation of environmental pollution. It is important to evaluate the quality of compost and seed germination test is a powerful tool to examine the toxicity of compost, which is the most important aspect of the quality. Now the test is widely adopted, but the main problem is that the test results vary with different methods and seed species, which limits the development and application of it. The standardization of methods and the modelization of seeds can contribute to solving the problem. Additionally, according to the probabilistic theory of seed germination, the error caused by the analysis and judgment methods of the test results can be reduced. Here, we reviewed the roles, problems and prospects of the seed germination test in the studies of compost. Copyright © 2017 Elsevier Ltd. All rights reserved.
Performance index and meta-optimization of a direct search optimization method
NASA Astrophysics Data System (ADS)
Krus, P.; Ölvander, J.
2013-10-01
Design optimization is becoming an increasingly important tool for design, often using simulation as part of the evaluation of the objective function. A measure of the efficiency of an optimization algorithm is of great importance when comparing methods. The main contribution of this article is the introduction of a singular performance criterion, the entropy rate index based on Shannon's information theory, taking both reliability and rate of convergence into account. It can also be used to characterize the difficulty of different optimization problems. Such a performance criterion can also be used for optimization of the optimization algorithms itself. In this article the Complex-RF optimization method is described and its performance evaluated and optimized using the established performance criterion. Finally, in order to be able to predict the resources needed for optimization an objective function temperament factor is defined that indicates the degree of difficulty of the objective function.
Bialas, Andrzej
2010-01-01
The paper discusses the security issues of intelligent sensors that are able to measure and process data and communicate with other information technology (IT) devices or systems. Such sensors are often used in high risk applications. To improve their robustness, the sensor systems should be developed in a restricted way to provide them with assurance. One of assurance creation methodologies is Common Criteria (ISO/IEC 15408), used for IT products and systems. The contribution of the paper is a Common Criteria compliant and pattern-based method for the intelligent sensors security development. The paper concisely presents this method and its evaluation for the sensor detecting methane in a mine, focusing on the security problem of the intelligent sensor definition and solution. The aim of the validation is to evaluate and improve the introduced method. PMID:22399888
Cognitive Factors Contributing to Spelling Performance in Children with Prenatal Alcohol Exposure
Glass, Leila; Graham, Diana M.; Akshoomoff, Natacha; Mattson, Sarah N.
2015-01-01
Objective Heavy prenatal alcohol exposure is associated with impaired school functioning. Spelling performance has not been comprehensively evaluated. We examined whether children with heavy prenatal alcohol exposure demonstrate deficits in spelling and related abilities, including reading, and tested whether there are unique underlying mechanisms for observed deficits in this population. Method Ninety-six school-age children comprised two groups: children with heavy prenatal alcohol exposure (AE, n=49) and control children (CON, n=47). Children completed select subtests from the WIAT-II and NEPSY-II. Group differences and relations between spelling and theoretically-related cognitive variables were evaluated using MANOVA and Pearson correlations. Hierarchical regression analyses were utilized to assess contributions of group membership and cognitive variables to spelling performance. The specificity of these deficits and underlying mechanisms was tested by examining the relations between reading ability, group membership, and cognitive variables. Results Groups differed significantly on all variables. Group membership and phonological processing significantly contributed to spelling performance. In addition, a significant group*working memory interaction revealed that working memory independently contributed significantly to spelling only for the AE group. All cognitive variables contributed to reading across groups and a group*working memory interaction revealed that working memory contributed independently to reading only for alcohol-exposed children. Conclusion Alcohol-exposed children demonstrated a unique pattern of spelling deficits. The relation of working memory to spelling and reading was specific to the AE group, suggesting that if prenatal alcohol exposure is known or suspected, working memory ability should be considered in the development and implementation of explicit instruction. PMID:25643217
[A future image of clinical inspection from health economics].
Kakihara, Hiroaki
2006-06-01
Do you let medical costs increase in proportion to the growth rate of GDP? A way of thinking of the Council on Economic and Fiscal Policy. Should we exclude public medical insurance? It is not a problem, it is an absolute sum if you are effective. If there is no insurance, and individuals pay the total amount, there is no problem, but it is impossible. Economic development will cease if there is no insurance. As medical personnel, to offer good medical care with an appropriate cost. An appeal to the nation is necessary. Economic technical evaluation to identify a cheap method for each clinical inspection. Does medical insurance have a deficit? I. Japanese Health insurance system. (1) Health insurance union. When you look at the contribution money, it is originally 2,479,800,000,000 yen, with premium income and a profit of 45%. (2) Government management health insurance. When you look at the contribution money, it is originally 2,163,300,000,000 yen, with premium income and a profit of 36%. (1) + (2) Employed insurance meter. (3) Mutual aid. (4) National Health Insurance. II. A clinical economic method. III. Expense of medical care and its effect. A. Expense. B. A medical economic technical evaluation method. 1. Cost-effectiveness analysis CEA. 2. Cost utility analysis CUA. 3. Cost-benefit analysis CBA. 4. Expense minimization analysis.
NASA Astrophysics Data System (ADS)
Sugimoto, Masataka; Hasegawa, Hideyuki; Kanai, Hiroshi
2005-08-01
Endothelial dysfunction is considered to be an initial step of arteriosclerosis [R. Ross: N. Engl. J. Med. 340 (2004) 115]. For the assessment of the endothelium function, brachial artery flow-mediated dilation (FMD) caused by increased blood flow has been evaluated with ultrasonic diagnostic equipment. In the case of conventional methods, the change in artery diameter caused by FMD is measured [M. Hashimoto et al.: Circulation 92 (1995) 3431]. Although the arterial wall has a layered structure (intima, media, and adventitia), such a structure is not taken into account in conventional methods because the change in diameter depends on the characteristic of the entire wall. However, smooth muscle present only in the media contributes to FMD, whereas the collagen-rich hard adventitia does not contribute. In this study, we measure the change in elasticity of only the intima-media region including smooth muscle using the phased tracking method [H. Kanai et al.: IEEE Trans. Ultrason. Ferroelectr. Freq. Control 43 (1996) 791]. From the change in elasticity, FMD measured only for the intima-media region by our proposed method was found to be more sensitive than that measured for the entire wall by the conventional method.
NASA Astrophysics Data System (ADS)
Rahmani, K.; Mayer, H.
2018-05-01
In this paper we present a pipeline for high quality semantic segmentation of building facades using Structured Random Forest (SRF), Region Proposal Network (RPN) based on a Convolutional Neural Network (CNN) as well as rectangular fitting optimization. Our main contribution is that we employ features created by the RPN as channels in the SRF.We empirically show that this is very effective especially for doors and windows. Our pipeline is evaluated on two datasets where we outperform current state-of-the-art methods. Additionally, we quantify the contribution of the RPN and the rectangular fitting optimization on the accuracy of the result.
NASA Technical Reports Server (NTRS)
Wolf, M.
1979-01-01
To facilitate the task of objectively comparing competing process options, a methodology was needed for the quantitative evaluation of their relative cost effectiveness. Such a methodology was developed and is described, together with three examples for its application. The criterion for the evaluation is the cost of the energy produced by the system. The method permits the evaluation of competing design options for subsystems, based on the differences in cost and efficiency of the subsystems, assuming comparable reliability and service life, or of competing manufacturing process options for such subsystems, which include solar cells or modules. This process option analysis is based on differences in cost, yield, and conversion efficiency contribution of the process steps considered.
Semiclassical evaluation of quantum fidelity
NASA Astrophysics Data System (ADS)
Vanicek, Jiri
2004-03-01
We present a numerically feasible semiclassical method to evaluate quantum fidelity (Loschmidt echo) in a classically chaotic system. It was thought that such evaluation would be intractable, but instead we show that a uniform semiclassical expression not only is tractable but it gives remarkably accurate numerical results for the standard map in both the Fermi-golden-rule and Lyapunov regimes. Because it allows a Monte-Carlo evaluation, this uniform expression is accurate at times where there are 10^70 semiclassical contributions. Remarkably, the method also explicitly contains the ``building blocks'' of analytical theories of recent literature, and thus permits a direct test of approximations made by other authors in these regimes, rather than an a posteriori comparison with numerical results. We explain in more detail the extended validity of the classical perturbation approximation and thus provide a ``defense" of the linear response theory from the famous Van Kampen objection. We point out the potential use of our uniform expression in other areas because it gives a most direct link between the quantum Feynman propagator based on the path integral and the semiclassical Van Vleck propagator based on the sum over classical trajectories. Finally, we test the applicability of our method in integrable and mixed systems.
NASA Astrophysics Data System (ADS)
Shahid, Muhammad; Cong, Zhentao; Zhang, Danwu
2017-09-01
Climate change and land use change are the two main factors that can alter the catchment hydrological process. The objective of this study is to evaluate the relative contribution of climate change and land use change to runoff change of the Soan River basin. The Mann-Kendal and the Pettit tests are used to find out the trends and change point in hydroclimatic variables during the period 1983-2012. Two different approaches including the abcd hydrological model and the Budyko framework are then used to quantify the impact of climate change and land use change on streamflow. The results from both methods are consistent and show that annual runoff has significantly decreased with a change point around 1997. The decrease in precipitation and increases in potential evapotranspiration contribute 68% of the detected change while the rest of the detected change is due to land use change. The land use change acquired from Landsat shows that during post-change period, the agriculture has increased in the Soan basin, which is in line with the positive contribution of land use change to runoff decrease. This study concludes that aforementioned methods performed well in quantifying the relative contribution of land use change and climate change to runoff change.
NASA Astrophysics Data System (ADS)
Mizamzul Mehat, Nik; Syuhada Zakarria, Noor; Kamaruddin, Shahrul
2018-03-01
The increase in demand for industrial gears has resulted in the increase in usage of plastic-matrix composites particularly glass fibre-reinforced plastics as the gear materials. The usage of these synthetic fibers is to enhance the mechanical strength and the thermal resistance of the plastic gears. Nevertheless, the production of large quantities of these synthetic fibre-reinforced composites poses a serious threat to the ecosystem. Comprehending to this fact, the present work aimed at investigating the effects of incorporating recycled glass fibre-reinforced plastics in various compositions particularly on dimensional stability and mechanical properties of gear produced with diverse injection moulding processing parameters setting. The integration of Grey relational analysis (GRA) and Taguchi method was adopted to evaluate the influence of recycled glass fibre-reinforced plastics and variation in processing parameters on gear quality. From the experimental results, the blending ratio was found as the most influential parameter of 56.0% contribution in both improving tensile properties as well as in minimizing shrinkage, followed by mould temperature of 24.1% contribution and cooling time of 10.6% contribution. The results obtained from the aforementioned work are expected to contribute to accessing the feasibility of using recycled glass fibre-reinforced plastics especially for gear application.
Optic cup segmentation from fundus images for glaucoma diagnosis.
Hu, Man; Zhu, Chenghao; Li, Xiaoxing; Xu, Yongli
2017-01-02
Glaucoma is a serious disease that can cause complete, permanent blindness, and its early diagnosis is very difficult. In recent years, computer-aided screening and diagnosis of glaucoma has made considerable progress. The optic cup segmentation from fundus images is an extremely important part for the computer-aided screening and diagnosis of glaucoma. This paper presented an automatic optic cup segmentation method that used both color difference information and vessel bends information from fundus images to determine the optic cup boundary. During the implementation of this algorithm, not only were the locations of the 2 types of information points used, but also the confidences of the information points were evaluated. In this way, the information points with higher confidence levels contributed more to the determination of the final cup boundary. The proposed method was evaluated using a public database for fundus images. The experimental results demonstrated that the cup boundaries obtained by the proposed method were more consistent than existing methods with the results obtained by ophthalmologists.
Droplet Digital PCR-Based Chimerism Analysis for Primary Immunodeficiency Diseases.
Okano, Tsubasa; Tsujita, Yuki; Kanegane, Hirokazu; Mitsui-Sekinaka, Kanako; Tanita, Kay; Miyamoto, Satoshi; Yeh, Tzu-Wen; Yamashita, Motoi; Terada, Naomi; Ogura, Yumi; Takagi, Masatoshi; Imai, Kohsuke; Nonoyama, Shigeaki; Morio, Tomohiro
2018-04-01
In the current study, we aimed to accurately evaluate donor/recipient or male/female chimerism in samples from patients who underwent hematopoietic stem cell transplantation (HSCT). We designed the droplet digital polymerase chain reaction (ddPCR) for SRY and RPP30 to detect the male/female chimerism. We also developed mutation-specific ddPCR for four primary immunodeficiency diseases. The accuracy of the male/female chimerism analysis using ddPCR was confirmed by comparing the results with those of conventional methods (fluorescence in situ hybridization and short tandem repeat-PCR) and evaluating dilution assays. In particular, we found that this method was useful for analyzing small samples. Thus, this method could be used with patient samples, especially to sorted leukocyte subpopulations, during the early post-transplant period. Four mutation-specific ddPCR accurately detected post-transplant chimerism. ddPCR-based male/female chimerism analysis and mutation-specific ddPCR were useful for all HSCT, and these simple methods contribute to following the post-transplant chimerism, especially in disease-specific small leukocyte fractions.
Optic cup segmentation from fundus images for glaucoma diagnosis
Hu, Man; Zhu, Chenghao; Li, Xiaoxing; Xu, Yongli
2017-01-01
ABSTRACT Glaucoma is a serious disease that can cause complete, permanent blindness, and its early diagnosis is very difficult. In recent years, computer-aided screening and diagnosis of glaucoma has made considerable progress. The optic cup segmentation from fundus images is an extremely important part for the computer-aided screening and diagnosis of glaucoma. This paper presented an automatic optic cup segmentation method that used both color difference information and vessel bends information from fundus images to determine the optic cup boundary. During the implementation of this algorithm, not only were the locations of the 2 types of information points used, but also the confidences of the information points were evaluated. In this way, the information points with higher confidence levels contributed more to the determination of the final cup boundary. The proposed method was evaluated using a public database for fundus images. The experimental results demonstrated that the cup boundaries obtained by the proposed method were more consistent than existing methods with the results obtained by ophthalmologists. PMID:27764542
Pitkänen, Janne; Nieminen, Marko
2017-01-01
Participation of healthcare professionals in information technology development has emerged as an important challenge. As end-users, the professionals are willing to participate in the development activities, but their experiences on the current methods of participation remain mostly negative. There is lack of applicable methods for meeting the needs of agile development approach and scaling up to the largest implementation projects, while maintaining the interest of the professional users to participate in development activities and keeping up their ability to continue working in a productive manner. In this paper, we describe the Agile Instrumented Monitoring as a methodology, based on the methods of instrumented usability evaluation, for improving user experience in HealthIT development. The contribution of the proposed methodology is analyzed in relation to activities of whole iteration cycle and chosen usability evaluation methods, while the user experience of participation is addressed regarding healthcare professionals. Prospective weak and strong market tests for AIM are discussed in the conclusions for future work.
Helou, Cynthia; Jacolot, Philippe; Niquet-Léridon, Céline; Gadonna-Widehem, Pascale; Tessier, Frédéric J
2016-01-01
The aim of this study was to test the methods currently in use and to develop a new protocol for the evaluation of melanoidins in bread. Markers of the early and advanced stages of the Maillard reaction were also followed in the crumb and the crust of bread throughout baking, and in a crust model system. The crumb of the bread contained N(ε)-fructoselysine and N(ε)-carboxymethyllysine but at levels 7 and 5 times lower than the crust, respectively. 5-Hydroxymethylfurfural was detected only in the crust and its model system. The available methods for the semi-quantification of melanoidins were found to be unsuitable for their analysis in bread. Our new method based on size exclusion chromatography and fluorescence measures soluble fluorescent melanoidins in bread. These melanoidin macromolecules (1.7-5.6 kDa) were detected intact in both crust and model system. They appear to contribute to the dietary fibre in bread. Copyright © 2015 Elsevier Ltd. All rights reserved.
Wagner, Kay Cimpl; Byrd, Gary D.
2004-01-01
Objective: This study was undertaken to determine if a systematic review of the evidence from thirty years of literature evaluating clinical medical librarian (CML) programs could help clarify the effectiveness of this outreach service model. Methods: A descriptive review of the CML literature describes the general characteristics of these services as they have been implemented, primarily in teaching-hospital settings. Comprehensive searches for CML studies using quantitative or qualitative evaluation methods were conducted in the medical, allied health, librarianship, and social sciences literature. Findings: Thirty-five studies published between 1974 and 2001 met the review criteria. Most (30) evaluated single, active programs and used descriptive research methods (e.g., use statistics or surveys/questionnaires). A weighted average of 89% of users in twelve studies found CML services useful and of high quality, and 65% of users in another overlapping, but not identical, twelve studies said these services contributed to improved patient care. Conclusions: The total amount of research evidence for CML program effectiveness is not great and most of it is descriptive rather than comparative or analytically qualitative. Standards are needed to consistently evaluate CML or informationist programs in the future. A carefully structured multiprogram study including three to five of the best current programs is needed to define the true value of these services. PMID:14762460
Springfield, Emily; Gwozdek, Anne E; Peet, Melissa; Kerschbaum, Wendy E
2012-04-01
Program evaluation is a necessary component of curricular change and innovation. It ascertains whether an innovation has met benchmarks and contributes to the body of knowledge about educational methodologies and supports the use of evidence-based practice in teaching. Education researchers argue that rigorous program evaluation should utilize a mixed-method approach, triangulating both qualitative and quantitative methods to understand program effectiveness. This approach was used to evaluate the University of Michigan Dental Hygiene Degree Completion E-Learning (online) Program. Quantitative data included time spent on coursework, grades, publications, course evaluation results, and survey responses. Qualitative data included student and faculty responses in focus groups and on surveys as well as students' portfolio reflections. The results showed the program was academically rigorous, fostering students' ability to connect theory with practice and apply evidence-based practice principles. These results also demonstrated that the students had learned to critically reflect on their practice and develop expanded professional identities; going beyond the role of clinician, they began to see themselves as educators, advocates, and researchers. This evaluation model is easily adaptable and is applicable to any health science or other professional degree program. This study also raised important questions regarding the effect of meta-reflection on student confidence and professional behavior.
Methods for semi-automated indexing for high precision information retrieval
NASA Technical Reports Server (NTRS)
Berrios, Daniel C.; Cucina, Russell J.; Fagan, Lawrence M.
2002-01-01
OBJECTIVE: To evaluate a new system, ISAID (Internet-based Semi-automated Indexing of Documents), and to generate textbook indexes that are more detailed and more useful to readers. DESIGN: Pilot evaluation: simple, nonrandomized trial comparing ISAID with manual indexing methods. Methods evaluation: randomized, cross-over trial comparing three versions of ISAID and usability survey. PARTICIPANTS: Pilot evaluation: two physicians. Methods evaluation: twelve physicians, each of whom used three different versions of the system for a total of 36 indexing sessions. MEASUREMENTS: Total index term tuples generated per document per minute (TPM), with and without adjustment for concordance with other subjects; inter-indexer consistency; ratings of the usability of the ISAID indexing system. RESULTS: Compared with manual methods, ISAID decreased indexing times greatly. Using three versions of ISAID, inter-indexer consistency ranged from 15% to 65% with a mean of 41%, 31%, and 40% for each of three documents. Subjects using the full version of ISAID were faster (average TPM: 5.6) and had higher rates of concordant index generation. There were substantial learning effects, despite our use of a training/run-in phase. Subjects using the full version of ISAID were much faster by the third indexing session (average TPM: 9.1). There was a statistically significant increase in three-subject concordant indexing rate using the full version of ISAID during the second indexing session (p < 0.05). SUMMARY: Users of the ISAID indexing system create complex, precise, and accurate indexing for full-text documents much faster than users of manual methods. Furthermore, the natural language processing methods that ISAID uses to suggest indexes contributes substantially to increased indexing speed and accuracy.
Methods for Semi-automated Indexing for High Precision Information Retrieval
Berrios, Daniel C.; Cucina, Russell J.; Fagan, Lawrence M.
2002-01-01
Objective. To evaluate a new system, ISAID (Internet-based Semi-automated Indexing of Documents), and to generate textbook indexes that are more detailed and more useful to readers. Design. Pilot evaluation: simple, nonrandomized trial comparing ISAID with manual indexing methods. Methods evaluation: randomized, cross-over trial comparing three versions of ISAID and usability survey. Participants. Pilot evaluation: two physicians. Methods evaluation: twelve physicians, each of whom used three different versions of the system for a total of 36 indexing sessions. Measurements. Total index term tuples generated per document per minute (TPM), with and without adjustment for concordance with other subjects; inter-indexer consistency; ratings of the usability of the ISAID indexing system. Results. Compared with manual methods, ISAID decreased indexing times greatly. Using three versions of ISAID, inter-indexer consistency ranged from 15% to 65% with a mean of 41%, 31%, and 40% for each of three documents. Subjects using the full version of ISAID were faster (average TPM: 5.6) and had higher rates of concordant index generation. There were substantial learning effects, despite our use of a training/run-in phase. Subjects using the full version of ISAID were much faster by the third indexing session (average TPM: 9.1). There was a statistically significant increase in three-subject concordant indexing rate using the full version of ISAID during the second indexing session (p < 0.05). Summary. Users of the ISAID indexing system create complex, precise, and accurate indexing for full-text documents much faster than users of manual methods. Furthermore, the natural language processing methods that ISAID uses to suggest indexes contributes substantially to increased indexing speed and accuracy. PMID:12386114
Methods for Determining Spontaneous Mutation Rates
Foster, Patricia L.
2007-01-01
Spontaneous mutations arise as a result of cellular processes that act upon or damage DNA. Accurate determination of spontaneous mutation rates can contribute to our understanding of these processes and the enzymatic pathways that deal with them. The methods that are used to calculate mutation rates are based on the model for the expansion of mutant clones originally described by Luria and Delbrück and extended by Lea and Coulson. The accurate determination of mutation rates depends on understanding the strengths and limitations of these methods and how to optimize a fluctuation assay for a given method. This chapter describes the proper design of a fluctuation assay, several of the methods used to calculate mutation rates, and ways to evaluate the results statistically. PMID:16793403
Evaluation of health promotion in schools: a realistic evaluation approach using mixed methods.
Pommier, Jeanine; Guével, Marie-Renée; Jourdan, Didier
2010-01-28
Schools are key settings for health promotion (HP) but the development of suitable approaches for evaluating HP in schools is still a major topic of discussion. This article presents a research protocol of a program developed to evaluate HP. After reviewing HP evaluation issues, the various possible approaches are analyzed and the importance of a realistic evaluation framework and a mixed methods (MM) design are demonstrated. The design is based on a systemic approach to evaluation, taking into account the mechanisms, context and outcomes, as defined in realistic evaluation, adjusted to our own French context using an MM approach. The characteristics of the design are illustrated through the evaluation of a nationwide HP program in French primary schools designed to enhance children's social, emotional and physical health by improving teachers' HP practices and promoting a healthy school environment. An embedded MM design is used in which a qualitative data set plays a supportive, secondary role in a study based primarily on a different quantitative data set. The way the qualitative and quantitative approaches are combined through the entire evaluation framework is detailed. This study is a contribution towards the development of suitable approaches for evaluating HP programs in schools. The systemic approach of the evaluation carried out in this research is appropriate since it takes account of the limitations of traditional evaluation approaches and considers suggestions made by the HP research community.
Citizen observations contributing to flood modelling: opportunities and challenges
NASA Astrophysics Data System (ADS)
Assumpção, Thaine H.; Popescu, Ioana; Jonoski, Andreja; Solomatine, Dimitri P.
2018-02-01
Citizen contributions to science have been successfully implemented in many fields, and water resources is one of them. Through citizens, it is possible to collect data and obtain a more integrated decision-making process. Specifically, data scarcity has always been an issue in flood modelling, which has been addressed in the last decades by remote sensing and is already being discussed in the citizen science context. With this in mind, this article aims to review the literature on the topic and analyse the opportunities and challenges that lie ahead. The literature on monitoring, mapping and modelling, was evaluated according to the flood-related variable citizens contributed to. Pros and cons of the collection/analysis methods were summarised. Then, pertinent publications were mapped into the flood modelling cycle, considering how citizen data properties (spatial and temporal coverage, uncertainty and volume) are related to its integration into modelling. It was clear that the number of studies in the area is rising. There are positive experiences reported in collection and analysis methods, for instance with velocity and land cover, and also when modelling is concerned, for example by using social media mining. However, matching the data properties necessary for each part of the modelling cycle with citizen-generated data is still challenging. Nevertheless, the concept that citizen contributions can be used for simulation and forecasting is proved and further work lies in continuing to develop and improve not only methods for collection and analysis, but certainly for integration into models as well. Finally, in view of recent automated sensors and satellite technologies, it is through studies as the ones analysed in this article that the value of citizen contributions, complementing such technologies, is demonstrated.
NASA Astrophysics Data System (ADS)
Omrani, Mehrazin; Ruban, Véronique; Ruban, Gwenaël; Lamprea, Katerine
2017-11-01
Bulk Atmospheric Deposition (BAD), Wet Atmospheric Deposition (WAD) and Dry Atmospheric Deposition (DAD) were all measured within an urban residential area in Nantes (France) over a 9-month period (27 February - 10 December 2014). The objectives of this study were to compare 2 methods for measuring dry and wet atmospheric depositions in the urban environment (DAD and WAD: direct method; BAD and WAD: indirect one), and to characterize as well the variations and relative contributions of these depositions. Trace metals (As, Cd, Cr, Cu, Ni, Pt and V) were used to carry out these comparison and quantification. BAD was collected with two open polyethylene containers (72 × 54 × 21 cm), while WAD was collected by means of an automated rainwater collector and DAD was determined from both air measurements (recorded by an air sampler) and 7Be deposition velocities. The comparison based on a detailed evaluation of uncertainties showed a significant difference between the direct and indirect methods. Dry and wet depositions varied widely from one month to the next. Zn and Cu were the most abundant elements in both dry and wet depositions. The mean contribution of DAD to the bulk atmospheric deposition during this 9-month study was significant for Zn, Cu and V (about 25%) as well as for Pb (approx. 60%). For this relatively unpolluted urban residential catchment, the contribution of atmospheric deposition to global load at the catchment outlet was low, between 10% and 20% for Zn, Cu, V and Pb, 25% for Cr and about 30% for Ni. For other urban sites exhibiting high atmospheric pollution however, the atmospheric contribution to the global pollution load could be much greater. An accurate and representative estimation of DAD thus proves critical.
Cardoso-Toset, Fernando; Luque, Inmaculada; Carrasco, Librado; Jurado-Martos, Francisco; Risalde, María Ángeles; Venteo, Ángel; Infantes-Lorenzo, José A; Bezos, Javier; Rueda, Paloma; Tapia, Istar; Gortázar, Christian; Domínguez, Lucas; Domínguez, Mercedes; Gomez-Laguna, Jaime
2017-02-01
In countries where bovine tuberculosis (bTB) is still prevalent the contact among different animal species in extensive systems contributes to the circulation of Mycobacterium bovis (M. bovis) and other members of the Mycobacterium tuberculosis complex (MTC). Thus, free-range pigs can develop subclinical infections and may contribute to disease spread to bovine and wildlife. Serodiagnosis has been proposed as a screening tool for detecting infected pig herds; however, the value of this method to obtain an accurate diagnosis in this species is still not clear. In this study, sensitivity (Se) and specificity (Sp) estimates of four ELISAs and a lateral flow immunochromatographic antibody assay based on different M. bovis antigens, including MPB70 and MPB83 proteins, were evaluated in naturally infected domestic free-range pigs. For this purpose, submandibular lymph nodes and blood samples from 217 pigs from both TB-infected and historically negative farms were sampled at slaughterhouse and analysed by gross examination, histopathology, bacteriological culture and qPCR. Se and Sp estimates of the 5 evaluated assays ranged from 66.1% to 78% (CI 95 from 52.6 to 87.7%) and from 98.9% to 100% (CI 95 from 93.8 to 100%), respectively. Results of our study suggest that all the evaluated assays could be used as a first screening tool to conduct bTB surveillance in domestic pigs at population level; however, animals from seropositive herds should later be surveyed by other methods in order to reduce false negative results. Copyright © 2016 Elsevier B.V. All rights reserved.
Suman, Arnela; Schaafsma, Frederieke G; Buchbinder, Rachelle; van Tulder, Maurits W; Anema, Johannes R
2017-09-01
Background To reduce the burden of low back pain (LBP) in the Netherlands, a multidisciplinary guideline for LBP has been implemented in Dutch primary care using a multifaceted implementation strategy targeted at health care professionals (HCPs) and patients. The current paper describes the process evaluation of the implementation among HCPs. Methods The strategy aimed to improve multidisciplinary collaboration and communication, and consisted of 7 components. This process evaluation was performed using the Linnan and Steckler framework. Data were collected using a mixed methods approach of quantitative and qualitative data. Results 128 HCPs participated in the implementation study, of which 96 participated in quantitative and 21 participated in qualitative evaluation. Overall dose delivered for this study was 89 %, and the participants were satisfied with the strategy, mostly with the multidisciplinary approach, which contributed to the mutual understanding of each other's disciplines and perspectives. While the training sessions did not yield any new information, the strategy created awareness of the guideline and its recommendations, contributing to positively changing attitudes and aiding in improving guideline adherent behaviour. However, many barriers to implementation still exist, including personal and practical factors, confidence, dependence and distrust issues among the HCPs, as well as policy factors (e.g. reimbursement systems). Conclusions The data presented in this paper have shown that the strategy that was used to implement the guideline in a Dutch primary care setting was feasible, especially when using a multidisciplinary approach. However, identified barriers for implementation have been identified and should be addressed in future implementation.
An Innovative Method to Involve Community Health Workers as Partners in Evaluation Research
Issel, L. Michele; Townsell, Stephanie J.; Chapple-McGruder, Theresa; Handler, Arden
2011-01-01
Objectives. We developed a process through which community outreach workers, whose role is not typically that of a trained researcher, could actively participate in collection of qualitative evaluation data. Methods. Outreach workers for a community-based intervention project received training in qualitative research methodology and certification in research ethics. They used a Voice over Internet Protocol phone-in system to provide narrative reports about challenges faced by women they encountered in their outreach activities as well as their own experiences as outreach workers. Results. Qualitative data contributed by outreach workers provided insights not otherwise available to the evaluation team, including details about the complex lives of underserved women at risk for poor pregnancy outcomes and the challenges and rewards of the outreach worker role. Conclusions. Lay health workers can be a valuable asset as part of a research team. Training in research ethics and methods can be tailored to their educational level and preferences, and their insights provide important information and perspectives that may not be accessible via other data collection methods. Challenges encountered in the dual roles of researcher and lay health worker can be addressed in training. PMID:22021290
Evaluation of atmospheric correction algorithms for processing SeaWiFS data
NASA Astrophysics Data System (ADS)
Ransibrahmanakul, Varis; Stumpf, Richard; Ramachandran, Sathyadev; Hughes, Kent
2005-08-01
To enable the production of the best chlorophyll products from SeaWiFS data NOAA (Coastwatch and NOS) evaluated the various atmospheric correction algorithms by comparing the satellite derived water reflectance derived for each algorithm with in situ data. Gordon and Wang (1994) introduced a method to correct for Rayleigh and aerosol scattering in the atmosphere so that water reflectance may be derived from the radiance measured at the top of the atmosphere. However, since the correction assumed near infrared scattering to be negligible in coastal waters an invalid assumption, the method over estimates the atmospheric contribution and consequently under estimates water reflectance for the lower wavelength bands on extrapolation. Several improved methods to estimate near infrared correction exist: Siegel et al. (2000); Ruddick et al. (2000); Stumpf et al. (2002) and Stumpf et al. (2003), where an absorbing aerosol correction is also applied along with an additional 1.01% calibration adjustment for the 412 nm band. The evaluation show that the near infrared correction developed by Stumpf et al. (2003) result in an overall minimum error for U.S. waters. As of July 2004, NASA (SEADAS) has selected this as the default method for the atmospheric correction used to produce chlorophyll products.
Non-destructive evaluation of UV pulse laser-induced damage performance of fused silica optics.
Huang, Jin; Wang, Fengrui; Liu, Hongjie; Geng, Feng; Jiang, Xiaodong; Sun, Laixi; Ye, Xin; Li, Qingzhi; Wu, Weidong; Zheng, Wanguo; Sun, Dunlu
2017-11-24
The surface laser damage performance of fused silica optics is related to the distribution of surface defects. In this study, we used chemical etching assisted by ultrasound and magnetorheological finishing to modify defect distribution in a fused silica surface, resulting in fused silica samples with different laser damage performance. Non-destructive test methods such as UV laser-induced fluorescence imaging and photo-thermal deflection were used to characterize the surface defects that contribute to the absorption of UV laser radiation. Our results indicate that the two methods can quantitatively distinguish differences in the distribution of absorptive defects in fused silica samples subjected to different post-processing steps. The percentage of fluorescence defects and the weak absorption coefficient were strongly related to the damage threshold and damage density of fused silica optics, as confirmed by the correlation curves built from statistical analysis of experimental data. The results show that non-destructive evaluation methods such as laser-induced fluorescence and photo-thermal absorption can be effectively applied to estimate the damage performance of fused silica optics at 351 nm pulse laser radiation. This indirect evaluation method is effective for laser damage performance assessment of fused silica optics prior to utilization.
Rolland, Y; Bézy-Wendling, J; Duvauferrier, R; Coatrieux, J L
1999-03-01
To demonstrate the usefulness of a model of the parenchymous vascularization to evaluate texture analysis methods. Slices with thickness varying from 1 to 4 mm were reformatted from a 3D vascular model corresponding to either normal tissue perfusion or local hypervascularization. Parameters of statistical methods were measured on 16128x128 regions of interest, and mean values and standard deviation were calculated. For each parameter, the performances (discrimination power and stability) were evaluated. Among 11 calculated statistical parameters, three (homogeneity, entropy, mean of gradients) were found to have a good discriminating power to differentiate normal perfusion from hypervascularization, but only the gradient mean was found to have a good stability with respect to the thickness. Five parameters (run percentage, run length distribution, long run emphasis, contrast, and gray level distribution) were found to have intermediate results. In the remaining three, curtosis and correlation was found to have little discrimination power, skewness none. This 3D vascular model, which allows the generation of various examples of vascular textures, is a powerful tool to assess the performance of texture analysis methods. This improves our knowledge of the methods and should contribute to their a priori choice when designing clinical studies.
Chan, Linda; Mackintosh, Jeannie
2017-01-01
Background The National Collaborating Centre for Methods and Tools (NCCMT) offers workshops and webinars to build public health capacity for evidence-informed decision-making. Despite positive feedback for NCCMT workshops and resources, NCCMT users found key terms used in research papers difficult to understand. The Understanding Research Evidence (URE) videos use plain language, cartoon visuals, and public health examples to explain complex research concepts. The videos are posted on the NCCMT website and YouTube channel. Objective The first four videos in the URE web-based video series, which explained odds ratios (ORs), confidence intervals (CIs), clinical significance, and forest plots, were evaluated. The evaluation examined how the videos affected public health professionals’ practice. A mixed-methods approach was used to examine the delivery mode and the content of the videos. Specifically, the evaluation explored (1) whether the videos were effective at increasing knowledge on the four video topics, (2) whether public health professionals were satisfied with the videos, and (3) how public health professionals applied the knowledge gained from the videos in their work. Methods A three-part evaluation was conducted to determine the effectiveness of the first four URE videos. The evaluation included a Web-based survey, telephone interviews, and pretest and posttests, which evaluated public health professionals’ experience with the videos and how the videos affected their public health work. Participants were invited to participate in this evaluation through various open access, public health email lists, through informational flyers and posters at the Canadian Public Health Association (CPHA) conference, and through targeted recruitment to NCCMT’s network. Results In the Web-based surveys (n=46), participants achieved higher scores on the knowledge assessment questions from watching the OR (P=.04), CI (P=.04), and clinical significance (P=.05) videos but not the forest plot (P=.12) video, as compared with participants who had not watched the videos. The pretest and posttest (n=124) demonstrated that participants had a better understanding of forest plots (P<.001) and CIs (P<.001) after watching the videos. Due to small sample size numbers, there were insufficient pretest and posttest data to conduct meaningful analyses on the clinical significance and OR videos. Telephone interview participants (n=18) thought the videos’ use of animation, narration, and plain language was appropriate for people with different levels of understanding and learning styles. Participants felt that by increasing their understanding of research evidence, they could develop better interventions and design evaluations to measure the impact of public health initiatives. Conclusions Overall, the results of the evaluation showed that watching the videos resulted in an increase in knowledge, and participants had an overall positive experience with the URE videos. With increased competence in using the best available evidence, professionals are empowered to contribute to decisions that can improve health outcomes of communities. PMID:28958986
Effects of Mobile Learning in Medical Education: A Counterfactual Evaluation.
Briz-Ponce, Laura; Juanes-Méndez, Juan Antonio; García-Peñalvo, Francisco José; Pereira, Anabela
2016-06-01
The aim of this research is to contribute to the general system education providing new insights and resources. This study performs a quasi-experimental study at University of Salamanca with 30 students to compare results between using an anatomic app for learning and the formal traditional method conducted by a teacher. The findings of the investigation suggest that the performance of learners using mobile apps is statistical better than the students using the traditional method. However, mobile devices should be considered as an additional tool to complement the teachers' explanation and it is necessary to overcome different barriers and challenges to adopt these pedagogical methods at University.
An innovative method to involve community health workers as partners in evaluation research.
Peacock, Nadine; Issel, L Michele; Townsell, Stephanie J; Chapple-McGruder, Theresa; Handler, Arden
2011-12-01
We developed a process through which community outreach workers, whose role is not typically that of a trained researcher, could actively participate in collection of qualitative evaluation data. Outreach workers for a community-based intervention project received training in qualitative research methodology and certification in research ethics. They used a Voice over Internet Protocol phone-in system to provide narrative reports about challenges faced by women they encountered in their outreach activities as well as their own experiences as outreach workers. Qualitative data contributed by outreach workers provided insights not otherwise available to the evaluation team, including details about the complex lives of underserved women at risk for poor pregnancy outcomes and the challenges and rewards of the outreach worker role. Lay health workers can be a valuable asset as part of a research team. Training in research ethics and methods can be tailored to their educational level and preferences, and their insights provide important information and perspectives that may not be accessible via other data collection methods. Challenges encountered in the dual roles of researcher and lay health worker can be addressed in training.
A Novel Computational Method to Reduce Leaky Reaction in DNA Strand Displacement.
Li, Xin; Wang, Xun; Song, Tao; Lu, Wei; Chen, Zhihua; Shi, Xiaolong
2015-01-01
DNA strand displacement technique is widely used in DNA programming, DNA biosensors, and gene analysis. In DNA strand displacement, leaky reactions can cause DNA signals decay and detecting DNA signals fails. The mostly used method to avoid leakage is cleaning up after upstream leaky reactions, and it remains a challenge to develop reliable DNA strand displacement technique with low leakage. In this work, we address the challenge by experimentally evaluating the basic factors, including reaction time, ratio of reactants, and ion concentration to the leakage in DNA strand displacement. Specifically, fluorescent probes and a hairpin structure reporting DNA strand are designed to detect the output of DNA strand displacement, and thus can evaluate the leakage of DNA strand displacement reactions with different reaction time, ratios of reactants, and ion concentrations. From the obtained data, mathematical models for evaluating leakage are achieved by curve derivation. As a result, it is obtained that long time incubation, high concentration of fuel strand, and inappropriate amount of ion concentration can weaken leaky reactions. This contributes to a method to set proper reaction conditions to reduce leakage in DNA strand displacement.
Internet addiction assessment tools: dimensional structure and methodological status.
Lortie, Catherine L; Guitton, Matthieu J
2013-07-01
Excessive internet use is becoming a concern, and some have proposed that it may involve addiction. We evaluated the dimensions assessed by, and psychometric properties of, a range of questionnaires purporting to assess internet addiction. Fourteen questionnaires were identified purporting to assess internet addiction among adolescents and adults published between January 1993 and October 2011. Their reported dimensional structure, construct, discriminant and convergent validity and reliability were assessed, as well as the methods used to derive these. Methods used to evaluate internet addiction questionnaires varied considerably. Three dimensions of addiction predominated: compulsive use (79%), negative outcomes (86%) and salience (71%). Less common were escapism (21%), withdrawal symptoms (36%) and other dimensions. Measures of validity and reliability were found to be within normally acceptable limits. There is a broad convergence of questionnaires purporting to assess internet addiction suggesting that compulsive use, negative outcome and salience should be covered and the questionnaires show adequate psychometric properties. However, the methods used to evaluate the questionnaires vary widely and possible factors contributing to excessive use such as social motivation do not appear to be covered. © 2013 Society for the Study of Addiction.
Grouin, Cyril; Moriceau, Véronique; Zweigenbaum, Pierre
2015-12-01
The determination of risk factors and their temporal relations in natural language patient records is a complex task which has been addressed in the i2b2/UTHealth 2014 shared task. In this context, in most systems it was broadly decomposed into two sub-tasks implemented by two components: entity detection, and temporal relation determination. Task-level ("black box") evaluation is relevant for the final clinical application, whereas component-level evaluation ("glass box") is important for system development and progress monitoring. Unfortunately, because of the interaction between entity representation and temporal relation representation, glass box and black box evaluation cannot be managed straightforwardly at the same time in the setting of the i2b2/UTHealth 2014 task, making it difficult to assess reliably the relative performance and contribution of the individual components to the overall task. To identify obstacles and propose methods to cope with this difficulty, and illustrate them through experiments on the i2b2/UTHealth 2014 dataset. We outline several solutions to this problem and examine their requirements in terms of adequacy for component-level and task-level evaluation and of changes to the task framework. We select the solution which requires the least modifications to the i2b2 evaluation framework and illustrate it with our system. This system identifies risk factor mentions with a CRF system complemented by hand-designed patterns, identifies and normalizes temporal expressions through a tailored version of the Heideltime tool, and determines temporal relations of each risk factor with a One Rule classifier. Giving a fixed value to the temporal attribute in risk factor identification proved to be the simplest way to evaluate the risk factor detection component independently. This evaluation method enabled us to identify the risk factor detection component as most contributing to the false negatives and false positives of the global system. This led us to redirect further effort to this component, focusing on medication detection, with gains of 7 to 20 recall points and of 3 to 6 F-measure points depending on the corpus and evaluation. We proposed a method to achieve a clearer glass box evaluation of risk factor detection and temporal relation detection in clinical texts, which can provide an example to help system development in similar tasks. This glass box evaluation was instrumental in refocusing our efforts and obtaining substantial improvements in risk factor detection. Copyright © 2015 Elsevier Inc. All rights reserved.
Provost, Karine; Leblond, Antoine; Gauthier-Lemire, Annie; Filion, Édith; Bahig, Houda; Lord, Martin
2017-09-01
Planar perfusion scintigraphy with 99m Tc-labeled macroaggregated albumin is often used for pretherapy quantification of regional lung perfusion in lung cancer patients, particularly those with poor respiratory function. However, subdividing lung parenchyma into rectangular regions of interest, as done on planar images, is a poor reflection of true lobar anatomy. New tridimensional methods using SPECT and SPECT/CT have been introduced, including semiautomatic lung segmentation software. The present study evaluated inter- and intraobserver agreement on quantification using SPECT/CT software and compared the results for regional lung contribution obtained with SPECT/CT and planar scintigraphy. Methods: Thirty lung cancer patients underwent ventilation-perfusion scintigraphy with 99m Tc-macroaggregated albumin and 99m Tc-Technegas. The regional lung contribution to perfusion and ventilation was measured on both planar scintigraphy and SPECT/CT using semiautomatic lung segmentation software by 2 observers. Interobserver and intraobserver agreement for the SPECT/CT software was assessed using the intraclass correlation coefficient, Bland-Altman plots, and absolute differences in measurements. Measurements from planar and tridimensional methods were compared using the paired-sample t test and mean absolute differences. Results: Intraclass correlation coefficients were in the excellent range (above 0.9) for both interobserver and intraobserver agreement using the SPECT/CT software. Bland-Altman analyses showed very narrow limits of agreement. Absolute differences were below 2.0% in 96% of both interobserver and intraobserver measurements. There was a statistically significant difference between planar and SPECT/CT methods ( P < 0.001) for quantification of perfusion and ventilation for all right lung lobes, with a maximal mean absolute difference of 20.7% for the right middle lobe. There was no statistically significant difference in quantification of perfusion and ventilation for the left lung lobes using either method; however, absolute differences reached 12.0%. The total right and left lung contributions were similar for the two methods, with a mean difference of 1.2% for perfusion and 2.0% for ventilation. Conclusion: Quantification of regional lung perfusion and ventilation using SPECT/CT-based lung segmentation software is highly reproducible. This tridimensional method yields statistically significant differences in measurements for right lung lobes when compared with planar scintigraphy. We recommend that SPECT/CT-based quantification be used for all lung cancer patients undergoing pretherapy evaluation of regional lung function. © 2017 by the Society of Nuclear Medicine and Molecular Imaging.
Shuttle Main Propulsion System LH2 Feed Line and Inducer Simulations
NASA Technical Reports Server (NTRS)
Dorney, Daniel J.; Rothermel, Jeffry
2002-01-01
This viewgraph presentation includes simulations of the unsteady flow field in the LH2 feed line, flow line, flow liner, backing cavity and inducer of Shuttle engine #1. It also evaluates aerodynamic forcing functions which may contribute to the formation of the cracks observed on the flow liner slots. The presentation lists the numerical methods used, and profiles a benchmark test case.
ERIC Educational Resources Information Center
Gessner, Susann
2017-01-01
Purpose: The article enquires about how young migrants perceive and evaluate civic education in school and what expectations they have of the subject. Method: The article is based on a qualitative-oriented research work based on the Grounded Theory; surveys were made by interviews with students. Findings: The article emphasises that educational…
Modeling Wind Wave Evolution from Deep to Shallow Water
2011-09-30
validation and calibration of new model developments. WORK COMPLETED Development of a Lumped Quadruplet Approximation ( LQA ) To make evaluation of the...interactions based on the WRT method. This Lumped Quadruplet Approximation ( LQA ) clusters (lumps) contributions to the integrations over the...total transfer rate. A procedure has been developed to test the implementation (of LQA and other reduced versions of the WRT) where 1) the non
NASA Astrophysics Data System (ADS)
Wan, X. Y.
2017-12-01
The extensive constructions of reservoirs change the hydrologic characteristics of the associated watersheds, which obviously increases the complexity of watershed flood control decisions. By evaluating the impacts of the multi-reservoir system on the flood hydrograph, it becomes possible to improve the effectiveness of the flood control decisions. In this paper we compare the non-reservoir flood hydrograph with the actual observed flood hydrograph using the Lutaizi upstream of Huai river in East China as a representative case, where 20 large-scale/large-sized reservoirs have been built. Based on the total impact of the multi-reservoir system, a novel strategy, namely reservoir successively added (RSA) method, is presented to evaluate the contribution of each reservoir to the total impact. According each reservoir contribution, the "highly effective" reservoirs for watershed flood control are identified via hierarchical clustering. Moreover, we estimate further the degree of impact of the reservoir current operation rules on the flood hydrograph on the base of the impact of dams themselves. As a result, we find that the RSA method provides a useful method for analysis of multi-reservoir systems by partitioning the contribution of each reservoir to the total impacts on the flooding at the downstream section. For all the historical large floods examined, the multi-reservoir system in the Huai river watershed has a significant impact on flooding at the downstream Lutaizi section, on average reducing the flood volume and peak discharge by 13.92 × 108 m3 and 18.7% respectively. It is more informative to evaluate the maximum impact of each reservoir (on flooding at the downstream section) than to examine the average impact. Each reservoir has a different impact on the flood hydrograph at the Lutaizi section. In particular, the Meishan, Xianghongdian, Suyahu, Nanwan, Nianyushan and Foziling reservoirs exert a strong influence on the flood hydrograph, and are therefore important for flood control on the Huai river. Under the current operation rules, the volume and peak discharge of flooding at the Lutaizi section are reduced by 13.69 × 108m3 and 1429 m3/s respectively, accounting for 98% and 80.5% of the real reduction respectively.
Research of Hubs Location Method for Weighted Brain Network Based on NoS-FA.
Weng, Zhengkui; Wang, Bin; Xue, Jie; Yang, Baojie; Liu, Hui; Xiong, Xin
2017-01-01
As a complex network of many interlinked brain regions, there are some central hub regions which play key roles in the structural human brain network based on T1 and diffusion tensor imaging (DTI) technology. Since most studies about hubs location method in the whole human brain network are mainly concerned with the local properties of each single node but not the global properties of all the directly connected nodes, a novel hubs location method based on global importance contribution evaluation index is proposed in this study. The number of streamlines (NoS) is fused with normalized fractional anisotropy (FA) for more comprehensive brain bioinformation. The brain region importance contribution matrix and information transfer efficiency value are constructed, respectively, and then by combining these two factors together we can calculate the importance value of each node and locate the hubs. Profiting from both local and global features of the nodes and the multi-information fusion of human brain biosignals, the experiment results show that this method can detect the brain hubs more accurately and reasonably compared with other methods. Furthermore, the proposed location method is used in impaired brain hubs connectivity analysis of schizophrenia patients and the results are in agreement with previous studies.
Conversational evidence in therapeutic dialogue.
Strong, Tom; Busch, Robbie; Couture, Shari
2008-07-01
Family therapists' participation in therapeutic dialogue with clients is typically informed by evidence of how such dialogue is developing. In this article, we propose that conversational evidence, the kind that can be empirically analyzed using discourse analyses, be considered a contribution to widening psychotherapy's evidence base. After some preliminaries about what we mean by conversational evidence, we provide a genealogy of evaluative practice in psychotherapy, and examine qualitative evaluation methods for their theoretical compatibilities with social constructionist approaches to family therapy. We then move on to examine the notion of accomplishment in therapeutic dialogue given how such accomplishments can be evaluated using conversation analysis. We conclude by considering a number of research and pedagogical implications we associate with conversational evidence.
Quantitative methods in assessment of neurologic function.
Potvin, A R; Tourtellotte, W W; Syndulko, K; Potvin, J
1981-01-01
Traditionally, neurologists have emphasized qualitative techniques for assessing results of clinical trials. However, in recent years qualitative evaluations have been increasingly augmented by quantitative tests for measuring neurologic functions pertaining to mental state, strength, steadiness, reactions, speed, coordination, sensation, fatigue, gait, station, and simulated activities of daily living. Quantitative tests have long been used by psychologists for evaluating asymptomatic function, assessing human information processing, and predicting proficiency in skilled tasks; however, their methodology has never been directly assessed for validity in a clinical environment. In this report, relevant contributions from the literature on asymptomatic human performance and that on clinical quantitative neurologic function are reviewed and assessed. While emphasis is focused on tests appropriate for evaluating clinical neurologic trials, evaluations of tests for reproducibility, reliability, validity, and examiner training procedures, and for effects of motivation, learning, handedness, age, and sex are also reported and interpreted. Examples of statistical strategies for data analysis, scoring systems, data reduction methods, and data display concepts are presented. Although investigative work still remains to be done, it appears that carefully selected and evaluated tests of sensory and motor function should be an essential factor for evaluating clinical trials in an objective manner.
Practice-centred evaluation and the privileging of care in health information technology evaluation
2014-01-01
Background Electronic Patient Records (EPRs) and telemedicine are positioned by policymakers as health information technologies that are integral to achieving improved clinical outcomes and efficiency savings. However, evaluating the extent to which these aims are met poses distinct evaluation challenges, particularly where clinical and cost outcomes form the sole focus of evaluation design. We propose that a practice-centred approach to evaluation - in which those whose day-to-day care practice is altered (or not) by the introduction of new technologies are placed at the centre of evaluation efforts – can complement and in some instances offer advantages over, outcome-centric evaluation models. Methods We carried out a regional programme of innovation in renal services where a participative approach was taken to the introduction of new technologies, including: a regional EPR system and a system to support video clinics. An ‘action learning’ approach was taken to procurement, pre-implementation planning, implementation, ongoing development and evaluation. Participants included clinicians, technology specialists, patients and external academic researchers. Whilst undergoing these activities we asked: how can a practice-centred approach be embedded into evaluation of health information technologies? Discussion Organising EPR and telemedicine evaluation around predetermined outcome measures alone can be impractical given the complex and contingent nature of such projects. It also limits the extent to which unforeseen outcomes and new capabilities are recognised. Such evaluations often fail to improve understanding of ‘when’ and ‘under what conditions’ technology-enabled service improvements are realised, and crucially, how such innovation improves care. Summary Our contribution, drawn from our experience of the case study provided, is a protocol for practice-centred, participative evaluation of technology in the clinical setting that privileges care. In this context ‘practice-centred’ evaluation acts as a scalable, coordinating framework for evaluation that recognises health information technology supported care as an achievement that is contingent and ongoing. We argue that if complex programmes of technology-enabled service innovation are understood in terms of their contribution to patient care and supported by participative, capability-building evaluation methodologies, conditions are created for practitioners and patients to realise the potential of technologies and make substantive contributions to the evidence base underpinning health innovation programmes. PMID:24903604
Wang, Lifang
2017-01-01
University scientific research ability is an important indicator to express the strength of universities. In this paper, the evaluation of university scientific research ability is investigated based on the output of sci-tech papers. Four university alliances from North America, UK, Australia, and China, are selected as the case study of the university scientific research evaluation. Data coming from Thomson Reuters InCites are collected to support the evaluation. The work has contributed new framework to the issue of university scientific research ability evaluation. At first, we have established a hierarchical structure to show the factors that impact the evaluation of university scientific research ability. Then, a new MCDM method called D-AHP model is used to implement the evaluation and ranking of different university alliances, in which a data-driven approach is proposed to automatically generate the D numbers preference relations. Next, a sensitivity analysis has been given to show the impact of weights of factors and sub-factors on the evaluation result. At last, the results obtained by using different methods are compared and discussed to verify the effectiveness and reasonability of this study, and some suggestions are given to promote China’s scientific research ability. PMID:28212446
Health system reform and the role of field sites based upon demographic and health surveillance.
Tollman, S. M.; Zwi, A. B.
2000-01-01
Field sites for demographic and health surveillance have made well-recognized contributions to the evaluation of new or untested interventions, largely through efficacy trials involving new technologies or the delivery of selected services, e.g. vaccines, oral rehydration therapy and alternative contraceptive methods. Their role in health system reform, whether national or international, has, however, proved considerably more limited. The present article explores the characteristics and defining features of such field sites in low-income and middle-income countries and argues that many currently active sites have a largely untapped potential for contributing substantially to national and subnational health development. Since the populations covered by these sites often correspond with the boundaries of districts or subdistricts, the strategic use of information generated by demographic surveillance can inform the decentralization efforts of national and provincial health authorities. Among the areas of particular importance are the following: making population-based information available and providing an information resource; evaluating programmes and interventions; and developing applications to policy and practice. The question is posed as to whether their potential contribution to health system reform justifies arguing for adaptations to these field sites and expanded investment in them. PMID:10686747
Spatial Map of Synthesized Criteria for the Redundancy Resolution of Human Arm Movements.
Li, Zhi; Milutinovic, Dejan; Rosen, Jacob
2015-11-01
The kinematic redundancy of the human arm enables the elbow position to rotate about the axis going through the shoulder and wrist, which results in infinite possible arm postures when the arm reaches to a target in a 3-D workspace. To infer the control strategy the human motor system uses to resolve redundancy in reaching movements, this paper compares five redundancy resolution criteria and evaluates their arm posture prediction performance using data on healthy human motion. Two synthesized criteria are developed to provide better real-time arm posture prediction than the five individual criteria. Of these two, the criterion synthesized using an exponential method predicts the arm posture more accurately than that using a least squares approach, and therefore is preferable for inferring the contributions of the individual criteria to motor control during reaching movements. As a methodology contribution, this paper proposes a framework to compare and evaluate redundancy resolution criteria for arm motion control. A cluster analysis which associates criterion contributions with regions of the workspace provides a guideline for designing a real-time motion control system applicable to upper-limb exoskeletons for stroke rehabilitation.
Asymptotic modal analysis and statistical energy analysis
NASA Technical Reports Server (NTRS)
Dowell, Earl H.
1992-01-01
Asymptotic Modal Analysis (AMA) is a method which is used to model linear dynamical systems with many participating modes. The AMA method was originally developed to show the relationship between statistical energy analysis (SEA) and classical modal analysis (CMA). In the limit of a large number of modes of a vibrating system, the classical modal analysis result can be shown to be equivalent to the statistical energy analysis result. As the CMA result evolves into the SEA result, a number of systematic assumptions are made. Most of these assumptions are based upon the supposition that the number of modes approaches infinity. It is for this reason that the term 'asymptotic' is used. AMA is the asymptotic result of taking the limit of CMA as the number of modes approaches infinity. AMA refers to any of the intermediate results between CMA and SEA, as well as the SEA result which is derived from CMA. The main advantage of the AMA method is that individual modal characteristics are not required in the model or computations. By contrast, CMA requires that each modal parameter be evaluated at each frequency. In the latter, contributions from each mode are computed and the final answer is obtained by summing over all the modes in the particular band of interest. AMA evaluates modal parameters only at their center frequency and does not sum the individual contributions from each mode in order to obtain a final result. The method is similar to SEA in this respect. However, SEA is only capable of obtaining spatial averages or means, as it is a statistical method. Since AMA is systematically derived from CMA, it can obtain local spatial information as well.
Analyzing the contributions of a government-commissioned research project: a case study
2014-01-01
Background It often remains unclear to investigators how their research contributes to the work of the commissioner. We initiated the ‘Risk Model’ case study to gain insight into how a Dutch National Institute for Public Health and the Environment (RIVM) project and its knowledge products contribute to the commissioner’s work, the commissioner being the Health Care Inspectorate. We aimed to identify the alignment efforts that influenced the research project contributions. Based on the literature, we expected interaction between investigators and key users to be the most determining factor for the contributions of a research project. Methods In this qualitative case study, we analyzed the alignment efforts and contributions in the Risk Model project by means of document analysis and interviews according to the evaluation method Contribution Mapping. Furthermore, a map of the research process was drafted and a feedback session was organized. After the feedback session with stakeholders discussing the findings, we completed the case study report. Results Both organizations had divergent views on the ownership of the research product and the relationship between RIVM and the Inspectorate, which resulted in different expectations. The RIVM considered the use of the risk models to be problematic, but the inspectors had a positive opinion about its contributions. Investigators, inspectors, and managers were not aware of these remarkably different perceptions. In this research project, we identified six relevant categories of both horizontal alignment efforts (between investigators and key users) as well as vertical alignment efforts (within own organization) that influenced the contributions to the Inspectorate’s work. Conclusions Relevant alignment efforts influencing the contributions of the project became manifest at three levels: the first level directly relates to the project, the second to the organizational environment, and the third to the formal and historical relationship between the organizations. Both external and internal alignments influence the contributions of a research project. Based on the findings, we recommend that research institutes invest in a reflective attitude towards the social aspects of research projects at all levels of the organization and develop alignment strategies to enhance the contributions of research. PMID:24498894
Thermal conductivity of an imperfect anharmonic crystal
NASA Astrophysics Data System (ADS)
Sahu, D. N.; Sharma, P. K.
1983-09-01
The thermal conductivity of an anharmonic crystal containing randomly distributed substitutional defects due to impurity-phonon scattering is theoretically investigated with the use of the method of double-time thermal Green's functions and the Kubo formalism considering all the terms, i.e., diagonal, nondiagonal, cubic anharmonic, and imperfection terms in the energy-flux operator as propounded by Hardy. The study uses cubic, quartic anharmonic, and defect terms in the Hamiltonian. Mass changes as well as force-constant changes between impurity and host-lattice atoms are taken into account explicitly. It is shown that the total conductivity can be written as a sum of contributions, namely diagonal, nondiagonal, anharmonic, and imperfection contributions. For phonons of small halfwidth, the diagonal contribution has precisely the same form which is obtained from Boltzmann's transport equation for impurity scattering in the relaxation-time approximation. The present study shows that there is a finite contribution of the nondiagonal term, cubic anharmonic term, and the term due to lattice imperfections in the energy-flux operator to the thermal conductivity although the contribution is small compared with that from the diagonal part. We have also discussed the feasibility of numerical evaluation of the various contributions to the thermal conductivity.
Solav, Dana; Meric, Henri; Rubin, M B; Pradon, Didier; Lofaso, Frédéric; Wolf, Alon
2017-08-01
Optoelectronic plethysmography (OEP) is a noninvasive method for assessing lung volume variations and the contributions of different anatomical compartments of the chest wall (CW) through measurements of the motion of markers attached to the CW surface. The present study proposes a new method for analyzing the local CW kinematics from OEP measurements based on the kinematics of triangular Cosserat point elements (TCPEs). 52 reflective markers were placed on the anterior CW to create a mesh of 78 triangles according to an anatomical model. Each triangle was characterized by a TCPE and its kinematics was described using four time-variant scalar TCPE parameters. The total CW volume ([Formula: see text]) and the contributions of its six compartments were also estimated, using the same markers. The method was evaluated using measurements of ten healthy subjects, nine patients with Pompe disease, and ten patients with Duchenne muscular dystrophy (DMD), during spontaneous breathing (SB) and vital capacity maneuvers (VC) in the supine position. TCPE parameters and compartmental volumes were compared with [Formula: see text] by computing the phase angles [Formula: see text] (for SB) and the correlation r (for VC) between them. Analysis of [Formula: see text] and r of the outward translation parameter [Formula: see text] of each TCPE revealed that for healthy subjects it provided similar results to those obtained by compartmental volumes, whereas for the neuromuscular patients the TCPE method was capable of detecting local asynchronous and paradoxical movements also in cases where they were undistinguished by volumes. Therefore, the TCPE approach provides additional information to OEP that may enhance its clinical evaluation capabilities.
Evaluation of a scattering correction method for high energy tomography
NASA Astrophysics Data System (ADS)
Tisseur, David; Bhatia, Navnina; Estre, Nicolas; Berge, Léonie; Eck, Daniel; Payan, Emmanuel
2018-01-01
One of the main drawbacks of Cone Beam Computed Tomography (CBCT) is the contribution of the scattered photons due to the object and the detector. Scattered photons are deflected from their original path after their interaction with the object. This additional contribution of the scattered photons results in increased measured intensities, since the scattered intensity simply adds to the transmitted intensity. This effect is seen as an overestimation in the measured intensity thus corresponding to an underestimation of absorption. This results in artifacts like cupping, shading, streaks etc. on the reconstructed images. Moreover, the scattered radiation provides a bias for the quantitative tomography reconstruction (for example atomic number and volumic mass measurement with dual-energy technique). The effect can be significant and difficult in the range of MeV energy using large objects due to higher Scatter to Primary Ratio (SPR). Additionally, the incident high energy photons which are scattered by the Compton effect are more forward directed and hence more likely to reach the detector. Moreover, for MeV energy range, the contribution of the photons produced by pair production and Bremsstrahlung process also becomes important. We propose an evaluation of a scattering correction technique based on the method named Scatter Kernel Superposition (SKS). The algorithm uses a continuously thickness-adapted kernels method. The analytical parameterizations of the scatter kernels are derived in terms of material thickness, to form continuously thickness-adapted kernel maps in order to correct the projections. This approach has proved to be efficient in producing better sampling of the kernels with respect to the object thickness. This technique offers applicability over a wide range of imaging conditions and gives users an additional advantage. Moreover, since no extra hardware is required by this approach, it forms a major advantage especially in those cases where experimental complexities must be avoided. This approach has been previously tested successfully in the energy range of 100 keV - 6 MeV. In this paper, the kernels are simulated using MCNP in order to take into account both photons and electronic processes in scattering radiation contribution. We present scatter correction results on a large object scanned with a 9 MeV linear accelerator.
White, Paul A; Johnson, George E
2016-05-01
Applied genetic toxicology is undergoing a transition from qualitative hazard identification to quantitative dose-response analysis and risk assessment. To facilitate this change, the Health and Environmental Sciences Institute (HESI) Genetic Toxicology Technical Committee (GTTC) sponsored a workshop held in Lancaster, UK on July 10-11, 2014. The event included invited speakers from several institutions and the contents was divided into three themes-1: Point-of-departure Metrics for Quantitative Dose-Response Analysis in Genetic Toxicology; 2: Measurement and Estimation of Exposures for Better Extrapolation to Humans and 3: The Use of Quantitative Approaches in Genetic Toxicology for human health risk assessment (HHRA). A host of pertinent issues were discussed relating to the use of in vitro and in vivo dose-response data, the development of methods for in vitro to in vivo extrapolation and approaches to use in vivo dose-response data to determine human exposure limits for regulatory evaluations and decision-making. This Special Issue, which was inspired by the workshop, contains a series of papers that collectively address topics related to the aforementioned themes. The Issue includes contributions that collectively evaluate, describe and discuss in silico, in vitro, in vivo and statistical approaches that are facilitating the shift from qualitative hazard evaluation to quantitative risk assessment. The use and application of the benchmark dose approach was a central theme in many of the workshop presentations and discussions, and the Special Issue includes several contributions that outline novel applications for the analysis and interpretation of genetic toxicity data. Although the contents of the Special Issue constitutes an important step towards the adoption of quantitative methods for regulatory assessment of genetic toxicity, formal acceptance of quantitative methods for HHRA and regulatory decision-making will require consensus regarding the relationships between genetic damage and disease, and the concomitant ability to use genetic toxicity results per se. © Her Majesty the Queen in Right of Canada 2016. Reproduced with the permission of the Minister of Health.
Dufour, Suzie; Atchia, Yaaseen; Gad, Raanan; Ringuette, Dene; Sigal, Iliya; Levi, Ofer
2013-01-01
The integrity of the blood brain barrier (BBB) can contribute to the development of many brain disorders. We evaluate laser speckle contrast imaging (LSCI) as an intrinsic modality for monitoring BBB disruptions through simultaneous fluorescence and LSCI with vertical cavity surface emitting lasers (VCSELs). We demonstrated that drug-induced BBB opening was associated with a relative change of the arterial and venous blood velocities. Cross-sectional flow velocity ratio (veins/arteries) decreased significantly in rats treated with BBB-opening drugs, ≤0.81 of initial values. PMID:24156049
[Development of a program theory as a basis for the evaluation of a dementia special care unit].
Adlbrecht, Laura; Bartholomeyczik, Sabine; Mayer, Hanna
2018-06-01
Background: An existing dementia special care unit should be evaluated. In order to build a sound foundation of the evaluation a deep theoretical understanding of the implemented intervention is needed, which has not been explicated yet. One possibility to achieve this is the development of a program theory. Aim: The aim is to present a method to develop a program theory for the existing living and care concept of the dementia special care unit, which is used in a larger project to evaluate the concept theory-drivenly. Method: The evaluation is embedded in the framework of van Belle et al. (2010) and an action model and a change model (Chen, 2015) is created. For the specification of the change model the contribution analysis (Mayne, 2011) is applied. Data were collected in workshops with the developers and the nurses of the dementia special care unit and a literature research concerning interventions and outcomes was carried out. The results were synthesized in a consens workshop. Results: The action model describes the interventions of the dementia special care unit, the implementers, the organization and the context. The change model compromises the mechanisms through which interventions achieve outcomes. Conclusions: The results of the program theory can be employed to choose data collection methods and instruments for the evaluation. On the basis of the results of the evaluation the program theory can be refined and adapted.
Perrino, Cinzia; Marcovecchio, Francesca
2016-02-01
Primary Biologic Atmospheric Particles (PBAPs) constitute an interesting and poorly investigated component of the atmospheric aerosol. We have developed and validated a method for evaluating the contribution of overall PBAPs to the mass concentration of atmospheric particulate matter (PM). The method is based on PM sampling on polycarbonate filters, staining of the collected particles with propidium iodide, observation at epifluorescence microscope and calculation of the bioaerosol mass using a digital image analysis software. The method has been also adapted to the observation and quantification of size-segregated aerosol samples collected by multi-stage impactors. Each step of the procedure has been individually validated. The relative repeatability of the method, calculated on 10 pairs of atmospheric PM samples collected side-by-side, was 16%. The method has been applied to real atmospheric samples collected in the vicinity of Rome, Italy. Size distribution measurements revealed that PBAPs was mainly in the coarse fraction of PM, with maxima in the range 5.6-10 μm. 24-h samples collected during different period of the year have shown that the concentration of bioaerosol was in the range 0.18-5.3 μg m(-3) (N=20), with a contribution to the organic matter in PM10 in the range 0.5-31% and to the total mass concentration of PM10 in the range 0.3-18%. The possibility to determine the concentration of total PBAPs in PM opens up interesting perspectives in terms of studying the health effects of these components and of increasing our knowledge about the composition of the organic fraction of the atmospheric aerosol. Copyright © 2015 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Martin, A. M.; Barnes, M. H.; Chambers, L. H.; Pippin, M. R.
2011-12-01
As part of NASA's Minority University Research and Education Program (MUREP), the NASA Innovations in Climate Education (NICE) project at Langley Research Center has funded 71 climate education initiatives since 2008. The funded initiatives span across the nation and contribute to the development of a climate-literate public and the preparation of a climate-related STEM workforce through research experiences, professional development opportunities, development of data access and modeling tools, and educational opportunities in both K-12 and higher education. Each of the funded projects proposes and carries out its own evaluation plan, in collaboration with external or internal evaluation experts. Using this portfolio as an exemplar case, NICE has undertaken a systematic meta-evaluation of these plans, focused primarily on evaluation questions, approaches, and methods. This meta-evaluation study seeks to understand the range of evaluations represented in the NICE portfolio, including descriptive information (what evaluations, questions, designs, approaches, and methods are applied?) and questions of value (do these evaluations meet the needs of projects and their staff, and of NASA/NICE?). In the current climate, as federal funders of climate change and STEM education projects seek to better understand and incorporate evaluation into their decisions, evaluators and project leaders are also seeking to build robust understanding of program effectiveness. Meta-evaluations like this provide some baseline understanding of the current status quo and the kinds of evaluations carried out within such funding portfolios. These explorations are needed to understand the common ground between evaluative best practices, limited resources, and agencies' desires, capacity, and requirements. When NASA asks for evaluation of funded projects, what happens? Which questions are asked and answered, using which tools? To what extent do the evaluations meet the needs of projects and program officers? How do they contribute to best practices in climate science education? These questions are important to ask about STEM and climate literacy work more generally; the NICE portfolio provides a broad test case for thinking strategically, critically, and progressively about evaluation in our community. Our findings can inform the STEM education, communication, and public outreach communities, and prompt us to consider a broad range of informative evaluation options. During this presentation, we will consider the breadth, depth and utility of evaluations conducted through a NASA climate education funding opportunity. We will examine the relationship between what we want to know about education programs, what we want to achieve with our interventions, and what we ask in our evaluations.
NASA Astrophysics Data System (ADS)
Martin, A. M.; Barnes, M. H.; Chambers, L. H.; Pippin, M. R.
2013-12-01
As part of NASA's Minority University Research and Education Program (MUREP), the NASA Innovations in Climate Education (NICE) project at Langley Research Center has funded 71 climate education initiatives since 2008. The funded initiatives span across the nation and contribute to the development of a climate-literate public and the preparation of a climate-related STEM workforce through research experiences, professional development opportunities, development of data access and modeling tools, and educational opportunities in both K-12 and higher education. Each of the funded projects proposes and carries out its own evaluation plan, in collaboration with external or internal evaluation experts. Using this portfolio as an exemplar case, NICE has undertaken a systematic meta-evaluation of these plans, focused primarily on evaluation questions, approaches, and methods. This meta-evaluation study seeks to understand the range of evaluations represented in the NICE portfolio, including descriptive information (what evaluations, questions, designs, approaches, and methods are applied?) and questions of value (do these evaluations meet the needs of projects and their staff, and of NASA/NICE?). In the current climate, as federal funders of climate change and STEM education projects seek to better understand and incorporate evaluation into their decisions, evaluators and project leaders are also seeking to build robust understanding of program effectiveness. Meta-evaluations like this provide some baseline understanding of the current status quo and the kinds of evaluations carried out within such funding portfolios. These explorations are needed to understand the common ground between evaluative best practices, limited resources, and agencies' desires, capacity, and requirements. When NASA asks for evaluation of funded projects, what happens? Which questions are asked and answered, using which tools? To what extent do the evaluations meet the needs of projects and program officers? How do they contribute to best practices in climate science education? These questions are important to ask about STEM and climate literacy work more generally; the NICE portfolio provides a broad test case for thinking strategically, critically, and progressively about evaluation in our community. Our findings can inform the STEM education, communication, and public outreach communities, and prompt us to consider a broad range of informative evaluation options. During this presentation, we will consider the breadth, depth and utility of evaluations conducted through a NASA climate education funding opportunity. We will examine the relationship between what we want to know about education programs, what we want to achieve with our interventions, and what we ask in our evaluations.
Ackerman, Janet M.; Dairkee, Shanaz H.; Fenton, Suzanne E.; Johnson, Dale; Navarro, Kathleen M.; Osborne, Gwendolyn; Rudel, Ruthann A.; Solomon, Gina M.; Zeise, Lauren; Janssen, Sarah
2015-01-01
Background Current approaches to chemical screening, prioritization, and assessment are being reenvisioned, driven by innovations in chemical safety testing, new chemical regulations, and demand for information on human and environmental impacts of chemicals. To conceptualize these changes through the lens of a prevalent disease, the Breast Cancer and Chemicals Policy project convened an interdisciplinary expert panel to investigate methods for identifying chemicals that may increase breast cancer risk. Methods Based on a review of current evidence, the panel identified key biological processes whose perturbation may alter breast cancer risk. We identified corresponding assays to develop the Hazard Identification Approach for Breast Carcinogens (HIA-BC), a method for detecting chemicals that may raise breast cancer risk. Finally, we conducted a literature-based pilot test of the HIA-BC. Results The HIA-BC identifies assays capable of detecting alterations to biological processes relevant to breast cancer, including cellular and molecular events, tissue changes, and factors that alter susceptibility. In the pilot test of the HIA-BC, chemicals associated with breast cancer all demonstrated genotoxic or endocrine activity, but not necessarily both. Significant data gaps persist. Conclusions This approach could inform the development of toxicity testing that targets mechanisms relevant to breast cancer, providing a basis for identifying safer chemicals. The study identified important end points not currently evaluated by federal testing programs, including altered mammary gland development, Her2 activation, progesterone receptor activity, prolactin effects, and aspects of estrogen receptor β activity. This approach could be extended to identify the biological processes and screening methods relevant for other common diseases. Citation Schwarzman MR, Ackerman JM, Dairkee SH, Fenton SE, Johnson D, Navarro KM, Osborne G, Rudel RA, Solomon GM, Zeise L, Janssen S. 2015. Screening for chemical contributions to breast cancer risk: a case study for chemical safety evaluation. Environ Health Perspect 123:1255–1264; http://dx.doi.org/10.1289/ehp.1408337 PMID:26032647
Assessment of masticatory performance by means of a color-changeable chewing gum.
Tarkowska, Agnieszka; Katzer, Lukasz; Ahlers, Marcus Oliver
2017-01-01
Previous research determined the relevance of masticatory performance with regard to nutritional status, cognitive functions, or stress management. In addition, the measurement of masticatory efficiency contributes to the evaluation of therapeutic successes within the stomatognathic system. However, the question remains unanswered as to what extent modern techniques are able to reproduce the subtle differences in masticatory efficiency within various patient groups. The purpose of this review is to provide an extensive summary of the evaluation of masticatory performance by means of a color-changeable chewing gum with regard to its clinical relevance and applicability. A general overview describing the various methods available for this task has already been published. This review focuses in depth on the research findings available on the technique of measuring masticatory performance by means of color-changeable chewing gum. Described are the mechanism and the differentiability of the color change and methods to evaluate the color changes. Subsequently, research on masticatory performance is conducted with regard to patient age groups, the impact of general diseases and the effect of prosthetic and surgical treatment. The studies indicate that color-changeable chewing gum is a valid and reliable method for the evaluation of masticatory function. Apart from other methods, in clinical practice this technique can enhance dental diagnostics as well as the assessment of therapy outcomes. Copyright © 2016 Japan Prosthodontic Society. Published by Elsevier Ltd. All rights reserved.
Liu, Zhe; Geng, Yong; Zhang, Pan; Dong, Huijuan; Liu, Zuoxi
2014-09-01
In China, local governments of many areas prefer to give priority to the development of heavy industrial clusters in pursuit of high value of gross domestic production (GDP) growth to get political achievements, which usually results in higher costs from ecological degradation and environmental pollution. Therefore, effective methods and reasonable evaluation system are urgently needed to evaluate the overall efficiency of industrial clusters. Emergy methods links economic and ecological systems together, which can evaluate the contribution of ecological products and services as well as the load placed on environmental systems. This method has been successfully applied in many case studies of ecosystem but seldom in industrial clusters. This study applied the methodology of emergy analysis to perform the efficiency of industrial clusters through a series of emergy-based indices as well as the proposed indicators. A case study of Shenyang Economic Technological Development Area (SETDA) was investigated to show the emergy method's practical potential to evaluate industrial clusters to inform environmental policy making. The results of our study showed that the industrial cluster of electric equipment and electronic manufacturing produced the most economic value and had the highest efficiency of energy utilization among the four industrial clusters. However, the sustainability index of the industrial cluster of food and beverage processing was better than the other industrial clusters.
Quantifying the mass loss of peripheral Greenland glaciers and ice caps (1958-2014).
NASA Astrophysics Data System (ADS)
Noël, Brice; van de Berg, Willem Jan; Machguth, Horst; van den Broeke, Michiel
2016-04-01
Since the 2000s, mass loss from Greenland peripheral glaciers and ice caps (GICs) has accelerated, becoming an important contributor to sea level rise. Under continued warming throughout the 21st century, GICs might yield up to 7.5 to 11 mm sea level rise, with increasing dominance of surface runoff at the expense of ice discharge. However, despite multiple observation campaigns, little remains known about the contribution of GICs to total Greenland mass loss. Furthermore, the relatively coarse resolutions in regional climate models, i.e. 5 km to 20 km, fail to represent the small scale patterns of surface mass balance (SMB) components over these topographically complex regions including also narrow valley glaciers. Here, we present a novel approach to quantify the contribution of GICs to surface melt and runoff, based on an elevation dependent downscaling method. GICs daily SMB components at 1 km resolution are obtained by statistically downscaling the outputs of RACMO2.3 at 11 km resolution to a down-sampled version of the GIMP DEM for the period 1958-2014. This method has recently been successfully validated over the Greenland ice sheet and is now applied to GICs. In this study, we first evaluate the 1 km daily downscaled GICs SMB against a newly available and comprehensive dataset of ablation stake measurements. Then, we investigate present-day trends of meltwater production and SMB for different regions and estimate GICs contribution to total Greenland mass loss. These data are considered valuable for model evaluation and prediction of future sea level rise.
NASA Astrophysics Data System (ADS)
Bruckner, B.; Roth, D.; Goebl, D.; Bauer, P.; Primetzhofer, D.
2018-05-01
Electronic stopping measurements in chemically reactive targets, e.g., transition and rare earth metals are challenging. These metals often contain low Z impurities, which contribute to electronic stopping. In this article, we present two ways how one can correct for the presence of impurities in the evaluation of proton and He stopping in Ni for primary energies between 1 and 100 keV, either considering or ignoring the contribution of the low Z impurities to multiple scattering. We find, that for protons either method leads to concordant results, but for heavier projectiles, e.g. He ions, the influence on multiple scattering must not be neglected.
Petticrew, Mark; Rehfuess, Eva; Noyes, Jane; Higgins, Julian P T; Mayhew, Alain; Pantoja, Tomas; Shemilt, Ian; Sowden, Amanda
2013-11-01
Although there is increasing interest in the evaluation of complex interventions, there is little guidance on how evidence from complex interventions may be reviewed and synthesized, and the relevance of the plethora of evidence synthesis methods to complexity is unclear. This article aims to explore how different meta-analytical approaches can be used to examine aspects of complexity; describe the contribution of various narrative, tabular, and graphical approaches to synthesis; and give an overview of the potential choice of selected qualitative and mixed-method evidence synthesis approaches. The methodological discussions presented here build on a 2-day workshop held in Montebello, Canada, in January 2012, involving methodological experts from the Campbell and Cochrane Collaborations and from other international review centers (Anderson L, Petticrew M, Chandler J, et al. systematic reviews of complex interventions. In press). These systematic review methodologists discussed the broad range of existing methods and considered the relevance of these methods to reviews of complex interventions. The evidence from primary studies of complex interventions may be qualitative or quantitative. There is a wide range of methodological options for reviewing and presenting this evidence. Specific contributions of statistical approaches include the use of meta-analysis, meta-regression, and Bayesian methods, whereas narrative summary approaches provide valuable precursors or alternatives to these. Qualitative and mixed-method approaches include thematic synthesis, framework synthesis, and realist synthesis. A suitable combination of these approaches allows synthesis of evidence for understanding complex interventions. Reviewers need to consider which aspects of complex interventions should be a focus of their review and what types of quantitative and/or qualitative studies they will be including, and this will inform their choice of review methods. These may range from standard meta-analysis through to more complex mixed-method synthesis and synthesis approaches that incorporate theory and/or user's perspectives. Copyright © 2013 Elsevier Inc. All rights reserved.
Dal'Maso, Vinícius Buaes; Mallmann, Lucas; Siebert, Marina; Simon, Laura; Saraiva-Pereira, Maria Luiza; Dalcin, Paulo de Tarso Roth
2013-01-01
OBJECTIVE: To evaluate the diagnostic contribution of molecular analysis of the cystic fibrosis transmembrane conductance regulator (CFTR) gene in patients suspected of having mild or atypical cystic fibrosis (CF). METHODS: This was a cross-sectional study involving adolescents and adults aged ≥ 14 years. Volunteers underwent clinical, laboratory, and radiological evaluation, as well as spirometry, sputum microbiology, liver ultrasound, sweat tests, and molecular analysis of the CFTR gene. We then divided the patients into three groups by the number of mutations identified (none, one, and two or more) and compared those groups in terms of their characteristics. RESULTS: We evaluated 37 patients with phenotypic findings of CF, with or without sweat test confirmation. The mean age of the patients was 32.5 ± 13.6 years, and females predominated (75.7%). The molecular analysis contributed to the definitive diagnosis of CF in 3 patients (8.1%), all of whom had at least two mutations. There were 7 patients (18.9%) with only one mutation and 26 patients (70.3%) with no mutations. None of the clinical characteristics evaluated was found to be associated with the genetic diagnosis. The most common mutation was p.F508del, which was found in 5 patients. The combination of p.V232D and p.F508del was found in 2 patients. Other mutations identified were p.A559T, p.D1152H, p.T1057A, p.I148T, p.V754M, p.P1290P, p.R1066H, and p.T351S. CONCLUSIONS: The molecular analysis of the CFTR gene coding region showed a limited contribution to the diagnostic investigation of patients suspected of having mild or atypical CF. In addition, there were no associations between the clinical characteristics and the genetic diagnosis. PMID:23670503
Langston, Anne; Weiss, Jennifer; Landegger, Justine; Pullum, Thomas; Morrow, Melanie; Kabadege, Melene; Mugeni, Catherine; Sarriot, Eric
2014-08-01
The Kabeho Mwana project (2006-2011) supported the Rwanda Ministry of Health (MOH) in scaling up integrated community case management (iCCM) of childhood illness in 6 of Rwanda's 30 districts. The project trained and equipped community health workers (CHWs) according to national guidelines. In project districts, Kabeho Mwana staff also trained CHWs to conduct household-level health promotion and established supervision and reporting mechanisms through CHW peer support groups (PSGs) and quality improvement systems. The 2005 and 2010 Demographic and Health Surveys were re-analyzed to evaluate how project and non-project districts differed in terms of care-seeking for fever, diarrhea, and acute respiratory infection symptoms and related indicators. We developed a logit regression model, controlling for the timing of the first CHW training, with the district included as a fixed categorical effect. We also analyzed qualitative data from the final evaluation to examine factors that may have contributed to improved outcomes. While there was notable improvement in care-seeking across all districts, care-seeking from any provider for each of the 3 conditions, and for all 3 combined, increased significantly more in the project districts. CHWs contributed a larger percentage of consultations in project districts (27%) than in non-project districts (12%). Qualitative data suggested that the PSG model was a valuable sub-level of CHW organization associated with improved CHW performance, supervision, and social capital. The iCCM model implemented by Kabeho Mwana resulted in greater improvements in care-seeking than those seen in the rest of the country. Intensive monitoring, collaborative supervision, community mobilization, and CHW PSGs contributed to this success. The PSGs were a unique contribution of the project, playing a critical role in improving care-seeking in project districts. Effective implementation of iCCM should therefore include CHW management and social support mechanisms. Finally, re-analysis of national survey data improved evaluation findings by providing impact estimates.
Neutron dose estimation in a zero power nuclear reactor
NASA Astrophysics Data System (ADS)
Triviño, S.; Vedelago, J.; Cantargi, F.; Keil, W.; Figueroa, R.; Mattea, F.; Chautemps, A.; Santibañez, M.; Valente, M.
2016-10-01
This work presents the characterization and contribution of neutron and gamma components to the absorbed dose in a zero power nuclear reactor. A dosimetric method based on Fricke gel was implemented to evaluate the separation between dose components in the mixed field. The validation of this proposed method was performed by means of direct measurements of neutron flux in different positions using Au and Mg-Ni activation foils. Monte Carlo simulations were conversely performed using the MCNP main code with a dedicated subroutine to incorporate the exact complete geometry of the nuclear reactor facility. Once nuclear fuel elements were defined, the simulations computed the different contributions to the absorbed dose in specific positions inside the core. Thermal/epithermal contributions of absorbed dose were assessed by means of Fricke gel dosimetry using different isotopic compositions aimed at modifying the sensitivity of the dosimeter for specific dose components. Clear distinctions between gamma and neutron capture dose were obtained. Both Monte Carlo simulations and experimental results provided reliable estimations about neutron flux rate as well as dose rate during the reactor operation. Simulations and experimental results are in good agreement in every positions measured and simulated in the core.
Horsch, Alexander; Hapfelmeier, Alexander; Elter, Matthias
2011-11-01
Breast cancer is globally a major threat for women's health. Screening and adequate follow-up can significantly reduce the mortality from breast cancer. Human second reading of screening mammograms can increase breast cancer detection rates, whereas this has not been proven for current computer-aided detection systems as "second reader". Critical factors include the detection accuracy of the systems and the screening experience and training of the radiologist with the system. When assessing the performance of systems and system components, the choice of evaluation methods is particularly critical. Core assets herein are reference image databases and statistical methods. We have analyzed characteristics and usage of the currently largest publicly available mammography database, the Digital Database for Screening Mammography (DDSM) from the University of South Florida, in literature indexed in Medline, IEEE Xplore, SpringerLink, and SPIE, with respect to type of computer-aided diagnosis (CAD) (detection, CADe, or diagnostics, CADx), selection of database subsets, choice of evaluation method, and quality of descriptions. 59 publications presenting 106 evaluation studies met our selection criteria. In 54 studies (50.9%), the selection of test items (cases, images, regions of interest) extracted from the DDSM was not reproducible. Only 2 CADx studies, not any CADe studies, used the entire DDSM. The number of test items varies from 100 to 6000. Different statistical evaluation methods are chosen. Most common are train/test (34.9% of the studies), leave-one-out (23.6%), and N-fold cross-validation (18.9%). Database-related terminology tends to be imprecise or ambiguous, especially regarding the term "case". Overall, both the use of the DDSM as data source for evaluation of mammography CAD systems, and the application of statistical evaluation methods were found highly diverse. Results reported from different studies are therefore hardly comparable. Drawbacks of the DDSM (e.g. varying quality of lesion annotations) may contribute to the reasons. But larger bias seems to be caused by authors' own decisions upon study design. RECOMMENDATIONS/CONCLUSION: For future evaluation studies, we derive a set of 13 recommendations concerning the construction and usage of a test database, as well as the application of statistical evaluation methods.
Extracting metalworking fluid aerosol samples in cassettes by provisional ASTM and NIOSH methods.
Harper, Martin
2002-01-01
Recent provisional methods for the determination of metalworking fluid aerosol in workplace air involve a solvent extraction procedure to separate the nonvolatile fraction of the fluid from insoluble material such as metal turnings and dirt. The procedure calls for preweighing a filter (W1) and assembling it into a cassette and taking a sample. In the laboratory the filter is removed from the cassette, desiccated to remove any collected water or other volatile substances, and weighed again (W2). The filter is then extracted in an organic solvent blend, allowed to dry, and weighed a final time (W3). The total weight collected by the filter is given by (W2-W1), and the weight of (nonvolatile) metalworking fluid collected is given by (W2-W3). The extraction step can take place within a cassette housing if it is relatively inert to the solvent blend used. The extraction of four metalworking fluids (straight oil, soluble oil, synthetic and semisynthetic) within disposable polypropylene cassettes was investigated using the same protocol used to evaluate the original method. For all fluids the extraction efficiency was greater than 95% with a precision better than 5%. The mean blank contribution to the extraction step was 16 micrograms. Blanks were also evaluated after storage, and after transport and storage. A small additional blank contribution could be removed by desiccation. The limits of detection and quantitation of the extraction step were calculated to be 28 and 94 micrograms, respectively.
Estimation of Additive, Dominance, and Imprinting Genetic Variance Using Genomic Data
Lopes, Marcos S.; Bastiaansen, John W. M.; Janss, Luc; Knol, Egbert F.; Bovenhuis, Henk
2015-01-01
Traditionally, exploration of genetic variance in humans, plants, and livestock species has been limited mostly to the use of additive effects estimated using pedigree data. However, with the development of dense panels of single-nucleotide polymorphisms (SNPs), the exploration of genetic variation of complex traits is moving from quantifying the resemblance between family members to the dissection of genetic variation at individual loci. With SNPs, we were able to quantify the contribution of additive, dominance, and imprinting variance to the total genetic variance by using a SNP regression method. The method was validated in simulated data and applied to three traits (number of teats, backfat, and lifetime daily gain) in three purebred pig populations. In simulated data, the estimates of additive, dominance, and imprinting variance were very close to the simulated values. In real data, dominance effects account for a substantial proportion of the total genetic variance (up to 44%) for these traits in these populations. The contribution of imprinting to the total phenotypic variance of the evaluated traits was relatively small (1–3%). Our results indicate a strong relationship between additive variance explained per chromosome and chromosome length, which has been described previously for other traits in other species. We also show that a similar linear relationship exists for dominance and imprinting variance. These novel results improve our understanding of the genetic architecture of the evaluated traits and shows promise to apply the SNP regression method to other traits and species, including human diseases. PMID:26438289
A Benchmark and Comparative Study of Video-Based Face Recognition on COX Face Database.
Huang, Zhiwu; Shan, Shiguang; Wang, Ruiping; Zhang, Haihong; Lao, Shihong; Kuerban, Alifu; Chen, Xilin
2015-12-01
Face recognition with still face images has been widely studied, while the research on video-based face recognition is inadequate relatively, especially in terms of benchmark datasets and comparisons. Real-world video-based face recognition applications require techniques for three distinct scenarios: 1) Videoto-Still (V2S); 2) Still-to-Video (S2V); and 3) Video-to-Video (V2V), respectively, taking video or still image as query or target. To the best of our knowledge, few datasets and evaluation protocols have benchmarked for all the three scenarios. In order to facilitate the study of this specific topic, this paper contributes a benchmarking and comparative study based on a newly collected still/video face database, named COX(1) Face DB. Specifically, we make three contributions. First, we collect and release a largescale still/video face database to simulate video surveillance with three different video-based face recognition scenarios (i.e., V2S, S2V, and V2V). Second, for benchmarking the three scenarios designed on our database, we review and experimentally compare a number of existing set-based methods. Third, we further propose a novel Point-to-Set Correlation Learning (PSCL) method, and experimentally show that it can be used as a promising baseline method for V2S/S2V face recognition on COX Face DB. Extensive experimental results clearly demonstrate that video-based face recognition needs more efforts, and our COX Face DB is a good benchmark database for evaluation.
Petters, M. D.; Kreidenweis, S. M.; Ziemann, P. J.
2016-01-19
A wealth of recent laboratory and field experiments demonstrate that organic aerosol composition evolves with time in the atmosphere, leading to changes in the influence of the organic fraction to cloud condensation nuclei (CCN) spectra. There is a need for tools that can realistically represent the evolution of CCN activity to better predict indirect effects of organic aerosol on clouds and climate. This work describes a model to predict the CCN activity of organic compounds from functional group composition. Following previous methods in the literature, we test the ability of semi-empirical group contribution methods in Kohler theory to predict themore » effective hygroscopicity parameter, kappa. However, in our approach we also account for liquid–liquid phase boundaries to simulate phase-limited activation behavior. Model evaluation against a selected database of published laboratory measurements demonstrates that kappa can be predicted within a factor of 2. Simulation of homologous series is used to identify the relative effectiveness of different functional groups in increasing the CCN activity of weakly functionalized organic compounds. Hydroxyl, carboxyl, aldehyde, hydroperoxide, carbonyl, and ether moieties promote CCN activity while methylene and nitrate moieties inhibit CCN activity. Furthermore, the model can be incorporated into scale-bridging test beds such as the Generator of Explicit Chemistry and Kinetics of Organics in the Atmosphere (GECKO-A) to evaluate the evolution of kappa for a complex mix of organic compounds and to develop suitable parameterizations of CCN evolution for larger-scale models.« less
Simões, Carla L; Xará, Susana M; Bernardo, C A
2011-10-01
Recent legislation has stressed the need to decide the best end-of-life (EoL) option for post-consumer products considering their full life-cycle and the corresponding overall environmental impacts. The life cycle assessment (LCA) technique has become a common tool to evaluate those impacts. The present study aimed to contribute to the better understanding of the application of this technique, by evaluating the influence of the selection of the life cycle impact assessment (LCIA) method in its results and conclusions. A specific case study was chosen, using previous information related to an anti-glare lamellae (AGL) for highway use, made with virgin and recycled high-density polyethylene (HDPE). Five distinct LCIA methods were used: Eco-indicator 99, CML 2 (2000), EPS 2000, Eco-indicator 95 and EDIP 97. Consistent results between these methods were obtained for the Climate change, Ozone layer depletion, Acidification and Eutrophication environmental indicators. Conversely, the Summer smog indicator showed large discrepancies between impact assessment methods. The work sheds light on the advantages inherent in using various LCIA methods when doing the LCA study of a specific product, thus evidencing complementary analysis perspectives.
Han, KA; Patel, Y; Lteif, AA; Chisholm, R; Mather, KJ
2011-01-01
Background Individual effects of hyperglycemia and obesity to impair vascular health are recognized. However, the relative contributions of dysglycemia versus other obesity-related traits to vascular dysfunction have not been systematically evaluated. Methods We undertook a cross-sectional evaluation of factors contributing to vascular function in 271 consecutive subjects, categorized as non-obese normal glucose tolerant (n=115), non-obese dysglycemic (n=32), obese normal glucose tolerant (n=57), obese dysglycemic (n=38), or type 2 diabetic (n=29). Vascular function was measured invasively as leg blood flow responses to methacholine chloride, an endothelium-dependent vasodilator. Categorical and continuous analyses were used to assess the contributions of hyperglycemia to vascular dysfunction. Results Even among normoglycemic subjects, obese subjects had impaired vascular function compared to non-obese subjects (p=0.004). Vascular function was also impaired in non-obese dysglycemic subjects (p=0.04 versus non-obese normoglycemic subjects), to a level comparable to normoglycemic obese subjects. Within obese subject groups, gradations of dysglycemia including the presence of diabetes were not associated with further worsening of these vascular responses beyond the effect of obesity alone (p=NS comparing all obese groups, p<0.001 versus lean normoglycemic subjects). In univariate and multivariable modeling analyses we found that effects of glycemia were less powerful than effects of insulin resistance and obesity on vascular dysfunction. Conclusions Dysglycemia contributes to impaired vascular function in non-obese subjects, but obesity and insulin resistance are more important determinants of vascular function in obese and diabetic subjects. PMID:21309061
Evaluation of quality improvement programmes
Ovretveit, J; Gustafson, D
2002-01-01
In response to increasing concerns about quality, many countries are carrying out large scale programmes which include national quality strategies, hospital programmes, and quality accreditation, assessment and review processes. Increasing amounts of resources are being devoted to these interventions, but do they ensure or improve quality of care? There is little research evidence as to their effectiveness or the conditions for maximum effectiveness. Reasons for the lack of evaluation research include the methodological challenges of measuring outcomes and attributing causality to these complex, changing, long term social interventions to organisations or health systems, which themselves are complex and changing. However, methods are available which can be used to evaluate these programmes and which can provide decision makers with research based guidance on how to plan and implement them. This paper describes the research challenges, the methods which can be used, and gives examples and guidance for future research. It emphasises the important contribution which such research can make to improving the effectiveness of these programmes and to developing the science of quality improvement. PMID:12486994
Signal evaluation environment: a new method for the design of peripheral in-vehicle warning signals.
Werneke, Julia; Vollrath, Mark
2011-06-01
An evaluation method called the Signal Evaluation Environment (SEE) was developed for use in the early stages of the design process of peripheral warning signals while driving. Accident analyses have shown that with complex driving situations such as intersections, the visual scan strategies of the driver contribute to overlooking other road users who have the right of way. Salient peripheral warning signals could disrupt these strategies and direct drivers' attention towards these road users. To select effective warning signals, the SEE was developed as a laboratory task requiring visual-cognitive processes similar to those used at intersections. For validation of the SEE, four experiments were conducted using different stimulus characteristics (size, colour contrast, shape, flashing) that influence peripheral vision. The results confirm that the SEE is able to differentiate between the selected stimulus characteristics. The SEE is a useful initial tool for designing peripheral signals, allowing quick and efficient preselection of beneficial signals.
Multi-Model Comparison of Lateral Boundary Contributions to ...
As the National Ambient Air Quality Standards (NAAQS) for ozone become more stringent, there has been growing attention on characterizing the contributions and the uncertainties in ozone from outside the US to the ozone concentrations within the US. The third phase of the Air Quality Model Evaluation International Initiative (AQMEII3) provides an opportunity to investigate this issue through the combined efforts of multiple research groups in the US and Europe. The model results cover a range of representations of chemical and physical processes, vertical and horizontal resolutions, and meteorological fields to drive the regional chemical transport models (CTMs), all of which are important components of model uncertainty (Solazzo and Galmarini, 2016). In AQMEII3, all groups were asked to track the contribution of ozone from lateral boundary through the use of chemically inert tracers. Though the inert tracer method tends to overestimate the impact of ozone boundary conditions compared with other methods such as chemically reactive tracers and source apportionment (Baker et al., 2015), the method takes the least effort to implement in different models, and is thus useful in highlighting and understanding the process-level differences amongst the models. In this study, results from four models were included (CMAQ driven by WRF, CAMx driven by WRF, CMAQ driven by CCLM, DEHM driven by WRF). At each site, the distribution of daily maximum 8-hour ozone, and the corre
Zhu, Hong; Xu, Xiaohan; Ahn, Chul
2017-01-01
Paired experimental design is widely used in clinical and health behavioral studies, where each study unit contributes a pair of observations. Investigators often encounter incomplete observations of paired outcomes in the data collected. Some study units contribute complete pairs of observations, while the others contribute either pre- or post-intervention observations. Statistical inference for paired experimental design with incomplete observations of continuous outcomes has been extensively studied in literature. However, sample size method for such study design is sparsely available. We derive a closed-form sample size formula based on the generalized estimating equation approach by treating the incomplete observations as missing data in a linear model. The proposed method properly accounts for the impact of mixed structure of observed data: a combination of paired and unpaired outcomes. The sample size formula is flexible to accommodate different missing patterns, magnitude of missingness, and correlation parameter values. We demonstrate that under complete observations, the proposed generalized estimating equation sample size estimate is the same as that based on the paired t-test. In the presence of missing data, the proposed method would lead to a more accurate sample size estimate comparing with the crude adjustment. Simulation studies are conducted to evaluate the finite-sample performance of the generalized estimating equation sample size formula. A real application example is presented for illustration.
An efficient method to compute spurious end point contributions in PO solutions. [Physical Optics
NASA Technical Reports Server (NTRS)
Gupta, Inder J.; Burnside, Walter D.; Pistorius, Carl W. I.
1987-01-01
A method is given to compute the spurious endpoint contributions in the physical optics solution for electromagnetic scattering from conducting bodies. The method is applicable to general three-dimensional structures. The only information required to use the method is the radius of curvature of the body at the shadow boundary. Thus, the method is very efficient for numerical computations. As an illustration, the method is applied to several bodies of revolution to compute the endpoint contributions for backscattering in the case of axial incidence. It is shown that in high-frequency situations, the endpoint contributions obtained using the method are equal to the true endpoint contributions.
Srihari, Sriganesh; Yong, Chern Han; Patil, Ashwini; Wong, Limsoon
2015-09-14
Complexes of physically interacting proteins constitute fundamental functional units responsible for driving biological processes within cells. A faithful reconstruction of the entire set of complexes is therefore essential to understand the functional organisation of cells. In this review, we discuss the key contributions of computational methods developed till date (approximately between 2003 and 2015) for identifying complexes from the network of interacting proteins (PPI network). We evaluate in depth the performance of these methods on PPI datasets from yeast, and highlight their limitations and challenges, in particular at detecting sparse and small or sub-complexes and discerning overlapping complexes. We describe methods for integrating diverse information including expression profiles and 3D structures of proteins with PPI networks to understand the dynamics of complex formation, for instance, of time-based assembly of complex subunits and formation of fuzzy complexes from intrinsically disordered proteins. Finally, we discuss methods for identifying dysfunctional complexes in human diseases, an application that is proving invaluable to understand disease mechanisms and to discover novel therapeutic targets. We hope this review aptly commemorates a decade of research on computational prediction of complexes and constitutes a valuable reference for further advancements in this exciting area. Copyright © 2015 Federation of European Biochemical Societies. Published by Elsevier B.V. All rights reserved.
Zhao, Yan; Lu, Wenjing; Wang, Hongtao
2015-12-30
Odour pollution caused by municipal solid waste is a public concern. This study quantitatively evaluated the concentration, environmental impacts, and olfaction of volatile trace compounds released from a waste transfer station. Seventy-six compounds were detected, and ethanol presented the highest releasing rate and ratio of 14.76 kg/d and 12.30 g/t of waste, respectively. Life cycle assessment showed that trichlorofluoromethane and dichlorodifluoromethane accounted for more than 99% of impact potentials to global warming and approximately 70% to human toxicity (non-carcinogenic). The major contributor for both photochemical ozone formation and ecotoxicity was ethanol. A detection threshold method was also used to evaluate odour pollution. Five compounds including methane thiol, hydrogen sulphide, ethanol, dimethyl disulphide, and dimethyl sulphide, with dilution multiples above one, were considered the critical compounds. Methane thiol showed the highest contribution to odour pollution of more than 90%, as indicated by its low threshold. Comparison of the contributions of the compounds to different environmental aspects indicated that typical pollutants varied based on specific evaluation targets and therefore should be comprehensively considered. This study provides important information and scientific methodology to elucidate the impacts of odourant compounds to the environment and odour pollution. Copyright © 2015 Elsevier B.V. All rights reserved.
Whitmore, Susan C.; Grefsheim, Suzanne F.; Rankin, Jocelyn A.
2008-01-01
Background The informationist programme at the Library of the National Institutes of Health (NIH) in Bethesda, MD, USA has grown to 14 informationists working with 40 clinical and basic science research teams. Purpose This case report, intended to contribute to the literature on informationist programmes, describes the NIH informationist programme including implementation experiences, the informationists' training programme, their job responsibilities and programme outcomes. Brief description The NIH informationist programme was designed to enhance the library's service capacity. Over time, the steps for introducing the service to new groups were formalized to ensure support by leadership, the team being served and the library. Job responsibilities also evolved from traditional library roles to a wide range of knowledge management activities. The commitment by the informationist, the team and the library to continuous learning is critical to the programme's success. Results/outcomes NIH scientists reported that informationists saved them time and contributed to teamwork with expert searching and point-of-need instruction. Process evaluation helped refine the programme. Evaluation method High-level, preliminary outcomes were identified from a survey of scientists receiving informationist services, along with key informant interviews. Process evaluation examined service implementation, informationists' training, and service components. Anecdotal evidence has also indicated a favorable response to the programme. PMID:18494648
Off-lexicon online Arabic handwriting recognition using neural network
NASA Astrophysics Data System (ADS)
Yahia, Hamdi; Chaabouni, Aymen; Boubaker, Houcine; Alimi, Adel M.
2017-03-01
This paper highlights a new method for online Arabic handwriting recognition based on graphemes segmentation. The main contribution of our work is to explore the utility of Beta-elliptic model in segmentation and features extraction for online handwriting recognition. Indeed, our method consists in decomposing the input signal into continuous part called graphemes based on Beta-Elliptical model, and classify them according to their position in the pseudo-word. The segmented graphemes are then described by the combination of geometric features and trajectory shape modeling. The efficiency of the considered features has been evaluated using feed forward neural network classifier. Experimental results using the benchmarking ADAB Database show the performance of the proposed method.
Ye, Qiuping; Jin, Xinyi; Wei, Shiqin; Zheng, Gongyu; Li, Xinlei
2016-05-01
Subcritical fluid extraction (SFE), as a novel method, was applied to investigate the yield, quality, and sensory evaluation of headspace oil from Jasminum sambac (L.) Aiton in comparison with petroleum ether extraction (PEE). The results indicated that the yield of the headspace oil using SFE was significantly higher (P < 0.05) than when using PEE. SFE contributed to obtaining alcohols and ethers, prevented the thermal reaction of terpenes, and reduced α-caryophyllene and β-caryophyllene in the headspace oil. The contents of linalool (21.90%) and benzyl acetate (16.31%) were higher via SFE than PEE. In addition, the sensory evaluation of SFE was superior to PEE, indicating a fresh, jasmine-like odor and green-yellow color. Thus, SFE is an improved method for obtaining natural headspace oil from jasmine flowers.
Dekker, Vera; Nauta, Maaike H; Mulder, Erik J; Sytema, Sjoerd; de Bildt, Annelies
2016-09-01
The Social skills Observation Measure (SOM) is a direct observation method for social skills used in naturalistic everyday situations in school. This study describes the development of the SOM and investigates its psychometric properties in 86 children with Autism spectrum disorder, aged 9.8-13.1 years. The interrater reliability was found to be good to excellent. The convergent validity was low in relation to parent and teacher reports of social skills, and also to parent interview on adaptive social functioning. Therefore this direct observation seems to provide additional information on the frequency and quality of social behaviors in daily life situations. As such it contributes to parent and teacher information as a blind measurement to evaluate Social Skills Training.
Source apportionment and sensitivity analysis: two methodologies with two different purposes
NASA Astrophysics Data System (ADS)
Clappier, Alain; Belis, Claudio A.; Pernigotti, Denise; Thunis, Philippe
2017-11-01
This work reviews the existing methodologies for source apportionment and sensitivity analysis to identify key differences and stress their implicit limitations. The emphasis is laid on the differences between source impacts
(sensitivity analysis) and contributions
(source apportionment) obtained by using four different methodologies: brute-force top-down, brute-force bottom-up, tagged species and decoupled direct method (DDM). A simple theoretical example to compare these approaches is used highlighting differences and potential implications for policy. When the relationships between concentration and emissions are linear, impacts and contributions are equivalent concepts. In this case, source apportionment and sensitivity analysis may be used indifferently for both air quality planning purposes and quantifying source contributions. However, this study demonstrates that when the relationship between emissions and concentrations is nonlinear, sensitivity approaches are not suitable to retrieve source contributions and source apportionment methods are not appropriate to evaluate the impact of abatement strategies. A quantification of the potential nonlinearities should therefore be the first step prior to source apportionment or planning applications, to prevent any limitations in their use. When nonlinearity is mild, these limitations may, however, be acceptable in the context of the other uncertainties inherent to complex models. Moreover, when using sensitivity analysis for planning, it is important to note that, under nonlinear circumstances, the calculated impacts will only provide information for the exact conditions (e.g. emission reduction share) that are simulated.
Yu-Kang, Tu
2016-12-01
Network meta-analysis for multiple treatment comparisons has been a major development in evidence synthesis methodology. The validity of a network meta-analysis, however, can be threatened by inconsistency in evidence within the network. One particular issue of inconsistency is how to directly evaluate the inconsistency between direct and indirect evidence with regard to the effects difference between two treatments. A Bayesian node-splitting model was first proposed and a similar frequentist side-splitting model has been put forward recently. Yet, assigning the inconsistency parameter to one or the other of the two treatments or splitting the parameter symmetrically between the two treatments can yield different results when multi-arm trials are involved in the evaluation. We aimed to show that a side-splitting model can be viewed as a special case of design-by-treatment interaction model, and different parameterizations correspond to different design-by-treatment interactions. We demonstrated how to evaluate the side-splitting model using the arm-based generalized linear mixed model, and an example data set was used to compare results from the arm-based models with those from the contrast-based models. The three parameterizations of side-splitting make slightly different assumptions: the symmetrical method assumes that both treatments in a treatment contrast contribute to inconsistency between direct and indirect evidence, whereas the other two parameterizations assume that only one of the two treatments contributes to this inconsistency. With this understanding in mind, meta-analysts can then make a choice about how to implement the side-splitting method for their analysis. Copyright © 2016 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
Recruiting U.S. Chinese Elders into Clinical Research for Dementia
Li, Clara; Neugroschl, Judith; Umpierre, Mari; Martin, Jane; Huang, QiYing; Zeng, Xiaoyi; Cai, Dongming; Sano, Mary
2016-01-01
Purpose This study described and evaluated the rapid recruitment of elderly Chinese into clinical research at the Mount Sinai Alzheimer’s Disease Research Center (MSADRC). Design and Methods Methods of publicizing the study included lectures to local senior centers/churches and publications in local Chinese newspapers. The amount of time and success of these methods were evaluated. A “go to them” model of evaluation was employed to enable participants to complete the study visit at locations where they were comfortable. Results From January to December 2015, we recruited 98 participants aged ≥ 65 who primarily speak Mandarin/Cantonese and reside in New York. The mean age and years of education was 73.93±6.34 and 12.79±4.58, respectively. The majority of participants were female (65.3%) and primarily Mandarin speaking (53.1%). Of all enrollees, 54.1% were recruited from community lectures, 29.6% through newspapers, 10.2% through word of mouth, and 6.1% from our clinical services. 40.8% of participants underwent evaluations at the MSADRC, 44.9% at local senior centers/churches, and 14.3% at home. Implications Given that the majority of our participants had low English proficiency, the use of bilingual recruiters probably allowed us to overcome the language barrier, facilitating recruitment. Our “go to them” model of evaluation is another important factor contributing to our successful recruitment. PMID:27819841
A benchmark for comparison of dental radiography analysis algorithms.
Wang, Ching-Wei; Huang, Cheng-Ta; Lee, Jia-Hong; Li, Chung-Hsing; Chang, Sheng-Wei; Siao, Ming-Jhih; Lai, Tat-Ming; Ibragimov, Bulat; Vrtovec, Tomaž; Ronneberger, Olaf; Fischer, Philipp; Cootes, Tim F; Lindner, Claudia
2016-07-01
Dental radiography plays an important role in clinical diagnosis, treatment and surgery. In recent years, efforts have been made on developing computerized dental X-ray image analysis systems for clinical usages. A novel framework for objective evaluation of automatic dental radiography analysis algorithms has been established under the auspices of the IEEE International Symposium on Biomedical Imaging 2015 Bitewing Radiography Caries Detection Challenge and Cephalometric X-ray Image Analysis Challenge. In this article, we present the datasets, methods and results of the challenge and lay down the principles for future uses of this benchmark. The main contributions of the challenge include the creation of the dental anatomy data repository of bitewing radiographs, the creation of the anatomical abnormality classification data repository of cephalometric radiographs, and the definition of objective quantitative evaluation for comparison and ranking of the algorithms. With this benchmark, seven automatic methods for analysing cephalometric X-ray image and two automatic methods for detecting bitewing radiography caries have been compared, and detailed quantitative evaluation results are presented in this paper. Based on the quantitative evaluation results, we believe automatic dental radiography analysis is still a challenging and unsolved problem. The datasets and the evaluation software will be made available to the research community, further encouraging future developments in this field. (http://www-o.ntust.edu.tw/~cweiwang/ISBI2015/). Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.
Evaluating interaction energies of weakly bonded systems using the Buckingham-Hirshfeld method
NASA Astrophysics Data System (ADS)
Krishtal, A.; Van Alsenoy, C.; Geerlings, P.
2014-05-01
We present the finalized Buckingham-Hirshfeld method (BHD-DFT) for the evaluation of interaction energies of non-bonded dimers with Density Functional Theory (DFT). In the method, dispersion energies are evaluated from static multipole polarizabilities, obtained on-the-fly from Coupled Perturbed Kohn-Sham calculations and partitioned into diatomic contributions using the iterative Hirshfeld partitioning method. The dispersion energy expression is distributed over four atoms and has therefore a higher delocalized character compared to the standard pairwise expressions. Additionally, full multipolar polarizability tensors are used as opposed to effective polarizabilities, allowing to retain the anisotropic character at no additional computational cost. A density dependent damping function for the BLYP, PBE, BP86, B3LYP, and PBE0 functionals has been implemented, containing two global parameters which were fitted to interaction energies and geometries of a selected number of dimers using a bi-variate RMS fit. The method is benchmarked against the S22 and S66 data sets for equilibrium geometries and the S22x5 and S66x8 data sets for interaction energies around the equilibrium geometry. Best results are achieved using the B3LYP functional with mean average deviation values of 0.30 and 0.24 kcal/mol for the S22 and S66 data sets, respectively. This situates the BHD-DFT method among the best performing dispersion inclusive DFT methods. Effect of counterpoise correction on DFT energies is discussed.
Evaluating interaction energies of weakly bonded systems using the Buckingham-Hirshfeld method
DOE Office of Scientific and Technical Information (OSTI.GOV)
Krishtal, A.; Van Alsenoy, C.; Geerlings, P.
2014-05-14
We present the finalized Buckingham-Hirshfeld method (BHD-DFT) for the evaluation of interaction energies of non-bonded dimers with Density Functional Theory (DFT). In the method, dispersion energies are evaluated from static multipole polarizabilities, obtained on-the-fly from Coupled Perturbed Kohn-Sham calculations and partitioned into diatomic contributions using the iterative Hirshfeld partitioning method. The dispersion energy expression is distributed over four atoms and has therefore a higher delocalized character compared to the standard pairwise expressions. Additionally, full multipolar polarizability tensors are used as opposed to effective polarizabilities, allowing to retain the anisotropic character at no additional computational cost. A density dependent damping functionmore » for the BLYP, PBE, BP86, B3LYP, and PBE0 functionals has been implemented, containing two global parameters which were fitted to interaction energies and geometries of a selected number of dimers using a bi-variate RMS fit. The method is benchmarked against the S22 and S66 data sets for equilibrium geometries and the S22x5 and S66x8 data sets for interaction energies around the equilibrium geometry. Best results are achieved using the B3LYP functional with mean average deviation values of 0.30 and 0.24 kcal/mol for the S22 and S66 data sets, respectively. This situates the BHD-DFT method among the best performing dispersion inclusive DFT methods. Effect of counterpoise correction on DFT energies is discussed.« less
Yamashita, Makiko; Kitano, Shigehisa; Aikawa, Hiroaki; Kuchiba, Aya; Hayashi, Mitsuhiro; Yamamoto, Noboru; Tamura, Kenji; Hamada, Akinobu
2016-01-01
Analyzing the cytotoxic functions of effector cells, such as NK cells against target cancer cells, is thought to be necessary for predicting the clinical efficacy of antibody-dependent cellular cytotoxicity (ADCC) -dependent antibody therapy. The 51Cr release assay has long been the most widely used method for quantification of ADCC activity. However, the reproducibilities of these release assays are not adequate, and they do not allow evaluation of the lysis susceptibilities of distinct cell types within the target cell population. In this study, we established a novel method for evaluating cytotoxicity, which involves the detection and quantification of dead target cells using flowcytometry. CFSE (carboxyfluorescein succinimidyl ester) was used as a dye to specifically stain and thereby label the target cell population, allowing living and dead cells, as well as both target and effector cells, to be quantitatively distinguished. Furthermore, with our new approach, ADCC activity was more reproducibly, sensitively, and specifically detectable, not only in freshly isolated but also in frozen human peripheral blood mononuclear cells (PBMCs), than with the calcein-AM release assay. This assay, validated herein, is expected to become a standard assay for evaluating ADCC activity which will ultimately contribute the clinical development of ADCC dependent-antibody therapies. PMID:26813960
Yamashita, Makiko; Kitano, Shigehisa; Aikawa, Hiroaki; Kuchiba, Aya; Hayashi, Mitsuhiro; Yamamoto, Noboru; Tamura, Kenji; Hamada, Akinobu
2016-01-27
Analyzing the cytotoxic functions of effector cells, such as NK cells against target cancer cells, is thought to be necessary for predicting the clinical efficacy of antibody-dependent cellular cytotoxicity (ADCC) -dependent antibody therapy. The (51)Cr release assay has long been the most widely used method for quantification of ADCC activity. However, the reproducibilities of these release assays are not adequate, and they do not allow evaluation of the lysis susceptibilities of distinct cell types within the target cell population. In this study, we established a novel method for evaluating cytotoxicity, which involves the detection and quantification of dead target cells using flowcytometry. CFSE (carboxyfluorescein succinimidyl ester) was used as a dye to specifically stain and thereby label the target cell population, allowing living and dead cells, as well as both target and effector cells, to be quantitatively distinguished. Furthermore, with our new approach, ADCC activity was more reproducibly, sensitively, and specifically detectable, not only in freshly isolated but also in frozen human peripheral blood mononuclear cells (PBMCs), than with the calcein-AM release assay. This assay, validated herein, is expected to become a standard assay for evaluating ADCC activity which will ultimately contribute the clinical development of ADCC dependent-antibody therapies.
Mahoney, Lisa; Rosen, Rachel
2017-01-01
Feeding difficulties such as dysphagia, coughing, choking, or vomiting during meals, slow eating, oral aversion, food refusal, and stressful mealtimes are common in children with repaired esophageal atresia (EA) and the reasons for this are often multifactorial. The aim of this review is to describe the possible underlying mechanisms contributing to feeding difficulties in patients with EA and approaches to management. Underlying mechanisms for these feeding difficulties include esophageal dysphagia, oropharyngeal dysphagia and aspiration, and aversions related to prolonged gastrostomy tube feeding. The initial diagnostic evaluation for feeding difficulties in a patient with EA may involve an esophagram, videofluoroscopic imaging or fiberoptic endoscopic evaluation during swallowing, upper endoscopy with biopsies, pH-impedance testing, and/or esophageal motility studies. The main goal of management is to reduce the factors contributing to feeding difficulties and may include reducing esophageal stasis, maximizing reflux therapies, treating underlying lung disease, dilating strictures, and altering feeding methods, routes, or schedules. PMID:28620597
Evaluating, Comparing, and Interpreting Protein Domain Hierarchies
2014-01-01
Abstract Arranging protein domain sequences hierarchically into evolutionarily divergent subgroups is important for investigating evolutionary history, for speeding up web-based similarity searches, for identifying sequence determinants of protein function, and for genome annotation. However, whether or not a particular hierarchy is optimal is often unclear, and independently constructed hierarchies for the same domain can often differ significantly. This article describes methods for statistically evaluating specific aspects of a hierarchy, for probing the criteria underlying its construction and for direct comparisons between hierarchies. Information theoretical notions are used to quantify the contributions of specific hierarchical features to the underlying statistical model. Such features include subhierarchies, sequence subgroups, individual sequences, and subgroup-associated signature patterns. Underlying properties are graphically displayed in plots of each specific feature's contributions, in heat maps of pattern residue conservation, in “contrast alignments,” and through cross-mapping of subgroups between hierarchies. Together, these approaches provide a deeper understanding of protein domain functional divergence, reveal uncertainties caused by inconsistent patterns of sequence conservation, and help resolve conflicts between competing hierarchies. PMID:24559108
Mathes, Tim; Walgenbach, Maren; Antoine, Sunya-Lee; Pieper, Dawid; Eikermann, Michaela
2014-10-01
The quality of systematic reviews of health economic evaluations (SR-HE) is often limited because of methodological shortcomings. One reason for this poor quality is that there are no established standards for the preparation of SR-HE. The objective of this study is to compare existing methods and suggest best practices for the preparation of SR-HE. To identify the relevant methodological literature on SR-HE, a systematic literature search was performed in Embase, Medline, the National Health System Economic Evaluation Database, the Health Technology Assessment Database, and the Cochrane methodology register, and webpages of international health technology assessment agencies were searched. The study selection was performed independently by 2 reviewers. Data were extracted by one reviewer and verified by a second reviewer. On the basis of the overlaps in the recommendations for the methods of SR-HE in the included papers, suggestions for best practices for the preparation of SR-HE were developed. Nineteen relevant publications were identified. The recommendations within them often differed. However, for most process steps there was some overlap between recommendations for the methods of preparation. The overlaps were taken as basis on which to develop suggestions for the following process steps of preparation: defining the research question, developing eligibility criteria, conducting a literature search, selecting studies, assessing the methodological study quality, assessing transferability, and synthesizing data. The differences in the proposed recommendations are not always explainable by the focus on certain evaluation types, target audiences, or integration in the decision process. Currently, there seem to be no standard methods for the preparation of SR-HE. The suggestions presented here can contribute to the harmonization of methods for the preparation of SR-HE. © The Author(s) 2014.
Testing the prospective evaluation of a new healthcare system
Planitz, Birgit; Sanderson, Penelope; Freeman, Clinton; Xiao, Tania; Botea, Adi; Orihuela, Cristina Beltran
2012-01-01
Research into health ICT adoption suggests that the failure to understand the clinical workplace has been a major contributing factor to the failure of many computer-based clinical systems. We suggest that clinicians and administrators need methods for envisioning future use when adopting new ICT. This paper presents and evaluates a six-stage “prospective evaluation” model that clinicians can use when assessing the impact of a new electronic patient information system on a Specialist Outpatients Department (SOPD). The prospective evaluation model encompasses normative, descriptive, formative and projective approaches. We show that this combination helped health informaticians to make reasonably accurate predictions for technology adoption at the SOPD. We suggest some refinements, however, to improve the scope and accuracy of predictions. PMID:23304347
Guang, Huizhi; Cai, Chuangjian; Zuo, Simin; Cai, Wenjuan; Zhang, Jiulou; Luo, Jianwen
2017-03-01
Peripheral arterial disease (PAD) can further cause lower limb ischemia. Quantitative evaluation of the vascular perfusion in the ischemic limb contributes to diagnosis of PAD and preclinical development of new drug. In vivo time-series indocyanine green (ICG) fluorescence imaging can noninvasively monitor blood flow and has a deep tissue penetration. The perfusion rate estimated from the time-series ICG images is not enough for the evaluation of hindlimb ischemia. The information relevant to the vascular density is also important, because angiogenesis is an essential mechanism for post-ischemic recovery. In this paper, a multiparametric evaluation method is proposed for simultaneous estimation of multiple vascular perfusion parameters, including not only the perfusion rate but also the vascular perfusion density and the time-varying ICG concentration in veins. The target method is based on a mathematical model of ICG pharmacokinetics in the mouse hindlimb. The regression analysis performed on the time-series ICG images obtained from a dynamic reflectance fluorescence imaging system. The results demonstrate that the estimated multiple parameters are effective to quantitatively evaluate the vascular perfusion and distinguish hypo-perfused tissues from well-perfused tissues in the mouse hindlimb. The proposed multiparametric evaluation method could be useful for PAD diagnosis. The estimated perfusion rate and vascular perfusion density maps (left) and the time-varying ICG concentration in veins of the ankle region (right) of the normal and ischemic hindlimbs. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
Importance and usefulness of evaluating self-esteem in children.
Hosogi, Mizuho; Okada, Ayumi; Fujii, Chikako; Noguchi, Keizou; Watanabe, Kumi
2012-03-20
Self-esteem is the "feeling of self-appreciation" and is an indispensable emotion for people to adapt to society and live their lives. For children, in particular, the environment in which they are raised contributes profoundly to the development of their self-esteem, which in turn helps them to adapt better to society. Various psychologists have provided definitions of self-esteem, and examined methods of objectively evaluating self-esteem. Questionnaire-style assessment methods for adult include Rosenberg Self-Esteem Scale and Janis-Field Feeling of Inadequacy Scale, and these for children include Coopersmith Self-Esteem Inventory, Pope's 5-Scale Test of Self-Esteem for children, and Kid- KINDL®. Other methods include Ziller Social Self-Esteem Scale and Implicit Association Test. The development of children's self-esteem is heavily influenced by their environment, that is, their homes, neighborhoods, and schools. Children with damaged self-esteem are at risk of developing psychological and social problems, which hinders recovery from low self-esteem. Thus, to recover low self-esteem, it is important for children to accumulate a series of successful experiences to create a positive concept of self. Evaluating children's self-esteem can be an effective method for understanding their past and present circumstances, and useful to treat for children with psychosomatic disorders.
Importance and usefulness of evaluating self-esteem in children
2012-01-01
Self-esteem is the "feeling of self-appreciation" and is an indispensable emotion for people to adapt to society and live their lives. For children, in particular, the environment in which they are raised contributes profoundly to the development of their self-esteem, which in turn helps them to adapt better to society. Various psychologists have provided definitions of self-esteem, and examined methods of objectively evaluating self-esteem. Questionnaire-style assessment methods for adult include Rosenberg Self-Esteem Scale and Janis-Field Feeling of Inadequacy Scale, and these for children include Coopersmith Self-Esteem Inventory, Pope's 5-Scale Test of Self-Esteem for children, and Kid- KINDL®. Other methods include Ziller Social Self-Esteem Scale and Implicit Association Test. The development of children's self-esteem is heavily influenced by their environment, that is, their homes, neighborhoods, and schools. Children with damaged self-esteem are at risk of developing psychological and social problems, which hinders recovery from low self-esteem. Thus, to recover low self-esteem, it is important for children to accumulate a series of successful experiences to create a positive concept of self. Evaluating children's self-esteem can be an effective method for understanding their past and present circumstances, and useful to treat for children with psychosomatic disorders. PMID:22433387
Kratochvíla, Jiří; Jiřík, Radovan; Bartoš, Michal; Standara, Michal; Starčuk, Zenon; Taxt, Torfinn
2016-03-01
One of the main challenges in quantitative dynamic contrast-enhanced (DCE) MRI is estimation of the arterial input function (AIF). Usually, the signal from a single artery (ignoring contrast dispersion, partial volume effects and flow artifacts) or a population average of such signals (also ignoring variability between patients) is used. Multi-channel blind deconvolution is an alternative approach avoiding most of these problems. The AIF is estimated directly from the measured tracer concentration curves in several tissues. This contribution extends the published methods of multi-channel blind deconvolution by applying a more realistic model of the impulse residue function, the distributed capillary adiabatic tissue homogeneity model (DCATH). In addition, an alternative AIF model is used and several AIF-scaling methods are tested. The proposed method is evaluated on synthetic data with respect to the number of tissue regions and to the signal-to-noise ratio. Evaluation on clinical data (renal cell carcinoma patients before and after the beginning of the treatment) gave consistent results. An initial evaluation on clinical data indicates more reliable and less noise sensitive perfusion parameter estimates. Blind multi-channel deconvolution using the DCATH model might be a method of choice for AIF estimation in a clinical setup. © 2015 Wiley Periodicals, Inc.
Application of high resolution synchrotron micro-CT radiation in dental implant osseointegration.
Neldam, Camilla Albeck; Lauridsen, Torsten; Rack, Alexander; Lefolii, Tore Tranberg; Jørgensen, Niklas Rye; Feidenhans'l, Robert; Pinholt, Else Marie
2015-06-01
The purpose of this study was to describe a refined method using high-resolution synchrotron radiation microtomography (SRmicro-CT) to evaluate osseointegration and peri-implant bone volume fraction after titanium dental implant insertion. SRmicro-CT is considered gold standard evaluating bone microarchitecture. Its high resolution, high contrast, and excellent high signal-to-noise-ratio all contribute to the highest spatial resolutions achievable today. Using SRmicro-CT at a voxel size of 5 μm in an experimental goat mandible model, the peri-implant bone volume fraction was found to quickly increase to 50% as the radial distance from the implant surface increased, and levelled out to approximately 80% at a distance of 400 μm. This method has been successful in depicting the bone and cavities in three dimensions thereby enabling us to give a more precise answer to the fraction of the bone-to-implant contact compared to previous methods. Copyright © 2015 European Association for Cranio-Maxillo-Facial Surgery. Published by Elsevier Ltd. All rights reserved.
Simple F Test Reveals Gene-Gene Interactions in Case-Control Studies
Chen, Guanjie; Yuan, Ao; Zhou, Jie; Bentley, Amy R.; Adeyemo, Adebowale; Rotimi, Charles N.
2012-01-01
Missing heritability is still a challenge for Genome Wide Association Studies (GWAS). Gene-gene interactions may partially explain this residual genetic influence and contribute broadly to complex disease. To analyze the gene-gene interactions in case-control studies of complex disease, we propose a simple, non-parametric method that utilizes the F-statistic. This approach consists of three steps. First, we examine the joint distribution of a pair of SNPs in cases and controls separately. Second, an F-test is used to evaluate the ratio of dependence in cases to that of controls. Finally, results are adjusted for multiple tests. This method was used to evaluate gene-gene interactions that are associated with risk of Type 2 Diabetes among African Americans in the Howard University Family Study. We identified 18 gene-gene interactions (P < 0.0001). Compared with the commonly-used logistical regression method, we demonstrate that the F-ratio test is an efficient approach to measuring gene-gene interactions, especially for studies with limited sample size. PMID:22837643
Effect of sample stratification on dairy GWAS results
2012-01-01
Background Artificial insemination and genetic selection are major factors contributing to population stratification in dairy cattle. In this study, we analyzed the effect of sample stratification and the effect of stratification correction on results of a dairy genome-wide association study (GWAS). Three methods for stratification correction were used: the efficient mixed-model association expedited (EMMAX) method accounting for correlation among all individuals, a generalized least squares (GLS) method based on half-sib intraclass correlation, and a principal component analysis (PCA) approach. Results Historical pedigree data revealed that the 1,654 contemporary cows in the GWAS were all related when traced through approximately 10–15 generations of ancestors. Genome and phenotype stratifications had a striking overlap with the half-sib structure. A large elite half-sib family of cows contributed to the detection of favorable alleles that had low frequencies in the general population and high frequencies in the elite cows and contributed to the detection of X chromosome effects. All three methods for stratification correction reduced the number of significant effects. EMMAX method had the most severe reduction in the number of significant effects, and the PCA method using 20 principal components and GLS had similar significance levels. Removal of the elite cows from the analysis without using stratification correction removed many effects that were also removed by the three methods for stratification correction, indicating that stratification correction could have removed some true effects due to the elite cows. SNP effects with good consensus between different methods and effect size distributions from USDA’s Holstein genomic evaluation included the DGAT1-NIBP region of BTA14 for production traits, a SNP 45kb upstream from PIGY on BTA6 and two SNPs in NIBP on BTA14 for protein percentage. However, most of these consensus effects had similar frequencies in the elite and average cows. Conclusions Genetic selection and extensive use of artificial insemination contributed to overlapped genome, pedigree and phenotype stratifications. The presence of an elite cluster of cows was related to the detection of rare favorable alleles that had high frequencies in the elite cluster and low frequencies in the remaining cows. Methods for stratification correction could have removed some true effects associated with genetic selection. PMID:23039970
Techniques of Acceleration for Association Rule Induction with Pseudo Artificial Life Algorithm
NASA Astrophysics Data System (ADS)
Kanakubo, Masaaki; Hagiwara, Masafumi
Frequent patterns mining is one of the important problems in data mining. Generally, the number of potential rules grows rapidly as the size of database increases. It is therefore hard for a user to extract the association rules. To avoid such a difficulty, we propose a new method for association rule induction with pseudo artificial life approach. The proposed method is to decide whether there exists an item set which contains N or more items in two transactions. If it exists, a series of item sets which are contained in the part of transactions will be recorded. The iteration of this step contributes to the extraction of association rules. It is not necessary to calculate the huge number of candidate rules. In the evaluation test, we compared the extracted association rules using our method with the rules using other algorithms like Apriori algorithm. As a result of the evaluation using huge retail market basket data, our method is approximately 10 and 20 times faster than the Apriori algorithm and many its variants.
Bearing Procurement Analysis Method by Total Cost of Ownership Analysis and Reliability Prediction
NASA Astrophysics Data System (ADS)
Trusaji, Wildan; Akbar, Muhammad; Sukoyo; Irianto, Dradjad
2018-03-01
In making bearing procurement analysis, price and its reliability must be considered as decision criteria, since price determines the direct cost as acquisition cost and reliability of bearing determine the indirect cost such as maintenance cost. Despite the indirect cost is hard to identify and measured, it has high contribution to overall cost that will be incurred. So, the indirect cost of reliability must be considered when making bearing procurement analysis. This paper tries to explain bearing evaluation method with the total cost of ownership analysis to consider price and maintenance cost as decision criteria. Furthermore, since there is a lack of failure data when bearing evaluation phase is conducted, reliability prediction method is used to predict bearing reliability from its dynamic load rating parameter. With this method, bearing with a higher price but has higher reliability is preferable for long-term planning. But for short-term planning the cheaper one but has lower reliability is preferable. This contextuality can give rise to conflict between stakeholders. Thus, the planning horizon needs to be agreed by all stakeholder before making a procurement decision.
Kasesaz, Y; Khalafi, H; Rahmani, F
2013-12-01
Optimization of the Beam Shaping Assembly (BSA) has been performed using the MCNP4C Monte Carlo code to shape the 2.45 MeV neutrons that are produced in the D-D neutron generator. Optimal design of the BSA has been chosen by considering in-air figures of merit (FOM) which consists of 70 cm Fluental as a moderator, 30 cm Pb as a reflector, 2mm (6)Li as a thermal neutron filter and 2mm Pb as a gamma filter. The neutron beam can be evaluated by in-phantom parameters, from which therapeutic gain can be derived. Direct evaluation of both set of FOMs (in-air and in-phantom) is very time consuming. In this paper a Response Matrix (RM) method has been suggested to reduce the computing time. This method is based on considering the neutron spectrum at the beam exit and calculating contribution of various dose components in phantom to calculate the Response Matrix. Results show good agreement between direct calculation and the RM method. Copyright © 2013 Elsevier Ltd. All rights reserved.
Li, Lian-Hui; Mo, Rong
2015-01-01
The production task queue has a great significance for manufacturing resource allocation and scheduling decision. Man-made qualitative queue optimization method has a poor effect and makes the application difficult. A production task queue optimization method is proposed based on multi-attribute evaluation. According to the task attributes, the hierarchical multi-attribute model is established and the indicator quantization methods are given. To calculate the objective indicator weight, criteria importance through intercriteria correlation (CRITIC) is selected from three usual methods. To calculate the subjective indicator weight, BP neural network is used to determine the judge importance degree, and then the trapezoid fuzzy scale-rough AHP considering the judge importance degree is put forward. The balanced weight, which integrates the objective weight and the subjective weight, is calculated base on multi-weight contribution balance model. The technique for order preference by similarity to an ideal solution (TOPSIS) improved by replacing Euclidean distance with relative entropy distance is used to sequence the tasks and optimize the queue by the weighted indicator value. A case study is given to illustrate its correctness and feasibility.
Li, Lian-hui; Mo, Rong
2015-01-01
The production task queue has a great significance for manufacturing resource allocation and scheduling decision. Man-made qualitative queue optimization method has a poor effect and makes the application difficult. A production task queue optimization method is proposed based on multi-attribute evaluation. According to the task attributes, the hierarchical multi-attribute model is established and the indicator quantization methods are given. To calculate the objective indicator weight, criteria importance through intercriteria correlation (CRITIC) is selected from three usual methods. To calculate the subjective indicator weight, BP neural network is used to determine the judge importance degree, and then the trapezoid fuzzy scale-rough AHP considering the judge importance degree is put forward. The balanced weight, which integrates the objective weight and the subjective weight, is calculated base on multi-weight contribution balance model. The technique for order preference by similarity to an ideal solution (TOPSIS) improved by replacing Euclidean distance with relative entropy distance is used to sequence the tasks and optimize the queue by the weighted indicator value. A case study is given to illustrate its correctness and feasibility. PMID:26414758
Ma, Ryewon; Jung, Dukyoo
2016-02-01
This study was done to develop a postural-stability patient transfer technique for care helpers in nursing homes and to evaluate its effectiveness. Four types of patient transfer techniques (Lifting towards the head board of the bed, turning to the lateral position, sitting upright on the bed, transferring from wheel chair to bed) were practiced in accordance with the following three methods; Care helpers habitually used transfer methods (Method 1), patient transfer methods according to care helper standard textbooks (Method 2), and a method developed by the author ensuring postural-stability (Method 3). The care helpers' muscle activity and four joint angles were measured. The collected data were analyzed using the program SPSS Statistic 21.0. To differentiate the muscle activity and joint angle, the Friedman test was executed and the post-hoc analysis was conducted using the Wilcoxon Signed Rank test. Muscle activity was significantly lower during Method 3 compared to Methods 1 and 2. In addition, the joint angle was significantly lower for the knee and shoulder joint angle while performing Method 3 compared to Methods 1 and 2. Findings indicate that using postural-stability patient transfer techniques can contribute to the prevention of musculoskeletal disease which care helpers suffer from due to physically demanding patient care in nursing homes.
Barros Silva, Gyl Eanes; Costa, Roberto Silva; Ravinal, Roberto Cuan; Saraiva e Silva, Jucélia; Dantas, Marcio; Coimbra, Terezila Machado
2010-03-01
To demonstrate that the evaluation of erythrocyte dysmorphism by light microscopy with lowering of the condenser lens (LMLC) is useful to identify patients with a haematuria of glomerular or non-glomerular origin. A comparative double-blind study between phase contrast microscopy (PCM) and LMLC is reported to evaluate the efficacy of these techniques. Urine samples of 39 patients followed up for 9 months were analyzed, and classified as glomerular and non-glomerular haematuria. The different microscopic techniques were compared using receiver-operator curve (ROC) analysis and area under curve (AUC). Reproducibility was assessed by coefficient of variation (CV). Specific cut-offs were set for each method according to their best rate of specificity and sensitivity as follows: 30% for phase contrast microscopy and 40% for standard LMLC, reaching in the first method the rate of 95% and 100% of sensitivity and specificity, respectively, and in the second method the rate of 90% and 100% of sensitivity and specificity, respectively. In ROC analysis, AUC for PCM was 0.99 and AUC for LMLC was 0.96. The CV was very similar in glomerular haematuria group for PCM (35%) and LMLC (35.3%). LMLC proved to be effective in contributing to the direction of investigation of haematuria, toward the nephrological or urological side. This method can substitute PCM when this equipment is not available.
Curriculum renewal in child psychiatry.
Hanson, M; Tiberius, R; Charach, A; Ulzen, T; Sackin, D; Jain, U; Reiter, S; Shomair, G
1999-11-01
To ensure uniform design and evaluation of a clerkship curriculum for child and adolescent psychiatry teaching common disorders and problems in an efficient manner across 5 teaching sites and to include structures for continuous improvement. The curriculum committee selected for course inclusion disorders and problems of child psychiatry that were commonly encountered by primary care physicians. Instruction methods that encouraged active student learning were selected. Course coordination across sites was encouraged by several methods: involving faculty, adopting a centralized examination format, and aligning teaching methods with examination format. Quantitative and qualitative methods were used to measure students' perceptions of the course's value. These evaluative results were reviewed, and course modifications were implemented and reevaluated. The average adjusted student return rate for course evaluation questionnaires for the 3-year study period was 63%. Clerks' ratings of course learning value demonstrated that the course improved significantly and continually across all sites, according to a Scheffé post-hoc analysis. Analysis of student statements from focus-group transcripts contributed to course modifications, such as the Brief Focused Interview (BFI). Our curriculum in child psychiatry, which focused on common problems and used active learning methods, was viewed as a valuable learning experience by clinical clerks. Curriculum coordination across multiple teaching sites was accomplished by including faculty in the process and by using specific teaching and examination strategies. Structures for continuous course improvement were effective.
NASA Astrophysics Data System (ADS)
Develaki, Maria
2017-11-01
Scientific reasoning is particularly pertinent to science education since it is closely related to the content and methodologies of science and contributes to scientific literacy. Much of the research in science education investigates the appropriate framework and teaching methods and tools needed to promote students' ability to reason and evaluate in a scientific way. This paper aims (a) to contribute to an extended understanding of the nature and pedagogical importance of model-based reasoning and (b) to exemplify how using computer simulations can support students' model-based reasoning. We provide first a background for both scientific reasoning and computer simulations, based on the relevant philosophical views and the related educational discussion. This background suggests that the model-based framework provides an epistemologically valid and pedagogically appropriate basis for teaching scientific reasoning and for helping students develop sounder reasoning and decision-taking abilities and explains how using computer simulations can foster these abilities. We then provide some examples illustrating the use of computer simulations to support model-based reasoning and evaluation activities in the classroom. The examples reflect the procedure and criteria for evaluating models in science and demonstrate the educational advantages of their application in classroom reasoning activities.
Details on suicide among US physicians: data from the National Violent Death Reporting System.
Gold, Katherine J; Sen, Ananda; Schwenk, Thomas L
2013-01-01
Physician suicide is an important public health problem as the rate of suicide is higher among physicians than the general population. Unfortunately, few studies have evaluated information about mental health comorbidities and psychosocial stressors which may contribute to physician suicide. We sought to evaluate these factors among physicians versus non-physician suicide victims. We used data from the United States National Violent Death Reporting System to evaluate demographics, mental health variables, recent stressors and suicide methods among physician versus non-physician suicide victims in 17 states. The data set included 31,636 suicide victims of whom 203 were identified as physicians. Multivariable logistic regression found that having a known mental health disorder or a job problem which contributed to the suicide significantly predicted being a physician. Physicians were significantly more likely than non-physicians to have antipsychotics, benzodiazepines and barbiturates present on toxicology testing but not antidepressants. Mental illness is an important comorbidity for physicians who complete a suicide but postmortem toxicology data shows low rates of medication treatment. Inadequate treatment and increased problems related to job stress may be potentially modifiable risk factors to reduce suicidal death among physicians. Copyright © 2013 Elsevier Inc. All rights reserved.
Capsicum--production, technology, chemistry, and quality. Part IV. Evaluation of quality.
Govindarajan, V S; Rajalakshmi, D; Chand, N
1987-01-01
Capsicum fruits are popular worldwide and are used in the cuisines of both the developing and the developed countries. With its different varieties, forms, and uses, the spice capsicum contributes to the entire gamut of sensory experience--color as finely ground paprika powder or extract in sausages, goulash, cheese, and snacks; both pungency and color as the many varieties of chillies used in Mexican, African, Indian, and southeast Asian cuisines; color, aroma, and mild pungency as the fresh green chillies used in many of the growing countries; and appearance, color, aroma, and texture as fresh fruit in salads and as a pickled and canned product. In three earlier parts in this series, the varieties, cultivation, and primary processing; the processed products, world production, and trade; and the chemistry of the color, aroma, and pungency stimuli have been reviewed. In this part, the evaluation of quality through instrumental determination of the causal components and the sensory evaluation of color, aroma, and pungency are discussed. Several methods for quantitative determination of the stimuli and the sensory evaluation of the responses to the stimuli are reviewed. The problems of sensory evaluation of color, aroma, and pungency, the dominant attributes for validation of the instrumentally determined values for carotenoids, volatiles, or particular fractions, and total and individual capsaicinoids are specifically discussed. Summarized details of selected instrumental methods for evaluating the stimuli, which are either validated by correlation to sensorily perceived responses or to adopted standards, are given along with representative data obtained for discussing the adequacy and reliability of the methods. Pungency as a specific gustatory perception and the many methods proposed to evaluate this quality are discussed. A recommended objective procedure for obtaining reproducible values is discussed, and a method for relating different panel results is shown. With such a method, highly significant correlations have been shown between estimated total capsaicinoids and the determined pungency. The estimation of total capsaicinoids by any simple, reliable method is shown to be adequate for quality control of pungency of Capsicum fruits.
A Novel Computational Method to Reduce Leaky Reaction in DNA Strand Displacement
Li, Xin; Wang, Xun; Song, Tao; Lu, Wei; Chen, Zhihua; Shi, Xiaolong
2015-01-01
DNA strand displacement technique is widely used in DNA programming, DNA biosensors, and gene analysis. In DNA strand displacement, leaky reactions can cause DNA signals decay and detecting DNA signals fails. The mostly used method to avoid leakage is cleaning up after upstream leaky reactions, and it remains a challenge to develop reliable DNA strand displacement technique with low leakage. In this work, we address the challenge by experimentally evaluating the basic factors, including reaction time, ratio of reactants, and ion concentration to the leakage in DNA strand displacement. Specifically, fluorescent probes and a hairpin structure reporting DNA strand are designed to detect the output of DNA strand displacement, and thus can evaluate the leakage of DNA strand displacement reactions with different reaction time, ratios of reactants, and ion concentrations. From the obtained data, mathematical models for evaluating leakage are achieved by curve derivation. As a result, it is obtained that long time incubation, high concentration of fuel strand, and inappropriate amount of ion concentration can weaken leaky reactions. This contributes to a method to set proper reaction conditions to reduce leakage in DNA strand displacement. PMID:26491602
Bau, Cho-Tsan; Huang, Chung-Yi
2014-01-01
Abstract Objective: To construct a clinical decision support system (CDSS) for undergoing surgery based on domain ontology and rules reasoning in the setting of hospitalized diabetic patients. Materials and Methods: The ontology was created with a modified ontology development method, including specification and conceptualization, formalization, implementation, and evaluation and maintenance. The Protégé–Web Ontology Language editor was used to implement the ontology. Embedded clinical knowledge was elicited to complement the domain ontology with formal concept analysis. The decision rules were translated into JENA format, which JENA can use to infer recommendations based on patient clinical situations. Results: The ontology includes 31 classes and 13 properties, plus 38 JENA rules that were built to generate recommendations. The evaluation studies confirmed the correctness of the ontology, acceptance of recommendations, satisfaction with the system, and usefulness of the ontology for glycemic management of diabetic patients undergoing surgery, especially for domain experts. Conclusions: The contribution of this research is to set up an evidence-based hybrid ontology and an evaluation method for CDSS. The system can help clinicians to achieve inpatient glycemic control in diabetic patients undergoing surgery while avoiding hypoglycemia. PMID:24730353
NASA Astrophysics Data System (ADS)
Zhao, Fengjun; Liu, Junting; Qu, Xiaochao; Xu, Xianhui; Chen, Xueli; Yang, Xiang; Cao, Feng; Liang, Jimin; Tian, Jie
2014-12-01
To solve the multicollinearity issue and unequal contribution of vascular parameters for the quantification of angiogenesis, we developed a quantification evaluation method of vascular parameters for angiogenesis based on in vivo micro-CT imaging of hindlimb ischemic model mice. Taking vascular volume as the ground truth parameter, nine vascular parameters were first assembled into sparse principal components (PCs) to reduce the multicolinearity issue. Aggregated boosted trees (ABTs) were then employed to analyze the importance of vascular parameters for the quantification of angiogenesis via the loadings of sparse PCs. The results demonstrated that vascular volume was mainly characterized by vascular area, vascular junction, connectivity density, segment number and vascular length, which indicated they were the key vascular parameters for the quantification of angiogenesis. The proposed quantitative evaluation method was compared with both the ABTs directly using the nine vascular parameters and Pearson correlation, which were consistent. In contrast to the ABTs directly using the vascular parameters, the proposed method can select all the key vascular parameters simultaneously, because all the key vascular parameters were assembled into the sparse PCs with the highest relative importance.
The Contribution of Vascular Receptors to +Gz Tolerance
1975-11-30
the possible effects of fatigue and habituation of the receptor on the magnitude of the cardiovascular antigravity " responses; and (4) contrast...responses obtained with and J without an antigravity suit. !A ieW r j-p B. Sixteen-Month Performance Pro&ram Varioun factors precluded the opportunity to... antigravity suit was used on 2 dogs. METHODS This investigation required the degree of impairment in +Gz tolerance of instrumented dogs to be evaluated
NASA Astrophysics Data System (ADS)
Torrungrueng, Danai; Johnson, Joel T.; Chou, Hsi-Tseng
2002-03-01
The novel spectral acceleration (NSA) algorithm has been shown to produce an $[\\mathcal{O}]$(Ntot) efficient iterative method of moments for the computation of radiation/scattering from both one-dimensional (1-D) and two-dimensional large-scale quasi-planar structures, where Ntot is the total number of unknowns to be solved. This method accelerates the matrix-vector multiplication in an iterative method of moments solution and divides contributions between points into ``strong'' (exact matrix elements) and ``weak'' (NSA algorithm) regions. The NSA method is based on a spectral representation of the electromagnetic Green's function and appropriate contour deformation, resulting in a fast multipole-like formulation in which contributions from large numbers of points to a single point are evaluated simultaneously. In the standard NSA algorithm the NSA parameters are derived on the basis of the assumption that the outermost possible saddle point, φs,max, along the real axis in the complex angular domain is small. For given height variations of quasi-planar structures, this assumption can be satisfied by adjusting the size of the strong region Ls. However, for quasi-planar structures with large height variations, the adjusted size of the strong region is typically large, resulting in significant increases in computational time for the computation of the strong-region contribution and degrading overall efficiency of the NSA algorithm. In addition, for the case of extremely large scale structures, studies based on the physical optics approximation and a flat surface assumption show that the given NSA parameters in the standard NSA algorithm may yield inaccurate results. In this paper, analytical formulas associated with the NSA parameters for an arbitrary value of φs,max are presented, resulting in more flexibility in selecting Ls to compromise between the computation of the contributions of the strong and weak regions. In addition, a ``multilevel'' algorithm, decomposing 1-D extremely large scale quasi-planar structures into more than one weak region and appropriately choosing the NSA parameters for each weak region, is incorporated into the original NSA method to improve its accuracy.
Working towards the SDGs: measuring resilience from a practitioner's perspective
NASA Astrophysics Data System (ADS)
van Manen, S. M.; Both, M.
2015-12-01
The broad universal nature of the SDGs requires integrated approaches across development sectors and action at a variety of scales: from global to local. In humanitarian and development contexts, particularly at the local level, working towards these goals is increasingly approached through the concept of resilience. Resilience is broadly defined as the ability to minimise the impact of, cope with and recover from the consequences of shocks and stresses, both natural and manmade, without compromising long-term prospects. Key in this are the physical resources required and the ability to organise these prior to and during a crisis. However, despite the active debate on the theoretical foundations of resilience there is a comparative lack in the development of measurement approaches. The conceptual diversity of the few existing approaches further illustrates the complexity of operationalising the concept. Here we present a practical method to measure community resilience using a questionnaire composed of a generic set of household-level indicators. Rooted in the sustainable livelihoods approach it considers 6 domains: human, social, natural, economic, physical and political, and evaluates both resources and socio-cognitive factors. It is intended to be combined with more specific intervention-based questionnaires to systematically assess, monitor and evaluate the resilience of a community and the contribution of specific activities to resilience. Its use will be illustrated using a Haiti-based case study. The method presented supports knowledge-based decision making and impact monitoring. Furthermore, the evidence-based way of working contributes to accountability to a range of stakeholders and can be used for resource mobilisation. However, it should be noted that due to its inherent complexity and comprehensive nature there is no method or combination of methods and data types that can fully capture resilience in and across all of its facets, scales and domains.
Electrophysiology of Cranial Nerve Testing: Cranial Nerves IX and X.
Martinez, Alberto R M; Martins, Melina P; Moreira, Ana Lucila; Martins, Carlos R; Kimaid, Paulo A T; França, Marcondes C
2018-01-01
The cranial nerves IX and X emerge from medulla oblongata and have motor, sensory, and parasympathetic functions. Some of these are amenable to neurophysiological assessment. It is often hard to separate the individual contribution of each nerve; in fact, some of the techniques are indeed a composite functional measure of both nerves. The main methods are the evaluation of the swallowing function (combined IX and X), laryngeal electromyogram (predominant motor vagal function), and heart rate variability (predominant parasympathetic vagal function). This review describes, therefore, the techniques that best evaluate the major symptoms presented in IX and X cranial nerve disturbance: dysphagia, dysphonia, and autonomic parasympathetic dysfunction.
Health technology assessment. Evaluation of biomedical innovative technologies.
Turchetti, Giuseppe; Spadoni, Enza; Geisler, Eliezer Elie
2010-01-01
This article describes health technology assessment (HTA) as an evaluation tool that applies systematic methods of inquiry to the generation and use of health technologies and new products. The focus of this article is on the contributions of HTA to the management of the new product development effort in the biomedical organization. Critical success factors (CSFs) are listed, and their role in assessing success is defined and explained. One of the conclusions of this article is that HTA is a powerful tool for managers in the biomedical sector, allowing them to better manage their innovation effort in their continuing struggle for competitiveness and survival.
Nodal weighting factor method for ex-core fast neutron fluence evaluation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chiang, R. T.
The nodal weighting factor method is developed for evaluating ex-core fast neutron flux in a nuclear reactor by utilizing adjoint neutron flux, a fictitious unit detector cross section for neutron energy above 1 or 0.1 MeV, the unit fission source, and relative assembly nodal powers. The method determines each nodal weighting factor for ex-core neutron fast flux evaluation by solving the steady-state adjoint neutron transport equation with a fictitious unit detector cross section for neutron energy above 1 or 0.1 MeV as the adjoint source, by integrating the unit fission source with a typical fission spectrum to the solved adjointmore » flux over all energies, all angles and given nodal volume, and by dividing it with the sum of all nodal weighting factors, which is a normalization factor. Then, the fast neutron flux can be obtained by summing the various relative nodal powers times the corresponding nodal weighting factors of the adjacent significantly contributed peripheral assembly nodes and times a proper fast neutron attenuation coefficient over an operating period. A generic set of nodal weighting factors can be used to evaluate neutron fluence at the same location for similar core design and fuel cycles, but the set of nodal weighting factors needs to be re-calibrated for a transition-fuel-cycle. This newly developed nodal weighting factor method should be a useful and simplified tool for evaluating fast neutron fluence at selected locations of interest in ex-core components of contemporary nuclear power reactors. (authors)« less
Nübling, R; Kaluscha, R; Krischak, G; Kriz, D; Martin, H; Müller, G; Renzland, J; Reuss-Borst, M; Schmidt, J; Kaiser, U; Toepler, E
2017-02-01
Aim of the Study The outcome quality of medical rehabilitation is evaluated often by "Patient Reported Outcomes" (PROs). It is examined to what extent these PROs are corresponding with "hard" or "objective" outcomes such as payments of contributions to social insurance. Methods The "rehabilitation QM outcome study" includes self-reports of patients as well as data from the Rehabilitation Statistics Database (RSD) of the German pension insurance Baden-Wurttemberg. The sample for the question posed includes N=2 947 insured who were treated in 2011 in 21 clinics of the "health quality network" and who were either employed or unemployed at the time of the rehabilitation application (e. g. the workforce or labour force group, response rate: 55%). The sample turned out widely representative for the population of the insured persons. Results PROs and payment of contributions to pension insurance clearly correspond. In the year after the rehabilitation improved vs. not improved rehabilitees differed clearly with regard to their payments of contributions. Conclusions The results support the validity of PROs. For a comprehensive depiction of the outcome quality of rehabilitation PROs and payments of contributions should be considered supplementary. © Georg Thieme Verlag KG Stuttgart · New York.
van der Ham, Alida J; van Erp, Nicole; Broerse, Jacqueline E W
2016-04-01
The aim of this study was to gain better insight into the quality of patient participation in the development of clinical practice guidelines and to contribute to approaches for the monitoring and evaluation of such initiatives. In addition, we explore the potential of a dialogue-based approach for reconciliation of preferences of patients and professionals in the guideline development processes. The development of the Multidisciplinary Guideline for Employment and Severe Mental Illness in the Netherlands served as a case study. Methods for patient involvement in guideline development included the following: four patient representatives in the development group and advisory committee, two focus group discussions with patients, a dialogue session and eight case studies. To evaluate the quality of patient involvement, we developed a monitoring and evaluation framework including both process and outcome criteria. Data collection included observations, document analysis and semi-structured interviews (n = 26). The quality of patient involvement was enhanced using different methods, reflection of patient input in the guideline text, a supportive attitude among professionals and attention to patient involvement throughout the process. The quality was lower with respect to representing the diversity of the target group, articulation of the patient perspective in the GDG, and clarity and transparency concerning methods of involvement. The monitoring and evaluation framework was useful in providing detailed insights into patient involvement in guideline development. Patient involvement was evaluated as being of good quality. The dialogue-based approach appears to be a promising method for obtaining integrated stakeholder input in a multidisciplinary setting. © 2015 John Wiley & Sons Ltd.
NASA Astrophysics Data System (ADS)
Liu, Y. B.; Gebremeskel, S.; de Smedt, F.; Hoffmann, L.; Pfister, L.
2006-02-01
A method is presented to evaluate the storm runoff contributions from different land-use class areas within a river basin using the geographical information system-based hydrological model WetSpa. The modelling is based on division of the catchment into a grid mesh. Each cell has a unique response function independent of the functioning of other cells. Summation of the flow responses from the cells with the same land-use type results in the storm runoff contribution from these areas. The model was applied on the Steinsel catchment in the Alzette river basin, Grand Duchy of Luxembourg, with 52 months of meteo-hydrological measurements. The simulation results show that the direct runoff from urban areas is dominant for a flood event compared with runoff from other land-use areas in this catchment, and this tends to increase for small floods and for the dry-season floods, whereas the interflow from forested, pasture and agricultural field areas contributes to recession flow. It is demonstrated that the relative contribution from urban areas decreases with flow coefficient, that cropland relative contribution is nearly constant, and that the relative contribution from grassland and woodland increases with flow coefficient with regard to their percentage of land-use class areas within the study catchment.
Measurement equivalence: a glossary for comparative population health research.
Morris, Katherine Ann
2018-03-06
Comparative population health studies are becoming more common and are advancing solutions to crucial public health problems, but decades-old measurement equivalence issues remain without a common vocabulary to identify and address the biases that contribute to non-equivalence. This glossary defines sources of measurement non-equivalence. While drawing examples from both within-country and between-country studies, this glossary also defines methods of harmonisation and elucidates the unique opportunities in addition to the unique challenges of particular harmonisation methods. Its primary objective is to enable population health researchers to more clearly articulate their measurement assumptions and the implications of their findings for policy. It is also intended to provide scholars and policymakers across multiple areas of inquiry with tools to evaluate comparative research and thus contribute to urgent debates on how to ameliorate growing health disparities within and between countries. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Evaluation of tools for highly variable gene discovery from single-cell RNA-seq data.
Yip, Shun H; Sham, Pak Chung; Wang, Junwen
2018-02-21
Traditional RNA sequencing (RNA-seq) allows the detection of gene expression variations between two or more cell populations through differentially expressed gene (DEG) analysis. However, genes that contribute to cell-to-cell differences are not discoverable with RNA-seq because RNA-seq samples are obtained from a mixture of cells. Single-cell RNA-seq (scRNA-seq) allows the detection of gene expression in each cell. With scRNA-seq, highly variable gene (HVG) discovery allows the detection of genes that contribute strongly to cell-to-cell variation within a homogeneous cell population, such as a population of embryonic stem cells. This analysis is implemented in many software packages. In this study, we compare seven HVG methods from six software packages, including BASiCS, Brennecke, scLVM, scran, scVEGs and Seurat. Our results demonstrate that reproducibility in HVG analysis requires a larger sample size than DEG analysis. Discrepancies between methods and potential issues in these tools are discussed and recommendations are made.
Wieser, A
2012-03-01
Electron paramagnetic resonance dosimetry with tooth enamel has been proved to be a reliable method to determine retrospectively exposures from photon fields with minimal detectable doses of 100 mGy or lower, which is lower than achievable with cytogenetic dose reconstruction methods. For risk assessment or validating dosimetry systems for specific radiation incidents, the relevant dose from the incident has to be calculated from the total absorbed dose in enamel by subtracting additional dose contributions from the radionuclide content in teeth, natural external background radiation and medical exposures. For calculating organ doses or evaluating dosimetry systems the absorbed dose in enamel from a radiation incident has to be converted to air kerma using dose conversion factors depending on the photon energy spectrum and geometry of the exposure scenario. This paper outlines the approach to assess individual dose contributions to absorbed dose in enamel and calculate individual air kerma of a radiation incident from the absorbed dose in tooth enamel.
Sikirzhytskaya, Aliaksandra; Sikirzhytski, Vitali; McLaughlin, Gregory; Lednev, Igor K
2013-09-01
Body fluid traces recovered at crime scenes are among the most common and important types of forensic evidence. However, the ability to characterize a biological stain at a crime scene nondestructively has not yet been demonstrated. Here, we expand the Raman spectroscopic approach for the identification of dry traces of pure body fluids to address the problem of heterogeneous contamination, which can impair the performance of conventional methods. The concept of multidimensional Raman signatures was utilized for the identification of blood in dry traces contaminated with sand, dust, and soil. Multiple Raman spectra were acquired from the samples via automatic scanning, and the contribution of blood was evaluated through the fitting quality using spectroscopic signature components. The spatial mapping technique allowed for detection of "hot spots" dominated by blood contribution. The proposed method has great potential for blood identification in highly contaminated samples. © 2013 American Academy of Forensic Sciences.
Health Care Ergonomics: Contributions of Thomas Waters.
Poole Wilson, Tiffany; Davis, Kermit G
2016-08-01
The aim of this study was to assess the contributions of Thomas Waters's work in the field of health care ergonomics and beyond. Waters's research of safe patient handling with a focus on reducing musculoskeletal disorders (MSDs) in health care workers contributed to current studies and prevention strategies. He worked with several groups to share his research and assist in developing safe patient handling guidelines and curriculum for nursing students and health care workers. The citations of articles that were published by Waters in health care ergonomics were evaluated for quality and themes of conclusions. Quality was assessed using the Mixed Methods Appraisal Tool and centrality to original research rating. Themes were documented by the type of population the citing articles were investigating. In total, 266 articles that referenced the top seven cited articles were evaluated. More than 95% of them were rated either medium or high quality. The important themes of these citing articles were as follows: (a) Safe patient handling is effective in reducing MSDs in health care workers. (b) Shift work has negative impact on nurses. (c) There is no safe way to manually lift a patient. (d) Nurse curriculums should contain safe patient handling. The research of Waters has contributed significantly to the health care ergonomics and beyond. His work, in combination with other pioneers in the field, has generated multiple initiatives, such as a standard safe patient-handling curriculum and safe patient-handling programs. © 2016, Human Factors and Ergonomics Society.
Epidemiology, Policy, and Racial/Ethnic Minority Health Disparities
Carter-Pokras, Olivia; Offutt-Powell, Tabatha; Kaufman, Jay S.; Giles, Wayne; Mays, Vickie
2013-01-01
Purpose Epidemiologists have long contributed to policy efforts to address health disparities. Three examples illustrate how epidemiologists have addressed health disparities in the U.S. and abroad through a “social determinants of health” lens. Methods To identify examples of how epidemiologic research has been applied to reduce health disparities, we queried epidemiologists engaged in disparities research in the U.S., Canada, and New Zealand, and drew upon the scientific literature. Results Resulting examples covered a wide range of topic areas. Three areas selected for their contributions to policy were: 1) epidemiology's role in definition and measurement, 2) the study of housing and asthma, and 3) the study of food policy strategies to reduce health disparities. While epidemiologic research has done much to define and quantify health inequalities, it has generally been less successful at producing evidence that would identify targets for health equity intervention. Epidemiologists have a role to play in measurement and basic surveillance, etiologic research, intervention research, and evaluation research. However, our training and funding sources generally place greatest emphasis on surveillance and etiologic research. Conclusions: The complexity of health disparities requires better training for epidemiologists to effectively work in multidisciplinary teams. Together we can evaluate contextual and multilevel contributions to disease and study intervention programs in order to gain better insights into evidenced-based health equity strategies. PMID:22626003
Female reproductive disorders: the roles of endocrine-disrupting compounds and developmental timing
Crain, D. Andrew; Janssen, Sarah J.; Edwards, Thea M.; Heindel, Jerrold; Ho, Shuk-mei; Hunt, Patricia; Iguchi, Taisen; Juul, Anders; McLachlan, John A.; Schwartz, Jackie; Skakkebaek, Niels; Soto, Ana M.; Swan, Shanna; Walker, Cheryl; Woodruff, Teresa K.; Woodruff, Tracey J.; Giudice, Linda C.; Guillette, Louis J.
2014-01-01
Objective To evaluate the possible role of endocrine-disrupting compounds (EDCs) on female reproductive disorders emphasizing developmental plasticity and the complexity of endocrine-dependent ontogeny of reproductive organs. Declining conception rates and the high incidence of female reproductive disruptions warrant evaluation of the impact of EDCs on female reproductive health. Design Publications related to the contribution of EDCs to disorders of the ovary (aneuploidy, polycystic ovary syndrome, and altered cyclicity), uterus (endometriosis, uterine fibroids, fetal growth restriction, and pregnancy loss), breast (breast cancer, reduced duration of lactation), and pubertal timing were identified, reviewed, and summarized at a workshop. Conclusion(s) The data reviewed illustrate that EDCs contribute to numerous human female reproductive disorders and emphasize the sensitivity of early life-stage exposures. Many research gaps are identified that limit full understanding of the contribution of EDCs to female reproductive problems. Moreover, there is an urgent need to reduce the incidence of these reproductive disorders, which can be addressed by correlative studies on early life exposure and adult reproductive dysfunction together with tools to assess the specific exposures and methods to block their effects. This review of the EDC literature as it relates to female health provides an important platform on which women’s health can be improved. PMID:18929049
Junior, E U Ramos; Brogin, R L; Godinho, V P C; Botelho, F J E; Tardin, F D; Teodoro, P E
2017-09-27
Biplot analysis has often been used to recommend genotypes from different crops in the presence of the genotype x environment interaction (GxE). The objective of this study was to verify the association between the AMMI and GGE biplot methods and to select soybean genotypes that simultaneously meet high grain yield and stability to the environments belonging to the Edaphoclimatic Region 402, from Soybean Cultivation Region 4 (Mid-West), which comprises the Center North and West of Mato Grosso, and the southern region of Rondônia. Grain yield of 12 soybean genotypes was evaluated in seven competition trials of soybean cultivars in the 2014/2015 harvest. Significant GxE interaction revealed the need to use methods for recommending genotypes with adaptability and yield stability. The methods were complementary regarding the recommendation of the best genotypes. The AMMI analysis recommended MG/BR46 (Conquista) (G10) widely for all environments evaluated, whereas the BRY23-55012 (G9) and BRAS11-0149 (G2) were the most indicated genotypes by the GGE biplot method. However, the methods were concordant as to Porto Velho (PV1) environment that contributed least to the GxE interaction.
Filter Design and Performance Evaluation for Fingerprint Image Segmentation
Thai, Duy Hoang; Huckemann, Stephan; Gottschlich, Carsten
2016-01-01
Fingerprint recognition plays an important role in many commercial applications and is used by millions of people every day, e.g. for unlocking mobile phones. Fingerprint image segmentation is typically the first processing step of most fingerprint algorithms and it divides an image into foreground, the region of interest, and background. Two types of error can occur during this step which both have a negative impact on the recognition performance: ‘true’ foreground can be labeled as background and features like minutiae can be lost, or conversely ‘true’ background can be misclassified as foreground and spurious features can be introduced. The contribution of this paper is threefold: firstly, we propose a novel factorized directional bandpass (FDB) segmentation method for texture extraction based on the directional Hilbert transform of a Butterworth bandpass (DHBB) filter interwoven with soft-thresholding. Secondly, we provide a manually marked ground truth segmentation for 10560 images as an evaluation benchmark. Thirdly, we conduct a systematic performance comparison between the FDB method and four of the most often cited fingerprint segmentation algorithms showing that the FDB segmentation method clearly outperforms these four widely used methods. The benchmark and the implementation of the FDB method are made publicly available. PMID:27171150
Attractive electron-electron interactions within robust local fitting approximations.
Merlot, Patrick; Kjærgaard, Thomas; Helgaker, Trygve; Lindh, Roland; Aquilante, Francesco; Reine, Simen; Pedersen, Thomas Bondo
2013-06-30
An analysis of Dunlap's robust fitting approach reveals that the resulting two-electron integral matrix is not manifestly positive semidefinite when local fitting domains or non-Coulomb fitting metrics are used. We present a highly local approximate method for evaluating four-center two-electron integrals based on the resolution-of-the-identity (RI) approximation and apply it to the construction of the Coulomb and exchange contributions to the Fock matrix. In this pair-atomic resolution-of-the-identity (PARI) approach, atomic-orbital (AO) products are expanded in auxiliary functions centered on the two atoms associated with each product. Numerical tests indicate that in 1% or less of all Hartree-Fock and Kohn-Sham calculations, the indefinite integral matrix causes nonconvergence in the self-consistent-field iterations. In these cases, the two-electron contribution to the total energy becomes negative, meaning that the electronic interaction is effectively attractive, and the total energy is dramatically lower than that obtained with exact integrals. In the vast majority of our test cases, however, the indefiniteness does not interfere with convergence. The total energy accuracy is comparable to that of the standard Coulomb-metric RI method. The speed-up compared with conventional algorithms is similar to the RI method for Coulomb contributions; exchange contributions are accelerated by a factor of up to eight with a triple-zeta quality basis set. A positive semidefinite integral matrix is recovered within PARI by introducing local auxiliary basis functions spanning the full AO product space, as may be achieved by using Cholesky-decomposition techniques. Local completion, however, slows down the algorithm to a level comparable with or below conventional calculations. Copyright © 2013 Wiley Periodicals, Inc.
NASA Astrophysics Data System (ADS)
Borkin, Michelle A.
Visualization is a powerful tool for data exploration and analysis. With data ever-increasing in quantity and becoming integrated into our daily lives, having effective visualizations is necessary. But how does one design an effective visualization? To answer this question we need to understand how humans perceive, process, and understand visualizations. Through visualization evaluation studies we can gain deeper insight into the basic perception and cognition theory of visualizations, both through domain-specific case studies as well as generalized laboratory experiments. This dissertation presents the results of four evaluation studies, each of which contributes new knowledge to the theory of perception and cognition of visualizations. The results of these studies include a deeper clearer understanding of how color, data representation dimensionality, spatial layout, and visual complexity affect a visualization's effectiveness, as well as how visualization types and visual attributes affect the memorability of a visualization. We first present the results of two domain-specific case study evaluations. The first study is in the field of biomedicine in which we developed a new heart disease diagnostic tool, and conducted a study to evaluate the effectiveness of 2D versus 3D data representations as well as color maps. In the second study, we developed a new visualization tool for filesystem provenance data with applications in computer science and the sciences more broadly. We additionally developed a new time-based hierarchical node grouping method. We then conducted a study to evaluate the effectiveness of the new tool with its radial layout versus the conventional node-link diagram, and the new node grouping method. Finally, we discuss the results of two generalized studies designed to understand what makes a visualization memorable. In the first evaluation we focused on visualization memorability and conducted an online study using Amazon's Mechanical Turk with hundreds of users and thousands of visualizations. For the second evaluation we designed an eye-tracking laboratory study to gain insight into precisely which elements of a visualization contribute to memorability as well as visualization recognition and recall.
Pressman, Alice R; Lo, Joan C; Chandra, Malini; Ettinger, Bruce
2011-01-01
Area under the receiver operating characteristics (AUROC) curve is often used to evaluate risk models. However, reclassification tests provide an alternative assessment of model performance. We performed both evaluations on results from FRAX (World Health Organization Collaborating Centre for Metabolic Bone Diseases, University of Sheffield, UK), a fracture risk tool, using Kaiser Permanente Northern California women older than 50yr with bone mineral density (BMD) measured during 1997-2003. We compared FRAX performance with and without BMD in the model. Among 94,489 women with mean follow-up of 6.6yr, 1579 (1.7%) sustained a hip fracture. Overall, AUROCs were 0.83 and 0.84 for FRAX without and with BMD, suggesting that BMD did not contribute to model performance. AUROC decreased with increasing age, and BMD contributed significantly to higher AUROC among those aged 70yr and older. Using an 81% sensitivity threshold (optimum level from receiver operating characteristic curve, corresponding to 1.2% cutoff), 35% of those categorized above were reassigned below when BMD was added. In contrast, only 10% of those categorized below were reassigned to the higher risk category when BMD was added. The net reclassification improvement was 5.5% (p<0.01). Two versions of this risk tool have similar AUROCs, but alternative assessments indicate that addition of BMD improves performance. Multiple methods should be used to evaluate risk tool performance with less reliance on AUROC alone. Copyright © 2011 The International Society for Clinical Densitometry. Published by Elsevier Inc. All rights reserved.
Multiattribute evaluation in formulary decision making as applied to calcium-channel blockers.
Schumacher, G E
1991-02-01
The use of multiattribute utility theory (MAUT) to make a formulary decision involving calcium-channel blockers (CCBs) is described. The MAUT method is a procedure for identifying, characterizing, and comparing the many variables that may affect a decision. Although applications in pharmacy have been infrequent, MAUT should be particularly appealing to formulary committees. The steps of the MAUT method are (1) determine the viewpoint of the decision makers, (2) identify the decision alternatives, (3) identify the attributes to be evaluated, (4) identify the factors to be used in evaluating the attributes, (5) establish a utility scale for scoring each factor, (6) transform the values for each factor to its utility scale, (7) determine weights for each attribute and factor, (8) calculate the total utility score for each decision alternative, (9) determine which decision alternative has the greatest total score, and (10) perform a sensitivity analysis. The viewpoint of a formulary committee in a health maintenance organization was simulated to develop a model for using the MAUT method to compare CCBs for single-agent therapy of chronic stable angina in ambulatory patients for one year. The attributes chosen were effectiveness, safety, patient acceptance, and cost and weighted 36%, 29%, 21%, and 14%, respectively, as contributions to the evaluation. The rank order of the decision alternatives was (1) generic verapamil, (2) brand-name verapamil, (3) diltiazem, (4) nicardipine, and (5) nifedipine. The MAUT method provides a standardized yet flexible format for comparing and selecting among formulary alternatives.
Evaluation and integration of existing methods for computational prediction of allergens
2013-01-01
Background Allergy involves a series of complex reactions and factors that contribute to the development of the disease and triggering of the symptoms, including rhinitis, asthma, atopic eczema, skin sensitivity, even acute and fatal anaphylactic shock. Prediction and evaluation of the potential allergenicity is of importance for safety evaluation of foods and other environment factors. Although several computational approaches for assessing the potential allergenicity of proteins have been developed, their performance and relative merits and shortcomings have not been compared systematically. Results To evaluate and improve the existing methods for allergen prediction, we collected an up-to-date definitive dataset consisting of 989 known allergens and massive putative non-allergens. The three most widely used allergen computational prediction approaches including sequence-, motif- and SVM-based (Support Vector Machine) methods were systematically compared using the defined parameters and we found that SVM-based method outperformed the other two methods with higher accuracy and specificity. The sequence-based method with the criteria defined by FAO/WHO (FAO: Food and Agriculture Organization of the United Nations; WHO: World Health Organization) has higher sensitivity of over 98%, but having a low specificity. The advantage of motif-based method is the ability to visualize the key motif within the allergen. Notably, the performances of the sequence-based method defined by FAO/WHO and motif eliciting strategy could be improved by the optimization of parameters. To facilitate the allergen prediction, we integrated these three methods in a web-based application proAP, which provides the global search of the known allergens and a powerful tool for allergen predication. Flexible parameter setting and batch prediction were also implemented. The proAP can be accessed at http://gmobl.sjtu.edu.cn/proAP/main.html. Conclusions This study comprehensively evaluated sequence-, motif- and SVM-based computational prediction approaches for allergens and optimized their parameters to obtain better performance. These findings may provide helpful guidance for the researchers in allergen-prediction. Furthermore, we integrated these methods into a web application proAP, greatly facilitating users to do customizable allergen search and prediction. PMID:23514097
Evaluation and integration of existing methods for computational prediction of allergens.
Wang, Jing; Yu, Yabin; Zhao, Yunan; Zhang, Dabing; Li, Jing
2013-01-01
Allergy involves a series of complex reactions and factors that contribute to the development of the disease and triggering of the symptoms, including rhinitis, asthma, atopic eczema, skin sensitivity, even acute and fatal anaphylactic shock. Prediction and evaluation of the potential allergenicity is of importance for safety evaluation of foods and other environment factors. Although several computational approaches for assessing the potential allergenicity of proteins have been developed, their performance and relative merits and shortcomings have not been compared systematically. To evaluate and improve the existing methods for allergen prediction, we collected an up-to-date definitive dataset consisting of 989 known allergens and massive putative non-allergens. The three most widely used allergen computational prediction approaches including sequence-, motif- and SVM-based (Support Vector Machine) methods were systematically compared using the defined parameters and we found that SVM-based method outperformed the other two methods with higher accuracy and specificity. The sequence-based method with the criteria defined by FAO/WHO (FAO: Food and Agriculture Organization of the United Nations; WHO: World Health Organization) has higher sensitivity of over 98%, but having a low specificity. The advantage of motif-based method is the ability to visualize the key motif within the allergen. Notably, the performances of the sequence-based method defined by FAO/WHO and motif eliciting strategy could be improved by the optimization of parameters. To facilitate the allergen prediction, we integrated these three methods in a web-based application proAP, which provides the global search of the known allergens and a powerful tool for allergen predication. Flexible parameter setting and batch prediction were also implemented. The proAP can be accessed at http://gmobl.sjtu.edu.cn/proAP/main.html. This study comprehensively evaluated sequence-, motif- and SVM-based computational prediction approaches for allergens and optimized their parameters to obtain better performance. These findings may provide helpful guidance for the researchers in allergen-prediction. Furthermore, we integrated these methods into a web application proAP, greatly facilitating users to do customizable allergen search and prediction.
NASA Astrophysics Data System (ADS)
Kergadallan, Xavier; Bernardara, Pietro; Benoit, Michel; Andreewsky, Marc; Weiss, Jérôme
2013-04-01
Estimating the probability of occurrence of extreme sea levels is a central issue for the protection of the coast. Return periods of sea level with wave set-up contribution are estimated here in one site : Cherbourg in France in the English Channel. The methodology follows two steps : the first one is computation of joint probability of simultaneous wave height and still sea level, the second one is interpretation of that joint probabilities to assess a sea level for a given return period. Two different approaches were evaluated to compute joint probability of simultaneous wave height and still sea level : the first one is multivariate extreme values distributions of logistic type in which all components of the variables become large simultaneously, the second one is conditional approach for multivariate extreme values in which only one component of the variables have to be large. Two different methods were applied to estimate sea level with wave set-up contribution for a given return period : Monte-Carlo simulation in which estimation is more accurate but needs higher calculation time and classical ocean engineering design contours of type inverse-FORM in which the method is simpler and allows more complex estimation of wave setup part (wave propagation to the coast for example). We compare results from the two different approaches with the two different methods. To be able to use both Monte-Carlo simulation and design contours methods, wave setup is estimated with an simple empirical formula. We show advantages of the conditional approach compared to the multivariate extreme values approach when extreme sea-level occurs when either surge or wave height is large. We discuss the validity of the ocean engineering design contours method which is an alternative when computation of sea levels is too complex to use Monte-Carlo simulation method.
Development of in-house serological methods for diagnosis and surveillance of chikungunya
Galo, Saira Saborío; González, Karla; Téllez, Yolanda; García, Nadezna; Pérez, Leonel; Gresh, Lionel; Harris, Eva; Balmaseda, Ángel
2017-01-01
Objective To develop and evaluate serological methods for chikungunya diagnosis and research in Nicaragua. Methods Two IgM ELISA capture systems (MAC-ELISA) for diagnosis of acute chikungunya virus (CHIKV) infections, and two Inhibition ELISA Methods (IEM) to measure total antibodies against CHIKV were developed using monoclonal antibodies (mAbs) and hyperimmune serum at the National Virology Laboratory of Nicaragua in 2014–2015. The sensitivity, specificity, predictive values, and agreement of the MAC-ELISAs were obtained by comparing the results of 198 samples (116 positive; 82 negative) with the Centers for Disease Control and Prevention’s IgM ELISA (Atlanta, Georgia, United States; CDC-MAC-ELISA). For clinical evaluation of the four serological techniques, 260 paired acute and convalescent phase serum samples of suspected chikungunya cases were used. Results All four assays were standardized by determining the optimal concentrations of the different reagents. Processing times were substantially reduced compared to the CDC-MAC-ELISA. For the MAC-ELISA systems, a sensitivity of 96.6% and 97.4%, and a specificity of 98.8% and 91.5% were obtained using mAb and hyperimmune serum, respectively, compared with the CDC method. Clinical evaluation of the four serological techniques versus the CDC real-time RT-PCR assay resulted in a sensitivity of 95.7% and a specificity of 88.8%–95.9%. Conclusion Two MAC-ELISA and two IEM systems were standardized, demonstrating very good quality for chikungunya diagnosis and research demands. This will achieve more efficient epidemiological surveillance in Nicaragua, the first country in Central America to produce its own reagents for serological diagnosis of CHIKV. The methods evaluated here can be applied in other countries and will contribute to sustainable diagnostic systems to combat the disease. PMID:28902269
van Asseldonk, Edwin H F; Buurke, Jaap H; Bloem, Bastiaan R; Renzenbrink, Gerbert J; Nene, Anand V; van der Helm, Frans C T; van der Kooij, Herman
2006-10-01
During stroke recovery, restoration of the paretic ankle and compensation in the non-paretic ankle may contribute to improved balance maintenance. We examine a new approach to disentangle these recovery mechanisms by objectively quantifying the contribution of each ankle to balance maintenance. Eight chronic hemiparetic patients were included. Balance responses were elicited by continuous random platform movements. We measured body sway and ground reaction forces below each foot to calculate corrective ankle torques in each leg. These measurements yielded the Frequency Response Function (FRF) of the stabilizing mechanisms, which expresses the amount and timing of the generated corrective torque in response to sway at the specified frequencies. The FRFs were used to calculate the relative contribution of the paretic and non-paretic ankle to the total amount of generated corrective torque to correct sway. All patients showed a clear asymmetry in the balance contribution in favor of the non-paretic ankle. Paretic balance contribution was significantly smaller than the contribution of the paretic leg to weight bearing, and did not show a clear relation with the contribution to weight bearing. In contrast, a group of healthy subjects instructed to distribute their weight asymmetrically showed a one-on-one relation between the contribution to weight bearing and to balance. We conclude that the presented approach objectively quantifies the contribution of each ankle to balance maintenance. Application of this method in longitudinal surveys of balance rehabilitation makes it possible to disentangle the different recovery mechanisms. Such insights will be critical for the development and evaluation of rehabilitation strategies.
NASA Astrophysics Data System (ADS)
Pain, F.; Dhenain, M.; Gurden, H.; Routier, A. L.; Lefebvre, F.; Mastrippolito, R.; Lanièce, P.
2008-10-01
The β-microprobe is a simple and versatile technique complementary to small animal positron emission tomography (PET). It relies on local measurements of the concentration of positron-labeled molecules. So far, it has been successfully used in anesthetized rats for pharmacokinetics experiments and for the study of brain energetic metabolism. However, the ability of the technique to provide accurate quantitative measurements using 18F, 11C and 15O tracers is likely to suffer from the contribution of 511 keV gamma rays background to the signal and from the contribution of positrons from brain loci surrounding the locus of interest. The aim of the present paper is to provide a method of evaluating several parameters, which are supposed to affect the quantification of recordings performed in vivo with this methodology. We have developed realistic voxelized phantoms of the rat whole body and brain, and used them as input geometries for Monte Carlo simulations of previous β-microprobe reports. In the context of realistic experiments (binding of 11C-Raclopride to D2 dopaminergic receptors in the striatum; local glucose metabolic rate measurement with 18F-FDG and H2O15 blood flow measurements in the somatosensory cortex), we have calculated the detection efficiencies and corresponding contribution of 511 keV gammas from peripheral organs accumulation. We confirmed that the 511 keV gammas background does not impair quantification. To evaluate the contribution of positrons from adjacent structures, we have developed β-Assistant, a program based on a rat brain voxelized atlas and matrices of local detection efficiencies calculated by Monte Carlo simulations for several probe geometries. This program was used to calculate the 'apparent sensitivity' of the probe for each brain structure included in the detection volume. For a given localization of a probe within the brain, this allows us to quantify the different sources of beta signal. Finally, since stereotaxic accuracy is crucial for quantification in most microprobe studies, the influence of stereotaxic positioning error was studied for several realistic experiments in favorable and unfavorable experimental situations (binding of 11C-Raclopride to D2 dopaminergic receptors in the striatum; binding of 18F-MPPF to 5HT1A receptors in the dorsal raphe nucleus).
Handwriting: Feature Correlation Analysis for Biometric Hashes
NASA Astrophysics Data System (ADS)
Vielhauer, Claus; Steinmetz, Ralf
2004-12-01
In the application domain of electronic commerce, biometric authentication can provide one possible solution for the key management problem. Besides server-based approaches, methods of deriving digital keys directly from biometric measures appear to be advantageous. In this paper, we analyze one of our recently published specific algorithms of this category based on behavioral biometrics of handwriting, the biometric hash. Our interest is to investigate to which degree each of the underlying feature parameters contributes to the overall intrapersonal stability and interpersonal value space. We will briefly discuss related work in feature evaluation and introduce a new methodology based on three components: the intrapersonal scatter (deviation), the interpersonal entropy, and the correlation between both measures. Evaluation of the technique is presented based on two data sets of different size. The method presented will allow determination of effects of parameterization of the biometric system, estimation of value space boundaries, and comparison with other feature selection approaches.
Novel strategy to implement active-space coupled-cluster methods
NASA Astrophysics Data System (ADS)
Rolik, Zoltán; Kállay, Mihály
2018-03-01
A new approach is presented for the efficient implementation of coupled-cluster (CC) methods including higher excitations based on a molecular orbital space partitioned into active and inactive orbitals. In the new framework, the string representation of amplitudes and intermediates is used as long as it is beneficial, but the contractions are evaluated as matrix products. Using a new diagrammatic technique, the CC equations are represented in a compact form due to the string notations we introduced. As an application of these ideas, a new automated implementation of the single-reference-based multi-reference CC equations is presented for arbitrary excitation levels. The new program can be considered as an improvement over the previous implementations in many respects; e.g., diagram contributions are evaluated by efficient vectorized subroutines. Timings for test calculations for various complete active-space problems are presented. As an application of the new code, the weak interactions in the Be dimer were studied.
Comparing Methods for Estimating Direct Costs of Adverse Drug Events.
Gyllensten, Hanna; Jönsson, Anna K; Hakkarainen, Katja M; Svensson, Staffan; Hägg, Staffan; Rehnberg, Clas
2017-12-01
To estimate how direct health care costs resulting from adverse drug events (ADEs) and cost distribution are affected by methodological decisions regarding identification of ADEs, assigning relevant resource use to ADEs, and estimating costs for the assigned resources. ADEs were identified from medical records and diagnostic codes for a random sample of 4970 Swedish adults during a 3-month study period in 2008 and were assessed for causality. Results were compared for five cost evaluation methods, including different methods for identifying ADEs, assigning resource use to ADEs, and for estimating costs for the assigned resources (resource use method, proportion of registered cost method, unit cost method, diagnostic code method, and main diagnosis method). Different levels of causality for ADEs and ADEs' contribution to health care resource use were considered. Using the five methods, the maximum estimated overall direct health care costs resulting from ADEs ranged from Sk10,000 (Sk = Swedish krona; ~€1,500 in 2016 values) using the diagnostic code method to more than Sk3,000,000 (~€414,000) using the unit cost method in our study population. The most conservative definitions for ADEs' contribution to health care resource use and the causality of ADEs resulted in average costs per patient ranging from Sk0 using the diagnostic code method to Sk4066 (~€500) using the unit cost method. The estimated costs resulting from ADEs varied considerably depending on the methodological choices. The results indicate that costs for ADEs need to be identified through medical record review and by using detailed unit cost data. Copyright © 2017 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
Aghajani Mir, M; Taherei Ghazvinei, P; Sulaiman, N M N; Basri, N E A; Saheri, S; Mahmood, N Z; Jahan, A; Begum, R A; Aghamohammadi, N
2016-01-15
Selecting a suitable Multi Criteria Decision Making (MCDM) method is a crucial stage to establish a Solid Waste Management (SWM) system. Main objective of the current study is to demonstrate and evaluate a proposed method using Multiple Criteria Decision Making methods (MCDM). An improved version of Technique for Order of Preference by Similarity to Ideal Solution (TOPSIS) applied to obtain the best municipal solid waste management method by comparing and ranking the scenarios. Applying this method in order to rank treatment methods is introduced as one contribution of the study. Besides, Viekriterijumsko Kompromisno Rangiranje (VIKOR) compromise solution method applied for sensitivity analyses. The proposed method can assist urban decision makers in prioritizing and selecting an optimized Municipal Solid Waste (MSW) treatment system. Besides, a logical and systematic scientific method was proposed to guide an appropriate decision-making. A modified TOPSIS methodology as a superior to existing methods for first time was applied for MSW problems. Applying this method in order to rank treatment methods is introduced as one contribution of the study. Next, 11 scenarios of MSW treatment methods are defined and compared environmentally and economically based on the waste management conditions. Results show that integrating a sanitary landfill (18.1%), RDF (3.1%), composting (2%), anaerobic digestion (40.4%), and recycling (36.4%) was an optimized model of integrated waste management. An applied decision-making structure provides the opportunity for optimum decision-making. Therefore, the mix of recycling and anaerobic digestion and a sanitary landfill with Electricity Production (EP) are the preferred options for MSW management. Copyright © 2015 Elsevier Ltd. All rights reserved.
Haering, Diane; Huchez, Aurore; Barbier, Franck; Holvoët, Patrice; Begon, Mickaël
2017-01-01
Introduction Teaching acrobatic skills with a minimal amount of repetition is a major challenge for coaches. Biomechanical, statistical or computer simulation tools can help them identify the most determinant factors of performance. Release parameters, change in moment of inertia and segmental momentum transfers were identified in the prediction of acrobatics success. The purpose of the present study was to evaluate the relative contribution of these parameters in performance throughout expertise or optimisation based improvements. The counter movement forward in flight (CMFIF) was chosen for its intrinsic dichotomy between the accessibility of its attempt and complexity of its mastery. Methods Three repetitions of the CMFIF performed by eight novice and eight advanced female gymnasts were recorded using a motion capture system. Optimal aerial techniques that maximise rotation potential at regrasp were also computed. A 14-segment-multibody-model defined through the Rigid Body Dynamics Library was used to compute recorded and optimal kinematics, and biomechanical parameters. A stepwise multiple linear regression was used to determine the relative contribution of these parameters in novice recorded, novice optimised, advanced recorded and advanced optimised trials. Finally, fixed effects of expertise and optimisation were tested through a mixed-effects analysis. Results and discussion Variation in release state only contributed to performances in novice recorded trials. Moment of inertia contribution to performance increased from novice recorded, to novice optimised, advanced recorded, and advanced optimised trials. Contribution to performance of momentum transfer to the trunk during the flight prevailed in all recorded trials. Although optimisation decreased transfer contribution, momentum transfer to the arms appeared. Conclusion Findings suggest that novices should be coached on both contact and aerial technique. Inversely, mainly improved aerial technique helped advanced gymnasts increase their performance. For both, reduction of the moment of inertia should be focused on. The method proposed in this article could be generalized to any aerial skill learning investigation. PMID:28422954
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tan, Eric C; Smith, Raymond; Ruiz-Mercado, Gerardo
This presentation examines different methods for analyzing manufacturing processes in the early stages of technical readiness. Before developers know much detail about their processes, it is valuable to apply various assessments to evaluate their performance. One type of assessment evaluates performance indicators to describe how closely processes approach desirable objectives. Another type of assessment determines the life cycle inventories (LCI) of inputs and outputs for processes, where for a functional unit of product, the user evaluates the resources used and the releases to the environment. These results can be compared to similar processes or combined with the LCI of othermore » processes to examine up-and down-stream chemicals. The inventory also provides a listing of the up-stream chemicals, which permits study of the whole life cycle. Performance indicators are evaluated in this presentation with the U.S. Environmental Protection Agency's GREENSCOPE (Gauging Reaction Effectiveness for ENvironmental Sustainability with a multi-Objective Process Evaluator) methodology, which evaluates processes in four areas: Environment, Energy, Economics, and Efficiency. The method develops relative scores for indicators that allow comparisons across various technologies. In this contribution, two conversion pathways for producing cellulosic ethanol from biomass, via thermochemical and biochemical routes, are studied. The information developed from the indicators and LCI can be used to inform the process design and the potential life cycle effects of up- and down-stream chemicals.« less
Tasato, Hiroshi; Kida, Noriyuki
2018-01-01
[Purpose] The purpose of this study was to investigate the measurement method and parameters to simply evaluate the condition of the knee that are necessary for preventing locomotive syndrome as advocated by the Japan Orthopedic Association. [Subjects and Methods] The subjects installed acceleration sensors in lateral condyles of the tibia and measured acceleration and load under the conditions of walking on a flat ground and walking using stairs; the difference between the impulse of impact forces (acceleration × load) of the two knees was defined as a simple evaluation parameter. [Results] Simple evaluation parameters were not correlated with age during walking on a flat ground, but during walking using stairs, it was almost flat up to the age of 20–40 years, and after the age of 49 years, based on the quadratic curve approximation (R2=0.99), a correlation of simple evaluation parameters with age could be confirmed. [Conclusion] The simple evaluation parameter during walking using stairs was highly correlated with age, suggesting a contribution to preventing locomotive syndrome by improving reliability. In the future, we plan to improve reliability by increasing the data, and establish it as a simple evaluation parameter that can be used for preventing locomotive syndrome in elderly people and those with KL classification grades 0–1. PMID:29706699
NASA Astrophysics Data System (ADS)
Rachakonda, Prem; Muralikrishnan, Bala; Cournoyer, Luc; Cheok, Geraldine; Lee, Vincent; Shilling, Meghan; Sawyer, Daniel
2017-10-01
The Dimensional Metrology Group at the National Institute of Standards and Technology is performing research to support the development of documentary standards within the ASTM E57 committee. This committee is addressing the point-to-point performance evaluation of a subclass of 3D imaging systems called terrestrial laser scanners (TLSs), which are laser-based and use a spherical coordinate system. This paper discusses the usage of sphere targets for this effort, and methods to minimize the errors due to the determination of their centers. The key contributions of this paper include methods to segment sphere data from a TLS point cloud, and the study of some of the factors that influence the determination of sphere centers.
Comparison of reversible methods for data compression
NASA Astrophysics Data System (ADS)
Heer, Volker K.; Reinfelder, Hans-Erich
1990-07-01
Widely differing methods for data compression described in the ACR-NEMA draft are used in medical imaging. In our contribution we will review various methods briefly and discuss the relevant advantages and disadvantages. In detail we evaluate 1st order DPCM pyramid transformation and S transformation. We compare as coding algorithms both fixed and adaptive Huffman coding and Lempel-Ziv coding. Our comparison is performed on typical medical images from CT MR DSA and DLR (Digital Luminescence Radiography). Apart from the achieved compression factors we take into account CPU time required and main memory requirement both for compression and for decompression. For a realistic comparison we have implemented the mentioned algorithms in the C program language on a MicroVAX II and a SPARC station 1. 2.
Goeminne, M; Demeulemeester, K; Viaene, N
2011-01-01
In order to make a cost benefit analysis for the management of the potato cyst nematodes Globodera rostochiensis and G. pallida, we developed a method to estimate the relative importance of three basic distribution channels of potato cyst nematodes: seed potatoes, machinery and soil tare. The baseline is determined by the area planted with potatoes, the area infested with potato cysts, the proportion of resistant potato cultivars and the distribution of cysts trough different channels. This quantification forms a basis for the evaluation of the effects of different control measures for potato cyst nematode on a national scale. The method can be useful as an example for application in other countries.
Angular spectral framework to test full corrections of paraxial solutions.
Mahillo-Isla, R; González-Morales, M J
2015-07-01
Different correction methods for paraxial solutions have been used when such solutions extend out of the paraxial regime. The authors have used correction methods guided by either their experience or some educated hypothesis pertinent to the particular problem that they were tackling. This article provides a framework so as to classify full wave correction schemes. Thus, for a given solution of the paraxial wave equation, we can select the best correction scheme of those available. Some common correction methods are considered and evaluated under the proposed scope. Another remarkable contribution is obtained by giving the necessary conditions that two solutions of the Helmholtz equation must accomplish to accept a common solution of the parabolic wave equation as a paraxial approximation of both solutions.
CART V: recent advancements in computer-aided camouflage assessment
NASA Astrophysics Data System (ADS)
Müller, Thomas; Müller, Markus
2011-05-01
In order to facilitate systematic, computer aided improvements of camouflage and concealment assessment methods, the software system CART (Camouflage Assessment in Real-Time) was built up for the camouflage assessment of objects in multispectral image sequences (see contributions to SPIE 2007-2010 [1], [2], [3], [4]). It comprises a semi-automatic marking of target objects (ground truth generation) including their propagation over the image sequence and the evaluation via user-defined feature extractors as well as methods to assess the object's movement conspicuity. In this fifth part in an annual series at the SPIE conference in Orlando, this paper presents the enhancements over the recent year and addresses the camouflage assessment of static and moving objects in multispectral image data that can show noise or image artefacts. The presented methods fathom the correlations between image processing and camouflage assessment. A novel algorithm is presented based on template matching to assess the structural inconspicuity of an object objectively and quantitatively. The results can easily be combined with an MTI (moving target indication) based movement conspicuity assessment function in order to explore the influence of object movement to a camouflage effect in different environments. As the results show, the presented methods contribute to a significant benefit in the field of camouflage assessment.
Alternative Significant Contribution Approaches Evaluated
This Technical Support Document (TSD) discusses alternative approaches that EPA evaluated for defining emissions that constitute each upwind state’s significant contribution to nonattainment and interference with maintenance downwind.
Paediatric health economic evaluations: a world view.
Ungar, Wendy J
2007-01-01
As economic evaluation methods evolve, their applicability to special populations, such as children, has received increased scrutiny. The objective was to review paediatric health economic evaluations published over the last quarter century, comment on trends, discuss gaps between developed and developing nations, and point to future directions for research. Data compiled for the Paediatric Economic Database Evaluation (PEDE) project to 2003 were used to describe temporal and geographic trends and evaluate the frequency of intervention categories and conditions studied. The volume of paediatric health economic evaluations rose rapidly since 1980. Studies of infective/parasitic diseases, congenital anomalies and complications of pregnancy accounted for the majority. Prevention rather than treatment was emphasized. Most evaluations performed since 1998 (78%) were cost-effectiveness analyses. Cost-utility analyses were rare. The US produced half of all publications, with the U.K. contributing 12%. Economic evaluations from developing countries were uncommon, despite an urgent need for evidence-based decision-making in these regions. The interventions studied reflected local health priorities; HIV and malaria prevention were more commonly studied in developing nations, whereas treatments for asthma and birth malformations were more often evaluated in developed nations. Despite global initiatives to combat disease, developing nations rely on foreign research to inform implementation of local health programs. There is a need for better methods for data transfer and extrapolation. Future research must focus on paediatric models of costs and consequences and the development of tools to measure long-term effects.
Acute lymphoblastic leukemia: are Egyptian children adherent to maintenance therapy?
Khalek, Elhamy Rifky Abdel; Sherif, Laila M; Kamal, Naglaa Mohamed; Gharib, Amal F; Shawky, H M
2015-01-01
BACKGROUND, AIMS, SETTINGS AND DESIGN: Poor adherence to oral maintenance chemotherapy can cause relapse of acute lymphoblastic leukemia (ALL). A multicenter study for the evaluation of adherence to oral 6-mercaptopurine (6-MP) maintenance chemotherapy for childhood ALL in Egypt to identify contributing factors and possible steps to promote adherence. The study included 129 children with ALL in complete remission receiving 6-MP single daily oral dose in the evening. Evaluation was done through specific questionnaires for the patients as well as serum 6-MP measurements. Nonadherence was detected in around 56% by questionnaires and around 50% by serum 6-MP level measurement. There was a highly significant correlation between nonadherence as found by the questionnaire and 6-MP level (P - 0.001). Nonadherence was significantly associated with low socioeconomic standard, noneducation and low educational level and large family size by both methods. High cost to come for follow-up visits was significant by questionnaire but not by 6-MP measurement. Adolescent age, the higher number of siblings, lack of written instructions, long time spent per visit, were all associated with higher rates of nonadherence, although none reached statistical significance. Nonadherence is a real problem in pediatric patients. Specific questionnaires can be an excellent reliable method for the routine follow-up of these children, and drug level assay can be requested only for confirmation. This protocol is especially effective in developing countries where financial resources may be limited. Every effort should be made to uncover its true incidence, contributing factors, and best methods of intervention.
[Evaluation of an educational website on First Aid].
Mori, Satomi; Whitaker, Iveth Yamaguchi; Marin, Heimar de Fátima
2013-08-01
The aim of this study was to evaluate the structure, quality of information and usability of a website on First Aid. The evaluation was performed by information technology (IT) and health care professionals and by students, using specific and validated instruments. The kappa method was used to evaluate the agreement of the answers, and Cronbach's α coefficient was used to assess the reliability of the instrument. There was no agreement (0.047) among the answers obtained from the IT professionals, indicating that the structure of the website must be reviewed. There was also no agreement in the evaluation by the health care professionals (-0.062); however, the overall positive scores suggest that the quality of the information of the website is adequate. The assessment of reliability of the instrument to evaluate the navigability rendered a value of α=0.974. Although improvement of the website structure is recommended, the quality of the information is good, and its use has contributed to the apprenticeship of students.
Preston, Nancy; Evans, Catherine J.; Grande, Gunn; Short, Vicky; Benalia, Hamid; Higginson, Irene J.; Todd, on behalf of MORECare, Chris
2013-01-01
Abstract Background: Complex interventions are common in palliative and end-of-life care. Mixed methods approaches sit well within the multiphase model of complex intervention development and evaluation. Generic mixed methods guidance is useful but additional challenges in the research design and operationalization within palliative and end-of-life care may have an impact on the use of mixed methods. Objective: The objective of the study was to develop guidance on the best methods for combining quantitative and qualitative methods for health and social care intervention development and evaluation in palliative and end-of-life care. Methods: A one-day workshop was held where experts participated in facilitated groups using Transparent Expert Consultation to generate items for potential recommendations. Agreement and consensus were then sought on nine draft recommendations (DRs) in a follow-up exercise. Results: There was at least moderate agreement with most of the DRs, although consensus was low. Strongest agreement was with DR1 (usefulness of mixed methods to palliative and end-of-life care) and DR5 (importance of attention to respondent burden), and least agreement was with DR2 (use of theoretical perspectives) and DR6 (therapeutic effects of research interviews). Narrative comments enabled recommendation refinement. Two fully endorsed, five partially endorsed, and two refined DRs emerged. The relationship of these nine to six key challenges of palliative and end-of-life care research was analyzed. Conclusions: There is a need for further discussion of these recommendations and their contribution to methodology. The recommendations should be considered when designing and operationalizing mixed methods studies of complex interventions in palliative care, and because they may have wider relevance, should be considered for other applications. PMID:24195755
Interpreting wireline measurements in coal beds
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnston, D.J.
1991-06-01
When logging coal seams with wireline tools, the interpretation method needed to evaluate the coals is different from that used for conventional oil and gas reservoirs. Wireline logs identify coals easily. For an evaluation, the contribution of each coal component on the raw measurements must be considered. This paper will discuss how each log measurement is affected by each component. The components of a coal will be identified as the mineral matter, macerals, moisture content, rank, gas content, and cleat porosity. The measurements illustrated are from the resistivity, litho-density, neutron, sonic, dielectric, and geochemical tools. Once the coal component effectsmore » have been determined, an interpretation of the logs can be made. This paper will illustrate how to use these corrected logs in a coal evaluation.« less
Andor, Bogdan; Alexa, Ersilia; Hogea, Elena; Coricovac, Dorina; Pătrașcu, Jenel Marian; Mioc, Marius; Cristina, Romeo Teodor; Soica, Codruta; Dehelean, Cristina
2016-01-01
In recent years, nutraceuticals attracted a great amount of attention in the biomedical research due to their significant contribution as natural agents for prevention of various health issues. Ethanolic extracts from the ungerminated and germinated seeds of Lupinus albus L. and Lupinus angustifolius L. were analyzed for the content in isoflavones (genistein) and cinnamic acid derivatives. Additionally, the extracts were evaluated for antimicrobial, antiproliferative, and anti-inflammatory properties, using in vitro and in vivo tests. Germination proved to be a method of choice in increasing the amount of genistein and cinnamic acid derivatives in both Lupinus albus L. and Lupinus angustifolius L. seeds. Biological evaluation of all vegetal extracts revealed a weak therapeutic potential for both ungerminated and germinated seeds. PMID:28090213
NASA Technical Reports Server (NTRS)
Stricklin, J. A.; Haisler, W. E.; Von Riesemann, W. A.
1972-01-01
This paper presents an assessment of the solution procedures available for the analysis of inelastic and/or large deflection structural behavior. A literature survey is given which summarized the contribution of other researchers in the analysis of structural problems exhibiting material nonlinearities and combined geometric-material nonlinearities. Attention is focused at evaluating the available computation and solution techniques. Each of the solution techniques is developed from a common equation of equilibrium in terms of pseudo forces. The solution procedures are applied to circular plates and shells of revolution in an attempt to compare and evaluate each with respect to computational accuracy, economy, and efficiency. Based on the numerical studies, observations and comments are made with regard to the accuracy and economy of each solution technique.
Schumacher, I; Zechmeister, I
2012-04-01
In Austria research in Health Technology Assessment (HTA) has been conducted since the 1990s. Research in HTA aims at supporting an adequate and efficient use of health care resources in order to sustain a publicly financed and solidary health care system. Ultimately, HTA research should result in better health of the population. Research results should provide independent information for decision makers. For legitimizing further research resources and for prioritizing future HTA research and guaranteeing the value of future research, HTA research needs itself to undergo evaluation. Aim of the study is to design a conceptual framework for evaluating the impact of HTA research in Austria on the basis of the existing literature. An already existing review which presents methods and concepts how to evaluate HTA-impact was updated by a systematic research including literature of the years 2004-January 2010. Results were analysed in regard to 4 categories: definition of the term impact, target groups and system levels, operationalisation of indicators and evaluation methods. Overall, 19 publications were included. Referring to the 4 categories, an explanation of impact has to take into account HTAs multidisciplinary setting and needs a context related definition. Target groups, system levels, indicators and methods depend on the impact defined. Studies investigated direct and indirect impact and were focused on different target groups like physicians, nurses and decision makers on the micro-, and meso level, as well as politicians and reimbursement institutions on the macro level. Except for one reference all studies applied already known and mostly qualitative methods for measuring the impact of HTA research. Thus, an appropriate pool of instruments seems to be available. There is a lack of information about validity of applied methods and indicators. By adapting adequate methods and concepts a conceptual framework for the Austrian HTA-Impact evaluation has been designed. The paper presents an overview of existing methods for the evaluation of the HTA research. This has been used to identify useful approaches for measuring the HTA-impact in Austria. By providing a context sensitive framework for impact evaluation in Austria the Austrian HTA-research contributes to the international trend of impact-evaluation. © Georg Thieme Verlag KG Stuttgart · New York.
Chen, Shuang; Sha, Sha; Qian, Michael; Xu, Yan
2017-12-01
This study investigated the aroma contribution of volatile sulfur compounds (VSCs) in Moutai liquors. The VSCs were analyzed using headspace solid-phase microextraction-gas chromatography-pulsed flame photometric detection (HS-SPME-GC-PFPD). The influences of SPME fibers, ethanol content in the sample, pre-incubation time, and extraction temperature and time on the extraction of VSCs were optimized. The VSCs were optimally extracted using a divinylbenzene/carboxen/polydimethylsiloxane fiber, by incubating 10 mL diluted Chinese liquor (5% vol.) with 3 g NaCl at 30 °C for 15 min, followed by a subsequent extraction for 40 min at 30 °C. The optimized method was further validated. A total of 13 VSCs were identified and quantified in Moutai liquors. The aroma contribution of these VSCs were evaluated by their odor activity values (OAVs), with the result that 7 of 13 VSCs had OAVs > 1. In particular, 2-furfurylthiol, methanethiol, dimethyl trisulfide, ethanethiol, and methional had relatively high OAVs and could be the key aroma contributors to Moutai liquors. In this study, a method for analyzing volatile sulfur compounds in Chinese liquors has been developed. This method will allow an in-depth study the aroma contribution of volatile sulfur compounds in Chinese liquors. Seven volatile sulfur compounds were identified as potential key aroma contributors for Moutai liquors, which can help to the quality control of Moutai liquors. © 2017 Institute of Food Technologists®.
Diky, Vladimir; Chirico, Robert D; Muzny, Chris D; Kazakov, Andrei F; Kroenlein, Kenneth; Magee, Joseph W; Abdulagatov, Ilmutdin; Frenkel, Michael
2013-12-23
ThermoData Engine (TDE) is the first full-scale software implementation of the dynamic data evaluation concept, as reported in this journal. The present article describes the background and implementation for new additions in latest release of TDE. Advances are in the areas of program architecture and quality improvement for automatic property evaluations, particularly for pure compounds. It is shown that selection of appropriate program architecture supports improvement of the quality of the on-demand property evaluations through application of a readily extensible collection of constraints. The basis and implementation for other enhancements to TDE are described briefly. Other enhancements include the following: (1) implementation of model-validity enforcement for specific equations that can provide unphysical results if unconstrained, (2) newly refined group-contribution parameters for estimation of enthalpies of formation for pure compounds containing carbon, hydrogen, and oxygen, (3) implementation of an enhanced group-contribution method (NIST-Modified UNIFAC) in TDE for improved estimation of phase-equilibrium properties for binary mixtures, (4) tools for mutual validation of ideal-gas properties derived through statistical calculations and those derived independently through combination of experimental thermodynamic results, (5) improvements in program reliability and function that stem directly from the recent redesign of the TRC-SOURCE Data Archival System for experimental property values, and (6) implementation of the Peng-Robinson equation of state for binary mixtures, which allows for critical evaluation of mixtures involving supercritical components. Planned future developments are summarized.
Gooding, Kate; Makwinja, Regina; Nyirenda, Deborah; Vincent, Robin; Sambakunsi, Rodrick
2018-01-01
Background: Evaluation of community and public engagement in research is important to deepen understanding of how engagement works and to enhance its effectiveness. Theories of change have been recommended for evaluating community engagement, for their ability to make explicit intended outcomes and understandings of how engagement activities contribute to these outcomes. However, there are few documented examples of using theories of change for evaluation of engagement. This article reports experience of using theories of change to develop a framework for evaluating community engagement in research at a clinical research organisation in Malawi. We describe the steps used to develop theories of change, and the way theories of change were used to design data collection plans. Based on our experience, we reflect on the advantages and challenges of the theory of change approach. Methods: The theories of change and evaluation framework were developed through a series of workshops and meetings between engagement practitioners, monitoring and evaluation staff, and researchers. We first identified goals for engagement, then used ‘so that’ chains to clarify pathways and intermediate outcomes between engagement activities and goals. Further meetings were held to refine initial theories of change, identify priority information needs, and define feasible evaluation methods. Results: The theory of change approach had several benefits. In particular, it helped to construct an evaluation framework focused on relevant outcomes and not just activities. The process of reflecting on intended goals and pathways also helped staff to review the design of engagement activities. Challenges included practical considerations around time to consider evaluation plans among practitioners (a challenge for evaluation more generally regardless of method), and more fundamental difficulties related to identifying feasible and agreed outcomes. Conclusions: These experiences from Malawi provide lessons for other research organisations considering use of theories of change to support evaluation of community engagement. PMID:29560418
Gooding, Kate; Makwinja, Regina; Nyirenda, Deborah; Vincent, Robin; Sambakunsi, Rodrick
2018-01-01
Background: Evaluation of community and public engagement in research is important to deepen understanding of how engagement works and to enhance its effectiveness. Theories of change have been recommended for evaluating community engagement, for their ability to make explicit intended outcomes and understandings of how engagement activities contribute to these outcomes. However, there are few documented examples of using theories of change for evaluation of engagement. This article reports experience of using theories of change to develop a framework for evaluating community engagement in research at a clinical research organisation in Malawi. We describe the steps used to develop theories of change, and the way theories of change were used to design data collection plans. Based on our experience, we reflect on the advantages and challenges of the theory of change approach. Methods: The theories of change and evaluation framework were developed through a series of workshops and meetings between engagement practitioners, monitoring and evaluation staff, and researchers. We first identified goals for engagement, then used 'so that' chains to clarify pathways and intermediate outcomes between engagement activities and goals. Further meetings were held to refine initial theories of change, identify priority information needs, and define feasible evaluation methods. Results: The theory of change approach had several benefits. In particular, it helped to construct an evaluation framework focused on relevant outcomes and not just activities. The process of reflecting on intended goals and pathways also helped staff to review the design of engagement activities. Challenges included practical considerations around time to consider evaluation plans among practitioners (a challenge for evaluation more generally regardless of method), and more fundamental difficulties related to identifying feasible and agreed outcomes. Conclusions: These experiences from Malawi provide lessons for other research organisations considering use of theories of change to support evaluation of community engagement.
What commodities and countries impact inequality in the global food system?
NASA Astrophysics Data System (ADS)
Carr, Joel A.; D'Odorico, Paolo; Suweis, Samir; Seekell, David A.
2016-09-01
The global distribution of food production is unequal relative to the distribution of human populations. International trade can increase or decrease inequality in food availability, but little is known about how specific countries and commodities contribute to this redistribution. We present a method based on the Gini coefficient for evaluating the contributions of country and commodity specific trade to inequality in the global food system. We applied the method to global food production and trade data for the years 1986-2011 to identify the specific countries and commodities that contribute to increasing and decreasing inequality in global food availability relative to food production. Overall, international trade reduced inequality in food availability by 25%-33% relative to the distribution of food production, depending on the year. Across all years, about 58% of the total trade links acted to reduce inequality with ˜4% of the links providing 95% of the reduction in inequality. Exports from United States of America, Malaysia, Argentina, and Canada are particularly important in decreasing inequality. Specific commodities that reduce inequality when traded include cereals and vegetables. Some trade connections contribute to increasing inequality, but this effect is mostly concentrated within a small number of commodities including fruits, stimulants, and nuts. In terms of specific countries, exports from Slovenia, Oman, Singapore, and Germany act to increase overall inequality. Collectively, our analysis and results represent an opportunity for building an enhanced understanding of global-scale patterns in food availability.
A contribution toward rational modeling of the pressure-strain-rate correlation
NASA Technical Reports Server (NTRS)
Lee, Moon Joo
1990-01-01
A novel method of obtaining an analytical expression of the 'linear part' of the pressure-strain-rate tensor in terms of the anisotropy tensor of the Reynolds stresses has been developed, where the coefficients of the seven independent tensor terms are functions of the invariants of the Reynolds-stress anisotropy. The coefficients are evaluated up to fourth order in the anisotropy of the Reynolds stresses to provide guidance for development of a turbulence model.
Xu, Zhongnan; Joshi, Yogesh V; Raman, Sumathy; Kitchin, John R
2015-04-14
We validate the usage of the calculated, linear response Hubbard U for evaluating accurate electronic and chemical properties of bulk 3d transition metal oxides. We find calculated values of U lead to improved band gaps. For the evaluation of accurate reaction energies, we first identify and eliminate contributions to the reaction energies of bulk systems due only to changes in U and construct a thermodynamic cycle that references the total energies of unique U systems to a common point using a DFT + U(V) method, which we recast from a recently introduced DFT + U(R) method for molecular systems. We then introduce a semi-empirical method based on weighted DFT/DFT + U cohesive energies to calculate bulk oxidation energies of transition metal oxides using density functional theory and linear response calculated U values. We validate this method by calculating 14 reactions energies involving V, Cr, Mn, Fe, and Co oxides. We find up to an 85% reduction of the mean average error (MAE) compared to energies calculated with the Perdew-Burke-Ernzerhof functional. When our method is compared with DFT + U with empirically derived U values and the HSE06 hybrid functional, we find up to 65% and 39% reductions in the MAE, respectively.
Drumond, Nélio; Stegemann, Sven
2018-05-01
The oral cavity is frequently used to administer pharmaceutical drug products. This route of administration is seen as the most accessible for the majority of patients and supports an independent therapy management. For current oral dosage forms under development, the prediction of their unintended mucoadhesive properties and esophageal transit profiles would contribute for future administration safety, as concerns regarding unintended adhesion of solid oral dosage forms (SODF) during oro-esophageal transit still remain. Different in vitro methods that access mucoadhesion of polymers and pharmaceutical preparations have been proposed over the years. The same methods might be used to test non-adhesive systems and contribute for developing safe-to-swallow technologies. Previous works have already investigated the suitability of non-animal derived in vitro methods to assess such properties. The aim of this work was to review the in vitro methodology available in the scientific literature that used animal esophageal tissue to evaluate mucoadhesion and esophageal transit of pharmaceutical preparations. Furthermore, in vivo methodology is also discussed. Since none of the in vitro methods developed are able to mimic the complex swallowing process and oro-esophageal transit, in vivo studies in humans remain as the gold standard. Copyright © 2018 Elsevier B.V. All rights reserved.
McNamara, Martin S; Fealy, Gerard M; Casey, Mary; O'Connor, Tom; Patton, Declan; Doyle, Louise; Quinlan, Christina
2014-09-01
To evaluate mentoring, coaching and action learning interventions used to develop nurses' and midwives' clinical leadership competencies and to describe the programme participants' experiences of the interventions. Mentoring, coaching and action learning are effective interventions in clinical leadership development and were used in a new national clinical leadership development programme, introduced in Ireland in 2011. An evaluation of the programme focused on how participants experienced the interventions. A qualitative design, using multiple data sources and multiple data collection methods. Methods used to generate data on participant experiences of individual interventions included focus groups, individual interviews and nonparticipant observation. Seventy participants, including 50 programme participants and those providing the interventions, contributed to the data collection. Mentoring, coaching and action learning were positively experienced by participants and contributed to the development of clinical leadership competencies, as attested to by the programme participants and intervention facilitators. The use of interventions that are action-oriented and focused on service development, such as mentoring, coaching and action learning, should be supported in clinical leadership development programmes. Being quite different to short attendance courses, these interventions require longer-term commitment on the part of both individuals and their organisations. In using mentoring, coaching and action learning interventions, the focus should be on each participant's current role and everyday practice and on helping the participant to develop and demonstrate clinical leadership skills in these contexts. © 2014 John Wiley & Sons Ltd.
Evaluation and perceived results of moral case deliberation: A mixed methods study.
Janssens, Rien M J P A; van Zadelhoff, Ezra; van Loo, Ger; Widdershoven, Guy A M; Molewijk, Bert A C
2015-12-01
Moral case deliberation is increasingly becoming part of various Dutch healthcare organizations. Although some evaluation studies of moral case deliberation have been carried out, research into the results of moral case deliberation within aged care is scarce. How did participants evaluate moral case deliberation? What has moral case deliberation brought to them? What has moral case deliberation contributed to care practice? Should moral case deliberation be further implemented and, if so, how? Quantitative analysis of a questionnaire study among participants of moral case deliberation, both caregivers and team leaders. Qualitative analysis of written answers to open questions, interview study and focus group meetings among caregivers and team leaders. Caregivers and team leaders in a large organization for aged care in the Netherlands. A total of 61 moral case deliberation sessions, carried out on 16 care locations belonging to the organization, were evaluated and perceived results were assessed. Participants gave informed consent and anonymity was guaranteed. In the Netherlands, the law does not prescribe independent ethical review by an Institutional Review Board for this kind of research among healthcare professionals. Moral case deliberation was evaluated positively by the participants. Content and atmosphere of moral case deliberation received high scores, while organizational issues regarding the moral case deliberation sessions scored lower and merit further attention. Respondents indicated that moral case deliberation has the potential to contribute to care practice as relationships among team members improve, more openness is experienced and more understanding for different perspectives is fostered. If moral case deliberation is to be successfully implemented, top-down approaches should go hand in hand with bottom-up approaches. The relevance of moral case deliberation for care practice received wide acknowledgement from the respondents. It can contribute to the team's cohesion as mutual understanding for one another's views is fostered. If implemented well, moral case deliberation has the potential to improve care, according to the respondents. © The Author(s) 2014.
2010-01-01
Background There is a growing consensus that linear approaches to improving the performance of health workers and health care organisations may only obtain short-term results. An alternative approach premised on the principle of human resource management described as a form of 'High commitment management', builds upon a bundles of balanced practices. This has been shown to contribute to better organisational performance. This paper illustrates an intervention and outcome of high commitment management (HiCom) at an urban hospital in Ghana. Few studies have shown how HiCom management might contribute to better performance of health services and in particular of hospitals in low and middle-income settings. Methods A realist case study design was used to analyse how specific management practices might contribute to improving the performance of an urban district hospital in Ho, Volta Region, in Ghana. Mixed methods were used to collect data, including document review, in-depth interviews, group discussions, observations and a review of routine health information. Results At Ho Municipal Hospital, the management team dealt with the crisis engulfing the ailing urban district hospital by building an alliance between hospital staff to generate a sense of ownership with a focus around participative problem analysis. The creation of an alliance led to improving staff morale and attitude, and contributed also to improvements in the infrastructure and equipment. This in turn had a positive impact on the revenue generating capacity of the hospital. The quick turn around in the state of this hospital showed that change was indeed possible, a factor that greatly motivated the staff. In a second step, the management team initiated the development of a strategic plan for the hospital to maintain the dynamics of change. This was undertaken through participative methods and sustained earlier staff involvement, empowerment and feelings of reciprocity. We found that these factors acted as the core mechanisms underlying the changes taking place at Ho Municipal Hospital. Conclusions This study shows how a hospital management team in Ghana succeeded in resuscitating an ailing hospital. Their high commitment management approach led to the active involvement and empowerment of staff. It also showed how a realist evaluation approach such as this, could be used in the research of the management of health care organisations to explain how management interventions may or may not work. PMID:21184678
Personal sound zone reproduction with room reflections
NASA Astrophysics Data System (ADS)
Olik, Marek
Loudspeaker-based sound systems, capable of a convincing reproduction of different audio streams to listeners in the same acoustic enclosure, are a convenient alternative to headphones. Such systems aim to generate "sound zones" in which target sound programmes are to be reproduced with minimum interference from any alternative programmes. This can be achieved with appropriate filtering of the source (loudspeaker) signals, so that the target sound's energy is directed to the chosen zone while being attenuated elsewhere. The existing methods are unable to produce the required sound energy ratio (acoustic contrast) between the zones with a small number of sources when strong room reflections are present. Optimization of parameters is therefore required for systems with practical limitations to improve their performance in reflective acoustic environments. One important parameter is positioning of sources with respect to the zones and room boundaries. The first contribution of this thesis is a comparison of the key sound zoning methods implemented on compact and distributed geometrical source arrangements. The study presents previously unpublished detailed evaluation and ranking of such arrangements for systems with a limited number of sources in a reflective acoustic environment similar to a domestic room. Motivated by the requirement to investigate the relationship between source positioning and performance in detail, the central contribution of this thesis is a study on optimizing source arrangements when strong individual room reflections occur. Small sound zone systems are studied analytically and numerically to reveal relationships between the geometry of source arrays and performance in terms of acoustic contrast and array effort (related to system efficiency). Three novel source position optimization techniques are proposed to increase the contrast, and geometrical means of reducing the effort are determined. Contrary to previously published case studies, this work presents a systematic examination of the key problem of first order reflections and proposes general optimization techniques, thus forming an important contribution. The remaining contribution considers evaluation and comparison of the proposed techniques with two alternative approaches to sound zone generation under reflective conditions: acoustic contrast control (ACC) combined with anechoic source optimization and sound power minimization (SPM). The study provides a ranking of the examined approaches which could serve as a guideline for method selection for rooms with strong individual reflections.
Contribution of bacteria-like particles to PM2.5 aerosol in urban and rural environments
NASA Astrophysics Data System (ADS)
Wolf, R.; El-Haddad, I.; Slowik, J. G.; Dällenbach, K.; Bruns, E.; Vasilescu, J.; Baltensperger, U.; Prévôt, A. S. H.
2017-07-01
We report highly time-resolved estimates of airborne bacteria-like particle concentrations in ambient aerosol using an Aerodyne aerosol mass spectrometer (AMS). AMS measurements with a newly developed PM2.5 and the standard (PM1) aerodynamic lens were performed at an urban background site (Zurich) and at a rural site (Payerne) in Switzerland. Positive matrix factorization using the multilinear engine (ME-2) implementation was used to estimate the contribution of bacteria-like particles to non-refractory organic aerosol. The success of the method was evaluated by a size-resolved analysis of the organic mass and the analysis of single particle mass spectra, which were detected with a light scattering system integrated into the AMS. Use of the PM2.5 aerodynamic lens increased measured bacteria-like concentrations, supporting the analysis method. However, at all sites, the low concentrations of this component suggest that airborne bacteria constitute a minor fraction of non-refractory PM2.5 organic aerosol mass. Estimated average mass concentrations were below 0.1 μg/m3 and relative contributions were lower than 2% at both sites. During rainfall periods, concentrations of the bacteria-like component increased considerably reaching a short-time maximum of approximately 2 μg/m3 at the Payerne site in summer.
Perrier, Charles; Normandeau, Éric; Dionne, Mélanie; Richard, Antoine; Bernatchez, Louis
2014-01-01
While nonanadromous males (stream-resident and/or mature male parr) contribute to reproduction in anadromous salmonids, little is known about their impacts on key population genetic parameters. Here, we evaluated the contribution of Atlantic salmon mature male parr to the effective number of breeders (Nb) using both demographic (variance in reproductive success) and genetic (linkage disequilibrium) methods, the number of alleles, and the relatedness among breeders. We used a recently published pedigree reconstruction of a wild anadromous Atlantic salmon population in which 2548 fry born in 2010 were assigned parentage to 144 anadromous female and 101 anadromous females that returned to the river to spawn in 2009 and to 462 mature male parr. Demographic and genetic methods revealed that mature male parr increased population Nb by 1.79 and 1.85 times, respectively. Moreover, mature male parr boosted the number of alleles found among progenies. Finally, mature male parr were in average less related to anadromous females than were anadromous males, likely because of asynchronous sexual maturation between mature male parr and anadromous fish of a given cohort. By increasing Nb and allelic richness, and by decreasing inbreeding, the reproductive contribution of mature male parr has important evolutionary and conservation implications for declining Atlantic salmon populations. PMID:25553070
The Correction of Myopia Evaluation Trial: lessons from the study design.
Hyman, L; Gwiazda, J
2004-01-01
The Correction of Myopia Evaluation Trial (COMET), a multicentre clinical trial based in 4 schools of optometry in the United States, evaluated the effect of progressive addition lenses versus single vision lenses on myopia progression in an ethnically diverse group of 469 myopic children aged 6 to 11 years. Completion of the clinical trial phase of the study provides an opportunity to evaluate aspects of the study design that contribute to its success. This article describes aspects of the study design that were influential in ensuring the smooth conduct of COMET. These include a dedicated team of investigators, an organisational structure with strong leadership and an independent Co-ordinating Centre, regular communication among investigators, flexible and creative approaches to recruitment and retention, sensitivity to concerns for child safety and child participation, and methods for enhancing and monitoring data reliability. The experience with COMET has provided a number of valuable lessons for all aspects of the study design that should benefit the development and implementation of future clinical trials, particularly those done in similar populations of children. The use of a carefully designed protocol using standard methods by dedicated members of the study team is essential in ensuring achievement of the study aims.
Zhang, Xiao-Li; Liu, Yu-Ling; Fan, Li-Jiao; Wang, Yue-Liang; Chen, Kai; Li, Hui
2016-05-01
Based on DPPH method, the antioxidant activities of Shenqi Tongmai Yizhi particles with different extraction processes were compared. The contribution to the anti-oxidant capacity in vitro was explored by means of grey relational analysis on different chemical compositions in the fingerprint. The results showed that the IC₅₀ concentration values of water extract, water extract from alcohol precipitation, alcohol extract, and alcohol and water extract were 0.801 4, 0.859 1, 0.796 1, 0.918 0 g•L⁻¹; and the alcohol extract is the best method to extract antioxidative components, with the highest antioxidant activity and lowest IC₅₀. When the mass concentration of the herbs reached a certain degree, its free radical clearance rate was similar to that of vitamin C control group. The order of different chemical contributions of constituents to the antioxidant activity in the fingerprint was 4>3>33>53>9>10>11>34>15>59>8>61>52>20>42>18>29. The preliminary exploration for the spectrum efficiency relations provides reference for studying traditional Chinese medicine compound processing method and the pharmacodyamic material basis. Copyright© by the Chinese Pharmaceutical Association.
Evaluation of an improved finite-element thermal stress calculation technique
NASA Technical Reports Server (NTRS)
Camarda, C. J.
1982-01-01
A procedure for generating accurate thermal stresses with coarse finite element grids (Ojalvo's method) is described. The procedure is based on the observation that for linear thermoelastic problems, the thermal stresses may be envisioned as being composed of two contributions; the first due to the strains in the structure which depend on the integral of the temperature distribution over the finite element and the second due to the local variation of the temperature in the element. The first contribution can be accurately predicted with a coarse finite-element mesh. The resulting strain distribution can then be combined via the constitutive relations with detailed temperatures from a separate thermal analysis. The result is accurate thermal stresses from coarse finite element structural models even where the temperature distributions have sharp variations. The range of applicability of the method for various classes of thermostructural problems such as in-plane or bending type problems and the effect of the nature of the temperature distribution and edge constraints are addressed. Ojalvo's method is used in conjunction with the SPAR finite element program. Results are obtained for rods, membranes, a box beam and a stiffened panel.
Users' perception as a tool to improve urban beach planning and management.
Cervantes, Omar; Espejel, Ileana; Arellano, Evarista; Delhumeau, Sheila
2008-08-01
Four beaches that share physiographic characteristics (sandy, wide, and long) but differ in socioeconomic and cultural terms (three are located in northwestern Mexico and one in California, USA) were evaluated by beach users. Surveys (565) composed of 36 questions were handed out to beach users on weekends and holidays in 2005. The 25 questions that revealed the most information were selected by factor analysis and classified by cluster analysis. Beach users' preferences were assigned a value by comparing the present survey results with the characteristics of an "ideal" recreational urban beach. Cluster analysis separated three groups of questions: (a) services and infrastructure, (b) recreational activities, and (c) beach conditions. Cluster linkage distance (r=0.82, r=0.78, r=0.67) was used as a weight and multiplied by the value of beach descriptive factors. Mazatlán and Oceanside obtained the highest values because there are enough infrastructure and services; on the contrary, Ensenada and Rosarito were rated medium and low because infrastructure and services are lacking. The presently proposed method can contribute to improving current beach evaluations because the final score represents the beach users' evaluation of the quality of the beach. The weight considered in the present study marks the beach users' preferences among the studied beaches. Adding this weight to beach evaluation will contribute to more specific beach planning in which users' perception is considered.
Users' Perception as a Tool to Improve Urban Beach Planning and Management
NASA Astrophysics Data System (ADS)
Cervantes, Omar; Espejel, Ileana; Arellano, Evarista; Delhumeau, Sheila
2008-08-01
Four beaches that share physiographic characteristics (sandy, wide, and long) but differ in socioeconomic and cultural terms (three are located in northwestern Mexico and one in California, USA) were evaluated by beach users. Surveys (565) composed of 36 questions were handed out to beach users on weekends and holidays in 2005. The 25 questions that revealed the most information were selected by factor analysis and classified by cluster analysis. Beach users’ preferences were assigned a value by comparing the present survey results with the characteristics of an “ideal” recreational urban beach. Cluster analysis separated three groups of questions: (a) services and infrastructure, (b) recreational activities, and (c) beach conditions. Cluster linkage distance ( r = 0.82, r = 0.78, r = 0.67) was used as a weight and multiplied by the value of beach descriptive factors. Mazatlán and Oceanside obtained the highest values because there are enough infrastructure and services; on the contrary, Ensenada and Rosarito were rated medium and low because infrastructure and services are lacking. The presently proposed method can contribute to improving current beach evaluations because the final score represents the beach users’ evaluation of the quality of the beach. The weight considered in the present study marks the beach users’ preferences among the studied beaches. Adding this weight to beach evaluation will contribute to more specific beach planning in which users’ perception is considered.
Ohno, Yoshiyuki
2018-01-01
Drug-drug interactions (DDIs) can affect the clearance of various drugs from the body; however, these effects are difficult to sufficiently evaluate in clinical studies. This article outlines our approach to improving methods for evaluating and providing drug information relative to the effects of DDIs. In a previous study, total exposure changes to many substrate drugs of CYP caused by the co-administration of inhibitor or inducer drugs were successfully predicted using in vivo data. There are two parameters for the prediction: the contribution ratio of the enzyme to oral clearance for substrates (CR), and either the inhibition ratio for inhibitors (IR) or the increase in clearance of substrates produced by induction (IC). To apply these predictions in daily pharmacotherapy, the clinical significance of any pharmacokinetic changes must be carefully evaluated. We constructed a pharmacokinetic interaction significance classification system (PISCS) in which the clinical significance of DDIs was considered in a systematic manner, according to pharmacokinetic changes. The PISCS suggests that many current 'alert' classifications are potentially inappropriate, especially for drug combinations in which pharmacokinetics have not yet been evaluated. It is expected that PISCS would contribute to constructing a reliable system to alert pharmacists, physicians and consumers of a broad range of pharmacokinetic DDIs in order to more safely manage daily clinical practices.
Evaluation of electrical impedance ratio measurements in accuracy of electronic apex locators.
Kim, Pil-Jong; Kim, Hong-Gee; Cho, Byeong-Hoon
2015-05-01
The aim of this paper was evaluating the ratios of electrical impedance measurements reported in previous studies through a correlation analysis in order to explicit it as the contributing factor to the accuracy of electronic apex locator (EAL). The literature regarding electrical property measurements of EALs was screened using Medline and Embase. All data acquired were plotted to identify correlations between impedance and log-scaled frequency. The accuracy of the impedance ratio method used to detect the apical constriction (APC) in most EALs was evaluated using linear ramp function fitting. Changes of impedance ratios for various frequencies were evaluated for a variety of file positions. Among the ten papers selected in the search process, the first-order equations between log-scaled frequency and impedance were in the negative direction. When the model for the ratios was assumed to be a linear ramp function, the ratio values decreased if the file went deeper and the average ratio values of the left and right horizontal zones were significantly different in 8 out of 9 studies. The APC was located within the interval of linear relation between the left and right horizontal zones of the linear ramp model. Using the ratio method, the APC was located within a linear interval. Therefore, using the impedance ratio between electrical impedance measurements at different frequencies was a robust method for detection of the APC.
Costing evidence for health care decision-making in Austria: A systematic review
Mayer, Susanne; Kiss, Noemi; Łaszewska, Agata
2017-01-01
Background With rising healthcare costs comes an increasing demand for evidence-informed resource allocation using economic evaluations worldwide. Furthermore, standardization of costing and reporting methods both at international and national levels are imperative to make economic evaluations a valid tool for decision-making. The aim of this review is to assess the availability and consistency of costing evidence that could be used for decision-making in Austria. It describes systematically the current economic evaluation and costing studies landscape focusing on the applied costing methods and their reporting standards. Findings are discussed in terms of their likely impacts on evidence-based decision-making and potential suggestions for areas of development. Methods A systematic literature review of English and German language peer-reviewed as well as grey literature (2004–2015) was conducted to identify Austrian economic analyses. The databases MEDLINE, EMBASE, SSCI, EconLit, NHS EED and Scopus were searched. Publication and study characteristics, costing methods, reporting standards and valuation sources were systematically synthesised and assessed. Results A total of 93 studies were included. 87% were journal articles, 13% were reports. 41% of all studies were full economic evaluations, mostly cost-effectiveness analyses. Based on relevant standards the most commonly observed limitations were that 60% of the studies did not clearly state an analytical perspective, 25% of the studies did not provide the year of costing, 27% did not comprehensively list all valuation sources, and 38% did not report all applied unit costs. Conclusion There are substantial inconsistencies in the costing methods and reporting standards in economic analyses in Austria, which may contribute to a low acceptance and lack of interest in economic evaluation-informed decision making. To improve comparability and quality of future studies, national costing guidelines should be updated with more specific methodological guidance and a national reference cost library should be set up to allow harmonisation of valuation methods. PMID:28806728
Advanced approach to the analysis of a series of in-situ nuclear forward scattering experiments
NASA Astrophysics Data System (ADS)
Vrba, Vlastimil; Procházka, Vít; Smrčka, David; Miglierini, Marcel
2017-03-01
This study introduces a sequential fitting procedure as a specific approach to nuclear forward scattering (NFS) data evaluation. Principles and usage of this advanced evaluation method are described in details and its utilization is demonstrated on NFS in-situ investigations of fast processes. Such experiments frequently consist of hundreds of time spectra which need to be evaluated. The introduced procedure allows the analysis of these experiments and significantly decreases the time needed for the data evaluation. The key contributions of the study are the sequential use of the output fitting parameters of a previous data set as the input parameters for the next data set and the model suitability crosscheck option of applying the procedure in ascending and descending directions of the data sets. Described fitting methodology is beneficial for checking of model validity and reliability of obtained results.
Semiclassical evaluation of quantum fidelity
NASA Astrophysics Data System (ADS)
Vaníček, Jiří; Heller, Eric J.
2003-11-01
We present a numerically feasible semiclassical (SC) method to evaluate quantum fidelity decay (Loschmidt echo) in a classically chaotic system. It was thought that such evaluation would be intractable, but instead we show that a uniform SC expression not only is tractable but it also gives remarkably accurate numerical results for the standard map in both the Fermi-golden-rule and Lyapunov regimes. Because it allows Monte Carlo evaluation, the uniform expression is accurate at times when there are 1070 semiclassical contributions. Remarkably, it also explicitly contains the “building blocks” of analytical theories of recent literature, and thus permits a direct test of the approximations made by other authors in these regimes, rather than an a posteriori comparison with numerical results. We explain in more detail the extended validity of the classical perturbation approximation and show that within this approximation, the so-called “diagonal approximation” is automatic and does not require ensemble averaging.
Exercise intolerance in pulmonary hypertension: mechanism, evaluation and clinical implications.
Babu, Abraham Samuel; Arena, Ross; Myers, Jonathan; Padmakumar, Ramachandran; Maiya, Arun G; Cahalin, Lawrence P; Waxman, Aaron B; Lavie, Carl J
2016-09-01
Exercise intolerance in pulmonary hypertension (PH) is a major factor affecting activities of daily living and quality of life. Evaluation strategies (i.e., non-invasive and invasive tests) are integral to providing a comprehensive assessment of clinical and functional status. Despite a growing body of literature on the clinical consequences of PH, there are limited studies discussing the contribution of various physiological systems to exercise intolerance in this patient population. This review, through a search of various databases, describes the physiological basis for exercise intolerance across the various PH etiologies, highlights the various exercise evaluation methods and discusses the rationale for exercise training amongst those diagnosed with PH. Expert commentary: With the growing importance of evaluating exercise capacity in PH (class 1, Level C recommendation), understanding why exercise performance is altered in PH is crucial. Thus, the further study is required for better quality evidence in this area.
A novel feature ranking algorithm for biometric recognition with PPG signals.
Reşit Kavsaoğlu, A; Polat, Kemal; Recep Bozkurt, M
2014-06-01
This study is intended for describing the application of the Photoplethysmography (PPG) signal and the time domain features acquired from its first and second derivatives for biometric identification. For this purpose, a sum of 40 features has been extracted and a feature-ranking algorithm is proposed. This proposed algorithm calculates the contribution of each feature to biometric recognition and collocates the features, the contribution of which is from great to small. While identifying the contribution of the features, the Euclidean distance and absolute distance formulas are used. The efficiency of the proposed algorithms is demonstrated by the results of the k-NN (k-nearest neighbor) classifier applications of the features. During application, each 15-period-PPG signal belonging to two different durations from each of the thirty healthy subjects were used with a PPG data acquisition card. The first PPG signals recorded from the subjects were evaluated as the 1st configuration; the PPG signals recorded later at a different time as the 2nd configuration and the combination of both were evaluated as the 3rd configuration. When the results were evaluated for the k-NN classifier model created along with the proposed algorithm, an identification of 90.44% for the 1st configuration, 94.44% for the 2nd configuration, and 87.22% for the 3rd configuration has successfully been attained. The obtained results showed that both the proposed algorithm and the biometric identification model based on this developed PPG signal are very promising for contactless recognizing the people with the proposed method. Copyright © 2014 Elsevier Ltd. All rights reserved.
An annular superposition integral for axisymmetric radiators.
Kelly, James F; McGough, Robert J
2007-02-01
A fast integral expression for computing the nearfield pressure is derived for axisymmetric radiators. This method replaces the sum of contributions from concentric annuli with an exact double integral that converges much faster than methods that evaluate the Rayleigh-Sommerfeld integral or the generalized King integral. Expressions are derived for plane circular pistons using both continuous wave and pulsed excitations. Several commonly used apodization schemes for the surface velocity distribution are considered, including polynomial functions and a "smooth piston" function. The effect of different apodization functions on the spectral content of the wave field is explored. Quantitative error and time comparisons between the new method, the Rayleigh-Sommerfeld integral, and the generalized King integral are discussed. At all error levels considered, the annular superposition method achieves a speed-up of at least a factor of 4 relative to the point-source method and a factor of 3 relative to the generalized King integral without increasing the computational complexity.
Sunada, Keijiro; Yamamoto, Hironori; Kita, Hiroto; Yano, Tomonori; Sato, Hiroyuki; Hayashi, Yoshikazu; Miyata, Tomohiko; Sekine, Yutaka; Kuno, Akiko; Iwamoto, Michiko; Ohnishi, Hirohide; Ido, Kenichi; Sugano, Kentaro
2005-01-01
AIM: To evaluate the clinical outcome of enteroscopy, using the double-balloon method, focusing on the involvement of neoplasms in strictures of the small intestine. METHODS: Enteroscopy, using the double-balloon method, was performed between December 1999 and December 2002 at Jichi Medical School Hospital, Japan and strictures of the small intestine were found in 17 out of 62 patients. These 17 consecutive patients were subjected to analysis. RESULTS: The double-balloon enteroscopy contributed to the diagnosis of small intestinal neoplasms found in 3 out of 17 patients by direct observation of the strictures as well as biopsy sampling. Surgical procedures were chosen for these three patients, while balloon dilation was chosen for the strictures in four patients diagnosed with inflammation without involvement of neoplasm. CONCLUSION: Double-balloon enteroscopy is a useful method for the diagnosis and treatment of strictures in the small bowel. PMID:15742422
Description and evaluation of an interference assessment for a slotted-wall wind tunnel
NASA Technical Reports Server (NTRS)
Kemp, William B., Jr.
1991-01-01
A wind-tunnel interference assessment method applicable to test sections with discrete finite-length wall slots is described. The method is based on high order panel method technology and uses mixed boundary conditions to satisfy both the tunnel geometry and wall pressure distributions measured in the slotted-wall region. Both the test model and its sting support system are represented by distributed singularities. The method yields interference corrections to the model test data as well as surveys through the interference field at arbitrary locations. These results include the equivalent of tunnel Mach calibration, longitudinal pressure gradient, tunnel flow angularity, wall interference, and an inviscid form of sting interference. Alternative results which omit the direct contribution of the sting are also produced. The method was applied to the National Transonic Facility at NASA Langley Research Center for both tunnel calibration tests and tests of two models of subsonic transport configurations.
Path-integral method for the source apportionment of photochemical pollutants
NASA Astrophysics Data System (ADS)
Dunker, A. M.
2015-06-01
A new, path-integral method is presented for apportioning the concentrations of pollutants predicted by a photochemical model to emissions from different sources. A novel feature of the method is that it can apportion the difference in a species concentration between two simulations. For example, the anthropogenic ozone increment, which is the difference between a simulation with all emissions present and another simulation with only the background (e.g., biogenic) emissions included, can be allocated to the anthropogenic emission sources. The method is based on an existing, exact mathematical equation. This equation is applied to relate the concentration difference between simulations to line or path integrals of first-order sensitivity coefficients. The sensitivities describe the effects of changing the emissions and are accurately calculated by the decoupled direct method. The path represents a continuous variation of emissions between the two simulations, and each path can be viewed as a separate emission-control strategy. The method does not require auxiliary assumptions, e.g., whether ozone formation is limited by the availability of volatile organic compounds (VOCs) or nitrogen oxides (NOx), and can be used for all the species predicted by the model. A simplified configuration of the Comprehensive Air Quality Model with Extensions (CAMx) is used to evaluate the accuracy of different numerical integration procedures and the dependence of the source contributions on the path. A Gauss-Legendre formula using three or four points along the path gives good accuracy for apportioning the anthropogenic increments of ozone, nitrogen dioxide, formaldehyde, and nitric acid. Source contributions to these increments were obtained for paths representing proportional control of all anthropogenic emissions together, control of NOx emissions before VOC emissions, and control of VOC emissions before NOx emissions. There are similarities in the source contributions from the three paths but also differences due to the different chemical regimes resulting from the emission-control strategies.
Path-integral method for the source apportionment of photochemical pollutants
NASA Astrophysics Data System (ADS)
Dunker, A. M.
2014-12-01
A new, path-integral method is presented for apportioning the concentrations of pollutants predicted by a photochemical model to emissions from different sources. A novel feature of the method is that it can apportion the difference in a species concentration between two simulations. For example, the anthropogenic ozone increment, which is the difference between a simulation with all emissions present and another simulation with only the background (e.g., biogenic) emissions included, can be allocated to the anthropogenic emission sources. The method is based on an existing, exact mathematical equation. This equation is applied to relate the concentration difference between simulations to line or path integrals of first-order sensitivity coefficients. The sensitivities describe the effects of changing the emissions and are accurately calculated by the decoupled direct method. The path represents a continuous variation of emissions between the two simulations, and each path can be viewed as a separate emission-control strategy. The method does not require auxiliary assumptions, e.g., whether ozone formation is limited by the availability of volatile organic compounds (VOC's) or nitrogen oxides (NOx), and can be used for all the species predicted by the model. A simplified configuration of the Comprehensive Air Quality Model with Extensions is used to evaluate the accuracy of different numerical integration procedures and the dependence of the source contributions on the path. A Gauss-Legendre formula using 3 or 4 points along the path gives good accuracy for apportioning the anthropogenic increments of ozone, nitrogen dioxide, formaldehyde, and nitric acid. Source contributions to these increments were obtained for paths representing proportional control of all anthropogenic emissions together, control of NOx emissions before VOC emissions, and control of VOC emissions before NOx emissions. There are similarities in the source contributions from the three paths but also differences due to the different chemical regimes resulting from the emission-control strategies.
Farquhar, Morag; Preston, Nancy; Evans, Catherine J; Grande, Gunn; Short, Vicky; Benalia, Hamid; Higginson, Irene J; Todd, Chris
2013-12-01
Complex interventions are common in palliative and end-of-life care. Mixed methods approaches sit well within the multiphase model of complex intervention development and evaluation. Generic mixed methods guidance is useful but additional challenges in the research design and operationalization within palliative and end-of-life care may have an impact on the use of mixed methods. The objective of the study was to develop guidance on the best methods for combining quantitative and qualitative methods for health and social care intervention development and evaluation in palliative and end-of-life care. A one-day workshop was held where experts participated in facilitated groups using Transparent Expert Consultation to generate items for potential recommendations. Agreement and consensus were then sought on nine draft recommendations (DRs) in a follow-up exercise. There was at least moderate agreement with most of the DRs, although consensus was low. Strongest agreement was with DR1 (usefulness of mixed methods to palliative and end-of-life care) and DR5 (importance of attention to respondent burden), and least agreement was with DR2 (use of theoretical perspectives) and DR6 (therapeutic effects of research interviews). Narrative comments enabled recommendation refinement. Two fully endorsed, five partially endorsed, and two refined DRs emerged. The relationship of these nine to six key challenges of palliative and end-of-life care research was analyzed. There is a need for further discussion of these recommendations and their contribution to methodology. The recommendations should be considered when designing and operationalizing mixed methods studies of complex interventions in palliative care, and because they may have wider relevance, should be considered for other applications.
Wu, Xiaozhe; Wang, Zhenling; Li, Xiaolu; Fan, Yingzi; He, Gu; Wan, Yang; Yu, Chaoheng; Tang, Jianying; Li, Meng; Zhang, Xian; Zhang, Hailong; Xiang, Rong; Pan, Ying; Liu, Yan; Lu, Lian
2014-01-01
To design and discover new antimicrobial peptides (AMPs) with high levels of antimicrobial activity, a number of machine-learning methods and prediction methods have been developed. Here, we present a new prediction method that can identify novel AMPs that are highly similar in sequence to known peptides but offer improved antimicrobial activity along with lower host cytotoxicity. Using previously generated AMP amino acid substitution data, we developed an amino acid activity contribution matrix that contained an activity contribution value for each amino acid in each position of the model peptide. A series of AMPs were designed with this method. After evaluating the antimicrobial activities of these novel AMPs against both Gram-positive and Gram-negative bacterial strains, DP7 was chosen for further analysis. Compared to the parent peptide HH2, this novel AMP showed broad-spectrum, improved antimicrobial activity, and in a cytotoxicity assay it showed lower toxicity against human cells. The in vivo antimicrobial activity of DP7 was tested in a Staphylococcus aureus infection murine model. When inoculated and treated via intraperitoneal injection, DP7 reduced the bacterial load in the peritoneal lavage solution. Electron microscope imaging and the results indicated disruption of the S. aureus outer membrane by DP7. Our new prediction method can therefore be employed to identify AMPs possessing minor amino acid differences with improved antimicrobial activities, potentially increasing the therapeutic agents available to combat multidrug-resistant infections. PMID:24982064
Evaluation of back scatter interferometry, a method for detecting protein binding in solution.
Jepsen, S T; Jørgensen, T M; Zong, W; Trydal, T; Kristensen, S R; Sørensen, H S
2015-02-07
Back Scatter Interferometry (BSI) has been proposed to be a highly sensitive and versatile refractive index sensor usable for analytical detection of biomarker and protein interactions in solution. However the existing literature on BSI lacks a physical explanation of why protein interactions in general should contribute to the BSI signal. We have established a BSI system to investigate this subject in further detail. We contribute with a thorough analysis of the robustness of the sensor including unwanted contributions to the interferometric signal caused by temperature variation and dissolved gasses. We report a limit of the effective minimum detectability of refractive index at the 10(-7) level. Long term stability was examined by simultaneously monitoring the temperature inside the capillary revealing an average drift of 2.0 × 10(-7) per hour. Finally we show that measurements on protein A incubated with immunoglobulin G do not result in a signal that can be attributed to binding affinities as otherwise claimed in literature.
Methods for Evaluating Practice Change Toward a Patient-Centered Medical Home
Jaén, Carlos Roberto; Crabtree, Benjamin F.; Palmer, Raymond F.; Ferrer, Robert L.; Nutting, Paul A.; Miller, William L.; Stewart, Elizabeth E.; Wood, Robert; Davila, Marivel; Stange, Kurt C.
2010-01-01
PURPOSE Understanding the transformation of primary care practices to patient-centered medical homes (PCMHs) requires making sense of the change process, multilevel outcomes, and context. We describe the methods used to evaluate the country’s first national demonstration project of the PCMH concept, with an emphasis on the quantitative measures and lessons for multimethod evaluation approaches. METHODS The National Demonstration Project (NDP) was a group-randomized clinical trial of facilitated and self-directed implementation strategies for the PCMH. An independent evaluation team developed an integrated package of quantitative and qualitative methods to evaluate the process and outcomes of the NDP for practices and patients. Data were collected by an ethnographic analyst and a research nurse who visited each practice, and from multiple data sources including a medical record audit, patient and staff surveys, direct observation, interviews, and text review. Analyses aimed to provide real-time feedback to the NDP implementation team and lessons that would be transferable to the larger practice, policy, education, and research communities. RESULTS Real-time analyses and feedback appeared to be helpful to the facilitators. Medical record audits provided data on process-of-care outcomes. Patient surveys contributed important information about patient-rated primary care attributes and patient-centered outcomes. Clinician and staff surveys provided important practice experience and organizational data. Ethnographic observations supplied insights about the process of practice development. Most practices were not able to provide detailed financial information. CONCLUSIONS A multimethod approach is challenging, but feasible and vital to understanding the process and outcome of a practice development process. Additional longitudinal follow-up of NDP practices and their patients is needed. PMID:20530398
Ethnographic process evaluation in primary care: explaining the complexity of implementation.
Bunce, Arwen E; Gold, Rachel; Davis, James V; McMullen, Carmit K; Jaworski, Victoria; Mercer, MaryBeth; Nelson, Christine
2014-12-05
The recent growth of implementation research in care delivery systems has led to a renewed interest in methodological approaches that deliver not only intervention outcome data but also deep understanding of the complex dynamics underlying the implementation process. We suggest that an ethnographic approach to process evaluation, when informed by and integrated with quantitative data, can provide this nuanced insight into intervention outcomes. The specific methods used in such ethnographic process evaluations are rarely presented in detail; our objective is to stimulate a conversation around the successes and challenges of specific data collection methods in health care settings. We use the example of a translational clinical trial among 11 community clinics in Portland, OR that are implementing an evidence-based, health-information technology (HIT)-based intervention focused on patients with diabetes. Our ethnographic process evaluation employed weekly diaries by clinic-based study employees, observation, informal and formal interviews, document review, surveys, and group discussions to identify barriers and facilitators to implementation success, provide insight into the quantitative study outcomes, and uncover lessons potentially transferable to other implementation projects. These methods captured the depth and breadth of factors contributing to intervention uptake, while minimizing disruption to clinic work and supporting mid-stream shifts in implementation strategies. A major challenge is the amount of dedicated researcher time required. The deep understanding of the 'how' and 'why' behind intervention outcomes that can be gained through an ethnographic approach improves the credibility and transferability of study findings. We encourage others to share their own experiences with ethnography in implementation evaluation and health services research, and to consider adapting the methods and tools described here for their own research.
Karim, Rashed; Bhagirath, Pranav; Claus, Piet; James Housden, R; Chen, Zhong; Karimaghaloo, Zahra; Sohn, Hyon-Mok; Lara Rodríguez, Laura; Vera, Sergio; Albà, Xènia; Hennemuth, Anja; Peitgen, Heinz-Otto; Arbel, Tal; Gonzàlez Ballester, Miguel A; Frangi, Alejandro F; Götte, Marco; Razavi, Reza; Schaeffter, Tobias; Rhode, Kawal
2016-05-01
Studies have demonstrated the feasibility of late Gadolinium enhancement (LGE) cardiovascular magnetic resonance (CMR) imaging for guiding the management of patients with sequelae to myocardial infarction, such as ventricular tachycardia and heart failure. Clinical implementation of these developments necessitates a reproducible and reliable segmentation of the infarcted regions. It is challenging to compare new algorithms for infarct segmentation in the left ventricle (LV) with existing algorithms. Benchmarking datasets with evaluation strategies are much needed to facilitate comparison. This manuscript presents a benchmarking evaluation framework for future algorithms that segment infarct from LGE CMR of the LV. The image database consists of 30 LGE CMR images of both humans and pigs that were acquired from two separate imaging centres. A consensus ground truth was obtained for all data using maximum likelihood estimation. Six widely-used fixed-thresholding methods and five recently developed algorithms are tested on the benchmarking framework. Results demonstrate that the algorithms have better overlap with the consensus ground truth than most of the n-SD fixed-thresholding methods, with the exception of the Full-Width-at-Half-Maximum (FWHM) fixed-thresholding method. Some of the pitfalls of fixed thresholding methods are demonstrated in this work. The benchmarking evaluation framework, which is a contribution of this work, can be used to test and benchmark future algorithms that detect and quantify infarct in LGE CMR images of the LV. The datasets, ground truth and evaluation code have been made publicly available through the website: https://www.cardiacatlas.org/web/guest/challenges. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.
Potential improvements in turbofan engine fuel economy
NASA Technical Reports Server (NTRS)
Hines, R. W.; Gaffin, W. O.
1976-01-01
The method developed for initial evaluation of possible performance improvements in the NASA Aircraft Energy Efficiency Program, directed toward improving the fuel economy of turbofan engines, is outlined, and results of the evaluation of 100 candidate engine modifications are presented. The study indicates that fuel consumption improvements of as much as 5% may be possible in current JT3D, JT8D, and JT9D turbofan engines. Aerodynamic, thermodynamic, material, and structural advances are expected to yield fuel consumption improvements on the order of 10 to 15% in advanced turbofan engines, with the greatest improvement stemming from significantly higher cycle pressure ratios. Higher turbine temperature and fan bypass ratios are also expected to contribute to fuel conservation.
A Generalized Technique in Numerical Integration
NASA Astrophysics Data System (ADS)
Safouhi, Hassan
2018-02-01
Integration by parts is one of the most popular techniques in the analysis of integrals and is one of the simplest methods to generate asymptotic expansions of integral representations. The product of the technique is usually a divergent series formed from evaluating boundary terms; however, sometimes the remaining integral is also evaluated. Due to the successive differentiation and anti-differentiation required to form the series or the remaining integral, the technique is difficult to apply to problems more complicated than the simplest. In this contribution, we explore a generalized and formalized integration by parts to create equivalent representations to some challenging integrals. As a demonstrative archetype, we examine Bessel integrals, Fresnel integrals and Airy functions.
Conceptual Knowledge Acquisition in Biomedicine: A Methodological Review
Payne, Philip R.O.; Mendonça, Eneida A.; Johnson, Stephen B.; Starren, Justin B.
2007-01-01
The use of conceptual knowledge collections or structures within the biomedical domain is pervasive, spanning a variety of applications including controlled terminologies, semantic networks, ontologies, and database schemas. A number of theoretical constructs and practical methods or techniques support the development and evaluation of conceptual knowledge collections. This review will provide an overview of the current state of knowledge concerning conceptual knowledge acquisition, drawing from multiple contributing academic disciplines such as biomedicine, computer science, cognitive science, education, linguistics, semiotics, and psychology. In addition, multiple taxonomic approaches to the description and selection of conceptual knowledge acquisition and evaluation techniques will be proposed in order to partially address the apparent fragmentation of the current literature concerning this domain. PMID:17482521
Mutations In Rare Ataxia Genes Are Uncommon Causes of Sporadic Cerebellar Ataxia
Fogel, Brent L.; Lee, Ji Yong; Lane, Jessica; Wahnich, Amanda; Chan, Sandy; Huang, Alden; Osborn, Greg E.; Klein, Eric; Mamah, Catherine; Perlman, Susan; Geschwind, Daniel H.; Coppola, Giovanni
2012-01-01
BACKGROUND Sporadic-onset ataxia is common in a tertiary care setting but a significant percentage remains unidentified despite extensive evaluation. Rare genetic ataxias, reported only in specific populations or families, may contribute to a percentage of sporadic ataxia. METHODS Patients with adult-onset sporadic ataxia, who tested negative for common genetic ataxias (SCA1, SCA2, SCA3, SCA6, SCA7, and/or Friedreich ataxia), were evaluated using a stratified screening approach for variants in seven rare ataxia genes. RESULTS We screened patients for published mutations in SYNE1 (n=80) and TGM6 (n=118), copy number variations in LMNB1 (n=40) and SETX (n=11), sequence variants in SACS (n=39) and PDYN (n=119), and the pentanucleotide insertion of spinocerebellar ataxia type 31 (n=101). Overall, we identified one patient with a LMNB1 duplication, one patient with a PDYN variant, and one compound SACS heterozygote, including a novel variant. CONCLUSIONS The rare genetic ataxias examined here do not significantly contribute to sporadic cerebellar ataxia in our tertiary care population. PMID:22287014
The threshold hypothesis: solving the equation of nurture vs nature in type 1 diabetes.
Wasserfall, C; Nead, K; Mathews, C; Atkinson, M A
2011-09-01
For more than 40 years, the contributions of nurture (i.e. the environment) and nature (i.e. genetics) have been touted for their aetiological importance in type 1 diabetes. Disappointingly, knowledge gains in these areas, while individually successful, have to a large extent occurred in isolation from each other. One reason underlying this divide is the lack of a testable model that simultaneously considers the contributions of genetic and environmental determinants in the formation of this and potentially other disorders that are subject to these variables. To address this void, we have designed a model based on the hypothesis that the aetiological influences of genetics and environment, when evaluated as intersecting and reciprocal trend lines based on odds ratios, result in a method of concurrently evaluating both facets and defining the attributable risk of clinical onset of type 1 diabetes. The model, which we have elected to term the 'threshold hypothesis', also provides a novel means of conceptualising the complex interactions of nurture with nature in type 1 diabetes across various geographical populations.
Wiegers, Susan E; Houser, Steven R; Pearson, Helen E; Untalan, Ann; Cheung, Joseph Y; Fisher, Susan G; Kaiser, Larry R; Feldman, Arthur M
2015-08-01
Academic medical centers are faced with increasing budgetary constraints due to a flat National Institutes of Health budget, lower reimbursements for clinical services, higher costs of technology including informatics and a changing competitive landscape. As such, institutional stakeholders are increasingly asking whether resources are allocated appropriately and whether there are objective methods for measuring faculty contributions and engagement. The complexities of translational research can be particularly challenging when trying to assess faculty contributions because of team science. For over a decade, we have used an objective scoring system called the Matrix to assess faculty productivity and engagement in four areas: research, education, scholarship, and administration or services. The Matrix was developed to be dynamic, quantitative, and able to insure that a fully engaged educator would have a Matrix score that was comparable to a fully engaged investigator. In this report, we present the Matrix in its current form in order to provide a well-tested objective system of performance evaluation for nonclinical faculty to help academic leaders in decision making. © 2015 Wiley Periodicals, Inc.
Valavanis, Ioannis K; Mougiakakou, Stavroula G; Grimaldi, Keith A; Nikita, Konstantina S
2010-09-08
Obesity is a multifactorial trait, which comprises an independent risk factor for cardiovascular disease (CVD). The aim of the current work is to study the complex etiology beneath obesity and identify genetic variations and/or factors related to nutrition that contribute to its variability. To this end, a set of more than 2300 white subjects who participated in a nutrigenetics study was used. For each subject a total of 63 factors describing genetic variants related to CVD (24 in total), gender, and nutrition (38 in total), e.g. average daily intake in calories and cholesterol, were measured. Each subject was categorized according to body mass index (BMI) as normal (BMI ≤ 25) or overweight (BMI > 25). Two artificial neural network (ANN) based methods were designed and used towards the analysis of the available data. These corresponded to i) a multi-layer feed-forward ANN combined with a parameter decreasing method (PDM-ANN), and ii) a multi-layer feed-forward ANN trained by a hybrid method (GA-ANN) which combines genetic algorithms and the popular back-propagation training algorithm. PDM-ANN and GA-ANN were comparatively assessed in terms of their ability to identify the most important factors among the initial 63 variables describing genetic variations, nutrition and gender, able to classify a subject into one of the BMI related classes: normal and overweight. The methods were designed and evaluated using appropriate training and testing sets provided by 3-fold Cross Validation (3-CV) resampling. Classification accuracy, sensitivity, specificity and area under receiver operating characteristics curve were utilized to evaluate the resulted predictive ANN models. The most parsimonious set of factors was obtained by the GA-ANN method and included gender, six genetic variations and 18 nutrition-related variables. The corresponding predictive model was characterized by a mean accuracy equal of 61.46% in the 3-CV testing sets. The ANN based methods revealed factors that interactively contribute to obesity trait and provided predictive models with a promising generalization ability. In general, results showed that ANNs and their hybrids can provide useful tools for the study of complex traits in the context of nutrigenetics.
Chan, Linda; Mackintosh, Jeannie; Dobbins, Maureen
2017-09-28
The National Collaborating Centre for Methods and Tools (NCCMT) offers workshops and webinars to build public health capacity for evidence-informed decision-making. Despite positive feedback for NCCMT workshops and resources, NCCMT users found key terms used in research papers difficult to understand. The Understanding Research Evidence (URE) videos use plain language, cartoon visuals, and public health examples to explain complex research concepts. The videos are posted on the NCCMT website and YouTube channel. The first four videos in the URE web-based video series, which explained odds ratios (ORs), confidence intervals (CIs), clinical significance, and forest plots, were evaluated. The evaluation examined how the videos affected public health professionals' practice. A mixed-methods approach was used to examine the delivery mode and the content of the videos. Specifically, the evaluation explored (1) whether the videos were effective at increasing knowledge on the four video topics, (2) whether public health professionals were satisfied with the videos, and (3) how public health professionals applied the knowledge gained from the videos in their work. A three-part evaluation was conducted to determine the effectiveness of the first four URE videos. The evaluation included a Web-based survey, telephone interviews, and pretest and posttests, which evaluated public health professionals' experience with the videos and how the videos affected their public health work. Participants were invited to participate in this evaluation through various open access, public health email lists, through informational flyers and posters at the Canadian Public Health Association (CPHA) conference, and through targeted recruitment to NCCMT's network. In the Web-based surveys (n=46), participants achieved higher scores on the knowledge assessment questions from watching the OR (P=.04), CI (P=.04), and clinical significance (P=.05) videos but not the forest plot (P=.12) video, as compared with participants who had not watched the videos. The pretest and posttest (n=124) demonstrated that participants had a better understanding of forest plots (P<.001) and CIs (P<.001) after watching the videos. Due to small sample size numbers, there were insufficient pretest and posttest data to conduct meaningful analyses on the clinical significance and OR videos. Telephone interview participants (n=18) thought the videos' use of animation, narration, and plain language was appropriate for people with different levels of understanding and learning styles. Participants felt that by increasing their understanding of research evidence, they could develop better interventions and design evaluations to measure the impact of public health initiatives. Overall, the results of the evaluation showed that watching the videos resulted in an increase in knowledge, and participants had an overall positive experience with the URE videos. With increased competence in using the best available evidence, professionals are empowered to contribute to decisions that can improve health outcomes of communities. ©Linda Chan, Jeannie Mackintosh, Maureen Dobbins. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 28.09.2017.
Concept and numerical simulations of a reactive anti-fragment armour layer
NASA Astrophysics Data System (ADS)
Hušek, Martin; Kala, Jiří; Král, Petr; Hokeš, Filip
2017-07-01
The contribution describes the concept and numerical simulation of a ballistic protective layer which is able to actively resist projectiles or smaller colliding fragments flying at high speed. The principle of the layer was designed on the basis of the action/reaction system of reactive armour which is used for the protection of armoured vehicles. As the designed ballistic layer consists of steel plates simultaneously combined with explosive material - primary explosive and secondary explosive - the technique of coupling the Finite Element Method with Smoothed Particle Hydrodynamics was used for the simulations. Certain standard situations which the ballistic layer should resist were simulated. The contribution describes the principles for the successful execution of numerical simulations, their results, and an evaluation of the functionality of the ballistic layer.
Translational Educational Research
Issenberg, S. Barry; Cohen, Elaine R.; Barsuk, Jeffrey H.; Wayne, Diane B.
2012-01-01
Medical education research contributes to translational science (TS) when its outcomes not only impact educational settings, but also downstream results, including better patient-care practices and improved patient outcomes. Simulation-based medical education (SBME) has demonstrated its role in achieving such distal results. Effective TS also encompasses implementation science, the science of health-care delivery. Educational, clinical, quality, and safety goals can only be achieved by thematic, sustained, and cumulative research programs, not isolated studies. Components of an SBME TS research program include motivated learners, curriculum grounded in evidence-based learning theory, educational resources, evaluation of downstream results, a productive research team, rigorous research methods, research resources, and health-care system acceptance and implementation. National research priorities are served from translational educational research. National funding priorities should endorse the contribution and value of translational education research. PMID:23138127
Aygün, Nurcihan; Uludağ, Mehmet; İşgör, Adnan
2017-01-01
Objective We evaluated the contribution of intraoperative neuromonitoring to the visual and functional identification of the external branch of the superior laryngeal nerve. Material and Methods The prospectively collected data of patients who underwent thyroid surgery with intraoperative neuromonitoring for external branch of the superior laryngeal nerve exploration were assessed retrospectively. The surface endotracheal tube-based Medtronic NIM3 intraoperative neuromonitoring device was used. The external branch of the superior laryngeal nerve function was evaluated by the cricothyroid muscle twitch. In addition, contribution of external branch of the superior laryngeal nerve to the vocal cord adduction was evaluated using electromyographic records. Results The study included data of 126 (female, 103; male, 23) patients undergoing thyroid surgery, with a mean age of 46.2±12.2 years (range, 18–75 years), and 215 neck sides were assessed. Two hundred and one (93.5%) of 215 external branch of the superior laryngeal nerves were identified, of which 60 (27.9%) were identified visually before being stimulated with a monopolar stimulator probe. Eighty-nine (41.4%) external branch of the superior laryngeal nerves were identified visually after being identified with a probe. Although 52 (24.1%) external branch of the superior laryngeal nerves were identified with a probe, they were not visualized. Intraoperative neuromonitoring provided a significant contribution to visual (p<0.001) and functional (p<0.001) identification of external branch of the superior laryngeal nerves. Additionally, positive electromyographic responses were recorded from 160 external branch of the superior laryngeal nerves (74.4%). Conclusion Intraoperative neuromonitoring provides an important contribution to visual and functional identification of external branch of the superior laryngeal nerves. We believe that it can not be predicted whether the external branch of the superior laryngeal nerve is at risk or not and the nerve is often invisible; thus, intraoperative neuromonitoring may routinely be used in superior pole dissection. Glottic electromyography response obtained via external branch of the superior laryngeal nerve stimulation provides quantifiable information in addition to the simple visualization of the cricothyroid muscle twitch. PMID:28944328
NASA Astrophysics Data System (ADS)
Mazúr, P.; Mrlík, J.; Beneš, J.; Pocedič, J.; Vrána, J.; Dundálek, J.; Kosek, J.
2018-03-01
In our contribution we study the electrocatalytic effect of oxygen functionalization of thermally treated graphite felt on kinetics of electrode reactions of vanadium redox flow battery. Chemical and morphological changes of the felts are analysed by standard physico-chemical characterization techniques. A complex method four-point method is developed and employed for characterization of the felts in a laboratory single-cell. The method is based on electrochemical impedance spectroscopy and load curves measurements of positive and negative half-cells using platinum wire pseudo-reference electrodes. The distribution of ohmic and faradaic losses within a single-cell is evaluated for both symmetric and asymmetric electrode set-up with respect to the treatment conditions. Positive effect of oxygen functionalization is observed only for negative electrode, whereas kinetics of positive electrode reaction is almost unaffected by the treatment. This is in a contradiction to the results of typically employed cyclovoltammetric characterization which indicate that both electrodes are enhanced by the treatment to a similar extent. The developed four-point characterization method can be further used e.g., for the component screening and in-situ durability studies on single-cell scale redox flow batteries of various chemistries.
A Requirements-Driven Optimization Method for Acoustic Liners Using Analytic Derivatives
NASA Technical Reports Server (NTRS)
Berton, Jeffrey J.; Lopes, Leonard V.
2017-01-01
More than ever, there is flexibility and freedom in acoustic liner design. Subject to practical considerations, liner design variables may be manipulated to achieve a target attenuation spectrum. But characteristics of the ideal attenuation spectrum can be difficult to know. Many multidisciplinary system effects govern how engine noise sources contribute to community noise. Given a hardwall fan noise source to be suppressed, and using an analytical certification noise model to compute a community noise measure of merit, the optimal attenuation spectrum can be derived using multidisciplinary systems analysis methods. In a previous paper on this subject, a method deriving the ideal target attenuation spectrum that minimizes noise perceived by observers on the ground was described. A simple code-wrapping approach was used to evaluate a community noise objective function for an external optimizer. Gradients were evaluated using a finite difference formula. The subject of this paper is an application of analytic derivatives that supply precise gradients to an optimization process. Analytic derivatives improve the efficiency and accuracy of gradient-based optimization methods and allow consideration of more design variables. In addition, the benefit of variable impedance liners is explored using a multi-objective optimization.
Governance for public health and health equity: The Tröndelag model for public health work.
Lillefjell, Monica; Magnus, Eva; Knudtsen, Margunn SkJei; Wist, Guri; Horghagen, Sissel; Espnes, Geir Arild; Maass, Ruca; Anthun, Kirsti Sarheim
2018-06-01
Multi-sectoral governance of population health is linked to the realization that health is the property of many societal systems. This study aims to contribute knowledge and methods that can strengthen the capacities of municipalities regarding how to work more systematically, knowledge-based and multi-sectoral in promoting health and health equity in the population. Process evaluation was conducted, applying a mixed-methods research design, combining qualitative and quantitative data collection methods. Processes strengthening systematic and multi-sectoral development, implementation and evaluation of research-based measures to promote health, quality of life, and health equity in, for and with municipalities were revealed. A step-by-step model, that emphasizes the promotion of knowledge-based, systematic, multi-sectoral public health work, as well as joint ownership of local resources, initiatives and policies has been developed. Implementation of systematic, knowledge-based and multi-sectoral governance of public health measures in municipalities demand shared understanding of the challenges, updated overview of the population health and impact factors, anchoring in plans, new skills and methods for selection and implementation of measures, as well as development of trust, ownership, shared ethics and goals among those involved.
Lower-upper-threshold correlation for underwater range-gated imaging self-adaptive enhancement.
Sun, Liang; Wang, Xinwei; Liu, Xiaoquan; Ren, Pengdao; Lei, Pingshun; He, Jun; Fan, Songtao; Zhou, Yan; Liu, Yuliang
2016-10-10
In underwater range-gated imaging (URGI), enhancement of low-brightness and low-contrast images is critical for human observation. Traditional histogram equalizations over-enhance images, with the result of details being lost. To compress over-enhancement, a lower-upper-threshold correlation method is proposed for underwater range-gated imaging self-adaptive enhancement based on double-plateau histogram equalization. The lower threshold determines image details and compresses over-enhancement. It is correlated with the upper threshold. First, the upper threshold is updated by searching for the local maximum in real time, and then the lower threshold is calculated by the upper threshold and the number of nonzero units selected from a filtered histogram. With this method, the backgrounds of underwater images are constrained with enhanced details. Finally, the proof experiments are performed. Peak signal-to-noise-ratio, variance, contrast, and human visual properties are used to evaluate the objective quality of the global and regions of interest images. The evaluation results demonstrate that the proposed method adaptively selects the proper upper and lower thresholds under different conditions. The proposed method contributes to URGI with effective image enhancement for human eyes.
Application of the finite element method in orthopedic implant design.
Saha, Subrata; Roychowdhury, Amit
2009-01-01
The finite element method (FEM) was first introduced to the field of orthopedic biomechanics in the early 1970s to evaluate stresses in human bones. By the early 1980s, the method had become well established as a tool for basic research and design analysis. Since the late 1980s and early 1990s, FEM has also been used to study bone remodeling. Today, it is one of the most reliable simulation tools for evaluating wear, fatigue, crack propagation, and so forth, and is used in many types of preoperative testing. Since the introduction of FEM to orthopedic biomechanics, there have been rapid advances in computer processing speeds, the finite element and other numerical methods, understanding of mechanical properties of soft and hard tissues and their modeling, and image-processing techniques. In light of these advances, it is accepted today that FEM will continue to contribute significantly to further progress in the design and development of orthopedic implants, as well as in the understanding of other complex systems of the human body. In the following article, different main application areas of finite element simulation will be reviewed including total hip joint arthroplasty, followed by the knee, spine, shoulder, and elbow, respectively.
Method for the measurement of susceptibility to decubitus ulcer formation.
Meijer, J H; Schut, G L; Ribbe, M W; Goovaerts, H G; Nieuwenhuys, R; Reulen, J P; Schneider, H
1989-09-01
A method for measuring the susceptibility of a patient to develop decubitus ulcers is described and initially evaluated. It is based on an indirect, noninvasive measurement of the transient regional blood flow response after a test pressure load which simulates the external stimulus for pressure-sore formation. This method was developed to determine the individual risk of a patient and to study the subfactors which contribute to the susceptibility. This would also offer the possibility of evaluating the effect of preventive treatment aimed at reducing the susceptibility. The method was found to discriminate between preselected elderly patients at risk on the one hand, and non-risk patients and healthy young adults on the other hand. No differences in blood flow responses were found between the non-risk elderly patients and the healthy young adults. This suggests that age per se is not a factor in the formation of pressure sores. In the risk group the recovery time after pressure relief was found to be three times as long as the duration of the pressure exercise. This indicates that the recovery time after pressure exercise may be as important as the period of pressure exercise in deducing the risk of developing decubitus ulcers.
A GPU-Accelerated Parameter Interpolation Thermodynamic Integration Free Energy Method.
Giese, Timothy J; York, Darrin M
2018-03-13
There has been a resurgence of interest in free energy methods motivated by the performance enhancements offered by molecular dynamics (MD) software written for specialized hardware, such as graphics processing units (GPUs). In this work, we exploit the properties of a parameter-interpolated thermodynamic integration (PI-TI) method to connect states by their molecular mechanical (MM) parameter values. This pathway is shown to be better behaved for Mg 2+ → Ca 2+ transformations than traditional linear alchemical pathways (with and without soft-core potentials). The PI-TI method has the practical advantage that no modification of the MD code is required to propagate the dynamics, and unlike with linear alchemical mixing, only one electrostatic evaluation is needed (e.g., single call to particle-mesh Ewald) leading to better performance. In the case of AMBER, this enables all the performance benefits of GPU-acceleration to be realized, in addition to unlocking the full spectrum of features available within the MD software, such as Hamiltonian replica exchange (HREM). The TI derivative evaluation can be accomplished efficiently in a post-processing step by reanalyzing the statistically independent trajectory frames in parallel for high throughput. We also show how one can evaluate the particle mesh Ewald contribution to the TI derivative evaluation without needing to perform two reciprocal space calculations. We apply the PI-TI method with HREM on GPUs in AMBER to predict p K a values in double stranded RNA molecules and make comparison with experiments. Convergence to under 0.25 units for these systems required 100 ns or more of sampling per window and coupling of windows with HREM. We find that MM charges derived from ab initio QM/MM fragment calculations improve the agreement between calculation and experimental results.
Development of in-house serological methods for diagnosis and surveillance of chikungunya.
Galo, Saira Saborío; González, Karla; Téllez, Yolanda; García, Nadezna; Pérez, Leonel; Gresh, Lionel; Harris, Eva; Balmaseda, Ángel
2017-08-21
To develop and evaluate serological methods for chikungunya diagnosis and research in Nicaragua. Two IgM ELISA capture systems (MAC-ELISA) for diagnosis of acute chikungunya virus (CHIKV) infections, and two Inhibition ELISA Methods (IEM) to measure total antibodies against CHIKV were developed using monoclonal antibodies (mAbs) and hyperimmune serum at the National Virology Laboratory of Nicaragua in 2014-2015. The sensitivity, specificity, predictive values, and agreement of the MAC-ELISAs were obtained by comparing the results of 198 samples (116 positive; 82 negative) with the Centers for Disease Control and Prevention's IgM ELISA (Atlanta, Georgia, United States; CDC-MAC-ELISA). For clinical evaluation of the four serological techniques, 260 paired acute and convalescent phase serum samples of suspected chikungunya cases were used. All four assays were standardized by determining the optimal concentrations of the different reagents. Processing times were substantially reduced compared to the CDC-MAC-ELISA. For the MAC-ELISA systems, a sensitivity of 96.6% and 97.4%, and a specificity of 98.8% and 91.5% were obtained using mAb and hyperimmune serum, respectively, compared with the CDC method. Clinical evaluation of the four serological techniques versus the CDC real-time RT-PCR assay resulted in a sensitivity of 95.7% and a specificity of 88.8%-95.9%. Two MAC-ELISA and two IEM systems were standardized, demonstrating very good quality for chikungunya diagnosis and research demands. This will achieve more efficient epidemiological surveillance in Nicaragua, the first country in Central America to produce its own reagents for serological diagnosis of CHIKV. The methods evaluated here can be applied in other countries and will contribute to sustainable diagnostic systems to combat the disease.
Meyer, Georg F.; Wong, Li Ting; Timson, Emma; Perfect, Philip; White, Mark D.
2012-01-01
We argue that objective fidelity evaluation of virtual environments, such as flight simulation, should be human-performance-centred and task-specific rather than measure the match between simulation and physical reality. We show how principled experimental paradigms and behavioural models to quantify human performance in simulated environments that have emerged from research in multisensory perception provide a framework for the objective evaluation of the contribution of individual cues to human performance measures of fidelity. We present three examples in a flight simulation environment as a case study: Experiment 1: Detection and categorisation of auditory and kinematic motion cues; Experiment 2: Performance evaluation in a target-tracking task; Experiment 3: Transferrable learning of auditory motion cues. We show how the contribution of individual cues to human performance can be robustly evaluated for each task and that the contribution is highly task dependent. The same auditory cues that can be discriminated and are optimally integrated in experiment 1, do not contribute to target-tracking performance in an in-flight refuelling simulation without training, experiment 2. In experiment 3, however, we demonstrate that the auditory cue leads to significant, transferrable, performance improvements with training. We conclude that objective fidelity evaluation requires a task-specific analysis of the contribution of individual cues. PMID:22957068
Modeling of the coupled magnetospheric and neutral wind dynamos
NASA Technical Reports Server (NTRS)
Thayer, Jeffrey P.
1994-01-01
This report summarizes the progress made in the first year of NASA Grant No. NAGW-3508 entitled 'Modeling of the Coupled Magnetospheric and Neutral Wind Dynamos.' The approach taken has been to impose magnetospheric boundary conditions with either pure voltage or current characteristics and solve the neutral wind dynamo equation under these conditions. The imposed boundary conditions determine whether the neutral wind dynamo will contribute to the high-latitude current system or the electric potential. The semi-annual technical report, dated December 15, 1993, provides further detail describing the scientific and numerical approach of the project. The numerical development has progressed and the dynamo solution for the case when the magnetosphere acts as a voltage source has been evaluated completely using spectral techniques. The simulation provides the field-aligned current distribution at high latitudes due to the neutral wind dynamo. A number of geophysical conditions can be simulated to evaluate the importance of the neutral wind dynamo contribution to the field-aligned current system. On average, field-aligned currents generated by the neutral wind dynamo contributed as much as 30 percent to the large-scale field-aligned current system driven by the magnetosphere. A term analysis of the high-latitude neutral wind dynamo equation describing the field aligned current distribution has also been developed to illustrate the important contributing factors involved in the process. The case describing the neutral dynamo response for a magnetosphere acting as a pure current generator requires the existing spectral code to be extended to a pseudo-spectral method and is currently under development.
Two laboratory methods for the calibration of GPS speed meters
NASA Astrophysics Data System (ADS)
Bai, Yin; Sun, Qiao; Du, Lei; Yu, Mei; Bai, Jie
2015-01-01
The set-ups of two calibration systems are presented to investigate calibration methods of GPS speed meters. The GPS speed meter calibrated is a special type of high accuracy speed meter for vehicles which uses Doppler demodulation of GPS signals to calculate the measured speed of a moving target. Three experiments are performed: including simulated calibration, field-test signal replay calibration, and in-field test comparison with an optical speed meter. The experiments are conducted at specific speeds in the range of 40-180 km h-1 with the same GPS speed meter as the device under calibration. The evaluation of measurement results validates both methods for calibrating GPS speed meters. The relative deviations between the measurement results of the GPS-based high accuracy speed meter and those of the optical speed meter are analyzed, and the equivalent uncertainty of the comparison is evaluated. The comparison results justify the utilization of GPS speed meters as reference equipment if no fewer than seven satellites are available. This study contributes to the widespread use of GPS-based high accuracy speed meters as legal reference equipment in traffic speed metrology.
Contini, Erika W; Wardle, Susan G; Carlson, Thomas A
2017-10-01
Visual object recognition is a complex, dynamic process. Multivariate pattern analysis methods, such as decoding, have begun to reveal how the brain processes complex visual information. Recently, temporal decoding methods for EEG and MEG have offered the potential to evaluate the temporal dynamics of object recognition. Here we review the contribution of M/EEG time-series decoding methods to understanding visual object recognition in the human brain. Consistent with the current understanding of the visual processing hierarchy, low-level visual features dominate decodable object representations early in the time-course, with more abstract representations related to object category emerging later. A key finding is that the time-course of object processing is highly dynamic and rapidly evolving, with limited temporal generalisation of decodable information. Several studies have examined the emergence of object category structure, and we consider to what degree category decoding can be explained by sensitivity to low-level visual features. Finally, we evaluate recent work attempting to link human behaviour to the neural time-course of object processing. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Yang, Guijun; Yang, Hao; Jin, Xiuliang; Pignatti, Stefano; Casa, Faffaele; Silverstro, Paolo Cosmo
2016-08-01
Drought is the most costly natural disasters in China and all over the world. It is very important to evaluate the drought-induced crop yield losses and further improve water use efficiency at regional scale. Firstly, crop biomass was estimated by the combined use of Synthetic Aperture Radar (SAR) and optical remote sensing data. Then the estimated biophysical variable was assimilated into crop growth model (FAO AquaCrop) by the Particle Swarm Optimization (PSO) method from farmland scale to regional scale.At farmland scale, the most important crop parameters of AquaCrop model were determined to reduce the used parameters in assimilation procedure. The Extended Fourier Amplitude Sensitivity Test (EFAST) method was used for assessing the contribution of different crop parameters to model output. Moreover, the AquaCrop model was calibrated using the experiment data in Xiaotangshan, Beijing.At regional scale, spatial application of our methods were carried out and validated in the rural area of Yangling, Shaanxi Province, in 2014. This study will provide guideline to make irrigation decision of balancing of water consumption and yield loss.
Franssen, Frits; van Andel, Esther; Swart, Arno; van der Giessen, Joke
2016-02-01
The performance of a 400-μm-mesh-size sieve (sieve400) has not previously been compared with that of a 180-μm-mesh-size sieve (sieve180). Using pork samples spiked with 0 to 10 Trichinella muscle larvae and an artificial digestion method, sieve performance was evaluated for control of Trichinella in meat-producing animals. The use of a sieve400 resulted in 12% lower larval counts, 147% more debris, and 28% longer counting times compared with the use of a sieve180. Although no false-negative results were obtained, prolonged counting times with the sieve400 may have an impact on performance in a high-throughput environment such as a slaughterhouse laboratory. Based on our results, the sieve180 remains the sieve of choice for Trichinella control in meat in slaughterhouse laboratories, according to the European Union reference method (European Commission regulation 2075/2005). Furthermore, the results of the present study contribute to the discussion of harmonization of meat inspection requirements among countries.
NASA Astrophysics Data System (ADS)
Klomp, Sander; van der Sommen, Fons; Swager, Anne-Fré; Zinger, Svitlana; Schoon, Erik J.; Curvers, Wouter L.; Bergman, Jacques J.; de With, Peter H. N.
2017-03-01
Volumetric Laser Endomicroscopy (VLE) is a promising technique for the detection of early neoplasia in Barrett's Esophagus (BE). VLE generates hundreds of high resolution, grayscale, cross-sectional images of the esophagus. However, at present, classifying these images is a time consuming and cumbersome effort performed by an expert using a clinical prediction model. This paper explores the feasibility of using computer vision techniques to accurately predict the presence of dysplastic tissue in VLE BE images. Our contribution is threefold. First, a benchmarking is performed for widely applied machine learning techniques and feature extraction methods. Second, three new features based on the clinical detection model are proposed, having superior classification accuracy and speed, compared to earlier work. Third, we evaluate automated parameter tuning by applying simple grid search and feature selection methods. The results are evaluated on a clinically validated dataset of 30 dysplastic and 30 non-dysplastic VLE images. Optimal classification accuracy is obtained by applying a support vector machine and using our modified Haralick features and optimal image cropping, obtaining an area under the receiver operating characteristic of 0.95 compared to the clinical prediction model at 0.81. Optimal execution time is achieved using a proposed mean and median feature, which is extracted at least factor 2.5 faster than alternative features with comparable performance.