20 CFR 640.3 - Interpretation of Federal law requirements.
Code of Federal Regulations, 2014 CFR
2014-04-01
... 20 Employees' Benefits 3 2014-04-01 2014-04-01 false Interpretation of Federal law requirements... STANDARD FOR BENEFIT PAYMENT PROMPTNESS-UNEMPLOYMENT COMPENSATION § 640.3 Interpretation of Federal law... require that a State law include provision for such methods of administration as will reasonable insure...
20 CFR 640.3 - Interpretation of Federal law requirements.
Code of Federal Regulations, 2012 CFR
2012-04-01
... 20 Employees' Benefits 3 2012-04-01 2012-04-01 false Interpretation of Federal law requirements... STANDARD FOR BENEFIT PAYMENT PROMPTNESS-UNEMPLOYMENT COMPENSATION § 640.3 Interpretation of Federal law... require that a State law include provision for such methods of administration as will reasonable insure...
20 CFR 640.3 - Interpretation of Federal law requirements.
Code of Federal Regulations, 2013 CFR
2013-04-01
... 20 Employees' Benefits 3 2013-04-01 2013-04-01 false Interpretation of Federal law requirements... STANDARD FOR BENEFIT PAYMENT PROMPTNESS-UNEMPLOYMENT COMPENSATION § 640.3 Interpretation of Federal law... require that a State law include provision for such methods of administration as will reasonable insure...
20 CFR 640.3 - Interpretation of Federal law requirements.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 20 Employees' Benefits 3 2011-04-01 2011-04-01 false Interpretation of Federal law requirements... STANDARD FOR BENEFIT PAYMENT PROMPTNESS-UNEMPLOYMENT COMPENSATION § 640.3 Interpretation of Federal law... require that a State law include provision for such methods of administration as will reasonable insure...
Interpretation of Radiological Images: Towards a Framework of Knowledge and Skills
ERIC Educational Resources Information Center
van der Gijp, A.; van der Schaaf, M. F.; van der Schaaf, I. C.; Huige, J. C. B. M.; Ravesloot, C. J.; van Schaik, J. P. J.; ten Cate, Th. J.
2014-01-01
The knowledge and skills that are required for radiological image interpretation are not well documented, even though medical imaging is gaining importance. This study aims to develop a comprehensive framework of knowledge and skills, required for two-dimensional and multiplanar image interpretation in radiology. A mixed-method study approach was…
A new method for the automatic interpretation of Schlumberger and Wenner sounding curves
Zohdy, A.A.R.
1989-01-01
A fast iterative method for the automatic interpretation of Schlumberger and Wenner sounding curves is based on obtaining interpreted depths and resistivities from shifted electrode spacings and adjusted apparent resistivities, respectively. The method is fully automatic. It does not require an initial guess of the number of layers, their thicknesses, or their resistivities; and it does not require extrapolation of incomplete sounding curves. The number of layers in the interpreted model equals the number of digitized points on the sounding curve. The resulting multilayer model is always well-behaved with no thin layers of unusually high or unusually low resistivities. For noisy data, interpretation is done in two sets of iterations (two passes). Anomalous layers, created because of noise in the first pass, are eliminated in the second pass. Such layers are eliminated by considering the best-fitting curve from the first pass to be a smoothed version of the observed curve and automatically reinterpreting it (second pass). The application of the method is illustrated by several examples. -Author
28 CFR 36.309 - Examinations and courses.
Code of Federal Regulations, 2012 CFR
2012-07-01
... include taped examinations, interpreters or other effective methods of making orally delivered materials... qualified readers for individuals with visual impairments or learning disabilities, transcribers for... and services required by this section may include taped texts, interpreters or other effective methods...
28 CFR 36.309 - Examinations and courses.
Code of Federal Regulations, 2013 CFR
2013-07-01
... include taped examinations, interpreters or other effective methods of making orally delivered materials... qualified readers for individuals with visual impairments or learning disabilities, transcribers for... and services required by this section may include taped texts, interpreters or other effective methods...
28 CFR 36.309 - Examinations and courses.
Code of Federal Regulations, 2014 CFR
2014-07-01
... include taped examinations, interpreters or other effective methods of making orally delivered materials... qualified readers for individuals with visual impairments or learning disabilities, transcribers for... and services required by this section may include taped texts, interpreters or other effective methods...
Multi-Reader ROC studies with Split-Plot Designs: A Comparison of Statistical Methods
Obuchowski, Nancy A.; Gallas, Brandon D.; Hillis, Stephen L.
2012-01-01
Rationale and Objectives Multi-reader imaging trials often use a factorial design, where study patients undergo testing with all imaging modalities and readers interpret the results of all tests for all patients. A drawback of the design is the large number of interpretations required of each reader. Split-plot designs have been proposed as an alternative, in which one or a subset of readers interprets all images of a sample of patients, while other readers interpret the images of other samples of patients. In this paper we compare three methods of analysis for the split-plot design. Materials and Methods Three statistical methods are presented: Obuchowski-Rockette method modified for the split-plot design, a newly proposed marginal-mean ANOVA approach, and an extension of the three-sample U-statistic method. A simulation study using the Roe-Metz model was performed to compare the type I error rate, power and confidence interval coverage of the three test statistics. Results The type I error rates for all three methods are close to the nominal level but tend to be slightly conservative. The statistical power is nearly identical for the three methods. The coverage of 95% CIs fall close to the nominal coverage for small and large sample sizes. Conclusions The split-plot MRMC study design can be statistically efficient compared with the factorial design, reducing the number of interpretations required per reader. Three methods of analysis, shown to have nominal type I error rate, similar power, and nominal CI coverage, are available for this study design. PMID:23122570
Multi-reader ROC studies with split-plot designs: a comparison of statistical methods.
Obuchowski, Nancy A; Gallas, Brandon D; Hillis, Stephen L
2012-12-01
Multireader imaging trials often use a factorial design, in which study patients undergo testing with all imaging modalities and readers interpret the results of all tests for all patients. A drawback of this design is the large number of interpretations required of each reader. Split-plot designs have been proposed as an alternative, in which one or a subset of readers interprets all images of a sample of patients, while other readers interpret the images of other samples of patients. In this paper, the authors compare three methods of analysis for the split-plot design. Three statistical methods are presented: the Obuchowski-Rockette method modified for the split-plot design, a newly proposed marginal-mean analysis-of-variance approach, and an extension of the three-sample U-statistic method. A simulation study using the Roe-Metz model was performed to compare the type I error rate, power, and confidence interval coverage of the three test statistics. The type I error rates for all three methods are close to the nominal level but tend to be slightly conservative. The statistical power is nearly identical for the three methods. The coverage of 95% confidence intervals falls close to the nominal coverage for small and large sample sizes. The split-plot multireader, multicase study design can be statistically efficient compared to the factorial design, reducing the number of interpretations required per reader. Three methods of analysis, shown to have nominal type I error rates, similar power, and nominal confidence interval coverage, are available for this study design. Copyright © 2012 AUR. All rights reserved.
Using the MDCT thick slab MinIP method for the follow-up of pulmonary emphysema.
Lan, Hai; Nishitani, Hiromu; Nishihara, Sadamitsu; Ueno, Junji; Takao, Shoichiro; Iwamoto, Seiji; Kawanaka, Takashi; Mahmut, Mawlan; Qingge, Si
2011-08-01
The purpose of this study was to evaluate the usefulness of thick slab minimum intensity projection (MinIP) as a follow-up method in patients with pulmonary emphysema. This method was used to determine the presence or absence of changes over time in the lung field based on multi-detector-row CT (MDCT) data. Among patients diagnosed with pulmonary emphysema who underwent 16-MDCT (slice thickness, 1 mm) twice at an interval of 6 months or more, 12 patients without changes in the lung field and 14 with clear changes in the lung field were selected as subjects. An image interpretation experiment was performed by five image interpreters. Pulmonary emphysema was followed up using two types of thick slab MinIP (thick slab MinIP 1 and 2) and multi-planar reformation (MPR), and the results of image interpretation were evaluated by receiver operating characteristic (ROC) analysis. In addition, the time required for image interpretation was compared among the three follow-up methods. The area under the ROC curve (Az) was 0.794 for thick slab MinIP 1, 0.778 for the thick slab MinIP 2, and 0.759 for MPR, showing no significant differences among the three methods. Individual differences in each item were significantly more marked for MPR than for thick slab MinIP. The time required for image interpretation was around 18 seconds for thick slab MinIP 1, 11 seconds for thick slab MinIP 2, and approximately 127 seconds for MPR, showing significant differences among the three methods. There were no significant differences in the results of image interpretation regarding the presence or absence of changes in the lung fields between thick slab MinIP and MPR. However, thick slab MinIP showed a shorter image interpretation time and smaller individual differences in the results among image interpreters than MPR, suggesting the usefulness of this method for determining the presence or absence of changes with time in the lung fields of patients with pulmonary emphysema.
Brownstein, Catherine A; Beggs, Alan H; Homer, Nils; Merriman, Barry; Yu, Timothy W; Flannery, Katherine C; DeChene, Elizabeth T; Towne, Meghan C; Savage, Sarah K; Price, Emily N; Holm, Ingrid A; Luquette, Lovelace J; Lyon, Elaine; Majzoub, Joseph; Neupert, Peter; McCallie, David; Szolovits, Peter; Willard, Huntington F; Mendelsohn, Nancy J; Temme, Renee; Finkel, Richard S; Yum, Sabrina W; Medne, Livija; Sunyaev, Shamil R; Adzhubey, Ivan; Cassa, Christopher A; de Bakker, Paul I W; Duzkale, Hatice; Dworzyński, Piotr; Fairbrother, William; Francioli, Laurent; Funke, Birgit H; Giovanni, Monica A; Handsaker, Robert E; Lage, Kasper; Lebo, Matthew S; Lek, Monkol; Leshchiner, Ignaty; MacArthur, Daniel G; McLaughlin, Heather M; Murray, Michael F; Pers, Tune H; Polak, Paz P; Raychaudhuri, Soumya; Rehm, Heidi L; Soemedi, Rachel; Stitziel, Nathan O; Vestecka, Sara; Supper, Jochen; Gugenmus, Claudia; Klocke, Bernward; Hahn, Alexander; Schubach, Max; Menzel, Mortiz; Biskup, Saskia; Freisinger, Peter; Deng, Mario; Braun, Martin; Perner, Sven; Smith, Richard J H; Andorf, Janeen L; Huang, Jian; Ryckman, Kelli; Sheffield, Val C; Stone, Edwin M; Bair, Thomas; Black-Ziegelbein, E Ann; Braun, Terry A; Darbro, Benjamin; DeLuca, Adam P; Kolbe, Diana L; Scheetz, Todd E; Shearer, Aiden E; Sompallae, Rama; Wang, Kai; Bassuk, Alexander G; Edens, Erik; Mathews, Katherine; Moore, Steven A; Shchelochkov, Oleg A; Trapane, Pamela; Bossler, Aaron; Campbell, Colleen A; Heusel, Jonathan W; Kwitek, Anne; Maga, Tara; Panzer, Karin; Wassink, Thomas; Van Daele, Douglas; Azaiez, Hela; Booth, Kevin; Meyer, Nic; Segal, Michael M; Williams, Marc S; Tromp, Gerard; White, Peter; Corsmeier, Donald; Fitzgerald-Butt, Sara; Herman, Gail; Lamb-Thrush, Devon; McBride, Kim L; Newsom, David; Pierson, Christopher R; Rakowsky, Alexander T; Maver, Aleš; Lovrečić, Luca; Palandačić, Anja; Peterlin, Borut; Torkamani, Ali; Wedell, Anna; Huss, Mikael; Alexeyenko, Andrey; Lindvall, Jessica M; Magnusson, Måns; Nilsson, Daniel; Stranneheim, Henrik; Taylan, Fulya; Gilissen, Christian; Hoischen, Alexander; van Bon, Bregje; Yntema, Helger; Nelen, Marcel; Zhang, Weidong; Sager, Jason; Zhang, Lu; Blair, Kathryn; Kural, Deniz; Cariaso, Michael; Lennon, Greg G; Javed, Asif; Agrawal, Saloni; Ng, Pauline C; Sandhu, Komal S; Krishna, Shuba; Veeramachaneni, Vamsi; Isakov, Ofer; Halperin, Eran; Friedman, Eitan; Shomron, Noam; Glusman, Gustavo; Roach, Jared C; Caballero, Juan; Cox, Hannah C; Mauldin, Denise; Ament, Seth A; Rowen, Lee; Richards, Daniel R; San Lucas, F Anthony; Gonzalez-Garay, Manuel L; Caskey, C Thomas; Bai, Yu; Huang, Ying; Fang, Fang; Zhang, Yan; Wang, Zhengyuan; Barrera, Jorge; Garcia-Lobo, Juan M; González-Lamuño, Domingo; Llorca, Javier; Rodriguez, Maria C; Varela, Ignacio; Reese, Martin G; De La Vega, Francisco M; Kiruluta, Edward; Cargill, Michele; Hart, Reece K; Sorenson, Jon M; Lyon, Gholson J; Stevenson, David A; Bray, Bruce E; Moore, Barry M; Eilbeck, Karen; Yandell, Mark; Zhao, Hongyu; Hou, Lin; Chen, Xiaowei; Yan, Xiting; Chen, Mengjie; Li, Cong; Yang, Can; Gunel, Murat; Li, Peining; Kong, Yong; Alexander, Austin C; Albertyn, Zayed I; Boycott, Kym M; Bulman, Dennis E; Gordon, Paul M K; Innes, A Micheil; Knoppers, Bartha M; Majewski, Jacek; Marshall, Christian R; Parboosingh, Jillian S; Sawyer, Sarah L; Samuels, Mark E; Schwartzentruber, Jeremy; Kohane, Isaac S; Margulies, David M
2014-03-25
There is tremendous potential for genome sequencing to improve clinical diagnosis and care once it becomes routinely accessible, but this will require formalizing research methods into clinical best practices in the areas of sequence data generation, analysis, interpretation and reporting. The CLARITY Challenge was designed to spur convergence in methods for diagnosing genetic disease starting from clinical case history and genome sequencing data. DNA samples were obtained from three families with heritable genetic disorders and genomic sequence data were donated by sequencing platform vendors. The challenge was to analyze and interpret these data with the goals of identifying disease-causing variants and reporting the findings in a clinically useful format. Participating contestant groups were solicited broadly, and an independent panel of judges evaluated their performance. A total of 30 international groups were engaged. The entries reveal a general convergence of practices on most elements of the analysis and interpretation process. However, even given this commonality of approach, only two groups identified the consensus candidate variants in all disease cases, demonstrating a need for consistent fine-tuning of the generally accepted methods. There was greater diversity of the final clinical report content and in the patient consenting process, demonstrating that these areas require additional exploration and standardization. The CLARITY Challenge provides a comprehensive assessment of current practices for using genome sequencing to diagnose and report genetic diseases. There is remarkable convergence in bioinformatic techniques, but medical interpretation and reporting are areas that require further development by many groups.
2014-01-01
Background There is tremendous potential for genome sequencing to improve clinical diagnosis and care once it becomes routinely accessible, but this will require formalizing research methods into clinical best practices in the areas of sequence data generation, analysis, interpretation and reporting. The CLARITY Challenge was designed to spur convergence in methods for diagnosing genetic disease starting from clinical case history and genome sequencing data. DNA samples were obtained from three families with heritable genetic disorders and genomic sequence data were donated by sequencing platform vendors. The challenge was to analyze and interpret these data with the goals of identifying disease-causing variants and reporting the findings in a clinically useful format. Participating contestant groups were solicited broadly, and an independent panel of judges evaluated their performance. Results A total of 30 international groups were engaged. The entries reveal a general convergence of practices on most elements of the analysis and interpretation process. However, even given this commonality of approach, only two groups identified the consensus candidate variants in all disease cases, demonstrating a need for consistent fine-tuning of the generally accepted methods. There was greater diversity of the final clinical report content and in the patient consenting process, demonstrating that these areas require additional exploration and standardization. Conclusions The CLARITY Challenge provides a comprehensive assessment of current practices for using genome sequencing to diagnose and report genetic diseases. There is remarkable convergence in bioinformatic techniques, but medical interpretation and reporting are areas that require further development by many groups. PMID:24667040
Remote sensing as a source of land cover information utilized in the universal soil loss equation
NASA Technical Reports Server (NTRS)
Morris-Jones, D. R.; Morgan, K. M.; Kiefer, R. W.; Scarpace, F. L.
1979-01-01
In this study, methods for gathering the land use/land cover information required by the USLE were investigated with medium altitude, multi-date color and color infrared 70-mm positive transparencies using human and computer-based interpretation techniques. Successful results, which compare favorably with traditional field study methods, were obtained within the test site watershed with airphoto data sources and human airphoto interpretation techniques. Computer-based interpretation techniques were not capable of identifying soil conservation practices but were successful to varying degrees in gathering other types of desired land use/land cover information.
20 CFR 640.3 - Interpretation of Federal law requirements.
Code of Federal Regulations, 2010 CFR
2010-04-01
... require that a State law include provision for such methods of administration as will reasonable insure... Security Act to require that, in the administration of a State law, there shall be substantial compliance... benefits. Factors reasonably beyond a State's control may cause its performance to drop below the level of...
On the interpretation of weight vectors of linear models in multivariate neuroimaging.
Haufe, Stefan; Meinecke, Frank; Görgen, Kai; Dähne, Sven; Haynes, John-Dylan; Blankertz, Benjamin; Bießmann, Felix
2014-02-15
The increase in spatiotemporal resolution of neuroimaging devices is accompanied by a trend towards more powerful multivariate analysis methods. Often it is desired to interpret the outcome of these methods with respect to the cognitive processes under study. Here we discuss which methods allow for such interpretations, and provide guidelines for choosing an appropriate analysis for a given experimental goal: For a surgeon who needs to decide where to remove brain tissue it is most important to determine the origin of cognitive functions and associated neural processes. In contrast, when communicating with paralyzed or comatose patients via brain-computer interfaces, it is most important to accurately extract the neural processes specific to a certain mental state. These equally important but complementary objectives require different analysis methods. Determining the origin of neural processes in time or space from the parameters of a data-driven model requires what we call a forward model of the data; such a model explains how the measured data was generated from the neural sources. Examples are general linear models (GLMs). Methods for the extraction of neural information from data can be considered as backward models, as they attempt to reverse the data generating process. Examples are multivariate classifiers. Here we demonstrate that the parameters of forward models are neurophysiologically interpretable in the sense that significant nonzero weights are only observed at channels the activity of which is related to the brain process under study. In contrast, the interpretation of backward model parameters can lead to wrong conclusions regarding the spatial or temporal origin of the neural signals of interest, since significant nonzero weights may also be observed at channels the activity of which is statistically independent of the brain process under study. As a remedy for the linear case, we propose a procedure for transforming backward models into forward models. This procedure enables the neurophysiological interpretation of the parameters of linear backward models. We hope that this work raises awareness for an often encountered problem and provides a theoretical basis for conducting better interpretable multivariate neuroimaging analyses. Copyright © 2013 The Authors. Published by Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Schulte, Peter Z.; Moore, James W.
2011-01-01
The Crew Exploration Vehicle Parachute Assembly System (CPAS) project conducts computer simulations to verify that flight performance requirements on parachute loads and terminal rate of descent are met. Design of Experiments (DoE) provides a systematic method for variation of simulation input parameters. When implemented and interpreted correctly, a DoE study of parachute simulation tools indicates values and combinations of parameters that may cause requirement limits to be violated. This paper describes one implementation of DoE that is currently being developed by CPAS, explains how DoE results can be interpreted, and presents the results of several preliminary studies. The potential uses of DoE to validate parachute simulation models and verify requirements are also explored.
The relationship between symbolic interactionism and interpretive description.
Oliver, Carolyn
2012-03-01
In this article I explore the relationship between symbolic interactionist theory and interpretive description methodology. The two are highly compatible, making symbolic interactionism an excellent theoretical framework for interpretive description studies. The pragmatism underlying interpretive description supports locating the methodology within this cross-disciplinary theory to make it more attractive to nonnursing researchers and expand its potential to address practice problems across the applied disciplines. The theory and method are so compatible that symbolic interactionism appears to be part of interpretive description's epistemological foundations. Interpretive description's theoretical roots have, to date, been identified only very generally in interpretivism and the philosophy of nursing. A more detailed examination of its symbolic interactionist heritage furthers the contextualization or forestructuring of the methodology to meet one of its own requirements for credibility.
Concepts of ‘personalization’ in personalized medicine: implications for economic evaluation
Rogowski, Wolf; Payne, Katherine; Schnell-Inderst, Petra; Manca, Andrea; Rochau, Ursula; Jahn, Beate; Alagoz, Oguzhan; Leidl, Reiner; Siebert, Uwe
2015-01-01
Context This paper assesses if, and how, existing methods for economic evaluation are applicable to the evaluation of PM and if not, where extension to methods may be required. Method Structured workshop with a pre-defined group of experts (n=47), run using a modified nominal group technique. Workshop findings were recorded using extensive note taking and summarised using thematic data analysis. The workshop was complemented by structured literature searches. Results The key finding emerging from the workshop, using an economic perspective, was that two distinct, but linked, interpretations of the concept of PM exist (personalization by ‘physiology’ or ‘preferences’). These interpretations involve specific challenges for the design and conduct of economic evaluations. Existing evaluative (extra-welfarist) frameworks were generally considered appropriate for evaluating PM. When ‘personalization’ is viewed as using physiological biomarkers, challenges include: representing complex care pathways; representing spill-over effects; meeting data requirements such as evidence on heterogeneity; choosing appropriate time horizons for the value of further research in uncertainty analysis. When viewed as tailoring medicine to patient preferences, further work is needed regarding: revealed preferences, e.g. treatment (non)adherence; stated preferences, e.g. risk interpretation and attitude; consideration of heterogeneity in preferences; and the appropriate framework (welfarism vs. extra-welfarism) to incorporate non-health benefits. Conclusion Ideally, economic evaluations should take account of both interpretations of PM and consider physiology and preferences. It is important for decision makers to be cognizant of the issues involved with the economic evaluation of PM to appropriately interpret the evidence and target future research funding. PMID:25249200
Teaching thoughtful practice: narrative pedagogy in addictions education.
Vandermause, Roxanne K; Townsend, Ryan P
2010-07-01
Preparing practitioners for this rapidly changing and demanding health care environment is challenging. A surge in knowledge development and scientific advancement has placed a priority on technical skill and a focus on content driven educational processes that prepare students for evidence-based practice. However, the most difficult health care scenarios require thinking-in-action and thoughtfulness as well as didactic knowledge. It is our contention that interpretive educational methods, like narrative pedagogy, will promote judgment-based practice that includes use of evidence and delivery of thoughtful care. In this article, we describe and interpret a narrative approach to addictions content and teaching thoughtful practice. We present our pedagogical process, including observations and field notes, to show how interpretive pedagogies can be introduced into nursing curricula. By presenting this process, the reader is invited to consider interpretive methods as a way to inspire and habituate thoughtful practice and judgment-based care. Copyright 2009 Elsevier Ltd. All rights reserved.
Automated Discrimination Method of Muscular and Subcutaneous Fat Layers Based on Tissue Elasticity
NASA Astrophysics Data System (ADS)
Inoue, Masahiro; Fukuda, Osamu; Tsubai, Masayoshi; Muraki, Satoshi; Okumura, Hiroshi; Arai, Kohei
Balance between human body composition, e.g. bones, muscles, and fat, is a major and basic indicator of personal health. Body composition analysis using ultrasound has been developed rapidly. However, interpretation of echo signal is conducted manually, and accuracy and confidence in interpretation requires experience. This paper proposes an automated discrimination method of tissue boundaries for measuring the thickness of subcutaneous fat and muscular layers. A portable one-dimensional ultrasound device was used in this study. The proposed method discriminated tissue boundaries based on tissue elasticity. Validity of the proposed method was evaluated in twenty-one subjects (twelve women, nine men; aged 20-70 yr) at three anatomical sites. Experimental results show that the proposed method can achieve considerably high discrimination performance.
NASA Astrophysics Data System (ADS)
Baumstark, René; Duffey, Renee; Pu, Ruiliang
2016-11-01
The offshore extent of seagrass habitat along the West Florida (USA) coast represents an important corridor for inshore-offshore migration of economically important fish and shellfish. Surviving at the fringe of light requirements, offshore seagrass beds are sensitive to changes in water clarity. Beyond and intermingled with the offshore seagrass areas are large swaths of colonized hard bottom. These offshore habitats of the West Florida coast have lacked mapping efforts needed for status and trends monitoring. The objective of this study was to propose an object-based classification method for mapping offshore habitats and to compare results to traditional photo-interpreted maps. Benthic maps were created from WorldView-2 satellite imagery using an Object Based Image Analysis (OBIA) method and a visual photo-interpretation method. A logistic regression analysis identified depth and distance from shore as significant parameters for discriminating spectrally similar seagrass and colonized hard bottom features. Seagrass, colonized hard bottom and unconsolidated sediment (sand) were mapped with 78% overall accuracy using the OBIA method compared to 71% overall accuracy using the photo-interpretation method. This study suggests an alternative for mapping deeper, offshore habitats capable of producing higher thematic and spatial resolution maps compared to those created with the traditional photo-interpretation method.
Vitali, Francesca; Li, Qike; Schissler, A Grant; Berghout, Joanne; Kenost, Colleen; Lussier, Yves A
2017-12-18
The development of computational methods capable of analyzing -omics data at the individual level is critical for the success of precision medicine. Although unprecedented opportunities now exist to gather data on an individual's -omics profile ('personalome'), interpreting and extracting meaningful information from single-subject -omics remain underdeveloped, particularly for quantitative non-sequence measurements, including complete transcriptome or proteome expression and metabolite abundance. Conventional bioinformatics approaches have largely been designed for making population-level inferences about 'average' disease processes; thus, they may not adequately capture and describe individual variability. Novel approaches intended to exploit a variety of -omics data are required for identifying individualized signals for meaningful interpretation. In this review-intended for biomedical researchers, computational biologists and bioinformaticians-we survey emerging computational and translational informatics methods capable of constructing a single subject's 'personalome' for predicting clinical outcomes or therapeutic responses, with an emphasis on methods that provide interpretable readouts. (i) the single-subject analytics of the transcriptome shows the greatest development to date and, (ii) the methods were all validated in simulations, cross-validations or independent retrospective data sets. This survey uncovers a growing field that offers numerous opportunities for the development of novel validation methods and opens the door for future studies focusing on the interpretation of comprehensive 'personalomes' through the integration of multiple -omics, providing valuable insights into individual patient outcomes and treatments. © The Author 2017. Published by Oxford University Press.
Interpretation of correlations in clinical research.
Hung, Man; Bounsanga, Jerry; Voss, Maren Wright
2017-11-01
Critically analyzing research is a key skill in evidence-based practice and requires knowledge of research methods, results interpretation, and applications, all of which rely on a foundation based in statistics. Evidence-based practice makes high demands on trained medical professionals to interpret an ever-expanding array of research evidence. As clinical training emphasizes medical care rather than statistics, it is useful to review the basics of statistical methods and what they mean for interpreting clinical studies. We reviewed the basic concepts of correlational associations, violations of normality, unobserved variable bias, sample size, and alpha inflation. The foundations of causal inference were discussed and sound statistical analyses were examined. We discuss four ways in which correlational analysis is misused, including causal inference overreach, over-reliance on significance, alpha inflation, and sample size bias. Recent published studies in the medical field provide evidence of causal assertion overreach drawn from correlational findings. The findings present a primer on the assumptions and nature of correlational methods of analysis and urge clinicians to exercise appropriate caution as they critically analyze the evidence before them and evaluate evidence that supports practice. Critically analyzing new evidence requires statistical knowledge in addition to clinical knowledge. Studies can overstate relationships, expressing causal assertions when only correlational evidence is available. Failure to account for the effect of sample size in the analyses tends to overstate the importance of predictive variables. It is important not to overemphasize the statistical significance without consideration of effect size and whether differences could be considered clinically meaningful.
Army Sociocultural Performance Requirements
2014-06-01
L., Crafts, J. L., & Brooks, J. E. (July 1995). Intercultural communication requirements for Special Forces teams. (Study Report 1683). Arlington... Communication Uses alternative, sometimes novel, methods to communicate when verbal language is not shared; conveys information about mood, intent...status, and demeanor via gestures, tone of voice, and facial expressions; improvises communication techniques as necessary. WI Works with Interpreters
Human factors studies of control configurations for advanced transport aircraft
NASA Technical Reports Server (NTRS)
Snyder, Harry L.; Monty, Robert W.; Old, Joe
1985-01-01
This research investigated the threshold levels of display luminance contrast which were required to interpret static, achromatic, integrated displays of primary flight information. A four-factor within-subjects design was used to investigate the influences of type of flight variable information, the level of ambient illumination, the type of control input, and the size of the display symbology on the setting of these interpretability thresholds. A three-alternative forced choice paradigm was used in conjunction with the method of adjustments to obtain a measure of the upper limen of display luminance contrast needed to interpret a complex display of primary flight information. The pattern of results and the absolute magnitudes of the luminance contrast settings were found to be in good agreement with previously reported data from psychophysical investigations of display luminance contrast requirements.
GUIDELINES TO ASSESSING REGIONAL VULNERABILITIES
Decision-makers today face increasingly complex environmental problems that require integrative and innovative approaches for analyzing, modeling, and interpreting various types of information. ReVA acknowledges this need and is designed to evaluate methods and models for synthe...
Fast and Scalable Gaussian Process Modeling with Applications to Astronomical Time Series
NASA Astrophysics Data System (ADS)
Foreman-Mackey, Daniel; Agol, Eric; Ambikasaran, Sivaram; Angus, Ruth
2017-12-01
The growing field of large-scale time domain astronomy requires methods for probabilistic data analysis that are computationally tractable, even with large data sets. Gaussian processes (GPs) are a popular class of models used for this purpose, but since the computational cost scales, in general, as the cube of the number of data points, their application has been limited to small data sets. In this paper, we present a novel method for GPs modeling in one dimension where the computational requirements scale linearly with the size of the data set. We demonstrate the method by applying it to simulated and real astronomical time series data sets. These demonstrations are examples of probabilistic inference of stellar rotation periods, asteroseismic oscillation spectra, and transiting planet parameters. The method exploits structure in the problem when the covariance function is expressed as a mixture of complex exponentials, without requiring evenly spaced observations or uniform noise. This form of covariance arises naturally when the process is a mixture of stochastically driven damped harmonic oscillators—providing a physical motivation for and interpretation of this choice—but we also demonstrate that it can be a useful effective model in some other cases. We present a mathematical description of the method and compare it to existing scalable GP methods. The method is fast and interpretable, with a range of potential applications within astronomical data analysis and beyond. We provide well-tested and documented open-source implementations of this method in C++, Python, and Julia.
NASA Astrophysics Data System (ADS)
Baumstark, R. D.; Duffey, R.; Pu, R.
2016-12-01
The offshore extent of seagrass habitat along the West Florida (USA) coast represents an important corridor for inshore-offshore migration of economically important fish and shellfish. Surviving at the fringe of light requirements, offshore seagrass beds are sensitive to changes in water clarity. Beyond and intermingled with the offshore seagrass areas are large swaths of colonized hard bottom. These offshore habitats of the West Florida coast have lacked mapping efforts needed for status and trends monitoring. The objective of this study was to propose an object-based classification method for mapping offshore habitats and to compare results to traditional photo-interpreted maps. Benthic maps depicting the spatial distribution and percent biological cover were created from WorldView-2 satellite imagery using Object Based Image Analysis (OBIA) method and a visual photo-interpretation method. A logistic regression analysis identified depth and distance from shore as significant parameters for discriminating spectrally similar seagrass and colonized hard bottom features. Seagrass, colonized hard bottom and unconsolidated sediment (sand) were mapped with 78% overall accuracy using the OBIA method compared to 71% overall accuracy using the photo-interpretation method. This study presents an alternative for mapping deeper, offshore habitats capable of producing higher thematic (percent biological cover) and spatial resolution maps compared to those created with the traditional photo-interpretation method.
78 FR 66865 - Interpretation of Rest Requirements
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-07
...-1259] Interpretation of Rest Requirements AGENCY: Federal Aviation Administration (FAA), DOT. ACTION... Proposed Interpretation seeking public comment on the application of certain rest requirements during on.... FAA-2010-1259, relating to rest requirements, and published in the Federal Register on December 23...
NASA Astrophysics Data System (ADS)
Bernard, J.
2012-12-01
The Manufacturers of geophysical instruments have been facing these past decades the fast evolution of the electronics and of the computer sciences. More automatisms have been introduced into the equipment and into the processing and interpretation software which may let believe that conducting geophysical surveys requires less understanding of the method and less experience than in the past. Hence some misunderstandings in the skills that are needed to make the geophysical results well integrated among the global information which the applied geologist needs to acquire to be successful in his applications. Globally, the demand in geophysical investigation goes towards more penetration depth, requiring more powerful transmitters, and towards a better resolution, requiring more data such as in 3D analysis. Budgets aspects strongly suggest a high efficiency in the field associated to high speed data processing. The innovation is required in all aspects of geophysics to fit with the market needs, including new technological (instruments, software) and methodological (methods, procedures, arrays) developments. The structures in charge of the geophysical work can be public organisations (institutes, ministries, geological surveys,…) or can come from the private sector (large companies, sub-contractors, consultants, …), each one of them getting their own constraints in the field work and in the processing and interpretation phases. In the applications concerning Groundwater investigations, Mining Exploration, Environmental and Engineering surveys, examples of data and their interpretation presently carried out all around the world will be presented for DC Resistivity (Vertical Electrical Sounding, 2D, 3D Resistivity Imaging, Resistivity Monitoring), Induced Polarisation (Time Domain 2D, 3D arrays for mining and environmental), Magnetic Resonance Sounding (direct detection and characterisation of groundwater) and Electromagnetic (multi-component and multi-spacing Frequency Domain Sounding and Profiling technique). The place that Geophysics takes in the market among the other investigation techniques is, and will remain, dependant on the quality of the results obtained, despite the uncertainties linked to the field (noise aspects) and to the interpretation (equivalence aspects), under the control of budget decisions.Resistivity Imaging measurements for groundwater investigations
FRW and domain walls in higher spin gravity
NASA Astrophysics Data System (ADS)
Aros, R.; Iazeolla, C.; Noreña, J.; Sezgin, E.; Sundell, P.; Yin, Y.
2018-03-01
We present exact solutions to Vasiliev's bosonic higher spin gravity equations in four dimensions with positive and negative cosmological constant that admit an interpretation in terms of domain walls, quasi-instantons and Friedman-Robertson-Walker (FRW) backgrounds. Their isometry algebras are infinite dimensional higher-spin extensions of spacetime isometries generated by six Killing vectors. The solutions presented are obtained by using a method of holomorphic factorization in noncommutative twistor space and gauge functions. In interpreting the solutions in terms of Fronsdal-type fields in space-time, a field-dependent higher spin transformation is required, which is implemented at leading order. To this order, the scalar field solves Klein-Gordon equation with conformal mass in ( A) dS 4 . We interpret the FRW solution with de Sitter asymptotics in the context of inflationary cosmology and we expect that the domain wall and FRW solutions are associated with spontaneously broken scaling symmetries in their holographic description. We observe that the factorization method provides a convenient framework for setting up a perturbation theory around the exact solutions, and we propose that the nonlinear completion of particle excitations over FRW and domain wall solutions requires black hole-like states.
Automated recognition of stratigraphic marker shales from geophysical logs in iron ore deposits
NASA Astrophysics Data System (ADS)
Silversides, Katherine; Melkumyan, Arman; Wyman, Derek; Hatherly, Peter
2015-04-01
The mining of stratiform ore deposits requires a means of determining the location of stratigraphic boundaries. A variety of geophysical logs may provide the required data but, in the case of banded iron formation hosted iron ore deposits in the Hamersley Ranges of Western Australia, only one geophysical log type (natural gamma) is collected for this purpose. The information from these logs is currently processed by slow manual interpretation. In this paper we present an alternative method of automatically identifying recurring stratigraphic markers in natural gamma logs from multiple drill holes. Our approach is demonstrated using natural gamma geophysical logs that contain features corresponding to the presence of stratigraphically important marker shales. The host stratigraphic sequence is highly consistent throughout the Hamersley and the marker shales can therefore be used to identify the stratigraphic location of the banded iron formation (BIF) or BIF hosted ore. The marker shales are identified using Gaussian Processes (GP) trained by either manual or active learning methods and the results are compared to the existing geological interpretation. The manual method involves the user selecting the signatures for improving the library, whereas the active learning method uses the measure of uncertainty provided by the GP to select specific examples for the user to consider for addition. The results demonstrate that both GP methods can identify a feature, but the active learning approach has several benefits over the manual method. These benefits include greater accuracy in the identified signatures, faster library building, and an objective approach for selecting signatures that includes the full range of signatures across a deposit in the library. When using the active learning method, it was found that the current manual interpretation could be replaced in 78.4% of the holes with an accuracy of 95.7%.
28 CFR 901.2 - Interpretation of fingerprint submission requirements.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 28 Judicial Administration 2 2010-07-01 2010-07-01 false Interpretation of fingerprint submission... FINGERPRINT SUBMISSION REQUIREMENTS § 901.2 Interpretation of fingerprint submission requirements. (a) Article V of the Compact requires the submission of fingerprints or other approved forms of positive...
28 CFR 901.2 - Interpretation of fingerprint submission requirements.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 28 Judicial Administration 2 2014-07-01 2014-07-01 false Interpretation of fingerprint submission... FINGERPRINT SUBMISSION REQUIREMENTS § 901.2 Interpretation of fingerprint submission requirements. (a) Article V of the Compact requires the submission of fingerprints or other approved forms of positive...
28 CFR 901.2 - Interpretation of fingerprint submission requirements.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 28 Judicial Administration 2 2012-07-01 2012-07-01 false Interpretation of fingerprint submission... FINGERPRINT SUBMISSION REQUIREMENTS § 901.2 Interpretation of fingerprint submission requirements. (a) Article V of the Compact requires the submission of fingerprints or other approved forms of positive...
28 CFR 901.2 - Interpretation of fingerprint submission requirements.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 28 Judicial Administration 2 2011-07-01 2011-07-01 false Interpretation of fingerprint submission... FINGERPRINT SUBMISSION REQUIREMENTS § 901.2 Interpretation of fingerprint submission requirements. (a) Article V of the Compact requires the submission of fingerprints or other approved forms of positive...
28 CFR 901.2 - Interpretation of fingerprint submission requirements.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 28 Judicial Administration 2 2013-07-01 2013-07-01 false Interpretation of fingerprint submission... FINGERPRINT SUBMISSION REQUIREMENTS § 901.2 Interpretation of fingerprint submission requirements. (a) Article V of the Compact requires the submission of fingerprints or other approved forms of positive...
Noise Reduction in High-Throughput Gene Perturbation Screens
USDA-ARS?s Scientific Manuscript database
Motivation: Accurate interpretation of perturbation screens is essential for a successful functional investigation. However, the screened phenotypes are often distorted by noise, and their analysis requires specialized statistical analysis tools. The number and scope of statistical methods available...
Dentist-patient communication in the multilingual dental setting.
Goldsmith, C; Slack-Smith, L; Davies, G
2005-12-01
Communication between dentists and patients can be exceptionally challenging when the patient and the dentist do not speak the same language, as is frequently the case in multicultural Australia. The aim of this study was to describe the issues involved in dealing with limited-English speaking patients in order to formulate recommendations on how to improve dental communication. A cross sectional study was performed using a postal survey to Australian Dental Association member dental practitioners in Western Australia. Responses were collated and data analysis was performed using SPSS 11.5 for Windows. Most respondents encounter language-related communication barriers weekly or monthly, and the most satisfactory method of communication is informal interpreters. Despite reporting satisfaction working with professional chairside interpreters or dental staff interpreters, most respondents did not use them. The most common alternative communication methods were diagrams and models. Endodontics and periodontics provided the greatest challenge in communication. Informed consent was reportedly compromised due to language barriers by 29 per cent of respondents. Recommendations to improve communication included access to interpretation services, dentist technique/attitude to communication and patient preparedness for English-speaking encounters. Many respondents do not utilize the preferential communication methods, creating a potential compromise to both informed consent and the patients' best interests. The use of professional interpreters is recommended, and discussion should be supplemented with means of non-verbal communication. Dentists require access to lists of multilingual dentists and greater awareness of interpretation services to improve multilingual dentist-patient communication.
Speeding up local correlation methods
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kats, Daniel
2014-12-28
We present two techniques that can substantially speed up the local correlation methods. The first one allows one to avoid the expensive transformation of the electron-repulsion integrals from atomic orbitals to virtual space. The second one introduces an algorithm for the residual equations in the local perturbative treatment that, in contrast to the standard scheme, does not require holding the amplitudes or residuals in memory. It is shown that even an interpreter-based implementation of the proposed algorithm in the context of local MP2 method is faster and requires less memory than the highly optimized variants of conventional algorithms.
A general method for decomposing the causes of socioeconomic inequality in health.
Heckley, Gawain; Gerdtham, Ulf-G; Kjellsson, Gustav
2016-07-01
We introduce a general decomposition method applicable to all forms of bivariate rank dependent indices of socioeconomic inequality in health, including the concentration index. The technique is based on recentered influence function regression and requires only the application of OLS to a transformed variable with similar interpretation. Our method requires few identifying assumptions to yield valid estimates in most common empirical applications, unlike current methods favoured in the literature. Using the Swedish Twin Registry and a within twin pair fixed effects identification strategy, our new method finds no evidence of a causal effect of education on income-related health inequality. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.
Applied photo interpretation for airbrush cartography
NASA Technical Reports Server (NTRS)
Inge, J. L.; Bridges, P. M.
1976-01-01
Lunar and planetary exploration has required the development of new techniques of cartographic portrayal. Conventional photo-interpretive methods employing size, shape, shadow, tone, pattern, and texture are applied to computer-processed satellite television images. Comparative judgements are affected by illumination, resolution, variations in surface coloration, and transmission or processing artifacts. The portrayal of tonal densities in a relief illustration is performed using a unique airbrush technique derived from hill-shading of contour maps. The control of tone and line quality is essential because the mid-gray to dark tone densities must be finalized prior to the addition of highlights to the drawing. This is done with an electric eraser until the drawing is completed. The drawing density is controlled with a reflectance-reading densitometer to meet certain density guidelines. The versatility of planetary photo-interpretive methods for airbrushed map portrayals is demonstrated by the application of these techniques to the synthesis of nonrelief data.
NASA Astrophysics Data System (ADS)
Xie, Tian; Grossman, Jeffrey C.
2018-04-01
The use of machine learning methods for accelerating the design of crystalline materials usually requires manually constructed feature vectors or complex transformation of atom coordinates to input the crystal structure, which either constrains the model to certain crystal types or makes it difficult to provide chemical insights. Here, we develop a crystal graph convolutional neural networks framework to directly learn material properties from the connection of atoms in the crystal, providing a universal and interpretable representation of crystalline materials. Our method provides a highly accurate prediction of density functional theory calculated properties for eight different properties of crystals with various structure types and compositions after being trained with 1 04 data points. Further, our framework is interpretable because one can extract the contributions from local chemical environments to global properties. Using an example of perovskites, we show how this information can be utilized to discover empirical rules for materials design.
Ozseven, Ayşe Gül; Sesli Çetin, Emel; Ozseven, Levent
2012-07-01
In recent years, owing to the presence of multi-drug resistant nosocomial bacteria, combination therapies are more frequently applied. Thus there is more need to investigate the in vitro activity of drug combinations against multi-drug resistant bacteria. Checkerboard synergy testing is among the most widely used standard technique to determine the activity of antibiotic combinations. It is based on microdilution susceptibility testing of antibiotic combinations. Although this test has a standardised procedure, there are many different methods for interpreting the results. In many previous studies carried out with multi-drug resistant bacteria, different rates of synergy have been reported with various antibiotic combinations using checkerboard technique. These differences might be attributed to the different features of the strains. However, different synergy rates detected by checkerboard method have also been reported in other studies using the same drug combinations and same types of bacteria. It was thought that these differences in synergy rates might be due to the different methods of interpretation of synergy test results. In recent years, multi-drug resistant Acinetobacter baumannii has been the most commonly encountered nosocomial pathogen especially in intensive-care units. For this reason, multidrug resistant A.baumannii has been the subject of a considerable amount of research about antimicrobial combinations. In the present study, the in vitro activities of frequently preferred combinations in A.baumannii infections like imipenem plus ampicillin/sulbactam, and meropenem plus ampicillin/sulbactam were tested by checkerboard synergy method against 34 multi-drug resistant A.baumannii isolates. Minimum inhibitory concentration (MIC) values for imipenem, meropenem and ampicillin/sulbactam were determined by the broth microdilution method. Subsequently the activity of two different combinations were tested in the dilution range of 4 x MIC and 0.03 x MIC in 96-well checkerboard plates. The results were obtained separately using the four different interpretation methods frequently preferred by researchers. Thus, it was aimed to detect to what extent the rates of synergistic, indifferent and antagonistic interactions were affected by different interpretation methods. The differences between the interpretation methods were tested by chi-square analysis for each combination used. Statistically significant differences were detected between the four different interpretation methods for the determination of synergistic and indifferent interactions (p< 0.0001). Highest rates of synergy were observed with both combinations by the method that used the lowest fractional inhibitory concentration index of all the non-turbid wells along the turbidity/non-turbidity interface. There was no statistically significant difference between the four methods for the detection of antagonism (p> 0.05). In conclusion although there is a standard procedure for checkerboard synergy testing it fails to exhibit standard results owing to different methods of interpretation of the results. Thus, there is a need to standardise the interpretation method for checkerboard synergy testing. To determine the most appropriate method of interpretation further studies investigating the clinical benefits of synergic combinations and additionally comparing the consistency of the results obtained from the other standard combination tests like time-kill studies, are required.
NASA Astrophysics Data System (ADS)
Szafranko, Elżbieta
2017-10-01
Assessment of variant solutions developed for a building investment project needs to be made at the stage of planning. While considering alternative solutions, the investor defines various criteria, but a direct evaluation of the degree of their fulfilment by developed variant solutions can be very difficult. In practice, there are different methods which enable the user to include a large number of parameters into an analysis, but their implementation can be challenging. Some methods require advanced mathematical computations, preceded by complicating input data processing, and the generated results may not lend themselves easily to interpretation. Hence, during her research, the author has developed a systemic approach, which involves several methods and whose goal is to compare their outcome. The final stage of the proposed method consists of graphic interpretation of results. The method has been tested on a variety of building and development projects.
Phillips, Jeffrey D.
2002-01-01
In 1997, the U.S. Geological Survey (USGS) contracted with Sial Geosciences Inc. for a detailed aeromagnetic survey of the Santa Cruz basin and Patagonia Mountains area of south-central Arizona. The contractor's Operational Report is included as an Appendix in this report. This section describes the data processing performed by the USGS on the digital aeromagnetic data received from the contractor. This processing was required in order to remove flight line noise, estimate the depths to the magnetic sources, and estimate the locations of the magnetic contacts. Three methods were used for estimating source depths and contact locations: the horizontal gradient method, the analytic signal method, and the local wavenumber method. The depth estimates resulting from each method are compared, and the contact locations are combined into an interpretative map showing the dip direction for some contacts.
Detection of antinuclear antibodies in SLE.
Kumar, Yashwant; Bhatia, Alka
2014-01-01
The antinuclear antibodies (ANA) also known as antinuclear factors (ANF) are unwanted molecules which bind and destroy certain structures within the nucleus. In systemic lupus erythematosus (SLE), they are produced in excess; hence their detection in the blood of patients is important for diagnosis and monitoring of the disease. Several methods are available which can be used to detect ANA; nevertheless, indirect immunofluorescence antinuclear antibody test (IF-ANA) is considered a "reference method" for their detection. Though IF-ANA is relatively easier to perform, its interpretation requires considerable skill and experience. The chapter therefore is aimed to provide comprehensive details to readers, not only about its methodology but also the result interpretation and reporting aspects of IF-ANA.
Screencasts: Formative Assessment for Mathematical Thinking
ERIC Educational Resources Information Center
Soto, Melissa; Ambrose, Rebecca
2016-01-01
Increased attention to reasoning and justification in mathematics classrooms requires the use of more authentic assessment methods. Particularly important are tools that allow teachers and students opportunities to engage in formative assessment practices such as gathering data, interpreting understanding, and revising thinking or instruction.…
Formulation of a dynamic analysis method for a generic family of hoop-mast antenna systems
NASA Technical Reports Server (NTRS)
Gabriele, A.; Loewy, R.
1981-01-01
Analytical studies of mast-cable-hoop-membrane type antennas were conducted using a transfer matrix numerical analysis approach. This method, by virtue of its specialization and the inherently easy compartmentalization of the formulation and numerical procedures, can be significantly more efficient in computer time required and in the time needed to review and interpret the results.
An Automated, High-Throughput Method for Interpreting the Tandem Mass Spectra of Glycosaminoglycans
NASA Astrophysics Data System (ADS)
Duan, Jiana; Jonathan Amster, I.
2018-05-01
The biological interactions between glycosaminoglycans (GAGs) and other biomolecules are heavily influenced by structural features of the glycan. The structure of GAGs can be assigned using tandem mass spectrometry (MS2), but analysis of these data, to date, requires manually interpretation, a slow process that presents a bottleneck to the broader deployment of this approach to solving biologically relevant problems. Automated interpretation remains a challenge, as GAG biosynthesis is not template-driven, and therefore, one cannot predict structures from genomic data, as is done with proteins. The lack of a structure database, a consequence of the non-template biosynthesis, requires a de novo approach to interpretation of the mass spectral data. We propose a model for rapid, high-throughput GAG analysis by using an approach in which candidate structures are scored for the likelihood that they would produce the features observed in the mass spectrum. To make this approach tractable, a genetic algorithm is used to greatly reduce the search-space of isomeric structures that are considered. The time required for analysis is significantly reduced compared to an approach in which every possible isomer is considered and scored. The model is coded in a software package using the MATLAB environment. This approach was tested on tandem mass spectrometry data for long-chain, moderately sulfated chondroitin sulfate oligomers that were derived from the proteoglycan bikunin. The bikunin data was previously interpreted manually. Our approach examines glycosidic fragments to localize SO3 modifications to specific residues and yields the same structures reported in literature, only much more quickly.
Norris, Ross L; Martin, Jennifer H; Thompson, Erin; Ray, John E; Fullinfaw, Robert O; Joyce, David; Barras, Michael; Jones, Graham R; Morris, Raymond G
2010-10-01
The measurement of drug concentrations, for clinical purposes, occurs in many diagnostic laboratories throughout Australia and New Zealand. However, the provision of a comprehensive therapeutic drug monitoring (TDM) service requires the additional elements of pre- and postanalytical advice to ensure that concentrations reported are meaningful, interpretable, and clinically applicable to the individual patient. The aim of this project was to assess the status of TDM services in Australia and New Zealand. A range of professions involved in key aspects of TDM was surveyed by questionnaire in late 2007. Information gathered included: the list of drugs assayed; analytical methods used; interpretation services offered; interpretative methods used; and further monitoring advice provided. Fifty-seven responses were received, of which 42% were from hospitals (public and/or private); 11% a hospital (public and/or private) and pathology provider; and 47% a pathology provider only (public and/or private). Results showed that TDM is applied to a large number of different drugs. Poorly performing assay methods were used in some cases, even when published guidelines recommended alternative practices. Although there was a wide array of assays available, the evidence suggested a need for better selection of assay methods. In addition, only limited advice and/or interpretation of results was offered. Of concern, less than 50% of those providing advice on aminoglycoside dosing in adults used pharmacokinetic tools with six of 37 (16.2%) respondents using Bayesian pharmacokinetic tools, the method recommended in the Australian Therapeutic Guidelines: Antibiotic. In conclusion, the survey highlighted deficiencies in the provision of TDM services, in particular assay method selection and both quality and quantity of postanalytical advice. A range of recommendations, some of which may have international implications, are discussed. There is a need to include measures of impact on clinical decision-making when assessing assay methodologies. Best practice guidelines and professional standards of practice in TDM are needed, supported by an active program of professional development to ensure the benefits of TDM are realized. This will require significant partnerships between the various professions involved.
Morris, Heather C; Monaco, Lisa A; Steele, Andrew; Wainwright, Norm
2010-10-01
Historically, colony-forming units as determined by plate cultures have been the standard unit for microbiological analysis of environmental samples, medical diagnostics, and products for human use. However, the time and materials required make plate cultures expensive and potentially hazardous in the closed environments of future NASA missions aboard the International Space Station and missions to other Solar System targets. The Limulus Amebocyte Lysate (LAL) assay is an established method for ensuring the sterility and cleanliness of samples in the meat-packing and pharmaceutical industries. Each of these industries has verified numerical requirements for the correct interpretation of results from this assay. The LAL assay is a rapid, point-of-use, verified assay that has already been approved by NASA Planetary Protection as an alternate, molecular method for the examination of outbound spacecraft. We hypothesize that standards for molecular techniques, similar to those used by the pharmaceutical and meat-packing industries, need to be set by space agencies to ensure accurate data interpretation and subsequent decision making. In support of this idea, we present research that has been conducted to relate the LAL assay to plate cultures, and we recommend values obtained from these investigations that could assist in interpretation and analysis of data obtained from the LAL assay.
20 CFR 602.11 - Secretary's interpretation.
Code of Federal Regulations, 2013 CFR
2013-04-01
... 20 Employees' Benefits 3 2013-04-01 2013-04-01 false Secretary's interpretation. 602.11 Section... IN THE FEDERAL-STATE UNEMPLOYMENT INSURANCE SYSTEM Federal Requirements § 602.11 Secretary's interpretation. (a) The Secretary interprets section 303(a)(1), SSA, to require that a State law provide for such...
20 CFR 602.11 - Secretary's interpretation.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 20 Employees' Benefits 3 2011-04-01 2011-04-01 false Secretary's interpretation. 602.11 Section... IN THE FEDERAL-STATE UNEMPLOYMENT INSURANCE SYSTEM Federal Requirements § 602.11 Secretary's interpretation. (a) The Secretary interprets section 303(a)(1), SSA, to require that a State law provide for such...
Feline dental radiography and radiology: A primer.
Niemiec, Brook A
2014-11-01
Information crucial to the diagnosis and treatment of feline oral diseases can be ascertained using dental radiography and the inclusion of this technology has been shown to be the best way to improve a dental practice. Becoming familar with the techniques required for dental radiology and radiography can, therefore, be greatly beneficial. Novices to dental radiography may need some time to adjust and become comfortable with the techniques. If using dental radiographic film, the generally recommended 'E' or 'F' speeds may be frustrating at first, due to their more specific exposure and image development requirements. Although interpreting dental radiographs is similar to interpreting a standard bony radiograph, there are pathologic states that are unique to the oral cavity and several normal anatomic structures that may mimic pathologic changes. Determining which teeth have been imaged also requires a firm knowledge of oral anatomy as well as the architecture of dental films/digital systems. This article draws on a range of dental radiography and radiology resources, and the benefit of the author's own experience, to review the basics of taking and interpreting intraoral dental radiographs. A simplified method for positioning the tubehead is explained and classic examples of some common oral pathologies are provided. © ISFM and AAFP 2014.
A Ground Truthing Method for AVIRIS Overflights Using Canopy Absorption Spectra
NASA Technical Reports Server (NTRS)
Gamon, John A.; Serrano, Lydia; Roberts, Dar A.; Ustin, Susan L.
1996-01-01
Remote sensing for ecological field studies requires ground truthing for accurate interpretation of remote imagery. However, traditional vegetation sampling methods are time consuming and hard to relate to the scale of an AVIRIS scene. The large errors associated with manual field sampling, the contrasting formats of remote and ground data, and problems with coregistration of field sites with AVIRIS pixels can lead to difficulties in interpreting AVIRIS data. As part of a larger study of fire risk in the Santa Monica Mountains of southern California, we explored a ground-based optical method of sampling vegetation using spectrometers mounted both above and below vegetation canopies. The goal was to use optical methods to provide a rapid, consistent, and objective means of "ground truthing" that could be related both to AVIRIS imagery and to conventional ground sampling (e.g., plot harvests and pigment assays).
Automated interferometric alignment system for paraboloidal mirrors
Maxey, L. Curtis
1993-01-01
A method is described for a systematic method of interpreting interference fringes obtained by using a corner cube retroreflector as an alignment aid when aigning a paraboloid to a spherical wavefront. This is applicable to any general case where such alignment is required, but is specifically applicable in the case of aligning an autocollimating test using a diverging beam wavefront. In addition, the method provides information which can be systematically interpreted such that independent information about pitch, yaw and focus errors can be obtained. Thus, the system lends itself readily to automation. Finally, although the method is developed specifically for paraboloids, it can be seen to be applicable to a variety of other aspheric optics when applied in combination with a wavefront corrector that produces a wavefront which, when reflected from the correctly aligned aspheric surface will produce a collimated wavefront like that obtained from the paraboloid when it is correctly aligned to a spherical wavefront.
Automated interferometric alignment system for paraboloidal mirrors
Maxey, L.C.
1993-09-28
A method is described for a systematic method of interpreting interference fringes obtained by using a corner cube retroreflector as an alignment aid when aligning a paraboloid to a spherical wavefront. This is applicable to any general case where such alignment is required, but is specifically applicable in the case of aligning an autocollimating test using a diverging beam wavefront. In addition, the method provides information which can be systematically interpreted such that independent information about pitch, yaw and focus errors can be obtained. Thus, the system lends itself readily to automation. Finally, although the method is developed specifically for paraboloids, it can be seen to be applicable to a variety of other aspheric optics when applied in combination with a wavefront corrector that produces a wavefront which, when reflected from the correctly aligned aspheric surface will produce a collimated wavefront like that obtained from the paraboloid when it is correctly aligned to a spherical wavefront. 14 figures.
Transforming Multidisciplinary Customer Requirements to Product Design Specifications
NASA Astrophysics Data System (ADS)
Ma, Xiao-Jie; Ding, Guo-Fu; Qin, Sheng-Feng; Li, Rong; Yan, Kai-Yin; Xiao, Shou-Ne; Yang, Guang-Wu
2017-09-01
With the increasing of complexity of complex mechatronic products, it is necessary to involve multidisciplinary design teams, thus, the traditional customer requirements modeling for a single discipline team becomes difficult to be applied in a multidisciplinary team and project since team members with various disciplinary backgrounds may have different interpretations of the customers' requirements. A new synthesized multidisciplinary customer requirements modeling method is provided for obtaining and describing the common understanding of customer requirements (CRs) and more importantly transferring them into a detailed and accurate product design specifications (PDS) to interact with different team members effectively. A case study of designing a high speed train verifies the rationality and feasibility of the proposed multidisciplinary requirement modeling method for complex mechatronic product development. This proposed research offersthe instruction to realize the customer-driven personalized customization of complex mechatronic product.
ERIC Educational Resources Information Center
Lijewska, Agnieszka; Chmiel, Agnieszka
2015-01-01
Conference interpreters form a special case of language users because the simultaneous interpretation practice requires very specific lexical processing. Word comprehension and production in respective languages is performed under strict time constraints and requires constant activation of the involved languages. The present experiment aimed at…
ERIC Educational Resources Information Center
DeJong, William; Wechsler, Henry
Under the Drug-Free Schools and Campuses Act, institutions of higher education are required to review the effectiveness of their alcohol and drug prevention programs biannually. This guide offers a method for gathering and interpreting student survey data on alcohol-related problems based on the methodology of the College Alcohol Survey developed…
Tensile and shear methods for measuring strength of bilayer tablets.
Chang, Shao-Yu; Li, Jian-Xin; Sun, Changquan Calvin
2017-05-15
Both shear and tensile measurement methods have been used to quantify interfacial bonding strength of bilayer tablets. The shear method is more convenient to perform, but reproducible strength data requires careful control of the placement of tablet and contact point for shear force application. Moreover, data obtained from the shear method depend on the orientation of the bilayer tablet. Although more time-consuming to perform, the tensile method yields data that are straightforward to interpret. Thus, the tensile method is preferred in fundamental bilayer tableting research to minimize ambiguity in data interpretation. Using both shear and tensile methods, we measured the mechanical strength of bilayer tablets made of several different layer combinations of lactose and microcrystalline cellulose. We observed a good correlation between strength obtained by the tensile method and carefully conducted shear method. This suggests that the shear method may be used for routine quality test of bilayer tablets during manufacturing because of its speed and convenience, provided a protocol for careful control of the placement of the tablet interface, tablet orientation, and blade is implemented. Copyright © 2017 Elsevier B.V. All rights reserved.
Resource Recovery. Energy and Environment. Teacher's Aid.
ERIC Educational Resources Information Center
Reynolds, Smith and Hills, Inc., Jacksonville, FL.
Designed to assist students in understanding solid waste resource recovery, this teaching aid package aims to get students involved in practical activities that require participation, observation, and interpretation. Provided in this package are definitions, methods, causes and effects, costs, and benefits of resource recovery presented in the…
20 CFR 650.3 - Secretary's interpretation of Federal law requirements.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 20 Employees' Benefits 3 2010-04-01 2010-04-01 false Secretary's interpretation of Federal law... Federal law requirements. (a) The Secretary interprets sections 303(a)(1) and 303(a)(3) above to require that a State law include provision for— (1) Hearing and decision for claimants who are parties to an...
20 CFR 650.3 - Secretary's interpretation of Federal law requirements.
Code of Federal Regulations, 2012 CFR
2012-04-01
... 20 Employees' Benefits 3 2012-04-01 2012-04-01 false Secretary's interpretation of Federal law... Federal law requirements. (a) The Secretary interprets sections 303(a)(1) and 303(a)(3) above to require that a State law include provision for— (1) Hearing and decision for claimants who are parties to an...
20 CFR 650.3 - Secretary's interpretation of Federal law requirements.
Code of Federal Regulations, 2013 CFR
2013-04-01
... 20 Employees' Benefits 3 2013-04-01 2013-04-01 false Secretary's interpretation of Federal law... Federal law requirements. (a) The Secretary interprets sections 303(a)(1) and 303(a)(3) above to require that a State law include provision for— (1) Hearing and decision for claimants who are parties to an...
20 CFR 650.3 - Secretary's interpretation of Federal law requirements.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 20 Employees' Benefits 3 2011-04-01 2011-04-01 false Secretary's interpretation of Federal law... Federal law requirements. (a) The Secretary interprets sections 303(a)(1) and 303(a)(3) above to require that a State law include provision for— (1) Hearing and decision for claimants who are parties to an...
20 CFR 650.3 - Secretary's interpretation of Federal law requirements.
Code of Federal Regulations, 2014 CFR
2014-04-01
... 20 Employees' Benefits 3 2014-04-01 2014-04-01 false Secretary's interpretation of Federal law... Federal law requirements. (a) The Secretary interprets sections 303(a)(1) and 303(a)(3) above to require that a State law include provision for— (1) Hearing and decision for claimants who are parties to an...
Hudelson, Patricia; Perneger, Thomas; Kolly, Véronique; Perron, Noëlle Junod
2012-01-01
Specific knowledge and skills are needed to work effectively with an interpreter, but most doctors have received limited training. Self-assessed competency may not accurately identify training needs. The purpose of this study is to explore the association between self-assessed competency at working with an interpreter and the ability to identify elements of good practice, using a written vignette. A mailed questionnaire was sent to 619 doctors and medical students in Geneva, Switzerland. 58.6% of respondents considered themselves to be highly competent at working with a professional interpreter, but 22% failed to mention even one element of good practice in response to the vignette, and only 39% could name more than one. There was no association between self-rated competency and number of elements mentioned. Training efforts should challenge the assumption that working with an interpreter is intuitive. Evaluation of clinicians' ability to work with an interpreter should not be limited to self-ratings. In the context of large-scale surveys, written vignettes may provide a simple method for identifying knowledge of good practice and topics requiring further training.
A public resource facilitating clinical use of genomes
Ball, Madeleine P.; Thakuria, Joseph V.; Zaranek, Alexander Wait; Clegg, Tom; Rosenbaum, Abraham M.; Wu, Xiaodi; Angrist, Misha; Bhak, Jong; Bobe, Jason; Callow, Matthew J.; Cano, Carlos; Chou, Michael F.; Chung, Wendy K.; Douglas, Shawn M.; Estep, Preston W.; Gore, Athurva; Hulick, Peter; Labarga, Alberto; Lee, Je-Hyuk; Lunshof, Jeantine E.; Kim, Byung Chul; Kim, Jong-Il; Li, Zhe; Murray, Michael F.; Nilsen, Geoffrey B.; Peters, Brock A.; Raman, Anugraha M.; Rienhoff, Hugh Y.; Robasky, Kimberly; Wheeler, Matthew T.; Vandewege, Ward; Vorhaus, Daniel B.; Yang, Joyce L.; Yang, Luhan; Aach, John; Ashley, Euan A.; Drmanac, Radoje; Kim, Seong-Jin; Li, Jin Billy; Peshkin, Leonid; Seidman, Christine E.; Seo, Jeong-Sun; Zhang, Kun; Rehm, Heidi L.; Church, George M.
2012-01-01
Rapid advances in DNA sequencing promise to enable new diagnostics and individualized therapies. Achieving personalized medicine, however, will require extensive research on highly reidentifiable, integrated datasets of genomic and health information. To assist with this, participants in the Personal Genome Project choose to forgo privacy via our institutional review board- approved “open consent” process. The contribution of public data and samples facilitates both scientific discovery and standardization of methods. We present our findings after enrollment of more than 1,800 participants, including whole-genome sequencing of 10 pilot participant genomes (the PGP-10). We introduce the Genome-Environment-Trait Evidence (GET-Evidence) system. This tool automatically processes genomes and prioritizes both published and novel variants for interpretation. In the process of reviewing the presumed healthy PGP-10 genomes, we find numerous literature references implying serious disease. Although it is sometimes impossible to rule out a late-onset effect, stringent evidence requirements can address the high rate of incidental findings. To that end we develop a peer production system for recording and organizing variant evaluations according to standard evidence guidelines, creating a public forum for reaching consensus on interpretation of clinically relevant variants. Genome analysis becomes a two-step process: using a prioritized list to record variant evaluations, then automatically sorting reviewed variants using these annotations. Genome data, health and trait information, participant samples, and variant interpretations are all shared in the public domain—we invite others to review our results using our participant samples and contribute to our interpretations. We offer our public resource and methods to further personalized medical research. PMID:22797899
Beekhuijzen, Manon; Schneider, Steffen; Barraclough, Narinder; Hallmark, Nina; Hoberman, Alan; Lordi, Sheri; Moxon, Mary; Perks, Deborah; Piersma, Aldert H; Makris, Susan L
2018-05-02
In recent years several OECD test guidelines have been updated and some will be updated shortly with the requirement to measure thyroid hormone levels in the blood of mammalian laboratory species. There is, however, an imperative need for clarification and guidance regarding the collection, assessment, and interpretation of thyroid hormone data for regulatory toxicology and risk assessment. Clarification and guidance is needed for 1) timing and methods of blood collection, 2) standardization and validation of the analytical methods, 3) triggers for additional measurements, 4) the need for T4 measurements in postnatal day (PND) 4 pups, and 5) the interpretation of changes in thyroid hormone levels regarding adversity. Discussions on these topics have already been initiated, and involve expert scientists from a number of international multisector organizations. This paper provides an overview of existing issues, current activities and recommendations for moving forward. Copyright © 2018 Elsevier Inc. All rights reserved.
USDA-ARS?s Scientific Manuscript database
Membrane bioreactors (MBR), used for wastewater treatment in Ohio and elsewhere in the United States, have pore sizes large enough to theoretically reduce concentrations of protozoa and bacteria, but not viruses. Sampling for viruses in wastewater is seldom done and not required. Instead, the bac...
Reporting and Interpreting Effect Size in Quantitative Agricultural Education Research
ERIC Educational Resources Information Center
Kotrlik, Joe W.; Williams, Heather A.; Jabor, M. Khata
2011-01-01
The Journal of Agricultural Education (JAE) requires authors to follow the guidelines stated in the Publication Manual of the American Psychological Association [APA] (2009) in preparing research manuscripts, and to utilize accepted research and statistical methods in conducting quantitative research studies. The APA recommends the reporting of…
Measuring Growth with Vertical Scales
ERIC Educational Resources Information Center
Briggs, Derek C.
2013-01-01
A vertical score scale is needed to measure growth across multiple tests in terms of absolute changes in magnitude. Since the warrant for subsequent growth interpretations depends upon the assumption that the scale has interval properties, the validation of a vertical scale would seem to require methods for distinguishing interval scales from…
Antecedents of obesity - analysis, interpretation, and use of longitudinal data.
Gillman, Matthew W; Kleinman, Ken
2007-07-01
The obesity epidemic causes misery and death. Most epidemiologists accept the hypothesis that characteristics of the early stages of human development have lifelong influences on obesity-related health outcomes. Unfortunately, there is a dearth of data of sufficient scope and individual history to help unravel the associations of prenatal, postnatal, and childhood factors with adult obesity and health outcomes. Here the authors discuss analytic methods, the interpretation of models, and the use to which such rare and valuable data may be put in developing interventions to combat the epidemic. For example, analytic methods such as quantile and multinomial logistic regression can describe the effects on body mass index range rather than just its mean; structural equation models may allow comparison of the contributions of different factors at different periods in the life course. Interpretation of the data and model construction is complex, and it requires careful consideration of the biologic plausibility and statistical interpretation of putative causal factors. The goals of discovering modifiable determinants of obesity during the prenatal, postnatal, and childhood periods must be kept in sight, and analyses should be built to facilitate them. Ultimately, interventions in these factors may help prevent obesity-related adverse health outcomes for future generations.
NASA Astrophysics Data System (ADS)
Whidden, E.; Roulet, N.
2003-04-01
Interpretation of a site average terrestrial flux may be complicated in the presence of inhomogeneities. Inhomogeneity may invalidate the basic assumptions of aerodynamic flux measurement. Chamber measurement may miss or misinterpret important temporal or spatial anomalies. Models may smooth over important nonlinearities depending on the scale of application. Although inhomogeneity is usually seen as a design problem, many sites have spatial variance that may have a large impact on net flux, and in many cases a large homogeneous surface is unrealistic. The sensitivity and validity of a site average flux are investigated in the presence of an inhomogeneous site. Directional differences are used to evaluate the validity of aerodynamic methods and the computation of a site average tower flux. Empirical and modelling methods are used to interpret the spatial controls on flux. An ecosystem model, Ecosys, is used to assess spatial length scales appropriate to the ecophysiologic controls. A diffusion model is used to compare tower, chamber, and model data, by spatially weighting contributions within the tower footprint. Diffusion model weighting is also used to improve tower flux estimates by producing footprint averaged ecological parameters (soil moisture, soil temperature, etc.). Although uncertainty remains in the validity of measurement methods and the accuracy of diffusion models, a detailed spatial interpretation is required at an inhomogeneous site. Flux estimation between methods improves with spatial interpretation, showing the importance to an estimation of a site average flux. Small-scale temporal and spatial anomalies may be relatively unimportant to overall flux, but accounting for medium-scale differences in ecophysiological controls is necessary. A combination of measurements and modelling can be used to define the appropriate time and length scales of significant non-linearity due to inhomogeneity.
Clinical Applications of Gastrointestinal Manometry in Children
2014-01-01
Manometry is a noninvasive diagnostic tool for identifying motility dysfunction of the gastrointestinal tract. Despite the great technical advances in monitoring motility, performance of the study in pediatric patients has several limitations that should be considered during the procedure and interpretation of the test results. This article reviews the clinical applications of conventional esophageal and anorectal manometries in children by describing a technique for performing the test. This review will develop the uniformity required for the methods of performance, the parameters for measurement, and interpretation of test results that could be applied in pediatric clinical practice. PMID:24749084
Molecular profiles to biology and pathways: a systems biology approach.
Van Laere, Steven; Dirix, Luc; Vermeulen, Peter
2016-06-16
Interpreting molecular profiles in a biological context requires specialized analysis strategies. Initially, lists of relevant genes were screened to identify enriched concepts associated with pathways or specific molecular processes. However, the shortcoming of interpreting gene lists by using predefined sets of genes has resulted in the development of novel methods that heavily rely on network-based concepts. These algorithms have the advantage that they allow a more holistic view of the signaling properties of the condition under study as well as that they are suitable for integrating different data types like gene expression, gene mutation, and even histological parameters.
Interpretation of magnetotelluric resistivity and phase soundings over horizontal layers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Patella, D.
1976-02-01
The present paper deals with a new inverse method for quantitatively interpreting magnetotelluric apparent resistivity and phase-lag sounding curves over horizontally stratified earth sections. The recurrent character of the general formula relating the wave impedance of an (n-l)-layered medium to that of an n-layered medium suggests the use of the method of reduction to a lower boundary plane, as originally termed by Koefoed in the case of dc resistivity soundings. The layering parameters are so directly derived by a simple iterative procedure. The method is applicable for any number of layers but only when both apparent resistivity and phase-lag soundingmore » curves are jointly available. Moreover no sophisticated algorithm is required: a simple desk electronic calculator together with a sheet of two-layer apparent resistivity and phase-lag master curves are sufficient to reproduce earth sections which, in the range of equivalence, are all consistent with field data.« less
Resolution of structural heterogeneity in dynamic crystallography
Ren, Zhong; Chan, Peter W. Y.; Moffat, Keith; Pai, Emil F.; Royer, William E.; Šrajer, Vukica; Yang, Xiaojing
2013-01-01
Dynamic behavior of proteins is critical to their function. X-ray crystallography, a powerful yet mostly static technique, faces inherent challenges in acquiring dynamic information despite decades of effort. Dynamic ‘structural changes’ are often indirectly inferred from ‘structural differences’ by comparing related static structures. In contrast, the direct observation of dynamic structural changes requires the initiation of a biochemical reaction or process in a crystal. Both the direct and the indirect approaches share a common challenge in analysis: how to interpret the structural heterogeneity intrinsic to all dynamic processes. This paper presents a real-space approach to this challenge, in which a suite of analytical methods and tools to identify and refine the mixed structural species present in multiple crystallographic data sets have been developed. These methods have been applied to representative scenarios in dynamic crystallography, and reveal structural information that is otherwise difficult to interpret or inaccessible using conventional methods. PMID:23695239
Resolution of structural heterogeneity in dynamic crystallography.
Ren, Zhong; Chan, Peter W Y; Moffat, Keith; Pai, Emil F; Royer, William E; Šrajer, Vukica; Yang, Xiaojing
2013-06-01
Dynamic behavior of proteins is critical to their function. X-ray crystallography, a powerful yet mostly static technique, faces inherent challenges in acquiring dynamic information despite decades of effort. Dynamic `structural changes' are often indirectly inferred from `structural differences' by comparing related static structures. In contrast, the direct observation of dynamic structural changes requires the initiation of a biochemical reaction or process in a crystal. Both the direct and the indirect approaches share a common challenge in analysis: how to interpret the structural heterogeneity intrinsic to all dynamic processes. This paper presents a real-space approach to this challenge, in which a suite of analytical methods and tools to identify and refine the mixed structural species present in multiple crystallographic data sets have been developed. These methods have been applied to representative scenarios in dynamic crystallography, and reveal structural information that is otherwise difficult to interpret or inaccessible using conventional methods.
Childress, Carolyn J. Oblinger; Foreman, William T.; Connor, Brooke F.; Maloney, Thomas J.
1999-01-01
This report describes the U.S. Geological Survey National Water Quality Laboratory?s approach for determining long-term method detection levels and establishing reporting levels, details relevant new reporting conventions, and provides preliminary guidance on interpreting data reported with the new conventions. At the long-term method detection level concentration, the risk of a false positive detection (analyte reported present at the long-term method detection level when not in sample) is no more than 1 percent. However, at the long-term method detection level, the risk of a false negative occurrence (analyte reported not present when present at the long-term method detection level concentration) is up to 50 percent. Because this false negative rate is too high for use as a default 'less than' reporting level, a more reliable laboratory reporting level is set at twice the determined long-term method detection level. For all methods, concentrations measured between the laboratory reporting level and the long-term method detection level will be reported as estimated concentrations. Non-detections will be censored to the laboratory reporting level. Adoption of the new reporting conventions requires a full understanding of how low-concentration data can be used and interpreted and places responsibility for using and presenting final data with the user rather than with the laboratory. Users must consider that (1) new laboratory reporting levels may differ from previously established minimum reporting levels, (2) long-term method detection levels and laboratory reporting levels may change over time, and (3) estimated concentrations are less certain than concentrations reported above the laboratory reporting level. The availability of uncensored but qualified low-concentration data for interpretation and statistical analysis is a substantial benefit to the user. A decision to censor data after they are reported from the laboratory may still be made by the user, if merited, on the basis of the intended use of the data.
Interpretive biases in chronic insomnia: an investigation using a priming paradigm.
Ree, Melissa J; Harvey, Allison G
2006-09-01
Disorder-congruent interpretations of ambiguous stimuli characterize several psychological disorders and have been implicated in their maintenance. Models of insomnia have highlighted the importance of cognitive processes, but the possibility that biased interpretations are important has been minimally investigated. Hence, a priming methodology was employed to investigate the presence of an interpretive bias in insomnia. A sample of 78 participants, differing in the presence of a diagnosis of insomnia, severity of sleep disturbance, and sleepiness, was required to read ambiguous sentences and make a lexical decision about target words that followed. Sleepiness at the time of the experiment was associated with the likelihood with which participants made insomnia and threat consistent interpretations of ambiguous sentences. The results suggest that there is a general bias towards threatening interpretations when individuals are sleepy and suggests that cognitive accounts of insomnia require revision to include a role for interpretative bias when people are sleepy. Future research is required to investigate whether this interpretive bias plays a causal role in the maintenance of insomnia.
Revealing plant cryptotypes: defining meaningful phenotypes among infinite traits.
Chitwood, Daniel H; Topp, Christopher N
2015-04-01
The plant phenotype is infinite. Plants vary morphologically and molecularly over developmental time, in response to the environment, and genetically. Exhaustive phenotyping remains not only out of reach, but is also the limiting factor to interpreting the wealth of genetic information currently available. Although phenotyping methods are always improving, an impasse remains: even if we could measure the entirety of phenotype, how would we interpret it? We propose the concept of cryptotype to describe latent, multivariate phenotypes that maximize the separation of a priori classes. Whether the infinite points comprising a leaf outline or shape descriptors defining root architecture, statistical methods to discern the quantitative essence of an organism will be required as we approach measuring the totality of phenotype. Copyright © 2015 Elsevier Ltd. All rights reserved.
Studying neuroanatomy using MRI.
Lerch, Jason P; van der Kouwe, André J W; Raznahan, Armin; Paus, Tomáš; Johansen-Berg, Heidi; Miller, Karla L; Smith, Stephen M; Fischl, Bruce; Sotiropoulos, Stamatios N
2017-02-23
The study of neuroanatomy using imaging enables key insights into how our brains function, are shaped by genes and environment, and change with development, aging and disease. Developments in MRI acquisition, image processing and data modeling have been key to these advances. However, MRI provides an indirect measurement of the biological signals we aim to investigate. Thus, artifacts and key questions of correct interpretation can confound the readouts provided by anatomical MRI. In this review we provide an overview of the methods for measuring macro- and mesoscopic structure and for inferring microstructural properties; we also describe key artifacts and confounds that can lead to incorrect conclusions. Ultimately, we believe that, although methods need to improve and caution is required in interpretation, structural MRI continues to have great promise in furthering our understanding of how the brain works.
An Oil Spill in a Tube: An Accessible Approach for Teaching Environmental NMR Spectroscopy
ERIC Educational Resources Information Center
Simpson, Andre´ J.; Mitchell, Perry J.; Masoom, Hussain; Mobarhan, Yalda Liaghati; Adamo, Antonio; Dicks, Andrew P.
2015-01-01
NMR spectroscopy has great potential as an instrumental method for environmental chemistry research and monitoring but may be underused in teaching laboratories because of its complexity and the level of expertise required in operating the instrument and interpreting data. This laboratory experiment introduces environmental NMR spectroscopy to…
Code of Federal Regulations, 2013 CFR
2013-01-01
... informed in writing of its right to a hearing, of the method by which a hearing may be requested, and that... hearing. If the individual making the request speaks a language other than English and the State agency is required by § 272.4(c)(3) to provide bilingual staff or interpreters who speak the appropriate language...
Code of Federal Regulations, 2011 CFR
2011-01-01
... informed in writing of its right to a hearing, of the method by which a hearing may be requested, and that... hearing. If the individual making the request speaks a language other than English and the State agency is required by § 272.4(c)(3) to provide bilingual staff or interpreters who speak the appropriate language...
Code of Federal Regulations, 2014 CFR
2014-01-01
... informed in writing of its right to a hearing, of the method by which a hearing may be requested, and that... hearing. If the individual making the request speaks a language other than English and the State agency is required by § 272.4(c)(3) to provide bilingual staff or interpreters who speak the appropriate language...
Code of Federal Regulations, 2012 CFR
2012-01-01
... informed in writing of its right to a hearing, of the method by which a hearing may be requested, and that... hearing. If the individual making the request speaks a language other than English and the State agency is required by § 272.4(c)(3) to provide bilingual staff or interpreters who speak the appropriate language...
ERIC Educational Resources Information Center
Jelicic, Helena; Phelps, Erin; Lerner, Richard M.
2009-01-01
Developmental science rests on describing, explaining, and optimizing intraindividual changes and, hence, empirically requires longitudinal research. Problems of missing data arise in most longitudinal studies, thus creating challenges for interpreting the substance and structure of intraindividual change. Using a sample of reports of longitudinal…
Helicopter rotor and engine sizing for preliminary performance estimation
NASA Technical Reports Server (NTRS)
Talbot, P. D.; Bowles, J. V.; Lee, H. C.
1986-01-01
Methods are presented for estimating some of the more fundamental design variables of single-rotor helicopters (tip speed, blade area, disk loading, and installed power) based on design requirements (speed, weight, fuselage drag, and design hover ceiling). The well-known constraints of advancing-blade compressibility and retreating-blade stall are incorporated into the estimation process, based on an empirical interpretation of rotor performance data from large-scale wind-tunnel tests. Engine performance data are presented and correlated with a simple model usable for preliminary design. When approximate results are required quickly, these methods may be more convenient to use and provide more insight than large digital computer programs.
Green, Julie M.; Wilcke, Jeffrey R.; Abbott, Jonathon; Rees, Loren P.
2006-01-01
Objective: This study evaluated an existing SNOMED-CT® model for structured recording of heart murmur findings and compared it to a concept-dependent attributes model using content from SNOMED-CT. Methods: The authors developed a model for recording heart murmur findings as an alternative to SNOMED-CT's use of Interprets and Has interpretation. A micro-nomenclature was then created to support each model using subset and extension mechanisms described for SNOMED-CT. Each micro-nomenclature included a partonomy of cardiac cycle timing values. A mechanism for handling ranges of values was also devised. One hundred clinical heart murmurs were recorded using purpose-built recording software based on both models. Results: Each micro-nomenclature was extended through the addition of the same list of concepts. SNOMED role grouping was required in both models. All 100 clinical murmurs were described using each model. The only major differences between the two models were the number of relationship rows required for storage and the hierarchical assignments of concepts within the micro-nomenclatures. Conclusion: The authors were able to capture 100 clinical heart murmurs with both models. Requirements for implementing the two models were virtually identical. In fact, data stored using these models could be easily interconverted. There is no apparent penalty for implementing either approach. PMID:16501179
Anatomical guidance for functional near-infrared spectroscopy: AtlasViewer tutorial
Aasted, Christopher M.; Yücel, Meryem A.; Cooper, Robert J.; Dubb, Jay; Tsuzuki, Daisuke; Becerra, Lino; Petkov, Mike P.; Borsook, David; Dan, Ippeita; Boas, David A.
2015-01-01
Abstract. Functional near-infrared spectroscopy (fNIRS) is an optical imaging method that is used to noninvasively measure cerebral hemoglobin concentration changes induced by brain activation. Using structural guidance in fNIRS research enhances interpretation of results and facilitates making comparisons between studies. AtlasViewer is an open-source software package we have developed that incorporates multiple spatial registration tools to enable structural guidance in the interpretation of fNIRS studies. We introduce the reader to the layout of the AtlasViewer graphical user interface, the folder structure, and user files required in the creation of fNIRS probes containing sources and detectors registered to desired locations on the head, evaluating probe fabrication error and intersubject probe placement variability, and different procedures for estimating measurement sensitivity to different brain regions as well as image reconstruction performance. Further, we detail how AtlasViewer provides a generic head atlas for guiding interpretation of fNIRS results, but also permits users to provide subject-specific head anatomies to interpret their results. We anticipate that AtlasViewer will be a valuable tool in improving the anatomical interpretation of fNIRS studies. PMID:26157991
Dickinson, Louise; Ahmed, Hashim U; Allen, Clare; Barentsz, Jelle O; Carey, Brendan; Futterer, Jurgen J; Heijmink, Stijn W; Hoskin, Peter J; Kirkham, Alex; Padhani, Anwar R; Persad, Raj; Puech, Philippe; Punwani, Shonit; Sohaib, Aslam S; Tombal, Bertrand; Villers, Arnauld; van der Meulen, Jan; Emberton, Mark
2011-04-01
Multiparametric magnetic resonance imaging (mpMRI) may have a role in detecting clinically significant prostate cancer in men with raised serum prostate-specific antigen levels. Variations in technique and the interpretation of images have contributed to inconsistency in its reported performance characteristics. Our aim was to make recommendations on a standardised method for the conduct, interpretation, and reporting of prostate mpMRI for prostate cancer detection and localisation. A consensus meeting of 16 European prostate cancer experts was held that followed the UCLA-RAND Appropriateness Method and facilitated by an independent chair. Before the meeting, 520 items were scored for "appropriateness" by panel members, discussed face to face, and rescored. Agreement was reached in 67% of 260 items related to imaging sequence parameters. T2-weighted, dynamic contrast-enhanced, and diffusion-weighted MRI were the key sequences incorporated into the minimum requirements. Consensus was also reached on 54% of 260 items related to image interpretation and reporting, including features of malignancy on individual sequences. A 5-point scale was agreed on for communicating the probability of malignancy, with a minimum of 16 prostatic regions of interest, to include a pictorial representation of suspicious foci. Limitations relate to consensus methodology. Dominant personalities are known to affect the opinions of the group and were countered by a neutral chairperson. Consensus was reached on a number of areas related to the conduct, interpretation, and reporting of mpMRI for the detection, localisation, and characterisation of prostate cancer. Before optimal dissemination of this technology, these outcomes will require formal validation in prospective trials. Copyright © 2010 European Association of Urology. Published by Elsevier B.V. All rights reserved.
28 CFR 904.2 - Interpretation of the criminal history record screening requirement.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 28 Judicial Administration 2 2014-07-01 2014-07-01 false Interpretation of the criminal history... PRIVACY COMPACT COUNCIL STATE CRIMINAL HISTORY RECORD SCREENING STANDARDS § 904.2 Interpretation of the criminal history record screening requirement. Compact Article IV(c) provides that “Any record obtained...
28 CFR 904.2 - Interpretation of the criminal history record screening requirement.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 28 Judicial Administration 2 2011-07-01 2011-07-01 false Interpretation of the criminal history... PRIVACY COMPACT COUNCIL STATE CRIMINAL HISTORY RECORD SCREENING STANDARDS § 904.2 Interpretation of the criminal history record screening requirement. Compact Article IV(c) provides that “Any record obtained...
28 CFR 904.2 - Interpretation of the criminal history record screening requirement.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 28 Judicial Administration 2 2012-07-01 2012-07-01 false Interpretation of the criminal history... PRIVACY COMPACT COUNCIL STATE CRIMINAL HISTORY RECORD SCREENING STANDARDS § 904.2 Interpretation of the criminal history record screening requirement. Compact Article IV(c) provides that “Any record obtained...
28 CFR 904.2 - Interpretation of the criminal history record screening requirement.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 28 Judicial Administration 2 2013-07-01 2013-07-01 false Interpretation of the criminal history... PRIVACY COMPACT COUNCIL STATE CRIMINAL HISTORY RECORD SCREENING STANDARDS § 904.2 Interpretation of the criminal history record screening requirement. Compact Article IV(c) provides that “Any record obtained...
28 CFR 904.2 - Interpretation of the criminal history record screening requirement.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 28 Judicial Administration 2 2010-07-01 2010-07-01 false Interpretation of the criminal history... PRIVACY COMPACT COUNCIL STATE CRIMINAL HISTORY RECORD SCREENING STANDARDS § 904.2 Interpretation of the criminal history record screening requirement. Compact Article IV(c) provides that “Any record obtained...
NASA Astrophysics Data System (ADS)
Hayashi, Tatsuro; Zhou, Xiangrong; Chen, Huayue; Hara, Takeshi; Miyamoto, Kei; Kobayashi, Tatsunori; Yokoyama, Ryujiro; Kanematsu, Masayuki; Hoshi, Hiroaki; Fujita, Hiroshi
2010-03-01
X-ray CT images have been widely used in clinical routine in recent years. CT images scanned by a modern CT scanner can show the details of various organs and tissues. This means various organs and tissues can be simultaneously interpreted on CT images. However, CT image interpretation requires a lot of time and energy. Therefore, support for interpreting CT images based on image-processing techniques is expected. The interpretation of the spinal curvature is important for clinicians because spinal curvature is associated with various spinal disorders. We propose a quantification scheme of the spinal curvature based on the center line of spinal canal on CT images. The proposed scheme consists of four steps: (1) Automated extraction of the skeletal region based on CT number thresholding. (2) Automated extraction of the center line of spinal canal. (3) Generation of the median plane image of spine, which is reformatted based on the spinal canal. (4) Quantification of the spinal curvature. The proposed scheme was applied to 10 cases, and compared with the Cobb angle that is commonly used by clinicians. We found that a high-correlation (for the 95% confidence interval, lumbar lordosis: 0.81-0.99) between values obtained by the proposed (vector) method and Cobb angle. Also, the proposed method can provide the reproducible result (inter- and intra-observer variability: within 2°). These experimental results suggested a possibility that the proposed method was efficient for quantifying the spinal curvature on CT images.
Bhaganagarapu, Kaushik; Jackson, Graeme D; Abbott, David F
2013-01-01
An enduring issue with data-driven analysis and filtering methods is the interpretation of results. To assist, we present an automatic method for identification of artifact in independent components (ICs) derived from functional MRI (fMRI). The method was designed with the following features: does not require temporal information about an fMRI paradigm; does not require the user to train the algorithm; requires only the fMRI images (additional acquisition of anatomical imaging not required); is able to identify a high proportion of artifact-related ICs without removing components that are likely to be of neuronal origin; can be applied to resting-state fMRI; is automated, requiring minimal or no human intervention. We applied the method to a MELODIC probabilistic ICA of resting-state functional connectivity data acquired in 50 healthy control subjects, and compared the results to a blinded expert manual classification. The method identified between 26 and 72% of the components as artifact (mean 55%). About 0.3% of components identified as artifact were discordant with the manual classification; retrospective examination of these ICs suggested the automated method had correctly identified these as artifact. We have developed an effective automated method which removes a substantial number of unwanted noisy components in ICA analyses of resting-state fMRI data. Source code of our implementation of the method is available.
An Automated Method for Identifying Artifact in Independent Component Analysis of Resting-State fMRI
Bhaganagarapu, Kaushik; Jackson, Graeme D.; Abbott, David F.
2013-01-01
An enduring issue with data-driven analysis and filtering methods is the interpretation of results. To assist, we present an automatic method for identification of artifact in independent components (ICs) derived from functional MRI (fMRI). The method was designed with the following features: does not require temporal information about an fMRI paradigm; does not require the user to train the algorithm; requires only the fMRI images (additional acquisition of anatomical imaging not required); is able to identify a high proportion of artifact-related ICs without removing components that are likely to be of neuronal origin; can be applied to resting-state fMRI; is automated, requiring minimal or no human intervention. We applied the method to a MELODIC probabilistic ICA of resting-state functional connectivity data acquired in 50 healthy control subjects, and compared the results to a blinded expert manual classification. The method identified between 26 and 72% of the components as artifact (mean 55%). About 0.3% of components identified as artifact were discordant with the manual classification; retrospective examination of these ICs suggested the automated method had correctly identified these as artifact. We have developed an effective automated method which removes a substantial number of unwanted noisy components in ICA analyses of resting-state fMRI data. Source code of our implementation of the method is available. PMID:23847511
Interactive visualization to advance earthquake simulation
Kellogg, L.H.; Bawden, G.W.; Bernardin, T.; Billen, M.; Cowgill, E.; Hamann, B.; Jadamec, M.; Kreylos, O.; Staadt, O.; Sumner, D.
2008-01-01
The geological sciences are challenged to manage and interpret increasing volumes of data as observations and simulations increase in size and complexity. For example, simulations of earthquake-related processes typically generate complex, time-varying data sets in two or more dimensions. To facilitate interpretation and analysis of these data sets, evaluate the underlying models, and to drive future calculations, we have developed methods of interactive visualization with a special focus on using immersive virtual reality (VR) environments to interact with models of Earth's surface and interior. Virtual mapping tools allow virtual "field studies" in inaccessible regions. Interactive tools allow us to manipulate shapes in order to construct models of geological features for geodynamic models, while feature extraction tools support quantitative measurement of structures that emerge from numerical simulation or field observations, thereby enabling us to improve our interpretation of the dynamical processes that drive earthquakes. VR has traditionally been used primarily as a presentation tool, albeit with active navigation through data. Reaping the full intellectual benefits of immersive VR as a tool for scientific analysis requires building on the method's strengths, that is, using both 3D perception and interaction with observed or simulated data. This approach also takes advantage of the specialized skills of geological scientists who are trained to interpret, the often limited, geological and geophysical data available from field observations. ?? Birkhaueser 2008.
Invited Commentary: Antecedents of Obesity—Analysis, Interpretation, and Use of Longitudinal Data
Gillman, Matthew W.; Kleinman, Ken
2007-01-01
The obesity epidemic causes misery and death. Most epidemiologists accept the hypothesis that characteristics of the early stages of human development have lifelong influences on obesity-related health outcomes. Unfortunately, there is a dearth of data of sufficient scope and individual history to help unravel the associations of prenatal, postnatal, and childhood factors with adult obesity and health outcomes. Here the authors discuss analytic methods, the interpretation of models, and the use to which such rare and valuable data may be put in developing interventions to combat the epidemic. For example, analytic methods such as quantile and multinomial logistic regression can describe the effects on body mass index range rather than just its mean; structural equation models may allow comparison of the contributions of different factors at different periods in the life course. Interpretation of the data and model construction is complex, and it requires careful consideration of the biologic plausibility and statistical interpretation of putative causal factors. The goals of discovering modifiable determinants of obesity during the prenatal, postnatal, and childhood periods must be kept in sight, and analyses should be built to facilitate them. Ultimately, interventions in these factors may help prevent obesity-related adverse health outcomes for future generations. PMID:17490988
NASA Astrophysics Data System (ADS)
Cobden, L. J.
2017-12-01
Mineral physics provides the essential link between seismic observations of the Earth's interior, and laboratory (or computer-simulated) measurements of rock properties. In this presentation I will outline the procedure for quantitative conversion from thermochemical structure to seismic structure (and vice versa) using the latest datasets from seismology and mineralogy. I will show examples of how this method can allow us to infer major chemical and dynamic properties of the deep mantle. I will also indicate where uncertainties and limitations in the data require us to exercise caution, in order not to "over-interpret" seismic observations. Understanding and modelling these uncertainties serves as a useful guide for mineralogists to ascertain which mineral parameters are most useful in seismic interpretation, and enables seismologists to optimise their data assembly and inversions for quantitative interpretations.
Methylation analysis of polysaccharides: Technical advice.
Sims, Ian M; Carnachan, Susan M; Bell, Tracey J; Hinkley, Simon F R
2018-05-15
Glycosyl linkage (methylation) analysis is used widely for the structural determination of oligo- and poly-saccharides. The procedure involves derivatisation of the individual component sugars of a polysaccharide to partially methylated alditol acetates which are analysed and quantified by gas chromatography-mass spectrometry. The linkage positions for each component sugar can be determined by correctly identifying the partially methylated alditol acetates. Although the methods are well established, there are many technical aspects to this procedure and both careful attention to detail and considerable experience are required to achieve a successful methylation analysis and to correctly interpret the data generated. The aim of this article is to provide the technical details and critical procedural steps necessary for a successful methylation analysis and to assist researchers (a) with interpreting data correctly and (b) in providing the comprehensive data required for reviewers to fully assess the work. Copyright © 2018 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Hayes, David; Widanski, Bozena
2013-01-01
A laboratory experiment is described that introduces students to "real-world" hazardous waste management issues chemists face. The students are required to define an analytical problem, choose a laboratory analysis method, investigate cost factors, consider quality-control issues, interpret the meaning of results, and provide management…
Haranas, Ioannis; Gkigkitzis, Ioannis; Kotsireas, Ilias; Austerlitz, Carlos
2017-01-01
Understanding how the brain encodes information and performs computation requires statistical and functional analysis. Given the complexity of the human brain, simple methods that facilitate the interpretation of statistical correlations among different brain regions can be very useful. In this report we introduce a numerical correlation measure that may serve the interpretation of correlational neuronal data, and may assist in the evaluation of different brain states. The description of the dynamical brain system, through a global numerical measure may indicate the presence of an action principle which may facilitate a application of physics principles in the study of the human brain and cognition.
Shen, Shichen; Sheng, Quanhu; Shyr, Yu; Qu, Jun
2016-01-01
The recently-introduced Orbitrap Fusion mass spectrometry permits various types of MS2 acquisition methods. To date, these different MS2 strategies and the optimal data interpretation approach for each have not been adequately evaluated. This study comprehensively investigated the four MS2 strategies: HCD-OT (higher-energy-collisional-dissociation with Orbitrap detection), HCD-IT (HCD with ion trap, IT), CID-IT (collision-induced-dissociation with IT) and CID-OT on Orbitrap Fusion. To achieve extensive comparison and identify the optimal data interpretation method for each technique, several search engines (SEQUEST and Mascot) and post-processing methods (score-based, PeptideProphet, and Percolator) were assessed for all techniques for the analysis of a human cell proteome. It was found that divergent conclusions could be made from the same dataset when different data interpretation approaches were used and therefore requiring a relatively fair comparison among techniques. Percolator was chosen for comparison of techniques because it performs the best among all search engines and MS2 strategies. For the analysis of human cell proteome using individual MS2 strategies, the highest number of identifications was achieved by HCD-OT, followed by HCD-IT and CID-IT. Based on these results, we concluded that a relatively fair platform for data interpretation is necessary to avoid divergent conclusions from the same dataset, and HCD-OT and HCD-IT may be preferable for protein/peptide identification using Orbitrap Fusion. PMID:27472422
Integrating concepts and skills: Slope and kinematics graphs
NASA Astrophysics Data System (ADS)
Tonelli, Edward P., Jr.
The concept of force is a foundational idea in physics. To predict the results of applying forces to objects, a student must be able to interpret data representing changes in distance, time, speed, and acceleration. Comprehension of kinematics concepts requires students to interpret motion graphs, where rates of change are represented as slopes of line segments. Studies have shown that majorities of students who show proficiency with mathematical concepts fail accurately to interpret motion graphs. The primary aim of this study was to examine how students apply their knowledge of slope when interpreting kinematics graphs. To answer the research questions a mixed methods research design, which included a survey and interviews, was adopted. Ninety eight (N=98) high school students completed surveys which were quantitatively analyzed along with qualitative information collected from interviews of students (N=15) and teachers ( N=2). The study showed that students who recalled methods for calculating slopes and speeds calculated slopes accurately, but calculated speeds inaccurately. When comparing the slopes and speeds, most students resorted to calculating instead of visual inspection. Most students recalled and applied memorized rules. Students who calculated slopes and speeds inaccurately failed to recall methods of calculating slopes and speeds, but when comparing speeds, these students connected the concepts of distance and time to the line segments and the rates of change they represented. This study's findings will likely help mathematics and science educators to better assist their students to apply their knowledge of the definition of slope and skills in kinematics concepts.
Estimation of gene induction enables a relevance-based ranking of gene sets.
Bartholomé, Kilian; Kreutz, Clemens; Timmer, Jens
2009-07-01
In order to handle and interpret the vast amounts of data produced by microarray experiments, the analysis of sets of genes with a common biological functionality has been shown to be advantageous compared to single gene analyses. Some statistical methods have been proposed to analyse the differential gene expression of gene sets in microarray experiments. However, most of these methods either require threshhold values to be chosen for the analysis, or they need some reference set for the determination of significance. We present a method that estimates the number of differentially expressed genes in a gene set without requiring a threshold value for significance of genes. The method is self-contained (i.e., it does not require a reference set for comparison). In contrast to other methods which are focused on significance, our approach emphasizes the relevance of the regulation of gene sets. The presented method measures the degree of regulation of a gene set and is a useful tool to compare the induction of different gene sets and place the results of microarray experiments into the biological context. An R-package is available.
Moitessier, N; Englebienne, P; Lee, D; Lawandi, J; Corbeil, C R
2008-01-01
Accelerating the drug discovery process requires predictive computational protocols capable of reducing or simplifying the synthetic and/or combinatorial challenge. Docking-based virtual screening methods have been developed and successfully applied to a number of pharmaceutical targets. In this review, we first present the current status of docking and scoring methods, with exhaustive lists of these. We next discuss reported comparative studies, outlining criteria for their interpretation. In the final section, we describe some of the remaining developments that would potentially lead to a universally applicable docking/scoring method. PMID:18037925
Code of Federal Regulations, 2010 CFR
2010-04-01
...) Require such member to hold itself out as being willing to buy and sell security futures for its own... for security futures-authority, purpose, interpretation, and scope. 242.400 Section 242.400 Commodity..., AND NMS AND CUSTOMER MARGIN REQUIREMENTS FOR SECURITY FUTURES Customer Margin Requirements for...
Correcting AUC for Measurement Error.
Rosner, Bernard; Tworoger, Shelley; Qiu, Weiliang
2015-12-01
Diagnostic biomarkers are used frequently in epidemiologic and clinical work. The ability of a diagnostic biomarker to discriminate between subjects who develop disease (cases) and subjects who do not (controls) is often measured by the area under the receiver operating characteristic curve (AUC). The diagnostic biomarkers are usually measured with error. Ignoring measurement error can cause biased estimation of AUC, which results in misleading interpretation of the efficacy of a diagnostic biomarker. Several methods have been proposed to correct AUC for measurement error, most of which required the normality assumption for the distributions of diagnostic biomarkers. In this article, we propose a new method to correct AUC for measurement error and derive approximate confidence limits for the corrected AUC. The proposed method does not require the normality assumption. Both real data analyses and simulation studies show good performance of the proposed measurement error correction method.
Antiperovitch, Pavel; Zareba, Wojciech; Steinberg, Jonathan S; Bacharova, Ljuba; Tereshchenko, Larisa G; Farre, Jeronimo; Nikus, Kjell; Ikeda, Takanori; Baranchuk, Adrian
2018-03-01
Despite its importance in everyday clinical practice, the ability of physicians to interpret electrocardiograms (ECGs) is highly variable. ECG patterns are often misdiagnosed, and electrocardiographic emergencies are frequently missed, leading to adverse patient outcomes. Currently, many medical education programs lack an organized curriculum and competency assessment to ensure trainees master this essential skill. ECG patterns that were previously mentioned in literature were organized into groups from A to D based on their clinical importance and distributed among levels of training. Incremental versions of this organization were circulated among members of the International Society of Electrocardiology and the International Society of Holter and Noninvasive Electrocardiology until complete consensus was reached. We present reasonably attainable ECG interpretation competencies for undergraduate and postgraduate trainees. Previous literature suggests that methods of teaching ECG interpretation are less important and can be selected based on the available resources of each education program and student preference. The evidence clearly favors summative trainee evaluation methods, which would facilitate learning and ensure that appropriate competencies are acquired. Resources should be allocated to ensure that every trainee reaches their training milestones and should ensure that no electrocardiographic emergency (class A condition) is ever missed. We hope that these guidelines will inform medical education programs and encourage them to allocate sufficient resources and develop organized curricula. Assessments must be in place to ensure trainees acquire the level-appropriate ECG interpretation skills that are required for safe clinical practice. © 2017 Society of Hospital Medicine.
Quantifying induced effects of subsurface renewable energy storage
NASA Astrophysics Data System (ADS)
Bauer, Sebastian; Beyer, Christof; Pfeiffer, Tilmann; Boockmeyer, Anke; Popp, Steffi; Delfs, Jens-Olaf; Wang, Bo; Li, Dedong; Dethlefsen, Frank; Dahmke, Andreas
2015-04-01
New methods and technologies for energy storage are required for the transition to renewable energy sources. Subsurface energy storage systems such as salt caverns or porous formations offer the possibility of hosting large amounts of energy or substance. When employing these systems, an adequate system and process understanding is required in order to assess the feasibility of the individual storage option at the respective site and to predict the complex and interacting effects induced. This understanding is the basis for assessing the potential as well as the risks connected with a sustainable usage of these storage options, especially when considering possible mutual influences. For achieving this aim, in this work synthetic scenarios for the use of the geological underground as an energy storage system are developed and parameterized. The scenarios are designed to represent typical conditions in North Germany. The types of subsurface use investigated here include gas storage and heat storage in porous formations. The scenarios are numerically simulated and interpreted with regard to risk analysis and effect forecasting. For this, the numerical simulators Eclipse and OpenGeoSys are used. The latter is enhanced to include the required coupled hydraulic, thermal, geomechanical and geochemical processes. Using the simulated and interpreted scenarios, the induced effects are quantified individually and monitoring concepts for observing these effects are derived. This presentation will detail the general investigation concept used and analyze the parameter availability for this type of model applications. Then the process implementation and numerical methods required and applied for simulating the induced effects of subsurface storage are detailed and explained. Application examples show the developed methods and quantify induced effects and storage sizes for the typical settings parameterized. This work is part of the ANGUS+ project, funded by the German Ministry of Education and Research (BMBF).
Cho, Yunju; Ahmed, Arif; Islam, Annana; Kim, Sunghwan
2015-01-01
Because of the increasing importance of heavy and unconventional crude oil as an energy source, there is a growing need for petroleomics: the pursuit of more complete and detailed knowledge of the chemical compositions of crude oil. Crude oil has an extremely complex nature; hence, techniques with ultra-high resolving capabilities, such as Fourier transform ion cyclotron resonance mass spectrometry (FT-ICR MS), are necessary. FT-ICR MS has been successfully applied to the study of heavy and unconventional crude oils such as bitumen and shale oil. However, the analysis of crude oil with FT-ICR MS is not trivial, and it has pushed analysis to the limits of instrumental and methodological capabilities. For example, high-resolution mass spectra of crude oils may contain over 100,000 peaks that require interpretation. To visualize large data sets more effectively, data processing methods such as Kendrick mass defect analysis and statistical analyses have been developed. The successful application of FT-ICR MS to the study of crude oil has been critically dependent on key developments in FT-ICR MS instrumentation and data processing methods. This review offers an introduction to the basic principles, FT-ICR MS instrumentation development, ionization techniques, and data interpretation methods for petroleomics and is intended for readers having no prior experience in this field of study. © 2014 Wiley Periodicals, Inc.
Kilkenny, Monique F; Lannin, Natasha A; Anderson, Craig S; Dewey, Helen M; Kim, Joosup; Barclay-Moss, Karen; Levi, Chris; Faux, Steven; Hill, Kelvin; Grabsch, Brenda; Middleton, Sandy; Thrift, Amanda G; Grimley, Rohan; Donnan, Geoffrey; Cadilhac, Dominique A
2018-03-01
In multicultural Australia, some patients with stroke cannot fully understand, or speak, English. Language barriers may reduce quality of care and consequent outcomes after stroke, yet little has been reported empirically. An observational study of patients with stroke or transient ischemic attack (2010-2015) captured from 45 hospitals participating in the Australian Stroke Clinical Registry. The use of interpreters in hospitals, which is routinely documented, was used as a proxy for severe language barriers. Health-Related Quality of Life was assessed using the EuroQoL-5 dimension-3 level measured 90 to 180 days after stroke. Logistic regression was undertaken to assess the association between domains of EuroQoL-5 dimension and interpreter status. Among 34 562 registrants, 1461 (4.2%) required an interpreter. Compared with patients without interpreters, patients requiring an interpreter were more often women (53% versus 46%; P <0.001), aged ≥75 years (68% versus 51%; P <0.001), and had greater access to stroke unit care (85% versus 78%; P <0.001). After accounting for patient characteristics and stroke severity, patients requiring interpreters had comparable discharge outcomes (eg, mortality, discharged to rehabilitation) to patients not needing interpreters. However, these patients reported poorer Health-Related Quality of Life (visual analogue scale coefficient, -9; 95% CI, -12.38, -5.62), including more problems with self-care (odds ratio: 2.22; 95% CI, 1.82, 2.72), pain (odds ratio: 1.84; 95% CI, 1.52, 2.34), anxiety or depression (odds ratio: 1.60; 95% CI, 1.33, 1.93), and usual activities (odds ratio: 1.62; 95% CI, 1.32, 2.00). Patients requiring interpreters reported poorer Health Related Quality of Life after stroke/transient ischemic attack despite greater access to stroke units. These findings should be interpreted with caution because we are unable to account for prestroke Health Related Quality of Life. Further research is needed. © 2018 American Heart Association, Inc.
Determining significant material properties: A discovery approach
NASA Technical Reports Server (NTRS)
Karplus, Alan K.
1992-01-01
The following is a laboratory experiment designed to further understanding of materials science. The experiment itself can be informative for persons of any age past elementary school, and even for some in elementary school. The preparation of the plastic samples is readily accomplished by persons with resonable dexterity in the cutting of paper designs. The completion of the statistical Design of Experiments, which uses Yates' Method, requires basic math (addition and subtraction). Interpretive work requires plotting of data and making observations. Knowledge of statistical methods would be helpful. The purpose of this experiment is to acquaint students with the seven classes of recyclable plastics, and provide hands-on learning about the response of these plastics to mechanical tensile loading.
Wehde, M. E.
1995-01-01
The common method of digital image comparison by subtraction imposes various constraints on the image contents. Precise registration of images is required to assure proper evaluation of surface locations. The attribute being measured and the calibration and scaling of the sensor are also important to the validity and interpretability of the subtraction result. Influences of sensor gains and offsets complicate the subtraction process. The presence of any uniform systematic transformation component in one of two images to be compared distorts the subtraction results and requires analyst intervention to interpret or remove it. A new technique has been developed to overcome these constraints. Images to be compared are first transformed using the cumulative relative frequency as a transfer function. The transformed images represent the contextual relationship of each surface location with respect to all others within the image. The process of differentiating between the transformed images results in a percentile rank ordered difference. This process produces consistent terrain-change information even when the above requirements necessary for subtraction are relaxed. This technique may be valuable to an appropriately designed hierarchical terrain-monitoring methodology because it does not require human participation in the process.
TACIT: An open-source text analysis, crawling, and interpretation tool.
Dehghani, Morteza; Johnson, Kate M; Garten, Justin; Boghrati, Reihane; Hoover, Joe; Balasubramanian, Vijayan; Singh, Anurag; Shankar, Yuvarani; Pulickal, Linda; Rajkumar, Aswin; Parmar, Niki Jitendra
2017-04-01
As human activity and interaction increasingly take place online, the digital residues of these activities provide a valuable window into a range of psychological and social processes. A great deal of progress has been made toward utilizing these opportunities; however, the complexity of managing and analyzing the quantities of data currently available has limited both the types of analysis used and the number of researchers able to make use of these data. Although fields such as computer science have developed a range of techniques and methods for handling these difficulties, making use of those tools has often required specialized knowledge and programming experience. The Text Analysis, Crawling, and Interpretation Tool (TACIT) is designed to bridge this gap by providing an intuitive tool and interface for making use of state-of-the-art methods in text analysis and large-scale data management. Furthermore, TACIT is implemented as an open, extensible, plugin-driven architecture, which will allow other researchers to extend and expand these capabilities as new methods become available.
Fault diagnosis model for power transformers based on information fusion
NASA Astrophysics Data System (ADS)
Dong, Ming; Yan, Zhang; Yang, Li; Judd, Martin D.
2005-07-01
Methods used to assess the insulation status of power transformers before they deteriorate to a critical state include dissolved gas analysis (DGA), partial discharge (PD) detection and transfer function techniques, etc. All of these approaches require experience in order to correctly interpret the observations. Artificial intelligence (AI) is increasingly used to improve interpretation of the individual datasets. However, a satisfactory diagnosis may not be obtained if only one technique is used. For example, the exact location of PD cannot be predicted if only DGA is performed. However, using diverse methods may result in different diagnosis solutions, a problem that is addressed in this paper through the introduction of a fuzzy information infusion model. An inference scheme is proposed that yields consistent conclusions and manages the inherent uncertainty in the various methods. With the aid of information fusion, a framework is established that allows different diagnostic tools to be combined in a systematic way. The application of information fusion technique for insulation diagnostics of transformers is proved promising by means of examples.
Interaction techniques for radiology workstations: impact on users' productivity
NASA Astrophysics Data System (ADS)
Moise, Adrian; Atkins, M. Stella
2004-04-01
As radiologists progress from reading images presented on film to modern computer systems with images presented on high-resolution displays, many new problems arise. Although the digital medium has many advantages, the radiologist"s job becomes cluttered with many new tasks related to image manipulation. This paper presents our solution for supporting radiologists" interpretation of digital images by automating image presentation during sequential interpretation steps. Our method supports scenario based interpretation, which group data temporally, according to the mental paradigm of the physician. We extended current hanging protocols with support for "stages". A stage reflects the presentation of digital information required to complete a single step within a complex task. We demonstrated the benefits of staging in a user study with 20 lay subjects involved in a visual conjunctive search for targets, similar to a radiology task of identifying anatomical abnormalities. We designed a task and a set of stimuli which allowed us to simulate the interpretation workflow from a typical radiology scenario - reading a chest computed radiography exam when a prior study is also available. The simulation was possible by abstracting the radiologist"s task and the basic workstation navigation functionality. We introduced "Stages," an interaction technique attuned to the radiologist"s interpretation task. Compared to the traditional user interface, Stages generated a 14% reduction in the average interpretation.
Interpreter use in an inner city accident and emergency department.
Leman, P
1997-01-01
OBJECTIVE: To determine the extent of communication problems that arose from patients whose primary language was non-English presenting to an inner city accident and emergency (A&E) department. METHODS: A prospective survey over seven consecutive days during September 1995. All adult patients other than those directly referred by their general practitioner to an inpatient team had a questionnaire completed by the A&E doctor first seeing the patient. The doctor recorded language ability and form of interpreter used, and estimated any prolongation of the consultation and ability to improve communication by the use of additional services. RESULTS: 103 patients (17%) did not speak English as their primary language; 55 patients (9.1% of the study population) had an English language ability rated as other than good, and 16 (29%) of these consultations could have been improved by the use of additional interpreter services; 28 patients overall (4.6% of the study population) required the use of an interpreter, who was usually a relative. CONCLUSIONS: A significant number of patients presenting to A&E have difficulty in communicating in English. These consultations could often have been improved by the use of additional interpreter services. Telephone interpreter services may provide the answer for use in A&E departments because of their instant and 24 hour availability. Images p99-a PMID:9132201
Thellier GUI: An integrated tool for analyzing paleointensity data from Thellier-type experiments
NASA Astrophysics Data System (ADS)
Shaar, Ron; Tauxe, Lisa
2013-03-01
Thellier-type experiments are a method used to estimate the intensity of the ancient geomagnetic field from samples carrying thermoremanent magnetization. The analysis of Thellier-type experimental data is conventionally done by manually interpreting data from each specimen individually. The main limitations of this approach are: (1) manual interpretation is highly subjective and can be biased by misleading concepts, (2) the procedure is time consuming, and (3) unless the measurement data are published, the final results cannot be reproduced by readers. These issues compound when trying to combine together paleointensity data from a collection of studies. Here, we address these problems by introducing the Thellier GUI: a comprehensive tool for interpreting Thellier-type experimental data. The tool presents a graphical user interface, which allows manual interpretation of the data, but also includes two new interpretation tools: (1) Thellier Auto Interpreter: an automatic interpretation procedure based on a given set of experimental requirements, and 2) Consistency Test: a self-test for the consistency of the results assuming groups of samples that should have the same paleointensity values. We apply the new tools to data from two case studies. These demonstrate that interpretation of non-ideal Arai plots is nonunique and different selection criteria can lead to significantly different conclusions. Hence, we recommend adopting the automatic interpretation approach, as it allows a more objective interpretation, which can be easily repeated or revised by others. When the analysis is combined with a Consistency Test, the credibility of the interpretations is enhanced. We also make the case that published paleointensity studies should include the measurement data (as supplementary files or as a contributions to the MagIC database) so that results based on a particular data set can be reproduced and assessed by others.
A Growing Consensus for Change in Interpretation of Clinical Research Evidence.
Wilkerson, Gary B; Denegar, Craig R
2018-03-01
The paradigm of evidence-based practice (EBP) is well established among the health care professions, but perspectives on the best methods for acquiring, analyzing, appraising, and using research evidence are evolving. The EBP paradigm has shifted away from a hierarchy of research-evidence quality to recognize that multiple research methods can yield evidence to guide clinicians and patients through a decision-making process. Whereas the "frequentist" approach to data interpretation through hypothesis testing has been the dominant analytical method used by and taught to athletic training students and scholars, this approach is not optimal for integrating evidence into routine clinical practice. Moreover, the dichotomy of rejecting, or failing to reject, a null hypothesis is inconsistent with the Bayesian-like clinical decision-making process that skilled health care providers intuitively use. We propose that data derived from multiple research methods can be best interpreted by reporting a credible lower limit that represents the smallest treatment effect at a specified level of certainty, which should be judged in relation to the smallest effect considered to be clinically meaningful. Such an approach can provide a quantifiable estimate of certainty that an individual patient needs follow-up attention to prevent an adverse outcome or that a meaningful level of therapeutic benefit will be derived from a given intervention. The practice of athletic training will be influenced by the evolution of the EBP paradigm. Contemporary practice will require clinicians to expand their critical-appraisal skills to effectively integrate the results derived from clinical research into the care of individual patients. Proper interpretation of a credible lower limit value for a magnitude ratio has the potential to increase the likelihood of favorable patient outcomes, thereby advancing the practice of evidence-based athletic training.
ERIC Educational Resources Information Center
Bradley, D.
1977-01-01
A description of the foreign language curriculum at the university level in which courses in simultaneous translation are required. The size and composition of the groups are described as well as methods used to develop skill in translating and interpreting. Results are assessed. (Text is in Spanish.) (AMH)
Medical Interpreters in Outpatient Practice.
Jacobs, Barb; Ryan, Anne M; Henrichs, Katherine S; Weiss, Barry D
2018-01-01
This article provides an overview of the federal requirements related to providing interpreter services for non-English-speaking patients in outpatient practice. Antidiscrimination provisions in federal law require health programs and clinicians receiving federal financial assistance to take reasonable steps to provide meaningful access to individuals with limited English proficiency who are eligible for or likely to be encountered in their health programs or activities. Federal financial assistance includes grants, contracts, loans, tax credits and subsidies, as well as payments through Medicaid, the Children's Health Insurance Program, and most Medicare programs. The only exception is providers whose only federal assistance is through Medicare Part B, an exception that applies to a very small percentage of practicing physicians. All required language assistance services must be free and provided by qualified translators and interpreters. Interpreters must meet specified qualifications and ideally be certified. Although the cost of interpreter services can be considerable, ranging from $45-$150/hour for in-person interpreters, to $1.25-$3.00/minute for telephone interpreters, and $1.95-$3.49/minute for video remote interpreting, it may be reimbursed or covered by a patient's Medicaid or other federally funded medical insurance. Failure to use qualified interpreters can have serious negative consequences for both practitioners and patients. In one study, 1 of every 40 malpractice claims were related, all or in part, to failure to provide appropriate interpreter services. Most importantly, however, the use of qualified interpreters results in better and more efficient patient care. © 2018 Annals of Family Medicine, Inc.
Toward RADSCAT measurements over the sea and their interpretation
NASA Technical Reports Server (NTRS)
Claassen, J. P.; Fung, A. K.; Wu, S. T.; Chan, H. L.
1973-01-01
Investigations into several areas which are essential to the execution and interpretation of suborbital observations by composite radiometer - scatterometer sensor (RADSCAT) are reported. Experiments and theory were developed to demonstrate the remote anemometric capability of the sensor over the sea through various weather conditions. It is shown that weather situations found in extra tropical cyclones are useful for demonstrating the all weather capability of the composite sensor. The large scale fluctuations of the wind over the sea dictate the observational coverage required to correlate measurements with the mean surface wind speed. Various theoretical investigations were performed to establish a premise for the joint interpretation of the experiment data. The effects of clouds and rains on downward radiometric observations over the sea were computed. A method of predicting atmospheric attenuation from joint observations is developed. In other theoretical efforts, the emission and scattering characteristics of the sea were derived. Composite surface theories with coherent and noncoherent assumptions were employed.
21 CFR 369.7 - Warnings required by official compendia.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 21 Food and Drugs 5 2010-04-01 2010-04-01 false Warnings required by official compendia. 369.7... (CONTINUED) DRUGS FOR HUMAN USE INTERPRETATIVE STATEMENTS RE WARNINGS ON DRUGS AND DEVICES FOR OVER-THE-COUNTER SALE Definitions and Interpretations § 369.7 Warnings required by official compendia. Any drug...
21 CFR 369.7 - Warnings required by official compendia.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 21 Food and Drugs 5 2011-04-01 2011-04-01 false Warnings required by official compendia. 369.7... (CONTINUED) DRUGS FOR HUMAN USE INTERPRETATIVE STATEMENTS RE WARNINGS ON DRUGS AND DEVICES FOR OVER-THE-COUNTER SALE Definitions and Interpretations § 369.7 Warnings required by official compendia. Any drug...
21 CFR 369.7 - Warnings required by official compendia.
Code of Federal Regulations, 2012 CFR
2012-04-01
... 21 Food and Drugs 5 2012-04-01 2012-04-01 false Warnings required by official compendia. 369.7... (CONTINUED) DRUGS FOR HUMAN USE INTERPRETATIVE STATEMENTS RE WARNINGS ON DRUGS AND DEVICES FOR OVER-THE-COUNTER SALE Definitions and Interpretations § 369.7 Warnings required by official compendia. Any drug...
21 CFR 369.7 - Warnings required by official compendia.
Code of Federal Regulations, 2014 CFR
2014-04-01
... 21 Food and Drugs 5 2014-04-01 2014-04-01 false Warnings required by official compendia. 369.7... (CONTINUED) DRUGS FOR HUMAN USE INTERPRETATIVE STATEMENTS RE WARNINGS ON DRUGS AND DEVICES FOR OVER-THE-COUNTER SALE Definitions and Interpretations § 369.7 Warnings required by official compendia. Any drug...
21 CFR 369.7 - Warnings required by official compendia.
Code of Federal Regulations, 2013 CFR
2013-04-01
... 21 Food and Drugs 5 2013-04-01 2013-04-01 false Warnings required by official compendia. 369.7... (CONTINUED) DRUGS FOR HUMAN USE INTERPRETATIVE STATEMENTS RE WARNINGS ON DRUGS AND DEVICES FOR OVER-THE-COUNTER SALE Definitions and Interpretations § 369.7 Warnings required by official compendia. Any drug...
Tricco, Andrea C; Antony, Jesmin; Soobiah, Charlene; Kastner, Monika; MacDonald, Heather; Cogo, Elise; Lillie, Erin; Tran, Judy; Straus, Sharon E
2016-05-01
To describe and compare, through a scoping review, emerging knowledge synthesis methods for integrating qualitative and quantitative evidence in health care, in terms of expertise required, similarities, differences, strengths, limitations, and steps involved in using the methods. Electronic databases (e.g., MEDLINE) were searched, and two reviewers independently selected studies and abstracted data for qualitative analysis. In total, 121 articles reporting seven knowledge synthesis methods (critical interpretive synthesis, integrative review, meta-narrative review, meta-summary, mixed studies review, narrative synthesis, and realist review) were included after screening of 17,962 citations and 1,010 full-text articles. Common similarities among methods related to the entire synthesis process, while common differences related to the research question and eligibility criteria. The most common strength was a comprehensive synthesis providing rich contextual data, whereas the most common weakness was a highly subjective method that was not reproducible. For critical interpretive synthesis, meta-narrative review, meta-summary, and narrative synthesis, guidance was not provided for some steps of the review process. Some of the knowledge synthesis methods provided guidance on all steps, whereas other methods were missing guidance on the synthesis process. Further work is needed to clarify these emerging knowledge synthesis methods. Copyright © 2016 Elsevier Inc. All rights reserved.
Seismic facies analysis based on self-organizing map and empirical mode decomposition
NASA Astrophysics Data System (ADS)
Du, Hao-kun; Cao, Jun-xing; Xue, Ya-juan; Wang, Xing-jian
2015-01-01
Seismic facies analysis plays an important role in seismic interpretation and reservoir model building by offering an effective way to identify the changes in geofacies inter wells. The selections of input seismic attributes and their time window have an obvious effect on the validity of classification and require iterative experimentation and prior knowledge. In general, it is sensitive to noise when waveform serves as the input data to cluster analysis, especially with a narrow window. To conquer this limitation, the Empirical Mode Decomposition (EMD) method is introduced into waveform classification based on SOM. We first de-noise the seismic data using EMD and then cluster the data using 1D grid SOM. The main advantages of this method are resolution enhancement and noise reduction. 3D seismic data from the western Sichuan basin, China, are collected for validation. The application results show that seismic facies analysis can be improved and better help the interpretation. The powerful tolerance for noise makes the proposed method to be a better seismic facies analysis tool than classical 1D grid SOM method, especially for waveform cluster with a narrow window.
NASA Technical Reports Server (NTRS)
Werner, Charles L.; Wegmueller, Urs; Small, David L.; Rosen, Paul A.
1994-01-01
Terrain slopes, which can be measured with Synthetic Aperture Radar (SAR) interferometry either from a height map or from the interferometric phase gradient, were used to calculate the local incidence angle and the correct pixel area. Both are required for correct thematic interpretation of SAR data. The interferometric correlation depends on the pixel area projected on a plane perpendicular to the look vector and requires correction for slope effects. Methods for normalization of the backscatter and interferometric correlation for ERS-1 SAR are presented.
Interpreters' Perceptions about the Goals of the Science Museum in Taiwan.
ERIC Educational Resources Information Center
Chin, Chi-Chin
The competence of interpreters, so called "docents," influences visitors' learning in museums. The study reported in this paper investigated 16 interpreters' perceptions about: the educational goals of the science museum in Taiwan, the function of the interpreter in the science museum, the requirements for a competent interpreter, and…
Clinical analysis of genome next-generation sequencing data using the Omicia platform
Coonrod, Emily M; Margraf, Rebecca L; Russell, Archie; Voelkerding, Karl V; Reese, Martin G
2013-01-01
Aims Next-generation sequencing is being implemented in the clinical laboratory environment for the purposes of candidate causal variant discovery in patients affected with a variety of genetic disorders. The successful implementation of this technology for diagnosing genetic disorders requires a rapid, user-friendly method to annotate variants and generate short lists of clinically relevant variants of interest. This report describes Omicia’s Opal platform, a new software tool designed for variant discovery and interpretation in a clinical laboratory environment. The software allows clinical scientists to process, analyze, interpret and report on personal genome files. Materials & Methods To demonstrate the software, the authors describe the interactive use of the system for the rapid discovery of disease-causing variants using three cases. Results & Conclusion Here, the authors show the features of the Opal system and their use in uncovering variants of clinical significance. PMID:23895124
Multivariate temporal dictionary learning for EEG.
Barthélemy, Q; Gouy-Pailler, C; Isaac, Y; Souloumiac, A; Larue, A; Mars, J I
2013-04-30
This article addresses the issue of representing electroencephalographic (EEG) signals in an efficient way. While classical approaches use a fixed Gabor dictionary to analyze EEG signals, this article proposes a data-driven method to obtain an adapted dictionary. To reach an efficient dictionary learning, appropriate spatial and temporal modeling is required. Inter-channels links are taken into account in the spatial multivariate model, and shift-invariance is used for the temporal model. Multivariate learned kernels are informative (a few atoms code plentiful energy) and interpretable (the atoms can have a physiological meaning). Using real EEG data, the proposed method is shown to outperform the classical multichannel matching pursuit used with a Gabor dictionary, as measured by the representative power of the learned dictionary and its spatial flexibility. Moreover, dictionary learning can capture interpretable patterns: this ability is illustrated on real data, learning a P300 evoked potential. Copyright © 2013 Elsevier B.V. All rights reserved.
Ewusie, Joycelyne E; Blondal, Erik; Soobiah, Charlene; Beyene, Joseph; Thabane, Lehana; Straus, Sharon E; Hamid, Jemila S
2017-07-02
Interrupted time series (ITS) design involves collecting data across multiple time points before and after the implementation of an intervention to assess the effect of the intervention on an outcome. ITS designs have become increasingly common in recent times with frequent use in assessing impact of evidence implementation interventions. Several statistical methods are currently available for analysing data from ITS designs; however, there is a lack of guidance on which methods are optimal for different data types and on their implications in interpreting results. Our objective is to conduct a scoping review of existing methods for analysing ITS data, to summarise their characteristics and properties, as well as to examine how the results are reported. We also aim to identify gaps and methodological deficiencies. We will search electronic databases from inception until August 2016 (eg, MEDLINE and JSTOR). Two reviewers will independently screen titles, abstracts and full-text articles and complete the data abstraction. The anticipated outcome will be a summarised description of all the methods that have been used in analysing ITS data in health research, how those methods were applied, their strengths and limitations and the transparency of interpretation/reporting of the results. We will provide summary tables of the characteristics of the included studies. We will also describe the similarities and differences of the various methods. Ethical approval is not required for this study since we are just considering the methods used in the analysis and there will not be identifiable patient data. Results will be disseminated through open access peer-reviewed publications. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
A 'digital' technique for manual extraction of data from aerial photography
NASA Technical Reports Server (NTRS)
Istvan, L. B.; Bondy, M. T.
1977-01-01
The interpretation procedure described uses a grid cell approach. In addition, a random point is located in each cell. The procedure required that the cell/point grid be established on a base map, and identical grids be made to precisely match the scale of the photographic frames. The grid is then positioned on the photography by visual alignment to obvious features. Several alignments on one frame are sometimes required to make a precise match of all points to be interpreted. This system inherently corrects for distortions in the photography. Interpretation is then done cell by cell. In order to meet the time constraints, first order interpretation should be maintained. The data is put onto coding forms, along with other appropriate data, if desired. This 'digital' manual interpretation technique has proven to be efficient, and time and cost effective, while meeting strict requirements for data format and accuracy.
Photochemical transformations of diazocarbonyl compounds: expected and novel reactions
NASA Astrophysics Data System (ADS)
Galkina, O. S.; Rodina, L. L.
2016-05-01
Photochemical reactions of diazocarbonyl compounds are well positioned in synthetic practice as an efficient method for ring contraction and homologation of carboxylic acids and as a carbene generation method. However, interpretation of the observed transformations of diazo compounds in electronically excited states is incomplete and requires a careful study of the fine mechanisms of these processes specific to different excited states of diazo compounds resorting to modern methods of investigation, including laser technology. The review is devoted to analysis of new data in the chemistry of excited states of diazocarbonyl compounds. The bibliography includes 155 references.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Crockett, C.S.; Haas, C.N.
1996-11-01
Due to current proposed regulations requiring monitoring for protozoans and demonstration of adequate protozoan removal depending on source water concentrations detected, many utilities are considering or are engaged in protozoan monitoring activities within their watershed so that proper watershed management and treatment modifications can reduce their impact on drinking water safety and quality. However, due to the difficulties associated with the current analytical methods and sample collection many sampling efforts collect data that cannot be interpreted or lack the tools to interpret the information obtained. Therefore, it is necessary to determine how to develop an effective sampling program tailored tomore » a utility`s specific needs to provide interpretable data and develop tools for evaluating such data. The following case study describes the process in which a utility learned how to collect and interpret monitoring data for their specific needs and provides concepts and tools which other utilities can use to aid in their own macro and microwatershed management efforts.« less
Fusion of monocular cues to detect man-made structures in aerial imagery
NASA Technical Reports Server (NTRS)
Shufelt, Jefferey; Mckeown, David M.
1991-01-01
The extraction of buildings from aerial imagery is a complex problem for automated computer vision. It requires locating regions in a scene that possess properties distinguishing them as man-made objects as opposed to naturally occurring terrain features. It is reasonable to assume that no single detection method can correctly delineate or verify buildings in every scene. A cooperative-methods paradigm is useful in approaching the building extraction problem. Using this paradigm, each extraction technique provides information which can be added or assimilated into an overall interpretation of the scene. Thus, the main objective is to explore the development of computer vision system that integrates the results of various scene analysis techniques into an accurate and robust interpretation of the underlying three dimensional scene. The problem of building hypothesis fusion in aerial imagery is discussed. Building extraction techniques are briefly surveyed, including four building extraction, verification, and clustering systems. A method for fusing the symbolic data generated by these systems is described, and applied to monocular image and stereo image data sets. Evaluation methods for the fusion results are described, and the fusion results are analyzed using these methods.
NASA Astrophysics Data System (ADS)
Radun, Jenni E.; Virtanen, Toni; Olives, Jean-Luc; Vaahteranoksa, Mikko; Vuori, Tero; Nyman, Göte
2007-01-01
We present an effective method for comparing subjective audiovisual quality and the features related to the quality changes of different video cameras. Both quantitative estimation of overall quality and qualitative description of critical quality features are achieved by the method. The aim was to combine two image quality evaluation methods, the quantitative Absolute Category Rating (ACR) method with hidden reference removal and the qualitative Interpretation- Based Quality (IBQ) method in order to see how they complement each other in audiovisual quality estimation tasks. 26 observers estimated the audiovisual quality of six different cameras, mainly mobile phone video cameras. In order to achieve an efficient subjective estimation of audiovisual quality, only two contents with different quality requirements were recorded with each camera. The results show that the subjectively important quality features were more related to the overall estimations of cameras' visual video quality than to the features related to sound. The data demonstrated two significant quality dimensions related to visual quality: darkness and sharpness. We conclude that the qualitative methodology can complement quantitative quality estimations also with audiovisual material. The IBQ approach is valuable especially, when the induced quality changes are multidimensional.
A Review of the Role of Social Cognition in Major Depressive Disorder
Weightman, Michael James; Air, Tracy Michele; Baune, Bernhard Theodor
2014-01-01
Background: Social cognition – the ability to identify, perceive, and interpret socially relevant information – is an important skill that plays a significant role in successful interpersonal functioning. Social cognitive performance is recognized to be impaired in several psychiatric conditions, but the relationship with major depressive disorder is less well understood. The aim of this review is to characterize the current understanding of: (i) the different domains of social cognition and a possible relationship with major depressive disorder, (ii) the clinical presentation of social cognition in acute and remitted depressive states, and (iii) the effect of severity of depression on social cognitive performance. Methods: Electronic databases were searched to identify clinical studies investigating social cognition in a major depressive disorder population, yielding 31 studies for this review. Results: Patients with major depressive disorder appear to interpret social cognitive stimuli differently to healthy controls: depressed individuals may interpret emotion through a mood-congruent bias and have difficulty with cognitive theory of mind tasks requiring interpretation of complex mental states. Social cognitive performance appears to be inversely associated with severity of depression, whilst the bias toward negative emotions persists even in remission. Some deficits may normalize following effective pharmacotherapy. Conclusions: The difficulties with social interaction observed in major depressive disorder may, at least in part, be due to an altered ability to correctly interpret emotional stimuli and mental states. These features seem to persist even in remission, although some may respond to intervention. Further research is required in this area to better understand the functional impact of these findings and the way in which targeted therapy could aid depressed individuals with social interactions. PMID:25566100
Arc-evaporated carbon films: optical properties and electron mean free paths
DOE Office of Scientific and Technical Information (OSTI.GOV)
Williams, M.W.; Arakawa, E.T.; Dolfini, S.M.
1984-01-01
This paper describes briefly a method which can be used to calculate inelastic mean free paths for electrons with energies in the range of interest for the interpretation of surface phenomena. This method requires a knowledge of the optical properties of the material for the photon energies associated with the oscillator strength of the valence electrons. However, in general it is easier to obtain accurate values of the required properties than it is to measure the electron attenuation lengths in the energy region of interest. This technique, demonstrated here for arc-evaporated carbon, can be used for any material for whichmore » the optical properties can be measured over essentially the whole energy range corresponding to the valence electron response.« less
Crux: Rapid Open Source Protein Tandem Mass Spectrometry Analysis
2015-01-01
Efficiently and accurately analyzing big protein tandem mass spectrometry data sets requires robust software that incorporates state-of-the-art computational, machine learning, and statistical methods. The Crux mass spectrometry analysis software toolkit (http://cruxtoolkit.sourceforge.net) is an open source project that aims to provide users with a cross-platform suite of analysis tools for interpreting protein mass spectrometry data. PMID:25182276
Tricco, Andrea C; Antony, Jesmin; Soobiah, Charlene; Kastner, Monika; Cogo, Elise; MacDonald, Heather; D'Souza, Jennifer; Hui, Wing; Straus, Sharon E
2016-05-01
To describe and compare, through a scoping review, emerging knowledge synthesis methods for generating and refining theory, in terms of expertise required, similarities, differences, strengths, limitations, and steps involved in using the methods. Electronic databases (e.g., MEDLINE) were searched, and two reviewers independently selected studies and abstracted data for qualitative analysis. In total, 287 articles reporting nine knowledge synthesis methods (concept synthesis, critical interpretive synthesis, integrative review, meta-ethnography, meta-interpretation, meta-study, meta-synthesis, narrative synthesis, and realist review) were included after screening of 17,962 citations and 1,010 full-text articles. Strengths of the methods included comprehensive synthesis providing rich contextual data and suitability for identifying gaps in the literature, informing policy, aiding in clinical decisions, addressing complex research questions, and synthesizing patient preferences, beliefs, and values. However, many of the methods were highly subjective and not reproducible. For integrative review, meta-ethnography, and realist review, guidance was provided on all steps of the review process, whereas meta-synthesis had guidance on the fewest number of steps. Guidance for conducting the steps was often vague and sometimes absent. Further work is needed to provide direction on operationalizing these methods. Copyright © 2016 Elsevier Inc. All rights reserved.
Perception of Health Problems Among Competitive Runners
Jelvegård, Sara; Timpka, Toomas; Bargoria, Victor; Gauffin, Håkan; Jacobsson, Jenny
2016-01-01
Background: Approximately 2 of every 3 competitive runners sustain at least 1 health problem each season. Most of these problems are nontraumatic injuries with gradual onset. The main known risk indicator for sustaining a new running-related injury episode is a history of a previous injury, suggesting that behavioral habits are part of the causal mechanisms. Purpose: Identification of elements associated with purposeful interpretations of body perceptions and balanced behavioral responses may supply vital information for prevention of health problems in runners. This study set out to explore competitive runners’ cognitive appraisals of perceived symptoms on injury and illness and how these appraisals are transformed into behavior. Study Design: Cross-sectional study; Level of evidence, 3. Methods: The study population consisted of Swedish middle- and long-distance runners from the national top 15 list. Qualitative research methods were used to categorize interview data and perform a thematic analysis. The categories resulting from the analysis were used to construct an explanatory model. Results: Saturation of the thematic classification required that data from 8 male and 6 female runners (age range, 20-36 years) were collected. Symptoms interpreted to be caused by illness or injury with a sudden onset were found to lead to immediate action and changes to training and competition programs (activity pacing). In contrast, perceptions interpreted to be due to injuries with gradual onset led to varied behavioral reactions. These behavioral responses were planned with regard to short-term consequences and were characterized by indifference and neglect of long-term implications, consistent with an overactivity behavioral pattern. The latter pattern was consistent with a psychological adaptation to stimuli that is presented progressively to the athlete. Conclusion: Competitive runners appraise whether a health problem requires immediate withdrawal from training based on whether the problem is interpreted as an illness and/or has a sudden onset. The ensuing behaviors follow 2 distinct patterns that can be termed “activity pacing” and “overactivity.” PMID:28210643
Imbalanced target prediction with pattern discovery on clinical data repositories.
Chan, Tak-Ming; Li, Yuxi; Chiau, Choo-Chiap; Zhu, Jane; Jiang, Jie; Huo, Yong
2017-04-20
Clinical data repositories (CDR) have great potential to improve outcome prediction and risk modeling. However, most clinical studies require careful study design, dedicated data collection efforts, and sophisticated modeling techniques before a hypothesis can be tested. We aim to bridge this gap, so that clinical domain users can perform first-hand prediction on existing repository data without complicated handling, and obtain insightful patterns of imbalanced targets for a formal study before it is conducted. We specifically target for interpretability for domain users where the model can be conveniently explained and applied in clinical practice. We propose an interpretable pattern model which is noise (missing) tolerant for practice data. To address the challenge of imbalanced targets of interest in clinical research, e.g., deaths less than a few percent, the geometric mean of sensitivity and specificity (G-mean) optimization criterion is employed, with which a simple but effective heuristic algorithm is developed. We compared pattern discovery to clinically interpretable methods on two retrospective clinical datasets. They contain 14.9% deaths in 1 year in the thoracic dataset and 9.1% deaths in the cardiac dataset, respectively. In spite of the imbalance challenge shown on other methods, pattern discovery consistently shows competitive cross-validated prediction performance. Compared to logistic regression, Naïve Bayes, and decision tree, pattern discovery achieves statistically significant (p-values < 0.01, Wilcoxon signed rank test) favorable averaged testing G-means and F1-scores (harmonic mean of precision and sensitivity). Without requiring sophisticated technical processing of data and tweaking, the prediction performance of pattern discovery is consistently comparable to the best achievable performance. Pattern discovery has demonstrated to be robust and valuable for target prediction on existing clinical data repositories with imbalance and noise. The prediction results and interpretable patterns can provide insights in an agile and inexpensive way for the potential formal studies.
18 CFR 284.6 - Rate interpretations.
Code of Federal Regulations, 2013 CFR
2013-04-01
... 18 Conservation of Power and Water Resources 1 2013-04-01 2013-04-01 false Rate interpretations... AUTHORITIES General Provisions and Conditions § 284.6 Rate interpretations. (a) Procedure. A pipeline may... rates and charges comply with the requirements of this part. (b) Address. Requests for interpretations...
18 CFR 284.6 - Rate interpretations.
Code of Federal Regulations, 2014 CFR
2014-04-01
... 18 Conservation of Power and Water Resources 1 2014-04-01 2014-04-01 false Rate interpretations... AUTHORITIES General Provisions and Conditions § 284.6 Rate interpretations. (a) Procedure. A pipeline may... rates and charges comply with the requirements of this part. (b) Address. Requests for interpretations...
18 CFR 284.6 - Rate interpretations.
Code of Federal Regulations, 2012 CFR
2012-04-01
... 18 Conservation of Power and Water Resources 1 2012-04-01 2012-04-01 false Rate interpretations... AUTHORITIES General Provisions and Conditions § 284.6 Rate interpretations. (a) Procedure. A pipeline may... rates and charges comply with the requirements of this part. (b) Address. Requests for interpretations...
18 CFR 284.6 - Rate interpretations.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Rate interpretations... AUTHORITIES General Provisions and Conditions § 284.6 Rate interpretations. (a) Procedure. A pipeline may... rates and charges comply with the requirements of this part. (b) Address. Requests for interpretations...
18 CFR 284.6 - Rate interpretations.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 18 Conservation of Power and Water Resources 1 2011-04-01 2011-04-01 false Rate interpretations... AUTHORITIES General Provisions and Conditions § 284.6 Rate interpretations. (a) Procedure. A pipeline may... rates and charges comply with the requirements of this part. (b) Address. Requests for interpretations...
King, B
2001-11-01
The new laboratory accreditation standard, ISO/IEC 17025, reflects current thinking on good measurement practice by requiring more explicit and more demanding attention to a number of activities. These include client interactions, method validation, traceability, and measurement uncertainty. Since the publication of the standard in 1999 there has been extensive debate about its interpretation. It is the author's view that if good quality practices are already in place and if the new requirements are introduced in a manner that is fit for purpose, the additional work required to comply with the new requirements can be expected to be modest. The paper argues that the rigour required in addressing the issues should be driven by customer requirements and the factors that need to be considered in this regard are discussed. The issues addressed include the benefits, interim arrangements, specifying the analytical requirement, establishing traceability, evaluating the uncertainty and reporting the information.
NASA Astrophysics Data System (ADS)
Hay, D. Robert; Brassard, Michel; Matthews, James R.; Garneau, Stephane; Morchat, Richard
1995-06-01
The convergence of a number of contemporary technologies with increasing demands for improvements in inspection capabilities in maritime applications has created new opportunities for ultrasonic inspection. An automated ultrasonic inspection and data collection system APHIUS (automated pressure hull intelligent ultrasonic system), incorporates hardware and software developments to meet specific requirements for the maritime vessels, in particular, submarines in the Canadian Navy. Housed within a hardened portable computer chassis, instrumentation for digital ultrasonic data acquisition and transducer position measurement provide new capabilities that meet more demanding requirements for inspection of the aging submarine fleet. Digital data acquisition enables a number of new important capabilites including archiving of the complete inspection session, interpretation assistance through imaging, and automated interpretation using artificial intelligence methods. With this new reliable inspection system, in conjunction with a complementary study of the significance of real defect type and location, comprehensive new criteria can be generated which will eliminate unnecessary defect removal. As a consequence, cost savings will be realized through shortened submarine refit schedules.
Experimental Protein Structure Verification by Scoring with a Single, Unassigned NMR Spectrum.
Courtney, Joseph M; Ye, Qing; Nesbitt, Anna E; Tang, Ming; Tuttle, Marcus D; Watt, Eric D; Nuzzio, Kristin M; Sperling, Lindsay J; Comellas, Gemma; Peterson, Joseph R; Morrissey, James H; Rienstra, Chad M
2015-10-06
Standard methods for de novo protein structure determination by nuclear magnetic resonance (NMR) require time-consuming data collection and interpretation efforts. Here we present a qualitatively distinct and novel approach, called Comparative, Objective Measurement of Protein Architectures by Scoring Shifts (COMPASS), which identifies the best structures from a set of structural models by numerical comparison with a single, unassigned 2D (13)C-(13)C NMR spectrum containing backbone and side-chain aliphatic signals. COMPASS does not require resonance assignments. It is particularly well suited for interpretation of magic-angle spinning solid-state NMR spectra, but also applicable to solution NMR spectra. We demonstrate COMPASS with experimental data from four proteins--GB1, ubiquitin, DsbA, and the extracellular domain of human tissue factor--and with reconstructed spectra from 11 additional proteins. For all these proteins, with molecular mass up to 25 kDa, COMPASS distinguished the correct fold, most often within 1.5 Å root-mean-square deviation of the reference structure. Copyright © 2015 Elsevier Ltd. All rights reserved.
Experimental Protein Structure Verification by Scoring with a Single, Unassigned NMR Spectrum
Courtney, Joseph M.; Ye, Qing; Nesbitt, Anna E.; Tang, Ming; Tuttle, Marcus D.; Watt, Eric D.; Nuzzio, Kristin M.; Sperling, Lindsay J.; Comellas, Gemma; Peterson, Joseph R.; Morrissey, James H.; Rienstra, Chad M.
2016-01-01
Standard methods for de novo protein structure determination by nuclear magnetic resonance (NMR) require time-consuming data collection and interpretation efforts. Here we present a qualitatively distinct and novel approach, called Comparative, Objective Measurement of Protein Architectures by Scoring Shifts (COMPASS), which identifies the best structures from a set of structural models by numerical comparison with a single, unassigned 2D 13C-13C NMR spectrum containing backbone and side-chain aliphatic signals. COMPASS does not require resonance assignments. It is particularly well suited for interpretation of magic-angle spinning solid-state NMR spectra, but also applicable to solution NMR spectra. We demonstrate COMPASS with experimental data from four proteins—GB1, ubiquitin, DsbA, and the extracellular domain of human tissue factor—and with reconstructed spectra from 11 additional proteins. For all these proteins, with molecular mass up to 25 kDa, COMPASS distinguished the correct fold, most often within 1.5 Å root-mean-square deviation of the reference structure. PMID:26365800
Migaszewski, Z.M.; Lamothe, P.J.; Crock, J.G.; Galuszka, A.; Dolegowska, S.
2011-01-01
Trace element concentrations in plant bioindicators are often determined to assess the quality of the environment. Instrumental methods used for trace element determination require digestion of samples. There are different methods of sample preparation for trace element analysis, and the selection of the best method should be fitted for the purpose of a study. Our hypothesis is that the method of sample preparation is important for interpretation of the results. Here we compare the results of 36 element determinations performed by ICP-MS on ashed and on acid-digested (HNO3, H2O2) samples of two moss species (Hylocomium splendens and Pleurozium schreberi) collected in Alaska and in south-central Poland. We found that dry ashing of the moss samples prior to analysis resulted in considerably lower detection limits of all the elements examined. We also show that this sample preparation technique facilitated the determination of interregional and interspecies differences in the chemistry of trace elements. Compared to the Polish mosses, the Alaskan mosses displayed more positive correlations of the major rock-forming elements with ash content, reflecting those elements' geogenic origin. Of the two moss species, P. schreberi from both Alaska and Poland was also highlighted by a larger number of positive element pair correlations. The cluster analysis suggests that the more uniform element distribution pattern of the Polish mosses primarily reflects regional air pollution sources. Our study has shown that the method of sample preparation is an important factor in statistical interpretation of the results of trace element determinations. ?? 2010 Springer-Verlag.
14 CFR 65.55 - Knowledge requirements.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 14 Aeronautics and Space 2 2014-01-01 2014-01-01 false Knowledge requirements. 65.55 Section 65.55 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED) AIRMEN...) General system of weather and NOTAM collection, dissemination, interpretation, and use; (4) Interpretation...
14 CFR 65.55 - Knowledge requirements.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 14 Aeronautics and Space 2 2012-01-01 2012-01-01 false Knowledge requirements. 65.55 Section 65.55 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED) AIRMEN...) General system of weather and NOTAM collection, dissemination, interpretation, and use; (4) Interpretation...
Korjus, Kristjan; Hebart, Martin N.; Vicente, Raul
2016-01-01
Supervised machine learning methods typically require splitting data into multiple chunks for training, validating, and finally testing classifiers. For finding the best parameters of a classifier, training and validation are usually carried out with cross-validation. This is followed by application of the classifier with optimized parameters to a separate test set for estimating the classifier’s generalization performance. With limited data, this separation of test data creates a difficult trade-off between having more statistical power in estimating generalization performance versus choosing better parameters and fitting a better model. We propose a novel approach that we term “Cross-validation and cross-testing” improving this trade-off by re-using test data without biasing classifier performance. The novel approach is validated using simulated data and electrophysiological recordings in humans and rodents. The results demonstrate that the approach has a higher probability of discovering significant results than the standard approach of cross-validation and testing, while maintaining the nominal alpha level. In contrast to nested cross-validation, which is maximally efficient in re-using data, the proposed approach additionally maintains the interpretability of individual parameters. Taken together, we suggest an addition to currently used machine learning approaches which may be particularly useful in cases where model weights do not require interpretation, but parameters do. PMID:27564393
Korjus, Kristjan; Hebart, Martin N; Vicente, Raul
2016-01-01
Supervised machine learning methods typically require splitting data into multiple chunks for training, validating, and finally testing classifiers. For finding the best parameters of a classifier, training and validation are usually carried out with cross-validation. This is followed by application of the classifier with optimized parameters to a separate test set for estimating the classifier's generalization performance. With limited data, this separation of test data creates a difficult trade-off between having more statistical power in estimating generalization performance versus choosing better parameters and fitting a better model. We propose a novel approach that we term "Cross-validation and cross-testing" improving this trade-off by re-using test data without biasing classifier performance. The novel approach is validated using simulated data and electrophysiological recordings in humans and rodents. The results demonstrate that the approach has a higher probability of discovering significant results than the standard approach of cross-validation and testing, while maintaining the nominal alpha level. In contrast to nested cross-validation, which is maximally efficient in re-using data, the proposed approach additionally maintains the interpretability of individual parameters. Taken together, we suggest an addition to currently used machine learning approaches which may be particularly useful in cases where model weights do not require interpretation, but parameters do.
In with the new, out with the old? Auto-extraction for remote sensing archaeology
NASA Astrophysics Data System (ADS)
Cowley, David C.
2012-09-01
This paper explores aspects of the inter-relationships between traditional archaeological interpretation of remote sensed data (principally visual examination of aerial photographs/satellite) and those drawing on automated feature extraction and processing. Established approaches to archaeological interpretation of aerial photographs are heavily reliant on individual observation (eye/brain) in an experience and knowledge-based process. Increasingly, however, much more complex and extensive datasets are becoming available to archaeology and these require critical reflection on analytical and interpretative processes. Archaeological applications of Airborne Laser Scanning (ALS) are becoming increasingly routine, and as the spatial resolution of hyper-spectral data improves, its potentially massive implications for archaeological site detection may prove to be a sea-change. These complex datasets demand new approaches, as traditional methods based on direct observation by an archaeological interpreter will never do more than scratch the surface, and will fail to fully extend the boundaries of knowledge. Inevitably, changing analytical and interpretative processes can create tensions, especially, as has been the case in archaeology, when the innovations in data and analysis come from outside the discipline. These tensions often centre on the character of the information produced, and a lack of clarity on the place of archaeological interpretation in the workflow. This is especially true for ALS data and autoextraction techniques, and carries implications for all forms of remote sensed archaeological datasets, including hyperspectral data and aerial photographs.
Improving data quality in neuronal population recordings
Harris, Kenneth D.; Quian Quiroga, Rodrigo; Freeman, Jeremy; Smith, Spencer
2017-01-01
Understanding how the brain operates requires understanding how large sets of neurons function together. Modern recording technology makes it possible to simultaneously record the activity of hundreds of neurons, and technological developments will soon allow recording of thousands or tens of thousands. As with all experimental techniques, these methods are subject to confounds that complicate the interpretation of such recordings, and could lead to erroneous scientific conclusions. Here, we discuss methods for assessing and improving the quality of data from these techniques, and outline likely future directions in this field. PMID:27571195
Nutrimetry: BMI assessment as a function of development.
Selem-Solís, Jorge Enrique; Alcocer-Gamboa, Alberto; Hattori-Hara, Mónica; Esteve-Lanao, Jonathan; Larumbe-Zabala, Eneko
2018-02-01
Adequate nutritional assessment is required to fight malnutrition (undernutrition and overfeeding) in children and adolescents. For this, joint interpretation of certain indicators (body mass index [BMI], height, weight, etc.) is recommended. This is done clinically, but not epidemiologically. The aim of this paper is to present "nutrimetry", a simple method that crosses anthropometric information allowing for bivariate interpretation at both levels (clinical and epidemiological). Data from 41,001 children and adolescents aged 0-19 years, taken from Mexico's National Health and Nutrition Survey 2012, were analyzed. Data crossed were BMI-for-age z-scores (BAZ) with height-for-age z-scores (HAZ) according to the World Health Organization (WHO) standards. Conditional prevalences were calculated in a 3×3 grid and were compared with expected values. This method identified subgroups in each BAZ category showing heterogeneity of the sample with regard to WHO standards for HAZ and nutritional status. According to the method, nutritional status patterns differed among Mexican states and age and sex groups. Nutrimetry is a helpful and accessible tool to be used in epidemiology. It allows for detecting unexpected distributions of conditional prevalences, its graphical representation facilitates communication of results by geographic areas, and enriched interpretation of BAZ helps guide intervention actions according to their codes. Copyright © 2017 SEEN y SED. Publicado por Elsevier España, S.L.U. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lines, L.; Burton, A.; Lu, H.X.
Accurate velocity models are a necessity for reliable migration results. Velocity analysis generally involves the use of methods such as normal moveout analysis (NMO), seismic traveltime tomography, or iterative prestack migration. These techniques can be effective, and each has its own advantage or disadvantage. Conventional NMO methods are relatively inexpensive but basically require simplifying assumptions about geology. Tomography is a more general method but requires traveltime interpretation of prestack data. Iterative prestack depth migration is very general but is computationally expensive. In some cases, there is the opportunity to estimate vertical velocities by use of well information. The well informationmore » can be used to optimize poststack migrations, thereby eliminating some of the time and expense of iterative prestack migration. The optimized poststack migration procedure defined here computes the velocity model which minimizes the depth differences between seismic images and formation depths at the well by using a least squares inversion method. The optimization methods described in this paper will hopefully produce ``migrations without migraines.``« less
Applications of three-dimensional modeling in electromagnetic exploration
NASA Astrophysics Data System (ADS)
Pellerin, Louise Donna
Numerical modeling is used in geophysical exploration to understand physical mechanisms of a geophysical method, compare different exploration techniques, and interpret field data. Exploring the physics of a geophysical response enhances the geophysicist's insight, resulting in better survey design and interpretation. Comparing exploration methods numerically can eliminate the use of a technique that cannot resolve the exploration target. Interpreting field data to determine the structure of the earth is the ultimate goal of the exploration geophysicist. Applications of three-dimensional (3-D) electromagnetic (EM) modeling in mining, geothermal and environmental exploration demonstrate the importance of numerical modeling as a geophysical tool. Detection of a confined, conductive target with a vertical electric source (VES) can be an effective technique if properly used. The vertical magnetic field response is due solely to multi-dimensional structures, and current channeling is the dominant mechanism. A VES is deployed in a bore hole, hence the orientation of the hole is critical to the response. A deviation of more than a degree from the vertical can result in a host response that overwhelms the target response. Only the in-phase response at low frequencies can be corrected to a purely vertical response. The geothermal system studied consists of a near-surface clay cap and a deep reservoir. The magnetotelluric (MT), controlled-source audio magnetotelluric (CSAMT), long-offset time-domain electromagnetic (LOTEM) and central-loop transient electromagnetic (TEM) methods are appraised for their ability to detect the reservoir and delineate the cap. The reservoir anomaly is supported by boundary charges and therefore is detectable only with deep sounding electric field measurement MT and LOTEM. The cap is easily delineated with all techniques. For interpretation I developed an approximate 3-D inversion that refines a 1-D interpretation by removing lateral distortions. An iterative inverse procedure invokes EM reciprocity while operating on a localized portion of the survey area thereby greatly reducing the computational requirements. The scheme is illustrated with three synthetic data sets representative of problems in environmental geophysics.
ERIC Educational Resources Information Center
Leffert, J. S.; Siperstein, G. N.; Widaman, K. F.
2010-01-01
Background: A key aspect of social perception is the interpretation of others' intentions. Children with intellectual disabilities (IDs) have difficulty interpreting benign intentions when a negative event occurs. From a cognitive processing perspective, interpreting benign intentions can be challenging because it requires integration of…
Interpreting Space-Mission LET Requirements for SEGR in Power MOSFETs
NASA Technical Reports Server (NTRS)
Lauenstein, J. M.; Ladbury, R. L.; Batchelor, D. A.; Goldsman, N.; Kim, H. S.; Phan, A. M.
2010-01-01
A Technology Computer Aided Design (TCAD) simulation-based method is developed to evaluate whether derating of high-energy heavy-ion accelerator test data bounds the risk for single-event gate rupture (SEGR) from much higher energy on-orbit ions for a mission linear energy transfer (LET) requirement. It is shown that a typical derating factor of 0.75 applied to a single-event effect (SEE) response curve defined by high-energy accelerator SEGR test data provides reasonable on-orbit hardness assurance, although in a high-voltage power MOSFET, it did not bound the risk of failure.
Next-generation genotype imputation service and methods.
Das, Sayantan; Forer, Lukas; Schönherr, Sebastian; Sidore, Carlo; Locke, Adam E; Kwong, Alan; Vrieze, Scott I; Chew, Emily Y; Levy, Shawn; McGue, Matt; Schlessinger, David; Stambolian, Dwight; Loh, Po-Ru; Iacono, William G; Swaroop, Anand; Scott, Laura J; Cucca, Francesco; Kronenberg, Florian; Boehnke, Michael; Abecasis, Gonçalo R; Fuchsberger, Christian
2016-10-01
Genotype imputation is a key component of genetic association studies, where it increases power, facilitates meta-analysis, and aids interpretation of signals. Genotype imputation is computationally demanding and, with current tools, typically requires access to a high-performance computing cluster and to a reference panel of sequenced genomes. Here we describe improvements to imputation machinery that reduce computational requirements by more than an order of magnitude with no loss of accuracy in comparison to standard imputation tools. We also describe a new web-based service for imputation that facilitates access to new reference panels and greatly improves user experience and productivity.
NASA Technical Reports Server (NTRS)
Rockfeller, W C
1939-01-01
Equations have been developed for the analysis of the performance of the ideal airplane, leading to an approximate physical interpretation of the performance problem. The basic sea-level airplane parameters have been generalized to altitude parameters and a new parameter has been introduced and physically interpreted. The performance analysis for actual airplanes has been obtained in terms of the equivalent ideal airplane in order that the charts developed for use in practical calculations will for the most part apply to any type of engine-propeller combination and system of control, the only additional material required consisting of the actual engine and propeller curves for propulsion unit. Finally, a more exact method for the calculation of the climb characteristics for the constant-speed controllable propeller is presented in the appendix.
Nurse faculty experiences in problem-based learning: an interpretive phenomenologic analysis.
Paige, Jane B; Smith, Regina O
2013-01-01
This study explored the nurse faculty experience of participating in a problem-based learning (PBL) faculty development program. Utilizing PBL as a pedagogical method requires a paradigm shift in the way faculty think about teaching, learning, and the teacher-student relationship. An interpretive phenomenological analysis approach was used to explore the faculty experience in a PBL development program. Four themes emerged: change in perception of the teacher-student relationship, struggle in letting go, uncertainty, and valuing PBL as a developmental process. Epistemic doubt happens when action and intent toward the PBL teaching perspective do not match underlying beliefs. Findings from this study call for ongoing administrative support for education on PBL while faculty take time to uncover hidden epistemological beliefs.
Solving subsurface structural problems using a computer
DOE Office of Scientific and Technical Information (OSTI.GOV)
Witte, D.M.
1987-02-01
Until recently, the solution of subsurface structural problems has required a combination of graphical construction, trigonometry, time, and patience. Recent advances in software available for both mainframe and microcomputers now reduce the time and potential error of these calculations by an order of magnitude. Software for analysis of deviated wells, three point problems, apparent dip, apparent thickness, and the intersection of two planes, as well as the plotting and interpretation of these data can be used to allow timely and accurate exploration or operational decisions. The available computer software provides a set of utilities, or tools, rather than a comprehensive,more » intelligent system. The burden for selection of appropriate techniques, computation methods, and interpretations still lies with the explorationist user.« less
NASA Astrophysics Data System (ADS)
Parrish, Randall
2010-05-01
The analysis of provenance of clastic sediments is useful for reconstructing the characteristics and rates of exhumation of source areas, and sometimes placing minimum age constraints on depositional age. Due largely to increased availability and ease of access to LA-ICP-MS instrumentation, the analysis of provenance using single detrital accessory minerals has grown very rapidly over recent years. With this however is a culture of casual users who may not fully appreciate subtleties of measurement and isotope interpretation. The isotopic provenance literature is dominated by zircon-centric studies that use U-Pb dating and Hf isotope measurements of single zircons, but unfortunately an increasing number of these studies appear to lack sufficient understanding of U-Pb and Hf systematics; misleading interpretations are increasingly common. The inherent information contained in detrital accessory minerals is potentially immense, scientifically, but comprehensive interpretations attempting to reconstruct the geological make-up and evolution of sources require dating of multiple types of accessory minerals (i.e. zircon, titanite, monazite, garnet inclusions, micas, allanite, rutile, apatite) by various methods (U-Pb, fission track, Ar-Ar…) at times accompanied by isotope geochemical data (Lu-Hf, Sm-Nd, Rb-Sr) of phases where Sr, Hf, or REE comprise a major element (≥0.5%). Many approaches have been demonstrated but the mix of methodologies needs to be tailored to the problem, in view of the variable effort and expense needed to acquire good datasets. To date there are few comprehensive multi-mineral, multi-isotope system applications, and too many studies that follow a prescriptive cookbook that lacks innovation and fails to address a problem. The field needs to focus effort on the approaches that can solve a problem well rather than doing either just the easy methods or too many methods only moderately well. Zircon studies require strategies that reduce or eliminate discordance, collect sufficient data on each grain to make a robust age interpretation, improve accuracy of data by more attention to standards and uncertainties, can analyze thin overgrowths that reveal the magmatic or metamorphic age, and minimize sample consumption, not an easy task for the vast majority of laboratories doing provenance applications. Detrital monazite, monazite-in garnet, titanite and rutile can reveal much of the higher temperature metamorphic time-temperature path, and coupled U-Pb and fission track studies of single zircon and apatite grains can be useful for determining lower temperature exhumation rates. Isotope geochemistry (Hf-Nd-Sr-O) is more time consuming but can be pivotal to distinguish subtle differences in sources and to test specific hypotheses. Examples of improved methods and applications will be presented to illustrate the presentation.
Fuzzy Logic in Legal Education
ERIC Educational Resources Information Center
Balkir, Z. Gonul; Alniacik, Umit; Apaydin, Eylem
2011-01-01
The necessity of examination of every case within its peculiar conditions in social sciences requires different approaches complying with the spirit and nature of social sciences. Multiple realities require different and various perceptual interpretations. In modern world and social sciences, interpretation of perception of valued and multi-valued…
Present Practice of Using Nautical Depth to Manage Navigation Channels in the Presence of Fluid Mud
2017-05-01
material surfaces cannot be interpreted reliably unless other correlating information is developed. Surveying of fluid mud properties. At some locations...depth to manage navigation channels and ports requires a mud property that determines a navigability criteria, a practical method for surveying that...for managing navigation channels, (3) issues related to conducting hydrographic surveying in waterways with fluid mud bottoms, (4) the newest
ERIC Educational Resources Information Center
Swanson, James; Arnold, L. Eugene; Kraemer, Helena; Hechtman, Lily; Molina, Brooke; Hinshaw, Stephen; Vitiello, Benedetto; Jensen, Peter; Steinhoff, Ken; Lerner, Marc; Greenhill, Laurence; Abikoff, Howard; Wells, Karen; Epstein, Jeffery; Elliott, Glen; Newcorn, Jeffrey; Hoza, Betsy; Wigal, Timothy
2008-01-01
Objective: To review the primary and secondary findings from the Multimodal Treatment study of ADHD (MTA) published over the past decade as three sets of articles. Method: In a two-part article--Part I: Executive Summary (without distracting details) and Part II: Supporting Details (with additional background and detail required by the complexity…
Immunomagnetic separation can enrich fixed solid tumors for epithelial cells.
Yaremko, M L; Kelemen, P R; Kutza, C; Barker, D; Westbrook, C A
1996-01-01
Immunomagnetic separation is a highly specific technique for the enrichment or isolation of cells from a variety of fresh tissues and microorganisms or molecules from suspensions. Because new techniques for molecular analysis of solid tumors are now applicable to fixed tissue but sometimes require or benefit from enrichment for tumor cells, we tested the efficacy of immunomagnetic separation for enriching fixed solid tumors for malignant epithelial cells. We applied it to two different tumors and fixation methods to separate neoplastic from non-neoplastic cells in primary colorectal cancers and metastatic breast cancers, and were able to enrich to a high degree of purity. Immunomagnetic separation was effective in unembedded fixed tissue as well as fixed paraffin-embedded tissue. The magnetically separated cells were amenable to fluorescence in situ hybridization and polymerase chain reaction amplification of their DNA with minimal additional manipulation. The high degree of enrichment achieved before amplification contributed to interpretation of loss of heterozygosity in metastatic breast cancers, and simplified fluorescence in situ hybridization analysis because only neoplastic cells were hybridized and counted. Immunomagnetic separation is effective for the enrichment of fixed solid tumors, can be performed with widely available commercial antibodies, and requires little specialized instrumentation. It can contribute to interpretation of results in situations where enrichment by other methods is difficult or not possible.
Metrics for Offline Evaluation of Prognostic Performance
NASA Technical Reports Server (NTRS)
Saxena, Abhinav; Celaya, Jose; Saha, Bhaskar; Saha, Sankalita; Goebel, Kai
2010-01-01
Prognostic performance evaluation has gained significant attention in the past few years. Currently, prognostics concepts lack standard definitions and suffer from ambiguous and inconsistent interpretations. This lack of standards is in part due to the varied end-user requirements for different applications, time scales, available information, domain dynamics, etc. to name a few. The research community has used a variety of metrics largely based on convenience and their respective requirements. Very little attention has been focused on establishing a standardized approach to compare different efforts. This paper presents several new evaluation metrics tailored for prognostics that were recently introduced and were shown to effectively evaluate various algorithms as compared to other conventional metrics. Specifically, this paper presents a detailed discussion on how these metrics should be interpreted and used. These metrics have the capability of incorporating probabilistic uncertainty estimates from prognostic algorithms. In addition to quantitative assessment they also offer a comprehensive visual perspective that can be used in designing the prognostic system. Several methods are suggested to customize these metrics for different applications. Guidelines are provided to help choose one method over another based on distribution characteristics. Various issues faced by prognostics and its performance evaluation are discussed followed by a formal notational framework to help standardize subsequent developments.
Visualization of volumetric seismic data
NASA Astrophysics Data System (ADS)
Spickermann, Dela; Böttinger, Michael; Ashfaq Ahmed, Khawar; Gajewski, Dirk
2015-04-01
Mostly driven by demands of high quality subsurface imaging, highly specialized tools and methods have been developed to support the processing, visualization and interpretation of seismic data. 3D seismic data acquisition and 4D time-lapse seismic monitoring are well-established techniques in academia and industry, producing large amounts of data to be processed, visualized and interpreted. In this context, interactive 3D visualization methods proved to be valuable for the analysis of 3D seismic data cubes - especially for sedimentary environments with continuous horizons. In crystalline and hard rock environments, where hydraulic stimulation techniques may be applied to produce geothermal energy, interpretation of the seismic data is a more challenging problem. Instead of continuous reflection horizons, the imaging targets are often steep dipping faults, causing a lot of diffractions. Without further preprocessing these geological structures are often hidden behind the noise in the data. In this PICO presentation we will present a workflow consisting of data processing steps, which enhance the signal-to-noise ratio, followed by a visualization step based on the use the commercially available general purpose 3D visualization system Avizo. Specifically, we have used Avizo Earth, an extension to Avizo, which supports the import of seismic data in SEG-Y format and offers easy access to state-of-the-art 3D visualization methods at interactive frame rates, even for large seismic data cubes. In seismic interpretation using visualization, interactivity is a key requirement for understanding complex 3D structures. In order to enable an easy communication of the insights gained during the interactive visualization process, animations of the visualized data were created which support the spatial understanding of the data.
Use and interpretation of logistic regression in habitat-selection studies
Keating, Kim A.; Cherry, Steve
2004-01-01
Logistic regression is an important tool for wildlife habitat-selection studies, but the method frequently has been misapplied due to an inadequate understanding of the logistic model, its interpretation, and the influence of sampling design. To promote better use of this method, we review its application and interpretation under 3 sampling designs: random, case-control, and use-availability. Logistic regression is appropriate for habitat use-nonuse studies employing random sampling and can be used to directly model the conditional probability of use in such cases. Logistic regression also is appropriate for studies employing case-control sampling designs, but careful attention is required to interpret results correctly. Unless bias can be estimated or probability of use is small for all habitats, results of case-control studies should be interpreted as odds ratios, rather than probability of use or relative probability of use. When data are gathered under a use-availability design, logistic regression can be used to estimate approximate odds ratios if probability of use is small, at least on average. More generally, however, logistic regression is inappropriate for modeling habitat selection in use-availability studies. In particular, using logistic regression to fit the exponential model of Manly et al. (2002:100) does not guarantee maximum-likelihood estimates, valid probabilities, or valid likelihoods. We show that the resource selection function (RSF) commonly used for the exponential model is proportional to a logistic discriminant function. Thus, it may be used to rank habitats with respect to probability of use and to identify important habitat characteristics or their surrogates, but it is not guaranteed to be proportional to probability of use. Other problems associated with the exponential model also are discussed. We describe an alternative model based on Lancaster and Imbens (1996) that offers a method for estimating conditional probability of use in use-availability studies. Although promising, this model fails to converge to a unique solution in some important situations. Further work is needed to obtain a robust method that is broadly applicable to use-availability studies.
On simulation of no-slip condition in the method of discrete vortices
NASA Astrophysics Data System (ADS)
Shmagunov, O. A.
2017-10-01
When modeling flows of an incompressible fluid, it is convenient sometimes to use the method of discrete vortices (MDV), where the continuous vorticity field is approximated by a set of discrete vortex elements moving in the velocity field. The vortex elements have a clear physical interpretation, they do not require the construction of grids and are automatically adaptive, since they concentrate in the regions of greatest interest and successfully describe the flows of a non-viscous fluid. The possibility of using MDV in simulating flows of a viscous fluid was considered in the previous papers using the examples of flows past bodies with sharp edges with the no-penetration condition at solid boundaries. However, the appearance of vorticity on smooth boundaries requires the no-slip condition to be met when MDV is realized, which substantially complicates the initially simple method. In this connection, an approach is considered that allows solving the problem by simple means.
Localized Smart-Interpretation
NASA Astrophysics Data System (ADS)
Lundh Gulbrandsen, Mats; Mejer Hansen, Thomas; Bach, Torben; Pallesen, Tom
2014-05-01
The complex task of setting up a geological model consists not only of combining available geological information into a conceptual plausible model, but also requires consistency with availably data, e.g. geophysical data. However, in many cases the direct geological information, e.g borehole samples, are very sparse, so in order to create a geological model, the geologist needs to rely on the geophysical data. The problem is however, that the amount of geophysical data in many cases are so vast that it is practically impossible to integrate all of them in the manual interpretation process. This means that a lot of the information available from the geophysical surveys are unexploited, which is a problem, due to the fact that the resulting geological model does not fulfill its full potential and hence are less trustworthy. We suggest an approach to geological modeling that 1. allow all geophysical data to be considered when building the geological model 2. is fast 3. allow quantification of geological modeling. The method is constructed to build a statistical model, f(d,m), describing the relation between what the geologists interpret, d, and what the geologist knows, m. The para- meter m reflects any available information that can be quantified, such as geophysical data, the result of a geophysical inversion, elevation maps, etc... The parameter d reflects an actual interpretation, such as for example the depth to the base of a ground water reservoir. First we infer a statistical model f(d,m), by examining sets of actual interpretations made by a geological expert, [d1, d2, ...], and the information used to perform the interpretation; [m1, m2, ...]. This makes it possible to quantify how the geological expert performs interpolation through f(d,m). As the geological expert proceeds interpreting, the number of interpreted datapoints from which the statistical model is inferred increases, and therefore the accuracy of the statistical model increases. When a model f(d,m) successfully has been inferred, we are able to simulate how the geological expert would perform an interpretation given some external information m, through f(d|m). We will demonstrate this method applied on geological interpretation and densely sampled airborne electromagnetic data. In short, our goal is to build a statistical model describing how a geological expert performs geological interpretation given some geophysical data. We then wish to use this statistical model to perform semi automatic interpretation, everywhere where such geophysical data exist, in a manner consistent with the choices made by a geological expert. Benefits of such a statistical model are that 1. it provides a quantification of how a geological expert performs interpretation based on available diverse data 2. all available geophysical information can be used 3. it allows much faster interpretation of large data sets.
A method for the measurement and analysis of ride vibrations of transportation systems
NASA Technical Reports Server (NTRS)
Catherines, J. J.; Clevenson, S. A.; Scholl, H. F.
1972-01-01
The measurement and recording of ride vibrations which affect passenger comfort in transportation systems and the subsequent data-reduction methods necessary for interpreting the data present exceptional instrumentation requirements and necessitate the use of computers for specialized analysis techniques. A method is presented for both measuring and analyzing ride vibrations of the type encountered in ground and air transportation systems. A portable system for measuring and recording low-frequency, low-amplitude accelerations and specialized data-reduction procedures are described. Sample vibration measurements in the form of statistical parameters representative of typical transportation systems are also presented to demonstrate the utility of the techniques.
A privacy challenge to longitudinal study methods: patient-derived codes.
Clay, Fiona J; Ozanne-Smith, Joan; Watson, Wendy; Congiu, Melinda; Fox, Barbara
2006-08-01
Recent changes to privacy legislation in Australia have resulted in more stringent requirements with respect to maintaining the confidentiality of patient health information. We describe a method employed to de-identify health information collected in a longitudinal study using codes. Using a patient-derived code that did not change during the life of the study follow-up resulted in errors in a quarter of the follow-up surveys. This may introduce bias that could compromise the validity of the study. Alternative methods of coding may alleviate some of these issues. However, removal of some of the constraints imposed by interpretations of privacy legislation may be the best way forward.
Clark, Michael; Young, Trudie; Fallon, Maureen
2018-06-01
Successful prevention of pressure ulcers is the end product of a complex series of care processes including, but not limited to, the assessment of vulnerability to pressure damage; skin assessment and care; nutritional support; repositioning; and the use of beds, mattresses, and cushions to manage mechanical loads on the skin and soft tissues. The purpose of this review was to examine where and how Statistical Process Control (SPC) measures have been used to assess the success of quality improvement initiatives intended to improve pressure ulcer prevention. A search of 7 electronic bibliographic databases was performed on May 17th, 2017, for studies that met the inclusion criteria. SPC methods have been reported in 9 publications since 2010 to interpret changes in the incidence of pressure ulcers over time. While these methods offer rapid interpretation of changes in incidence than is gained from a comparison of 2 arbitrarily selected time points pre- and post-implementation of change, more work is required to ensure that the clinical and scientific communities adopt the most appropriate SPC methods. © 2018 Medicalhelplines.com Inc and John Wiley & Sons Ltd.
Mapping forest canopy gaps using air-photo interpretation and ground surveys
Fox, T.J.; Knutson, M.G.; Hines, R.K.
2000-01-01
Canopy gaps are important structural components of forested habitats for many wildlife species. Recent improvements in the spatial accuracy of geographic information system tools facilitate accurate mapping of small canopy features such as gaps. We compared canopy-gap maps generated using ground survey methods with those derived from air-photo interpretation. We found that maps created from high-resolution air photos were more accurate than those created from ground surveys. Errors of omission were 25.6% for the ground-survey method and 4.7% for the air-photo method. One variable of inter est in songbird research is the distance from nests to gap edges. Distances from real and simulated nests to gap edges were longer using the ground-survey maps versus the air-photo maps, indicating that gap omission could potentially bias the assessment of spatial relationships. If research or management goals require location and size of canopy gaps and specific information about vegetation structure, we recommend a 2-fold approach. First, canopy gaps can be located and the perimeters defined using 1:15,000-scale or larger aerial photographs and the methods we describe. Mapped gaps can then be field-surveyed to obtain detailed vegetation data.
Interpretive versus didactic learning approach towards oral biology: a student's perspective.
Farooq, Imran
2014-10-01
This study analyzed the preference of dental students for oral biology questions that require either an interpretive or a descriptive approach to answer and to compare the preferences with their final examination result retrospectively. A questionnaire requiring student academic number and containing two questions (one asked with an interpretive approach/the other asked with a descriptive approach) from random topics of oral biology course was distributed among students who have already appeared in the final examination. Majority of the students who had achieved good grades (A+, A, B+, B) preferred interpretive questions whereas majority of the students with average grades (C+, C, D+, D) selected descriptive questions. Common reason for picking interpretive question was that it enhances critical thinking. The descriptive questions were argued to provide students with a chance to explain more. Hence, students should be encouraged to learn interpretively to promote enquiry based learning (EBL) and critical thinking.
Barber, Julie A; Thompson, Simon G
1998-01-01
Objective To review critically the statistical methods used for health economic evaluations in randomised controlled trials where an estimate of cost is available for each patient in the study. Design Survey of published randomised trials including an economic evaluation with cost values suitable for statistical analysis; 45 such trials published in 1995 were identified from Medline. Main outcome measures The use of statistical methods for cost data was assessed in terms of the descriptive statistics reported, use of statistical inference, and whether the reported conclusions were justified. Results Although all 45 trials reviewed apparently had cost data for each patient, only 9 (20%) reported adequate measures of variability for these data and only 25 (56%) gave results of statistical tests or a measure of precision for the comparison of costs between the randomised groups. Only 16 (36%) of the articles gave conclusions which were justified on the basis of results presented in the paper. No paper reported sample size calculations for costs. Conclusions The analysis and interpretation of cost data from published trials reveal a lack of statistical awareness. Strong and potentially misleading conclusions about the relative costs of alternative therapies have often been reported in the absence of supporting statistical evidence. Improvements in the analysis and reporting of health economic assessments are urgently required. Health economic guidelines need to be revised to incorporate more detailed statistical advice. Key messagesHealth economic evaluations required for important healthcare policy decisions are often carried out in randomised controlled trialsA review of such published economic evaluations assessed whether statistical methods for cost outcomes have been appropriately used and interpretedFew publications presented adequate descriptive information for costs or performed appropriate statistical analysesIn at least two thirds of the papers, the main conclusions regarding costs were not justifiedThe analysis and reporting of health economic assessments within randomised controlled trials urgently need improving PMID:9794854
NASA Astrophysics Data System (ADS)
Lespinats, Sylvain; Pinker-Domenig, Katja; Wengert, Georg; Houben, Ivo; Lobbes, Marc; Stadlbauer, Andreas; Meyer-Bäse, Anke
2016-05-01
Glioma-derived cancer stem cells (GSCs) are tumor-initiating cells and may be refractory to radiation and chemotherapy and thus have important implications for tumor biology and therapeutics. The analysis and interpretation of large proteomic data sets requires the development of new data mining and visualization approaches. Traditional techniques are insufficient to interpret and visualize these resulting experimental data. The emphasis of this paper lies in the application of novel approaches for the visualization, clustering and projection representation to unveil hidden data structures relevant for the accurate interpretation of biological experiments. These qualitative and quantitative methods are applied to the proteomic analysis of data sets derived from the GSCs. The achieved clustering and visualization results provide a more detailed insight into the protein-level fold changes and putative upstream regulators for the GSCs. However the extracted molecular information is insufficient in classifying GSCs and paving the pathway to an improved therapeutics of the heterogeneous glioma.
Estimation and interpretation of genetic effects with epistasis using the NOIA model.
Alvarez-Castro, José M; Carlborg, Orjan; Rönnegård, Lars
2012-01-01
We introduce this communication with a brief outline of the historical landmarks in genetic modeling, especially concerning epistasis. Then, we present methods for the use of genetic modeling in QTL analyses. In particular, we summarize the essential expressions of the natural and orthogonal interactions (NOIA) model of genetic effects. Our motivation for reviewing that theory here is twofold. First, this review presents a digest of the expressions for the application of the NOIA model, which are often mixed with intermediate and additional formulae in the original articles. Second, we make the required theory handy for the reader to relate the genetic concepts to the particular mathematical expressions underlying them. We illustrate those relations by providing graphical interpretations and a diagram summarizing the key features for applying genetic modeling with epistasis in comprehensive QTL analyses. Finally, we briefly review some examples of the application of NOIA to real data and the way it improves the interpretability of the results.
Achee, Nicole L; Youngblood, Laura; Bangs, Michael J; Lavery, James V; James, Stephanie
2015-02-01
A thorough search of the existing literature has revealed that there are currently no published recommendations or guidelines for the interpretation of US regulations on the use of human participants in vector biology research (VBR). An informal survey of vector biologists has indicated that issues related to human participation in vector research have been largely debated by academic, national, and local Institutional Review Boards (IRBs) in the countries where the research is being conducted, and that interpretations and subsequent requirements made by these IRBs have varied widely. This document is intended to provide investigators and corresponding scientific and ethical review committee members an introduction to VBR methods involving human participation and the legal and ethical framework in which such studies are conducted with a focus on US Federal Regulations. It is also intended to provide a common perspective for guiding researchers, IRB members, and other interested parties (i.e., public health officials conducting routine entomological surveillance) in the interpretation of human subjects regulations pertaining to VBR.
Interpretation and mapping of gypsy moth defoilation from ERTS (LANDSAT)-1 temporal composites
NASA Technical Reports Server (NTRS)
Mcmurtry, G. J.; Petersen, G. W. (Principal Investigator); Kowalik, W. S.
1975-01-01
The author has identified the following significant results. Photointerpretation of temporally composited color Diazo transparencies of ERTS(LANDSAT) images is a practical method for detecting and locating levels of widespread defoliation. ERTS 9 x 9 inch images are essentially orthographic and are produced at a nearly constant 1:1,000,000 scale. This allows direct superposition of scenes for temporal composites. ERTS coverage provides a sweeping 180 km (110 mile) wide view, permitting one interpreter to rapidly delineate defoliation in an area requiring days and weeks of work by aerial surveys or computerized processing. Defoliation boundaries can be located on the images within maximum errors on the order of hundreds of meters. The enhancement process is much less expensive than aerial surveys or computerized processing. Maps produced directly from interpretation are manageable working products. The 18 day periodic coverage of ERTS is not frequent enough to replace aerial survey mapping because defoliation and refoliation move as waves.
Code of Federal Regulations, 2011 CFR
2011-10-01
... interpretation of diagnostic radiology and other diagnostic tests. 415.180 Section 415.180 Public Health CENTERS... for the interpretation of diagnostic radiology and other diagnostic tests. (a) General rule. Physician fee schedule payment is made for the interpretation of diagnostic radiology and other diagnostic tests...
Code of Federal Regulations, 2010 CFR
2010-10-01
... interpretation of diagnostic radiology and other diagnostic tests. 415.180 Section 415.180 Public Health CENTERS... for the interpretation of diagnostic radiology and other diagnostic tests. (a) General rule. Physician fee schedule payment is made for the interpretation of diagnostic radiology and other diagnostic tests...
Dynamic stress changes during earthquake rupture
Day, S.M.; Yu, G.; Wald, D.J.
1998-01-01
We assess two competing dynamic interpretations that have been proposed for the short slip durations characteristic of kinematic earthquake models derived by inversion of earthquake waveform and geodetic data. The first interpretation would require a fault constitutive relationship in which rapid dynamic restrengthening of the fault surface occurs after passage of the rupture front, a hypothesized mechanical behavior that has been referred to as "self-healing." The second interpretation would require sufficient spatial heterogeneity of stress drop to permit rapid equilibration of elastic stresses with the residual dynamic friction level, a condition we refer to as "geometrical constraint." These interpretations imply contrasting predictions for the time dependence of the fault-plane shear stresses. We compare these predictions with dynamic shear stress changes for the 1992 Landers (M 7.3), 1994 Northridge (M 6.7), and 1995 Kobe (M 6.9) earthquakes. Stress changes are computed from kinematic slip models of these earthquakes, using a finite-difference method. For each event, static stress drop is highly variable spatially, with high stress-drop patches embedded in a background of low, and largely negative, stress drop. The time histories of stress change show predominantly monotonic stress change after passage of the rupture front, settling to a residual level, without significant evidence for dynamic restrengthening. The stress change at the rupture front is usually gradual rather than abrupt, probably reflecting the limited resolution inherent in the underlying kinematic inversions. On the basis of this analysis, as well as recent similar results obtained independently for the Kobe and Morgan Hill earthquakes, we conclude that, at the present time, the self-healing hypothesis is unnecessary to explain earthquake kinematics.
[Electrocardiographic interpretation in athletes : 2017 recommendations for non-cardiologists].
Meyer, Philippe; Gabus, Vincent
2017-07-12
A resting electrocardiogram (ECG) is recommended for screening of sudden cardiac death in young athletes. However, ECG interpretation in athletes requires an adequate training because normal physiological training adaptations in athletes can sometimes be hardly distinguished from abnormal findings suggestive of underlying pathology. In 2017, a consensus of international experts established new recommendations for a clear and accurate interpretation of ECGs in athletes. This article aims to guide non-cardiologists according to these new data, allowing a better triage of anomalies requiring further investigations.
Translating Radiometric Requirements for Satellite Sensors to Match International Standards.
Pearlman, Aaron; Datla, Raju; Kacker, Raghu; Cao, Changyong
2014-01-01
International scientific standards organizations created standards on evaluating uncertainty in the early 1990s. Although scientists from many fields use these standards, they are not consistently implemented in the remote sensing community, where traditional error analysis framework persists. For a satellite instrument under development, this can create confusion in showing whether requirements are met. We aim to create a methodology for translating requirements from the error analysis framework to the modern uncertainty approach using the product level requirements of the Advanced Baseline Imager (ABI) that will fly on the Geostationary Operational Environmental Satellite R-Series (GOES-R). In this paper we prescribe a method to combine several measurement performance requirements, written using a traditional error analysis framework, into a single specification using the propagation of uncertainties formula. By using this approach, scientists can communicate requirements in a consistent uncertainty framework leading to uniform interpretation throughout the development and operation of any satellite instrument.
Translating Radiometric Requirements for Satellite Sensors to Match International Standards
Pearlman, Aaron; Datla, Raju; Kacker, Raghu; Cao, Changyong
2014-01-01
International scientific standards organizations created standards on evaluating uncertainty in the early 1990s. Although scientists from many fields use these standards, they are not consistently implemented in the remote sensing community, where traditional error analysis framework persists. For a satellite instrument under development, this can create confusion in showing whether requirements are met. We aim to create a methodology for translating requirements from the error analysis framework to the modern uncertainty approach using the product level requirements of the Advanced Baseline Imager (ABI) that will fly on the Geostationary Operational Environmental Satellite R-Series (GOES-R). In this paper we prescribe a method to combine several measurement performance requirements, written using a traditional error analysis framework, into a single specification using the propagation of uncertainties formula. By using this approach, scientists can communicate requirements in a consistent uncertainty framework leading to uniform interpretation throughout the development and operation of any satellite instrument. PMID:26601032
Ruppel, T; van den Berg, N; Hoffmann, W
2016-10-01
Objective: Triggered by the AGnES model project of the University Medicine Greifswald, the Code of Social Law V was changed by the German Lower and Upper House of Parliament (Bundestag and Bundesrat) in 2008 so that the delegation of GP's activities to non-physician colleagues was allowed under highly restricted preconditions. Delegated home visits should become an integral part of the standard care in Germany. In this study, the implementation of § 87 para 2b clause 5 SGB V, established in Annex 8 of the Federal Collective Agreement, was checked for its legality in terms of qualification. Methods: The problem was checked with the legal methods of interpretation in pursuance of the norm and the methods of systematic, historic and teleologic interpretation. Results: Even though the Parliament clearly required orientation to the AGnES model project (in order to assure safety and effective care of delegated home visits), self-management in the implementation of the law remained far behind these guidelines. The main outcome of the legal analysis was that the implementation arrangements of the Code of Social Law V are predominantly illegal. Conclusions: The parties of the Federal Collective Agreement have to change the arrangements to meet the requirements of the Parliament and to avoid risks of liability for delegating GPs. © Georg Thieme Verlag KG Stuttgart · New York.
The effects of deep network topology on mortality prediction.
Hao Du; Ghassemi, Mohammad M; Mengling Feng
2016-08-01
Deep learning has achieved remarkable results in the areas of computer vision, speech recognition, natural language processing and most recently, even playing Go. The application of deep-learning to problems in healthcare, however, has gained attention only in recent years, and it's ultimate place at the bedside remains a topic of skeptical discussion. While there is a growing academic interest in the application of Machine Learning (ML) techniques to clinical problems, many in the clinical community see little incentive to upgrade from simpler methods, such as logistic regression, to deep learning. Logistic regression, after all, provides odds ratios, p-values and confidence intervals that allow for ease of interpretation, while deep nets are often seen as `black-boxes' which are difficult to understand and, as of yet, have not demonstrated performance levels far exceeding their simpler counterparts. If deep learning is to ever take a place at the bedside, it will require studies which (1) showcase the performance of deep-learning methods relative to other approaches and (2) interpret the relationships between network structure, model performance, features and outcomes. We have chosen these two requirements as the goal of this study. In our investigation, we utilized a publicly available EMR dataset of over 32,000 intensive care unit patients and trained a Deep Belief Network (DBN) to predict patient mortality at discharge. Utilizing an evolutionary algorithm, we demonstrate automated topology selection for DBNs. We demonstrate that with the correct topology selection, DBNs can achieve better prediction performance compared to several bench-marking methods.
[Hierarchy structuring for mammography technique by interpretive structural modeling method].
Kudo, Nozomi; Kurowarabi, Kunio; Terashita, Takayoshi; Nishimoto, Naoki; Ogasawara, Katsuhiko
2009-10-20
Participation in screening mammography is currently desired in Japan because of the increase in breast cancer morbidity. However, the pain and discomfort of mammography is recognized as a significant deterrent for women considering this examination. Thus quick procedures, sufficient experience, and advanced skills are required for radiologic technologists. The aim of this study was to make the point of imaging techniques explicit and to help understand the complicated procedure. We interviewed 3 technologists who were highly skilled in mammography, and 14 factors were retrieved by using brainstorming and the KJ method. We then applied Interpretive Structural Modeling (ISM) to the factors and developed a hierarchical concept structure. The result showed a six-layer hierarchy whose top node was explanation of the entire procedure on mammography. Male technologists were related to as a negative factor. Factors concerned with explanation were at the upper node. We gave attention to X-ray techniques and considerations. The findings will help beginners improve their skills.
A possibilistic approach to clustering
NASA Technical Reports Server (NTRS)
Krishnapuram, Raghu; Keller, James M.
1993-01-01
Fuzzy clustering has been shown to be advantageous over crisp (or traditional) clustering methods in that total commitment of a vector to a given class is not required at each image pattern recognition iteration. Recently fuzzy clustering methods have shown spectacular ability to detect not only hypervolume clusters, but also clusters which are actually 'thin shells', i.e., curves and surfaces. Most analytic fuzzy clustering approaches are derived from the 'Fuzzy C-Means' (FCM) algorithm. The FCM uses the probabilistic constraint that the memberships of a data point across classes sum to one. This constraint was used to generate the membership update equations for an iterative algorithm. Recently, we cast the clustering problem into the framework of possibility theory using an approach in which the resulting partition of the data can be interpreted as a possibilistic partition, and the membership values may be interpreted as degrees of possibility of the points belonging to the classes. We show the ability of this approach to detect linear and quartic curves in the presence of considerable noise.
Deng, Fengyuan; Ulcickas, James R W; Simpson, Garth J
2016-11-03
Fluorescence optical rotary dispersion (F-ORD) is proposed as a novel chiral-specific and interface-specific spectroscopic method. F-ORD measurements of uniaxial assemblies are predicted to be fully electric-dipole-allowed, with corresponding increases in sensitivity to chirality relative to chiral-specific measurements in isotropic assemblies that are commonly interpreted through coupling between electric and magnetic dynamic dipoles. Observations of strong chiral sensitivity in prior single-molecule fluorescence measurements of chiral interfacial molecules are in excellent qualitative agreement with the predictions of the F-ORD mechanism and challenging to otherwise explain. F-ORD may provide methods to suppress background fluorescence in studies of biological interfaces, as the detected signal requires both polar local order and interfacial chirality. In addition, the molecular-level descriptions of the mechanisms underpinning F-ORD may also potentially apply to aid in interpreting chiral-specific Raman and surface-enhanced Raman spectroscopy measurements of uniaxially oriented assemblies, opening up opportunities for chiral-specific and interface-specific vibrational spectroscopy.
Enabling scientific workflows in virtual reality
Kreylos, O.; Bawden, G.; Bernardin, T.; Billen, M.I.; Cowgill, E.S.; Gold, R.D.; Hamann, B.; Jadamec, M.; Kellogg, L.H.; Staadt, O.G.; Sumner, D.Y.
2006-01-01
To advance research and improve the scientific return on data collection and interpretation efforts in the geosciences, we have developed methods of interactive visualization, with a special focus on immersive virtual reality (VR) environments. Earth sciences employ a strongly visual approach to the measurement and analysis of geologic data due to the spatial and temporal scales over which such data ranges, As observations and simulations increase in size and complexity, the Earth sciences are challenged to manage and interpret increasing amounts of data. Reaping the full intellectual benefits of immersive VR requires us to tailor exploratory approaches to scientific problems. These applications build on the visualization method's strengths, using both 3D perception and interaction with data and models, to take advantage of the skills and training of the geological scientists exploring their data in the VR environment. This interactive approach has enabled us to develop a suite of tools that are adaptable to a range of problems in the geosciences and beyond. Copyright ?? 2008 by the Association for Computing Machinery, Inc.
Challenges, uncertainties, and issues facing gas production from gas-hydrate deposits
Moridis, G.J.; Collett, T.S.; Pooladi-Darvish, M.; Hancock, S.; Santamarina, C.; Boswel, R.; Kneafsey, T.; Rutqvist, J.; Kowalsky, M.B.; Reagan, M.T.; Sloan, E.D.; Sum, A.K.; Koh, C.A.
2011-01-01
The current paper complements the Moridis et al. (2009) review of the status of the effort toward commercial gas production from hydrates. We aim to describe the concept of the gas-hydrate (GH) petroleum system; to discuss advances, requirements, and suggested practices in GH prospecting and GH deposit characterization; and to review the associated technical, economic, and environmental challenges and uncertainties, which include the following: accurate assessment of producible fractions of the GH resource; development of methods for identifying suitable production targets; sampling of hydrate-bearing sediments (HBS) and sample analysis; analysis and interpretation of geophysical surveys of GH reservoirs; well-testing methods; interpretation of well-testing results; geomechanical and reservoir/well stability concerns; well design, operation, and installation; field operations and extending production beyond sand-dominated GH reservoirs; monitoring production and geomechanical stability; laboratory investigations; fundamental knowledge of hydrate behavior; the economics of commercial gas production from hydrates; and associated environmental concerns. ?? 2011 Society of Petroleum Engineers.
CET exSim: mineral exploration experience via simulation
NASA Astrophysics Data System (ADS)
Wong, Jason C.; Holden, Eun-Jung; Kovesi, Peter; McCuaig, T. Campbell; Hronsky, Jon
2013-08-01
Undercover mineral exploration is a challenging task as it requires understanding of subsurface geology by relying heavily on remotely sensed (i.e. geophysical) data. Cost-effective exploration is essential in order to increase the chance of success using finite budgets. This requires effective decision-making in both the process of selecting the optimum data collection methods and in the process of achieving accuracy during subsequent interpretation. Traditionally, developing the skills, behaviour and practices of exploration decision-making requires many years of experience through working on exploration projects under various geological settings, commodities and levels of available resources. This implies long periods of sub-optimal exploration decision-making, before the necessary experience has been successfully obtained. To address this critical industry issue, our ongoing research focuses on the development of the unique and novel e-learning environment, exSim, which simulates exploration scenarios where users can test their strategies and learn the consequences of their choices. This simulator provides an engaging platform for self-learning and experimentation in exploration decision strategies, providing a means to build experience more effectively. The exSim environment also provides a unique platform on which numerous scenarios and situations (e.g. deposit styles) can be simulated, potentially allowing the user to become virtually familiarised with a broader scope of exploration practices. Harnessing the power of computer simulation, visualisation and an intuitive graphical user interface, the simulator provides a way to assess the user's exploration decisions and subsequent interpretations. In this paper, we present the prototype functionalities in exSim including: simulation of geophysical surveys, follow-up drill testing and interpretation assistive tools.
NASA Technical Reports Server (NTRS)
Davis, Robert E.
2002-01-01
The presentation provides an overview of requirement and interpretation letters, mechanical systems safety interpretation letter, design and verification provisions, and mechanical systems verification plan.
Clinical Laboratories – Production Factories or Specialized Diagnostic Centers
Tóth, Judit
2016-01-01
Since a large proportion of medical decisions are based on laboratory results, clinical laboratories should meet the increasing demand of clinicians and their patients. Huge central laboratories may process over 10 million tests annually; they act as production factories, measuring emergency and routine tests with sufficient speed and accuracy. At the same time, they also serve as specialized diagnostic centers where well-trained experts analyze and interpret special test results. It is essential to improve and constantly monitor this complex laboratory service, by several methods. Sample transport by pneumatic tube system, use of an advanced laboratory information system and point-of-care testing may result in decreased total turnaround time. The optimization of test ordering may result in a faster and more cost-effective laboratory service. Autovalidation can save time for laboratory specialists, when the analysis of more complex results requires their attention. Small teams of experts responsible for special diagnostic work, and their interpretative reporting according to predetermined principles, may help to minimize subjectivity of these special reports. Although laboratory investigations have become so diversely developed in the past decades, it is essential that the laboratory can provide accurate results relatively quickly, and that laboratory specialists can support the diagnosis and monitoring of patients by adequate interpretation of esoteric laboratory methods. PMID:27683528
NASA Astrophysics Data System (ADS)
Ironi, Liliana; Tentoni, Stefania
2009-08-01
The last decade has witnessed major advancements in the direct application of functional imaging techniques to several clinical contexts. Unfortunately, this is not the case of Electrocardiology. As a matter of fact, epicardial maps, which can hit electrical conduction pathologies that routine surface ECG's analysis may miss, can be obtained non invasively from body surface data through mathematical model-based reconstruction methods. But, their interpretation still requires highly specialized skills that belong to few experts. The automated detection of salient patterns in the map, grounded on the existing interpretation rationale, would therefore represent a major contribution towards the clinical use of such valuable tools, whose diagnostic potential is still largely unexploited. We focus on epicardial activation isochronal maps, which convey information about the heart electric function in terms of the depolarization wavefront kinematics. An approach grounded on the integration of a Spatial Aggregation (SA) method with concepts borrowed from Computational Geometry provides a computational framework to extract, from the given activation data, a few basic features that characterize the wavefront propagation, as well as a more specific set of features that identify an important class of heart rhythm pathologies, namely reentry arrhythmias due to block of conduction.
Arias, María Luisa Flores; Champion, Jane Dimmitt; Soto, Norma Elva Sáenz
2017-08-01
Development of a Spanish Version Contraceptive Self-efficacy Scale for use among heterosexual Mexican populations of reproductive age inclusive of 18-35years. Methods of family planning have decreased in Mexico which may lead to an increase in unintended pregnancies. Contraceptive self-efficacy is considered a predictor and precursor for use of family planning methods. Cross-sectional, descriptive study design was used to assess contraceptive self-efficacy among a heterosexual Mexican population (N=160) of reproductive age (18-35years). Adaptation of a Spanish Version Contraceptive Self-efficacy scale was conducted prior to instrument administration. Exploratory and confirmatory factorial analyses identified seven factors with a variance of 72.812%. The adapted scale had a Cronbach alpha of 0.771. A significant correlation between the Spanish Version Contraceptive Self-efficacy Scale and the use of family planning methods was identified. The Spanish Version Contraceptive Self-efficacy scale has an acceptable Cronbach alpha. Exploratory factor analysis identified 7 components. A positive correlation between self-reported contraceptive self-efficacy and family planning method use was identified. This scale may be used among heterosexual Mexican men and women of reproductive age. The factor analysis (7 factors versus 4 factors for the original scale) identified a discrepancy for interpretation of the Spanish versus English language versions. Interpretation of findings obtained via the Spanish versión among heterosexual Mexican men and women of reproductive age require interpretation based upon these differences identified in these analyses. Copyright © 2017 Elsevier Inc. All rights reserved.
Integration of Geophysical Methods By A Generalised Probability Tomography Approach
NASA Astrophysics Data System (ADS)
Mauriello, P.; Patella, D.
In modern science, the propensity interpretative approach stands on the assumption that any physical system consists of two kinds of reality: actual and potential. Also geophysical data systems have potentialities that extend far beyond the few actual models normally attributed to them. Indeed, any geophysical data set is in itself quite inherently ambiguous. Classical deterministic inversion, including tomography, usu- ally forces a measured data set to collapse into a few rather subjective models based on some available a priori information. Classical interpretation is thus an intrinsically limited approach requiring a very deep logical extension. We think that a way to high- light a system full potentiality is to introduce probability as the leading paradigm in dealing with field data systems. Probability tomography has been recently introduced as a completely new approach to data interpretation. Probability tomography has been originally formulated for the self-potential method. It has been then extended to geo- electric, natural source electromagnetic induction, gravity and magnetic methods. Fol- lowing the same rationale, in this paper we generalize the probability tomography the- ory to a generic geophysical anomaly vector field, including the treatment for scalar fields as a particular case. This generalization makes then possible to address for the first time the problem of the integration of different methods by a conjoint probabil- ity tomography imaging procedure. The aim is to infer the existence of an unknown buried object through the analysis of an ad hoc occurrence probability function, blend- ing the physical messages brought forth by a set of singularly observed anomalies.
The verification of ethnographic data
2017-01-01
Anthropologists are increasingly required to account for the data on which they base their interpretations and to make it available for public scrutiny and re-analysis. While this may seem straightforward (why not place our data in online repositories?), it is not. Ethnographic ‘data’ may consist of everything from verbatim transcripts (‘hard data’) to memories and impressions (‘soft data’). Hard data can be archived and re-analysed; soft data cannot. The focus on hard ‘objective’ data contributes to the delegitimizing of the soft data that are essential for ethnographic understanding, and without which hard data cannot be properly interpreted. However, the credibility of ethnographic interpretation requires the possibility of verification. This could be achieved by obligatory, standardised forms of personal storage with the option for audit if required, and by being more explicit in publications about the nature and status of the data and the process of interpretation. PMID:29081713
The verification of ethnographic data.
Pool, Robert
2017-09-01
Anthropologists are increasingly required to account for the data on which they base their interpretations and to make it available for public scrutiny and re-analysis. While this may seem straightforward (why not place our data in online repositories?), it is not. Ethnographic 'data' may consist of everything from verbatim transcripts ('hard data') to memories and impressions ('soft data'). Hard data can be archived and re-analysed; soft data cannot. The focus on hard 'objective' data contributes to the delegitimizing of the soft data that are essential for ethnographic understanding, and without which hard data cannot be properly interpreted. However, the credibility of ethnographic interpretation requires the possibility of verification. This could be achieved by obligatory, standardised forms of personal storage with the option for audit if required, and by being more explicit in publications about the nature and status of the data and the process of interpretation.
NASA Astrophysics Data System (ADS)
Mackens, Sonja; Klitzsch, Norbert; Grützner, Christoph; Klinger, Riccardo
2017-09-01
Detailed information on shallow sediment distribution in basins is required to achieve solutions for problems in Quaternary geology, geomorphology, neotectonics, (geo)archaeology, and climatology. Usually, detailed information is obtained by studying outcrops and shallow drillings. Unfortunately, such data are often sparsely distributed and thus cannot characterise entire basins in detail. Therefore, they are frequently combined with remote sensing methods to overcome this limitation. Remote sensing can cover entire basins but provides information of the land surface only. Geophysical methods can close the gap between detailed sequences of the shallow sediment inventory from drillings at a few spots and continuous surface information from remote sensing. However, their interpretation in terms of sediment types is often challenging, especially if permafrost conditions complicate their interpretation. Here we present an approach for the joint interpretation of the geophysical methods ground penetrating radar (GPR) and capacitive coupled resistivity (CCR), drill core, and remote sensing data. The methods GPR and CCR were chosen because they allow relatively fast surveying and provide complementary information. We apply the approach to the middle Orkhon Valley in central Mongolia where fluvial, alluvial, and aeolian processes led to complex sediment architecture. The GPR and CCR data, measured on profiles with a total length of about 60 km, indicate the presence of two distinct layers over the complete surveying area: (i) a thawed layer at the surface, and (ii) a frozen layer below. In a first interpretation step, we establish a geophysical classification by considering the geophysical signatures of both layers. We use sedimentological information from core logs to relate the geophysical classes to sediment types. This analysis reveals internal structures of Orkhon River sediments, such as channels and floodplain sediments. We also distinguish alluvial fan deposits and aeolian sediments by their distinct geophysical signature. With this procedure we map aeolian sediments, debris flow sediments, floodplains, and channel sediments along the measured profiles in the entire basin. We show that the joint interpretation of drillings and geophysical profile measurements matches the information from remote sensing data, i.e., the sediment architecture of vast areas can be characterised by combining these techniques. The method presented here proves powerful for characterising large areas with minimal effort and can be applied to similar settings.
Murphy, Marilyn K.; Kowalski, Kurt P.; Grapentine, Joel L.
2010-01-01
The geocontrol template method was developed to georeference multiple, overlapping analog aerial photographs without reliance upon conventionally obtained horizontal ground control. The method was tested as part of a long-term wetland habitat restoration project at a Lake Erie coastal wetland complex in the U.S. Fish and Wildlife Service Ottawa National Wildlife Refuge. As in most coastal wetlands, annually identifiable ground-control features required to georeference photo-interpreted data are difficult to find. The geocontrol template method relies on the following four components: (a) an uncontrolled aerial photo mosaic of the study area, (b) global positioning system (GPS) derived horizontal coordinates of each photo’s principal point, (c) a geocontrol template created by the transfer of fiducial markings and calculated principal points to clear acetate from individual photographs arranged in a mosaic, and (d) the root-mean-square-error testing of the system to ensure an acceptable level of planimetric accuracy. Once created for a study area, the geocontrol template can be registered in geographic information system (GIS) software to facilitate interpretation of multiple images without individual image registration. The geocontrol template enables precise georeferencing of single images within larger blocks of photographs using a repeatable and consistent method.
Interpretation of Blood Microbiology Results - Function of the Clinical Microbiologist.
Kristóf, Katalin; Pongrácz, Júlia
2016-04-01
The proper use and interpretation of blood microbiology results may be one of the most challenging and one of the most important functions of clinical microbiology laboratories. Effective implementation of this function requires careful consideration of specimen collection and processing, pathogen detection techniques, and prompt and precise reporting of identification and susceptibility results. The responsibility of the treating physician is proper formulation of the analytical request and to provide the laboratory with complete and precise patient information, which are inevitable prerequisites of a proper testing and interpretation. The clinical microbiologist can offer advice concerning the differential diagnosis, sampling techniques and detection methods to facilitate diagnosis. Rapid detection methods are essential, since the sooner a pathogen is detected, the better chance the patient has of getting cured. Besides the gold-standard blood culture technique, microbiologic methods that decrease the time in obtaining a relevant result are more and more utilized today. In the case of certain pathogens, the pathogen can be identified directly from the blood culture bottle after propagation with serological or automated/semi-automated systems or molecular methods or with MALDI-TOF MS (matrix-assisted laser desorption-ionization time of flight mass spectrometry). Molecular biology methods are also suitable for the rapid detection and identification of pathogens from aseptically collected blood samples. Another important duty of the microbiology laboratory is to notify the treating physician immediately about all relevant information if a positive sample is detected. The clinical microbiologist may provide important guidance regarding the clinical significance of blood isolates, since one-third to one-half of blood culture isolates are contaminants or isolates of unknown clinical significance. To fully exploit the benefits of blood culture and other (non- culture based) diagnoses, the microbiologist and the clinician should interact directly.
10 CFR 34.5 - Interpretations.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 10 Energy 1 2014-01-01 2014-01-01 false Interpretations. 34.5 Section 34.5 Energy NUCLEAR REGULATORY COMMISSION LICENSES FOR INDUSTRIAL RADIOGRAPHY AND RADIATION SAFETY REQUIREMENTS FOR INDUSTRIAL RADIOGRAPHIC OPERATIONS General Provisions § 34.5 Interpretations. Except as specifically authorized by the...
10 CFR 34.5 - Interpretations.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 10 Energy 1 2012-01-01 2012-01-01 false Interpretations. 34.5 Section 34.5 Energy NUCLEAR REGULATORY COMMISSION LICENSES FOR INDUSTRIAL RADIOGRAPHY AND RADIATION SAFETY REQUIREMENTS FOR INDUSTRIAL RADIOGRAPHIC OPERATIONS General Provisions § 34.5 Interpretations. Except as specifically authorized by the...
10 CFR 34.5 - Interpretations.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 10 Energy 1 2010-01-01 2010-01-01 false Interpretations. 34.5 Section 34.5 Energy NUCLEAR REGULATORY COMMISSION LICENSES FOR INDUSTRIAL RADIOGRAPHY AND RADIATION SAFETY REQUIREMENTS FOR INDUSTRIAL RADIOGRAPHIC OPERATIONS General Provisions § 34.5 Interpretations. Except as specifically authorized by the...
10 CFR 34.5 - Interpretations.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 10 Energy 1 2011-01-01 2011-01-01 false Interpretations. 34.5 Section 34.5 Energy NUCLEAR REGULATORY COMMISSION LICENSES FOR INDUSTRIAL RADIOGRAPHY AND RADIATION SAFETY REQUIREMENTS FOR INDUSTRIAL RADIOGRAPHIC OPERATIONS General Provisions § 34.5 Interpretations. Except as specifically authorized by the...
10 CFR 34.5 - Interpretations.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 10 Energy 1 2013-01-01 2013-01-01 false Interpretations. 34.5 Section 34.5 Energy NUCLEAR REGULATORY COMMISSION LICENSES FOR INDUSTRIAL RADIOGRAPHY AND RADIATION SAFETY REQUIREMENTS FOR INDUSTRIAL RADIOGRAPHIC OPERATIONS General Provisions § 34.5 Interpretations. Except as specifically authorized by the...
NASA Astrophysics Data System (ADS)
Clausing, Eric; Kraetzer, Christian; Dittmann, Jana; Vielhauer, Claus
2012-10-01
An important part of criminalistic forensics is the analysis of toolmarks. Such toolmarks often consist of plenty of single striations, scratches and dents which can allow for conclusions in regards to the sequence of events or used tools. To receive qualified results with an automated analysis and contactless acquisition of such toolmarks, a detailed digital representation of these and their orientation as well as placing to each other is required. For marks of firearms and tools the desired result of an analysis is a conclusion whether or not a mark has been generated by a tool under suspicion. For toolmark analysis on locking cylinders, the aim is not an identification of the used tool but rather an identification of the opening method. The challenge of such an identification is that a one-to-one comparison of two images is not sufficient - although two marked objects look completely different in regards to the specific location and shape of found marks they still can represent a sample for the identical opening method. This paper provides the first approach for modelling toolmarks on lock pins and takes into consideration the different requirements necessary to generate a detailed and interpretable digital representation of these traces. These requirements are 'detail', i.e. adequate features which allow for a suitable representation and interpretation of single marks, 'meta detail', i.e. adequate representation of the context and connection between all marks and 'distinctiveness', i.e. the possibility to reliably distinguish different sample types by the according model. The model is evaluated with a set of 15 physical samples (resulting in 675 digital scans) of lock pins from cylinders opened with different opening methods, contactlessly scanned with a confocal laser microscope. The presented results suggest a high suitability for the aspired purpose of opening method determination.
Accuracy of Screening Mammography Interpretation by Characteristics of Radiologists
Barlow, William E.; Chi, Chen; Carney, Patricia A.; Taplin, Stephen H.; D’Orsi, Carl; Cutter, Gary; Hendrick, R. Edward; Elmore, Joann G.
2011-01-01
Background Radiologists differ in their ability to interpret screening mammograms accurately. We investigated the relationship of radiologist characteristics to actual performance from 1996 to 2001. Methods Screening mammograms (n = 469 512) interpreted by 124 radiologists were linked to cancer outcome data. The radiologists completed a survey that included questions on demographics, malpractice concerns, years of experience interpreting mammograms, and the number of mammograms read annually. We used receiver operating characteristics (ROC) analysis to analyze variables associated with sensitivity, specificity, and the combination of the two, adjusting for patient variables that affect performance. All P values are two-sided. Results Within 1 year of the mammogram, 2402 breast cancers were identified. Relative to low annual interpretive volume (≤1000 mammograms), greater interpretive volume was associated with higher sensitivity (P = .001; odds ratio [OR] for moderate volume [1001–2000] = 1.68, 95% CI = 1.18 to 2.39; OR for high volume [>2000] = 1.89, 95% CI = 1.36 to 2.63). Specificity decreased with volume (OR for 1001–2000 = 0.65, 95% CI = 0.52 to 0.83; OR for more than 2000 = 0.76, 95% CI = 0.60 to 0.96), compared with 1000 or less (P = .002). Greater number of years of experience interpreting mammograms was associated with lower sensitivity (P = .001), but higher specificity (P = .003). ROC analysis using the ordinal BI-RADS interpretation showed an association between accuracy and both previous mammographic history (P = .012) and breast density (P<.001). No association was observed between accuracy and years interpreting mammograms (P = .34) or mammography volume (P = .94), after adjusting for variables that affect the threshold for calling a mammogram positive. Conclusions We found no evidence that greater volume or experience at interpreting mammograms is associated with better performance. However, they may affect sensitivity and specificity, possibly by determining the threshold for calling a mammogram positive. Increasing volume requirements is unlikely to improve overall mammography performance. PMID:15601640
Toeplitz Inverse Covariance-Based Clustering of Multivariate Time Series Data
Hallac, David; Vare, Sagar; Boyd, Stephen; Leskovec, Jure
2018-01-01
Subsequence clustering of multivariate time series is a useful tool for discovering repeated patterns in temporal data. Once these patterns have been discovered, seemingly complicated datasets can be interpreted as a temporal sequence of only a small number of states, or clusters. For example, raw sensor data from a fitness-tracking application can be expressed as a timeline of a select few actions (i.e., walking, sitting, running). However, discovering these patterns is challenging because it requires simultaneous segmentation and clustering of the time series. Furthermore, interpreting the resulting clusters is difficult, especially when the data is high-dimensional. Here we propose a new method of model-based clustering, which we call Toeplitz Inverse Covariance-based Clustering (TICC). Each cluster in the TICC method is defined by a correlation network, or Markov random field (MRF), characterizing the interdependencies between different observations in a typical subsequence of that cluster. Based on this graphical representation, TICC simultaneously segments and clusters the time series data. We solve the TICC problem through alternating minimization, using a variation of the expectation maximization (EM) algorithm. We derive closed-form solutions to efficiently solve the two resulting subproblems in a scalable way, through dynamic programming and the alternating direction method of multipliers (ADMM), respectively. We validate our approach by comparing TICC to several state-of-the-art baselines in a series of synthetic experiments, and we then demonstrate on an automobile sensor dataset how TICC can be used to learn interpretable clusters in real-world scenarios. PMID:29770257
Hydrogen Donor-Acceptor Fluctuations from Kinetic Isotope Effects: A Phenomenological Model
Roston, Daniel; Cheatum, Christopher M.; Kohen, Amnon
2012-01-01
Kinetic isotope effects (KIEs) and their temperature dependence can probe the structural and dynamic nature of enzyme-catalyzed proton or hydride transfers. The molecular interpretation of their temperature dependence requires expensive and specialized QM/MM calculations to provide a quantitative molecular understanding. Currently available phenomenological models use a non-adiabatic assumption that is not appropriate for most hydride and proton-transfer reactions, while others require more parameters than the experimental data justify. Here we propose a phenomenological interpretation of KIEs based on a simple method to quantitatively link the size and temperature dependence of KIEs to a conformational distribution of the catalyzed reaction. The present model assumes adiabatic hydrogen tunneling, and by fitting experimental KIE data, the model yields a population distribution for fluctuations of the distance between donor and acceptor atoms. Fits to data from a variety of proton and hydride transfers catalyzed by enzymes and their mutants, as well as non-enzymatic reactions, reveal that steeply temperature-dependent KIEs indicate the presence of at least two distinct conformational populations, each with different kinetic behaviors. We present the results of these calculations for several published cases and discuss how the predictions of the calculations might be experimentally tested. The current analysis does not replace molecular quantum mechanics/molecular mechanics (QM/MM) investigations, but it provides a fast and accessible way to quantitatively interpret KIEs in the context of a Marcus-like model. PMID:22857146
10 CFR 61.5 - Interpretations.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 10 Energy 2 2013-01-01 2013-01-01 false Interpretations. 61.5 Section 61.5 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) LICENSING REQUIREMENTS FOR LAND DISPOSAL OF RADIOACTIVE WASTE General Provisions § 61.5 Interpretations. Except as specifically authorized by the Commission in writing, no...
10 CFR 61.5 - Interpretations.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 10 Energy 2 2011-01-01 2011-01-01 false Interpretations. 61.5 Section 61.5 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) LICENSING REQUIREMENTS FOR LAND DISPOSAL OF RADIOACTIVE WASTE General Provisions § 61.5 Interpretations. Except as specifically authorized by the Commission in writing, no...
10 CFR 61.5 - Interpretations.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 10 Energy 2 2012-01-01 2012-01-01 false Interpretations. 61.5 Section 61.5 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) LICENSING REQUIREMENTS FOR LAND DISPOSAL OF RADIOACTIVE WASTE General Provisions § 61.5 Interpretations. Except as specifically authorized by the Commission in writing, no...
10 CFR 61.5 - Interpretations.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 10 Energy 2 2010-01-01 2010-01-01 false Interpretations. 61.5 Section 61.5 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) LICENSING REQUIREMENTS FOR LAND DISPOSAL OF RADIOACTIVE WASTE General Provisions § 61.5 Interpretations. Except as specifically authorized by the Commission in writing, no...
10 CFR 61.5 - Interpretations.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 10 Energy 2 2014-01-01 2014-01-01 false Interpretations. 61.5 Section 61.5 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) LICENSING REQUIREMENTS FOR LAND DISPOSAL OF RADIOACTIVE WASTE General Provisions § 61.5 Interpretations. Except as specifically authorized by the Commission in writing, no...
A hermeneutic phenomenological understanding of men's healing from childhood maltreatment.
Willis, Danny G; Rhodes, Alison M; Dionne-Odom, James N; Lee, Kayoung; Terreri, Pamela
2015-03-01
To describe and interpret men's experience of healing from childhood maltreatment. Hermeneutic phenomenological. In-depth interviews. Community-based purposive, maximum variation sampling approach. Recruitment occurred through posting flyers and advertisements. Verbatim data were analyzed and themes of the meaning of healing were identified. The meaning of healing was interpreted as "moving beyond suffering." Five themes were identified to capture the multidimensional nature of the phenomenon: (a) breaking through the masculine veneer, (b) finding meaning, (c) choosing to live well, (d) caring for the self using holistic healing methods, and (e) engaging in humanizing relationships. Men who survived childhood maltreatment have needs to heal holistically mind, body, and spirit. Meeting their needs requires the provision of highly compassionate humanistic healing environments and healing-promotive nursing care. © The Author(s) 2014.
It Takes Time and Experience to Learn How to Interpret Gaze in Mentalistic Terms
ERIC Educational Resources Information Center
Leavens, David A.
2006-01-01
What capabilities are required for an organism to evince an "explicit" understanding of gaze as a mentalistic phenomenon? One possibility is that mentalistic interpretations of gaze, like concepts of unseen, supernatural beings, are culturally-specific concepts, acquired through cultural learning. These abstract concepts may either require a…
ERIC Educational Resources Information Center
Kane, Michael T.
2016-01-01
How we choose to use a term depends on what we want to do with it. If "validity" is to be used to support a score interpretation, validation would require an analysis of the plausibility of that interpretation. If validity is to be used to support score uses, validation would require an analysis of the appropriateness of the proposed…
Generalized Sheet Transition Conditions for a Metascreen—A Fishnet Metasurface
NASA Astrophysics Data System (ADS)
Holloway, Christopher L.; Kuester, Edward F.
2018-05-01
We used a multiple-scale homogenization method to derive generalized sheet transition conditions (GSTCs) for electromagnetic fields at the surface of a metascreen---a metasurface with a "fishnet" structure. These surfaces are characterized by periodically-spaced arbitrary-shaped apertures in an otherwise relatively impenetrable surface. The parameters in these GSTCs are interpreted as effective surface susceptibilities and surface porosities, which are related to the geometry of the apertures that constitute the metascreen. Finally, we emphasize the subtle but important difference between the GSTCs required for metascreens and those required for metafilms (a metasurface with a "cermet" structure, i.e., an array of isolated (non-touching) scatterers).
Donaldson revisited: is dangerousness a constitutional requirement for civil commitment?
Linburn, G E
1998-01-01
The Supreme Court decision O'Connor v. Donaldson (1975) has been widely interpreted to assert that dangerousness is a constitutional requirement for civil commitment. This interpretation is a misreading of the decision, which actually addressed the conditions disallowing indefinite, involuntary custodial confinement and not the requirements for an initial commitment. An excessive reliance on dangerousness narrowly construed as a restrictive requirement for civil commitment has distorted the commitment process by emphasizing the state's police power in protecting the public at the expense of its parens patriae responsibility to provide care and treatment for the severely mentally ill. In reality, the Court has been remarkably cautious in addressing the justifications for civil commitment and has allowed room for a broader interpretation of legitimate justifications that would permit greater latitude in the treatment of the severely mentally ill.
State of the art in bile analysis in forensic toxicology.
Bévalot, F; Cartiser, N; Bottinelli, C; Guitton, J; Fanton, L
2016-02-01
In forensic toxicology, alternative matrices to blood are useful in case of limited, unavailable or unusable blood sample, suspected postmortem redistribution or long drug intake-to-sampling interval. The present article provides an update on the state of knowledge for the use of bile in forensic toxicology, through a review of the Medline literature from 1970 to May 2015. Bile physiology and technical aspects of analysis (sampling, storage, sample preparation and analytical methods) are reported, to highlight specificities and consequences from an analytical and interpretative point of view. A table summarizes cause of death and quantification in bile and blood of 133 compounds from more than 200 case reports, providing a useful tool for forensic physicians and toxicologists involved in interpreting bile analysis. Qualitative and quantitative interpretation is discussed. As bile/blood concentration ratios are high for numerous molecules or metabolites, bile is a matrix of choice for screening when blood concentrations are low or non-detectable: e.g., cases of weak exposure or long intake-to-death interval. Quantitative applications have been little investigated, but small molecules with low bile/blood concentration ratios seem to be good candidates for quantitative bile-based interpretation. Further experimental data on the mechanism and properties of biliary extraction of xenobiotics of forensic interest are required to improve quantitative interpretation. Copyright © 2015 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.
Interpreting BOLD: towards a dialogue between cognitive and cellular neuroscience.
Hall, Catherine N; Howarth, Clare; Kurth-Nelson, Zebulun; Mishra, Anusha
2016-10-05
Cognitive neuroscience depends on the use of blood oxygenation level-dependent (BOLD) functional magnetic resonance imaging (fMRI) to probe brain function. Although commonly used as a surrogate measure of neuronal activity, BOLD signals actually reflect changes in brain blood oxygenation. Understanding the mechanisms linking neuronal activity to vascular perfusion is, therefore, critical in interpreting BOLD. Advances in cellular neuroscience demonstrating differences in this neurovascular relationship in different brain regions, conditions or pathologies are often not accounted for when interpreting BOLD. Meanwhile, within cognitive neuroscience, the increasing use of high magnetic field strengths and the development of model-based tasks and analyses have broadened the capability of BOLD signals to inform us about the underlying neuronal activity, but these methods are less well understood by cellular neuroscientists. In 2016, a Royal Society Theo Murphy Meeting brought scientists from the two communities together to discuss these issues. Here, we consolidate the main conclusions arising from that meeting. We discuss areas of consensus about what BOLD fMRI can tell us about underlying neuronal activity, and how advanced modelling techniques have improved our ability to use and interpret BOLD. We also highlight areas of controversy in understanding BOLD and suggest research directions required to resolve these issues.This article is part of the themed issue 'Interpreting BOLD: a dialogue between cognitive and cellular neuroscience'. © 2016 The Author(s).
Hall, Gordon C Nagayama; Yip, Tiffany; Zárate, Michael A
2016-12-01
In their comments on Hall, Yip, and Zárate (2016), Dvorakova (2016) addresses cultural psychology methods and Yakushko, Hoffman, Consoli, and Lee (2016) address qualitative research methods. We provide evidence of the neglect of underrepresented groups in the publications of major journals in cultural psychology and qualitative psychology. We do not view any particular research method as inherently contributing to "epistemological violence" (Yakushko et al., 2016, p. 5), but it is the misguided application and/or interpretation of data generated from such methods that perpetuate oppression. We contend that best practices for representing ethnocultural diversity in research will require a diverse toolbox containing quantitative, qualitative, biological, and behavioral approaches. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
Interpreter Services in An Inner City Teaching Hospital: A 6-Year Experience
Khwaja, Nadeem; Sharma, Saroj; Wong, Julian; Murray, David; Ghosh, Jonathan; Murphy, Michael O; Halka, Anastassi T; Walker, Michael G
2006-01-01
INTRODUCTION Being able to communicate effectively with patients is essential not only from a medicolegal standpoint but more importantly from clinical governance perspectives. Issues such as informed consent and patient choice within the NHS are currently being highlighted; for these to be available to patients, their language requirements are paramount. PATIENTS AND METHODS An audit was performed by the Linkworkers office at the Central Manchester & Manchester Children's Hospital NHS (CMMC) Trust on the total number of attendances and refusals per language in the period 1998–2003. RESULTS In the CMMC Trust, Urdu/Punjabi, Bengali, Cantonese, Somali, Arabic and French represent the majority of the workload, comprising almost 80% of cases in 2003. In the same year, an increase in demand for languages of Eastern European countries became evident. Finding interpreters for these languages even via agencies can be extremely difficult. CONCLUSIONS If the current trend continues, requirement for these services will increase exponentially. For this demand to be met adequately these issues must be kept at the forefront of NHS planning. PMID:17132317
Hudelson, Patricia; Vilpert, Sarah
2009-01-01
Background Use of available interpreter services by hospital clincial staff is often suboptimal, despite evidence that trained interpreters contribute to quality of care and patient safety. Examination of intra-hospital variations in attitudes and practices regarding interpreter use can contribute to identifying factors that facilitate good practice. The purpose of this study was to describe attitudes, practices and preferences regarding communication with limited French proficiency (LFP) patients, examine how these vary across professions and departments within the hospital, and identify factors associated with good practices. Methods A self-administered questionnaire was mailed to random samples of 700 doctors, 700 nurses and 93 social workers at the Geneva University Hospitals, Switzerland. Results Seventy percent of respondents encounter LFP patients at least once a month, but this varied by department. 66% of respondents said they preferred working with ad hoc interpreters (patient's family and bilingual staff), mainly because these were easier to access. During the 6 months preceding the study, ad hoc interpreters were used at least once by 71% of respondents, and professional interpreters were used at least once by 51%. Overall, only nine percent of respondents had received any training in how and why to work with a trained interpreter. Only 23.2% of respondents said the clinical service in which they currently worked encouraged them to use professional interpreters. Respondents working in services where use of professional interpreters was encouraged were more likely to be of the opinion that the hospital should systematically provide a professional interpreter to LFP patients (40.3%) as compared with those working in a department that discouraged use of professional interpreters (15.5%) and they used professional interpreters more often during the previous 6 months. Conclusion Attitudes and practices regarding communication with LFP patients vary across professions and hospital departments. In order to foster an institution-wide culture conducive to ensuring adequate communication with LFP patients will require both the development of a hospital-wide policy and service-level activities aimed at reinforcing this policy and putting it into practice. PMID:19832982
Methods for processing high-throughput RNA sequencing data.
Ares, Manuel
2014-11-03
High-throughput sequencing (HTS) methods for analyzing RNA populations (RNA-Seq) are gaining rapid application to many experimental situations. The steps in an RNA-Seq experiment require thought and planning, especially because the expense in time and materials is currently higher and the protocols are far less routine than those used for other high-throughput methods, such as microarrays. As always, good experimental design will make analysis and interpretation easier. Having a clear biological question, an idea about the best way to do the experiment, and an understanding of the number of replicates needed will make the entire process more satisfying. Whether the goal is capturing transcriptome complexity from a tissue or identifying small fragments of RNA cross-linked to a protein of interest, conversion of the RNA to cDNA followed by direct sequencing using the latest methods is a developing practice, with new technical modifications and applications appearing every day. Even more rapid are the development and improvement of methods for analysis of the very large amounts of data that arrive at the end of an RNA-Seq experiment, making considerations regarding reproducibility, validation, visualization, and interpretation increasingly important. This introduction is designed to review and emphasize a pathway of analysis from experimental design through data presentation that is likely to be successful, with the recognition that better methods are right around the corner. © 2014 Cold Spring Harbor Laboratory Press.
NASA Astrophysics Data System (ADS)
Haring, Martijn T.; Liv, Nalan; Zonnevylle, A. Christiaan; Narvaez, Angela C.; Voortman, Lenard M.; Kruit, Pieter; Hoogenboom, Jacob P.
2017-03-01
In the biological sciences, data from fluorescence and electron microscopy is correlated to allow fluorescence biomolecule identification within the cellular ultrastructure and/or ultrastructural analysis following live-cell imaging. High-accuracy (sub-100 nm) image overlay requires the addition of fiducial markers, which makes overlay accuracy dependent on the number of fiducials present in the region of interest. Here, we report an automated method for light-electron image overlay at high accuracy, i.e. below 5 nm. Our method relies on direct visualization of the electron beam position in the fluorescence detection channel using cathodoluminescence pointers. We show that image overlay using cathodoluminescence pointers corrects for image distortions, is independent of user interpretation, and does not require fiducials, allowing image correlation with molecular precision anywhere on a sample.
Haring, Martijn T; Liv, Nalan; Zonnevylle, A Christiaan; Narvaez, Angela C; Voortman, Lenard M; Kruit, Pieter; Hoogenboom, Jacob P
2017-03-02
In the biological sciences, data from fluorescence and electron microscopy is correlated to allow fluorescence biomolecule identification within the cellular ultrastructure and/or ultrastructural analysis following live-cell imaging. High-accuracy (sub-100 nm) image overlay requires the addition of fiducial markers, which makes overlay accuracy dependent on the number of fiducials present in the region of interest. Here, we report an automated method for light-electron image overlay at high accuracy, i.e. below 5 nm. Our method relies on direct visualization of the electron beam position in the fluorescence detection channel using cathodoluminescence pointers. We show that image overlay using cathodoluminescence pointers corrects for image distortions, is independent of user interpretation, and does not require fiducials, allowing image correlation with molecular precision anywhere on a sample.
Haring, Martijn T.; Liv, Nalan; Zonnevylle, A. Christiaan; Narvaez, Angela C.; Voortman, Lenard M.; Kruit, Pieter; Hoogenboom, Jacob P.
2017-01-01
In the biological sciences, data from fluorescence and electron microscopy is correlated to allow fluorescence biomolecule identification within the cellular ultrastructure and/or ultrastructural analysis following live-cell imaging. High-accuracy (sub-100 nm) image overlay requires the addition of fiducial markers, which makes overlay accuracy dependent on the number of fiducials present in the region of interest. Here, we report an automated method for light-electron image overlay at high accuracy, i.e. below 5 nm. Our method relies on direct visualization of the electron beam position in the fluorescence detection channel using cathodoluminescence pointers. We show that image overlay using cathodoluminescence pointers corrects for image distortions, is independent of user interpretation, and does not require fiducials, allowing image correlation with molecular precision anywhere on a sample. PMID:28252673
Faraday rotation data analysis with least-squares elliptical fitting
DOE Office of Scientific and Technical Information (OSTI.GOV)
White, Adam D.; McHale, G. Brent; Goerz, David A.
2010-10-15
A method of analyzing Faraday rotation data from pulsed magnetic field measurements is described. The method uses direct least-squares elliptical fitting to measured data. The least-squares fit conic parameters are used to rotate, translate, and rescale the measured data. Interpretation of the transformed data provides improved accuracy and time-resolution characteristics compared with many existing methods of analyzing Faraday rotation data. The method is especially useful when linear birefringence is present at the input or output of the sensing medium, or when the relative angle of the polarizers used in analysis is not aligned with precision; under these circumstances the methodmore » is shown to return the analytically correct input signal. The method may be pertinent to other applications where analysis of Lissajous figures is required, such as the velocity interferometer system for any reflector (VISAR) diagnostics. The entire algorithm is fully automated and requires no user interaction. An example of algorithm execution is shown, using data from a fiber-based Faraday rotation sensor on a capacitive discharge experiment.« less
Intelligent Detection of Structure from Remote Sensing Images Based on Deep Learning Method
NASA Astrophysics Data System (ADS)
Xin, L.
2018-04-01
Utilizing high-resolution remote sensing images for earth observation has become the common method of land use monitoring. It requires great human participation when dealing with traditional image interpretation, which is inefficient and difficult to guarantee the accuracy. At present, the artificial intelligent method such as deep learning has a large number of advantages in the aspect of image recognition. By means of a large amount of remote sensing image samples and deep neural network models, we can rapidly decipher the objects of interest such as buildings, etc. Whether in terms of efficiency or accuracy, deep learning method is more preponderant. This paper explains the research of deep learning method by a great mount of remote sensing image samples and verifies the feasibility of building extraction via experiments.
NASA Astrophysics Data System (ADS)
Odijk, Dennis; Zhang, Baocheng; Khodabandeh, Amir; Odolinski, Robert; Teunissen, Peter J. G.
2016-01-01
The concept of integer ambiguity resolution-enabled Precise Point Positioning (PPP-RTK) relies on appropriate network information for the parameters that are common between the single-receiver user that applies and the network that provides this information. Most of the current methods for PPP-RTK are based on forming the ionosphere-free combination using dual-frequency Global Navigation Satellite System (GNSS) observations. These methods are therefore restrictive in the light of the development of new multi-frequency GNSS constellations, as well as from the point of view that the PPP-RTK user requires ionospheric corrections to obtain integer ambiguity resolution results based on short observation time spans. The method for PPP-RTK that is presented in this article does not have above limitations as it is based on the undifferenced, uncombined GNSS observation equations, thereby keeping all parameters in the model. Working with the undifferenced observation equations implies that the models are rank-deficient; not all parameters are unbiasedly estimable, but only combinations of them. By application of S-system theory the model is made of full rank by constraining a minimum set of parameters, or S-basis. The choice of this S-basis determines the estimability and the interpretation of the parameters that are transmitted to the PPP-RTK users. As this choice is not unique, one has to be very careful when comparing network solutions in different S-systems; in that case the S-transformation, which is provided by the S-system method, should be used to make the comparison. Knowing the estimability and interpretation of the parameters estimated by the network is shown to be crucial for a correct interpretation of the estimable PPP-RTK user parameters, among others the essential ambiguity parameters, which have the integer property which is clearly following from the interpretation of satellite phase biases from the network. The flexibility of the S-system method is furthermore demonstrated by the fact that all models in this article are derived in multi-epoch mode, allowing to incorporate dynamic model constraints on all or subsets of parameters.
NASA Astrophysics Data System (ADS)
De Carlo, Lorenzo; Perri, Maria Teresa; Caputo, Maria Clementina; Deiana, Rita; Vurro, Michele; Cassiani, Giorgio
2013-11-01
Electrical resistivity methods are widely used for environmental applications, and they are particularly useful for the characterization and monitoring of sites where the presence of contamination requires a thorough understanding of the location and movement of water, that can act as a carrier of solutes. One such application is landfill studies, where the strong electrical contrasts between waste, leachate and surrounding formations make electrical methods a nearly ideal tool for investigation. In spite of the advantages, however, electrical investigation of landfills poses also challenges, both logistical and interpretational. This paper presents the results of a study conducted on a dismissed landfill, close to the city of Corigliano d'Otranto, in the Apulia region (Southern Italy). The landfill is located in an abandoned quarry, that was subsequently re-utilized about thirty years ago as a site for urban waste disposal. The waste was thought to be more than 20 m thick, and the landfill bottom was expected to be confined with an HDPE (high-density poli-ethylene) liner. During the digging operations performed to build a nearby new landfill, leachate was found, triggering an in-depth investigation including also non-invasive methods. The principal goal was to verify whether the leachate is indeed confined, and to what extent, by the HDPE liner. We performed both surface electrical resistivity tomography (ERT) and mise-à-la-masse (MALM) surveys, facing the severe challenges posed by the rugged terrain of the abandoned quarry complex. A conductive body, probably associated with leachate, was found as deep as 40 m below the current landfill surface i.e. at a depth much larger than the expected 20 m thickness of waste. Given the logistical difficulties that limit the geometry of acquisition, we utilized synthetic forward modeling in order to confirm/dismiss interpretational hypotheses emerging from the ERT and MALM results. This integration between measurements and modeling helped narrow the alternative interpretations and strengthened the confidence in results, confirming the effectiveness of non-invasive methods in landfill investigation and the importance of modeling in the interpretation of geophysical results.
On-Site Detection as a Countermeasure to Chemical Warfare/Terrorism.
Seto, Y
2014-01-01
On-site monitoring and detection are necessary in the crisis and consequence management of wars and terrorism involving chemical warfare agents (CWAs) such as sarin. The analytical performance required for on-site detection is mainly determined by the fatal vapor concentration and volatility of the CWAs involved. The analytical performance for presently available on-site technologies and commercially available on-site equipment for detecting CWAs interpreted and compared in this review include: classical manual methods, photometric methods, ion mobile spectrometry, vibrational spectrometry, gas chromatography, mass spectrometry, sensors, and other methods. Some of the data evaluated were obtained from our experiments using authentic CWAs. We concluded that (a) no technologies perfectly fulfill all of the on-site detection requirements and (b) adequate on-site detection requires (i) a combination of the monitoring-tape method and ion-mobility spectrometry for point detection and (ii) a combination of the monitoring-tape method, atmospheric pressure chemical ionization mass spectrometry with counterflow introduction, and gas chromatography with a trap and special detectors for continuous monitoring. The basic properties of CWAs, the concept of on-site detection, and the sarin gas attacks in Japan as well as the forensic investigations thereof, are also explicated in this article. Copyright © 2014 Central Police University.
Simulation of spin label structure and its implication in molecular characterization
Fajer, Piotr; Fajer, Mikolai; Zawrotny, Michael; Yang, Wei
2016-01-01
Interpretation of EPR from spin labels in terms of structure and dynamics requires knowledge of label behavior. General strategies were developed for simulation of labels used in EPR of proteins. The criteria for those simulations are: (a) exhaustive sampling of rotamer space; (b) consensus of results independent of starting points; (c) inclusion of entropy. These criteria are satisfied only when the number of transitions in any dihedral angle exceeds 100 and the simulation maintains thermodynamic equilibrium. Methods such as conventional MD do not efficiently cross energetic barriers, Simulated Annealing, Monte Carlo or popular Rotamer Library methodologies are potential energy based and ignore entropy (in addition to their specific shortcomings: environment fluctuations, fixed environment or electrostatics). Simulated Scaling method, avoids above flaws by modulating the forcefields between 0 (allowing crossing energy barriers) and full potential (sampling minima). Spin label diffuses on this surface while remaining in thermodynamic equilibrium. Simulations show that: (a) single conformation is rare, often there are 2–4 populated rotamers; (b) position of the NO varies up to 16Å. These results illustrate necessity for caution when interpreting EPR signals in terms of molecular structure. For example the 10–16Å distance change in DEER should not be interpreted as a large conformational change, it can well be a flip about Cα -Cβ bond. Rigorous exploration of possible rotamer structures of a spin label is paramount in signal interpretation. We advocate use of bifunctional labels, which motion is restricted 10,000-fold and the NO position is restricted to 2–5Å. PMID:26478501
2014-01-01
Background Cost-effectiveness analyses (CEAs) that use patient-specific data from a randomized controlled trial (RCT) are popular, yet such CEAs are criticized because they neglect to incorporate evidence external to the trial. A popular method for quantifying uncertainty in a RCT-based CEA is the bootstrap. The objective of the present study was to further expand the bootstrap method of RCT-based CEA for the incorporation of external evidence. Methods We utilize the Bayesian interpretation of the bootstrap and derive the distribution for the cost and effectiveness outcomes after observing the current RCT data and the external evidence. We propose simple modifications of the bootstrap for sampling from such posterior distributions. Results In a proof-of-concept case study, we use data from a clinical trial and incorporate external evidence on the effect size of treatments to illustrate the method in action. Compared to the parametric models of evidence synthesis, the proposed approach requires fewer distributional assumptions, does not require explicit modeling of the relation between external evidence and outcomes of interest, and is generally easier to implement. A drawback of this approach is potential computational inefficiency compared to the parametric Bayesian methods. Conclusions The bootstrap method of RCT-based CEA can be extended to incorporate external evidence, while preserving its appealing features such as no requirement for parametric modeling of cost and effectiveness outcomes. PMID:24888356
Steele Gray, Carolyn; Khan, Anum Irfan; Kuluski, Kerry; McKillop, Ian; Sharpe, Sarah; Bierman, Arlene S; Lyons, Renee F; Cott, Cheryl
2016-02-18
Many mHealth technologies do not meet the needs of patients with complex chronic disease and disabilities (CCDDs) who are among the highest users of health systems worldwide. Furthermore, many of the development methodologies used in the creation of mHealth and eHealth technologies lack the ability to embrace users with CCDD in the specification process. This paper describes how we adopted and modified development techniques to create the electronic Patient-Reported Outcomes (ePRO) tool, a patient-centered mHealth solution to help improve primary health care for patients experiencing CCDD. This paper describes the design and development approach, specifically the process of incorporating qualitative research methods into user-centered design approaches to create the ePRO tool. Key lessons learned are offered as a guide for other eHealth and mHealth research and technology developers working with complex patient populations and their primary health care providers. Guided by user-centered design principles, interpretive descriptive qualitative research methods were adopted to capture user experiences through interviews and working groups. Consistent with interpretive descriptive methods, an iterative analysis technique was used to generate findings, which were then organized in relation to the tool design and function to help systematically inform modifications to the tool. User feedback captured and analyzed through this method was used to challenge the design and inform the iterative development of the tool. Interviews with primary health care providers (n=7) and content experts (n=6), and four focus groups with patients and carers (n=14) along with a PICK analysis-Possible, Implementable, (to be) Challenged, (to be) Killed-guided development of the first prototype. The initial prototype was presented in three design working groups with patients/carers (n=5), providers (n=6), and experts (n=5). Working group findings were broken down into categories of what works and what does not work to inform modifications to the prototype. This latter phase led to a major shift in the purpose and design of the prototype, validating the importance of using iterative codesign processes. Interpretive descriptive methods allow for an understanding of user experiences of patients with CCDD, their carers, and primary care providers. Qualitative methods help to capture and interpret user needs, and identify contextual barriers and enablers to tool adoption, informing a redesign to better suit the needs of this diverse user group. This study illustrates the value of adopting interpretive descriptive methods into user-centered mHealth tool design and can also serve to inform the design of other eHealth technologies. Our approach is particularly useful in requirements determination when developing for a complex user group and their health care providers.
Interpreting spectral unmixing coefficients: From spectral weights to mass fractions
NASA Astrophysics Data System (ADS)
Grumpe, Arne; Mengewein, Natascha; Rommel, Daniela; Mall, Urs; Wöhler, Christian
2018-01-01
It is well known that many common planetary minerals exhibit prominent absorption features. Consequently, the analysis of spectral reflectance measurements has become a major tool of remote sensing. Quantifying the mineral abundances, however, is not a trivial task. The interaction between the incident light rays and particulate surfaces, e.g., the lunar regolith, leads to a non-linear relationship between the reflectance spectra of the pure minerals, the so-called ;endmembers;, and the surface's reflectance spectrum. It is, however, possible to transform the non-linear reflectance mixture into a linear mixture of single-scattering albedos of the Hapke model. The abundances obtained by inverting the linear single-scattering albedo mixture may be interpreted as volume fractions which are weighted by the endmember's extinction coefficient. Commonly, identical extinction coefficients are assumed throughout all endmembers and the obtained volume fractions are converted to mass fractions using either measured or assumed densities. In theory, the proposed method may cover different grain sizes if each grain size range of a mineral is treated as a distinct endmember. Here, we present a method to transform the mixing coefficients to mass fractions for arbitrary combinations of extinction coefficients and densities. The required parameters are computed from reflectance measurements of well defined endmember mixtures. Consequently, additional measurements, e.g., the endmember density, are no longer required. We evaluate the method based on laboratory measurements and various results presented in the literature, respectively. It is shown that the procedure transforms the mixing coefficients to mass fractions yielding an accuracy comparable to carefully calibrated laboratory measurements without additional knowledge. For our laboratory measurements, the square root of the mean squared error is less than 4.82 wt%. In addition, the method corrects for systematic effects originating from mixtures of endmembers showing a highly varying albedo, e.g., plagioclase and pyroxene.
Pathogen profiling for disease management and surveillance.
Sintchenko, Vitali; Iredell, Jonathan R; Gilbert, Gwendolyn L
2007-06-01
The usefulness of rapid pathogen genotyping is widely recognized, but its effective interpretation and application requires integration into clinical and public health decision-making. How can pathogen genotyping data best be translated to inform disease management and surveillance? Pathogen profiling integrates microbial genomics data into communicable disease control by consolidating phenotypic identity-based methods with DNA microarrays, proteomics, metabolomics and sequence-based typing. Sharing data on pathogen profiles should facilitate our understanding of transmission patterns and the dynamics of epidemics.
Qualitative research: a brief description.
Kemparaj, Umesh; Chavan, Sangeeta
2013-01-01
Qualitative research refers to, a range of methodological approaches which aim to generate an in-depth and interpreted understanding of the social world, by learning about people's social and material circumstances, their experiences, perspectives, and histories. Requires researchers to become intensely involved, often remaining in field for lengthy periods of time. The greatest value of qualitative research is its ability to address questions of relevance to public health knowledge and practice which are difficult to answer satisfactorily using quantitative methods.
Crosby, Sondra S
2013-08-07
Refugees are a vulnerable class of immigrants who have fled their countries, typically following war, violence, or natural disaster, and who have frequently experienced trauma. In primary care, engaging refugees to develop a positive therapeutic relationship is challenging. Relative to care of other primary care patients, there are important differences in symptom evaluation and developing treatment plans. To discuss the importance of and methods for obtaining refugee trauma histories, to recognize the psychological and physical manifestations of trauma characteristic of refugees, and to explore how cultural differences and limited English proficiency affect the refugee patient-clinician relationship and how to best use interpreters. MEDLINE and the Cochrane Library were searched from 1984 to 2012. Additional citations were obtained from lists of references from select research and review articles on this topic. Engagement with a refugee patient who has experienced trauma requires an understanding of the trauma history and the trauma-related symptoms. Mental health symptoms and chronic pain are commonly experienced by refugee patients. Successful treatment requires a multidisciplinary approach that is culturally acceptable to the refugee. Refugee patients frequently have experienced trauma requiring a directed history and physical examination, facilitated by an interpreter if necessary. Intervention should be sensitive to the refugee's cultural mores.
NASA Technical Reports Server (NTRS)
Generazio, Edward R.
2014-01-01
Unknown risks are introduced into failure critical systems when probability of detection (POD) capabilities are accepted without a complete understanding of the statistical method applied and the interpretation of the statistical results. The presence of this risk in the nondestructive evaluation (NDE) community is revealed in common statements about POD. These statements are often interpreted in a variety of ways and therefore, the very existence of the statements identifies the need for a more comprehensive understanding of POD methodologies. Statistical methodologies have data requirements to be met, procedures to be followed, and requirements for validation or demonstration of adequacy of the POD estimates. Risks are further enhanced due to the wide range of statistical methodologies used for determining the POD capability. Receiver/Relative Operating Characteristics (ROC) Display, simple binomial, logistic regression, and Bayes' rule POD methodologies are widely used in determining POD capability. This work focuses on Hit-Miss data to reveal the framework of the interrelationships between Receiver/Relative Operating Characteristics Display, simple binomial, logistic regression, and Bayes' Rule methodologies as they are applied to POD. Knowledge of these interrelationships leads to an intuitive and global understanding of the statistical data, procedural and validation requirements for establishing credible POD estimates.
Immunomagnetic separation can enrich fixed solid tumors for epithelial cells.
Yaremko, M. L.; Kelemen, P. R.; Kutza, C.; Barker, D.; Westbrook, C. A.
1996-01-01
Immunomagnetic separation is a highly specific technique for the enrichment or isolation of cells from a variety of fresh tissues and microorganisms or molecules from suspensions. Because new techniques for molecular analysis of solid tumors are now applicable to fixed tissue but sometimes require or benefit from enrichment for tumor cells, we tested the efficacy of immunomagnetic separation for enriching fixed solid tumors for malignant epithelial cells. We applied it to two different tumors and fixation methods to separate neoplastic from non-neoplastic cells in primary colorectal cancers and metastatic breast cancers, and were able to enrich to a high degree of purity. Immunomagnetic separation was effective in unembedded fixed tissue as well as fixed paraffin-embedded tissue. The magnetically separated cells were amenable to fluorescence in situ hybridization and polymerase chain reaction amplification of their DNA with minimal additional manipulation. The high degree of enrichment achieved before amplification contributed to interpretation of loss of heterozygosity in metastatic breast cancers, and simplified fluorescence in situ hybridization analysis because only neoplastic cells were hybridized and counted. Immunomagnetic separation is effective for the enrichment of fixed solid tumors, can be performed with widely available commercial antibodies, and requires little specialized instrumentation. It can contribute to interpretation of results in situations where enrichment by other methods is difficult or not possible. Images Figure 1 Figure 2 Figure 3 PMID:8546231
The Java Image Science Toolkit (JIST) for rapid prototyping and publishing of neuroimaging software.
Lucas, Blake C; Bogovic, John A; Carass, Aaron; Bazin, Pierre-Louis; Prince, Jerry L; Pham, Dzung L; Landman, Bennett A
2010-03-01
Non-invasive neuroimaging techniques enable extraordinarily sensitive and specific in vivo study of the structure, functional response and connectivity of biological mechanisms. With these advanced methods comes a heavy reliance on computer-based processing, analysis and interpretation. While the neuroimaging community has produced many excellent academic and commercial tool packages, new tools are often required to interpret new modalities and paradigms. Developing custom tools and ensuring interoperability with existing tools is a significant hurdle. To address these limitations, we present a new framework for algorithm development that implicitly ensures tool interoperability, generates graphical user interfaces, provides advanced batch processing tools, and, most importantly, requires minimal additional programming or computational overhead. Java-based rapid prototyping with this system is an efficient and practical approach to evaluate new algorithms since the proposed system ensures that rapidly constructed prototypes are actually fully-functional processing modules with support for multiple GUI's, a broad range of file formats, and distributed computation. Herein, we demonstrate MRI image processing with the proposed system for cortical surface extraction in large cross-sectional cohorts, provide a system for fully automated diffusion tensor image analysis, and illustrate how the system can be used as a simulation framework for the development of a new image analysis method. The system is released as open source under the Lesser GNU Public License (LGPL) through the Neuroimaging Informatics Tools and Resources Clearinghouse (NITRC).
The Java Image Science Toolkit (JIST) for Rapid Prototyping and Publishing of Neuroimaging Software
Lucas, Blake C.; Bogovic, John A.; Carass, Aaron; Bazin, Pierre-Louis; Prince, Jerry L.; Pham, Dzung
2010-01-01
Non-invasive neuroimaging techniques enable extraordinarily sensitive and specific in vivo study of the structure, functional response and connectivity of biological mechanisms. With these advanced methods comes a heavy reliance on computer-based processing, analysis and interpretation. While the neuroimaging community has produced many excellent academic and commercial tool packages, new tools are often required to interpret new modalities and paradigms. Developing custom tools and ensuring interoperability with existing tools is a significant hurdle. To address these limitations, we present a new framework for algorithm development that implicitly ensures tool interoperability, generates graphical user interfaces, provides advanced batch processing tools, and, most importantly, requires minimal additional programming or computational overhead. Java-based rapid prototyping with this system is an efficient and practical approach to evaluate new algorithms since the proposed system ensures that rapidly constructed prototypes are actually fully-functional processing modules with support for multiple GUI's, a broad range of file formats, and distributed computation. Herein, we demonstrate MRI image processing with the proposed system for cortical surface extraction in large cross-sectional cohorts, provide a system for fully automated diffusion tensor image analysis, and illustrate how the system can be used as a simulation framework for the development of a new image analysis method. The system is released as open source under the Lesser GNU Public License (LGPL) through the Neuroimaging Informatics Tools and Resources Clearinghouse (NITRC). PMID:20077162
Visualization of hyperspectral imagery
NASA Astrophysics Data System (ADS)
Hogervorst, Maarten A.; Bijl, Piet; Toet, Alexander
2007-04-01
We developed four new techniques to visualize hyper spectral image data for man-in-the-loop target detection. The methods respectively: (1) display the subsequent bands as a movie ("movie"), (2) map the data onto three channels and display these as a colour image ("colour"), (3) display the correlation between the pixel signatures and a known target signature ("match") and (4) display the output of a standard anomaly detector ("anomaly"). The movie technique requires no assumptions about the target signature and involves no information loss. The colour technique produces a single image that can be displayed in real-time. A disadvantage of this technique is loss of information. A display of the match between a target signature and pixels and can be interpreted easily and fast, but this technique relies on precise knowledge of the target signature. The anomaly detector signifies pixels with signatures that deviate from the (local) background. We performed a target detection experiment with human observers to determine their relative performance with the four techniques,. The results show that the "match" presentation yields the best performance, followed by "movie" and "anomaly", while performance with the "colour" presentation was the poorest. Each scheme has its advantages and disadvantages and is more or less suited for real-time and post-hoc processing. The rationale is that the final interpretation is best done by a human observer. In contrast to automatic target recognition systems, the interpretation of hyper spectral imagery by the human visual system is robust to noise and image transformations and requires a minimal number of assumptions (about signature of target and background, target shape etc.) When more knowledge about target and background is available this may be used to help the observer interpreting the data (aided target detection).
New Developments in Observer Performance Methodology in Medical Imaging
Chakraborty, Dev P.
2011-01-01
A common task in medical imaging is assessing whether a new imaging system, or a variant of an existing one, is an improvement over an existing imaging technology. Imaging systems are generally quite complex, consisting of several components – e.g., image acquisition hardware, image processing and display hardware and software, and image interpretation by radiologists– each of which can affect performance. While it may appear odd to include the radiologist as a “component” of the imaging chain, since the radiologist’s decision determines subsequent patient care, the effect of the human interpretation has to be included. Physical measurements like modulation transfer function, signal to noise ratio, etc., are useful for characterizing the non-human parts of the imaging chain under idealized and often unrealistic conditions, such as uniform background phantoms, target objects with sharp edges, etc. Measuring the effect on performance of the entire imaging chain, including the radiologist, and using real clinical images, requires different methods that fall under the rubric of observer performance methods or “ROC analysis”. The purpose of this paper is to review recent developments in this field, particularly with respect to the free-response method. PMID:21978444
Using visual art and collaborative reflection to explore medical attitudes toward vulnerable persons
Kidd, Monica; Nixon, Lara; Rosenal, Tom; Jackson, Roberta; Pereles, Laurie; Mitchell, Ian; Bendiak, Glenda; Hughes, Lisa
2016-01-01
Background Vulnerable persons often face stigma-related barriers while seeking health care. Innovative education and professional development methods are needed to help change this. Method We describe an interdisciplinary group workshop designed around a discomfiting oil portrait, intended to trigger provocative conversations among health care students and practitioners, and we present our mixed methods analysis of participant reflections. Results After the workshop, participants were significantly more likely to endorse the statements that the observation and interpretive skills involved in viewing visual art are relevant to patient care and that visual art should be used in medical education to improve students’ observational skills, narrative skills, and empathy with their patients. Subsequent to the workshop, significantly more participants agreed that art interpretation should be required curriculum for health care students. Qualitative comments from two groups from two different education and professional contexts were examined for themes; conversations focused on issues of power, body image/self-esteem, and lessons for clinical practice. Conclusions We argue that difficult conversations about affective responses to vulnerable persons are possible in a collaborative context using well-chosen works of visual art that can stand in for a patient. PMID:27103949
Postmodern Bioethics through Literature.
ERIC Educational Resources Information Center
Goldstein, Daniel
1994-01-01
Explores a hermeneutical perspective of modern medicine. The author suggests that good medical decision making requires interpretation, and bioethics will be well served by incorporating this interpretive element. (LZ)
29 CFR 1926.1102 - Coal tar pitch volatiles; interpretation of term.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 29 Labor 8 2014-07-01 2014-07-01 false Coal tar pitch volatiles; interpretation of term. 1926.1102 Section 1926.1102 Labor Regulations Relating to Labor (Continued) OCCUPATIONAL SAFETY AND HEALTH... Hazardous Substances § 1926.1102 Coal tar pitch volatiles; interpretation of term. Note: The requirements...
29 CFR 1926.1102 - Coal tar pitch volatiles; interpretation of term.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 29 Labor 8 2011-07-01 2011-07-01 false Coal tar pitch volatiles; interpretation of term. 1926.1102 Section 1926.1102 Labor Regulations Relating to Labor (Continued) OCCUPATIONAL SAFETY AND HEALTH... Hazardous Substances § 1926.1102 Coal tar pitch volatiles; interpretation of term. Note: The requirements...
29 CFR 1926.1102 - Coal tar pitch volatiles; interpretation of term.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 29 Labor 8 2012-07-01 2012-07-01 false Coal tar pitch volatiles; interpretation of term. 1926.1102 Section 1926.1102 Labor Regulations Relating to Labor (Continued) OCCUPATIONAL SAFETY AND HEALTH... Hazardous Substances § 1926.1102 Coal tar pitch volatiles; interpretation of term. Note: The requirements...
29 CFR 1926.1102 - Coal tar pitch volatiles; interpretation of term.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 29 Labor 8 2010-07-01 2010-07-01 false Coal tar pitch volatiles; interpretation of term. 1926.1102 Section 1926.1102 Labor Regulations Relating to Labor (Continued) OCCUPATIONAL SAFETY AND HEALTH... Hazardous Substances § 1926.1102 Coal tar pitch volatiles; interpretation of term. Note: The requirements...
ERIC Educational Resources Information Center
CAMPBELL, PAUL N.
THE BASIC PREMISE OF THIS BOOK IS THAT LEARNING TO READ ORALLY IS OF FUNDAMENTAL IMPORTANCE TO THOSE WHO WOULD FULLY APPRECIATE OR RESPOND TO LITERATURE. BECAUSE READERS MUST INTERPRET LITERATURE ALWAYS FOR THEMSELVES AND OFTEN FOR AN AUDIENCE, THREE ASPECTS OF ORAL INTERPRETATION ARE EXPLORED--(1) THE CHOICE OF MATERIALS, WHICH REQUIRES AN…
Interpretation Analysis as a Competitive Event.
ERIC Educational Resources Information Center
Nading, Robert M.
Interpretation analysis is a new and interesting event on the forensics horizon which appears to be attracting an ever larger number of supporters. This event, developed by Larry Lambert of Ball State University in 1989, requires a student to perform all three disciplines of forensic competition (interpretation, public speaking, and limited…
Interpretacion: The Lived Experience of Interpretation in the Bilingual Psychotherapist
ERIC Educational Resources Information Center
Melchor, Rosemary Laura
2008-01-01
To enhance the effectiveness of therapy for Spanish-speaking individuals and families requires an understanding of the subtleties of language use and interpretive processing. The purpose of this phenomenological study was to describe the interpretive process in bilingual psychotherapists as they reflected upon their lived experiences of providing…
Landscape Interpretation with Augmented Reality and Maps to Improve Spatial Orientation Skill
ERIC Educational Resources Information Center
Carbonell Carrera, Carlos; Bermejo Asensio, Luis A.
2017-01-01
Landscape interpretation is needed for navigating and determining an orientation: with traditional cartography, interpreting 3D topographic information from 2D landform representations to get self-location requires spatial orientation skill. Augmented reality technology allows a new way to interact with 3D landscape representation and thereby…
Arús, Nádia A; da Silva, Átila M; Duarte, Rogério; da Silveira, Priscila F; Vizzotto, Mariana B; da Silveira, Heraldo L D; da Silveira, Heloisa E D
2017-06-01
The aims of this study were to evaluate and compare the performance of dental students in interpreting the temporomandibular joint (TMJ) with magnetic resonance imaging (MRI) scans using two learning methods (conventional and digital interactive learning) and to examine the usability of the digital learning object (DLO). The DLO consisted of tutorials about MRI and anatomic and functional aspects of the TMJ. In 2014, dental students in their final year of study who were enrolled in the elective "MRI Interpretation of the TMJ" course comprised the study sample. After exclusions for nonattendance and other reasons, 29 of the initial 37 students participated in the study, for a participation rate of 78%. The participants were divided into two groups: a digital interactive learning group (n=14) and a conventional learning group (n=15). Both methods were assessed by an objective test applied before and after training and classes. Aspects such as support and training requirements, complexity, and consistency of the DLO were also evaluated using the System Usability Scale (SUS). A significant between-group difference in the posttest results was found, with the conventional learning group scoring better than the DLO group, indicated by mean scores of 9.20 and 8.11, respectively, out of 10. However, when the pretest and posttest results were compared, both groups showed significantly improved performance. The SUS score was 89, which represented a high acceptance of the DLO by the users. The students who used the conventional method of learning showed superior performance in interpreting the TMJ using MRI compared to the group that used digital interactive learning.
Comparing errors in ED computer-assisted vs conventional pediatric drug dosing and administration.
Yamamoto, Loren; Kanemori, Joan
2010-06-01
Compared to fixed-dose single-vial drug administration in adults, pediatric drug dosing and administration requires a series of calculations, all of which are potentially error prone. The purpose of this study is to compare error rates and task completion times for common pediatric medication scenarios using computer program assistance vs conventional methods. Two versions of a 4-part paper-based test were developed. Each part consisted of a set of medication administration and/or dosing tasks. Emergency department and pediatric intensive care unit nurse volunteers completed these tasks using both methods (sequence assigned to start with a conventional or a computer-assisted approach). Completion times, errors, and the reason for the error were recorded. Thirty-eight nurses completed the study. Summing the completion of all 4 parts, the mean conventional total time was 1243 seconds vs the mean computer program total time of 879 seconds (P < .001). The conventional manual method had a mean of 1.8 errors vs the computer program with a mean of 0.7 errors (P < .001). Of the 97 total errors, 36 were due to misreading the drug concentration on the label, 34 were due to calculation errors, and 8 were due to misplaced decimals. Of the 36 label interpretation errors, 18 (50%) occurred with digoxin or insulin. Computerized assistance reduced errors and the time required for drug administration calculations. A pattern of errors emerged, noting that reading/interpreting certain drug labels were more error prone. Optimizing the layout of drug labels could reduce the error rate for error-prone labels. Copyright (c) 2010 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Swartwout, Michael Alden
New paradigms in space missions require radical changes in spacecraft operations. In the past, operations were insulated from competitive pressures of cost, quality and time by system infrastructures, technological limitations and historical precedent. However, modern demands now require that operations meet competitive performance goals. One target for improvement is the telemetry downlink, where significant resources are invested to acquire thousands of measurements for human interpretation. This cost-intensive method is used because conventional operations are not based on formal methodologies but on experiential reasoning and incrementally adapted procedures. Therefore, to improve the telemetry downlink it is first necessary to invent a rational framework for discussing operations. This research explores operations as a feedback control problem, develops the conceptual basis for the use of spacecraft telemetry, and presents a method to improve performance. The method is called summarization, a process to make vehicle data more useful to operators. Summarization enables rational trades for telemetry downlink by defining and quantitatively ranking these elements: all operational decisions, the knowledge needed to inform each decision, and all possible sensor mappings to acquire that knowledge. Summarization methods were implemented for the Sapphire microsatellite; conceptual health management and system models were developed and a degree-of-observability metric was defined. An automated tool was created to generate summarization methods from these models. Methods generated using a Sapphire model were compared against the conventional operations plan. Summarization was shown to identify the key decisions and isolate the most appropriate sensors. Secondly, a form of summarization called beacon monitoring was experimentally verified. Beacon monitoring automates the anomaly detection and notification tasks and migrates these responsibilities to the space segment. A set of experiments using Sapphire demonstrated significant cost and time savings compared to conventional operations. Summarization is based on rational concepts for defining and understanding operations. Therefore, it enables additional trade studies that were formerly not possible and also can form the basis for future detailed research into spacecraft operations.
Kent, Peter; Stochkendahl, Mette Jensen; Christensen, Henrik Wulff; Kongsted, Alice
2015-01-01
Recognition of homogeneous subgroups of patients can usefully improve prediction of their outcomes and the targeting of treatment. There are a number of research approaches that have been used to recognise homogeneity in such subgroups and to test their implications. One approach is to use statistical clustering techniques, such as Cluster Analysis or Latent Class Analysis, to detect latent relationships between patient characteristics. Influential patient characteristics can come from diverse domains of health, such as pain, activity limitation, physical impairment, social role participation, psychological factors, biomarkers and imaging. However, such 'whole person' research may result in data-driven subgroups that are complex, difficult to interpret and challenging to recognise clinically. This paper describes a novel approach to applying statistical clustering techniques that may improve the clinical interpretability of derived subgroups and reduce sample size requirements. This approach involves clustering in two sequential stages. The first stage involves clustering within health domains and therefore requires creating as many clustering models as there are health domains in the available data. This first stage produces scoring patterns within each domain. The second stage involves clustering using the scoring patterns from each health domain (from the first stage) to identify subgroups across all domains. We illustrate this using chest pain data from the baseline presentation of 580 patients. The new two-stage clustering resulted in two subgroups that approximated the classic textbook descriptions of musculoskeletal chest pain and atypical angina chest pain. The traditional single-stage clustering resulted in five clusters that were also clinically recognisable but displayed less distinct differences. In this paper, a new approach to using clustering techniques to identify clinically useful subgroups of patients is suggested. Research designs, statistical methods and outcome metrics suitable for performing that testing are also described. This approach has potential benefits but requires broad testing, in multiple patient samples, to determine its clinical value. The usefulness of the approach is likely to be context-specific, depending on the characteristics of the available data and the research question being asked of it.
Blackboard architecture for medical image interpretation
NASA Astrophysics Data System (ADS)
Davis, Darryl N.; Taylor, Christopher J.
1991-06-01
There is a growing interest in using sophisticated knowledge-based systems for biomedical image interpretation. We present a principled attempt to use artificial intelligence methodologies in interpreting lateral skull x-ray images. Such radiographs are routinely used in cephalometric analysis to provide quantitative measurements useful to clinical orthodontists. Manual and interactive methods of analysis are known to be error prone and previous attempts to automate this analysis typically fail to capture the expertise and adaptability required to cope with the variability in biological structure and image quality. An integrated model-based system has been developed which makes use of a blackboard architecture and multiple knowledge sources. A model definition interface allows quantitative models, of feature appearance and location, to be built from examples as well as more qualitative modelling constructs. Visual task definition and blackboard control modules allow task-specific knowledge sources to act on information available to the blackboard in a hypothesise and test reasoning cycle. Further knowledge-based modules include object selection, location hypothesis, intelligent segmentation, and constraint propagation systems. Alternative solutions to given tasks are permitted.
Youngblood, Laura; Bangs, Michael J.; Lavery, James V.; James, Stephanie
2015-01-01
Abstract A thorough search of the existing literature has revealed that there are currently no published recommendations or guidelines for the interpretation of US regulations on the use of human participants in vector biology research (VBR). An informal survey of vector biologists has indicated that issues related to human participation in vector research have been largely debated by academic, national, and local Institutional Review Boards (IRBs) in the countries where the research is being conducted, and that interpretations and subsequent requirements made by these IRBs have varied widely. This document is intended to provide investigators and corresponding scientific and ethical review committee members an introduction to VBR methods involving human participation and the legal and ethical framework in which such studies are conducted with a focus on US Federal Regulations. It is also intended to provide a common perspective for guiding researchers, IRB members, and other interested parties (i.e., public health officials conducting routine entomological surveillance) in the interpretation of human subjects regulations pertaining to VBR. PMID:25700039
Castagné, Raphaële; Boulangé, Claire Laurence; Karaman, Ibrahim; Campanella, Gianluca; Santos Ferreira, Diana L; Kaluarachchi, Manuja R; Lehne, Benjamin; Moayyeri, Alireza; Lewis, Matthew R; Spagou, Konstantina; Dona, Anthony C; Evangelos, Vangelis; Tracy, Russell; Greenland, Philip; Lindon, John C; Herrington, David; Ebbels, Timothy M D; Elliott, Paul; Tzoulaki, Ioanna; Chadeau-Hyam, Marc
2017-10-06
1 H NMR spectroscopy of biofluids generates reproducible data allowing detection and quantification of small molecules in large population cohorts. Statistical models to analyze such data are now well-established, and the use of univariate metabolome wide association studies (MWAS) investigating the spectral features separately has emerged as a computationally efficient and interpretable alternative to multivariate models. The MWAS rely on the accurate estimation of a metabolome wide significance level (MWSL) to be applied to control the family wise error rate. Subsequent interpretation requires efficient visualization and formal feature annotation, which, in-turn, call for efficient prioritization of spectral variables of interest. Using human serum 1 H NMR spectroscopic profiles from 3948 participants from the Multi-Ethnic Study of Atherosclerosis (MESA), we have performed a series of MWAS for serum levels of glucose. We first propose an extension of the conventional MWSL that yields stable estimates of the MWSL across the different model parameterizations and distributional features of the outcome. We propose both efficient visualization methods and a strategy based on subsampling and internal validation to prioritize the associations. Our work proposes and illustrates practical and scalable solutions to facilitate the implementation of the MWAS approach and improve interpretation in large cohort studies.
2017-01-01
1H NMR spectroscopy of biofluids generates reproducible data allowing detection and quantification of small molecules in large population cohorts. Statistical models to analyze such data are now well-established, and the use of univariate metabolome wide association studies (MWAS) investigating the spectral features separately has emerged as a computationally efficient and interpretable alternative to multivariate models. The MWAS rely on the accurate estimation of a metabolome wide significance level (MWSL) to be applied to control the family wise error rate. Subsequent interpretation requires efficient visualization and formal feature annotation, which, in-turn, call for efficient prioritization of spectral variables of interest. Using human serum 1H NMR spectroscopic profiles from 3948 participants from the Multi-Ethnic Study of Atherosclerosis (MESA), we have performed a series of MWAS for serum levels of glucose. We first propose an extension of the conventional MWSL that yields stable estimates of the MWSL across the different model parameterizations and distributional features of the outcome. We propose both efficient visualization methods and a strategy based on subsampling and internal validation to prioritize the associations. Our work proposes and illustrates practical and scalable solutions to facilitate the implementation of the MWAS approach and improve interpretation in large cohort studies. PMID:28823158
Debray, Thomas P A; Vergouwe, Yvonne; Koffijberg, Hendrik; Nieboer, Daan; Steyerberg, Ewout W; Moons, Karel G M
2015-03-01
It is widely acknowledged that the performance of diagnostic and prognostic prediction models should be assessed in external validation studies with independent data from "different but related" samples as compared with that of the development sample. We developed a framework of methodological steps and statistical methods for analyzing and enhancing the interpretation of results from external validation studies of prediction models. We propose to quantify the degree of relatedness between development and validation samples on a scale ranging from reproducibility to transportability by evaluating their corresponding case-mix differences. We subsequently assess the models' performance in the validation sample and interpret the performance in view of the case-mix differences. Finally, we may adjust the model to the validation setting. We illustrate this three-step framework with a prediction model for diagnosing deep venous thrombosis using three validation samples with varying case mix. While one external validation sample merely assessed the model's reproducibility, two other samples rather assessed model transportability. The performance in all validation samples was adequate, and the model did not require extensive updating to correct for miscalibration or poor fit to the validation settings. The proposed framework enhances the interpretation of findings at external validation of prediction models. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.
Sadatsafavi, Mohsen; Marra, Carlo; Aaron, Shawn; Bryan, Stirling
2014-06-03
Cost-effectiveness analyses (CEAs) that use patient-specific data from a randomized controlled trial (RCT) are popular, yet such CEAs are criticized because they neglect to incorporate evidence external to the trial. A popular method for quantifying uncertainty in a RCT-based CEA is the bootstrap. The objective of the present study was to further expand the bootstrap method of RCT-based CEA for the incorporation of external evidence. We utilize the Bayesian interpretation of the bootstrap and derive the distribution for the cost and effectiveness outcomes after observing the current RCT data and the external evidence. We propose simple modifications of the bootstrap for sampling from such posterior distributions. In a proof-of-concept case study, we use data from a clinical trial and incorporate external evidence on the effect size of treatments to illustrate the method in action. Compared to the parametric models of evidence synthesis, the proposed approach requires fewer distributional assumptions, does not require explicit modeling of the relation between external evidence and outcomes of interest, and is generally easier to implement. A drawback of this approach is potential computational inefficiency compared to the parametric Bayesian methods. The bootstrap method of RCT-based CEA can be extended to incorporate external evidence, while preserving its appealing features such as no requirement for parametric modeling of cost and effectiveness outcomes.
2011-01-01
Background The Baby Friendly Hospital (Health) Initiative (BFHI) is a global initiative aimed at protecting, promoting and supporting breastfeeding and is based on the ten steps to successful breastfeeding. Worldwide, over 20,000 health facilities have attained BFHI accreditation but only 77 Australian hospitals (approximately 23%) have received accreditation. Few studies have investigated the factors that facilitate or hinder implementation of BFHI but it is acknowledged this is a major undertaking requiring strategic planning and change management throughout an institution. This paper examines the perceptions of BFHI held by midwives and nurses working in one Area Health Service in NSW, Australia. Methods The study used an interpretive, qualitative approach. A total of 132 health professionals, working across four maternity units, two neonatal intensive care units and related community services, participated in 10 focus groups. Data were analysed using thematic analysis. Results Three main themes were identified: 'Belief and Commitment'; 'Interpreting BFHI' and 'Climbing a Mountain'. Participants considered the BFHI implementation a high priority; an essential set of practices that would have positive benefits for babies and mothers both locally and globally as well as for health professionals. It was considered achievable but would take commitment and hard work to overcome the numerous challenges including a number of organisational constraints. There were, however, differing interpretations of what was required to attain BFHI accreditation with the potential that misinterpretation could hinder implementation. A model described by Greenhalgh and colleagues on adoption of innovation is drawn on to interpret the findings. Conclusion Despite strong support for BFHI, the principles of this global strategy are interpreted differently by health professionals and further education and accurate information is required. It may be that the current processes used to disseminate and implement BFHI need to be reviewed. The findings suggest that there is a contradiction between the broad philosophical stance and best practice approach of this global strategy and the tendency for health professionals to focus on the ten steps as a set of tasks or a checklist to be accomplished. The perceived procedural approach to implementation may be contributing to lower rates of breastfeeding continuation. PMID:21878131
On the validity of Freud's dream interpretations.
Michael, Michael
2008-03-01
In this article I defend Freud's method of dream interpretation against those who criticize it as involving a fallacy-namely, the reverse causal fallacy-and those who criticize it as permitting many interpretations, indeed any that the interpreter wants to put on the dream. The first criticism misconstrues the logic of the interpretative process: it does not involve an unjustified reversal of causal relations, but rather a legitimate attempt at an inference to the best explanation. The judgement of whether or not a particular interpretation is the best explanation depends on the details of the case in question. I outline the kinds of probabilities involved in making the judgement. My account also helps to cash out the metaphors of the jigsaw and crossword puzzles that Freudians have used in response to the 'many interpretations' objection. However, in defending Freud's method of dream interpretation, I do not thereby defend his theory of dreams, which cannot be justified by his interpretations alone.
Causal Cognition, Force Dynamics and Early Hunting Technologies
Gärdenfors, Peter; Lombard, Marlize
2018-01-01
With this contribution we analyze ancient hunting technologies as one way to explore the development of causal cognition in the hominin lineage. Building on earlier work, we separate seven grades of causal thinking. By looking at variations in force dynamics as a central element in causal cognition, we analyze the thinking required for different hunting technologies such as stabbing spears, throwing spears, launching atlatl darts, shooting arrows with a bow, and the use of poisoned arrows. Our interpretation demonstrates that there is an interplay between the extension of human body through technology and expanding our cognitive abilities to reason about causes. It adds content and dimension to the trend of including embodied cognition in evolutionary studies and in the interpretation of the archeological record. Our method could explain variation in technology sets between archaic and modern human groups. PMID:29483885
Visualizing tumor evolution with the fishplot package for R.
Miller, Christopher A; McMichael, Joshua; Dang, Ha X; Maher, Christopher A; Ding, Li; Ley, Timothy J; Mardis, Elaine R; Wilson, Richard K
2016-11-07
Massively-parallel sequencing at depth is now enabling tumor heterogeneity and evolution to be characterized in unprecedented detail. Tracking these changes in clonal architecture often provides insight into therapeutic response and resistance. In complex cases involving multiple timepoints, standard visualizations, such as scatterplots, can be difficult to interpret. Current data visualization methods are also typically manual and laborious, and often only approximate subclonal fractions. We have developed an R package that accurately and intuitively displays changes in clonal structure over time. It requires simple input data and produces illustrative and easy-to-interpret graphs suitable for diagnosis, presentation, and publication. The simplicity, power, and flexibility of this tool make it valuable for visualizing tumor evolution, and it has potential utility in both research and clinical settings. The fishplot package is available at https://github.com/chrisamiller/fishplot .
Remote sensing as a tool for estimating soil erosion potential
NASA Technical Reports Server (NTRS)
Morris-Jones, D. R.; Morgan, K. M.; Kiefer, R. W.
1979-01-01
The Universal Soil Loss Equation is a frequently used methodology for estimating soil erosion potential. The Universal Soil Loss Equation requires a variety of types of geographic information (e.g. topographic slope, soil erodibility, land use, crop type, and soil conservation practice) in order to function. This information is traditionally gathered from topographic maps, soil surveys, field surveys, and interviews with farmers. Remote sensing data sources and interpretation techniques provide an alternative method for collecting information regarding land use, crop type, and soil conservation practice. Airphoto interpretation techniques and medium altitude, multi-date color and color infrared positive transparencies (70mm) were utilized in this study to determine their effectiveness for gathering the desired land use/land cover data. Successful results were obtained within the test site, a 6136 hectare watershed in Dane County, Wisconsin.
An evaluation of the Goddard Space Flight Center Library
NASA Technical Reports Server (NTRS)
Herner, S.; Lancaster, F. W.; Wright, N.; Ockerman, L.; Shearer, B.; Greenspan, S.; Mccartney, J.; Vellucci, M.
1979-01-01
The character and degree of coincidence between the current and future missions, programs, and projects of the Goddard Space Flight Center and the current and future collection, services, and facilities of its library were determined from structured interviews and discussions with various classes of facility personnel. In addition to the tabulation and interpretation of the data from the structured interview survey, five types of statistical analyses were performed to corroborate (or contradict) the survey results and to produce useful information not readily attainable through survey material. Conclusions reached regarding compatability between needs and holdings, services and buildings, library hours of operation, methods of early detection and anticipation of changing holdings requirements, and the impact of near future programs are presented along with a list of statistics needing collection, organization, and interpretation on a continuing or longitudinal basis.
Identity versus determinism: Émile Meyerson's neo-Kantian interpretation of the quantum theory
NASA Astrophysics Data System (ADS)
Mills, M. Anthony
2014-08-01
Despite the praise his writing garnered during his lifetime, e.g., from readers such as Einstein and de Broglie, Émile Meyerson has been largely forgotten. The rich tradition of French épistémologie has recently been taken up in some Anglo-American scholarship, but Meyerson-who popularized the term épistémologie through his historical method of analyzing science, and criticized positivism long before Quine and Kuhn-remains overlooked. If Meyerson is remembered at all, it is as a historian of classical science. This paper attempts to rectify both states of affairs by explicating one of Meyerson's last and untranslated works, Réel et déterminisme dans la théorie quantique, an opuscule on quantum physics. I provide an overview of Meyerson's philosophy, his critique of Max Planck's interpretation of quantum physics, and then outline and evaluate Meyerson's neo-Kantian alternative. I then compare and contrast this interpretation with Cassirer's neo-Kantian program. Finally I show that, while Meyerson believes the revolutionary new physics requires "profoundly" modifying our conception of reality, ultimately, he thinks, it secures the legitimacy of his thesis: that science seeks explanations in the form of what he calls "identification." I hope my research will enable a more general and systematic engagement with Meyerson's work, especially with a view to assessing its viability as a philosophical method today.
NASA Astrophysics Data System (ADS)
Huang, Zhongjie; Siozos-Rousoulis, Leonidas; De Troyer, Tim; Ghorbaniasl, Ghader
2018-02-01
This paper presents a time-domain method for noise prediction of supersonic rotating sources in a moving medium. The proposed approach can be interpreted as an extensive time-domain solution for the convected permeable Ffowcs Williams and Hawkings equation, which is capable of avoiding the Doppler singularity. The solution requires special treatment for construction of the emission surface. The derived formula can explicitly and efficiently account for subsonic uniform constant flow effects on radiated noise. Implementation of the methodology is realized through the Isom thickness noise case and high-speed impulsive noise prediction from helicopter rotors.
Towards automated sleep classification in infants using symbolic and subsymbolic approaches.
Kubat, M; Flotzinger, D; Pfurtscheller, G
1993-04-01
The paper addresses the problem of automatic sleep classification. A special effort is made to find a method of extracting reasonable descriptions of the individual sleep stages from sample measurements of EGG, EMG, EOG, etc., and from a classification of these measurements provided by an expert. The method should satisfy three requirements: classification accuracy, interpretability of the results, and the ability to select the relevant and discard the irrelevant variables. The solution suggested in this paper consists of a combination of the subsymbolic algorithm LVQ with the symbolic decision tree generator ID3. Results demonstrating the feasibility and utility of our approach are also presented.
Fixation methods for electron microscopy of human and other liver
Wisse, Eddie; Braet, Filip; Duimel, Hans; Vreuls, Celien; Koek, Ger; Olde Damink, Steven WM; van den Broek, Maartje AJ; De Geest, Bart; Dejong, Cees HC; Tateno, Chise; Frederik, Peter
2010-01-01
For an electron microscopic study of the liver, expertise and complicated, time-consuming processing of hepatic tissues and cells is needed. The interpretation of electron microscopy (EM) images requires knowledge of the liver fine structure and experience with the numerous artifacts in fixation, embedding, sectioning, contrast staining and microscopic imaging. Hence, the aim of this paper is to present a detailed summary of different methods for the preparation of hepatic cells and tissue, for the purpose of preserving long-standing expertise and to encourage new investigators and clinicians to include EM studies of liver cells and tissue in their projects. PMID:20556830
How concept images affect students' interpretations of Newton's method
NASA Astrophysics Data System (ADS)
Engelke Infante, Nicole; Murphy, Kristen; Glenn, Celeste; Sealey, Vicki
2018-07-01
Knowing when students have the prerequisite knowledge to be able to read and understand a mathematical text is a perennial concern for instructors. Using text describing Newton's method and Vinner's notion of concept image, we exemplify how prerequisite knowledge influences understanding. Through clinical interviews with first-semester calculus students, we determined how evoked concept images of tangent lines and roots contributed to students' interpretation and application of Newton's method. Results show that some students' concept images of root and tangent line developed throughout the interview process, and most students were able to adequately interpret the text on Newton's method. However, students with insufficient concept images of tangent line and students who were unwilling or unable to modify their concept images of tangent line after reading the text were not successful in interpreting Newton's method.
NASA Astrophysics Data System (ADS)
Ansari, Muhammad Ahsan; Zai, Sammer; Moon, Young Shik
2017-01-01
Manual analysis of the bulk data generated by computed tomography angiography (CTA) is time consuming, and interpretation of such data requires previous knowledge and expertise of the radiologist. Therefore, an automatic method that can isolate the coronary arteries from a given CTA dataset is required. We present an automatic yet effective segmentation method to delineate the coronary arteries from a three-dimensional CTA data cloud. Instead of a region growing process, which is usually time consuming and prone to leakages, the method is based on the optimal thresholding, which is applied globally on the Hessian-based vesselness measure in a localized way (slice by slice) to track the coronaries carefully to their distal ends. Moreover, to make the process automatic, we detect the aorta using the Hough transform technique. The proposed segmentation method is independent of the starting point to initiate its process and is fast in the sense that coronary arteries are obtained without any preprocessing or postprocessing steps. We used 12 real clinical datasets to show the efficiency and accuracy of the presented method. Experimental results reveal that the proposed method achieves 95% average accuracy.
Sanagi, M Marsin; Nasir, Zalilah; Ling, Susie Lu; Hermawan, Dadan; Ibrahim, Wan Aini Wan; Naim, Ahmedy Abu
2010-01-01
Linearity assessment as required in method validation has always been subject to different interpretations and definitions by various guidelines and protocols. However, there are very limited applicable implementation procedures that can be followed by a laboratory chemist in assessing linearity. Thus, this work proposes a simple method for linearity assessment in method validation by a regression analysis that covers experimental design, estimation of the parameters, outlier treatment, and evaluation of the assumptions according to the International Union of Pure and Applied Chemistry guidelines. The suitability of this procedure was demonstrated by its application to an in-house validation for the determination of plasticizers in plastic food packaging by GC.
The storybook method: research feedback with young participants.
Anderson, Kate; Balandin, Susan
2011-12-01
Children are valuable informants for social research; however, their participation presents additional ethical and practical challenges. Of these challenges, feedback to verify the researchers' interpretations drawn from children's data, and the dissemination of project findings to young participants, have proven difficult to overcome. In this paper, we outline the Storybook method, an approach to feedback in research with young children. In the example study, illustrations, interactive pop-ups, and third-person disclosure were used to aid children aged 7-9 years to overcome the power imbalance in interviews with adults. The Storybook method facilitated active participation in the validation process. Potential modifications of the method for use with older populations, including adults with intellectual disabilities, complex communication needs, and those requiring alternate access to written texts, are also explored.
Interpreting Tools by Imagining Their Uses
ERIC Educational Resources Information Center
Kwan, Alistair
2017-01-01
By prompting imagined or actual bodily experience, we can guide interpretation of tools to emphasize the action that those tools perform. The technique requires little more than an extension from looking at an object, to imagining how the body engages with it, perhaps even trying out those specialist postures, to nourish an interpretation centered…
Unregulated Autonomy: Uncredentialed Educational Interpreters in Rural Schools
ERIC Educational Resources Information Center
Fitzmaurice, Stephen
2017-01-01
Although many rural Deaf and Hard of Hearing students attend public schools most of the day and use the services of educational interpreters to gain access to the school environment, little information exists on what interpreters are doing in rural school systems in the absence of credentialing requirements. The researcher used ethnographic…
Court Interpreters and Translators: Developing Ethical and Professional Standards.
ERIC Educational Resources Information Center
Funston, Richard
Changing needs in the courtroom have raised questions about the need for standards in court interpreter qualifications. In California, no formal training or familiarity with the legal system is required for certification, which is done entirely by language testing. The fact that often court interpreters are officers of the court may be…
Russell, Steven M; Doménech-Sánchez, Antonio; de la Rica, Roberto
2017-06-23
Colorimetric tests are becoming increasingly popular in point-of-need analyses due to the possibility of detecting the signal with the naked eye, which eliminates the utilization of bulky and costly instruments only available in laboratories. However, colorimetric tests may be interpreted incorrectly by nonspecialists due to disparities in color perception or a lack of training. Here we solve this issue with a method that not only detects colorimetric signals but also interprets them so that the test outcome is understandable for anyone. It consists of an augmented reality (AR) app that uses a camera to detect the colored signals generated by a nanoparticle-based immunoassay, and that yields a warning symbol or message when the concentration of analyte is higher than a certain threshold. The proposed method detected the model analyte mouse IgG with a limit of detection of 0.3 μg mL -1 , which was comparable to the limit of detection afforded by classical densitometry performed with a nonportable device. When adapted to the detection of E. coli, the app always yielded a "hazard" warning symbol when the concentration of E. coli in the sample was above the infective dose (10 6 cfu mL -1 or higher). The proposed method could help nonspecialists make a decision about drinking from a potentially contaminated water source by yielding an unambiguous message that is easily understood by anyone. The widespread availability of smartphones along with the inexpensive paper test that requires no enzymes to generate the signal makes the proposed assay promising for analyses in remote locations and developing countries.
Hybrid Theory of Electron-Hydrogenic Systems Elastic Scattering
NASA Technical Reports Server (NTRS)
Bhatia, A. K.
2007-01-01
Accurate electron-hydrogen and electron-hydrogenic cross sections are required to interpret fusion experiments, laboratory plasma physics and properties of the solar and astrophysical plasmas. We have developed a method in which the short-range and long-range correlations can be included at the same time in the scattering equations. The phase shifts have rigorous lower bounds and the scattering lengths have rigorous upper bounds. The phase shifts in the resonance region can be used to calculate very accurately the resonance parameters.
Macro-actor execution on multilevel data-driven architectures
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gaudiot, J.L.; Najjar, W.
1988-12-31
The data-flow model of computation brings to multiprocessors high programmability at the expense of increased overhead. Applying the model at a higher level leads to better performance but also introduces loss of parallelism. We demonstrate here syntax directed program decomposition methods for the creation of large macro-actors in numerical algorithms. In order to alleviate some of the problems introduced by the lower resolution interpretation, we describe a multi-level of resolution and analyze the requirements for its actual hardware and software integration.
Perspectives in astrophysical databases
NASA Astrophysics Data System (ADS)
Frailis, Marco; de Angelis, Alessandro; Roberto, Vito
2004-07-01
Astrophysics has become a domain extremely rich of scientific data. Data mining tools are needed for information extraction from such large data sets. This asks for an approach to data management emphasizing the efficiency and simplicity of data access; efficiency is obtained using multidimensional access methods and simplicity is achieved by properly handling metadata. Moreover, clustering and classification techniques on large data sets pose additional requirements in terms of computation and memory scalability and interpretability of results. In this study we review some possible solutions.
Optical diffraction interpretation: an alternative to interferometers
NASA Astrophysics Data System (ADS)
Bouillet, S.; Audo, F.; Fréville, S.; Eupherte, L.; Rouyer, C.; Daurios, J.
2015-08-01
The Laser MégaJoule (LMJ) is a French high power laser project that requires thousands of large optical components. The wavefront performances of all those optics are critical to achieve the desired focal spot shape and to limit the hot spots that could damage the components. Fizeau interferometers and interferometric microscopes are the most commonly used tools to cover the whole range of interesting spatial frequencies. Anyway, in some particular cases like diffractive and/or coated and/or aspheric optics, an interferometric set-up becomes very expensive with the need to build a costly reference component or a specific to-the-wavelength designed interferometer. Despite the increasing spatial resolution of Fizeau interferometers, it may even not be enough, if you are trying to access the highest spatial frequencies of a transmitted wavefront for instance. The method we developed is based upon laser beam diffraction intermediate field measurements and their interpretation with a Fourier analysis and the Talbot effect theory. We demonstrated in previous papers that it is a credible alternative to classical methods. In this paper we go further by analyzing main error sources and discussing main practical difficulties.
Laxmisan, A.; McCoy, A.B.; Wright, A.; Sittig, D.F.
2012-01-01
Objective Clinical summarization, the process by which relevant patient information is electronically summarized and presented at the point of care, is of increasing importance given the increasing volume of clinical data in electronic health record systems (EHRs). There is a paucity of research on electronic clinical summarization, including the capabilities of currently available EHR systems. Methods We compared different aspects of general clinical summary screens used in twelve different EHR systems using a previously described conceptual model: AORTIS (Aggregation, Organization, Reduction, Interpretation and Synthesis). Results We found a wide variation in the EHRs’ summarization capabilities: all systems were capable of simple aggregation and organization of limited clinical content, but only one demonstrated an ability to synthesize information from the data. Conclusion Improvement of the clinical summary screen functionality for currently available EHRs is necessary. Further research should identify strategies and methods for creating easy to use, well-designed clinical summary screens that aggregate, organize and reduce all pertinent patient information as well as provide clinical interpretations and synthesis as required. PMID:22468161
Subtype Diagnosis of Primary Aldosteronism: Is Adrenal Vein Sampling Always Necessary?
Buffolo, Fabrizio; Monticone, Silvia; Williams, Tracy A.; Rossato, Denis; Burrello, Jacopo; Tetti, Martina; Veglio, Franco; Mulatero, Paolo
2017-01-01
Aldosterone producing adenoma and bilateral adrenal hyperplasia are the two most common subtypes of primary aldosteronism (PA) that require targeted and distinct therapeutic approaches: unilateral adrenalectomy or lifelong medical therapy with mineralocorticoid receptor antagonists. According to the 2016 Endocrine Society Guideline, adrenal venous sampling (AVS) is the gold standard test to distinguish between unilateral and bilateral aldosterone overproduction and therefore, to safely refer patients with PA to surgery. Despite significant advances in the optimization of the AVS procedure and the interpretation of hormonal data, a standardized protocol across centers is still lacking. Alternative methods are sought to either localize an aldosterone producing adenoma or to predict the presence of unilateral disease and thereby substantially reduce the number of patients with PA who proceed to AVS. In this review, we summarize the recent advances in subtyping PA for the diagnosis of unilateral and bilateral disease. We focus on the developments in the AVS procedure, the interpretation criteria, and comparisons of the performance of AVS with the alternative methods that are currently available. PMID:28420172
A systematic review of methods to diagnose oral dryness and salivary gland function
2012-01-01
Background The most advocated clinical method for diagnosing salivary dysfunction is to quantitate unstimulated and stimulated whole saliva (sialometry). Since there is an expected and wide variation in salivary flow rates among individuals, the assessment of dysfunction can be difficult. The aim of this systematic review is to evaluate the quality of the evidence for the efficacy of diagnostic methods used to identify oral dryness. Methods A literature search, with specific indexing terms and a hand search, was conducted for publications that described a method to diagnose oral dryness. The electronic databases of PubMed, Cochrane Library, and Web of Science were used as data sources. Four reviewers selected publications on the basis of predetermined inclusion and exclusion criteria. Data were extracted from the selected publications using a protocol. Original studies were interpreted with the aid of Quality Assessment of Diagnostic Accuracy Studies (QUADAS) tool. Results The database searches resulted in 224 titles and abstracts. Of these abstracts, 80 publications were judged to meet the inclusion criteria and read in full. A total of 18 original studies were judged relevant and interpreted for this review. In all studies, the results of the test method were compared to those of a reference method. Based on the interpretation (with the aid of the QUADAS tool) it can be reported that the patient selection criteria were not clearly described and the test or reference methods were not described in sufficient detail for it to be reproduced. None of the included studies reported information on uninterpretable/intermediate results nor data on observer or instrument variation. Seven of the studies presented their results as a percentage of correct diagnoses. Conclusions The evidence for the efficacy of clinical methods to assess oral dryness is sparse and it can be stated that improved standards for the reporting of diagnostic accuracy are needed in order to assure the methodological quality of studies. There is need for effective diagnostic criteria and functional tests in order to detect those individuals with oral dryness who may require oral treatment, such as alleviation of discomfort and/or prevention of diseases. PMID:22870895
NASA Technical Reports Server (NTRS)
Ashley, R. P. (Principal Investigator); Goetz, A. F. H.; Rowan, L. C.; Abrams, M. J.
1979-01-01
The author has identified the following significant results. LANDSAT images enhanced by the band-ratioing method can be used for reconnaissance alteration mapping in moderately heavily vegetated semiarid terrain as well as in sparsely vegetated to semiarid terrain where the technique was originally developed. Significant vegetation cover in a scene, however, requires the use of MSS ratios 4/5, 4/6, and 6/7 rather than 4/5, 5/6, and 6/7, and requires careful interpretation of the results. Supplemental information suitable to vegetation identification and cover estimates, such as standard LANDSAT false-color composites and low altitude aerial photographs of selected areas is desirable.
You are lost without a map: Navigating the sea of protein structures.
Lamb, Audrey L; Kappock, T Joseph; Silvaggi, Nicholas R
2015-04-01
X-ray crystal structures propel biochemistry research like no other experimental method, since they answer many questions directly and inspire new hypotheses. Unfortunately, many users of crystallographic models mistake them for actual experimental data. Crystallographic models are interpretations, several steps removed from the experimental measurements, making it difficult for nonspecialists to assess the quality of the underlying data. Crystallographers mainly rely on "global" measures of data and model quality to build models. Robust validation procedures based on global measures now largely ensure that structures in the Protein Data Bank (PDB) are largely correct. However, global measures do not allow users of crystallographic models to judge the reliability of "local" features in a region of interest. Refinement of a model to fit into an electron density map requires interpretation of the data to produce a single "best" overall model. This process requires inclusion of most probable conformations in areas of poor density. Users who misunderstand this can be misled, especially in regions of the structure that are mobile, including active sites, surface residues, and especially ligands. This article aims to equip users of macromolecular models with tools to critically assess local model quality. Structure users should always check the agreement of the electron density map and the derived model in all areas of interest, even if the global statistics are good. We provide illustrated examples of interpreted electron density as a guide for those unaccustomed to viewing electron density. Copyright © 2014 Elsevier B.V. All rights reserved.
Whiffin, Nicola; Walsh, Roddy; Govind, Risha; Edwards, Matthew; Ahmad, Mian; Zhang, Xiaolei; Tayal, Upasana; Buchan, Rachel; Midwinter, William; Wilk, Alicja E; Najgebauer, Hanna; Francis, Catherine; Wilkinson, Sam; Monk, Thomas; Brett, Laura; O'Regan, Declan P; Prasad, Sanjay K; Morris-Rosendahl, Deborah J; Barton, Paul J R; Edwards, Elizabeth; Ware, James S; Cook, Stuart A
2018-01-25
PurposeInternationally adopted variant interpretation guidelines from the American College of Medical Genetics and Genomics (ACMG) are generic and require disease-specific refinement. Here we developed CardioClassifier (http://www.cardioclassifier.org), a semiautomated decision-support tool for inherited cardiac conditions (ICCs).MethodsCardioClassifier integrates data retrieved from multiple sources with user-input case-specific information, through an interactive interface, to support variant interpretation. Combining disease- and gene-specific knowledge with variant observations in large cohorts of cases and controls, we refined 14 computational ACMG criteria and created three ICC-specific rules.ResultsWe benchmarked CardioClassifier on 57 expertly curated variants and show full retrieval of all computational data, concordantly activating 87.3% of rules. A generic annotation tool identified fewer than half as many clinically actionable variants (64/219 vs. 156/219, Fisher's P = 1.1 × 10 -18 ), with important false positives, illustrating the critical importance of disease and gene-specific annotations. CardioClassifier identified putatively disease-causing variants in 33.7% of 327 cardiomyopathy cases, comparable with leading ICC laboratories. Through addition of manually curated data, variants found in over 40% of cardiomyopathy cases are fully annotated, without requiring additional user-input data.ConclusionCardioClassifier is an ICC-specific decision-support tool that integrates expertly curated computational annotations with case-specific data to generate fast, reproducible, and interactive variant pathogenicity reports, according to best practice guidelines.GENETICS in MEDICINE advance online publication, 25 January 2018; doi:10.1038/gim.2017.258.
An exact noniterative linear method for locating sources based on measuring receiver arrival times.
Militello, C; Buenafuente, S R
2007-06-01
In this paper an exact, linear solution to the source localization problem based on the time of arrival at the receivers is presented. The method is unique in that the source's position can be obtained by solving a system of linear equations, three for a plane and four for a volume. This simplification means adding an additional receiver to the minimum mathematically required (3+1 in two dimensions and 4+1 in three dimensions). The equations are easily worked out for any receiver configuration and their geometrical interpretation is straightforward. Unlike other methods, the system of reference used to describe the receivers' positions is completely arbitrary. The relationship between this method and previously published ones is discussed, showing how the present, more general, method overcomes nonlinearity and unknown dependency issues.
Khan, Anum Irfan; Kuluski, Kerry; McKillop, Ian; Sharpe, Sarah; Bierman, Arlene S; Lyons, Renee F; Cott, Cheryl
2016-01-01
Background Many mHealth technologies do not meet the needs of patients with complex chronic disease and disabilities (CCDDs) who are among the highest users of health systems worldwide. Furthermore, many of the development methodologies used in the creation of mHealth and eHealth technologies lack the ability to embrace users with CCDD in the specification process. This paper describes how we adopted and modified development techniques to create the electronic Patient-Reported Outcomes (ePRO) tool, a patient-centered mHealth solution to help improve primary health care for patients experiencing CCDD. Objective This paper describes the design and development approach, specifically the process of incorporating qualitative research methods into user-centered design approaches to create the ePRO tool. Key lessons learned are offered as a guide for other eHealth and mHealth research and technology developers working with complex patient populations and their primary health care providers. Methods Guided by user-centered design principles, interpretive descriptive qualitative research methods were adopted to capture user experiences through interviews and working groups. Consistent with interpretive descriptive methods, an iterative analysis technique was used to generate findings, which were then organized in relation to the tool design and function to help systematically inform modifications to the tool. User feedback captured and analyzed through this method was used to challenge the design and inform the iterative development of the tool. Results Interviews with primary health care providers (n=7) and content experts (n=6), and four focus groups with patients and carers (n=14) along with a PICK analysis—Possible, Implementable, (to be) Challenged, (to be) Killed—guided development of the first prototype. The initial prototype was presented in three design working groups with patients/carers (n=5), providers (n=6), and experts (n=5). Working group findings were broken down into categories of what works and what does not work to inform modifications to the prototype. This latter phase led to a major shift in the purpose and design of the prototype, validating the importance of using iterative codesign processes. Conclusions Interpretive descriptive methods allow for an understanding of user experiences of patients with CCDD, their carers, and primary care providers. Qualitative methods help to capture and interpret user needs, and identify contextual barriers and enablers to tool adoption, informing a redesign to better suit the needs of this diverse user group. This study illustrates the value of adopting interpretive descriptive methods into user-centered mHealth tool design and can also serve to inform the design of other eHealth technologies. Our approach is particularly useful in requirements determination when developing for a complex user group and their health care providers. PMID:26892952
Does periodic lung screening of films meets standards?
Binay, Songul; Arbak, Peri; Safak, Alp Alper; Balbay, Ege Gulec; Bilgin, Cahit; Karatas, Naciye
2016-01-01
Objective: To determine whether the workers’ periodic chest x-ray screening techniques in accordance with the quality standards is the responsibility of physicians. Evaluation of differences of interpretations by physicians in different levels of education and the importance of standardization of interpretation. Methods: Previously taken chest radiographs of 400 workers who are working in a factory producing the glass run channels were evaluated according to technical and quality standards by three observers (pulmonologist, radiologist, pulmonologist assistant). There was a perfect concordance between radiologist and pulmonologist for the underpenetrated films. Whereas there was perfect concordance between pulmonologist and pulmonologist assistant for over penetrated films. Results: Pulmonologist (52%) has interpreted the dose of the films as regular more than other observers (radiologist; 44.3%, pulmonologist assistant; 30.4%). The frequency of interpretation of the films as taken in inspiratory phase by the pulmonologist (81.7%) was less than other observers (radiologist; 92.1%, pulmonologist assistant; 92.6%). The rate of the pulmonologist (53.5%) was higher than the other observers (radiologist; 44.6%, pulmonologist assistant; 41.8%) for the assessment of the positioning of the patients as symmetrical. Pulmonologist assistant (15.3%) was the one who most commonly reported the parenchymal findings (radiologist; 2.2%, pulmonologist; 12.9%). Conclusion: It is necessary to reorganize the technical standards and exposure procedures for improving the quality of the chest radiographs. The reappraisal of all interpreters and continuous training of technicians is required. PMID:28083054
[Relevance of big data for molecular diagnostics].
Bonin-Andresen, M; Smiljanovic, B; Stuhlmüller, B; Sörensen, T; Grützkau, A; Häupl, T
2018-04-01
Big data analysis raises the expectation that computerized algorithms may extract new knowledge from otherwise unmanageable vast data sets. What are the algorithms behind the big data discussion? In principle, high throughput technologies in molecular research already introduced big data and the development and application of analysis tools into the field of rheumatology some 15 years ago. This includes especially omics technologies, such as genomics, transcriptomics and cytomics. Some basic methods of data analysis are provided along with the technology, however, functional analysis and interpretation requires adaptation of existing or development of new software tools. For these steps, structuring and evaluating according to the biological context is extremely important and not only a mathematical problem. This aspect has to be considered much more for molecular big data than for those analyzed in health economy or epidemiology. Molecular data are structured in a first order determined by the applied technology and present quantitative characteristics that follow the principles of their biological nature. These biological dependencies have to be integrated into software solutions, which may require networks of molecular big data of the same or even different technologies in order to achieve cross-technology confirmation. More and more extensive recording of molecular processes also in individual patients are generating personal big data and require new strategies for management in order to develop data-driven individualized interpretation concepts. With this perspective in mind, translation of information derived from molecular big data will also require new specifications for education and professional competence.
Karliner, Leah S; Párez-Stable, Eliseo J; Gildengorin, Ginny
2004-01-01
PURPOSE Provision of interpreter services for non-English-speaking patients is a federal requirement. We surveyed clinicians to describe their experience using interpreters. SUBJECTS AND METHODS In this cross-sectional study we surveyed clinicians in three academic outpatient settings in San Francisco (N = 194) regarding their most recent patient encounter which involved an interpreter. Questions about the visit included type of interpreter, satisfaction with content of clinical encounter, potential problems, and frequency of need. Previous training in interpreter use, languages spoken, and demographics were also asked. Questionnaires were self-administered in approximately 10 minutes. RESULTS Of 194 questionnaires mailed, 158 were completed (81% response rate) and 67% were from resident physicians. Most respondents (78%) were very satisfied or satisfied with the medical care they provided, 85% felt satisfied with their ability to diagnose a disease and treat a disease, but only 45% were satisfied with their ability to empower the patient with knowledge about their disease, treatment, or medication. Even though 71% felt they were able to make a personal connection with their patient, only 33% felt they had learned about another culture as a result of the encounter. Clinicians reported difficulties eliciting exact symptoms (70%), explaining treatments (44%), and eliciting treatment preferences (51%). Clinicians perceived that lack of knowledge of a patient's culture hindered their ability to provide quality medical care and only 18% felt they were unable to establish trust or rapport. Previous training in interpreter use was associated with increased use of professional interpreters (odds ratio [OR], 3.2; 95% confidence interval [CI], 1.4 to 7.5) and increased satisfaction with medical care provided (OR, 2.6; 95% CI, 1.1 to 6.6). CONCLUSIONS Clinicians reported communication difficulties affecting their ability to understand symptoms and treat disease, as well as their ability to empower patients regarding their healthcare. Training in the use of interpreters may improve communication and clinical care, and thus health outcomes. PMID:15009797
NASA Astrophysics Data System (ADS)
Black, S.; Hynek, B. M.; Kierein-Young, K. S.; Avard, G.; Alvarado-Induni, G.
2015-12-01
Proper characterization of mineralogy is an essential part of geologic interpretation. This process becomes even more critical when attempting to interpret the history of a region remotely, via satellites and/or landed spacecraft. Orbiters and landed missions to Mars carry with them a wide range of analytical tools to aid in the interpretation of Mars' geologic history. However, many instruments make a single type of measurement (e.g., APXS: elemental chemistry; XRD: mineralogy), and multiple data sets must be utilized to develop a comprehensive understanding of a sample. Hydrothermal alteration products often exist in intimate mixtures, and vary widely across a site due to changing pH, temperature, and fluid/gas chemistries. These characteristics require that we develop a detailed understanding regarding the possible mineral mixtures that may exist, and their detectability in different instrument data sets. This comparative analysis study utilized several analytical methods on existing or planned Mars rovers (XRD Raman, LIBS, Mössbauer, and APXS) combined with additional characterization (thin section, VNIR, XRF, SEM-EMP) to develop a comprehensive suite of data for hydrothermal alteration products collected from Poás and Turrialba volcanoes in Costa Rica. Analyzing the same samples across a wide range of instruments allows for direct comparisons of results, and identification of instrumentation "blind spots." This provides insight into the ability of in-situ analyses to comprehensively characterize sites on Mars exhibiting putative hydrothermal characteristics, such as the silica and sulfate deposits at Gusev crater [eg: Squyres et al., 2008], as well as valuable information for future mission planning and data interpretation. References: Squyres et al. (2008), Detection of Silica-Rich Deposits on Mars, Science, 320, 1063-1067, doi:10.1126/science.1155429.
Denman, A R; Crockett, R G M; Groves-Kirkby, C J; Phillips, P S
2016-10-01
Radon gas is naturally occurring, and can concentrate in the built environment. It is radioactive and high concentration levels within buildings, including homes, have been shown to increase the risk of lung cancer in the occupants. As a result, several methods have been developed to measure radon. The long-term average radon level determines the risk to occupants, but there is always pressure to complete measurements more quickly, particularly when buying and selling the home. For many years, the three-month exposure using etched-track detectors has been the de facto standard, but a decade ago, Phillips et al. (2003), in a DEFRA funded project, evaluated the use of 1-week and 1-month measurements. They found that the measurement methods were accurate, but the challenge lay in the wide variation in radon levels - with diurnal, seasonal, and other patterns due to climatic factors and room use. In the report on this work, and in subsequent papers, the group proposed methodologies for 1-week, 1-month and 3-month measurements and their interpretation. Other work, however, has suggested that 2-week exposures were preferable to 1-week ones. In practice, the radon remediation industry uses a range of exposure times, and further guidance is required to help interpret these results. This paper reviews the data from this study and a subsequent 4-year study of 4 houses, re-analysing the results and extending them to other exposures, particularly for 2-week and 2-month exposures, and provides comprehensive guidance for the use of etched-track detectors, the value and use of Seasonal Correction Factors (SCFs), the uncertainties in short and medium term exposures and the interpretation of results. Copyright © 2016 Elsevier Ltd. All rights reserved.
Prevalence of Mixed-Methods Sampling Designs in Social Science Research
ERIC Educational Resources Information Center
Collins, Kathleen M. T.
2006-01-01
The purpose of this mixed-methods study was to document the prevalence of sampling designs utilised in mixed-methods research and to examine the interpretive consistency between interpretations made in mixed-methods studies and the sampling design used. Classification of studies was based on a two-dimensional mixed-methods sampling model. This…
Education of Women in Islam: A Critical Islamic Interpretation of the Quran
ERIC Educational Resources Information Center
Abukari, Abdulai
2014-01-01
In Islam, knowledge, its acquisition and application is a fundamental requirement for all Muslims to enable them to believe, think, and act according to the principles of the religion. However, differences in style of interpretation of the Qur'an have led to text being interpreted against its own fundamental worldview; an example is the…
What information on measurement uncertainty should be communicated to clinicians, and how?
Plebani, Mario; Sciacovelli, Laura; Bernardi, Daniela; Aita, Ada; Antonelli, Giorgia; Padoan, Andrea
2018-02-02
The communication of laboratory results to physicians and the quality of reports represent fundamental requirements of the post-analytical phase in order to assure the right interpretation and utilization of laboratory information. Accordingly, the International Standard for clinical laboratories accreditation (ISO 15189) requires that "laboratory reports shall include the information necessary for the interpretation of the examination results". Measurement uncertainty (MU) is an inherent property of any quantitative measurement result which express the lack of knowledge of the true value and quantify the uncertainty of a result, incorporating the factors known to influence it. Even if the MU is not included in the report attributes of ISO 15189 and cannot be considered a post-analytical requirement, it is suggested as an information which should facilitate an appropriate interpretation of quantitative results (quantity values). Therefore, MU has two intended uses: for laboratory professionals, it gives information about the quality of measurements, providing evidence of the compliance with analytical performance characteristics; for physicians (and patients) it may help in interpretation of measurement results, especially when values are compared with reference intervals or clinical decision limits, providing objective information. Here we describe the way that MU should be added to laboratory reports in order to facilitate the interpretation of laboratory results and connecting efforts performed within laboratory to provide more accurate and reliable results with a more objective tool for their interpretation by physicians. Copyright © 2018 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.
Tasker, Gary D.; Granato, Gregory E.
2000-01-01
Decision makers need viable methods for the interpretation of local, regional, and national-highway runoff and urban-stormwater data including flows, concentrations and loads of chemical constituents and sediment, potential effects on receiving waters, and the potential effectiveness of various best management practices (BMPs). Valid (useful for intended purposes), current, and technically defensible stormwater-runoff models are needed to interpret data collected in field studies, to support existing highway and urban-runoffplanning processes, to meet National Pollutant Discharge Elimination System (NPDES) requirements, and to provide methods for computation of Total Maximum Daily Loads (TMDLs) systematically and economically. Historically, conceptual, simulation, empirical, and statistical models of varying levels of detail, complexity, and uncertainty have been used to meet various data-quality objectives in the decision-making processes necessary for the planning, design, construction, and maintenance of highways and for other land-use applications. Water-quality simulation models attempt a detailed representation of the physical processes and mechanisms at a given site. Empirical and statistical regional water-quality assessment models provide a more general picture of water quality or changes in water quality over a region. All these modeling techniques share one common aspect-their predictive ability is poor without suitable site-specific data for calibration. To properly apply the correct model, one must understand the classification of variables, the unique characteristics of water-resources data, and the concept of population structure and analysis. Classifying variables being used to analyze data may determine which statistical methods are appropriate for data analysis. An understanding of the characteristics of water-resources data is necessary to evaluate the applicability of different statistical methods, to interpret the results of these techniques, and to use tools and techniques that account for the unique nature of water-resources data sets. Populations of data on stormwater-runoff quantity and quality are often best modeled as logarithmic transformations. Therefore, these factors need to be considered to form valid, current, and technically defensible stormwater-runoff models. Regression analysis is an accepted method for interpretation of water-resources data and for prediction of current or future conditions at sites that fit the input data model. Regression analysis is designed to provide an estimate of the average response of a system as it relates to variation in one or more known variables. To produce valid models, however, regression analysis should include visual analysis of scatterplots, an examination of the regression equation, evaluation of the method design assumptions, and regression diagnostics. A number of statistical techniques are described in the text and in the appendixes to provide information necessary to interpret data by use of appropriate methods. Uncertainty is an important part of any decisionmaking process. In order to deal with uncertainty problems, the analyst needs to know the severity of the statistical uncertainty of the methods used to predict water quality. Statistical models need to be based on information that is meaningful, representative, complete, precise, accurate, and comparable to be deemed valid, up to date, and technically supportable. To assess uncertainty in the analytical tools, the modeling methods, and the underlying data set, all of these components need be documented and communicated in an accessible format within project publications.
On the role of entailment patterns and scalar implicatures in the processing of numerals
Panizza, Daniele; Chierchia, Gennaro; Clifton, Charles
2009-01-01
There has been much debate, in both the linguistics and the psycholinguistics literature, concerning numbers and the interpretation of number denoting determiners ('numerals'). Such debate concerns, in particular, the nature and distribution of upper-bounded ('at-least') interpretations vs. lower-bounded ('exact') construals. In the present paper we show that the interpretation and processing of numerals are affected by the entailment properties of the context in which they occur. Experiment 1 established off-line preferences using a questionnaire. Experiment 2 investigated the processing issue through an eye tracking experiment using a silent reading task. Our results show that the upper-bounded interpretation of numerals occurs more often in an upward entailing context than in a downward entailing context. Reading times of the numeral itself were longer when it was embedded in an upward entailing context than when it was not, indicating that processing resources were required when the context triggered an upper-bounded interpretation. However, reading of a following context that required an upper-bounded interpretation triggered more regressions towards the numeral when it had occurred in a downward entailing context than in an upward entailing one. Such findings show that speakers' interpretation and processing of numerals is systematically affected by the polarity of the sentence in which they occur, and support the hypothesis that the upper-bounded interpretation of numerals is due to a scalar implicature. PMID:20161494
Interpreting Medical Information Using Machine Learning and Individual Conditional Expectation.
Nohara, Yasunobu; Wakata, Yoshifumi; Nakashima, Naoki
2015-01-01
Recently, machine-learning techniques have spread many fields. However, machine-learning is still not popular in medical research field due to difficulty of interpreting. In this paper, we introduce a method of interpreting medical information using machine learning technique. The method gave new explanation of partial dependence plot and individual conditional expectation plot from medical research field.
NASA Astrophysics Data System (ADS)
Riveiro, B.; DeJong, M.; Conde, B.
2016-06-01
Despite the tremendous advantages of the laser scanning technology for the geometric characterization of built constructions, there are important limitations preventing more widespread implementation in the structural engineering domain. Even though the technology provides extensive and accurate information to perform structural assessment and health monitoring, many people are resistant to the technology due to the processing times involved. Thus, new methods that can automatically process LiDAR data and subsequently provide an automatic and organized interpretation are required. This paper presents a new method for fully automated point cloud segmentation of masonry arch bridges. The method efficiently creates segmented, spatially related and organized point clouds, which each contain the relevant geometric data for a particular component (pier, arch, spandrel wall, etc.) of the structure. The segmentation procedure comprises a heuristic approach for the separation of different vertical walls, and later image processing tools adapted to voxel structures allows the efficient segmentation of the main structural elements of the bridge. The proposed methodology provides the essential processed data required for structural assessment of masonry arch bridges based on geometric anomalies. The method is validated using a representative sample of masonry arch bridges in Spain.
Milreu, Paulo Vieira; Klein, Cecilia Coimbra; Cottret, Ludovic; Acuña, Vicente; Birmelé, Etienne; Borassi, Michele; Junot, Christophe; Marchetti-Spaccamela, Alberto; Marino, Andrea; Stougie, Leen; Jourdan, Fabien; Crescenzi, Pierluigi; Lacroix, Vincent; Sagot, Marie-France
2014-01-01
The increasing availability of metabolomics data enables to better understand the metabolic processes involved in the immediate response of an organism to environmental changes and stress. The data usually come in the form of a list of metabolites whose concentrations significantly changed under some conditions, and are thus not easy to interpret without being able to precisely visualize how such metabolites are interconnected. We present a method that enables to organize the data from any metabolomics experiment into metabolic stories. Each story corresponds to a possible scenario explaining the flow of matter between the metabolites of interest. These scenarios may then be ranked in different ways depending on which interpretation one wishes to emphasize for the causal link between two affected metabolites: enzyme activation, enzyme inhibition or domino effect on the concentration changes of substrates and products. Equally probable stories under any selected ranking scheme can be further grouped into a single anthology that summarizes, in a unique subnetwork, all equivalently plausible alternative stories. An anthology is simply a union of such stories. We detail an application of the method to the response of yeast to cadmium exposure. We use this system as a proof of concept for our method, and we show that we are able to find a story that reproduces very well the current knowledge about the yeast response to cadmium. We further show that this response is mostly based on enzyme activation. We also provide a framework for exploring the alternative pathways or side effects this local response is expected to have in the rest of the network. We discuss several interpretations for the changes we see, and we suggest hypotheses that could in principle be experimentally tested. Noticeably, our method requires simple input data and could be used in a wide variety of applications. The code for the method presented in this article is available at http://gobbolino.gforge.inria.fr.
Interpretation and presentation of results. Chickens will come home to roost.
Beattie, A; Donovan, B; Mant, A; Bridges-Webb, C
1984-06-01
Collecting information is an obsessional activity which can become an end in itself. Interpreting and presenting results in a way that others can learn from them requires reflection, selectivity and ability to accept criticism. In selecting the journal to which the article will be submitted, consider which has the most appropriate readership and ensure the article conforms to the requirements and style of that journal.
Karemere, H; Kahindo, J B; Ribesse, N; Macq, J
2013-01-01
Because hospitals are complex enterprises requiring adaptive systems, it is appropriate to apply the theory and terminology of governance or even better adaptive governance to the interpretation of their management. This study focused on understanding hospital governance in Logo, Bunia, and Katana, three hospitals in two regions of the eastern DRC, which has been characterized by intermittent armed conflict since 1996. In such a context of war and continuous insecurity, how can governance be interpreted for hospitals required to adapt to a constantly changing environment to be able to continue to provide health care? A critical interpretive synthesis of the literature, identified by searching for keywords related to governance. The concepts of governance, adaptive governance, performance, leadership, and complex adaptive system concepts are defined. The interpretation of the concepts helps us to better understand (1) the hospital as a complex adaptive system, (2) the governance of tertiary referral hospitals, (3) analysis of hospital performance, and (4) leadership for good governance of these hospitals. The interpretation of these concepts raises several questions about their application to the eastern DRC. Conclusion. This critical interpretive synthesis opens the door to a new way of exploring tertiary hospitals and their governance in the eastern DRC.
Bieber, Frederick R; Buckleton, John S; Budowle, Bruce; Butler, John M; Coble, Michael D
2016-08-31
The evaluation and interpretation of forensic DNA mixture evidence faces greater interpretational challenges due to increasingly complex mixture evidence. Such challenges include: casework involving low quantity or degraded evidence leading to allele and locus dropout; allele sharing of contributors leading to allele stacking; and differentiation of PCR stutter artifacts from true alleles. There is variation in statistical approaches used to evaluate the strength of the evidence when inclusion of a specific known individual(s) is determined, and the approaches used must be supportable. There are concerns that methods utilized for interpretation of complex forensic DNA mixtures may not be implemented properly in some casework. Similar questions are being raised in a number of U.S. jurisdictions, leading to some confusion about mixture interpretation for current and previous casework. Key elements necessary for the interpretation and statistical evaluation of forensic DNA mixtures are described. Given the most common method for statistical evaluation of DNA mixtures in many parts of the world, including the USA, is the Combined Probability of Inclusion/Exclusion (CPI/CPE). Exposition and elucidation of this method and a protocol for use is the focus of this article. Formulae and other supporting materials are provided. Guidance and details of a DNA mixture interpretation protocol is provided for application of the CPI/CPE method in the analysis of more complex forensic DNA mixtures. This description, in turn, should help reduce the variability of interpretation with application of this methodology and thereby improve the quality of DNA mixture interpretation throughout the forensic community.
Overcoming Language Barriers in Health Care: Costs and Benefits of Interpreter Services
Jacobs, Elizabeth A.; Shepard, Donald S.; Suaya, Jose A.; Stone, Esta-Lee
2004-01-01
Objectives. We assessed the impact of interpreter services on the cost and the utilization of health care services among patients with limited English proficiency. Methods. We measured the change in delivery and cost of care provided to patients enrolled in a health maintenance organization before and after interpreter services were implemented. Results. Compared with English-speaking patients, patients who used the interpreter services received significantly more recommended preventive services, made more office visits, and had more prescriptions written and filled. The estimated cost of providing interpreter services was $279 per person per year. Conclusions. Providing interpreter services is a financially viable method for enhancing delivery of health care to patients with limited English proficiency. PMID:15117713
Convertini, Livia S.; Quatraro, Sabrina; Ressa, Stefania; Velasco, Annalisa
2015-01-01
Background. Even though the interpretation of natural language messages is generally conceived as the result of a conscious processing of the message content, the influence of unconscious factors is also well known. What is still insufficiently known is the way such factors work. We have tackled interpretation assuming it is a process, whose basic features are the same for the whole humankind, and employing a naturalistic approach (careful observation of phenomena in conditions the closest to “natural” ones, and precise description before and independently of data statistical analysis). Methodology. Our field research involved a random sample of 102 adults. We presented them with a complete real world-like case of written communication using unabridged message texts. We collected data (participants’ written reports on their interpretations) in controlled conditions through a specially designed questionnaire (closed and opened answers); then, we treated it through qualitative and quantitative methods. Principal Findings. We gathered some evidence that, in written message interpretation, between reading and the attribution of conscious meaning, an intermediate step could exist (we named it “disassembling”) which looks like an automatic reaction to the text words/expressions. Thus, the process of interpretation would be a discontinuous sequence of three steps having different natures: the initial “decoding” step (i.e., reading, which requires technical abilities), disassembling (the automatic reaction, an unconscious passage) and the final conscious attribution of meaning. If this is true, words and expressions would firstly function like physical stimuli, before being taken into account as symbols. Such hypothesis, once confirmed, could help explaining some links between the cultural (human communication) and the biological (stimulus-reaction mechanisms as the basis for meanings) dimension of humankind. PMID:26528419
Design and operation of internal dosimetry programs
DOE Office of Scientific and Technical Information (OSTI.GOV)
LaBone, T.R.
1991-01-01
The proposed revision to USNRC 10 CFR 20 and the USDOE Order 5480.11 require intakes of radioactive material to be evaluated. Radiation dose limits are based on the sum of effective dose equivalent from intakes and the whole body dose from external sources. These significant changes in the regulations will require, at a minimum, a complete review of personnel monitoring programs to determine their adequacy. In this session we will review a systematic method of designing a routine personnel monitoring program that will comply with the requirements of the new regulations. Specific questions discussed are: (a) What are the goalsmore » and objectives of a routine personnel monitoring program (b) When is a routine personnel monitoring program required (c) What are the required capabilities of the routine personnel monitoring program (d) What should be done with the information generated in a personnel monitoring program Specific recommendations and interpretations are given in the session. 5 refs., 3 figs., 33 tabs.« less
Caldwell, Michelle; Dickerhoof, Erica; Hall, Anastasia; Odakura, Bryan; Fanchiang, Hsin-Chen
2014-01-01
Objective. To describe and analyze the potential use of games in the commercially available EyeToy Play and EyeToy Play 2 on required/targeted training skills and feedback provided for clinical application. Methods. A summary table including all games was created. Two movement experts naïve to the software validated required/targeted training skills and feedback for 10 randomly selected games. Ten healthy school-aged children played to further validate the required/targeted training skills. Results. All but two (muscular and cardiovascular endurance) had excellent agreement in required/targeted training skills, and there was 100% agreement on feedback. Children's performance in required/targeted training skills (number of unilateral reaches and bilateral reaches, speed, muscular endurance, and cardiovascular endurance) significantly differed between games (P < .05). Conclusion. EyeToy Play games could be used to train children's arm function. However, a careful evaluation of the games is needed since performance might not be consistent between players and therapists' interpretation. PMID:25610652
Cultural and Family Challenges to Managing Type 2 Diabetes in Immigrant Chinese Americans
Chesla, Catherine A.; Chun, Kevin M.; Kwan, Christine M.L.
2009-01-01
OBJECTIVE Although Asians demonstrate elevated levels of type 2 diabetes, little attention has been directed to their unique cultural beliefs and practices regarding diabetes. We describe cultural and family challenges to illness management in foreign-born Chinese American patients with type 2 diabetes and their spouses. RESEARCH DESIGN AND METHODS This was an interpretive comparative interview study with 20 foreign-born Chinese American couples (n = 40) living with type 2 diabetes. Multiple (six to seven) semistructured interviews with each couple in individual, group, and couple settings elicited beliefs about diabetes and narratives of care within the family and community. Interpretive narrative and thematic analysis were completed. A separate respondent group of 19 patients and spouses who met the inclusion criteria reviewed and confirmed the themes developed from the initial couples. RESULTS Cultural and family challenges to diabetes management within foreign-born Chinese American families included how 1) diabetes symptoms challenged family harmony, 2) dietary prescriptions challenged food beliefs and practices, and 3) disease management requirements challenged established family role responsibilities. CONCLUSIONS Culturally nuanced care with immigrant Chinese Americans requires attentiveness to the social context of disease management. Patients' and families' disease management decisions are seldom made independent of their concerns for family well-being, family face, and the reciprocal responsibilities required by varied family roles. Framing disease recommendations to include cultural concerns for balance and significant food rituals are warranted. PMID:19628812
Research on the Construction of Remote Sensing Automatic Interpretation Symbol Big Data
NASA Astrophysics Data System (ADS)
Gao, Y.; Liu, R.; Liu, J.; Cheng, T.
2018-04-01
Remote sensing automatic interpretation symbol (RSAIS) is an inexpensive and fast method in providing precise in-situ information for image interpretation and accuracy. This study designed a scientific and precise RSAIS data characterization method, as well as a distributed and cloud architecture massive data storage method. Additionally, it introduced an offline and online data update mode and a dynamic data evaluation mechanism, with the aim to create an efficient approach for RSAIS big data construction. Finally, a national RSAIS database with more than 3 million samples covering 86 land types was constructed during 2013-2015 based on the National Geographic Conditions Monitoring Project of China and then annually updated since the 2016 period. The RSAIS big data has proven to be a good method for large scale image interpretation and field validation. It is also notable that it has the potential to solve image automatic interpretation with the assistance of deep learning technology in the remote sensing big data era.
Reasoning and Data Representation in a Health and Lifestyle Support System.
Hanke, Sten; Kreiner, Karl; Kropf, Johannes; Scase, Marc; Gossy, Christian
2017-01-01
Case-based reasoning and data interpretation is an artificial intelligence approach that capitalizes on past experience to solve current problems and this can be used as a method for practical intelligent systems. Case-based data reasoning is able to provide decision support for experts and clinicians in health systems as well as lifestyle systems. In this project we were focusing on developing a solution for healthy ageing considering daily activities, nutrition as well as cognitive activities. The data analysis of the reasoner followed state of the art guidelines from clinical practice. Guidelines provide a general framework to guide clinicians, and require consequent background knowledge to become operational, which is precisely the kind of information recorded in practice cases; cases complement guidelines very well and helps to interpret them. It is expected that the interest in case-based reasoning systems in the health.
Positron Annihilation Studies of High-Tc Superconductors
NASA Astrophysics Data System (ADS)
Peter, M.; Manuel, A. A.
1989-01-01
First we present the principles involved in the study of the two-photon momentum distribution: The method requires deconvolution of the positron wavefunction and the estimation of matrix elements effects. Single crystal samples must be of sufficient quality to avoid positron trapping (tested by positron lifetime measurements). In ordinary metals (alkalis, transition- and rare earth metals and compounds) two-photon momentum distribution studies have given results in close agreement with relevant band structure calculations. Discrepancies have been successfully described as enhancement effects due to correlations. In the superconducting oxides, measurements are more difficult because there are fewer conduction electrons and more trapping. Correlation effects of a different nature are expected to be important and might render the band picture inappropriate. Two-photon momentum distribution measurements have now been made by several groups, but have been interpreted in different ways. We relate the current state of affairs, and our present interpretation, to the latest available results.
Interpretation of link fluctuations in climate networks during El Niño periods
NASA Astrophysics Data System (ADS)
Martin, E. A.; Paczuski, M.; Davidsen, J.
2013-05-01
Recent work has shown that the topologies of functional climate networks are sensitive to El Niño events. One important interpretation of the findings was that parts of the globe act in correlated relationships which become weaker, on average, during El Niño periods (this was shown using monthly averaged data where no time lag is required, and with daily averaged data where time lags were utilized). In contrast to this, we show that El Niño periods actually exhibit higher correlations than “Normal” climate conditions, while typically having lower correlations than La Niña periods. We also show that it is crucial to establish the sensitivity and the robustness of a given method used to extract functional climate networks —parameters such as time lags can significantly influence and even totally alter the outcome.
Translation of Genotype to Phenotype by a Hierarchy of Cell Subsystems.
Yu, Michael Ku; Kramer, Michael; Dutkowski, Janusz; Srivas, Rohith; Licon, Katherine; Kreisberg, Jason; Ng, Cherie T; Krogan, Nevan; Sharan, Roded; Ideker, Trey
2016-02-24
Accurately translating genotype to phenotype requires accounting for the functional impact of genetic variation at many biological scales. Here we present a strategy for genotype-phenotype reasoning based on existing knowledge of cellular subsystems. These subsystems and their hierarchical organization are defined by the Gene Ontology or a complementary ontology inferred directly from previously published datasets. Guided by the ontology's hierarchical structure, we organize genotype data into an "ontotype," that is, a hierarchy of perturbations representing the effects of genetic variation at multiple cellular scales. The ontotype is then interpreted using logical rules generated by machine learning to predict phenotype. This approach substantially outperforms previous, non-hierarchical methods for translating yeast genotype to cell growth phenotype, and it accurately predicts the growth outcomes of two new screens of 2,503 double gene knockouts impacting DNA repair or nuclear lumen. Ontotypes also generalize to larger knockout combinations, setting the stage for interpreting the complex genetics of disease.
Test techniques for model development of repetitive service energy storage capacitors
NASA Astrophysics Data System (ADS)
Thompson, M. C.; Mauldin, G. H.
1984-03-01
The performance of the Sandia perfluorocarbon family of energy storage capacitors was evaluated. The capacitors have a much lower charge noise signature creating new instrumentation performance goals. Thermal response to power loading and the importance of average and spot heating in the bulk regions require technical advancements in real time temperature measurements. Reduction and interpretation of thermal data are crucial to the accurate development of an intelligent thermal transport model. The thermal model is of prime interest in the high repetition rate, high average power applications of power conditioning capacitors. The accurate identification of device parasitic parameters has ramifications in both the average power loss mechanisms and peak current delivery. Methods to determine the parasitic characteristics and their nonlinearities and terminal effects are considered. Meaningful interpretations for model development, performance history, facility development, instrumentation, plans for the future, and present data are discussed.
Wilhelm, Dalit; Zlotnick, Cheryl
2014-07-01
Two tools were created to help international students to better understand culture by becoming more astute observers of nonverbal behaviors, particularly behaviors depicting emotions among Norwegian students. The two tools were a trilingual list of words illustrating emotions and an exercise with images to practice verbalizing their observations of emotional expression. Students compared the subdued behaviors of Norwegians to the Israelis' very vivid behaviors. The intense emotional expression of Israelis influenced their interpretations. By making comparisons and through the experiences with Israelis, they learned more about culture and their own emotional expression. Creative strategies can contribute to students understanding and reflection of patients in a different culture. Encouraging students to grasp the nuances of emotional expression is part of understanding a different culture. Students, like faculty, learn that self-exploration is an evolving process that requires checking out one's assumptions and interpretations. © The Author(s) 2013.
McLaughlin, L; McConnell, J; McFadden, S; Bond, R; Hughes, C
2017-11-01
This systematic review aimed to determine the strength of evidence available in the literature on the effect of training to develop the skills required by radiographers to interpret plain radiography chest images. Thirteen articles feature within the review. Sample size varied from one reporting radiographer to 148 radiography students/experienced radiographers. The quality of the articles achieved a mean score of 7.5/10, indicating the evidence is strong and the quality of studies in this field is high. Investigative approaches included audit of participants' performance in clinical practice post formal training, evaluation of informal training and the impact of short feedback sessions on performance. All studies demonstrated positive attributions on user performance. Using a combination of training techniques can help maximise learning and accommodate those with different preferred learning types. Copyright © 2017 The College of Radiographers. Published by Elsevier Ltd. All rights reserved.
Gassner, Christoph; Rainer, Esther; Pircher, Elfriede; Markut, Lydia; Körmöczi, Günther F.; Jungbauer, Christof; Wessin, Dietmar; Klinghofer, Roswitha; Schennach, Harald; Schwind, Peter; Schönitzer, Diether
2009-01-01
Summary Background Validations of routinely used serological typing methods require intense performance evaluations typically including large numbers of samples before routine application. However, such evaluations could be improved considering information about the frequency of standard blood groups and their variants. Methods Using RHD and ABO population genetic data, a Caucasian-specific donor panel was compiled for a performance comparison of the three RhD and ABO serological typing methods MDmulticard (Medion Diagnostics), ID-System (DiaMed) and ScanGel (Bio-Rad). The final test panel included standard and variant RHD and ABO genotypes, e.g. RhD categories, partial and weak RhDs, RhD DELs, and ABO samples, mainly to interpret weak serological reactivity for blood group A specificity. All samples were from individuals recorded in our local DNA blood group typing database. Results For ‘standard’ blood groups, results of performance were clearly interpretable for all three serological methods compared. However, when focusing on specific variant phenotypes, pronounced differences in reaction strengths and specificities were observed between them. Conclusions A genetically and ethnically predefined donor test panel consisting of 93 individual samples only, delivered highly significant results for serological performance comparisons. Such small panels offer impressive representative powers, higher as such based on statistical chances and large numbers only. PMID:21113264
NASA Astrophysics Data System (ADS)
Vasantrao, Baride Mukund; Bhaskarrao, Patil Jitendra; Mukund, Baride Aarti; Baburao, Golekar Rushikesh; Narayan, Patil Sanjaykumar
2017-12-01
The area chosen for the present study is Dhule district, which belongs to the drought prone area of Maharashtra State, India. Dhule district suffers from water problem, and therefore, there is no extra water available to supply for the agricultural and industrial growth. To understand the lithological characters in terms of its hydro-geological conditions, it is necessary to understand the geology of the area. It is now established fact that the geophysical method gives a better information of subsurface geology. Geophysical electrical surveys with four electrodes configuration, i.e., Wenner and Schlumberger method, were carried out at the same selected sites to observe the similarity and compared both the applications in terms of its use and handling in the field. A total 54 VES soundings were carried out spread over the Dhule district and representing different lithological units. The VES curves are drawn using inverse slope method for Wenner configuration, IPI2 win Software, and curve matching techniques were used for Schlumberger configuration. Regionwise lithologs are prepared based on the obtained resistivity and thickness for Wenner method. Regionwise curves were prepared based on resistivity layers for Schlumberger method. Comparing the two methods, it is observed that Wenner and Schlumberger methods have merits or demerits. Considering merits and demerits from the field point of view, it is suggested that Wenner inverse slope method is more handy for calculation and interpretation, but requires lateral length which is a constrain. Similarly, Schlumberger method is easy in application but unwieldy for their interpretation. The work amply proves the applicability of geophysical techniques in the water resource evaluation procedure. This technique is found to be suitable for the areas with similar geological setup elsewhere.
Assessing the predictive capability of randomized tree-based ensembles in streamflow modelling
NASA Astrophysics Data System (ADS)
Galelli, S.; Castelletti, A.
2013-02-01
Combining randomization methods with ensemble prediction is emerging as an effective option to balance accuracy and computational efficiency in data-driven modeling. In this paper we investigate the prediction capability of extremely randomized trees (Extra-Trees), in terms of accuracy, explanation ability and computational efficiency, in a streamflow modeling exercise. Extra-Trees are a totally randomized tree-based ensemble method that (i) alleviates the poor generalization property and tendency to overfitting of traditional standalone decision trees (e.g. CART); (ii) is computationally very efficient; and, (iii) allows to infer the relative importance of the input variables, which might help in the ex-post physical interpretation of the model. The Extra-Trees potential is analyzed on two real-world case studies (Marina catchment (Singapore) and Canning River (Western Australia)) representing two different morphoclimatic contexts comparatively with other tree-based methods (CART and M5) and parametric data-driven approaches (ANNs and multiple linear regression). Results show that Extra-Trees perform comparatively well to the best of the benchmarks (i.e. M5) in both the watersheds, while outperforming the other approaches in terms of computational requirement when adopted on large datasets. In addition, the ranking of the input variable provided can be given a physically meaningful interpretation.
Assessing the predictive capability of randomized tree-based ensembles in streamflow modelling
NASA Astrophysics Data System (ADS)
Galelli, S.; Castelletti, A.
2013-07-01
Combining randomization methods with ensemble prediction is emerging as an effective option to balance accuracy and computational efficiency in data-driven modelling. In this paper, we investigate the prediction capability of extremely randomized trees (Extra-Trees), in terms of accuracy, explanation ability and computational efficiency, in a streamflow modelling exercise. Extra-Trees are a totally randomized tree-based ensemble method that (i) alleviates the poor generalisation property and tendency to overfitting of traditional standalone decision trees (e.g. CART); (ii) is computationally efficient; and, (iii) allows to infer the relative importance of the input variables, which might help in the ex-post physical interpretation of the model. The Extra-Trees potential is analysed on two real-world case studies - Marina catchment (Singapore) and Canning River (Western Australia) - representing two different morphoclimatic contexts. The evaluation is performed against other tree-based methods (CART and M5) and parametric data-driven approaches (ANNs and multiple linear regression). Results show that Extra-Trees perform comparatively well to the best of the benchmarks (i.e. M5) in both the watersheds, while outperforming the other approaches in terms of computational requirement when adopted on large datasets. In addition, the ranking of the input variable provided can be given a physically meaningful interpretation.
NASA Astrophysics Data System (ADS)
Wilcox, Dawn Renee
This dissertation examined elementary teachers' beliefs and perceptions of effective science instruction and documents how these teachers interpret and implement a model for Inquiry-Based (I-B) science in their classrooms. The study chronicles a group of teachers working in a large public school division and documents how these teachers interpret and implement reform-based science methods after participating in a professional development course on I-B science methods administered by the researcher. I-B science teaching and its implementation is discussed as an example of one potential method to address the current call for national education reform to meet the increasing needs of all students to achieve scientific literacy and the role of teachers in that effort. The conviction in science reform efforts is that all students are able to learn science and consequently must be given the crucial opportunities in the right environment that permits optimal science learning in our nation's schools. Following this group of teachers as they attempted to deliver I-B science teaching revealed challenges elementary science teachers face and the professional supports necessary for them to effectively meet science standards. This dissertation serves as partial fulfillment of the requirements for the degree of Doctor of Philosophy in Education at George Mason University.
Laboratory diagnosis of Chlamydia pneumoniae infections
Peeling, Rosanna W
1995-01-01
Chlamydia pneumoniae is an important cause of respiratory illness. There is a need for accurate and rapid laboratory diagnostic methods that will lead to improved patient care, appropriate use of antimicrobial therapy and a better understanding of the epidemiology of this emerging pathogen. Culture is highly specific but is technically demanding, expensive, has a long turnaround time and its sensitivity is highly dependent on transport conditions. Antigen detection tests such as enzyme immunoassay and direct fluorescent antibody assay, and molecular detection methods such as the polymerase chain reaction assay, may provide a rapid diagnosis without the requirement for stringent transport conditions. The results of these tests should be interpreted with caution until more thorough evaluation is available. Serology remains the method of choice. The limitations of different serological methods for the laboratory diagnosis of C pneumoniae are discussed. PMID:22514397
Unregulated Autonomy: Uncredentialed Educational Interpreters in Rural Schools.
Fitzmaurice, Stephen
2017-01-01
Although many rural Deaf and Hard of Hearing students attend public schools most of the day and use the services of educational interpreters to gain access to the school environment, little information exists on what interpreters are doing in rural school systems in the absence of credentialing requirements. The researcher used ethnographic interviews and field observations of three educational interpreters with no certification or professional assessment to explore how uncredentialed interpreters were enacting their role in a rural high school. The findings indicate that uncredentialed interpreters in rural settings perform four major functions during their school day: preparing the environment, staff, and materials; interpreting a variety of content; interacting with numerous stakeholders; and directly instructing Deaf and Hard of Hearing students. Generally, educational interpreters in rural districts operate with unregulated autonomy, a situation that warrants further research and a national standard for all educational interpreters.
Giske, Christian G.; Haldorsen, Bjørg; Matuschek, Erika; Schønning, Kristian; Leegaard, Truls M.; Kahlmeter, Gunnar
2014-01-01
Different antimicrobial susceptibility testing methods to detect low-level vancomycin resistance in enterococci were evaluated in a Scandinavian multicenter study (n = 28). A phenotypically and genotypically well-characterized diverse collection of Enterococcus faecalis (n = 12) and Enterococcus faecium (n = 18) strains with and without nonsusceptibility to vancomycin was examined blindly in Danish (n = 5), Norwegian (n = 13), and Swedish (n = 10) laboratories using the EUCAST disk diffusion method (n = 28) and the CLSI agar screen (n = 18) or the Vitek 2 system (bioMérieux) (n = 5). The EUCAST disk diffusion method (very major error [VME] rate, 7.0%; sensitivity, 0.93; major error [ME] rate, 2.4%; specificity, 0.98) and CLSI agar screen (VME rate, 6.6%; sensitivity, 0.93; ME rate, 5.6%; specificity, 0.94) performed significantly better (P = 0.02) than the Vitek 2 system (VME rate, 13%; sensitivity, 0.87; ME rate, 0%; specificity, 1). The performance of the EUCAST disk diffusion method was challenged by differences in vancomycin inhibition zone sizes as well as the experience of the personnel in interpreting fuzzy zone edges as an indication of vancomycin resistance. Laboratories using Oxoid agar (P < 0.0001) or Merck Mueller-Hinton (MH) agar (P = 0.027) for the disk diffusion assay performed significantly better than did laboratories using BBL MH II medium. Laboratories using Difco brain heart infusion (BHI) agar for the CLSI agar screen performed significantly better (P = 0.017) than did those using Oxoid BHI agar. In conclusion, both the EUCAST disk diffusion and CLSI agar screening methods performed acceptably (sensitivity, 0.93; specificity, 0.94 to 0.98) in the detection of VanB-type vancomycin-resistant enterococci with low-level resistance. Importantly, use of the CLSI agar screen requires careful monitoring of the vancomycin concentration in the plates. Moreover, disk diffusion methodology requires that personnel be trained in interpreting zone edges. PMID:24599985
NASA Astrophysics Data System (ADS)
Utama, M. Iqbal Bakti; Lu, Xin; Zhan, Da; Ha, Son Tung; Yuan, Yanwen; Shen, Zexiang; Xiong, Qihua
2014-10-01
Patterning two-dimensional materials into specific spatial arrangements and geometries is essential for both fundamental studies of materials and practical applications in electronics. However, the currently available patterning methods generally require etching steps that rely on complicated and expensive procedures. We report here a facile patterning method for atomically thin MoSe2 films using stripping with an SU-8 negative resist layer exposed to electron beam lithography. Additional steps of chemical and physical etching were not necessary in this SU-8 patterning method. The SU-8 patterning was used to define a ribbon channel from a field effect transistor of MoSe2 film, which was grown by chemical vapor deposition. The narrowing of the conduction channel area with SU-8 patterning was crucial in suppressing the leakage current within the device, thereby allowing a more accurate interpretation of the electrical characterization results from the sample. An electrical transport study, enabled by the SU-8 patterning, showed a variable range hopping behavior at high temperatures.Patterning two-dimensional materials into specific spatial arrangements and geometries is essential for both fundamental studies of materials and practical applications in electronics. However, the currently available patterning methods generally require etching steps that rely on complicated and expensive procedures. We report here a facile patterning method for atomically thin MoSe2 films using stripping with an SU-8 negative resist layer exposed to electron beam lithography. Additional steps of chemical and physical etching were not necessary in this SU-8 patterning method. The SU-8 patterning was used to define a ribbon channel from a field effect transistor of MoSe2 film, which was grown by chemical vapor deposition. The narrowing of the conduction channel area with SU-8 patterning was crucial in suppressing the leakage current within the device, thereby allowing a more accurate interpretation of the electrical characterization results from the sample. An electrical transport study, enabled by the SU-8 patterning, showed a variable range hopping behavior at high temperatures. Electronic supplementary information (ESI) available: Further experiments on patterning and additional electrical characterizations data. See DOI: 10.1039/c4nr03817g
2014-01-01
Background Objective physical assessment of patients with lumbar spondylosis involves plain film radiographs (PFR) viewing and interpretation by the radiologists. Physiotherapists also routinely assess PFR within the scope of their practice. However, studies appraising the level of agreement of physiotherapists’ PFR interpretation with radiologists are not common in Ghana. Method Forty-one (41) physiotherapists took part in the cross-sectional survey. An assessment guide was developed from findings of the interpretation of three PFR of patients with lumbar spondylosis by a radiologist. The three PFR were selected from a pool of different radiographs based on clarity, common visible pathological features, coverage body segments and short post production period. Physiotherapists were required to view the same PFR after which they were assessed with the assessment guide according to the number of features identified correctly or incorrectly. The score range on the assessment form was 0–24, interpreted as follow: 0–8 points (low), 9–16 points (moderate) and 17–24 points (high) levels of agreement. Data were analyzed using one sample t-test and fisher’s exact test at α = 0.05. Results The mean score of interpretation for the physiotherapists was 12.7 ± 2.6 points compared to the radiologist’s interpretation of 24 points (assessment guide). The physiotherapists’ levels were found to be significantly associated with their academic qualification (p = 0.006) and sex (p = 0.001). However, their levels of agreement were not significantly associated with their age group (p = 0.098), work settings (p = 0.171), experience (p = 0.666), preferred PFR view (p = 0.088) and continuing education (p = 0.069). Conclusions The physiotherapists’ skills fall short of expectation for interpreting PFR of patients with lumbar spondylosis. The levels of agreement with radiologist’s interpretation have no link with year of clinial practice, age, work settings and continuing education. Thus, routine PFR viewing techniques should be made a priority in physiotherapists’ continuing professional education. PMID:24678695
Halliwell, Barry; Whiteman, Matthew
2004-01-01
Free radicals and other reactive species (RS) are thought to play an important role in many human diseases. Establishing their precise role requires the ability to measure them and the oxidative damage that they cause. This article first reviews what is meant by the terms free radical, RS, antioxidant, oxidative damage and oxidative stress. It then critically examines methods used to trap RS, including spin trapping and aromatic hydroxylation, with a particular emphasis on those methods applicable to human studies. Methods used to measure oxidative damage to DNA, lipids and proteins and methods used to detect RS in cell culture, especially the various fluorescent ‘probes' of RS, are also critically reviewed. The emphasis throughout is on the caution that is needed in applying these methods in view of possible errors and artifacts in interpreting the results. PMID:15155533
Measuring Patient Preferences: An Overview of Methods with a Focus on Discrete Choice Experiments.
Hazlewood, Glen S
2018-05-01
There is increasing recognition of the importance of patient preferences and methodologies to measure them. In this article, methods to quantify patient preferences are reviewed, with a focus on discrete choice experiments. In a discrete choice experiment, patients are asked to choose between 2 or more treatments. The results can be used to quantify the relative importance of treatment outcomes and/or other considerations relevant to medical decision making. Conducting and interpreting a discrete choice experiment requires multiple steps and an understanding of the potential biases that can arise, which we review in this article with examples in rheumatic diseases. Copyright © 2018 Elsevier Inc. All rights reserved.
Profit intensity and cases of non-compliance with the law of demand/supply
NASA Astrophysics Data System (ADS)
Makowski, Marcin; Piotrowski, Edward W.; Sładkowski, Jan; Syska, Jacek
2017-05-01
We consider properties of the measurement intensity ρ of a random variable for which the probability density function represented by the corresponding Wigner function attains negative values on a part of the domain. We consider a simple economic interpretation of this problem. This model is used to present the applicability of the method to the analysis of the negative probability on markets where there are anomalies in the law of supply and demand (e.g. Giffen's goods). It turns out that the new conditions to optimize the intensity ρ require a new strategy. We propose a strategy (so-called à rebours strategy) based on the fixed point method and explore its effectiveness.
Emissivity correction for interpreting thermal radiation from a terrestrial surface
NASA Technical Reports Server (NTRS)
Sutherland, R. A.; Bartholic, J. F.; Gerber, J. F.
1979-01-01
A general method of accounting for emissivity in making temperature determinations of graybody surfaces from radiometric data is presented. The method differs from previous treatments in that a simple blackbody calibration and graphical approach is used rather than numerical integrations which require detailed knowledge of an instrument's spectral characteristics. Also, errors caused by approximating instrumental response with the Stephan-Boltzman law rather than with an appropriately weighted Planck integral are examined. In the 8-14 micron wavelength interval, it is shown that errors are at most on the order of 3 C for the extremes of the earth's temperature and emissivity. For more practical limits, however, errors are less than 0.5 C.
Interactive Visualization to Advance Earthquake Simulation
NASA Astrophysics Data System (ADS)
Kellogg, Louise H.; Bawden, Gerald W.; Bernardin, Tony; Billen, Magali; Cowgill, Eric; Hamann, Bernd; Jadamec, Margarete; Kreylos, Oliver; Staadt, Oliver; Sumner, Dawn
2008-04-01
The geological sciences are challenged to manage and interpret increasing volumes of data as observations and simulations increase in size and complexity. For example, simulations of earthquake-related processes typically generate complex, time-varying data sets in two or more dimensions. To facilitate interpretation and analysis of these data sets, evaluate the underlying models, and to drive future calculations, we have developed methods of interactive visualization with a special focus on using immersive virtual reality (VR) environments to interact with models of Earth’s surface and interior. Virtual mapping tools allow virtual “field studies” in inaccessible regions. Interactive tools allow us to manipulate shapes in order to construct models of geological features for geodynamic models, while feature extraction tools support quantitative measurement of structures that emerge from numerical simulation or field observations, thereby enabling us to improve our interpretation of the dynamical processes that drive earthquakes. VR has traditionally been used primarily as a presentation tool, albeit with active navigation through data. Reaping the full intellectual benefits of immersive VR as a tool for scientific analysis requires building on the method’s strengths, that is, using both 3D perception and interaction with observed or simulated data. This approach also takes advantage of the specialized skills of geological scientists who are trained to interpret, the often limited, geological and geophysical data available from field observations.
Ahrensberg, J M; Olesen, F; Hansen, R P; Schrøder, H; Vedsted, P
2013-01-01
Background: Early diagnosis of childhood cancer provides hope for better prognoses. Shorter diagnostic intervals (DI) in primary care require better knowledge of the association between presenting symptoms, interpretation of symptoms and the wording of the referral letter. Methods: A Danish nationwide population-based study. Data on 550 children aged <15 years with an incident cancer diagnosis (January 2007–December 2010) were collected through questionnaires to parents (response rate=69%) and general practitioners (GPs) (response rate=87%). The DI from the first presentation in general practice until diagnosis was categorised as short or long based on quartiles. Associations between variables and long DIs were assessed using logistic regression. Results: The GPs interpreted symptoms as ‘vague' in 25.4%, ‘serious' in 50.0% and ‘alarm' in 19.0% of cases. Symptom interpretation varied by cancer type (P<0.001) and was associated with the DI (P<0.001). Vomiting was associated with a shorter DI for central nervous system (CNS) tumours, and pain with a longer DI for leukaemia. Referral letter wording was associated with DI (P<0.001); the shortest DIs were observed when cancer suspicion was raised in the letter. Conclusion: The GPs play an important role in recognising early signs of childhood cancer as their symptom interpretation and referral wording have a profound impact on the diagnostic process. PMID:23449354
The role of practical wisdom in nurse manager practice: why experience matters.
Cathcart, Eloise Balasco; Greenspan, Miriam
2013-10-01
To illustrate through the interpretation of one representative nurse manager's narrative how the methodology of practice articulation gives language to the ways practical wisdom develops in leadership practice and facilitates learning. Patricia Benner's corpus of research has demonstrated that reflection on clinical narratives comes closer than other pedagogical methods to replicating and enhancing the experiential learning required for the development of practical wisdom. Using Benner's methodology of practice articulation, 91 nurse managers wrote and read to a peer group a narrative of their lived experience in the role. The groups interpreted the narratives to extract the skilled knowledge and ethics embedded in the practice of the nurse manager authors. One narrative was chosen for this paper because it is a particularly clear exemplar of how practical wisdom develops in nurse manager practice. Articulating and reflecting on experiential learning led to an understanding of how practical wisdom developed in one nurse manager's practice. Interpretation of the narrative of one nurse manager illustrated how reflection on a complex ethical dilemma was a source of character development for the individual and the peer group. Describing and interpreting how practical wisdom develops for individual nurse managers can be a source of learning for the narrative author and other role incumbents who need to make sound decisions and take prudent action in ethically challenging situations. © 2013 John Wiley & Sons Ltd.
Validity threats: overcoming interference with proposed interpretations of assessment data.
Downing, Steven M; Haladyna, Thomas M
2004-03-01
Factors that interfere with the ability to interpret assessment scores or ratings in the proposed manner threaten validity. To be interpreted in a meaningful manner, all assessments in medical education require sound, scientific evidence of validity. The purpose of this essay is to discuss 2 major threats to validity: construct under-representation (CU) and construct-irrelevant variance (CIV). Examples of each type of threat for written, performance and clinical performance examinations are provided. The CU threat to validity refers to undersampling the content domain. Using too few items, cases or clinical performance observations to adequately generalise to the domain represents CU. Variables that systematically (rather than randomly) interfere with the ability to meaningfully interpret scores or ratings represent CIV. Issues such as flawed test items written at inappropriate reading levels or statistically biased questions represent CIV in written tests. For performance examinations, such as standardised patient examinations, flawed cases or cases that are too difficult for student ability contribute CIV to the assessment. For clinical performance data, systematic rater error, such as halo or central tendency error, represents CIV. The term face validity is rejected as representative of any type of legitimate validity evidence, although the fact that the appearance of the assessment may be an important characteristic other than validity is acknowledged. There are multiple threats to validity in all types of assessment in medical education. Methods to eliminate or control validity threats are suggested.
Alvarez-Meza, Andres M.; Orozco-Gutierrez, Alvaro; Castellanos-Dominguez, German
2017-01-01
We introduce Enhanced Kernel-based Relevance Analysis (EKRA) that aims to support the automatic identification of brain activity patterns using electroencephalographic recordings. EKRA is a data-driven strategy that incorporates two kernel functions to take advantage of the available joint information, associating neural responses to a given stimulus condition. Regarding this, a Centered Kernel Alignment functional is adjusted to learning the linear projection that best discriminates the input feature set, optimizing the required free parameters automatically. Our approach is carried out in two scenarios: (i) feature selection by computing a relevance vector from extracted neural features to facilitating the physiological interpretation of a given brain activity task, and (ii) enhanced feature selection to perform an additional transformation of relevant features aiming to improve the overall identification accuracy. Accordingly, we provide an alternative feature relevance analysis strategy that allows improving the system performance while favoring the data interpretability. For the validation purpose, EKRA is tested in two well-known tasks of brain activity: motor imagery discrimination and epileptic seizure detection. The obtained results show that the EKRA approach estimates a relevant representation space extracted from the provided supervised information, emphasizing the salient input features. As a result, our proposal outperforms the state-of-the-art methods regarding brain activity discrimination accuracy with the benefit of enhanced physiological interpretation about the task at hand. PMID:29056897
How people interpret healthy eating: contributions of qualitative research.
Bisogni, Carole A; Jastran, Margaret; Seligson, Marc; Thompson, Alyssa
2012-01-01
To identify how qualitative research has contributed to understanding the ways people in developed countries interpret healthy eating. Bibliographic database searches identified reports of qualitative, empirical studies published in English, peer-reviewed journals since 1995. Authors coded, discussed, recoded, and analyzed papers reporting qualitative research studies related to participants' interpretations of healthy eating. Studies emphasized a social constructionist approach, and most used focus groups and/or individual, in-depth interviews to collect data. Study participants explained healthy eating in terms of food, food components, food production methods, physical outcomes, psychosocial outcomes, standards, personal goals, and as requiring restriction. Researchers described meanings as specific to life stages and different life experiences, such as parenting and disease onset. Identity (self-concept), social settings, resources, food availability, and conflicting considerations were themes in participants' explanations for not eating according to their ideals for healthy eating. People interpret healthy eating in complex and diverse ways that reflect their personal, social, and cultural experiences, as well as their environments. Their meanings include but are broader than the food composition and health outcomes considered by scientists. The rich descriptions and concepts generated by qualitative research can help practitioners and researchers think beyond their own experiences and be open to audience members' perspectives as they seek to promote healthy ways of eating. Copyright © 2012 Society for Nutrition Education and Behavior. Published by Elsevier Inc. All rights reserved.
Real-time monitoring of CO2 storage sites: Application to Illinois Basin-Decatur Project
Picard, G.; Berard, T.; Chabora, E.; Marsteller, S.; Greenberg, S.; Finley, R.J.; Rinck, U.; Greenaway, R.; Champagnon, C.; Davard, J.
2011-01-01
Optimization of carbon dioxide (CO2) storage operations for efficiency and safety requires use of monitoring techniques and implementation of control protocols. The monitoring techniques consist of permanent sensors and tools deployed for measurement campaigns. Large amounts of data are thus generated. These data must be managed and integrated for interpretation at different time scales. A fast interpretation loop involves combining continuous measurements from permanent sensors as they are collected to enable a rapid response to detected events; a slower loop requires combining large datasets gathered over longer operational periods from all techniques. The purpose of this paper is twofold. First, it presents an analysis of the monitoring objectives to be performed in the slow and fast interpretation loops. Second, it describes the implementation of the fast interpretation loop with a real-time monitoring system at the Illinois Basin-Decatur Project (IBDP) in Illinois, USA. ?? 2011 Published by Elsevier Ltd.
Clark, D S
1998-10-13
The Premerger Notification Office ("PNO") of the Federal Trade Commission ("FTC"), with the concurrence of the Assistant Attorney General in charge of the Antitrust Division of the Department of Justice ("DOJ"), is adopting a Formal Interpretation of the Hart-Scott-Rodino Act, which requires certain persons planning certain mergers, consolidations, or other acquisitions to report information about the proposed transactions to the FTC and DOJ. The Interpretation concerns the reportability of certain transactions involving a Limited Liability Company ("LLC"), a relatively new form of entity authorized by state statutes. Under the Interpretation, the formation of an LLC will be reportable if it will unite two or more pre-existing businesses under common control. Similarly, acquisitions of existing LLC membership interests will be reportable if they would have the effect of uniting two or more pre-existing businesses under common control.
Remote sensing programs and courses in engineering and water resources
NASA Technical Reports Server (NTRS)
Kiefer, R. W.
1981-01-01
The content of typical basic and advanced remote sensing and image interpretation courses are described and typical remote sensing graduate programs of study in civil engineering and in interdisciplinary environmental remote sensing and water resources management programs are outlined. Ideally, graduate programs with an emphasis on remote sensing and image interpretation should be built around a core of five courses: (1) a basic course in fundamentals of remote sensing upon which the more specialized advanced remote sensing courses can build; (2) a course dealing with visual image interpretation; (3) a course dealing with quantitative (computer-based) image interpretation; (4) a basic photogrammetry course; and (5) a basic surveying course. These five courses comprise up to one-half of the course work required for the M.S. degree. The nature of other course work and thesis requirements vary greatly, depending on the department in which the degree is being awarded.
Elmore, Joann G.; Longton, Gary M.; Pepe, Margaret S.; Carney, Patricia A.; Nelson, Heidi D.; Allison, Kimberly H.; Geller, Berta M.; Onega, Tracy; Tosteson, Anna N. A.; Mercan, Ezgi; Shapiro, Linda G.; Brunyé, Tad T.; Morgan, Thomas R.; Weaver, Donald L.
2017-01-01
Background: Digital whole slide imaging may be useful for obtaining second opinions and is used in many countries. However, the U.S. Food and Drug Administration requires verification studies. Methods: Pathologists were randomized to interpret one of four sets of breast biopsy cases during two phases, separated by ≥9 months, using glass slides or digital format (sixty cases per set, one slide per case, n = 240 cases). Accuracy was assessed by comparing interpretations to a consensus reference standard. Intraobserver reproducibility was assessed by comparing the agreement of interpretations on the same cases between two phases. Estimated probabilities of confirmation by a reference panel (i.e., predictive values) were obtained by incorporating data on the population prevalence of diagnoses. Results: Sixty-five percent of responding pathologists were eligible, and 252 consented to randomization; 208 completed Phase I (115 glass, 93 digital); and 172 completed Phase II (86 glass, 86 digital). Accuracy was slightly higher using glass compared to digital format and varied by category: invasive carcinoma, 96% versus 93% (P = 0.04); ductal carcinoma in situ (DCIS), 84% versus 79% (P < 0.01); atypia, 48% versus 43% (P = 0.08); and benign without atypia, 87% versus 82% (P < 0.01). There was a small decrease in intraobserver agreement when the format changed compared to when glass slides were used in both phases (P = 0.08). Predictive values for confirmation by a reference panel using glass versus digital were: invasive carcinoma, 98% and 97% (not significant [NS]); DCIS, 70% and 57% (P = 0.007); atypia, 38% and 28% (P = 0.002); and benign without atypia, 97% and 96% (NS). Conclusions: In this large randomized study, digital format interpretations were similar to glass slide interpretations of benign and invasive cancer cases. However, cases in the middle of the spectrum, where more inherent variability exists, may be more problematic in digital format. Future studies evaluating the effect these findings exert on clinical practice and patient outcomes are required. PMID:28382226
Lu, Tong; Tai, Chiew-Lan; Yang, Huafei; Cai, Shijie
2009-08-01
We present a novel knowledge-based system to automatically convert real-life engineering drawings to content-oriented high-level descriptions. The proposed method essentially turns the complex interpretation process into two parts: knowledge representation and knowledge-based interpretation. We propose a new hierarchical descriptor-based knowledge representation method to organize the various types of engineering objects and their complex high-level relations. The descriptors are defined using an Extended Backus Naur Form (EBNF), facilitating modification and maintenance. When interpreting a set of related engineering drawings, the knowledge-based interpretation system first constructs an EBNF-tree from the knowledge representation file, then searches for potential engineering objects guided by a depth-first order of the nodes in the EBNF-tree. Experimental results and comparisons with other interpretation systems demonstrate that our knowledge-based system is accurate and robust for high-level interpretation of complex real-life engineering projects.
Validation of high throughput sequencing and microbial forensics applications
2014-01-01
High throughput sequencing (HTS) generates large amounts of high quality sequence data for microbial genomics. The value of HTS for microbial forensics is the speed at which evidence can be collected and the power to characterize microbial-related evidence to solve biocrimes and bioterrorist events. As HTS technologies continue to improve, they provide increasingly powerful sets of tools to support the entire field of microbial forensics. Accurate, credible results allow analysis and interpretation, significantly influencing the course and/or focus of an investigation, and can impact the response of the government to an attack having individual, political, economic or military consequences. Interpretation of the results of microbial forensic analyses relies on understanding the performance and limitations of HTS methods, including analytical processes, assays and data interpretation. The utility of HTS must be defined carefully within established operating conditions and tolerances. Validation is essential in the development and implementation of microbial forensics methods used for formulating investigative leads attribution. HTS strategies vary, requiring guiding principles for HTS system validation. Three initial aspects of HTS, irrespective of chemistry, instrumentation or software are: 1) sample preparation, 2) sequencing, and 3) data analysis. Criteria that should be considered for HTS validation for microbial forensics are presented here. Validation should be defined in terms of specific application and the criteria described here comprise a foundation for investigators to establish, validate and implement HTS as a tool in microbial forensics, enhancing public safety and national security. PMID:25101166
Policy Tree Optimization for Adaptive Management of Water Resources Systems
NASA Astrophysics Data System (ADS)
Herman, J. D.; Giuliani, M.
2016-12-01
Water resources systems must cope with irreducible uncertainty in supply and demand, requiring policy alternatives capable of adapting to a range of possible future scenarios. Recent studies have developed adaptive policies based on "signposts" or "tipping points", which are threshold values of indicator variables that signal a change in policy. However, there remains a need for a general method to optimize the choice of indicators and their threshold values in a way that is easily interpretable for decision makers. Here we propose a conceptual framework and computational algorithm to design adaptive policies as a tree structure (i.e., a hierarchical set of logical rules) using a simulation-optimization approach based on genetic programming. We demonstrate the approach using Folsom Reservoir, California as a case study, in which operating policies must balance the risk of both floods and droughts. Given a set of feature variables, such as reservoir level, inflow observations and forecasts, and time of year, the resulting policy defines the conditions under which flood control and water supply hedging operations should be triggered. Importantly, the tree-based rule sets are easy to interpret for decision making, and can be compared to historical operating policies to understand the adaptations needed under possible climate change scenarios. Several remaining challenges are discussed, including the empirical convergence properties of the method, and extensions to irreversible decisions such as infrastructure. Policy tree optimization, and corresponding open-source software, provide a generalizable, interpretable approach to designing adaptive policies under uncertainty for water resources systems.
Validation of high throughput sequencing and microbial forensics applications.
Budowle, Bruce; Connell, Nancy D; Bielecka-Oder, Anna; Colwell, Rita R; Corbett, Cindi R; Fletcher, Jacqueline; Forsman, Mats; Kadavy, Dana R; Markotic, Alemka; Morse, Stephen A; Murch, Randall S; Sajantila, Antti; Schmedes, Sarah E; Ternus, Krista L; Turner, Stephen D; Minot, Samuel
2014-01-01
High throughput sequencing (HTS) generates large amounts of high quality sequence data for microbial genomics. The value of HTS for microbial forensics is the speed at which evidence can be collected and the power to characterize microbial-related evidence to solve biocrimes and bioterrorist events. As HTS technologies continue to improve, they provide increasingly powerful sets of tools to support the entire field of microbial forensics. Accurate, credible results allow analysis and interpretation, significantly influencing the course and/or focus of an investigation, and can impact the response of the government to an attack having individual, political, economic or military consequences. Interpretation of the results of microbial forensic analyses relies on understanding the performance and limitations of HTS methods, including analytical processes, assays and data interpretation. The utility of HTS must be defined carefully within established operating conditions and tolerances. Validation is essential in the development and implementation of microbial forensics methods used for formulating investigative leads attribution. HTS strategies vary, requiring guiding principles for HTS system validation. Three initial aspects of HTS, irrespective of chemistry, instrumentation or software are: 1) sample preparation, 2) sequencing, and 3) data analysis. Criteria that should be considered for HTS validation for microbial forensics are presented here. Validation should be defined in terms of specific application and the criteria described here comprise a foundation for investigators to establish, validate and implement HTS as a tool in microbial forensics, enhancing public safety and national security.
Application of ground-penetrating-radar methods in hydrogeologic studies
Beres, Milan; Haeni, F.P.
1991-01-01
A ground-penetrating-radar system was used to study selected stratified-drift deposits in Connecticut. Ground-penetrating radar is a surface-geophysical method that depends on the emission, transmission, reflection, and reception of an electromagnetic pulse and can produce continuous high-resolution profiles of the subsurface rapidly and efficiently. Traverse locations on land included a well field in the town of Mansfield, a sand and gravel pit and a farm overlying a potential aquifer in the town of Coventry, and Haddam Meadows State Park in the town of Haddam. Traverse locations on water included the Willimantic River in Coventry and Mansfield Hollow Lake in Mansfield. The penetration depth of the radar signal ranged from about 20 feet in fine-grained glaciolacustrine sediments to about 70 feet in coarse sand and gravel. Some land records in coarse-grained sediments show a distinct, continuous reflection from the water table about 5 to 11 feet below land surface. Parallel reflectors on the records are interpreted as fine-grained sediments. Hummocky or chaotic reflectors are interpreted as cross-bedded or coarse-grained sediments. Other features observed on some of the radar records include the till and bedrock surface. Records collected on water had distinct water-bottom multiples (more than one reflection) and diffraction patterns from boulders. The interpretation of the radar records, which required little or no processing, was verified by using lithologic logs from test holes located along some of the land traverses and near the water traverses.
Bulik, Catharine C.; Fauntleroy, Kathy A.; Jenkins, Stephen G.; Abuali, Mayssa; LaBombardi, Vincent J.; Nicolau, David P.; Kuti, Joseph L.
2010-01-01
We describe the levels of agreement between broth microdilution, Etest, Vitek 2, Sensititre, and MicroScan methods to accurately define the meropenem MIC and categorical interpretation of susceptibility against carbapenemase-producing Klebsiella pneumoniae (KPC). A total of 46 clinical K. pneumoniae isolates with KPC genotypes, all modified Hodge test and blaKPC positive, collected from two hospitals in NY were included. Results obtained by each method were compared with those from broth microdilution (the reference method), and agreement was assessed based on MICs and Clinical Laboratory Standards Institute (CLSI) interpretative criteria using 2010 susceptibility breakpoints. Based on broth microdilution, 0%, 2.2%, and 97.8% of the KPC isolates were classified as susceptible, intermediate, and resistant to meropenem, respectively. Results from MicroScan demonstrated the most agreement with those from broth microdilution, with 95.6% agreement based on the MIC and 2.2% classified as minor errors, and no major or very major errors. Etest demonstrated 82.6% agreement with broth microdilution MICs, a very major error rate of 2.2%, and a minor error rate of 2.2%. Vitek 2 MIC agreement was 30.4%, with a 23.9% very major error rate and a 39.1% minor error rate. Sensititre demonstrated MIC agreement for 26.1% of isolates, with a 3% very major error rate and a 26.1% minor error rate. Application of FDA breakpoints had little effect on minor error rates but increased very major error rates to 58.7% for Vitek 2 and Sensititre. Meropenem MIC results and categorical interpretations for carbapenemase-producing K. pneumoniae differ by methodology. Confirmation of testing results is encouraged when an accurate MIC is required for antibiotic dosing optimization. PMID:20484603
Interpretive model for ''A Concurrency Method''
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carter, C.L.
1987-01-01
This paper describes an interpreter for ''A Concurrency Method,'' in which concurrency is the inherent mode of operation and not an appendage to sequentiality. This method is based on the notions of data-drive and single-assignment while preserving a natural manner of programming. The interpreter is designed for and implemented on a network of Corvus Concept Personal Workstations, which are based on the Motorola MC68000 super-microcomputer. The interpreter utilizes the MC68000 processors in each workstation by communicating across OMNINET, the local area network designed for the workstations. The interpreter is a complete system, containing an editor, a compiler, an operating systemmore » with load balancer, and a communication facility. The system includes the basic arithmetic and trigonometric primitive operations for mathematical computations as well as the ability to construct more complex operations from these. 9 refs., 5 figs.« less
Speciation of Mg in biogenic calcium carbonates
NASA Astrophysics Data System (ADS)
Farges, F.; Meibom, A.; Flank, A.-M.; Lagarde, P.; Janousch, M.; Stolarski, J.
2009-11-01
A selection of marine biominerals, mostly aragonitic coral skeletons were probed at the Mg K-edge by XANES spectroscopy coupled to μXRF methods and compared to an extensive set of relevant model compounds (silicates, carbonates, oxides and organic). Extensive methodologies are required to better describe the speciation of Mg in those minerals. A combination of ab-initio XANES calculations for defective clusters around Mg in aragonite together with wavelets analyzes of the XANES region are required to robustly interpret the spectra. When using those methodologies, the speciation of Mg ranges from a magnesite-type environment in some scleractinian corals to an organic-type environment. In all environments, the Mg-domains probed appear to be less than 1 nm in size.
Comparison of texture synthesis methods for content generation in ultrasound simulation for training
NASA Astrophysics Data System (ADS)
Mattausch, Oliver; Ren, Elizabeth; Bajka, Michael; Vanhoey, Kenneth; Goksel, Orcun
2017-03-01
Navigation and interpretation of ultrasound (US) images require substantial expertise, the training of which can be aided by virtual-reality simulators. However, a major challenge in creating plausible simulated US images is the generation of realistic ultrasound speckle. Since typical ultrasound speckle exhibits many properties of Markov Random Fields, it is conceivable to use texture synthesis for generating plausible US appearance. In this work, we investigate popular classes of texture synthesis methods for generating realistic US content. In a user study, we evaluate their performance for reproducing homogeneous tissue regions in B-mode US images from small image samples of similar tissue and report the best-performing synthesis methods. We further show that regression trees can be used on speckle texture features to learn a predictor for US realism.
Analysis of Nonstationary Time Series for Biological Rhythms Research.
Leise, Tanya L
2017-06-01
This article is part of a Journal of Biological Rhythms series exploring analysis and statistics topics relevant to researchers in biological rhythms and sleep research. The goal is to provide an overview of the most common issues that arise in the analysis and interpretation of data in these fields. In this article on time series analysis for biological rhythms, we describe some methods for assessing the rhythmic properties of time series, including tests of whether a time series is indeed rhythmic. Because biological rhythms can exhibit significant fluctuations in their period, phase, and amplitude, their analysis may require methods appropriate for nonstationary time series, such as wavelet transforms, which can measure how these rhythmic parameters change over time. We illustrate these methods using simulated and real time series.
Rohling, Martin L; Williamson, David J; Miller, L Stephen; Adams, Russell L
2003-11-01
The aim of this project was to validate an alternative global measure of neurocognitive impairment (Rohling Interpretive Method, or RIM) that could be generated from data gathered from a flexible battery approach. A critical step in this process is to establish the utility of the technique against current standards in the field. In this paper, we compared results from the Rohling Interpretive Method to those obtained from the General Neuropsychological Deficit Scale (GNDS; Reitan & Wolfson, 1988) and the Halstead-Russell Average Impairment Rating (AIR; Russell, Neuringer & Goldstein, 1970) on a large previously published sample of patients assessed with the Halstead-Reitan Battery (HRB). Findings support the use of the Rohling Interpretive Method in producing summary statistics similar in diagnostic sensitivity and specificity to the traditional HRB indices.
Bilman, Els M; Kleef, Ellen van; Mela, David J; Hulshof, Toine; van Trijp, Hans C M
2012-12-01
The aim of this study was to explore (a) whether and how consumers may (over-) interpret satiety claims, and (b) whether and to what extent consumers recognize that personal efforts are required to realize possible satiety-related or weight loss benefits. Following means-end chain theory, we explored for a number of satiety claims the extent of inference-making to higher-level benefits than actually stated in the claim, using internet-based questions and tasks. Respondents (N=1504) in U.K., France, Italy and Germany participated in the study. The majority of these respondents correctly interpret satiety-related claims; i.e. they largely limit their interpretation to what was actually stated. They do not expect a "magic bullet" effect, but understand that personal efforts are required to translate product attributes into potential weight control benefits. Less-restrained eaters were at lower risk for over-interpreting satiety-related claims, whilst respondents with a stronger belief that their weight is something that they can control accept more personal responsibility, and better understand that personal efforts are required to be effective in weight control. Overall, these results indicate there is likely to be a relatively low level of consumer misinterpretation of satiety-related claims on food products. Copyright © 2012 International Life Sciences Institute. Published by Elsevier Ltd.. All rights reserved.
NASA Technical Reports Server (NTRS)
Gaffey, Michael J.
2005-01-01
There is significant dispute concerning the interpretation and meteoritic affinities of S-type asteroids. Some of this arises from the use of inappropriate analysis methods and the derivation of conclusions which cannot be supported by those interpretive methodologies [1]. The most frequently applied inappropriate technique is curve matching. Whether matching spectra from a spectral library or mixing end-member spectra to match the asteroid spectrum, curve matching for S-type spectra suffers from a suite of weaknesses that are virtually impossible to overcome. Chief among these is the lack of a comprehensive comparison set. Lacking a complete library that includes both the mineralogical variations and the spectrally significant physical variations (e.g., particle size, petrographic relationships, etc.), curve matches are plagued with potential unresolved ambiguities. The other major weakness of virtually all curve matching efforts is that equal weight is given to matching all portions of the spectrum. In actuality, some portions of the spectrum (e.g., centers of absorption features) must be matched very accurately while other portions of the spectrum (e.g., continuum regions and overall slopes) do not require good matches since they are strongly effected by parameters unrelated to the mineralogy of the sample.
Segura-Totten, Miriam; Dalman, Nancy E.
2013-01-01
Analysis of the primary literature in the undergraduate curriculum is associated with gains in student learning. In particular, the CREATE (Consider, Read, Elucidate hypotheses, Analyze and interpret the data, and Think of the next Experiment) method is associated with an increase in student critical thinking skills. We adapted the CREATE method within a required cell biology class and compared the learning gains of students using CREATE to those of students involved in less structured literature discussions. We found that while both sets of students had gains in critical thinking, students who used the CREATE method did not show significant improvement over students engaged in a more traditional method for dissecting the literature. Students also reported similar learning gains for both literature discussion methods. Our study suggests that, at least in our educational context, the CREATE method does not lead to higher learning gains than a less structured way of reading primary literature. PMID:24358379
Comparison of the dye method with the thermocouple psychrometer for measuring leaf water potentials.
Knipling, E B; Kramer, P J
1967-10-01
The dye method for measuring water potential was examined and compared with the thermocouple psychrometer method in order to evaluate its usefulness for measuring leaf water potentials of forest trees and common laboratory plants. Psychrometer measurements are assumed to represent the true leaf water potentials. Because of the contamination of test solutions by cell sap and leaf surface residues, dye method values of most species varied about 1 to 5 bars from psychrometer values over the leaf water potential range of 0 to -30 bars. The dye method is useful for measuring changes and relative values in leaf potential. Because of species differences in the relationships of dye method values to true leaf water potentials, dye method values should be interpreted with caution when comparing different species or the same species growing in widely different environments. Despite its limitations the dye method has a usefulness to many workers because it is simple, requires no elaborate equipment, and can be used in both the laboratory and field.
NASA Astrophysics Data System (ADS)
Soh, BaoLin P.; Lee, Warwick B.; Wong, Jill; Sim, Llewellyn; Hillis, Stephen L.; Tapia, Kriscia A.; Brennan, Patrick C.
2016-03-01
Aim: To compare the performance of Australian and Singapore breast readers interpreting a single test-set that consisted of mammographic examinations collected from the Australian population. Background: In the teleradiology era, breast readers are interpreting mammographic examinations from different populations. The question arises whether two groups of readers with similar training backgrounds, demonstrate the same level of performance when presented with a population familiar only to one of the groups. Methods: Fifty-three Australian and 15 Singaporean breast radiologists participated in this study. All radiologists were trained in mammogram interpretation and had a median of 9 and 15 years of experience in reading mammograms respectively. Each reader interpreted the same BREAST test-set consisting of sixty de-identified mammographic examinations arising from an Australian population. Performance parameters including JAFROC, ROC, case sensitivity as well as specificity were compared between Australian and Singaporean readers using a Mann Whitney U test. Results: A significant difference (P=0.036) was demonstrated between the JAFROC scores of the Australian and Singaporean breast radiologists. No other significant differences were observed. Conclusion: JAFROC scores for Australian radiologists were higher than those obtained by the Singaporean counterparts. Whilst it is tempting to suggest this is down to reader expertise, this may be a simplistic explanation considering the very similar training and audit backgrounds of the two populations of radiologists. The influence of reading images that are different from those that radiologists normally encounter cannot be ruled out and requires further investigation, particularly in the light of increasing international outsourcing of radiologic reporting.
A possible new approach to understanding mental disorder.
Sharples, P J
2012-09-01
The aetiology of mental disorders is not fully understood. This paper presents an analysis of the conceptual control process exploring the tools of conceptual application and the phases and the mechanism of the control process and seeks to show how the illness states of mental disorder naturally come to occur. Living occurs in a world of change. For living to occur some control is required and to exert control, to provide direction for the conceptual process, some interpretation of significance, some definition of need is also required. Such interpretation, monitoring significance in relation to the many aspects of change, forms the base on which living occurs. Change in human terms is intrinsically insecure and interpretation of significance is an interpretation of security, an interpretation of control in living. Conceptual control is a process applied to maintain security, to maintain a secure base for the interpretation of significance, it is a process applied to produce and hold a sense of control. Powering a process, producing and holding a sense of control, is an active process and so requires some form of energy. Human beings have a sense of that energy, something exhibited in terms such as full of energy, tired, exhausted. As energy is required to power the control process, accompanying the sense of energy is a sense of the ability to provide power, is a sense of the ability to hold and maintain control, is a sense of security. As available energy reduces there is difficulty holding the same sense of control, a person in the same setting comes to feel more insecure. This can result in a person experiencing mental disorder from mild to severe degree. Mild where conceptual process is applied to manage just one or a very few particular needs, severe and more general where the insecurity affects the base of interpretation. In this later case seeking to protect security can lead to mania, mood-incongruent delusions, schizophrenia. Failing ability to protect can lead to generalized anxiety disorder, mood-congruent delusions, different presentations and degrees of depression. Copyright © 2012 Elsevier Ltd. All rights reserved.
Storage and retrieval of mass spectral information
NASA Technical Reports Server (NTRS)
Hohn, M. E.; Humberston, M. J.; Eglinton, G.
1977-01-01
Computer handling of mass spectra serves two main purposes: the interpretation of the occasional, problematic mass spectrum, and the identification of the large number of spectra generated in the gas-chromatographic-mass spectrometric (GC-MS) analysis of complex natural and synthetic mixtures. Methods available fall into the three categories of library search, artificial intelligence, and learning machine. Optional procedures for coding, abbreviating and filtering a library of spectra minimize time and storage requirements. Newer techniques make increasing use of probability and information theory in accessing files of mass spectral information.
Multiple directed graph large-class multi-spectral processor
NASA Technical Reports Server (NTRS)
Casasent, David; Liu, Shiaw-Dong; Yoneyama, Hideyuki
1988-01-01
Numerical analysis techniques for the interpretation of high-resolution imaging-spectrometer data are described and demonstrated. The method proposed involves the use of (1) a hierarchical classifier with a tree structure generated automatically by a Fisher linear-discriminant-function algorithm and (2) a novel multiple-directed-graph scheme which reduces the local maxima and the number of perturbations required. Results for a 500-class test problem involving simulated imaging-spectrometer data are presented in tables and graphs; 100-percent-correct classification is achieved with an improvement factor of 5.
Spatial Resolution Requirements for Accurate Identification of Drivers of Atrial Fibrillation
Roney, Caroline H.; Cantwell, Chris D.; Bayer, Jason D.; Qureshi, Norman A.; Lim, Phang Boon; Tweedy, Jennifer H.; Kanagaratnam, Prapa; Vigmond, Edward J.; Ng, Fu Siong
2017-01-01
Background— Recent studies have demonstrated conflicting mechanisms underlying atrial fibrillation (AF), with the spatial resolution of data often cited as a potential reason for the disagreement. The purpose of this study was to investigate whether the variation in spatial resolution of mapping may lead to misinterpretation of the underlying mechanism in persistent AF. Methods and Results— Simulations of rotors and focal sources were performed to estimate the minimum number of recording points required to correctly identify the underlying AF mechanism. The effects of different data types (action potentials and unipolar or bipolar electrograms) and rotor stability on resolution requirements were investigated. We also determined the ability of clinically used endocardial catheters to identify AF mechanisms using clinically recorded and simulated data. The spatial resolution required for correct identification of rotors and focal sources is a linear function of spatial wavelength (the distance between wavefronts) of the arrhythmia. Rotor localization errors are larger for electrogram data than for action potential data. Stationary rotors are more reliably identified compared with meandering trajectories, for any given spatial resolution. All clinical high-resolution multipolar catheters are of sufficient resolution to accurately detect and track rotors when placed over the rotor core although the low-resolution basket catheter is prone to false detections and may incorrectly identify rotors that are not present. Conclusions— The spatial resolution of AF data can significantly affect the interpretation of the underlying AF mechanism. Therefore, the interpretation of human AF data must be taken in the context of the spatial resolution of the recordings. PMID:28500175
Method of interpretation of remotely sensed data and applications to land use
NASA Technical Reports Server (NTRS)
Parada, N. D. J. (Principal Investigator); Dossantos, A. P.; Foresti, C.; Demoraesnovo, E. M. L.; Niero, M.; Lombardo, M. A.
1981-01-01
Instructional material describing a methodology of remote sensing data interpretation and examples of applicatons to land use survey are presented. The image interpretation elements are discussed for different types of sensor systems: aerial photographs, radar, and MSS/LANDSAT. Visual and automatic LANDSAT image interpretation is emphasized.
NASA Astrophysics Data System (ADS)
Lohweg, Volker; Schaede, Johannes; Türke, Thomas
2006-02-01
The authenticity checking and inspection of bank notes is a high labour intensive process where traditionally every note on every sheet is inspected manually. However with the advent of more and more sophisticated security features, both visible and invisible, and the requirement of cost reduction in the printing process, it is clear that automation is required. As more and more print techniques and new security features will be established, total quality security, authenticity and bank note printing must be assured. Therefore, this factor necessitates amplification of a sensorial concept in general. We propose a concept for both authenticity checking and inspection methods for pattern recognition and classification for securities and banknotes, which is based on the concept of sensor fusion and fuzzy interpretation of data measures. In the approach different methods of authenticity analysis and print flaw detection are combined, which can be used for vending or sorting machines, as well as for printing machines. Usually only the existence or appearance of colours and their textures are checked by cameras. Our method combines the visible camera images with IR-spectral sensitive sensors, acoustical and other measurements like temperature and pressure of printing machines.
Combining nutrient intake from food/beverages and vitamin/mineral supplements.
Garriguet, Didier
2010-12-01
To calculate total intake of a nutrient and estimate inadequate intake for a population, the amounts derived from food/beverages and from vitamin/mineral supplements must be combined. The two methods Statistics Canada has suggested present problems of interpretation. Data collected from 34,386 respondents to the 2004 Canadian Community Health Survey-Nutrition were used to compare four methods of combining nutrient intake from food/beverages and vitamin/mineral supplements: adding average intake from supplements to the 24-hour food/beverage recall and estimating the usual distribution in the population (Method 1); estimating usual individual intake from food? beverages and adding intake from supplements (Method 2); and dividing the population into supplement users and non-users and applying Method 1 or Method 2 and combining the estimates based on the percentages of users and non-users (Methods 3 and 4). Interpretation problems arise with Methods 1 and 2; for example, the percentage of the population with inadequate intake of vitamin C and folate equivalents falls outside the expected minimum-maximum range. These interpretation problems are not observed with Methods 3 and 4. Interpretation problems that may arise in combining food and supplement intake of a given nutrient are overcome if the population is divided into supplement users and non-users before Method 1 or Method 2 is applied.
20 CFR 602.11 - Secretary's interpretation.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 20 Employees' Benefits 3 2010-04-01 2010-04-01 false Secretary's interpretation. 602.11 Section 602.11 Employees' Benefits EMPLOYMENT AND TRAINING ADMINISTRATION, DEPARTMENT OF LABOR QUALITY CONTROL IN THE FEDERAL-STATE UNEMPLOYMENT INSURANCE SYSTEM Federal Requirements § 602.11 Secretary's...
48 CFR 9901.305 - Requirements for standards and interpretive rulings.
Code of Federal Regulations, 2014 CFR
2014-10-01
... promulgation of cost accounting standards and interpretations thereof, the Board shall: (a) Take into account, after consultation and discussion with the Comptroller General, professional accounting organizations... ACCOUNTING STANDARDS BOARD, OFFICE OF FEDERAL PROCUREMENT POLICY, OFFICE OF MANAGEMENT AND BUDGET...
48 CFR 9901.305 - Requirements for standards and interpretive rulings.
Code of Federal Regulations, 2013 CFR
2013-10-01
... promulgation of cost accounting standards and interpretations thereof, the Board shall: (a) Take into account, after consultation and discussion with the Comptroller General, professional accounting organizations... ACCOUNTING STANDARDS BOARD, OFFICE OF FEDERAL PROCUREMENT POLICY, OFFICE OF MANAGEMENT AND BUDGET...
48 CFR 9901.305 - Requirements for standards and interpretive rulings.
Code of Federal Regulations, 2011 CFR
2011-10-01
... promulgation of cost accounting standards and interpretations thereof, the Board shall: (a) Take into account, after consultation and discussion with the Comptroller General, professional accounting organizations... ACCOUNTING STANDARDS BOARD, OFFICE OF FEDERAL PROCUREMENT POLICY, OFFICE OF MANAGEMENT AND BUDGET...
48 CFR 9901.305 - Requirements for standards and interpretive rulings.
Code of Federal Regulations, 2012 CFR
2012-10-01
... promulgation of cost accounting standards and interpretations thereof, the Board shall: (a) Take into account, after consultation and discussion with the Comptroller General, professional accounting organizations... ACCOUNTING STANDARDS BOARD, OFFICE OF FEDERAL PROCUREMENT POLICY, OFFICE OF MANAGEMENT AND BUDGET...
48 CFR 9901.305 - Requirements for standards and interpretive rulings.
Code of Federal Regulations, 2010 CFR
2010-10-01
... promulgation of cost accounting standards and interpretations thereof, the Board shall: (a) Take into account, after consultation and discussion with the Comptroller General, professional accounting organizations... ACCOUNTING STANDARDS BOARD, OFFICE OF FEDERAL PROCUREMENT POLICY, OFFICE OF MANAGEMENT AND BUDGET...
Code of Federal Regulations, 2010 CFR
2010-01-01
... Practices CONSUMER PRODUCT SAFETY COMMISSION CONSUMER PRODUCT SAFETY ACT REGULATIONS SUBSTANTIAL PRODUCT HAZARD REPORTS General Interpretation § 1115.1 Purpose. The purpose of this part 1115 is to set forth the Consumer Product Safety Commission's (Commission's) interpretation of the reporting requirements imposed on...
Pugh, Aaron L.
2014-01-01
Users of streamflow information often require streamflow statistics and basin characteristics at various locations along a stream. The USGS periodically calculates and publishes streamflow statistics and basin characteristics for streamflowgaging stations and partial-record stations, but these data commonly are scattered among many reports that may or may not be readily available to the public. The USGS also provides and periodically updates regional analyses of streamflow statistics that include regression equations and other prediction methods for estimating statistics for ungaged and unregulated streams across the State. Use of these regional predictions for a stream can be complex and often requires the user to determine a number of basin characteristics that may require interpretation. Basin characteristics may include drainage area, classifiers for physical properties, climatic characteristics, and other inputs. Obtaining these input values for gaged and ungaged locations traditionally has been time consuming, subjective, and can lead to inconsistent results.
Evaluating the Veterans Health Administration's Staffing Methodology Model: A Reliable Approach.
Taylor, Beth; Yankey, Nicholas; Robinson, Claire; Annis, Ann; Haddock, Kathleen S; Alt-White, Anna; Krein, Sarah L; Sales, Anne
2015-01-01
All Veterans Health Administration facilities have been mandated to use a standardized method of determining appropriate direct-care staffing by nursing personnel. A multi-step process was designed to lead to projection of full-time equivalent employees required for safe and effective care across all inpatient units. These projections were intended to develop appropriate budgets for each facility. While staffing levels can be increased, even in facilities subject to budget and personnel caps, doing so requires considerable commitment at all levels of the facility. This commitment must come from front-line nursing personnel to senior leadership, not only in nursing and patient care services, but throughout the hospital. Learning to interpret and rely on data requires a considerable shift in thinking for many facilities, which have relied on historical levels to budget for staffing, but which does not take into account the dynamic character of nursing units and patient need.
Hedden, Sarra L; Woolson, Robert F; Carter, Rickey E; Palesch, Yuko; Upadhyaya, Himanshu P; Malcolm, Robert J
2009-07-01
"Loss to follow-up" can be substantial in substance abuse clinical trials. When extensive losses to follow-up occur, one must cautiously analyze and interpret the findings of a research study. Aims of this project were to introduce the types of missing data mechanisms and describe several methods for analyzing data with loss to follow-up. Furthermore, a simulation study compared Type I error and power of several methods when missing data amount and mechanism varies. Methods compared were the following: Last observation carried forward (LOCF), multiple imputation (MI), modified stratified summary statistics (SSS), and mixed effects models. Results demonstrated nominal Type I error for all methods; power was high for all methods except LOCF. Mixed effect model, modified SSS, and MI are generally recommended for use; however, many methods require that the data are missing at random or missing completely at random (i.e., "ignorable"). If the missing data are presumed to be nonignorable, a sensitivity analysis is recommended.
From SOPs to Reports to Evaluations: Learning and Memory ...
In an era of global trade and regulatory cooperation, consistent and scientifically based interpretation of developmental neurotoxicity (DNT) studies is essential. Because there is flexibility in the selection of test method(s), consistency can be especially challenging for learning and memory tests required by EPA and OECD DNT guidelines (chemicals and pesticides) and recommended for ICH prenatal/postnatal guidelines (pharmaceuticals). A well reasoned uniform approach is particularly important for variable endpoints and if non-standard tests are used. An understanding of the purpose behind the tests and expected outcomes is critical, and attention to elements of experimental design, conduct, and reporting can improve study design by the investigator as well as accuracy and consistency of interpretation by evaluators. This understanding also directs which information must be clearly described in study reports. While missing information may be available in standardized operating procedures (SOPs), if not clearly reflected in report submissions there may be questions and misunderstandings by evaluators which could impact risk assessments. A practical example will be presented to provide insights into important variables and reporting approaches. Cognitive functions most often tested in guidelines studies include associative, positional, sequential, and spatial learning and memory in weanling and adult animals. These complex behaviors tap different bra
Kidd, Monica; Nixon, Lara; Rosenal, Tom; Jackson, Roberta; Pereles, Laurie; Mitchell, Ian; Bendiak, Glenda; Hughes, Lisa
2016-01-01
Vulnerable persons often face stigma-related barriers while seeking health care. Innovative education and professional development methods are needed to help change this. We describe an interdisciplinary group workshop designed around a discomfiting oil portrait, intended to trigger provocative conversations among health care students and practitioners, and we present our mixed methods analysis of participant reflections. After the workshop, participants were significantly more likely to endorse the statements that the observation and interpretive skills involved in viewing visual art are relevant to patient care and that visual art should be used in medical education to improve students' observational skills, narrative skills, and empathy with their patients. Subsequent to the workshop, significantly more participants agreed that art interpretation should be required curriculum for health care students. Qualitative comments from two groups from two different education and professional contexts were examined for themes; conversations focused on issues of power, body image/self-esteem, and lessons for clinical practice. We argue that difficult conversations about affective responses to vulnerable persons are possible in a collaborative context using well-chosen works of visual art that can stand in for a patient.
Learning from samples of one or fewer*
March, J; Sproull, L; Tamuz, M
2003-01-01
Organizations learn from experience. Sometimes, however, history is not generous with experience. We explore how organizations convert infrequent events into interpretations of history, and how they balance the need to achieve agreement on interpretations with the need to interpret history correctly. We ask what methods are used, what problems are involved, and what improvements might be made. Although the methods we observe are not guaranteed to lead to consistent agreement on interpretations, valid knowledge, improved organizational performance, or organizational survival, they provide possible insights into the possibilities for and problems of learning from fragments of history. PMID:14645764
Geiser, Christian; Burns, G. Leonard; Servera, Mateu
2014-01-01
Models of confirmatory factor analysis (CFA) are frequently applied to examine the convergent validity of scores obtained from multiple raters or methods in so-called multitrait-multimethod (MTMM) investigations. We show that interesting incremental information about method effects can be gained from including mean structures and tests of MI across methods in MTMM models. We present a modeling framework for testing MI in the first step of a CFA-MTMM analysis. We also discuss the relevance of MI in the context of four more complex CFA-MTMM models with method factors. We focus on three recently developed multiple-indicator CFA-MTMM models for structurally different methods [the correlated traits-correlated (methods – 1), latent difference, and latent means models; Geiser et al., 2014a; Pohl and Steyer, 2010; Pohl et al., 2008] and one model for interchangeable methods (Eid et al., 2008). We demonstrate that some of these models require or imply MI by definition for a proper interpretation of trait or method factors, whereas others do not, and explain why MI may or may not be required in each model. We show that in the model for interchangeable methods, testing for MI is critical for determining whether methods can truly be seen as interchangeable. We illustrate the theoretical issues in an empirical application to an MTMM study of attention deficit and hyperactivity disorder (ADHD) with mother, father, and teacher ratings as methods. PMID:25400603
Interpretations, perspectives and intentions in surrogate motherhood
van Zyl, L.; van Niekerk, A.
2000-01-01
In this paper we examine the questions "What does it mean to be a surrogate mother?" and "What would be an appropriate perspective for a surrogate mother to have on her pregnancy?" In response to the objection that such contracts are alienating or dehumanising since they require women to suppress their evolving perspective on their pregnancies, liberal supporters of surrogate motherhood argue that the freedom to contract includes the freedom to enter a contract to bear a child for an infertile couple. After entering the contract the surrogate may not be free to interpret her pregnancy as that of a non-surrogate mother, but there is more than one appropriate way of interpreting one's pregnancy. To restrict or ban surrogacy contracts would be to prohibit women from making other particular interpretations of their pregnancies they may wish to make, requiring them to live up to a culturally constituted image of ideal motherhood. We examine three interpretations of a "surrogate pregnancy" that are implicit in the views and arguments put forward by ethicists, surrogacy agencies, and surrogate mothers themselves. We hope to show that our concern in this regard goes beyond the view that surrogacy contracts deny or suppress the natural, instinctive or conventional interpretation of pregnancy. Key Words: Surrogate motherhood • parental rights and responsibilities PMID:11055048
Redman-MacLaren, Michelle; Mills, Jane; Tommbe, Rachael
2014-01-01
Participatory approaches to qualitative research practice constantly change in response to evolving research environments. Researchers are increasingly encouraged to undertake secondary analysis of qualitative data, despite epistemological and ethical challenges. Interpretive focus groups can be described as a more participative method for groups to analyse qualitative data. To facilitate interpretive focus groups with women in Papua New Guinea to extend analysis of existing qualitative data and co-create new primary data. The purpose of this was to inform a transformational grounded theory and subsequent health promoting action. A two-step approach was used in a grounded theory study about how women experience male circumcision in Papua New Guinea. Participants analysed portions or 'chunks' of existing qualitative data in story circles and built upon this analysis by using the visual research method of storyboarding. New understandings of the data were evoked when women in interpretive focus groups analysed the data 'chunks'. Interpretive focus groups encouraged women to share their personal experiences about male circumcision. The visual method of storyboarding enabled women to draw pictures to represent their experiences. This provided an additional focus for whole-of-group discussions about the research topic. Interpretive focus groups offer opportunity to enhance trustworthiness of findings when researchers undertake secondary analysis of qualitative data. The co-analysis of existing data and co-generation of new data between research participants and researchers informed an emergent transformational grounded theory and subsequent health promoting action.
Redman-MacLaren, Michelle; Mills, Jane; Tommbe, Rachael
2014-01-01
Background Participatory approaches to qualitative research practice constantly change in response to evolving research environments. Researchers are increasingly encouraged to undertake secondary analysis of qualitative data, despite epistemological and ethical challenges. Interpretive focus groups can be described as a more participative method for groups to analyse qualitative data. Objective To facilitate interpretive focus groups with women in Papua New Guinea to extend analysis of existing qualitative data and co-create new primary data. The purpose of this was to inform a transformational grounded theory and subsequent health promoting action. Design A two-step approach was used in a grounded theory study about how women experience male circumcision in Papua New Guinea. Participants analysed portions or ‘chunks’ of existing qualitative data in story circles and built upon this analysis by using the visual research method of storyboarding. Results New understandings of the data were evoked when women in interpretive focus groups analysed the data ‘chunks’. Interpretive focus groups encouraged women to share their personal experiences about male circumcision. The visual method of storyboarding enabled women to draw pictures to represent their experiences. This provided an additional focus for whole-of-group discussions about the research topic. Conclusions Interpretive focus groups offer opportunity to enhance trustworthiness of findings when researchers undertake secondary analysis of qualitative data. The co-analysis of existing data and co-generation of new data between research participants and researchers informed an emergent transformational grounded theory and subsequent health promoting action. PMID:25138532
Video medical interpretation over 3G cellular networks: a feasibility study.
Locatis, Craig; Williamson, Deborah; Sterrett, James; Detzler, Isabel; Ackerman, Michael
2011-12-01
To test the feasibility of using cell phone technology to provide video medical interpretation services at a distance. Alternative cell phone services were researched and videoconferencing technologies were tried out to identify video products and telecommunication services needed to meet video medical interpretation requirements. The video and telecommunication technologies were tried out in a pharmacy setting and compared with use of the telephone. Outcomes were similar to findings in previous research involving video medical interpretation with higher bandwidth and video quality. Patients appreciated the interpretation service no matter how it is provided, while health providers and interpreters preferred video. It is possible to provide video medical interpretation services via cellular communication using lower bandwidth videoconferencing technology that provides sufficient quality, at least in pharmacy settings. However, a number of issues need to be addressed to ensure quality of service.
In an era of global trade and regulatory cooperation, consistent and scientifically based interpretation of developmental neurotoxicity (DNT) studies is essential. Because there is flexibility in the selection of test method(s), consistency can be especially challenging for lea...
40 CFR 168.85 - Other export requirements.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 23 2010-07-01 2010-07-01 false Other export requirements. 168.85... STATEMENTS OF ENFORCEMENT POLICIES AND INTERPRETATIONS Export Policy and Procedures for Exporting Unregistered Pesticides § 168.85 Other export requirements. This section describes other requirements found in...
Detection of coupling delay: A problem not yet solved
NASA Astrophysics Data System (ADS)
Coufal, David; Jakubík, Jozef; Jajcay, Nikola; Hlinka, Jaroslav; Krakovská, Anna; Paluš, Milan
2017-08-01
Nonparametric detection of coupling delay in unidirectionally and bidirectionally coupled nonlinear dynamical systems is examined. Both continuous and discrete-time systems are considered. Two methods of detection are assessed—the method based on conditional mutual information—the CMI method (also known as the transfer entropy method) and the method of convergent cross mapping—the CCM method. Computer simulations show that neither method is generally reliable in the detection of coupling delays. For continuous-time chaotic systems, the CMI method appears to be more sensitive and applicable in a broader range of coupling parameters than the CCM method. In the case of tested discrete-time dynamical systems, the CCM method has been found to be more sensitive, while the CMI method required much stronger coupling strength in order to bring correct results. However, when studied systems contain a strong oscillatory component in their dynamics, results of both methods become ambiguous. The presented study suggests that results of the tested algorithms should be interpreted with utmost care and the nonparametric detection of coupling delay, in general, is a problem not yet solved.
French, Julian M
2014-07-01
Variation in the interpretation of the regulatory guidelines has resulted in a diversity of techniques employed to examine the internal structures of the foetal rabbit head. Examination of the foetal rabbit brain, using a single transverse section as the sole technique, is considered not to be sufficiently thorough to be regarded as an adequate examination method. It is not compliant with published EPA and OECD guidelines covering required examination of the internal head structures, nor is it considered to conform to the spirit of the safety assessment required by the ICH guideline. Fixation of approximately half of the heads in each litter to allow the examination of multiple transverse sections enables the major structures within the head to be assessed effectively. This method is compliant with current guidelines, represents "good practice" and should be consistently adopted for the examination of the internal head structures of the term rabbit foetus. Copyright © 2014 Elsevier Inc. All rights reserved.
Schmidt, Mark E; Chiao, Ping; Klein, Gregory; Matthews, Dawn; Thurfjell, Lennart; Cole, Patricia E; Margolin, Richard; Landau, Susan; Foster, Norman L; Mason, N Scott; De Santi, Susan; Suhy, Joyce; Koeppe, Robert A; Jagust, William
2015-09-01
In vivo imaging of amyloid burden with positron emission tomography (PET) provides a means for studying the pathophysiology of Alzheimer's and related diseases. Measurement of subtle changes in amyloid burden requires quantitative analysis of image data. Reliable quantitative analysis of amyloid PET scans acquired at multiple sites and over time requires rigorous standardization of acquisition protocols, subject management, tracer administration, image quality control, and image processing and analysis methods. We review critical points in the acquisition and analysis of amyloid PET, identify ways in which technical factors can contribute to measurement variability, and suggest methods for mitigating these sources of noise. Improved quantitative accuracy could reduce the sample size necessary to detect intervention effects when amyloid PET is used as a treatment end point and allow more reliable interpretation of change in amyloid burden and its relationship to clinical course. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.
Lin, Abraham; Harris, Mitchell; Zalis, Michael
2010-07-01
Electronic medical record (EMR) systems permit integration of contextual nonimaging EMR data into examination interpretation; however, the extra effort required to search and review these nonradiologic data are not well characterized. We assessed the gross frequency and pattern of EMR usage in the interpretation of diagnostic CT and MRI examinations. We defined nonradiologic EMR data as laboratory data, nonimaging specialty report, clinical note, and administrative data not available on PACS. For abdominal, neuroradiologic, and musculoskeletal CT and MRI, we prospectively recorded the time required for image analysis (including prior imaging studies and their reports), nonradiologic EMR use, and initial report drafting by fellows and staff in randomized sessions. We assessed EMR use as a fraction of work activity and according to technique, subspecialty, inpatient status, and radiologist experience. We observed 372 CT and MRI interpretations by 33 radiologists. For CT, radiologists used the EMR in 34% of abdominal, 57% of neuroradiologic, and 38% of musculoskeletal interpretations. For MRI, EMR was used in 73% of abdominal, 56% of neuroradiologic, and 33% of musculoskeletal interpretations. For CT, EMR usage comprised 18%, 14%, and 18% of diagnostic effort (image analysis plus EMR use) for abdominal, neuroradiologic, and musculoskeletal interpretations, respectively; for MRI, EMR usage comprised 21%, 16%, and 15% of diagnostic effort for abdominal, neuroradiologic, and musculoskeletal interpretations, respectively. Frequency of EMR use was significantly greater for neuroradiology CT and abdominal MRI (p < 0.05, Fisher's test). EMR usage was not consistently related to inpatient status for CT or radiologist experience. For CT and MRI interpretation, EMR usage is frequent and comprises a significant fraction of diagnostic effort.
31 CFR Appendix C to Part 103 - Interpretive Rules
Code of Federal Regulations, 2010 CFR
2010-07-01
...-money laundering programs. See 31 CFR 103.125. Specifically, this Interpretive Guidance clarifies that the anti-money laundering program regulation requires Money Services Businesses to establish adequate and appropriate policies, procedures, and controls commensurate with the risks of money laundering and...
What Bell proved: A reply to Blaylock
NASA Astrophysics Data System (ADS)
Maudlin, Tim
2010-01-01
Blaylock argues that the derivation of Bell's inequality requires a hidden assumption, counterfactual definiteness, of which Bell was unaware. A careful analysis of Bell's argument shows that Bell presupposes only locality and the predictions of standard quantum mechanics. Counterfactual definiteness, insofar as it is required, is derived in the course of the argument rather than presumed. Bell's theorem has no direct bearing on the many worlds interpretation not because that interpretation denies counterfactual definiteness but because it does not recover the predictions of standard quantum mechanics.
Boriollo, Marcelo Fabiano Gomes; Rosa, Edvaldo Antonio Ribeiro; Gonçalves, Reginaldo Bruno; Höfling, José Francisco
2006-03-01
The typing of C. albicans by MLEE (multilocus enzyme electrophoresis) is dependent on the interpretation of enzyme electrophoretic patterns, and the study of the epidemiological relationships of these yeasts can be conducted by cluster analysis. Therefore, the aims of the present study were to first determine the discriminatory power of genetic interpretation (deduction of the allelic composition of diploid organisms) and numerical interpretation (mere determination of the presence and absence of bands) of MLEE patterns, and then to determine the concordance (Pearson product-moment correlation coefficient) and similarity (Jaccard similarity coefficient) of the groups of strains generated by three cluster analysis models, and the discriminatory power of such models as well [model A: genetic interpretation, genetic distance matrix of Nei (d(ij)) and UPGMA dendrogram; model B: genetic interpretation, Dice similarity matrix (S(D1)) and UPGMA dendrogram; model C: numerical interpretation, Dice similarity matrix (S(D2)) and UPGMA dendrogram]. MLEE was found to be a powerful and reliable tool for the typing of C. albicans due to its high discriminatory power (>0.9). Discriminatory power indicated that numerical interpretation is a method capable of discriminating a greater number of strains (47 versus 43 subtypes), but also pointed to model B as a method capable of providing a greater number of groups, suggesting its use for the typing of C. albicans by MLEE and cluster analysis. Very good agreement was only observed between the elements of the matrices S(D1) and S(D2), but a large majority of the groups generated in the three UPGMA dendrograms showed similarity S(J) between 4.8% and 75%, suggesting disparities in the conclusions obtained by the cluster assays.
Comparison of air-coupled GPR data analysis results determined by multiple analysts
NASA Astrophysics Data System (ADS)
Martino, Nicole; Maser, Ken
2016-04-01
Current bridge deck condition assessments using ground penetrating radar (GPR) requires a trained analyst to manually interpret substructure layering information from B-scan images in order to proceed with an intended analysis (pavement thickness, concrete cover, effects of rebar corrosion, etc.) For example, a recently developed method to rapidly and accurately analyze air-coupled GPR data based on the effects of rebar corrosion, requires that a user "picks" a layer of rebar reflections in each B-scan image collected along the length of the deck. These "picks" have information like signal amplitude and two way travel time. When a deck is new, or has little rebar corrosion, the resulting layer of rebar reflections is readily evident and there is little room for subjectivity. However, when a deck is severely deteriorated, the rebar layer may be difficult to identify, and different analysts may make different interpretations of the appropriate layer to analyze. One highly corroded bridge deck, was assessed with a number of nondestructive evaluation techniques including 2GHz air-coupled GPR. Two trained analysts separately selected the rebar layer in each B-scan image, choosing as much information as possible, even in areas of significant deterioration. The post processing of the selected data points was then completed and the results from each analyst were contour plotted to observe any discrepancies. The paper describes the differences between ground coupled and air-coupled GPR systems, the data collection and analysis methods used by two different analysts for one case study, and the results of the two different analyses.
Novel hyperspectral prediction method and apparatus
NASA Astrophysics Data System (ADS)
Kemeny, Gabor J.; Crothers, Natalie A.; Groth, Gard A.; Speck, Kathy A.; Marbach, Ralf
2009-05-01
Both the power and the challenge of hyperspectral technologies is the very large amount of data produced by spectral cameras. While off-line methodologies allow the collection of gigabytes of data, extended data analysis sessions are required to convert the data into useful information. In contrast, real-time monitoring, such as on-line process control, requires that compression of spectral data and analysis occur at a sustained full camera data rate. Efficient, high-speed practical methods for calibration and prediction are therefore sought to optimize the value of hyperspectral imaging. A novel method of matched filtering known as science based multivariate calibration (SBC) was developed for hyperspectral calibration. Classical (MLR) and inverse (PLS, PCR) methods are combined by spectroscopically measuring the spectral "signal" and by statistically estimating the spectral "noise." The accuracy of the inverse model is thus combined with the easy interpretability of the classical model. The SBC method is optimized for hyperspectral data in the Hyper-CalTM software used for the present work. The prediction algorithms can then be downloaded into a dedicated FPGA based High-Speed Prediction EngineTM module. Spectral pretreatments and calibration coefficients are stored on interchangeable SD memory cards, and predicted compositions are produced on a USB interface at real-time camera output rates. Applications include minerals, pharmaceuticals, food processing and remote sensing.
Heap, Marion; Sinanovic, Edina
2017-01-01
Background The World Health Organisation estimates disabling hearing loss to be around 5.3%, while a study of hearing impairment and auditory pathology in Limpopo, South Africa found a prevalence of nearly 9%. Although Sign Language Interpreters (SLIs) improve the communication challenges in health care, they are unaffordable for many signing Deaf people and people with disabling hearing loss. On the other hand, there are no legal provisions in place to ensure the provision of SLIs in the health sector in most countries including South Africa. To advocate for funding of such initiatives, reliable cost estimates are essential and such data is scarce. To bridge this gap, this study estimated the costs of providing such a service within a South African District health service based on estimates obtained from a pilot-project that initiated the first South African Sign Language Interpreter (SASLI) service in health-care. Methods The ingredients method was used to calculate the unit cost per SASLI-assisted visit from a provider perspective. The unit costs per SASLI-assisted visit were then used in estimating the costs of scaling up this service to the District Health Services. The average annual SASLI utilisation rate per person was calculated on Stata v.12 using the projects’ registry from 2008–2013. Sensitivity analyses were carried out to determine the effect of changing the discount rate and personnel costs. Results Average Sign Language Interpreter services’ utilisation rates increased from 1.66 to 3.58 per person per year, with a median of 2 visits, from 2008–2013. The cost per visit was US$189.38 in 2013 whilst the estimated costs of scaling up this service ranged from US$14.2million to US$76.5million in the Cape Metropole District. These cost estimates represented 2.3%-12.2% of the budget for the Western Cape District Health Services for 2013. Conclusions In the presence of Sign Language Interpreters, Deaf Sign language users utilise health care service to a similar extent as the hearing population. However, this service requires significant capital investment by government to enable access to healthcare for the Deaf. PMID:29272272
Kane, J.S.; Evans, J.R.; Jackson, J.C.
1989-01-01
Accurate and precise determinations of tin in geological materials are needed for fundamental studies of tin geochemistry, and for tin prospecting purposes. Achieving the required accuracy is difficult because of the different matrices in which Sn can occur (i.e. sulfides, silicates and cassiterite), and because of the variability of literature values for Sn concentrations in geochemical reference materials. We have evaluated three methods for the analysis of samples for Sn concentration: graphite furnace atomic absorption spectrometry (HGA-AAS) following iodide extraction, inductively coupled plasma atomic emission spectrometry (ICP-OES), and energy-dispersive X-ray fluorescence (EDXRF) spectrometry. Two of these methods (HGA-AAS and ICP-OES) required sample decomposition either by acid digestion or fusion, while the third (EDXRF) was performed directly on the powdered sample. Analytical details of all three methods, their potential errors, and the steps necessary to correct these errors were investigated. Results showed that similar accuracy was achieved from all methods for unmineralized samples, which contain no known Sn-bearing phase. For mineralized samples, which contain Sn-bearing minerals, either cassiterite or stannous sulfides, only EDXRF and fusion ICP-OES methods provided acceptable accuracy. This summary of our study provides information which helps to assure correct interpretation of data bases for underlying geochemical processes, regardless of method of data collection and its inherent limitations. ?? 1989.
Non-invasive neuroimaging using near-infrared light
NASA Technical Reports Server (NTRS)
Strangman, Gary; Boas, David A.; Sutton, Jeffrey P.
2002-01-01
This article reviews diffuse optical brain imaging, a technique that employs near-infrared light to non-invasively probe the brain for changes in parameters relating to brain function. We describe the general methodology, including types of measurements and instrumentation (including the tradeoffs inherent in the various instrument components), and the basic theory required to interpret the recorded data. A brief review of diffuse optical applications is included, with an emphasis on research that has been done with psychiatric populations. Finally, we discuss some practical issues and limitations that are relevant when conducting diffuse optical experiments. We find that, while diffuse optics can provide substantial advantages to the psychiatric researcher relative to the alternative brain imaging methods, the method remains substantially underutilized in this field.
The resolving power of in vitro genotoxicity assays for cigarette smoke particulate matter.
Scott, K; Saul, J; Crooks, I; Camacho, O M; Dillon, D; Meredith, C
2013-06-01
In vitro genotoxicity assays are often used to compare tobacco smoke particulate matter (PM) from different cigarettes. The quantitative aspect of the comparisons requires appropriate statistical methods and replication levels, to support the interpretation in terms of power and significance. This paper recommends a uniform statistical analysis for the Ames test, mouse lymphoma mammalian cell mutation assay (MLA) and the in vitro micronucleus test (IVMNT); involving a hierarchical decision process with respect to slope, fixed effect and single dose comparisons. With these methods, replication levels of 5 (Ames test TA98), 4 (Ames test TA100), 10 (Ames test TA1537), 6 (MLA) and 4 (IVMNT) resolved a 30% difference in PM genotoxicity. Copyright © 2013 Elsevier Ltd. All rights reserved.
Methods for Quantitative Interpretation of Retarding Field Analyzer Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Calvey, J.R.; Crittenden, J.A.; Dugan, G.F.
2011-03-28
Over the course of the CesrTA program at Cornell, over 30 Retarding Field Analyzers (RFAs) have been installed in the CESR storage ring, and a great deal of data has been taken with them. These devices measure the local electron cloud density and energy distribution, and can be used to evaluate the efficacy of different cloud mitigation techniques. Obtaining a quantitative understanding of RFA data requires use of cloud simulation programs, as well as a detailed model of the detector itself. In a drift region, the RFA can be modeled by postprocessing the output of a simulation code, and onemore » can obtain best fit values for important simulation parameters with a chi-square minimization method.« less
Laboratory Diagnosis of Parasites from the Gastrointestinal Tract.
Garcia, Lynne S; Arrowood, Michael; Kokoskin, Evelyne; Paltridge, Graeme P; Pillai, Dylan R; Procop, Gary W; Ryan, Norbert; Shimizu, Robyn Y; Visvesvara, Govinda
2018-01-01
This Practical Guidance for Clinical Microbiology document on the laboratory diagnosis of parasites from the gastrointestinal tract provides practical information for the recovery and identification of relevant human parasites. The document is based on a comprehensive literature review and expert consensus on relevant diagnostic methods. However, it does not include didactic information on human parasite life cycles, organism morphology, clinical disease, pathogenesis, treatment, or epidemiology and prevention. As greater emphasis is placed on neglected tropical diseases, it becomes highly probable that patients with gastrointestinal parasitic infections will become more widely recognized in areas where parasites are endemic and not endemic. Generally, these methods are nonautomated and require extensive bench experience for accurate performance and interpretation. Copyright © 2017 American Society for Microbiology.
10 CFR 72.5 - Interpretations.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 10 Energy 2 2011-01-01 2011-01-01 false Interpretations. 72.5 Section 72.5 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) LICENSING REQUIREMENTS FOR THE INDEPENDENT STORAGE OF SPENT NUCLEAR FUEL, HIGH-LEVEL RADIOACTIVE WASTE, AND REACTOR-RELATED GREATER THAN CLASS C WASTE General Provisions § 72.5...
10 CFR 72.5 - Interpretations.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 10 Energy 2 2010-01-01 2010-01-01 false Interpretations. 72.5 Section 72.5 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) LICENSING REQUIREMENTS FOR THE INDEPENDENT STORAGE OF SPENT NUCLEAR FUEL, HIGH-LEVEL RADIOACTIVE WASTE, AND REACTOR-RELATED GREATER THAN CLASS C WASTE General Provisions § 72.5...
Regression models for analyzing costs and their determinants in health care: an introductory review.
Gregori, Dario; Petrinco, Michele; Bo, Simona; Desideri, Alessandro; Merletti, Franco; Pagano, Eva
2011-06-01
This article aims to describe the various approaches in multivariable modelling of healthcare costs data and to synthesize the respective criticisms as proposed in the literature. We present regression methods suitable for the analysis of healthcare costs and then apply them to an experimental setting in cardiovascular treatment (COSTAMI study) and an observational setting in diabetes hospital care. We show how methods can produce different results depending on the degree of matching between the underlying assumptions of each method and the specific characteristics of the healthcare problem. The matching of healthcare cost models to the analytic objectives and characteristics of the data available to a study requires caution. The study results and interpretation can be heavily dependent on the choice of model with a real risk of spurious results and conclusions.
Biomagnetism using SQUIDs: status and perspectives
NASA Astrophysics Data System (ADS)
Sternickel, Karsten; Braginski, Alex I.
2006-03-01
Biomagnetism involves the measurement and analysis of very weak local magnetic fields of living organisms and various organs in humans. Such fields can be of physiological origin or due to magnetic impurities or markers. This paper reviews existing and prospective applications of biomagnetism in clinical research and medical diagnostics. Currently, such applications require sensitive magnetic SQUID sensors and amplifiers. The practicality of biomagnetic methods depends especially on techniques for suppressing the dominant environmental electromagnetic noise, and on suitable nearly real-time data processing and interpretation methods. Of the many biomagnetic methods and applications, only the functional studies of the human brain (magnetoencephalography) and liver susceptometry are in clinical use, while functional diagnostics of the human heart (magnetocardiography) approaches the threshold of clinical acceptance. Particularly promising for the future is the ongoing research into low-field magnetic resonance anatomical imaging using SQUIDs.
NASA Astrophysics Data System (ADS)
Silalahi, R. L. R.; Mustaniroh, S. A.; Ikasari, D. M.; Sriulina, R. P.
2018-03-01
UD. Bunda Foods is an SME located in the district of Sidoarjo. UD. Bunda Foods has problems of maintaining its milkfish’s quality assurance and developing marketing strategies. Improving those problems enables UD. Bunda Foods to compete with other similar SMEs and to market its product for further expansion of their business. The objectives of this study were to determine the model of the institutional structure of the milkfish supply chain, to determine the elements, the sub-elements, and the relationship among each element. The method used in this research was Interpretive Structural Modeling (ISM), involving 5 experts as respondents consisting of 1 practitioner, 1 academician, and 3 government organisation employees. The results showed that there were two key elements include requirement and goals elements. Based on the Drive Power-Dependence (DP-D) matrix, the key sub-elements of requirement element, consisted of raw material continuity, appropriate marketing strategy, and production capital, were positioned in the Linkage sector quadrant. The DP-D matrix for the key sub-elements of the goal element also showed a similar position. The findings suggested several managerial implications to be carried out by UD. Bunda Foods include establishing good relationships with all involved institutions, obtaining capital assistance, and attending the marketing training provided by the government.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wiley, H. S.
There comes a time in every field of science when things suddenly change. While it might not be immediately apparent that things are different, a tipping point has occurred. Biology is now at such a point. The reason is the introduction of high-throughput genomics-based technologies. I am not talking about the consequences of the sequencing of the human genome (and every other genome within reach). The change is due to new technologies that generate an enormous amount of data about the molecular composition of cells. These include proteomics, transcriptional profiling by sequencing, and the ability to globally measure microRNAs andmore » post-translational modifications of proteins. These mountains of digital data can be mapped to a common frame of reference: the organism’s genome. With the new high-throughput technologies, we can generate tens of thousands of data points from each sample. Data are now measured in terabytes and the time necessary to analyze data can now require years. Obviously, we can’t wait to interpret the data fully before the next experiment. In fact, we might never be able to even look at all of it, much less understand it. This volume of data requires sophisticated computational and statistical methods for its analysis and is forcing biologists to approach data interpretation as a collaborative venture.« less
Teaching and Assessing Professionalism in Radiology Resident Education.
Kelly, Aine Marie; Gruppen, Larry D; Mullan, Patricia B
2017-05-01
Radiologists in teaching hospitals and in practices with residents rotating through are involved in the education of their residents. The Accreditation Council for Graduate Medical Education requires evidence that trainees are taught and demonstrate competency not only in medical knowledge and in patient care-the historic focus of radiology education-but also in the so-called non-interpretative core competencies, which include professionalism and interpersonal skills. In addition to accreditation agencies, the prominent assessment practices represented by the American Board of Radiology core and certifying examinations for trainees, as well as Maintenance of Certification for practitioners, are planning to feature more non-interpretative competency assessment, including professionalism to a greater extent. Because professionalism was incorporated as a required competency in medical education as a whole, more clarity about the justification and expected content for teaching about competence in professionalism, as well as greater understanding and evidence about appropriate and effective teaching and assessment methods, have emerged. This article summarizes justifications and expectations for teaching and assessing professionalism in radiology residents and best practices on how to teach and evaluate professionalism that can be used by busy radiology faculty in their everyday practice supervising radiology residents. Copyright © 2017 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.
Toward Precision Healthcare: Context and Mathematical Challenges
Colijn, Caroline; Jones, Nick; Johnston, Iain G.; Yaliraki, Sophia; Barahona, Mauricio
2017-01-01
Precision medicine refers to the idea of delivering the right treatment to the right patient at the right time, usually with a focus on a data-centered approach to this task. In this perspective piece, we use the term “precision healthcare” to describe the development of precision approaches that bridge from the individual to the population, taking advantage of individual-level data, but also taking the social context into account. These problems give rise to a broad spectrum of technical, scientific, policy, ethical and social challenges, and new mathematical techniques will be required to meet them. To ensure that the science underpinning “precision” is robust, interpretable and well-suited to meet the policy, ethical and social questions that such approaches raise, the mathematical methods for data analysis should be transparent, robust, and able to adapt to errors and uncertainties. In particular, precision methodologies should capture the complexity of data, yet produce tractable descriptions at the relevant resolution while preserving intelligibility and traceability, so that they can be used by practitioners to aid decision-making. Through several case studies in this domain of precision healthcare, we argue that this vision requires the development of new mathematical frameworks, both in modeling and in data analysis and interpretation. PMID:28377724
Shoreline Change and Storm-Induced Beach Erosion Modeling: A Collection of Seven Papers
1990-03-01
reducing, and analyzing the data in a systematic manner. Most physical data needed for evaluating and interpreting shoreline and beach evolution processes...proposed development concepts using both physical and numerical models. b. Analyzed and interpreted model results. c. Provided technical documentation of... interpret study results in the context required for "Confirmation" hearings. 26 The Corps of Engineers, Los Angeles District (SPL), has also begun studies
Comparison of two teaching methods for cardiac arrhythmia interpretation among nursing students.
Varvaroussis, Dimitrios P; Kalafati, Maria; Pliatsika, Paraskevi; Castrén, Maaret; Lott, Carsten; Xanthos, Theodoros
2014-02-01
The aim of this study was to compare the six-stage method (SSM) for instructing primary cardiac arrhythmias interpretation to students without basic electrocardiogram (ECG) knowledge with a descriptive teaching method in a single educational intervention. This is a randomized trial. Following a brief instructional session, undergraduate nursing students, assigned to group A (SSM) and group B (descriptive teaching method), undertook a written test in cardiac rhythm recognition, immediately after the educational intervention (initial exam). Participants were also examined with an unannounced retention test (final exam), one month after instruction. Altogether 134 students completed the study. Interpretation accuracy for each cardiac arrhythmia was assessed. Mean score at the initial exam was 8.71±1.285 for group A and 8.74±1.303 for group B. Mean score at the final exam was 8.25±1.46 for group A vs 7.84±1.44 for group B. Overall results showed that the SSM was equally effective with the descriptive teaching method. The study showed that in each group bradyarrhythmias were identified correctly by more students than tachyarrhythmias. No significant difference between the two teaching methods was seen for any specific cardiac arrhythmia. The SSM effectively develops staff competency for interpreting common cardiac arrhythmias in students without ECG knowledge. More research is needed to support this conclusion and the method's effectiveness must be evaluated if being implemented to trainee groups with preexisting basic ECG interpretation knowledge. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Deepwater Horizon - Estimating surface oil volume distribution in real time
NASA Astrophysics Data System (ADS)
Lehr, B.; Simecek-Beatty, D.; Leifer, I.
2011-12-01
Spill responders to the Deepwater Horizon (DWH) oil spill required both the relative spatial distribution and total oil volume of the surface oil. The former was needed on a daily basis to plan and direct local surface recovery and treatment operations. The latter was needed less frequently to provide information for strategic response planning. Unfortunately, the standard spill observation methods were inadequate for an oil spill this size, and new, experimental, methods, were not ready to meet the operational demands of near real-time results. Traditional surface oil estimation tools for large spills include satellite-based sensors to define the spatial extent (but not thickness) of the oil, complemented with trained observers in small aircraft, sometimes supplemented by active or passive remote sensing equipment, to determine surface percent coverage of the 'thick' part of the slick, where the vast majority of the surface oil exists. These tools were also applied to DWH in the early days of the spill but the shear size of the spill prevented synoptic information of the surface slick through the use small aircraft. Also, satellite images of the spill, while large in number, varied considerably in image quality, requiring skilled interpretation of them to identify oil and eliminate false positives. Qualified staff to perform this task were soon in short supply. However, large spills are often events that overcome organizational inertia to the use of new technology. Two prime examples in DWH were the application of hyper-spectral scans from a high-altitude aircraft and more traditional fixed-wing aircraft using multi-spectral scans processed by use of a neural network to determine, respectively, absolute or relative oil thickness. But, with new technology, come new challenges. The hyper-spectral instrument required special viewing conditions that were not present on a daily basis and analysis infrastructure to process the data that was not available at the command post. Very few days provided sufficient observation quality and spatial coverage. Future application of this method will require solving both the observational and analysis challenges demonstrated at DWH. Similarly, the multi-spectral scanner results could only be interpreted by a handful of individuals, causing some logistical problems incorporating the observational results with the incident command decisions. This roadblock may go away as the spill response community becomes more familiar with the technology.
Field camp: Using traditional methods to train the next generation of petroleum geologists
Puckette, J.O.; Suneson, N.H.
2009-01-01
The summer field camp experience provides many students with their best opportunity to learn the scientific process by making observations and collecting, recording, evaluating, and interpreting geologic data. Field school projects enhance student professional development by requiring cooperation and interpersonal interaction, report writing to communicate interpretations, and the development of project management skills to achieve a common goal. The field school setting provides students with the opportunity to observe geologic features and their spatial distribution, size, and shape that will impact the student's future careers as geoscientists. The Les Huston Geology Field Camp (a.k.a. Oklahoma Geology Camp) near Ca??on City, Colorado, focuses on time-tested traditional methods of geological mapping and fieldwork to accomplish these goals. The curriculum consists of an introduction to field techniques (pacing, orienteering, measuring strike and dip, and using a Jacob's staff), sketching outcrops, section measuring (one illustrating facies changes), three mapping exercises (of increasing complexity), and a field geophysics project. Accurate rock and contact descriptions are emphasized, and attitudes and contacts are mapped in the field. Mapping is done on topographic maps at 1:12,000 and 1:6000 scales; air photos are provided. Global positioning system (GPS)-assisted mapping is allowed, but we insist that locations be recorded in the field and confirmed using visual observations. The course includes field trips to the Cripple Creek and Leadville mining districts, Floris-sant/Guffey volcano area, Pikes Peak batholith, and the Denver Basin. Each field trip is designed to emphasize aspects of geology that are not stressed in the field exercises. Students are strongly encouraged to accurately describe geologic features and gather evidence to support their interpretations of the geologic history. Concise reports are a part of each major exercise. Students are grouped into teams to (1) introduce the team concept and develop interpersonal skills that are fundamental components of many professions, (2) ensure safety, and (3) mix students with varying academic backgrounds and physical strengths. This approach has advantages and disadvantages. Students with academic strengths in specific areas assist those with less experience, thereby becoming engaged in the teaching process. However, some students contribute less to fi nal map projects than others, and assigning grades to individual team members can be diffi cult. The greatest challenges we face involve group dynamics and student personalities. We continue to believe that traditional fi eld methods, aided by (but not relying upon) new technologies, are the key to constructing and/or interpreting geologic maps. The requirement that students document fi eld evidence using careful observations teaches skills that will be benefi cial throughout their professional careers. ??2009 The Geological Society of America. All rights reserved.
Electrocardiogram interpretation and arrhythmia management: a primary and secondary care survey.
Begg, Gordon; Willan, Kathryn; Tyndall, Keith; Pepper, Chris; Tayebjee, Muzahir
2016-05-01
There is increasing desire among service commissioners to treat arrhythmia in primary care. Accurate interpretation of the electrocardiogram (ECG) is fundamental to this. ECG interpretation has previously been shown to vary widely but there is little recent data. To examine the interpretation of ECGs in primary and secondary care. A cross-sectional survey of participants' interpretation of six ECGs and hypothetical management of patients based on those ECGs, at primary care educational events, and a cardiology department in Leeds. A total of 262 primary care clinicians and 20 cardiology clinicians were surveyed via questionnaire. Answers were compared with expert electrophysiologist opinion. In primary care, abnormal ECGs were interpreted as normal by 23% of responders. ST elevation and prolonged QT were incorrectly interpreted as normal by 1% and 22%, respectively. In cardiology, abnormal ECGs were interpreted as normal by 3%. ECG provision and interpretation remains inconsistent in both primary and secondary care. Primary care practitioners are less experienced and less confident with ECG interpretation than cardiologists, and require support in this area. © British Journal of General Practice 2016.
ERIC Educational Resources Information Center
Seigler, Timothy John
2005-01-01
The purpose of this article is to 1) examine the interpretive method applied to the United States Constitution referred of as"Original Intent" and the degree, if any, to which it is superior in objectivity than other methods, 2) discuss whether the application of the interpretive method would have an effect preferred by conservative or…
The generalized scattering coefficient method for plane wave scattering in layered structures
NASA Astrophysics Data System (ADS)
Liu, Yu; Li, Chao; Wang, Huai-Yu; Zhou, Yun-Song
2017-02-01
The generalized scattering coefficient (GSC) method is pedagogically derived and employed to study the scattering of plane waves in homogeneous and inhomogeneous layered structures. The numerical stabilities and accuracies of this method and other commonly used numerical methods are discussed and compared. For homogeneous layered structures, concise scattering formulas with clear physical interpretations and strong numerical stability are obtained by introducing the GSCs. For inhomogeneous layered structures, three numerical methods are employed: the staircase approximation method, the power series expansion method, and the differential equation based on the GSCs. We investigate the accuracies and convergence behaviors of these methods by comparing their predictions to the exact results. The conclusions are as follows. The staircase approximation method has a slow convergence in spite of its simple and intuitive implementation, and a fine stratification within the inhomogeneous layer is required for obtaining accurate results. The expansion method results are sensitive to the expansion order, and the treatment becomes very complicated for relatively complex configurations, which restricts its applicability. By contrast, the GSC-based differential equation possesses a simple implementation while providing fast and accurate results.
A guide to understanding social science research for natural scientists.
Moon, Katie; Blackman, Deborah
2014-10-01
Natural scientists are increasingly interested in social research because they recognize that conservation problems are commonly social problems. Interpreting social research, however, requires at least a basic understanding of the philosophical principles and theoretical assumptions of the discipline, which are embedded in the design of social research. Natural scientists who engage in social science but are unfamiliar with these principles and assumptions can misinterpret their results. We developed a guide to assist natural scientists in understanding the philosophical basis of social science to support the meaningful interpretation of social research outcomes. The 3 fundamental elements of research are ontology, what exists in the human world that researchers can acquire knowledge about; epistemology, how knowledge is created; and philosophical perspective, the philosophical orientation of the researcher that guides her or his action. Many elements of the guide also apply to the natural sciences. Natural scientists can use the guide to assist them in interpreting social science research to determine how the ontological position of the researcher can influence the nature of the research; how the epistemological position can be used to support the legitimacy of different types of knowledge; and how philosophical perspective can shape the researcher's choice of methods and affect interpretation, communication, and application of results. The use of this guide can also support and promote the effective integration of the natural and social sciences to generate more insightful and relevant conservation research outcomes. © 2014 Society for Conservation Biology.
Design and interpretation of anthropometric and fitness testing of basketball players.
Drinkwater, Eric J; Pyne, David B; McKenna, Michael J
2008-01-01
The volume of literature on fitness testing in court sports such as basketball is considerably less than for field sports or individual sports such as running and cycling. Team sport performance is dependent upon a diverse range of qualities including size, fitness, sport-specific skills, team tactics, and psychological attributes. The game of basketball has evolved to have a high priority on body size and physical fitness by coaches and players. A player's size has a large influence on the position in the team, while the high-intensity, intermittent nature of the physical demands requires players to have a high level of fitness. Basketball coaches and sport scientists often use a battery of sport-specific physical tests to evaluate body size and composition, and aerobic fitness and power. This testing may be used to track changes within athletes over time to evaluate the effectiveness of training programmes or screen players for selection. Sports science research is establishing typical (or 'reference') values for both within-athlete changes and between-athlete differences. Newer statistical approaches such as magnitude-based inferences have emerged that are providing more meaningful interpretation of fitness testing results in the field for coaches and athletes. Careful selection and implementation of tests, and more pertinent interpretation of data, will enhance the value of fitness testing in high-level basketball programmes. This article presents reference values of fitness and body size in basketball players, and identifies practical methods of interpreting changes within players and differences between players beyond the null-hypothesis.
Protein flexibility: coordinate uncertainties and interpretation of structural differences
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rashin, Alexander A., E-mail: alexander-rashin@hotmail.com; LH Baker Center for Bioinformatics and Department of Biochemistry, Biophysics and Molecular Biology, 112 Office and Lab Building, Iowa State University, Ames, IA 50011-3020; Rashin, Abraham H. L.
2009-11-01
Criteria for the interpretability of coordinate differences and a new method for identifying rigid-body motions and nonrigid deformations in protein conformational changes are developed and applied to functionally induced and crystallization-induced conformational changes. Valid interpretations of conformational movements in protein structures determined by X-ray crystallography require that the movement magnitudes exceed their uncertainty threshold. Here, it is shown that such thresholds can be obtained from the distance difference matrices (DDMs) of 1014 pairs of independently determined structures of bovine ribonuclease A and sperm whale myoglobin, with no explanations provided for reportedly minor coordinate differences. The smallest magnitudes of reportedly functionalmore » motions are just above these thresholds. Uncertainty thresholds can provide objective criteria that distinguish between true conformational changes and apparent ‘noise’, showing that some previous interpretations of protein coordinate changes attributed to external conditions or mutations may be doubtful or erroneous. The use of uncertainty thresholds, DDMs, the newly introduced CDDMs (contact distance difference matrices) and a novel simple rotation algorithm allows a more meaningful classification and description of protein motions, distinguishing between various rigid-fragment motions and nonrigid conformational deformations. It is also shown that half of 75 pairs of identical molecules, each from the same asymmetric crystallographic cell, exhibit coordinate differences that range from just outside the coordinate uncertainty threshold to the full magnitude of large functional movements. Thus, crystallization might often induce protein conformational changes that are comparable to those related to or induced by the protein function.« less
Statistical Hypothesis Testing in Intraspecific Phylogeography: NCPA versus ABC
Templeton, Alan R.
2009-01-01
Nested clade phylogeographic analysis (NCPA) and approximate Bayesian computation (ABC) have been used to test phylogeographic hypotheses. Multilocus NCPA tests null hypotheses, whereas ABC discriminates among a finite set of alternatives. The interpretive criteria of NCPA are explicit and allow complex models to be built from simple components. The interpretive criteria of ABC are ad hoc and require the specification of a complete phylogeographic model. The conclusions from ABC are often influenced by implicit assumptions arising from the many parameters needed to specify a complex model. These complex models confound many assumptions so that biological interpretations are difficult. Sampling error is accounted for in NCPA, but ABC ignores important sources of sampling error that creates pseudo-statistical power. NCPA generates the full sampling distribution of its statistics, but ABC only yields local probabilities, which in turn make it impossible to distinguish between a good fitting model, a non-informative model, and an over-determined model. Both NCPA and ABC use approximations, but convergences of the approximations used in NCPA are well defined whereas those in ABC are not. NCPA can analyze a large number of locations, but ABC cannot. Finally, the dimensionality of tested hypothesis is known in NCPA, but not for ABC. As a consequence, the “probabilities” generated by ABC are not true probabilities and are statistically non-interpretable. Accordingly, ABC should not be used for hypothesis testing, but simulation approaches are valuable when used in conjunction with NCPA or other methods that do not rely on highly parameterized models. PMID:19192182
Referential communication in children with ADHD: challenges in the role of a listener.
Nilsen, Elizabeth S; Mangal, Leilani; Macdonald, Kristi
2013-04-01
Successful communication requires that listeners accurately interpret the meaning of speakers' statements. The present work examined whether children with and without attention-deficit/hyperactivity disorder (ADHD) differ in their ability to interpret referential statements (i.e., phrases that denote objects or events) from speakers. Children (6 to 9 years old), diagnosed with ADHD (n = 27) and typically developing (n = 26), took part in an interactive task in which they were asked by an adult speaker to retrieve objects from a display case. Children interpreted the referential statements in contexts that either did or did not require perspective-taking. Children's eye movements and object choices were recorded. Parents completed questionnaires assessing their child's frequency of ADHD symptoms and pragmatic communicative abilities. Behavioral and eye movement measures revealed that children with ADHD made more interpretive errors and were less likely to consider target referents across the 2 communicative conditions. Furthermore, ADHD symptoms related to children's performance on the communicative task and to parental report of the child's pragmatic skills. Children with ADHD are less accurate in their interpretations of referential statements. Such difficulties would lead to greater occurrences of miscommunication.
Keshvari, Mahrokh; Mohammadi, Eesa; Boroujeni, Ali Zargham; Farajzadegan, Ziba
2012-01-01
Objectives: Health care providers in the rural centers offer the primary health services in the form of proficiencies and professions to the most required target population in the health system. These services are provided in certain condition and population with a verity of limitations. This study aimed to describe and interpret the experiences of the employees from their own working condition in the rural health centers. Methods: The present study conducted in a qualitative research approach and content analysis method through individual and group interviews with 26 employed primary health care providers (including 7 family physicians, 7 midwives, and 12 health workers) in the rural health centers in Isfahan in 2009. Sampling was done using purposive sampling method. The data were analyzed using qualitative content analysis as constant comparative basis. Results: During the content analysis process, six themes were obtained; “instability and frequent changes”, “involved in laws and regulations”, “pressure and stress due to unbalanced workload and manpower”, “helplessness in performing the tasks and duties”, “sense of identity threat and low self-concept”, and “deprivation of professional development”. The mentioned themes indicate a main and more important theme called “burnout”. Conclusions: Health services providers in the rural health centers are working in stressful and challenging work conditions and are suffered from deprivation of something for which are responsible to the community. PMID:22826774
NASA Astrophysics Data System (ADS)
De Boissieu, Florian; Sevin, Brice; Cudahy, Thomas; Mangeas, Morgan; Chevrel, Stéphane; Ong, Cindy; Rodger, Andrew; Maurizot, Pierre; Laukamp, Carsten; Lau, Ian; Touraivane, Touraivane; Cluzel, Dominique; Despinoy, Marc
2018-02-01
Accurate maps of Earth's geology, especially its regolith, are required for managing the sustainable exploration and development of mineral resources. This paper shows how airborne imaging hyperspectral data collected over weathered peridotite rocks in vegetated, mountainous terrane in New Caledonia were processed using a combination of methods to generate a regolith-geology map that could be used for more efficiently targeting Ni exploration. The image processing combined two usual methods, which are spectral feature extraction and support vector machine (SVM). This rationale being the spectral features extraction can rapidly reduce data complexity by both targeting only the diagnostic mineral absorptions and masking those pixels complicated by vegetation, cloud and deep shade. SVM is a supervised classification method able to generate an optimal non-linear classifier with these features that generalises well even with limited training data. Key minerals targeted are serpentine, which is considered as an indicator for hydrolysed peridotitic rock, and iron oxy-hydroxides (hematite and goethite), which are considered as diagnostic of laterite development. The final classified regolith map was assessed against interpreted regolith field sites, which yielded approximately 70% similarity for all unit types, as well as against a regolith-geology map interpreted using traditional datasets (not hyperspectral imagery). Importantly, the hyperspectral derived mineral map provided much greater detail enabling a more precise understanding of the regolith-geological architecture where there are exposed soils and rocks.
Interpretation of magnetic anomalies using a genetic algorithm
NASA Astrophysics Data System (ADS)
Kaftan, İlknur
2017-08-01
A genetic algorithm (GA) is an artificial intelligence method used for optimization. We applied a GA to the inversion of magnetic anomalies over a thick dike. Inversion of nonlinear geophysical problems using a GA has advantages because it does not require model gradients or well-defined initial model parameters. The evolution process consists of selection, crossover, and mutation genetic operators that look for the best fit to the observed data and a solution consisting of plausible compact sources. The efficiency of a GA on both synthetic and real magnetic anomalies of dikes by estimating model parameters, such as depth to the top of the dike ( H), the half-width of the dike ( B), the distance from the origin to the reference point ( D), the dip of the thick dike ( δ), and the susceptibility contrast ( k), has been shown. For the synthetic anomaly case, it has been considered for both noise-free and noisy magnetic data. In the real case, the vertical magnetic anomaly from the Pima copper mine in Arizona, USA, and the vertical magnetic anomaly in the Bayburt-Sarıhan skarn zone in northeastern Turkey have been inverted and interpreted. We compared the estimated parameters with the results of conventional inversion methods used in previous studies. We can conclude that the GA method used in this study is a useful tool for evaluating magnetic anomalies for dike models.
[The Scope, Quality and Safety Requirements of Drug Abuse Testing].
Küme, Tuncay; Karakükcü, Çiğdem; Pınar, Aslı; Coşkunol, Hakan
2017-01-01
The aim of this review is to inform about the scopes and requirements of drug abuse testing. Drug abuse testing is one of the tools for determination of drug use. It must fulfill the quality and safety requirements in judgmental legal and administrative decisions. Drug abuse testing must fulfill some requirements like selection of the appropriate test matrix, appropriate screening test panel, sampling in detection window, patient consent, identification of the donor, appropriate collection site, sample collection with observation, identification and control of the sample, specimen custody chain in preanalytical phase; analysis in authorized laboratories, specimen validity tests, reliable testing METHODS, strict quality control, two-step analysis in analytical phase; storage of the split specimen, confirmation of the split specimen in the objection, result custody chain, appropriate cut-off concentration, the appropriate interpretation of the result in postanalytical phase. The workflow and analytical processes of drug abuse testing are explained in last regulation of the Department of Medical Laboratory Services, Ministry of Health in Turkey. The clinical physicians have to know and apply the quality and safety requirements in drug abuse testing according to last regulations in Turkey.
Cleared for Launch - Lessons Learned from the OSIRIS-REx System Requirements Verification Program
NASA Technical Reports Server (NTRS)
Stevens, Craig; Adams, Angela; Williams, Bradley; Goodloe, Colby
2017-01-01
Requirements verification of a large flight system is a challenge. It is especially challenging for engineers taking on their first role in space systems engineering. This paper describes our approach to verification of the Origins, Spectral Interpretation, Resource Identification, Security-Regolith Explorer (OSIRIS-REx) system requirements. It also captures lessons learned along the way from developing systems engineers embroiled in this process. We begin with an overview of the mission and science objectives as well as the project requirements verification program strategy. A description of the requirements flow down is presented including our implementation for managing the thousands of program and element level requirements and associated verification data. We discuss both successes and methods to improve the managing of this data across multiple organizational interfaces. Our approach to verifying system requirements at multiple levels of assembly is presented using examples from our work at instrument, spacecraft, and ground segment levels. We include a discussion of system end-to-end testing limitations and their impacts to the verification program. Finally, we describe lessons learned that are applicable to all emerging space systems engineers using our unique perspectives across multiple organizations of a large NASA program.
DOT National Transportation Integrated Search
1999-08-01
Several different interpretations of the American Association of State Highway and Transportation Officials' (AASHTO's) Moisture Sensitivity Test exist. The official AASHTO interpretation of this test method does not account for water which has been ...
Bond, R R; Kligfield, P D; Zhu, T; Finlay, D D; Drew, B; Guldenring, D; Breen, C; Clifford, G D; Wagner, G S
2015-01-01
The 12-lead electrocardiogram (ECG) is a complex set of cardiac signals that require a high degree of skill and clinical knowledge to interpret. Therefore, it is imperative to record and understand how expert readers interpret the 12-lead ECG. This short paper showcases how eye tracking technology and audio data can be fused together and visualised to gain insight into the interpretation techniques employed by an eminent ECG champion, namely Dr Rory Childers. Copyright © 2015 Elsevier Inc. All rights reserved.
Harun, Rashed; Grassi, Christine M; Munoz, Miranda J; Torres, Gonzalo E; Wagner, Amy K
2015-03-02
Fast-scan cyclic voltammetry (FSCV) is an electrochemical method that can assess real-time in vivo dopamine (DA) concentration changes to study the kinetics of DA neurotransmission. Electrical stimulation of dopaminergic (DAergic) pathways can elicit FSCV DA responses that largely reflect a balance of DA release and reuptake. Interpretation of these evoked DA responses requires a framework to discern the contribution of DA release and reuptake. The current, widely implemented interpretive framework for doing so is the Michaelis-Menten (M-M) model, which is grounded on two assumptions- (1) DA release rate is constant during stimulation, and (2) DA reuptake occurs through dopamine transporters (DAT) in a manner consistent with M-M enzyme kinetics. Though the M-M model can simulate evoked DA responses that rise convexly, response types that predominate in the ventral striatum, the M-M model cannot simulate dorsal striatal responses that rise concavely. Based on current neurotransmission principles and experimental FSCV data, we developed a novel, quantitative, neurobiological framework to interpret DA responses that assumes DA release decreases exponentially during stimulation and continues post-stimulation at a diminishing rate. Our model also incorporates dynamic M-M kinetics to describe DA reuptake as a process of decreasing reuptake efficiency. We demonstrate that this quantitative, neurobiological model is an extension of the traditional M-M model that can simulate heterogeneous regional DA responses following manipulation of stimulation duration, frequency, and DA pharmacology. The proposed model can advance our interpretive framework for future in vivo FSCV studies examining regional DA kinetics and their alteration by disease and DA pharmacology. Copyright © 2015 Elsevier B.V. All rights reserved.
Dingwall, Kylie M; Pinkerton, Jennifer; Lindeman, Melissa A
2013-01-31
Achieving culturally fair assessments of cognitive functioning for Aboriginal people is difficult due to a scarcity of appropriately validated tools for use with this group. As a result, some Aboriginal people with cognitive impairments may lack fair and equitable access to services. The objective of this study was to examine current clinical practice in the Northern Territory regarding cognitive assessment for Aboriginal people thereby providing some guidance for clinicians new to this practice setting. Qualitative enquiry was used to describe practice context, reasons for assessment, and current practices in assessing cognition for Aboriginal Australians. Semi-structured interviews were conducted with 22 clinicians working with Aboriginal clients in central and northern Australia. Results pertaining to assessment methods are reported. A range of standardised tests were utilised with little consistency across clinical practice. Nevertheless, it was recognised that such tests bear severe limitations, requiring some modification and significant caution in their interpretation. Clinicians relied heavily on informal assessment or observations, contextual information and clinical judgement. Cognitive tests developed specifically for Aboriginal people are urgently needed. In the absence of appropriate, validated tests, clinicians have relied on and modified a range of standardised and informal assessments, whilst recognising the severe limitations of these. Past clinical training has not prepared clinicians adequately for assessing Aboriginal clients, and experience and clinical judgment were considered crucial for fair interpretation of test scores. Interpretation guidelines may assist inexperienced clinicians to consider whether they are achieving fair assessments of cognition for Aboriginal clients.
Using Analytical Techniques to Interpret Financial Statements.
ERIC Educational Resources Information Center
Walters, Donald L.
1986-01-01
Summarizes techniques for interpreting the balance sheet and the statement of revenues, expenditures, and changes-in-fund-balance sections of the comprehensive annual financial report required of all school districts. Uses three tables to show intricacies involved and focuses on analyzing favorable and unfavorable budget variances. (MLH)
Interpreting Assessment Scores of Nonliterate Learners with Ethnographic Data.
ERIC Educational Resources Information Center
Griffin, Suzanne M.
Research findings are reported that suggest that valid interpretation of assessment scores on illiterate and preliterate learners requires the use of ethnographic data. Data from observation notes, photos, and audiotapes indicated that learners' understanding of their tasks affected their performance in assessment situations. Previous findings…
Interpretation in consultations with immigrant patients with cancer: how accurate is it?
Butow, Phyllis N; Goldstein, David; Bell, Melaine L; Sze, Ming; Aldridge, Lynley J; Abdo, Sarah; Tanious, Michelle; Dong, Skye; Iedema, Rick; Vardy, Janette; Ashgari, Ray; Hui, Rina; Eisenbruch, Maurice
2011-07-10
Immigrants with cancer often have professional and/or family interpreters to overcome challenges communicating with their health team. This study explored the rate and consequences of nonequivalent interpretation in medical oncology consultations. Consecutive immigrant patients with newly diagnosed with incurable cancer, who spoke Arabic, Cantonese, Mandarin, or Greek, were recruited from the practices of 10 medical oncologists in nine hospitals. Their first two consultations were audio taped, transcribed, translated into English and coded. Thirty-two of 78 participants had an interpreter at 49 consultations; 43% of interpreters were family, 35% professional, 18% both a professional and family, and 4% a health professional. Sixty-five percent of professional interpretations were equivalent to the original speech versus 50% for family interpreters (P= .02). Seventy percent of nonequivalent interpretations were inconsequential or positive; however, 10% could result in misunderstanding, in 5% the tone was more authoritarian than originally intended, and in 3% more certainty was conveyed. There were no significant differences in interpreter type for equivalency of interpretations. Nonequivalent interpretation is common, and not always innocuous. Our study suggests that there may remain a role for family or telephone versus face-to-face professional interpreters. careful communication between oncologists and interpreters is required to ensure optimal communication with the patient.
Lessons learned from translators and interpreters from the Dinka tribe of southern Sudan.
Baird, Martha B
2011-04-01
This article discusses the methodological challenges associated with working with translators and interpreters from the Dinka tribe of southern Sudan during an ethnographic study with refugee Dinka women who were resettled with their children in the United States. Navigating the cultural differences between the researcher, the translator, and the interpreters provided a deeper understanding about the culture of the study population. The lessons learned included the importance of cultural congruence between the interpreters and participants; the education, training, and experience of the interpreters; and the difficulties encountered in preparing interpreters according to university institutional review board requirements. Cultural differences such as time perception and communication and literacy styles were negotiated throughout each phase of the study. The most valuable lesson learned from this experience was the importance of the relationship between the researcher, the translator, and the interpreters as well as between the interpreters and participants to achieve credibility and trustworthiness of the study results.
Regional and supraregional biochemistry services in Scotland: a survey of hospital laboratory users.
Murphy, M J; Dryburgh, F J; Shepherd, J
1994-01-01
AIM--To ascertain the views of Scottish hospital laboratory users on aspects of regional and supraregional biochemical services offered by the Institute of Biochemistry at Glasgow Royal Infirmary. METHODS--A questionnaire was circulated asking questions or inviting opinions under various headings, including current patterns of usage of the services provided, availability of information on specimen collection requirements and reference ranges, current arrangements for transport of specimens, turnaround times for delivery of reports, layout and content of request and report forms, quantity and quality of interpretive advice, potential changes in laboratory services, and overall impression of the services provided. Opportunities were provided for free text comment. The questionnaire was circulated in 1992 to heads of department in 23 Scottish hospital biochemistry laboratories. RESULTS--Twenty one replies were received. Services used widely included trace metals/vitamins (n = 20) and specialised endocrine tests (n = 19). Other services also used included specialised lipid tests (n = 13), toxicology (n = 12), thyroid function tests (n = nine), and tumour markers (n = eight). Fifteen laboratories used one or more of the services at least weekly. Most (n = 20) welcomed the idea of a handbook providing information on specimen collection and reference ranges. Nine identified loss of specimens as a problem. Other perceived problems included the absence of reference ranges from report forms, quantity and quality of interpretive advice, and turnaround times of some tests. Overall impressions of the service(s) offered were very good (n = 12); adequate (n = seven); poor (n = one). CONCLUSIONS--Useful information was obtained about patterns of use and transport arrangements. Areas identified as requiring follow up included provision of information, alternative ways of communicating reports, and improvement in quantity and quality of interpretive advice. PMID:8027390
Narayanan, Roshni; Nugent, Rebecca; Nugent, Kenneth
2015-10-01
Accreditation Council for Graduate Medical Education guidelines require internal medicine residents to develop skills in the interpretation of medical literature and to understand the principles of research. A necessary component is the ability to understand the statistical methods used and their results, material that is not an in-depth focus of most medical school curricula and residency programs. Given the breadth and depth of the current medical literature and an increasing emphasis on complex, sophisticated statistical analyses, the statistical foundation and education necessary for residents are uncertain. We reviewed the statistical methods and terms used in 49 articles discussed at the journal club in the Department of Internal Medicine residency program at Texas Tech University between January 1, 2013 and June 30, 2013. We collected information on the study type and on the statistical methods used for summarizing and comparing samples, determining the relations between independent variables and dependent variables, and estimating models. We then identified the typical statistics education level at which each term or method is learned. A total of 14 articles came from the Journal of the American Medical Association Internal Medicine, 11 from the New England Journal of Medicine, 6 from the Annals of Internal Medicine, 5 from the Journal of the American Medical Association, and 13 from other journals. Twenty reported randomized controlled trials. Summary statistics included mean values (39 articles), category counts (38), and medians (28). Group comparisons were based on t tests (14 articles), χ2 tests (21), and nonparametric ranking tests (10). The relations between dependent and independent variables were analyzed with simple regression (6 articles), multivariate regression (11), and logistic regression (8). Nine studies reported odds ratios with 95% confidence intervals, and seven analyzed test performance using sensitivity and specificity calculations. These papers used 128 statistical terms and context-defined concepts, including some from data analysis (56), epidemiology-biostatistics (31), modeling (24), data collection (12), and meta-analysis (5). Ten different software programs were used in these articles. Based on usual undergraduate and graduate statistics curricula, 64.3% of the concepts and methods used in these papers required at least a master's degree-level statistics education. The interpretation of the current medical literature can require an extensive background in statistical methods at an education level exceeding the material and resources provided to most medical students and residents. Given the complexity and time pressure of medical education, these deficiencies will be hard to correct, but this project can serve as a basis for developing a curriculum in study design and statistical methods needed by physicians-in-training.
75 FR 63067 - Interpretation of “Children's Product”
Federal Register 2010, 2011, 2012, 2013, 2014
2010-10-14
... a level of sophistication required to operate the locomotives. Additionally, the commenters note... railroad hobbyists, the costs involved, and the level of sophistication required to operate them. Model...
Teratology testing under REACH.
Barton, Steve
2013-01-01
REACH guidelines may require teratology testing for new and existing chemicals. This chapter discusses procedures to assess the need for teratology testing and the conduct and interpretation of teratology tests where required.
Madden, A M; Smith, S
2016-02-01
Evaluation of body composition is an important part of assessing nutritional status and provides prognostically useful data and an opportunity to monitor the effects of nutrition-related disease progression and nutritional intervention. The aim of this narrative review is to critically evaluate body composition methodology in adults, focusing on anthropometric variables. The variables considered include height, weight, body mass index and alternative indices, trunk measurements (waist and hip circumferences and sagittal abdominal diameter) and limb measurements (mid-upper arm and calf circumferences) and skinfold thickness. The importance of adhering to a defined measurement protocol, checking measurement error and the need to interpret measurements using appropriate population-specific cut-off values to identify health risks were highlighted. Selecting the optimum method for assessing body composition using anthropometry depends on the purpose (i.e. evaluating obesity or undernutrition) and requires practitioners to have a good understanding of both practical and theoretical limitations and to be able to interpret the results wisely. © 2014 The British Dietetic Association Ltd.
Retrieving Coherent Receiver Function Images with Dense Arrays
NASA Astrophysics Data System (ADS)
Zhong, M.; Zhan, Z.
2016-12-01
Receiver functions highlight converted phases (e.g., Ps, PpPs, sP) and have been widely used to study seismic interfaces. With a dense array, receiver functions (RFs) at multiple stations form a RF image that can provide more robust/detailed constraints. However, due to noise in data, non-uniqueness of deconvolution, and local structures that cannot be detected across neighboring stations, traditional RF images are often noisy and hard to interpret. Previous attempts to enhance coherence by stacking RFs from multiple events or post-filtering the RF images have not produced satisfactory improvements. Here, we propose a new method to retrieve coherent RF images with dense arrays. We take advantage of the waveform coherency at neighboring stations and invert for a small number of coherent arrivals for their RFs. The new RF images contain only the coherent arrivals required to fit data well. Synthetic tests and preliminary applications on real data demonstrate that the new RF images are easier to interpret and improve our ability to infer Earth structures using receiver functions.
Platelet Function Analyzed by Light Transmission Aggregometry.
Hvas, Anne-Mette; Favaloro, Emmanuel J
2017-01-01
Analysis of platelet function is widely used for diagnostic work-up in patients with increased bleeding tendency. During the last decades, platelet function testing has also been introduced for evaluation of antiplatelet therapy, but this is still recommended for research purposes only. Platelet function can also be assessed for hyper-aggregability, but this is less often evaluated. Light transmission aggregometry (LTA) was introduced in the early 1960s and has since been considered the gold standard. This optical detection system is based on changes in turbidity measured as a change in light transmission, which is proportional to the extent of platelet aggregation induced by addition of an agonist. LTA is a flexible method, as different agonists can be used in varying concentrations, but performance of the test requires large blood volumes and experienced laboratory technicians as well as specialized personal to interpret results. In the present chapter, a protocol for LTA is described including all steps from pre-analytical preparation to interpretation of results.
Serang, Oliver; MacCoss, Michael J.; Noble, William Stafford
2010-01-01
The problem of identifying proteins from a shotgun proteomics experiment has not been definitively solved. Identifying the proteins in a sample requires ranking them, ideally with interpretable scores. In particular, “degenerate” peptides, which map to multiple proteins, have made such a ranking difficult to compute. The problem of computing posterior probabilities for the proteins, which can be interpreted as confidence in a protein’s presence, has been especially daunting. Previous approaches have either ignored the peptide degeneracy problem completely, addressed it by computing a heuristic set of proteins or heuristic posterior probabilities, or by estimating the posterior probabilities with sampling methods. We present a probabilistic model for protein identification in tandem mass spectrometry that recognizes peptide degeneracy. We then introduce graph-transforming algorithms that facilitate efficient computation of protein probabilities, even for large data sets. We evaluate our identification procedure on five different well-characterized data sets and demonstrate our ability to efficiently compute high-quality protein posteriors. PMID:20712337
Interpreting Electromagnetic Reflections In Glaciology
NASA Astrophysics Data System (ADS)
Eisen, O.; Nixdorf, U.; Wilhelms, F.; Steinhage, D.; Miller, H.
Electromagnetic reflection (EMR) measurements are active remote sensing methods that have become a major tool for glaciological investigations. Although the basic pro- cesses are well understood, the unambiguous interpretation of EMR data, especially internal layering, still requires further information. The Antacrtic ice sheet provides a unique setting for investigating the relation between physicalchemical properties of ice and EMR data. Cold ice, smooth surface topography, and low accumulation facilitates matters to use low energy ground penetrating radar (GPR) devices to pene- trate several tens to hundreds of meters of ice, covering several thousands of years of snow deposition history. Thus, sufficient internal layers, primarily of volcanic origin, are recorded to enable studies on a local and regional scale. Based on dated ice core records, GPR measurements at various frequencies, and airborne radio-echo sound- ing (RES) from Dronning Maud Land (DML), Antarctica, combined with numerical modeling techniques, we investigate the influence of internal layering characteristics and properties of the propagating electromagnetic wave on EMR data.
2011-12-05
general body language, and gestures. PRT commanders were instructed to make eye contact with the “local leader” when speaking , not the “team...typically have little to no English speaking skill 18 , the task of communicating is accomplished through interpreters. The PRT interpreter becomes...interpreter to listen and speak at the same time that Speaker A is speaking , providing a shorter discourse pattern, but requiring far greater degree of
Pylkkänen, Paavo
2015-12-01
The theme of phenomenology and quantum physics is here tackled by examining some basic interpretational issues in quantum physics. One key issue in quantum theory from the very beginning has been whether it is possible to provide a quantum ontology of particles in motion in the same way as in classical physics, or whether we are restricted to stay within a more limited view of quantum systems, in terms of complementary but mutually exclusive phenomena. In phenomenological terms we could describe the situation by saying that according to the usual interpretation of quantum theory (especially Niels Bohr's), quantum phenomena require a kind of epoché (i.e. a suspension of assumptions about reality at the quantum level). However, there are other interpretations (especially David Bohm's) that seem to re-establish the possibility of a mind-independent ontology at the quantum level. We will show that even such ontological interpretations contain novel, non-classical features, which require them to give a special role to "phenomena" or "appearances", a role not encountered in classical physics. We will conclude that while ontological interpretations of quantum theory are possible, quantum theory implies the need of a certain kind of epoché even for this type of interpretations. While different from the epoché connected to phenomenological description, the "quantum epoché" nevertheless points to a potentially interesting parallel between phenomenology and quantum philosophy. Copyright © 2015. Published by Elsevier Ltd.
Milreu, Paulo Vieira; Klein, Cecilia Coimbra; Cottret, Ludovic; Acuña, Vicente; Birmelé, Etienne; Borassi, Michele; Junot, Christophe; Marchetti-Spaccamela, Alberto; Marino, Andrea; Stougie, Leen; Jourdan, Fabien; Crescenzi, Pierluigi; Lacroix, Vincent; Sagot, Marie-France
2014-01-01
Motivation: The increasing availability of metabolomics data enables to better understand the metabolic processes involved in the immediate response of an organism to environmental changes and stress. The data usually come in the form of a list of metabolites whose concentrations significantly changed under some conditions, and are thus not easy to interpret without being able to precisely visualize how such metabolites are interconnected. Results: We present a method that enables to organize the data from any metabolomics experiment into metabolic stories. Each story corresponds to a possible scenario explaining the flow of matter between the metabolites of interest. These scenarios may then be ranked in different ways depending on which interpretation one wishes to emphasize for the causal link between two affected metabolites: enzyme activation, enzyme inhibition or domino effect on the concentration changes of substrates and products. Equally probable stories under any selected ranking scheme can be further grouped into a single anthology that summarizes, in a unique subnetwork, all equivalently plausible alternative stories. An anthology is simply a union of such stories. We detail an application of the method to the response of yeast to cadmium exposure. We use this system as a proof of concept for our method, and we show that we are able to find a story that reproduces very well the current knowledge about the yeast response to cadmium. We further show that this response is mostly based on enzyme activation. We also provide a framework for exploring the alternative pathways or side effects this local response is expected to have in the rest of the network. We discuss several interpretations for the changes we see, and we suggest hypotheses that could in principle be experimentally tested. Noticeably, our method requires simple input data and could be used in a wide variety of applications. Availability and implementation: The code for the method presented in this article is available at http://gobbolino.gforge.inria.fr. Contact: pvmilreu@gmail.com; vincent.lacroix@univ-lyon1.fr; marie-france.sagot@inria.fr Supplementary information: Supplementary data are available at Bioinformatics online. PMID:24167155
46 CFR 15.101 - Purpose of regulations.
Code of Federal Regulations, 2014 CFR
2014-10-01
... COAST GUARD, DEPARTMENT OF HOMELAND SECURITY MERCHANT MARINE OFFICERS AND SEAMEN MANNING REQUIREMENTS... uniform minimum requirements for the manning of vessels. In general, they implement, interpret, or apply the specific statutory manning requirements in title 46, U.S.C., implement various international...
47 CFR 76.75 - Specific EEO program requirements.
Code of Federal Regulations, 2011 CFR
2011-10-01
... MULTICHANNEL VIDEO AND CABLE TELEVISION SERVICE Equal Employment Opportunity Requirements § 76.75 Specific EEO... necessary. Nothing in this section shall be interpreted to require a multichannel video programming...) In addition to using such recruitment sources, a multichannel video programming distributor...
47 CFR 76.75 - Specific EEO program requirements.
Code of Federal Regulations, 2012 CFR
2012-10-01
... MULTICHANNEL VIDEO AND CABLE TELEVISION SERVICE Equal Employment Opportunity Requirements § 76.75 Specific EEO... necessary. Nothing in this section shall be interpreted to require a multichannel video programming...) In addition to using such recruitment sources, a multichannel video programming distributor...
47 CFR 76.75 - Specific EEO program requirements.
Code of Federal Regulations, 2014 CFR
2014-10-01
... MULTICHANNEL VIDEO AND CABLE TELEVISION SERVICE Equal Employment Opportunity Requirements § 76.75 Specific EEO... necessary. Nothing in this section shall be interpreted to require a multichannel video programming...) In addition to using such recruitment sources, a multichannel video programming distributor...
47 CFR 76.75 - Specific EEO program requirements.
Code of Federal Regulations, 2013 CFR
2013-10-01
... MULTICHANNEL VIDEO AND CABLE TELEVISION SERVICE Equal Employment Opportunity Requirements § 76.75 Specific EEO... necessary. Nothing in this section shall be interpreted to require a multichannel video programming...) In addition to using such recruitment sources, a multichannel video programming distributor...
A project-based geoscience curriculum: select examples
NASA Astrophysics Data System (ADS)
Brown, L. M.; Kelso, P. R.; White, R. J.; Rexroad, C. B.
2007-12-01
Principles of constructivist educational philosophy serve as a foundation for the recently completed National Science Foundation sponsored undergraduate curricular revision undertaken by the Geology Department of Lake Superior State University. We integrate lecture and laboratory sessions utilizing active learning strategies that focus on real-world geoscience experiences and problems. In this presentation, we discuss details of three research-like projects that require students to access original data, process and model the data using appropriate geological software, interpret and defend results, and disseminate results in reports, posters, and class presentations. The projects are from three upper division courses, Carbonate Systems, Sequence Stratigraphy, and Geophysical Systems, where teams of two to four students are presented with defined problems of durations ranging from a few weeks to an entire semester. Project goals and location, some background information, and specified dates and expectations for interim and final written and oral reports are provided to students. Some projects require the entire class to work on one data set, some require each team to be initially responsible for a portion of the project with teams ultimately merging data for interpretation and to arrive at final conclusions. Some projects require students to utilize data from appropriate geological web sites such as state geological surveys. Others require students to design surveys and utilize appropriate instruments of their choice for field data collection. Students learn usage and applications of appropriate geological software in compiling, processing, modeling, and interpreting data and preparing formal reports and presentations. Students uniformly report heightened interest and motivation when engaged in these projects. Our new curriculum has resulted in an increase in students" quantitative and interpretive skills along with dramatic improvement in communication and interpersonal skills related to group dynamics.
Decision support at home (DS@HOME) – system architectures and requirements
2012-01-01
Background Demographic change with its consequences of an aging society and an increase in the demand for care in the home environment has triggered intensive research activities in sensor devices and smart home technologies. While many advanced technologies are already available, there is still a lack of decision support systems (DSS) for the interpretation of data generated in home environments. The aim of the research for this paper is to present the state-of-the-art in DSS for these data, to define characteristic properties of such systems, and to define the requirements for successful home care DSS implementations. Methods A literature review was performed along with the analysis of cross-references. Characteristic properties are proposed and requirements are derived from the available body of literature. Results 79 papers were identified and analyzed, of which 20 describe implementations of decision components. Most authors mention server-based decision support components, but only few papers provide details about the system architecture or the knowledge base. A list of requirements derived from the analysis is presented. Among the primary drawbacks of current systems are the missing integration of DSS in current health information system architectures including interfaces, the missing agreement among developers with regard to the formalization and customization of medical knowledge and a lack of intelligent algorithms to interpret data from multiple sources including clinical application systems. Conclusions Future research needs to address these issues in order to provide useful information – and not only large amounts of data – for both the patient and the caregiver. Furthermore, there is a need for outcome studies allowing for identifying successful implementation concepts. PMID:22640470
29 CFR 548.200 - Requirements.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 29 Labor 3 2014-07-01 2014-07-01 false Requirements. 548.200 Section 548.200 Labor Regulations... ESTABLISHED BASIC RATES FOR COMPUTING OVERTIME PAY Interpretations Requirements for A Basic Rate § 548.200 Requirements. The following conditions must be satisfied if a “basic” rate is to be considered proper under...
29 CFR 548.200 - Requirements.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 29 Labor 3 2013-07-01 2013-07-01 false Requirements. 548.200 Section 548.200 Labor Regulations... ESTABLISHED BASIC RATES FOR COMPUTING OVERTIME PAY Interpretations Requirements for A Basic Rate § 548.200 Requirements. The following conditions must be satisfied if a “basic” rate is to be considered proper under...
29 CFR 548.200 - Requirements.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 29 Labor 3 2012-07-01 2012-07-01 false Requirements. 548.200 Section 548.200 Labor Regulations... ESTABLISHED BASIC RATES FOR COMPUTING OVERTIME PAY Interpretations Requirements for A Basic Rate § 548.200 Requirements. The following conditions must be satisfied if a “basic” rate is to be considered proper under...
Modelling switching-time effects in high-frequency power conditioning networks
NASA Technical Reports Server (NTRS)
Owen, H. A.; Sloane, T. H.; Rimer, B. H.; Wilson, T. G.
1979-01-01
Power transistor networks which switch large currents in highly inductive environments are beginning to find application in the hundred kilohertz switching frequency range. Recent developments in the fabrication of metal-oxide-semiconductor field-effect transistors in the power device category have enhanced the movement toward higher switching frequencies. Models for switching devices and of the circuits in which they are imbedded are required to properly characterize the mechanisms responsible for turning on and turning off effects. Easily interpreted results in the form of oscilloscope-like plots assist in understanding the effects of parametric studies using topology oriented computer-aided analysis methods.
Molecular Diagnostics of Fusion and Laboratory Plasmas
NASA Astrophysics Data System (ADS)
Fantz, U.
2005-05-01
The presence of molecules in the cold scrape-off layer of fusion experiments and industrial plasmas requires an understanding of the molecular dynamics in these low temperature plasmas. Suitable diagnostic methods can provide an insight in molecular processes in the plasma volume as well as for plasma surface interactions. A very simple but powerful technique is the molecular emission spectroscopy. Spectra are obtained easily, whereas interpretation might be very complex and relies on the availability of atomic and molecular data. Examples are given for hydrogen plasmas and plasmas with hydrocarbons which both are of importance in industrial applications as well as in fusion experiments.
NASA Technical Reports Server (NTRS)
Harwood, P. (Principal Investigator); Finley, R.; Mcculloch, S.; Marphy, D.; Hupp, B.
1976-01-01
The author has identified the following significant results. Image interpretation mapping techniques were successfully applied to test site 5, an area with a semi-arid climate. The land cover/land use classification required further modification. A new program, HGROUP, added to the ADP classification schedule provides a convenient method for examining the spectral similarity between classes. This capability greatly simplifies the task of combining 25-30 unsupervised subclasses into about 15 major classes that approximately correspond to the land use/land cover classification scheme.
1993-03-25
manually . M2F=3.28083989 A=size( HTP ); TO= HTP (I,2)+273.15; PO= HTP (I,3); H0= HTP (1,1); R=287.04; GSL=9.7957; H=[]; P3=0; RAD=20820807; axis((0,5000,0,30...contributed, but I would especially like to acknowledge Bob Nagy of the Geophysics Department for his assistance in providing, and interpreting the...teet to under 250 feet at an altitude of 30,000 feet. The method does require the sounding data to be manually examined to pick out the layer. When a
Highlights in the study of exoplanet atmospheres
NASA Astrophysics Data System (ADS)
Burrows, Adam S.
2014-09-01
Exoplanets are now being discovered in profusion. To understand their character, however, we require spectral models and data. These elements of remote sensing can yield temperatures, compositions and even weather patterns, but only if significant improvements in both the parameter retrieval process and measurements are made. Despite heroic efforts to garner constraining data on exoplanet atmospheres and dynamics, reliable interpretation has frequently lagged behind ambition. I summarize the most productive, and at times novel, methods used to probe exoplanet atmospheres; highlight some of the most interesting results obtained; and suggest various broad theoretical topics in which further work could pay significant dividends.
Life Sciences Research Facility automation requirements and concepts for the Space Station
NASA Technical Reports Server (NTRS)
Rasmussen, Daryl N.
1986-01-01
An evaluation is made of the methods and preliminary results of a study on prospects for the automation of the NASA Space Station's Life Sciences Research Facility. In order to remain within current Space Station resource allocations, approximately 85 percent of planned life science experiment tasks must be automated; these tasks encompass specimen care and feeding, cage and instrument cleaning, data acquisition and control, sample analysis, waste management, instrument calibration, materials inventory and management, and janitorial work. Task automation will free crews for specimen manipulation, tissue sampling, data interpretation and communication with ground controllers, and experiment management.
Design, analysis, and interpretation of field quality-control data for water-sampling projects
Mueller, David K.; Schertz, Terry L.; Martin, Jeffrey D.; Sandstrom, Mark W.
2015-01-01
The report provides extensive information about statistical methods used to analyze quality-control data in order to estimate potential bias and variability in environmental data. These methods include construction of confidence intervals on various statistical measures, such as the mean, percentiles and percentages, and standard deviation. The methods are used to compare quality-control results with the larger set of environmental data in order to determine whether the effects of bias and variability might interfere with interpretation of these data. Examples from published reports are presented to illustrate how the methods are applied, how bias and variability are reported, and how the interpretation of environmental data can be qualified based on the quality-control analysis.
A Meta-Analysis of Reliability Coefficients in Second Language Research
ERIC Educational Resources Information Center
Plonsky, Luke; Derrick, Deirdre J.
2016-01-01
Ensuring internal validity in quantitative research requires, among other conditions, reliable instrumentation. Unfortunately, however, second language (L2) researchers often fail to report and even more often fail to interpret reliability estimates beyond generic benchmarks for acceptability. As a means to guide interpretations of such estimates,…
ERIC Educational Resources Information Center
MacNeela, Pádraig; Gannon, Niall
2014-01-01
Volunteering among university students is an important expression of civic engagement, but the impact of this experience on the development of emerging adults requires further contextualization. Adopting interpretative phenomenological analysis as a qualitative research approach, we carried out semistructured interviews with 10 students of one…
29 CFR 780.606 - Interpretation of term “agriculture.”
Code of Federal Regulations, 2010 CFR
2010-07-01
... AGRICULTURE, PROCESSING OF AGRICULTURAL COMMODITIES, AND RELATED SUBJECTS UNDER THE FAIR LABOR STANDARDS ACT Employment in Agriculture and Livestock Auction Operations Under the Section 13(b)(13) Exemption Requirements for Exemption § 780.606 Interpretation of term “agriculture.” Section 3(f) of the Act, which defines...
29 CFR 780.606 - Interpretation of term “agriculture.”
Code of Federal Regulations, 2011 CFR
2011-07-01
... AGRICULTURE, PROCESSING OF AGRICULTURAL COMMODITIES, AND RELATED SUBJECTS UNDER THE FAIR LABOR STANDARDS ACT Employment in Agriculture and Livestock Auction Operations Under the Section 13(b)(13) Exemption Requirements for Exemption § 780.606 Interpretation of term “agriculture.” Section 3(f) of the Act, which defines...
29 CFR 780.606 - Interpretation of term “agriculture.”
Code of Federal Regulations, 2012 CFR
2012-07-01
... AGRICULTURE, PROCESSING OF AGRICULTURAL COMMODITIES, AND RELATED SUBJECTS UNDER THE FAIR LABOR STANDARDS ACT Employment in Agriculture and Livestock Auction Operations Under the Section 13(b)(13) Exemption Requirements for Exemption § 780.606 Interpretation of term “agriculture.” Section 3(f) of the Act, which defines...
29 CFR 780.606 - Interpretation of term “agriculture.”
Code of Federal Regulations, 2014 CFR
2014-07-01
... AGRICULTURE, PROCESSING OF AGRICULTURAL COMMODITIES, AND RELATED SUBJECTS UNDER THE FAIR LABOR STANDARDS ACT Employment in Agriculture and Livestock Auction Operations Under the Section 13(b)(13) Exemption Requirements for Exemption § 780.606 Interpretation of term “agriculture.” Section 3(f) of the Act, which defines...
29 CFR 780.606 - Interpretation of term “agriculture.”
Code of Federal Regulations, 2013 CFR
2013-07-01
... AGRICULTURE, PROCESSING OF AGRICULTURAL COMMODITIES, AND RELATED SUBJECTS UNDER THE FAIR LABOR STANDARDS ACT Employment in Agriculture and Livestock Auction Operations Under the Section 13(b)(13) Exemption Requirements for Exemption § 780.606 Interpretation of term “agriculture.” Section 3(f) of the Act, which defines...