Loarie, Thomas M; Applegate, David; Kuenne, Christopher B; Choi, Lawrence J; Horowitz, Diane P
2003-01-01
Market segmentation analysis identifies discrete segments of the population whose beliefs are consistent with exhibited behaviors such as purchase choice. This study applies market segmentation analysis to low myopes (-1 to -3 D with less than 1 D cylinder) in their consideration and choice of a refractive surgery procedure to discover opportunities within the market. A quantitative survey based on focus group research was sent to a demographically balanced sample of myopes using contact lenses and/or glasses. A variable reduction process followed by a clustering analysis was used to discover discrete belief-based segments. The resulting segments were validated both analytically and through in-market testing. Discontented individuals who wear contact lenses are the primary target for vision correction surgery. However, 81% of the target group is apprehensive about laser in situ keratomileusis (LASIK). They are nervous about the procedure and strongly desire reversibility and exchangeability. There exists a large untapped opportunity for vision correction surgery within the low myope population. Market segmentation analysis helped determine how to best meet this opportunity through repositioning existing procedures or developing new vision correction technology, and could also be applied to identify opportunities in other vision correction populations.
Zhang, Yanbin; Lin, Guanfeng; Wang, Shengru; Zhang, Jianguo; Shen, Jianxiong; Wang, Yipeng; Guo, Jianwei; Yang, Xinyu; Zhao, Lijuan
2016-01-01
Study Design. Retrospective study. Objective. To study the behavior of the unfused thoracic curve in Lenke type 5C during the follow-up and to identify risk factors for its correction loss. Summary of Background Data. Few studies have focused on the spontaneous behaviors of the unfused thoracic curve after selective thoracolumbar or lumbar fusion during the follow-up and the risk factors for spontaneous correction loss. Methods. We retrospectively reviewed 45 patients (41 females and 4 males) with AIS who underwent selective TL/L fusion from 2006 to 2012 in a single institution. The follow-up averaged 36 months (range, 24–105 months). Patients were divided into two groups. Thoracic curves in group A improved or maintained their curve magnitude after spontaneous correction, with a negative or no correction loss during the follow-up. Thoracic curves in group B deteriorated after spontaneous correction with a positive correction loss. Univariate analysis and multivariate analysis were built to identify the risk factors for correction loss of the unfused thoracic curves. Results. The minor thoracic curve was 26° preoperatively. It was corrected to 13° immediately with a spontaneous correction of 48.5%. At final follow-up it was 14° with a correction loss of 1°. Thoracic curves did not deteriorate after spontaneous correction in 23 cases in group A, while 22 cases were identified with thoracic curve progressing in group B. In multivariate analysis, two risk factors were independently associated with thoracic correction loss: higher flexibility and better immediate spontaneous correction rate of thoracic curve. Conclusion. Posterior selective TL/L fusion with pedicle screw constructs is an effective treatment for Lenke 5C AIS patients. Nonstructural thoracic curves with higher flexibility or better immediate correction are more likely to progress during the follow-up and close attentions must be paid to these patients in case of decompensation. Level of Evidence: 4 PMID:27831989
9 CFR 417.3 - Corrective actions.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 9 Animals and Animal Products 2 2011-01-01 2011-01-01 false Corrective actions. 417.3 Section 417... ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS § 417.3 Corrective actions. (a) The written HACCP plan shall identify the corrective action to be followed in response to a deviation from a critical limit...
Ramirez, Ivan I; Arellano, Daniel H; Adasme, Rodrigo S; Landeros, Jose M; Salinas, Francisco A; Vargas, Alvaro G; Vasquez, Francisco J; Lobos, Ignacio A; Oyarzun, Magdalena L; Restrepo, Ruben D
2017-02-01
Waveform analysis by visual inspection can be a reliable, noninvasive, and useful tool for detecting patient-ventilator asynchrony. However, it is a skill that requires a properly trained professional. This observational study was conducted in 17 urban ICUs. Health-care professionals (HCPs) working in these ICUs were asked to recognize different types of asynchrony shown in 3 evaluation videos. The health-care professionals were categorized according to years of experience, prior training in mechanical ventilation, profession, and number of asynchronies identified correctly. A total of 366 HCPs were evaluated. Statistically significant differences were found when HCPs with and without prior training in mechanical ventilation (trained vs non-trained HCPs) were compared according to the number of asynchronies detected correctly (of the HCPs who identified 3 asynchronies, 63 [81%] trained vs 15 [19%] non-trained, P < .001; 2 asynchronies, 72 [65%] trained vs 39 [35%] non-trained, P = .034; 1 asynchrony, 55 [47%] trained vs 61 [53%] non-trained, P = .02; 0 asynchronies, 17 [28%] trained vs 44 [72%] non-trained, P < .001). HCPs who had prior training in mechanical ventilation also increased, nearly 4-fold, their odds of identifying ≥2 asynchronies correctly (odds ratio 3.67, 95% CI 1.93-6.96, P < .001). However, neither years of experience nor profession were associated with the ability of HCPs to identify asynchrony. HCPs who have specific training in mechanical ventilation increase their ability to identify asynchrony using waveform analysis. Neither experience nor profession proved to be a relevant factor to identify asynchrony correctly using waveform analysis. Copyright © 2017 by Daedalus Enterprises.
Client Accounts of Corrective Experiences in Psychotherapy: Implications for Clinical Practice.
Angus, Lynne; Constantino, Michael J
2017-02-01
The Patient Perceptions of Corrective Experiences in Individual Therapy (PPCEIT; Constantino, Angus, Friedlander, Messer, & Moertl, 2011) posttreatment interview guide was developed to provide clinical researchers with an effective mode of inquiry to identify and further explore clients' firsthand accounts of corrective and transformative therapy experiences and their determinants. Not only do findings from the analysis of client corrective experience (CE) accounts help identify what and how CEs happen in or as a result of psychotherapy, but the measure itself may also provide therapists with an effective tool to further enhance clients' awareness, understanding, and integration of transformative change experiences. Accordingly, we discuss in this afterword to the series the implications for clinical practice arising from (a) the thematic analysis of client CE accounts, drawn from a range of clinical samples and international research programs and (b) the clinical effect of completing the PPCEIT posttreatment interview inquiry. We also identify directions for future clinical training and research. © 2016 Wiley Periodicals, Inc.
Berruyer, M; Atkinson, S; Lebel, D; Bussières, J-F
2016-01-01
Insulin is a high-alert drug. The main objective of this descriptive cross-sectional study was to evaluate the risks associated with insulin use in healthcare centers. The secondary objective was to propose corrective measures to reduce the main risks associated with the most critical failure modes in the analysis. We conducted a failure mode and effects analysis (FMEA) in obstetrics-gynecology, neonatology and pediatrics. Five multidisciplinary meetings occurred in August 2013. A total of 44 out of 49 failure modes were analyzed. Nine out of 44 (20%) failure modes were deemed critical, with a criticality score ranging from 540 to 720. Following the multidisciplinary meetings, everybody agreed that an FMEA was a useful tool to identify failure modes and their relative importance. This approach identified many corrective measures. This shared experience increased awareness of safety issues with insulin in our mother-child center. This study identified the main failure modes and associated corrective measures. Copyright © 2015 Elsevier Masson SAS. All rights reserved.
Roth, K. C.; Walenkamp, M. M. J.; van Geenen, R. C. I.; Reijman, M.; Verhaar, J. A. N.; Colaris, J. W.
2017-01-01
The aim of this study was to identify predictors of a superior functional outcome after corrective osteotomy for paediatric malunited radius and both-bone forearm fractures. We performed a systematic review and meta-analysis of individual participant data, searching databases up to 1 October 2016. Our primary outcome was the gain in pronosupination seen after corrective osteotomy. Individual participant data of 11 cohort studies were included, concerning 71 participants with a median age of 11 years at trauma. Corrective osteotomy was performed after a median of 12 months after trauma, leading to a mean gain of 77° in pronosupination after a median follow-up of 29 months. Analysis of variance and multiple regression analysis revealed that predictors of superior functional outcome after corrective osteotomy are: an interval between trauma and corrective osteotomy of less than 1 year, an angular deformity of greater than 20° and the use of three-dimensional computer-assisted techniques. Level of evidence: II PMID:28891765
Human factors process failure modes and effects analysis (HF PFMEA) software tool
NASA Technical Reports Server (NTRS)
Chandler, Faith T. (Inventor); Relvini, Kristine M. (Inventor); Shedd, Nathaneal P. (Inventor); Valentino, William D. (Inventor); Philippart, Monica F. (Inventor); Bessette, Colette I. (Inventor)
2011-01-01
Methods, computer-readable media, and systems for automatically performing Human Factors Process Failure Modes and Effects Analysis for a process are provided. At least one task involved in a process is identified, where the task includes at least one human activity. The human activity is described using at least one verb. A human error potentially resulting from the human activity is automatically identified, the human error is related to the verb used in describing the task. A likelihood of occurrence, detection, and correction of the human error is identified. The severity of the effect of the human error is identified. The likelihood of occurrence, and the severity of the risk of potential harm is identified. The risk of potential harm is compared with a risk threshold to identify the appropriateness of corrective measures.
Kodak, Tiffany; Campbell, Vincent; Bergmann, Samantha; LeBlanc, Brittany; Kurtz-Nelson, Eva; Cariveau, Tom; Haq, Shaji; Zemantic, Patricia; Mahon, Jacob
2016-09-01
Prior research shows that learners have idiosyncratic responses to error-correction procedures during instruction. Thus, assessments that identify error-correction strategies to include in instruction can aid practitioners in selecting individualized, efficacious, and efficient interventions. The current investigation conducted an assessment to compare 5 error-correction procedures that have been evaluated in the extant literature and are common in instructional practice for children with autism spectrum disorder (ASD). Results showed that the assessment identified efficacious and efficient error-correction procedures for all participants, and 1 procedure was efficient for 4 of the 5 participants. To examine the social validity of error-correction procedures, participants selected among efficacious and efficient interventions in a concurrent-chains assessment. We discuss the results in relation to prior research on error-correction procedures and current instructional practices for learners with ASD. © 2016 Society for the Experimental Analysis of Behavior.
The Type and Linguistic Foci of Oral Corrective Feedback in the L2 Classroom: A Meta-Analysis
ERIC Educational Resources Information Center
Brown, Dan
2016-01-01
Research on corrective feedback (CF), a central focus of second language acquisition (SLA), has increasingly examined how teachers employ CF in second language classrooms. Lyster and Ranta's (1997) seminal study identified six types of CF that teachers use in response to students' errors (recast, explicit correction, elicitation, clarification…
Learning from examples - Generation and evaluation of decision trees for software resource analysis
NASA Technical Reports Server (NTRS)
Selby, Richard W.; Porter, Adam A.
1988-01-01
A general solution method for the automatic generation of decision (or classification) trees is investigated. The approach is to provide insights through in-depth empirical characterization and evaluation of decision trees for software resource data analysis. The trees identify classes of objects (software modules) that had high development effort. Sixteen software systems ranging from 3,000 to 112,000 source lines were selected for analysis from a NASA production environment. The collection and analysis of 74 attributes (or metrics), for over 4,700 objects, captured information about the development effort, faults, changes, design style, and implementation style. A total of 9,600 decision trees were automatically generated and evaluated. The trees correctly identified 79.3 percent of the software modules that had high development effort or faults, and the trees generated from the best parameter combinations correctly identified 88.4 percent of the modules on the average.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-04
... Development Center in Massachusetts so that it can be correctly identified through DNA analysis. (Since the... distinguished from each other. DNA analysis is the only way to accurately identify these insects.) The PPQ or... regulatory officials to identify and track specific specimens through the DNA identification tests that we...
Trounson, Justin S; Pfeifer, Jeffrey E
2017-10-01
This study explored correctional officers' response tendencies (i.e., cognitive, interpersonal, and behavioral response patterns they engage in) when managing workplace adversity. In total, 53 Australian correctional officers participated in the study. Eight exploratory focus group discussions ( n = 42) were conducted to identify a set of officer-endorsed response tendencies. Thematic analysis of group data revealed that correctional officers engage in a range of response tendencies when facing workplace adversity and that these tendencies may be categorized as interpersonally, cognitively, or behaviorally based. Semistructured interviews ( n = 11) were then conducted to provide further depth of information regarding officer response tendency usage. Results are discussed in terms of common themes, future research, and implications for developing training programs designed to ameliorate the effects of workplace adversity.
NASA Astrophysics Data System (ADS)
Alves, Gelio; Wang, Guanghui; Ogurtsov, Aleksey Y.; Drake, Steven K.; Gucek, Marjan; Suffredini, Anthony F.; Sacks, David B.; Yu, Yi-Kuo
2016-02-01
Correct and rapid identification of microorganisms is the key to the success of many important applications in health and safety, including, but not limited to, infection treatment, food safety, and biodefense. With the advance of mass spectrometry (MS) technology, the speed of identification can be greatly improved. However, the increasing number of microbes sequenced is challenging correct microbial identification because of the large number of choices present. To properly disentangle candidate microbes, one needs to go beyond apparent morphology or simple `fingerprinting'; to correctly prioritize the candidate microbes, one needs to have accurate statistical significance in microbial identification. We meet these challenges by using peptidome profiles of microbes to better separate them and by designing an analysis method that yields accurate statistical significance. Here, we present an analysis pipeline that uses tandem MS (MS/MS) spectra for microbial identification or classification. We have demonstrated, using MS/MS data of 81 samples, each composed of a single known microorganism, that the proposed pipeline can correctly identify microorganisms at least at the genus and species levels. We have also shown that the proposed pipeline computes accurate statistical significances, i.e., E-values for identified peptides and unified E-values for identified microorganisms. The proposed analysis pipeline has been implemented in MiCId, a freely available software for Microorganism Classification and Identification. MiCId is available for download at http://www.ncbi.nlm.nih.gov/CBBresearch/Yu/downloads.html.
All-digital precision processing of ERTS images
NASA Technical Reports Server (NTRS)
Bernstein, R. (Principal Investigator)
1975-01-01
The author has identified the following significant results. Digital techniques have been developed and used to apply precision-grade radiometric and geometric corrections to ERTS MSS and RBV scenes. Geometric accuracies sufficient for mapping at 1:250,000 scale have been demonstrated. Radiometric quality has been superior to ERTS NDPF precision products. A configuration analysis has shown that feasible, cost-effective all-digital systems for correcting ERTS data are easily obtainable. This report contains a summary of all results obtained during this study and includes: (1) radiometric and geometric correction techniques, (2) reseau detection, (3) GCP location, (4) resampling, (5) alternative configuration evaluations, and (6) error analysis.
Ashley, Laura; Armitage, Gerry; Taylor, Julie
2017-03-01
Failure Modes and Effects Analysis (FMEA) is a prospective quality assurance methodology increasingly used in healthcare, which identifies potential vulnerabilities in complex, high-risk processes and generates remedial actions. We aimed, for the first time, to apply FMEA in a social care context to evaluate the process for recognising and referring children exposed to domestic abuse within one Midlands city safeguarding area in England. A multidisciplinary, multi-agency team of 10 front-line professionals undertook the FMEA, using a modified methodology, over seven group meetings. The FMEA included mapping out the process under evaluation to identify its component steps, identifying failure modes (potential errors) and possible causes for each step and generating corrective actions. In this article, we report the output from the FMEA, including illustrative examples of the failure modes and corrective actions generated. We also present an analysis of feedback from the FMEA team and provide future recommendations for the use of FMEA in appraising social care processes and practice. Although challenging, the FMEA was unequivocally valuable for team members and generated a significant number of corrective actions locally for the safeguarding board to consider in its response to children exposed to domestic abuse. © 2016 John Wiley & Sons Ltd.
Identification and analysis of student conceptions used to solve chemical equilibrium problems
NASA Astrophysics Data System (ADS)
Voska, Kirk William
This study identified and quantified chemistry conceptions students use when solving chemical equilibrium problems requiring the application of Le Chatelier's principle, and explored the feasibility of designing a paper and pencil test for this purpose. It also demonstrated the utility of conditional probabilities to assess test quality. A 10-item pencil-and-paper, two-tier diagnostic instrument, the Test to Identify Student Conceptualizations (TISC) was developed and administered to 95 second-semester university general chemistry students after they received regular course instruction concerning equilibrium in homogeneous aqueous, heterogeneous aqueous, and homogeneous gaseous systems. The content validity of TISC was established through a review of TISC by a panel of experts; construct validity was established through semi-structured interviews and conditional probabilities. Nine students were then selected from a stratified random sample for interviews to validate TISC. The probability that TISC correctly identified an answer given by a student in an interview was p = .64, while the probability that TISC correctly identified a reason given by a student in an interview was p=.49. Each TISC item contained two parts. In the first part the student selected the correct answer to a problem from a set of four choices. In the second part students wrote reasons for their answer to the first part. TISC questions were designed to identify students' conceptions concerning the application of Le Chatelier's principle, the constancy of the equilibrium constant, K, and the effect of a catalyst. Eleven prevalent incorrect conceptions were identified. This study found students consistently selected correct answers more frequently (53% of the time) than they provided correct reasons (33% of the time). The association between student answers and respective reasons on each TISC item was quantified using conditional probabilities calculated from logistic regression coefficients. The probability a student provided correct reasoning (B) when the student selected a correct answer (A) ranged from P(B| A) =.32 to P(B| A) =.82. However, the probability a student selected a correct answer when they provided correct reasoning ranged from P(A| B) =.96 to P(A| B) = 1. The K-R 20 reliability for TISC was found to be.79.
Kury, Fabrício S P; Cimino, James J
2015-01-01
The corrections ("stipulations") to a proposed research study protocol produced by an institutional review board (IRB) can often be repetitive across many studies; however, there is no standard set of stipulations that could be used, for example, by researchers wishing to anticipate and correct problems in their research proposals prior to submitting to an IRB. The objective of the research was to computationally identify the most repetitive types of stipulations generated in the course of IRB deliberations. The text of each stipulation was normalized using the natural language processing techniques. An undirected weighted network was constructed in which each stipulation was represented by a node, and each link, if present, had weight corresponding to the TF-IDF Cosine Similarity of the stipulations. Network analysis software was then used to identify clusters in the network representing similar stipulations. The final results were correlated with additional data to produce further insights about the IRB workflow. From a corpus of 18,582 stipulations we identified 31 types of repetitive stipulations. Those types accounted for 3,870 stipulations (20.8% of the corpus) produced for 697 (88.7%) of all protocols in 392 (also 88.7%) of all the CNS IRB meetings with stipulations entered in our data source. A notable peroportion of the corrections produced by the IRB can be considered highly repetitive. Our shareable method relied on a minimal manual analysis and provides an intuitive exploration with theoretically unbounded granularity. Finer granularity allowed for the insight that is anticipated to prevent the need for identifying the IRB panel expertise or any human supervision.
NASA Technical Reports Server (NTRS)
Wiseman, S.M.; Arvidson, R.E.; Wolff, M. J.; Smith, M. D.; Seelos, F. P.; Morgan, F.; Murchie, S. L.; Mustard, J. F.; Morris, R. V.; Humm, D.;
2014-01-01
The empirical volcano-scan atmospheric correction is widely applied to Martian near infrared CRISM and OMEGA spectra between 1000 and 2600 nanometers to remove prominent atmospheric gas absorptions with minimal computational investment. This correction method employs division by a scaled empirically-derived atmospheric transmission spectrum that is generated from observations of the Martian surface in which different path lengths through the atmosphere were measured and transmission calculated using the Beer-Lambert Law. Identifying and characterizing both artifacts and residual atmospheric features left by the volcano-scan correction is important for robust interpretation of CRISM and OMEGA volcano scan corrected spectra. In order to identify and determine the cause of spectral artifacts introduced by the volcano-scan correction, we simulated this correction using a multiple scattering radiative transfer algorithm (DISORT). Simulated transmission spectra that are similar to actual CRISM- and OMEGA-derived transmission spectra were generated from modeled Olympus Mons base and summit spectra. Results from the simulations were used to investigate the validity of assumptions inherent in the volcano-scan correction and to identify artifacts introduced by this method of atmospheric correction. We found that the most prominent artifact, a bowl-shaped feature centered near 2000 nanometers, is caused by the inaccurate assumption that absorption coefficients of CO2 in the Martian atmosphere are independent of column density. In addition, spectral albedo and slope are modified by atmospheric aerosols. Residual atmospheric contributions that are caused by variable amounts of dust aerosols, ice aerosols, and water vapor are characterized by the analysis of CRISM volcano-scan corrected spectra from the same location acquired at different times under variable atmospheric conditions.
NASA Astrophysics Data System (ADS)
Wiseman, S. M.; Arvidson, R. E.; Wolff, M. J.; Smith, M. D.; Seelos, F. P.; Morgan, F.; Murchie, S. L.; Mustard, J. F.; Morris, R. V.; Humm, D.; McGuire, P. C.
2016-05-01
The empirical 'volcano-scan' atmospheric correction is widely applied to martian near infrared CRISM and OMEGA spectra between ∼1000 and ∼2600 nm to remove prominent atmospheric gas absorptions with minimal computational investment. This correction method employs division by a scaled empirically-derived atmospheric transmission spectrum that is generated from observations of the martian surface in which different path lengths through the atmosphere were measured and transmission calculated using the Beer-Lambert Law. Identifying and characterizing both artifacts and residual atmospheric features left by the volcano-scan correction is important for robust interpretation of CRISM and OMEGA volcano-scan corrected spectra. In order to identify and determine the cause of spectral artifacts introduced by the volcano-scan correction, we simulated this correction using a multiple scattering radiative transfer algorithm (DISORT). Simulated transmission spectra that are similar to actual CRISM- and OMEGA-derived transmission spectra were generated from modeled Olympus Mons base and summit spectra. Results from the simulations were used to investigate the validity of assumptions inherent in the volcano-scan correction and to identify artifacts introduced by this method of atmospheric correction. We found that the most prominent artifact, a bowl-shaped feature centered near 2000 nm, is caused by the inaccurate assumption that absorption coefficients of CO2 in the martian atmosphere are independent of column density. In addition, spectral albedo and slope are modified by atmospheric aerosols. Residual atmospheric contributions that are caused by variable amounts of dust aerosols, ice aerosols, and water vapor are characterized by the analysis of CRISM volcano-scan corrected spectra from the same location acquired at different times under variable atmospheric conditions.
ERIC Educational Resources Information Center
McCurdy, Merilee; Clure, Lynne F.; Bleck, Amanda A.; Schmitz, Stephanie L.
2016-01-01
Spelling is an important skill that is crucial to effective written communication. In this study, brief experimental analysis procedures were used to examine spelling instruction strategies (e.g., whole word correction; word study strategy; positive practice; and cover, copy, and compare) for four students. In addition, an extended analysis was…
Convergent genetic and expression data implicate immunity in Alzheimer's disease
Jones, Lesley; Lambert, Jean-Charles; Wang, Li-San; Choi, Seung-Hoan; Harold, Denise; Vedernikov, Alexey; Escott-Price, Valentina; Stone, Timothy; Richards, Alexander; Bellenguez, Céline; Ibrahim-Verbaas, Carla A; Naj, Adam C; Sims, Rebecca; Gerrish, Amy; Jun, Gyungah; DeStefano, Anita L; Bis, Joshua C; Beecham, Gary W; Grenier-Boley, Benjamin; Russo, Giancarlo; Thornton-Wells, Tricia A; Jones, Nicola; Smith, Albert V; Chouraki, Vincent; Thomas, Charlene; Ikram, M Arfan; Zelenika, Diana; Vardarajan, Badri N; Kamatani, Yoichiro; Lin, Chiao-Feng; Schmidt, Helena; Kunkle, Brian; Dunstan, Melanie L; Ruiz, Agustin; Bihoreau, Marie-Thérèse; Reitz, Christiane; Pasquier, Florence; Hollingworth, Paul; Hanon, Olivier; Fitzpatrick, Annette L; Buxbaum, Joseph D; Campion, Dominique; Crane, Paul K; Becker, Tim; Gudnason, Vilmundur; Cruchaga, Carlos; Craig, David; Amin, Najaf; Berr, Claudine; Lopez, Oscar L; De Jager, Philip L; Deramecourt, Vincent; Johnston, Janet A; Evans, Denis; Lovestone, Simon; Letteneur, Luc; Kornhuber, Johanes; Tárraga, Lluís; Rubinsztein, David C; Eiriksdottir, Gudny; Sleegers, Kristel; Goate, Alison M; Fiévet, Nathalie; Huentelman, Matthew J; Gill, Michael; Emilsson, Valur; Brown, Kristelle; Kamboh, M Ilyas; Keller, Lina; Barberger-Gateau, Pascale; McGuinness, Bernadette; Larson, Eric B; Myers, Amanda J; Dufouil, Carole; Todd, Stephen; Wallon, David; Love, Seth; Kehoe, Pat; Rogaeva, Ekaterina; Gallacher, John; George-Hyslop, Peter St; Clarimon, Jordi; Lleὀ, Alberti; Bayer, Anthony; Tsuang, Debby W; Yu, Lei; Tsolaki, Magda; Bossù, Paola; Spalletta, Gianfranco; Proitsi, Petra; Collinge, John; Sorbi, Sandro; Garcia, Florentino Sanchez; Fox, Nick; Hardy, John; Naranjo, Maria Candida Deniz; Razquin, Cristina; Bosco, Paola; Clarke, Robert; Brayne, Carol; Galimberti, Daniela; Mancuso, Michelangelo; Moebus, Susanne; Mecocci, Patrizia; del Zompo, Maria; Maier, Wolfgang; Hampel, Harald; Pilotto, Alberto; Bullido, Maria; Panza, Francesco; Caffarra, Paolo; Nacmias, Benedetta; Gilbert, John R; Mayhaus, Manuel; Jessen, Frank; Dichgans, Martin; Lannfelt, Lars; Hakonarson, Hakon; Pichler, Sabrina; Carrasquillo, Minerva M; Ingelsson, Martin; Beekly, Duane; Alavarez, Victoria; Zou, Fanggeng; Valladares, Otto; Younkin, Steven G; Coto, Eliecer; Hamilton-Nelson, Kara L; Mateo, Ignacio; Owen, Michael J; Faber, Kelley M; Jonsson, Palmi V; Combarros, Onofre; O'Donovan, Michael C; Cantwell, Laura B; Soininen, Hilkka; Blacker, Deborah; Mead, Simon; Mosley, Thomas H; Bennett, David A; Harris, Tamara B; Fratiglioni, Laura; Holmes, Clive; de Bruijn, Renee FAG; Passmore, Peter; Montine, Thomas J; Bettens, Karolien; Rotter, Jerome I; Brice, Alexis; Morgan, Kevin; Foroud, Tatiana M; Kukull, Walter A; Hannequin, Didier; Powell, John F; Nalls, Michael A; Ritchie, Karen; Lunetta, Kathryn L; Kauwe, John SK; Boerwinkle, Eric; Riemenschneider, Matthias; Boada, Mercè; Hiltunen, Mikko; Martin, Eden R; Pastor, Pau; Schmidt, Reinhold; Rujescu, Dan; Dartigues, Jean-François; Mayeux, Richard; Tzourio, Christophe; Hofman, Albert; Nöthen, Markus M; Graff, Caroline; Psaty, Bruce M; Haines, Jonathan L; Lathrop, Mark; Pericak-Vance, Margaret A; Launer, Lenore J; Farrer, Lindsay A; van Duijn, Cornelia M; Van Broekhoven, Christine; Ramirez, Alfredo; Schellenberg, Gerard D; Seshadri, Sudha; Amouyel, Philippe; Holmans, Peter A
2015-01-01
Background Late–onset Alzheimer's disease (AD) is heritable with 20 genes showing genome wide association in the International Genomics of Alzheimer's Project (IGAP). To identify the biology underlying the disease we extended these genetic data in a pathway analysis. Methods The ALIGATOR and GSEA algorithms were used in the IGAP data to identify associated functional pathways and correlated gene expression networks in human brain. Results ALIGATOR identified an excess of curated biological pathways showing enrichment of association. Enriched areas of biology included the immune response (p = 3.27×10-12 after multiple testing correction for pathways), regulation of endocytosis (p = 1.31×10-11), cholesterol transport (p = 2.96 × 10-9) and proteasome-ubiquitin activity (p = 1.34×10-6). Correlated gene expression analysis identified four significant network modules, all related to the immune response (corrected p 0.002 – 0.05). Conclusions The immune response, regulation of endocytosis, cholesterol transport and protein ubiquitination represent prime targets for AD therapeutics. PMID:25533204
Convergent genetic and expression data implicate immunity in Alzheimer's disease.
2015-06-01
Late-onset Alzheimer's disease (AD) is heritable with 20 genes showing genome-wide association in the International Genomics of Alzheimer's Project (IGAP). To identify the biology underlying the disease, we extended these genetic data in a pathway analysis. The ALIGATOR and GSEA algorithms were used in the IGAP data to identify associated functional pathways and correlated gene expression networks in human brain. ALIGATOR identified an excess of curated biological pathways showing enrichment of association. Enriched areas of biology included the immune response (P = 3.27 × 10(-12) after multiple testing correction for pathways), regulation of endocytosis (P = 1.31 × 10(-11)), cholesterol transport (P = 2.96 × 10(-9)), and proteasome-ubiquitin activity (P = 1.34 × 10(-6)). Correlated gene expression analysis identified four significant network modules, all related to the immune response (corrected P = .002-.05). The immune response, regulation of endocytosis, cholesterol transport, and protein ubiquitination represent prime targets for AD therapeutics. Copyright © 2015. Published by Elsevier Inc.
Methylation analysis of polysaccharides: Technical advice.
Sims, Ian M; Carnachan, Susan M; Bell, Tracey J; Hinkley, Simon F R
2018-05-15
Glycosyl linkage (methylation) analysis is used widely for the structural determination of oligo- and poly-saccharides. The procedure involves derivatisation of the individual component sugars of a polysaccharide to partially methylated alditol acetates which are analysed and quantified by gas chromatography-mass spectrometry. The linkage positions for each component sugar can be determined by correctly identifying the partially methylated alditol acetates. Although the methods are well established, there are many technical aspects to this procedure and both careful attention to detail and considerable experience are required to achieve a successful methylation analysis and to correctly interpret the data generated. The aim of this article is to provide the technical details and critical procedural steps necessary for a successful methylation analysis and to assist researchers (a) with interpreting data correctly and (b) in providing the comprehensive data required for reviewers to fully assess the work. Copyright © 2018 Elsevier Ltd. All rights reserved.
Kangas, Michael J; Burks, Raychelle M; Atwater, Jordyn; Lukowicz, Rachel M; Garver, Billy; Holmes, Andrea E
2018-02-01
With the increasing availability of digital imaging devices, colorimetric sensor arrays are rapidly becoming a simple, yet effective tool for the identification and quantification of various analytes. Colorimetric arrays utilize colorimetric data from many colorimetric sensors, with the multidimensional nature of the resulting data necessitating the use of chemometric analysis. Herein, an 8 sensor colorimetric array was used to analyze select acid and basic samples (0.5 - 10 M) to determine which chemometric methods are best suited for classification quantification of analytes within clusters. PCA, HCA, and LDA were used to visualize the data set. All three methods showed well-separated clusters for each of the acid or base analytes and moderate separation between analyte concentrations, indicating that the sensor array can be used to identify and quantify samples. Furthermore, PCA could be used to determine which sensors showed the most effective analyte identification. LDA, KNN, and HQI were used for identification of analyte and concentration. HQI and KNN could be used to correctly identify the analytes in all cases, while LDA correctly identified 95 of 96 analytes correctly. Additional studies demonstrated that controlling for solvent and image effects was unnecessary for all chemometric methods utilized in this study.
Comparison of methods for the identification of microorganisms isolated from blood cultures.
Monteiro, Aydir Cecília Marinho; Fortaleza, Carlos Magno Castelo Branco; Ferreira, Adriano Martison; Cavalcante, Ricardo de Souza; Mondelli, Alessandro Lia; Bagagli, Eduardo; da Cunha, Maria de Lourdes Ribeiro de Souza
2016-08-05
Bloodstream infections are responsible for thousands of deaths each year. The rapid identification of the microorganisms causing these infections permits correct therapeutic management that will improve the prognosis of the patient. In an attempt to reduce the time spent on this step, microorganism identification devices have been developed, including the VITEK(®) 2 system, which is currently used in routine clinical microbiology laboratories. This study evaluated the accuracy of the VITEK(®) 2 system in the identification of 400 microorganisms isolated from blood cultures and compared the results to those obtained with conventional phenotypic and genotypic methods. In parallel to the phenotypic identification methods, the DNA of these microorganisms was extracted directly from the blood culture bottles for genotypic identification by the polymerase chain reaction (PCR) and DNA sequencing. The automated VITEK(®) 2 system correctly identified 94.7 % (379/400) of the isolates. The YST and GN cards resulted in 100 % correct identifications of yeasts (15/15) and Gram-negative bacilli (165/165), respectively. The GP card correctly identified 92.6 % (199/215) of Gram-positive cocci, while the ANC card was unable to correctly identify any Gram-positive bacilli (0/5). The performance of the VITEK(®) 2 system was considered acceptable and statistical analysis showed that the system is a suitable option for routine clinical microbiology laboratories to identify different microorganisms.
Identifying the theory of dark matter with direct detection
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gluscevic, Vera; Gresham, Moira I.; McDermott, Samuel D.
2015-12-01
Identifying the true theory of dark matter depends crucially on accurately characterizing interactions of dark matter (DM) with other species. In the context of DM direct detection, we present a study of the prospects for correctly identifying the low-energy effective DM-nucleus scattering operators connected to UV-complete models of DM-quark interactions. We take a census of plausible UV-complete interaction models with different low-energy leading-order DM-nuclear responses. For each model (corresponding to different spin–, momentum–, and velocity-dependent responses), we create a large number of realizations of recoil-energy spectra, and use Bayesian methods to investigate the probability that experiments will be able tomore » select the correct scattering model within a broad set of competing scattering hypotheses. We conclude that agnostic analysis of a strong signal (such as Generation-2 would see if cross sections are just below the current limits) seen on xenon and germanium experiments is likely to correctly identify momentum dependence of the dominant response, ruling out models with either 'heavy' or 'light' mediators, and enabling downselection of allowed models. However, a unique determination of the correct UV completion will critically depend on the availability of measurements from a wider variety of nuclear targets, including iodine or fluorine. We investigate how model-selection prospects depend on the energy window available for the analysis. In addition, we discuss accuracy of the DM particle mass determination under a wide variety of scattering models, and investigate impact of the specific types of particle-physics uncertainties on prospects for model selection.« less
Identifying the theory of dark matter with direct detection
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gluscevic, Vera; Gresham, Moira I.; McDermott, Samuel D.
2015-12-29
Identifying the true theory of dark matter depends crucially on accurately characterizing interactions of dark matter (DM) with other species. In the context of DM direct detection, we present a study of the prospects for correctly identifying the low-energy effective DM-nucleus scattering operators connected to UV-complete models of DM-quark interactions. We take a census of plausible UV-complete interaction models with different low-energy leading-order DM-nuclear responses. For each model (corresponding to different spin–, momentum–, and velocity-dependent responses), we create a large number of realizations of recoil-energy spectra, and use Bayesian methods to investigate the probability that experiments will be able tomore » select the correct scattering model within a broad set of competing scattering hypotheses. We conclude that agnostic analysis of a strong signal (such as Generation-2 would see if cross sections are just below the current limits) seen on xenon and germanium experiments is likely to correctly identify momentum dependence of the dominant response, ruling out models with either “heavy” or “light” mediators, and enabling downselection of allowed models. However, a unique determination of the correct UV completion will critically depend on the availability of measurements from a wider variety of nuclear targets, including iodine or fluorine. We investigate how model-selection prospects depend on the energy window available for the analysis. In addition, we discuss accuracy of the DM particle mass determination under a wide variety of scattering models, and investigate impact of the specific types of particle-physics uncertainties on prospects for model selection.« less
Statistical inference of static analysis rules
NASA Technical Reports Server (NTRS)
Engler, Dawson Richards (Inventor)
2009-01-01
Various apparatus and methods are disclosed for identifying errors in program code. Respective numbers of observances of at least one correctness rule by different code instances that relate to the at least one correctness rule are counted in the program code. Each code instance has an associated counted number of observances of the correctness rule by the code instance. Also counted are respective numbers of violations of the correctness rule by different code instances that relate to the correctness rule. Each code instance has an associated counted number of violations of the correctness rule by the code instance. A respective likelihood of the validity is determined for each code instance as a function of the counted number of observances and counted number of violations. The likelihood of validity indicates a relative likelihood that a related code instance is required to observe the correctness rule. The violations may be output in order of the likelihood of validity of a violated correctness rule.
Species identification of corynebacteria by cellular fatty acid analysis.
Van den Velde, Sandra; Lagrou, Katrien; Desmet, Koen; Wauters, Georges; Verhaegen, Jan
2006-02-01
We evaluated the usefulness of cellular fatty acid analysis for the identification of corynebacteria. Therefore, 219 well-characterized strains belonging to 21 Corynebacterium species were analyzed with the Sherlock System of MIDI (Newark, DE). Most Corynebacterium species have a qualitative different fatty acid profile. Corynebacterium coyleae (subgroup 1), Corynebacterium riegelii, Corynebacterium simulans, and Corynebacterium imitans differ only quantitatively. Corynebacterium afermentans afermentans and C. coyleae (subgroup 2) have both a similar qualitative and quantitative profile. The commercially available database (CLIN 40, MIDI) identified only one third of the 219 strains correctly at the species level. We created a new database with these 219 strains. This new database was tested with 34 clinical isolates and could identify 29 strains correctly. Strains that remained unidentified were 2 Corynebacterium aurimucosum (not included in our database), 1 C. afermentans afermentans, and 2 Corynebacterium pseudodiphtheriticum. Cellular fatty acid analysis with a self-created database can be used for the identification and differentiation of corynebacteria.
Assessing Feedback in a Mobile Videogame.
Brand, Leah; Beltran, Alicia; Hughes, Sheryl; O'Connor, Teresia; Baranowski, Janice; Nicklas, Theresa; Chen, Tzu-An; Dadabhoy, Hafza R; Diep, Cassandra S; Buday, Richard; Baranowski, Tom
2016-06-01
Player feedback is an important part of serious games, although there is no consensus regarding its delivery or optimal content. "Mommio" is a serious game designed to help mothers motivate their preschoolers to eat vegetables. The purpose of this study was to assess optimal format and content of player feedback for use in "Mommio." The current study posed 36 potential "Mommio" gameplay feedback statements to 20 mothers using a Web survey and interview. Mothers were asked about the meaning and helpfulness of each feedback statement. Several themes emerged upon thematic analysis, including identifying an effective alternative in the case of corrective feedback, avoiding vague wording, using succinct and correct grammar, avoiding provocation of guilt, and clearly identifying why players' game choice was correct or incorrect. Guidelines are proposed for future feedback statements.
Analysis of Levene's Test under Design Imbalance.
ERIC Educational Resources Information Center
Keyes, Tim K.; Levy, Martin S.
1997-01-01
H. Levene (1960) proposed a heuristic test for heteroscedasticity in the case of a balanced two-way layout, based on analysis of variance of absolute residuals. Conditions under which design imbalance affects the test's characteristics are identified, and a simple correction involving leverage is proposed. (SLD)
DOE Office of Scientific and Technical Information (OSTI.GOV)
NNSA /NSO
The Corrective Action Investigation Plan contains the U.S. Department of Energy, National Nuclear Security Administration Nevada Operations Office's approach to collect the data necessary to evaluate corrective action alternatives appropriate for the closure of Corrective Action Unit (CAU) 204 under the Federal Facility Agreement and Consent Order. Corrective Action Unit 204 is located on the Nevada Test Site approximately 65 miles northwest of Las Vegas, Nevada. This CAU is comprised of six Corrective Action Sites (CASs) which include: 01-34-01, Underground Instrument House Bunker; 02-34-01, Instrument Bunker; 03-34-01, Underground Bunker; 05-18-02, Chemical Explosives Storage; 05-33-01, Kay Blockhouse; 05-99-02, Explosive Storage Bunker.more » Based on site history, process knowledge, and previous field efforts, contaminants of potential concern for Corrective Action Unit 204 collectively include radionuclides, beryllium, high explosives, lead, polychlorinated biphenyls, total petroleum hydrocarbons, silver, warfarin, and zinc phosphide. The primary question for the investigation is: ''Are existing data sufficient to evaluate appropriate corrective actions?'' To address this question, resolution of two decision statements is required. Decision I is to ''Define the nature of contamination'' by identifying any contamination above preliminary action levels (PALs); Decision II is to ''Determine the extent of contamination identified above PALs. If PALs are not exceeded, the investigation is completed. If PALs are exceeded, then Decision II must be resolved. In addition, data will be obtained to support waste management decisions. Field activities will include radiological land area surveys, geophysical surveys to identify any subsurface metallic and nonmetallic debris, field screening for applicable contaminants of potential concern, collection and analysis of surface and subsurface soil samples from biased locations, and step-out sampling to define the extent of contamination, as necessary. The results of this field investigation will support a defensible evaluation of corrective action alternatives in the corrective action decision document.« less
Weinberger, Sarah; Klarholz-Pevere, Carola; Liefeldt, Lutz; Baeder, Michael; Steckhan, Nico; Friedersdorff, Frank
2018-03-22
To analyse the influence of CT-based depth correction in the assessment of split renal function in potential living kidney donors. In 116 consecutive living kidney donors preoperative split renal function was assessed using the CT-based depth correction. Influence on donor side selection and postoperative renal function of the living kidney donors were analyzed. Linear regression analysis was performed to identify predictors of postoperative renal function. A left versus right kidney depth variation of more than 1 cm was found in 40/114 donors (35%). 11 patients (10%) had a difference of more than 5% in relative renal function after depth correction. Kidney depth variation and changes in relative renal function after depth correction would have had influence on side selection in 30 of 114 living kidney donors. CT depth correction did not improve the predictability of postoperative renal function of the living kidney donor. In general, it was not possible to predict the postoperative renal function from preoperative total and relative renal function. In multivariate linear regression analysis, age and BMI were identified as most important predictors for postoperative renal function of the living kidney donors. Our results clearly indicate that concerning the postoperative renal function of living kidney donors, the relative renal function of the donated kidney seems to be less important than other factors. A multimodal assessment with consideration of all available results including kidney size, location of the kidney and split renal function remains necessary.
Chung, Yi-Chieh; Cannella-Malone, Helen I
2010-11-01
This study examined the effects of presession exposure to attention, response blocking, attention with response blocking, and noninteraction conditions on subsequent engagement in automatically maintained challenging behavior and correct responding in four individuals with significant intellectual disabilities. Following a functional analysis, the effects of the four presession conditions were examined using multielement designs. Results varied across the 4 participants (e.g., presession noninteraction acted as an abolishing operation for 2 participants, but as an establishing operation for the other 2 participants). As such, both the results replicated and contradicted previous research examining the effects of motivating operations on automatically maintained challenging behavior. Although the results varied across participants, at least one condition resulting in a decrease in challenging behavior and an increase in correct responding were identified for each participant. These findings suggested that presession manipulations resulted in decreases in subsequent automatically maintained challenging behavior and simultaneous increases in correct responding might need to be individually identified when the maintaining contingencies cannot be identified.
Assessing Feedback in a Mobile Videogame
Brand, Leah; Beltran, Alicia; Hughes, Sheryl; O'Connor, Teresia; Baranowski, Janice; Nicklas, Theresa; Chen, Tzu-An; Dadabhoy, Hafza R.; Diep, Cassandra S.; Buday, Richard
2016-01-01
Abstract Background: Player feedback is an important part of serious games, although there is no consensus regarding its delivery or optimal content. “Mommio” is a serious game designed to help mothers motivate their preschoolers to eat vegetables. The purpose of this study was to assess optimal format and content of player feedback for use in “Mommio.” Materials and Methods: The current study posed 36 potential “Mommio” gameplay feedback statements to 20 mothers using a Web survey and interview. Mothers were asked about the meaning and helpfulness of each feedback statement. Results: Several themes emerged upon thematic analysis, including identifying an effective alternative in the case of corrective feedback, avoiding vague wording, using succinct and correct grammar, avoiding provocation of guilt, and clearly identifying why players' game choice was correct or incorrect. Conclusions: Guidelines are proposed for future feedback statements. PMID:27058403
Hiroyasu, Tomoyuki; Hayashinuma, Katsutoshi; Ichikawa, Hiroshi; Yagi, Nobuaki
2015-08-01
A preprocessing method for endoscopy image analysis using texture analysis is proposed. In a previous study, we proposed a feature value that combines a co-occurrence matrix and a run-length matrix to analyze the extent of early gastric cancer from images taken with narrow-band imaging endoscopy. However, the obtained feature value does not identify lesion zones correctly due to the influence of noise and halation. Therefore, we propose a new preprocessing method with a non-local means filter for de-noising and contrast limited adaptive histogram equalization. We have confirmed that the pattern of gastric mucosa in images can be improved by the proposed method. Furthermore, the lesion zone is shown more correctly by the obtained color map.
De-identifying an EHR database - anonymity, correctness and readability of the medical record.
Pantazos, Kostas; Lauesen, Soren; Lippert, Soren
2011-01-01
Electronic health records (EHR) contain a large amount of structured data and free text. Exploring and sharing clinical data can improve healthcare and facilitate the development of medical software. However, revealing confidential information is against ethical principles and laws. We de-identified a Danish EHR database with 437,164 patients. The goal was to generate a version with real medical records, but related to artificial persons. We developed a de-identification algorithm that uses lists of named entities, simple language analysis, and special rules. Our algorithm consists of 3 steps: collect lists of identifiers from the database and external resources, define a replacement for each identifier, and replace identifiers in structured data and free text. Some patient records could not be safely de-identified, so the de-identified database has 323,122 patient records with an acceptable degree of anonymity, readability and correctness (F-measure of 95%). The algorithm has to be adjusted for each culture, language and database.
Adderson, Elisabeth E.; Boudreaux, Jan W.; Cummings, Jessica R.; Pounds, Stanley; Wilson, Deborah A.; Procop, Gary W.; Hayden, Randall T.
2008-01-01
We compared the relative levels of effectiveness of three commercial identification kits and three nucleic acid amplification tests for the identification of coryneform bacteria by testing 50 diverse isolates, including 12 well-characterized control strains and 38 organisms obtained from pediatric oncology patients at our institution. Between 33.3 and 75.0% of control strains were correctly identified to the species level by phenotypic systems or nucleic acid amplification assays. The most sensitive tests were the API Coryne system and amplification and sequencing of the 16S rRNA gene using primers optimized for coryneform bacteria, which correctly identified 9 of 12 control isolates to the species level, and all strains with a high-confidence call were correctly identified. Organisms not correctly identified were species not included in the test kit databases or not producing a pattern of reactions included in kit databases or which could not be differentiated among several genospecies based on reaction patterns. Nucleic acid amplification assays had limited abilities to identify some bacteria to the species level, and comparison of sequence homologies was complicated by the inclusion of allele sequences obtained from uncultivated and uncharacterized strains in databases. The utility of rpoB genotyping was limited by the small number of representative gene sequences that are currently available for comparison. The correlation between identifications produced by different classification systems was poor, particularly for clinical isolates. PMID:18160450
Bidra, Avinash S; Nguyen, Viensuong; Manzotti, Anna; Kuo, Chia-Ling
2018-01-01
To study the subjective differences in direct lip support assessments and to determine if dentists and laypeople are able to discern and correctly identify direct changes in lip support between flange and flangeless dentures. A random sample of 20 maxillary edentulous patients described in part 2 of the study was used for analysis. A total of 60 judges comprising 15 general dentists, 15 prosthodontists, and 30 laypeople, the majority of who were distinct from part 2 of the study, were recruited. All images used in this study were cropped at the infraorbital level and converted to black and white tone, to encourage the judges to focus on lip support. The judges were un-blinded to the study objectives and told what to look for, and were asked to rate the lip support of each of the 80 images on a 100 mm visual analog scale (VAS). The judges then took a discriminatory sensory analysis test (triangle test) where they were required to correctly identify the image with a flangeless denture out of a set of 3 images. Both the VAS and triangle test ratings were conducted twice in a random order, and mean ratings were used for all analyses. The overall VAS ratings of lip support for images with flangeless dentures were slightly lower compared to images with labial flanges, and this difference was statistically significant (p < 0.0001). This was true for both profile and frontal images. However, the magnitude of these differences was too small (no greater than 5 mm on a 100-mm scale) to be clinically significant or meaningful. The differences in VAS ratings were not significant between the judges. For the triangle test, judges overall correctly identified the flangeless denture image in 55% of frontal image sets and 60% of profile image sets. The difference in correct identification rate between frontal and profile images was statistically significant (p < 0.0001). For frontal and profile images, prosthodontists had the highest correct identification rate (61% and 69%), followed by general dentists (53% and 68%) and by laypeople (53% and 50%). The difference in correct identification rate was statistically significant between various judges (p = 0.012). For all judges, the likelihood of correctly identifying images with flangeless dentures was significantly greater than 1/3, which was the minimum chance for correct identification (p < 0.0001). Removal of a labial flange in a maxillary denture resulted in slightly lower ratings of lip support compared to images with a labial flange, but the differences were clinically insignificant. When judges were forced to look for differences, flangeless dentures were detected more often in profile images. Prosthodontists detected the flangeless dentures more often than general dentists and laypeople. © 2017 by the American College of Prosthodontists.
76 FR 59125 - Environmental Impacts Statements; Notice of Availability
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-23
.... Markham 805-585-2150. EIS No. 20110317, Draft EIS, USFS, MT, Lonesome Wood Vegetation Management 2 Project... the 2006 FEIS Analysis and to Correct the Deficiencies that the Meister Panel Identified, Land and...
The human brain representation of odor identification.
Kjelvik, Grete; Evensmoen, Hallvard R; Brezova, Veronika; Håberg, Asta K
2012-07-01
Odor identification (OI) tests are increasingly used clinically as biomarkers for Alzheimer's disease and schizophrenia. The aim of this study was to directly compare the neuronal correlates to identified odors vs. nonidentified odors. Seventeen females with normal olfactory function underwent a functional magnetic resonance imaging (fMRI) experiment with postscanning assessment of spontaneous uncued OI. An event-related analysis was performed to compare within-subject activity to spontaneously identified vs. nonidentified odors at the whole brain level, and in anatomic and functional regions of interest (ROIs) in the medial temporal lobe (MTL). Parameter estimate values and blood oxygenated level-dependent (BOLD) signal curves for correctly identified and nonidentified odors were derived from functional ROIs in hippocampus, entorhinal, piriform, and orbitofrontal cortices. Number of activated voxels and max parameter estimate values were obtained from anatomic ROIs in the hippocampus and the entorhinal cortex. At the whole brain level the correct OI gave rise to increased activity in the left entorhinal cortex and secondary olfactory structures, including the orbitofrontal cortex. Increased activation was also observed in fusiform, primary visual, and auditory cortices, inferior frontal plus inferior temporal gyri. The anatomic MTL ROI analysis showed increased activation in the left entorhinal cortex, right hippocampus, and posterior parahippocampal gyri in correct OI. In the entorhinal cortex and hippocampus the BOLD signal increased specifically in response to identified odors and decreased for nonidentified odors. In orbitofrontal and piriform cortices both identified and nonidentified odors gave rise to an increased BOLD signal, but the response to identified odors was significantly greater than that for nonidentified odors. These results support a specific role for entorhinal cortex and hippocampus in OI, whereas piriform and orbitofrontal cortices are active in both smelling and OI. Moreover, episodic as well as semantic memory systems appeared to support OI.
Consistency of FMEA used in the validation of analytical procedures.
Oldenhof, M T; van Leeuwen, J F; Nauta, M J; de Kaste, D; Odekerken-Rombouts, Y M C F; Vredenbregt, M J; Weda, M; Barends, D M
2011-02-20
In order to explore the consistency of the outcome of a Failure Mode and Effects Analysis (FMEA) in the validation of analytical procedures, an FMEA was carried out by two different teams. The two teams applied two separate FMEAs to a High Performance Liquid Chromatography-Diode Array Detection-Mass Spectrometry (HPLC-DAD-MS) analytical procedure used in the quality control of medicines. Each team was free to define their own ranking scales for the probability of severity (S), occurrence (O), and detection (D) of failure modes. We calculated Risk Priority Numbers (RPNs) and we identified the failure modes above the 90th percentile of RPN values as failure modes needing urgent corrective action; failure modes falling between the 75th and 90th percentile of RPN values were identified as failure modes needing necessary corrective action, respectively. Team 1 and Team 2 identified five and six failure modes needing urgent corrective action respectively, with two being commonly identified. Of the failure modes needing necessary corrective actions, about a third were commonly identified by both teams. These results show inconsistency in the outcome of the FMEA. To improve consistency, we recommend that FMEA is always carried out under the supervision of an experienced FMEA-facilitator and that the FMEA team has at least two members with competence in the analytical method to be validated. However, the FMEAs of both teams contained valuable information that was not identified by the other team, indicating that this inconsistency is not always a drawback. Copyright © 2010 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grant Evenson
2008-02-01
This Corrective Action Decision Document has been prepared for Corrective Action Unit (CAU) 563, Septic Systems, in accordance with the Federal Facility Agreement and Consent Order (FFACO, 1996; as amended January 2007). The corrective action sites (CASs) for CAU 563 are located in Areas 3 and 12 of the Nevada Test Site, Nevada, and are comprised of the following four sites: •03-04-02, Area 3 Subdock Septic Tank •03-59-05, Area 3 Subdock Cesspool •12-59-01, Drilling/Welding Shop Septic Tanks •12-60-01, Drilling/Welding Shop Outfalls The purpose of this Corrective Action Decision Document is to identify and provide the rationale for the recommendation of a correctivemore » action alternative (CAA) for the four CASs within CAU 563. Corrective action investigation (CAI) activities were performed from July 17 through November 19, 2007, as set forth in the CAU 563 Corrective Action Investigation Plan (NNSA/NSO, 2007). Analytes detected during the CAI were evaluated against appropriate final action levels (FALs) to identify the contaminants of concern (COCs) for each CAS. The results of the CAI identified COCs at one of the four CASs in CAU 563 and required the evaluation of CAAs. Assessment of the data generated from investigation activities conducted at CAU 563 revealed the following: •CASs 03-04-02, 03-59-05, and 12-60-01 do not contain contamination at concentrations exceeding the FALs. •CAS 12-59-01 contains arsenic and chromium contamination above FALs in surface and near-surface soils surrounding a stained location within the site. Based on the evaluation of analytical data from the CAI, review of future and current operations at CAS 12-59-01, and the detailed and comparative analysis of the potential CAAs, the following corrective actions are recommended for CAU 563.« less
Feedback in action within bedside teaching encounters: a video ethnographic study.
Rizan, Chantelle; Elsey, Christopher; Lemon, Thomas; Grant, Andrew; Monrouxe, Lynn V
2014-09-01
Feedback associated with teaching activities is often synonymous with reflection on action, which comprises the evaluative assessment of performance out of its original context. Feedback in action (as correction during clinical encounters) is an underexplored, complementary resource facilitating students' understanding and learning. The purpose of this study was to explore the interactional patterns and correction modalities utilised in feedback sequences between doctors and students within general practice-based bedside teaching encounters (BTEs). A qualitative video ethnographic approach was used. Participants were recorded in their natural settings to allow interactional practices to be contextually explored. We examined 12 BTEs recorded across four general practices and involving 12 patients, four general practitioners and four medical students (209 minutes and 20 seconds of data) taken from a larger corpus. Data analysis was facilitated by Transana video analysis software and informed by previous conversation analysis research in ordinary conversation, classrooms and health care settings. A range of correction strategies across a spectrum of underlying explicitness were identified. Correction strategies classified at extreme poles of this scale (high or low explicitness) were believed to be less interactionally effective. For example, those using abrupt closing of topics (high explicitness) or interactional ambiguity (low explicitness) were thought to be less effective than embedded correction strategies that enabled the student to reach the correct answer with support. We believe that educators who are explicitly taught linguistic strategies for how to manage feedback in BTEs might manage learning more effectively. For example, clinicians might maximise learning moments during BTEs by avoiding abrupt or ambiguous feedback practices. Embedded correction strategies can enhance student participation by guiding students towards the correct answer. Clinician corrections can sensitively manage student face-saving by minimising the exposure of student error to patients. Furthermore, we believe that the effective practices highlighted by our analysis might facilitate successful transformation of feedback in action into feedback for action. © 2014 John Wiley & Sons Ltd.
2010-01-01
Background Discrimination between clinical and environmental strains within many bacterial species is currently underexplored. Genomic analyses have clearly shown the enormous variability in genome composition between different strains of a bacterial species. In this study we have used Legionella pneumophila, the causative agent of Legionnaire's disease, to search for genomic markers related to pathogenicity. During a large surveillance study in The Netherlands well-characterized patient-derived strains and environmental strains were collected. We have used a mixed-genome microarray to perform comparative-genome analysis of 257 strains from this collection. Results Microarray analysis indicated that 480 DNA markers (out of in total 3360 markers) showed clear variation in presence between individual strains and these were therefore selected for further analysis. Unsupervised statistical analysis of these markers showed the enormous genomic variation within the species but did not show any correlation with a pathogenic phenotype. We therefore used supervised statistical analysis to identify discriminating markers. Genetic programming was used both to identify predictive markers and to define their interrelationships. A model consisting of five markers was developed that together correctly predicted 100% of the clinical strains and 69% of the environmental strains. Conclusions A novel approach for identifying predictive markers enabling discrimination between clinical and environmental isolates of L. pneumophila is presented. Out of over 3000 possible markers, five were selected that together enabled correct prediction of all the clinical strains included in this study. This novel approach for identifying predictive markers can be applied to all bacterial species, allowing for better discrimination between strains well equipped to cause human disease and relatively harmless strains. PMID:20630115
Mendoza, Maria C.B.; Burns, Trudy L.; Jones, Michael P.
2009-01-01
Objectives Case-deletion diagnostic methods are tools that allow identification of influential observations that may affect parameter estimates and model fitting conclusions. The goal of this paper was to develop two case-deletion diagnostics, the exact case deletion (ECD) and the empirical influence function (EIF), for detecting outliers that can affect results of sib-pair maximum likelihood quantitative trait locus (QTL) linkage analysis. Methods Subroutines to compute the ECD and EIF were incorporated into the maximum likelihood QTL variance estimation components of the linkage analysis program MAPMAKER/SIBS. Performance of the diagnostics was compared in simulation studies that evaluated the proportion of outliers correctly identified (sensitivity), and the proportion of non-outliers correctly identified (specificity). Results Simulations involving nuclear family data sets with one outlier showed EIF sensitivities approximated ECD sensitivities well for outlier-affected parameters. Sensitivities were high, indicating the outlier was identified a high proportion of the time. Simulations also showed the enormous computational time advantage of the EIF. Diagnostics applied to body mass index in nuclear families detected observations influential on the lod score and model parameter estimates. Conclusions The EIF is a practical diagnostic tool that has the advantages of high sensitivity and quick computation. PMID:19172086
A Coupling Analysis Approach to Capture Unexpected Behaviors in Ares 1
NASA Astrophysics Data System (ADS)
Kis, David
Coupling of physics in large-scale complex engineering systems must be correctly accounted for during the systems engineering process. Preliminary corrections ensure no unanticipated behaviors arise during operation. Structural vibration of large segmented solid rocket motors, known as thrust oscillation, is a well-documented problem that can effect solid rocket motors in adverse ways. Within the Ares 1 rocket, unexpected vibrations deemed potentially harmful to future crew were recorded during late stage flight that propagated from the engine chamber to the Orion crew module. This research proposes the use of a coupling strength analysis during the design and development phase to identify potential unanticipated behaviors such as thrust oscillation. Once these behaviors and couplings are identified then a value function, based on research in Value Driven Design, is proposed to evaluate mitigation strategies and their impact on system value. The results from this study showcase a strong coupling interaction from structural displacement back onto the fluid flow of the Ares 1 that was previously deemed inconsequential. These findings show that the use of a coupling strength analysis can aid engineers and managers in identifying unanticipated behaviors and then rank order their importance based on the impact they have on value.
NASA Astrophysics Data System (ADS)
Gillen, Rebecca; Firbank, Michael J.; Lloyd, Jim; O'Brien, John T.
2015-09-01
This study investigated if the appearance and diagnostic accuracy of HMPAO brain perfusion SPECT images could be improved by using CT-based attenuation and scatter correction compared with the uniform attenuation correction method. A cohort of subjects who were clinically categorized as Alzheimer’s Disease (n=38 ), Dementia with Lewy Bodies (n=29 ) or healthy normal controls (n=30 ), underwent SPECT imaging with Tc-99m HMPAO and a separate CT scan. The SPECT images were processed using: (a) correction map derived from the subject’s CT scan or (b) the Chang uniform approximation for correction or (c) no attenuation correction. Images were visually inspected. The ratios between key regions of interest known to be affected or spared in each condition were calculated for each correction method, and the differences between these ratios were evaluated. The images produced using the different corrections were noted to be visually different. However, ROI analysis found similar statistically significant differences between control and dementia groups and between AD and DLB groups regardless of the correction map used. We did not identify an improvement in diagnostic accuracy in images which were corrected using CT-based attenuation and scatter correction, compared with those corrected using a uniform correction map.
Canaris, Gay J; Flach, Stephen D; Tape, Thomas G; Stierwalt, Kathyrn M; Haggstrom, David A; Wigton, Robert S
2003-05-01
Although microscopic urinalysis (micro UA) is commonly used in clinical practice, and residents are trained in micro UA, proficiency in this procedure has not been studied. In 1996-97, 38 residents in the University of Nebraska Medical Center's internal medicine (IM) residency program were evaluated on their technical ability to perform micro UA, and on their cognitive skills in recognizing common micro UA findings. After identifying deficits in the residents' cognitive competency, two educational interventions were applied and residents were tested after each intervention. A total of 24 residents (63%) correctly prepared the specimen for analysis (the technical portion). On the cognitive portion, only one of the 38 residents correctly identified 80% of all micro UA findings in the urinary sediment, although 11 (29%) residents identified UA findings specific to urinary tract infection (UTI). The first educational intervention did little to improve residents' performance. A second more intensive intervention resulted in 10 (45%) residents identifying 80% of all micro UA findings, and 19 (86%) residents correctly identifying UTI findings. Many residents were not proficient in performing micro UA, even after intensive educational interventions. Although micro UA is a simple procedure, residents' mastery cannot be assumed. Residency programs should assess competency in this procedure.
NASA Astrophysics Data System (ADS)
Alves, Gelio; Wang, Guanghui; Ogurtsov, Aleksey Y.; Drake, Steven K.; Gucek, Marjan; Sacks, David B.; Yu, Yi-Kuo
2018-06-01
Rapid and accurate identification and classification of microorganisms is of paramount importance to public health and safety. With the advance of mass spectrometry (MS) technology, the speed of identification can be greatly improved. However, the increasing number of microbes sequenced is complicating correct microbial identification even in a simple sample due to the large number of candidates present. To properly untwine candidate microbes in samples containing one or more microbes, one needs to go beyond apparent morphology or simple "fingerprinting"; to correctly prioritize the candidate microbes, one needs to have accurate statistical significance in microbial identification. We meet these challenges by using peptide-centric representations of microbes to better separate them and by augmenting our earlier analysis method that yields accurate statistical significance. Here, we present an updated analysis workflow that uses tandem MS (MS/MS) spectra for microbial identification or classification. We have demonstrated, using 226 MS/MS publicly available data files (each containing from 2500 to nearly 100,000 MS/MS spectra) and 4000 additional MS/MS data files, that the updated workflow can correctly identify multiple microbes at the genus and often the species level for samples containing more than one microbe. We have also shown that the proposed workflow computes accurate statistical significances, i.e., E values for identified peptides and unified E values for identified microbes. Our updated analysis workflow MiCId, a freely available software for Microorganism Classification and Identification, is available for download at https://www.ncbi.nlm.nih.gov/CBBresearch/Yu/downloads.html.
Alves, Gelio; Wang, Guanghui; Ogurtsov, Aleksey Y; Drake, Steven K; Gucek, Marjan; Sacks, David B; Yu, Yi-Kuo
2018-06-05
Rapid and accurate identification and classification of microorganisms is of paramount importance to public health and safety. With the advance of mass spectrometry (MS) technology, the speed of identification can be greatly improved. However, the increasing number of microbes sequenced is complicating correct microbial identification even in a simple sample due to the large number of candidates present. To properly untwine candidate microbes in samples containing one or more microbes, one needs to go beyond apparent morphology or simple "fingerprinting"; to correctly prioritize the candidate microbes, one needs to have accurate statistical significance in microbial identification. We meet these challenges by using peptide-centric representations of microbes to better separate them and by augmenting our earlier analysis method that yields accurate statistical significance. Here, we present an updated analysis workflow that uses tandem MS (MS/MS) spectra for microbial identification or classification. We have demonstrated, using 226 MS/MS publicly available data files (each containing from 2500 to nearly 100,000 MS/MS spectra) and 4000 additional MS/MS data files, that the updated workflow can correctly identify multiple microbes at the genus and often the species level for samples containing more than one microbe. We have also shown that the proposed workflow computes accurate statistical significances, i.e., E values for identified peptides and unified E values for identified microbes. Our updated analysis workflow MiCId, a freely available software for Microorganism Classification and Identification, is available for download at https://www.ncbi.nlm.nih.gov/CBBresearch/Yu/downloads.html . Graphical Abstract ᅟ.
General practitioners' knowledge and concern about electromagnetic fields.
Berg-Beckhoff, Gabriele; Breckenkamp, Jürgen; Larsen, Pia Veldt; Kowall, Bernd
2014-12-01
Our aim is to explore general practitioners' (GPs') knowledge about EMF, and to assess whether different knowledge structures are related to the GPs' concern about EMF. Random samples were drawn from lists of GPs in Germany in 2008. Knowledge about EMF was assessed by seven items. A latent class analysis was conducted to identify latent structures in GPs' knowledge. Further, the GPs' concern about EMF health risk was measured using a score comprising six items. The association between GPs' concern about EMF and their knowledge was analysed using multiple linear regression. In total 435 (response rate 23.3%) GPs participated in the study. Four groups were identified by the latent class analysis: 43.1% of the GPs gave mainly correct answers; 23.7% of the GPs answered low frequency EMF questions correctly; 19.2% answered only the questions relating EMF with health risks, and 14.0% answered mostly "don't know". There was no association between GPs' latent knowledge classes or between the number of correct answers given by the GPs and their EMF concern, whereas the number of incorrect answers was associated with EMF concern. Greater EMF concern in subjects with more incorrect answers suggests paying particular attention to misconceptions regarding EMF in risk communication.
Singleton, Christa-Marie; Debastiani, Summer; Rose, Dale; Kahn, Emily B
2014-01-01
To identify the extent to which the Homeland Security Exercise and Evaluation Program's (HSEEP) After Action Report/Improvement Plan (AAR/IP) template was followed by public health entities and facilitated the identification of detailed corrective actions and continuous improvement. Data were drawn from the US H1N1 Public Health Emergency Response (PHER) federal grant awardees (n = 62). After action report/improvement plan text was examined to identify the presence of AAR/IP HSEEP elements and characterized as "minimally complete," "partially complete," or "complete." Corrective actions (CA) and recommendations within the IP focusing on performance deficits were coded as specific, measurable, and time-bound, and whether they were associated with a problem that met root cause criteria and whether the CA/recommendation was intended to address or fix the root cause. A total of 2619 CA/recommendations were identified. More than half (n = 1480, 57%) addressed root causes. Corrective actions/recommendations associated with complete AARs more frequently addressed root cause (58% vs 51%, χ = 9.1, P < 0.003) and were more specific (34% vs 23%, χ = 32.3, P < 0.0001), measurable (30% vs 18%, χ = 37.9, P < 0.0001), and time-bound (38% vs 15%, χ = 115.5, P < 0.0001) than partially complete AARs. The same pattern was not observed with completeness of IPs. Corrective actions and recommendations were similarly specific and measurable. Recommendations significantly addressed root cause more than CAs. Our analysis indicates a possible lack of awardee distinction between CA and recommendations in AARs. As HSEEP adapts to align with the 2011 National Preparedness Goal and National Preparedness System, future HSEEP documents should emphasize the importance of root cause analysis as a required element within AAR documents and templates in the exercise and real incident environment, as well as the need for specific and measurable CAs.
Must Educational Responsibility Be an Illusion?
ERIC Educational Resources Information Center
Inbar, Dan
1982-01-01
Examines the concepts of and relationship between responsibility and authority. Identifies seven characteristics of educational systems (authority, interdependence, the unified whole, goals, evaluation, correction, and knowledge) as central to an analysis of educational responsibility. Suggests restructuring the educational system on a…
Machine-learned cluster identification in high-dimensional data.
Ultsch, Alfred; Lötsch, Jörn
2017-02-01
High-dimensional biomedical data are frequently clustered to identify subgroup structures pointing at distinct disease subtypes. It is crucial that the used cluster algorithm works correctly. However, by imposing a predefined shape on the clusters, classical algorithms occasionally suggest a cluster structure in homogenously distributed data or assign data points to incorrect clusters. We analyzed whether this can be avoided by using emergent self-organizing feature maps (ESOM). Data sets with different degrees of complexity were submitted to ESOM analysis with large numbers of neurons, using an interactive R-based bioinformatics tool. On top of the trained ESOM the distance structure in the high dimensional feature space was visualized in the form of a so-called U-matrix. Clustering results were compared with those provided by classical common cluster algorithms including single linkage, Ward and k-means. Ward clustering imposed cluster structures on cluster-less "golf ball", "cuboid" and "S-shaped" data sets that contained no structure at all (random data). Ward clustering also imposed structures on permuted real world data sets. By contrast, the ESOM/U-matrix approach correctly found that these data contain no cluster structure. However, ESOM/U-matrix was correct in identifying clusters in biomedical data truly containing subgroups. It was always correct in cluster structure identification in further canonical artificial data. Using intentionally simple data sets, it is shown that popular clustering algorithms typically used for biomedical data sets may fail to cluster data correctly, suggesting that they are also likely to perform erroneously on high dimensional biomedical data. The present analyses emphasized that generally established classical hierarchical clustering algorithms carry a considerable tendency to produce erroneous results. By contrast, unsupervised machine-learned analysis of cluster structures, applied using the ESOM/U-matrix method, is a viable, unbiased method to identify true clusters in the high-dimensional space of complex data. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.
Exploratory Mediation Analysis via Regularization
Serang, Sarfaraz; Jacobucci, Ross; Brimhall, Kim C.; Grimm, Kevin J.
2017-01-01
Exploratory mediation analysis refers to a class of methods used to identify a set of potential mediators of a process of interest. Despite its exploratory nature, conventional approaches are rooted in confirmatory traditions, and as such have limitations in exploratory contexts. We propose a two-stage approach called exploratory mediation analysis via regularization (XMed) to better address these concerns. We demonstrate that this approach is able to correctly identify mediators more often than conventional approaches and that its estimates are unbiased. Finally, this approach is illustrated through an empirical example examining the relationship between college acceptance and enrollment. PMID:29225454
ERIC Educational Resources Information Center
Brusco, Michael J.; Singh, Renu; Steinley, Douglas
2009-01-01
The selection of a subset of variables from a pool of candidates is an important problem in several areas of multivariate statistics. Within the context of principal component analysis (PCA), a number of authors have argued that subset selection is crucial for identifying those variables that are required for correct interpretation of the…
Deterministic figure correction of piezoelectrically adjustable slumped glass optics
NASA Astrophysics Data System (ADS)
DeRoo, Casey T.; Allured, Ryan; Cotroneo, Vincenzo; Hertz, Edward; Marquez, Vanessa; Reid, Paul B.; Schwartz, Eric D.; Vikhlinin, Alexey A.; Trolier-McKinstry, Susan; Walker, Julian; Jackson, Thomas N.; Liu, Tianning; Tendulkar, Mohit
2018-01-01
Thin x-ray optics with high angular resolution (≤ 0.5 arcsec) over a wide field of view enable the study of a number of astrophysically important topics and feature prominently in Lynx, a next-generation x-ray observatory concept currently under NASA study. In an effort to address this technology need, piezoelectrically adjustable, thin mirror segments capable of figure correction after mounting and on-orbit are under development. We report on the fabrication and characterization of an adjustable cylindrical slumped glass optic. This optic has realized 100% piezoelectric cell yield and employs lithographically patterned traces and anisotropic conductive film connections to address the piezoelectric cells. In addition, the measured responses of the piezoelectric cells are found to be in good agreement with finite-element analysis models. While the optic as manufactured is outside the range of absolute figure correction, simulated corrections using the measured responses of the piezoelectric cells are found to improve 5 to 10 arcsec mirrors to 1 to 3 arcsec [half-power diameter (HPD), single reflection at 1 keV]. Moreover, a measured relative figure change which would correct the figure of a representative slumped glass piece from 6.7 to 1.2 arcsec HPD is empirically demonstrated. We employ finite-element analysis-modeled influence functions to understand the current frequency limitations of the correction algorithm employed and identify a path toward achieving subarcsecond corrections.
Use of failure mode effect analysis (FMEA) to improve medication management process.
Jain, Khushboo
2017-03-13
Purpose Medication management is a complex process, at high risk of error with life threatening consequences. The focus should be on devising strategies to avoid errors and make the process self-reliable by ensuring prevention of errors and/or error detection at subsequent stages. The purpose of this paper is to use failure mode effect analysis (FMEA), a systematic proactive tool, to identify the likelihood and the causes for the process to fail at various steps and prioritise them to devise risk reduction strategies to improve patient safety. Design/methodology/approach The study was designed as an observational analytical study of medication management process in the inpatient area of a multi-speciality hospital in Gurgaon, Haryana, India. A team was made to study the complex process of medication management in the hospital. FMEA tool was used. Corrective actions were developed based on the prioritised failure modes which were implemented and monitored. Findings The percentage distribution of medication errors as per the observation made by the team was found to be maximum of transcription errors (37 per cent) followed by administration errors (29 per cent) indicating the need to identify the causes and effects of their occurrence. In all, 11 failure modes were identified out of which major five were prioritised based on the risk priority number (RPN). The process was repeated after corrective actions were taken which resulted in about 40 per cent (average) and around 60 per cent reduction in the RPN of prioritised failure modes. Research limitations/implications FMEA is a time consuming process and requires a multidisciplinary team which has good understanding of the process being analysed. FMEA only helps in identifying the possibilities of a process to fail, it does not eliminate them, additional efforts are required to develop action plans and implement them. Frank discussion and agreement among the team members is required not only for successfully conducing FMEA but also for implementing the corrective actions. Practical implications FMEA is an effective proactive risk-assessment tool and is a continuous process which can be continued in phases. The corrective actions taken resulted in reduction in RPN, subjected to further evaluation and usage by others depending on the facility type. Originality/value The application of the tool helped the hospital in identifying failures in medication management process, thereby prioritising and correcting them leading to improvement.
WHODAS 2.0 as a Measure of Severity of Illness: Results of a FLDA Analysis
Barahona, Igor; Aroca, Fuensanta; Peñuelas-Calvo, Inmaculada; Rodríguez-Jover, Alba; Amodeo-Escribano, Susana; González-Granado, Marta
2018-01-01
WHODAS 2.0 is the standard measure of disability promoted by World Health Organization whereas Clinical Global Impression (CGI) is a widely used scale for determining severity of mental illness. Although a close relationship between these two scales would be expected, there are no relevant studies on the topic. In this study, we explore if WHODAS 2.0 can be used for identifying severity of illness measured by CGI using the Fisher Linear Discriminant Analysis (FLDA) and for identifying which individual items of WHODAS 2.0 best predict CGI scores given by clinicians. One hundred and twenty-two patients were assessed with WHODAS 2.0 and CGI during three months in outpatient mental health facilities of four hospitals of Madrid, Spain. Compared with the traditional correction of WHODAS 2.0, FLDA improves accuracy in near 15%, and so, with FLDA WHODAS 2.0 classifying correctly 59.0% of the patients. Furthermore, FLDA identifies item 6.6 (illness effect on personal finances) and item 4.5 (damaged sexual life) as the most important items for clinicians to score the severity of illness. PMID:29770158
Can the behavioral sciences self-correct? A social epistemic study.
Romero, Felipe
2016-12-01
Advocates of the self-corrective thesis argue that scientific method will refute false theories and find closer approximations to the truth in the long run. I discuss a contemporary interpretation of this thesis in terms of frequentist statistics in the context of the behavioral sciences. First, I identify experimental replications and systematic aggregation of evidence (meta-analysis) as the self-corrective mechanism. Then, I present a computer simulation study of scientific communities that implement this mechanism to argue that frequentist statistics may converge upon a correct estimate or not depending on the social structure of the community that uses it. Based on this study, I argue that methodological explanations of the "replicability crisis" in psychology are limited and propose an alternative explanation in terms of biases. Finally, I conclude suggesting that scientific self-correction should be understood as an interaction effect between inference methods and social structures. Copyright © 2016 Elsevier Ltd. All rights reserved.
The Use of Natural Pozzolan in Concrete as an Additive or Substitute for Cement
2011-12-01
identified opal and chert as the common forms of reactive silica. ERDC/CERL TR-11-46 4 For cracking and expansion to result from the ASR, the following combi...chemical composition of three natural pozzolanic samples was deter- mined through XRD analysis. In addition to these analyses, several addi- tional tests...reflected angle, which results in an inaccurate plot. The correct angle is required to deter- mine the correct composition. A very finely ground sample
NASA Technical Reports Server (NTRS)
Miles, J. H.; Stevens, G. H.; Leininger, G. G.
1975-01-01
Ground reflections generate undesirable effects on acoustic measurements such as those conducted outdoors for jet noise research, aircraft certification, and motor vehicle regulation. Cepstral techniques developed in speech processing are adapted to identify echo delay time and to correct for ground reflection effects. A sample result is presented using an actual narrowband sound pressure level spectrum. The technique can readily be adapted to existing fast Fourier transform type spectrum measurement instrumentation to provide field measurements/of echo time delays.
Correction factors in determining speed of sound among freshmen in undergraduate physics laboratory
NASA Astrophysics Data System (ADS)
Lutfiyah, A.; Adam, A. S.; Suprapto, N.; Kholiq, A.; Putri, N. P.
2018-03-01
This paper deals to identify the correction factor in determining speed of sound that have been done by freshmen in undergraduate physics laboratory. Then, the result will be compared with speed of sound that determining by senior student. Both of them used the similar instrument, namely resonance tube with apparatus. The speed of sound indicated by senior was 333.38 ms-1 with deviation to the theory about 3.98%. Meanwhile, for freshmen, the speed of sound experiment was categorised into three parts: accurate value (52.63%), middle value (31.58%) and lower value (15.79%). Based on analysis, some correction factors were suggested: human error in determining first and second harmonic, end correction of tube diameter, and another factors from environment, such as temperature, humidity, density, and pressure.
The event-related potential effects of cognitive conflict in a Chinese character-generation task.
Qiu, Jiang; Zhang, Qinglin; Li, Hong; Luo, Yuejia; Yin, Qinging; Chen, Antao; Yuan, Hong
2007-06-11
High-density event-related potentials were recorded to examine the electrophysiologic correlates of the evaluation of possible answers provided during a Chinese character-generation task. We examined three conditions: the character given was what participants initially generated (Consistent answer), the character given was correct (Unexpected Correct answer), or it was incorrect (Unexpected Incorrect answer). Results showed that Unexpected Correct and Incorrect answers elicited a more negative event-related potential deflection (N320) than did Consistent answers between 300 and 400 ms. Dipole source analysis of difference waves (Unexpected Correct or Incorrect minus Consistent answers) localized the generator of the N320 in the anterior cingulate cortex. The N320 therefore likely reflects the cognitive change or conflict between old and new ways of thinking while identifying and judging characters.
Dai, Guang-ming; Campbell, Charles E; Chen, Li; Zhao, Huawei; Chernyak, Dimitri
2009-01-20
In wavefront-driven vision correction, ocular aberrations are often measured on the pupil plane and the correction is applied on a different plane. The problem with this practice is that any changes undergone by the wavefront as it propagates between planes are not currently included in devising customized vision correction. With some valid approximations, we have developed an analytical foundation based on geometric optics in which Zernike polynomials are used to characterize the propagation of the wavefront from one plane to another. Both the boundary and the magnitude of the wavefront change after the propagation. Taylor monomials were used to realize the propagation because of their simple form for this purpose. The method we developed to identify changes in low-order aberrations was verified with the classical vertex correction formula. The method we developed to identify changes in high-order aberrations was verified with ZEMAX ray-tracing software. Although the method may not be valid for highly irregular wavefronts and it was only proven for wavefronts with low-order or high-order aberrations, our analysis showed that changes in the propagating wavefront are significant and should, therefore, be included in calculating vision correction. This new approach could be of major significance in calculating wavefront-driven vision correction whether by refractive surgery, contact lenses, intraocular lenses, or spectacles.
Shen, Jim K; Faaborg, Daniel; Rouse, Glenn; Kelly, Isaac; Li, Roger; Alsyouf, Muhannad; Myklak, Kristene; Distelberg, Brian; Staack, Andrea
2017-09-01
Translabial ultrasound (TUS) is a useful tool for identifying and assessing synthetic slings. This study evaluates the ability of urology trainees to learn basic pelvic anatomy and sling assessment on TUS. Eight urology trainees (six residents and two medical students) received a lecture reviewing basic anatomy and sling assessment on TUS followed by review of two training cases. Next, they underwent a 126-question examination assessing their ability to identify anatomic planes and structures in those planes, identify the presence of slings, and assess the location and intactness of a sling. The correct response rate was compared to that of an attending radiologist experienced in reading TUS. Non-parametric tests (Fisher's exact, chi-squared tests, and Yates correction) were used for statistical analysis, with P < 0.05 considered significant. 847/1008 (84.0%) of questions were answered correctly by eight trainees compared to 119/126 (94.4%) by the radiologist (P = 0.001). The trainees' correct response rates and Fisher's exact test P values associated with the difference in correct answers between radiologist and trainee were as follows: identification of anatomic plane (94.4%; P = 0.599), identification of structure in sagittal view (80.6%; P = 0.201), identification of structure in transverse view (88.2%; P = 0.696), presence of synthetic sling (95.8%; P = 1.000), location of sling along the urethra in (71.5%; P = 0.403), intactness of sling (82.6%; P = 0.311), and laterality of sling disruption (75.0%; P = 0.076). Urology trainees can quickly learn to identify anatomic landmarks and assess slings on TUS with reasonable proficiency compared to an experienced attending radiologist. © 2017 Wiley Periodicals, Inc.
Menz, Hylton B; Allan, Jamie J; Bonanno, Daniel R; Landorf, Karl B; Murley, George S
2017-01-01
Foot orthoses are widely used in the prevention and treatment of foot disorders. The aim of this study was to describe characteristics of custom-made foot orthosis prescriptions from a Australian podiatric orthotic laboratory. One thousand consecutive foot orthosis prescription forms were obtained from a commercial prescription foot orthosis laboratory located in Melbourne, Victoria, Australia (Footwork Podiatric Laboratory). Each item from the prescription form was documented in relation to orthosis type, cast correction, arch fill technique, cast modifications, shell material, shell modifications and cover material. Cluster analysis and discriminant function analysis were applied to identify patterns in the prescription data. Prescriptions were obtained from 178 clinical practices across Australia and Hong Kong, with patients ranging in age from 5 to 92 years. Three broad categories ('clusters') were observed that were indicative of increasing 'control' of rearfoot pronation. A combination of five variables (rearfoot cast correction, cover shape, orthosis type, forefoot cast correction and plantar fascial accommodation) was able to identify these clusters with an accuracy of 70%. Significant differences between clusters were observed in relation to age and sex of the patient and the geographic location of the prescribing clinician. Foot orthosis prescriptions are complex, but can be broadly classified into three categories. Selection of these prescription subtypes appears to be influenced by both patient factors (age and sex) and clinician factors (clinic location).
A Critical Meta-Analysis of Lens Model Studies in Human Judgment and Decision-Making
Kaufmann, Esther; Reips, Ulf-Dietrich; Wittmann, Werner W.
2013-01-01
Achieving accurate judgment (‘judgmental achievement’) is of utmost importance in daily life across multiple domains. The lens model and the lens model equation provide useful frameworks for modeling components of judgmental achievement and for creating tools to help decision makers (e.g., physicians, teachers) reach better judgments (e.g., a correct diagnosis, an accurate estimation of intelligence). Previous meta-analyses of judgment and decision-making studies have attempted to evaluate overall judgmental achievement and have provided the basis for evaluating the success of bootstrapping (i.e., replacing judges by linear models that guide decision making). However, previous meta-analyses have failed to appropriately correct for a number of study design artifacts (e.g., measurement error, dichotomization), which may have potentially biased estimations (e.g., of the variability between studies) and led to erroneous interpretations (e.g., with regards to moderator variables). In the current study we therefore conduct the first psychometric meta-analysis of judgmental achievement studies that corrects for a number of study design artifacts. We identified 31 lens model studies (N = 1,151, k = 49) that met our inclusion criteria. We evaluated overall judgmental achievement as well as whether judgmental achievement depended on decision domain (e.g., medicine, education) and/or the level of expertise (expert vs. novice). We also evaluated whether using corrected estimates affected conclusions with regards to the success of bootstrapping with psychometrically-corrected models. Further, we introduce a new psychometric trim-and-fill method to estimate the effect sizes of potentially missing studies correct psychometric meta-analyses for effects of publication bias. Comparison of the results of the psychometric meta-analysis with the results of a traditional meta-analysis (which only corrected for sampling error) indicated that artifact correction leads to a) an increase in values of the lens model components, b) reduced heterogeneity between studies, and c) increases the success of bootstrapping. We argue that psychometric meta-analysis is useful for accurately evaluating human judgment and show the success of bootstrapping. PMID:24391781
A critical meta-analysis of lens model studies in human judgment and decision-making.
Kaufmann, Esther; Reips, Ulf-Dietrich; Wittmann, Werner W
2013-01-01
Achieving accurate judgment ('judgmental achievement') is of utmost importance in daily life across multiple domains. The lens model and the lens model equation provide useful frameworks for modeling components of judgmental achievement and for creating tools to help decision makers (e.g., physicians, teachers) reach better judgments (e.g., a correct diagnosis, an accurate estimation of intelligence). Previous meta-analyses of judgment and decision-making studies have attempted to evaluate overall judgmental achievement and have provided the basis for evaluating the success of bootstrapping (i.e., replacing judges by linear models that guide decision making). However, previous meta-analyses have failed to appropriately correct for a number of study design artifacts (e.g., measurement error, dichotomization), which may have potentially biased estimations (e.g., of the variability between studies) and led to erroneous interpretations (e.g., with regards to moderator variables). In the current study we therefore conduct the first psychometric meta-analysis of judgmental achievement studies that corrects for a number of study design artifacts. We identified 31 lens model studies (N = 1,151, k = 49) that met our inclusion criteria. We evaluated overall judgmental achievement as well as whether judgmental achievement depended on decision domain (e.g., medicine, education) and/or the level of expertise (expert vs. novice). We also evaluated whether using corrected estimates affected conclusions with regards to the success of bootstrapping with psychometrically-corrected models. Further, we introduce a new psychometric trim-and-fill method to estimate the effect sizes of potentially missing studies correct psychometric meta-analyses for effects of publication bias. Comparison of the results of the psychometric meta-analysis with the results of a traditional meta-analysis (which only corrected for sampling error) indicated that artifact correction leads to a) an increase in values of the lens model components, b) reduced heterogeneity between studies, and c) increases the success of bootstrapping. We argue that psychometric meta-analysis is useful for accurately evaluating human judgment and show the success of bootstrapping.
ERIC Educational Resources Information Center
Cullen, John B.; Perrewe, Pamela L.
1981-01-01
Used factors identified in the literature as predictors of centralization/decentralization as potential discriminating variables among several decision making configurations in university affiliated professional schools. The model developed from multiple discriminant analysis had reasonable success in classifying correctly only the decentralized…
Error Pattern Analysis Applied to Technical Writing: An Editor's Guide for Writers.
ERIC Educational Resources Information Center
Monagle, E. Brette
The use of error pattern analysis can reduce the time and money spent on editing and correcting manuscripts. What is required is noting, classifying, and keeping a frequency count of errors. First an editor should take a typical page of writing and circle each error. After the editor has done a sufficiently large number of pages to identify an…
Root Cause Analysis Webinar: Q&A with Roni Silverstein. REL Mid-Atlantic Webinar
ERIC Educational Resources Information Center
Regional Educational Laboratory Mid-Atlantic, 2014
2014-01-01
Root cause analysis is a powerful method schools use to analyze data to solve problems; it aims to identify and correct the root causes of problems or events, rather than simply addressing their symptoms. In this webinar, veteran practitioner, Roni Silverstein, talked about the value of this process and practical ways to use it in your school or…
YÜCEL, MERYEM A.; SELB, JULIETTE; COOPER, ROBERT J.; BOAS, DAVID A.
2014-01-01
As near-infrared spectroscopy (NIRS) broadens its application area to different age and disease groups, motion artifacts in the NIRS signal due to subject movement is becoming an important challenge. Motion artifacts generally produce signal fluctuations that are larger than physiological NIRS signals, thus it is crucial to correct for them before obtaining an estimate of stimulus evoked hemodynamic responses. There are various methods for correction such as principle component analysis (PCA), wavelet-based filtering and spline interpolation. Here, we introduce a new approach to motion artifact correction, targeted principle component analysis (tPCA), which incorporates a PCA filter only on the segments of data identified as motion artifacts. It is expected that this will overcome the issues of filtering desired signals that plagues standard PCA filtering of entire data sets. We compared the new approach with the most effective motion artifact correction algorithms on a set of data acquired simultaneously with a collodion-fixed probe (low motion artifact content) and a standard Velcro probe (high motion artifact content). Our results show that tPCA gives statistically better results in recovering hemodynamic response function (HRF) as compared to wavelet-based filtering and spline interpolation for the Velcro probe. It results in a significant reduction in mean-squared error (MSE) and significant enhancement in Pearson’s correlation coefficient to the true HRF. The collodion-fixed fiber probe with no motion correction performed better than the Velcro probe corrected for motion artifacts in terms of MSE and Pearson’s correlation coefficient. Thus, if the experimental study permits, the use of a collodion-fixed fiber probe may be desirable. If the use of a collodion-fixed probe is not feasible, then we suggest the use of tPCA in the processing of motion artifact contaminated data. PMID:25360181
Identification of FGF7 as a novel susceptibility locus for chronic obstructive pulmonary disease.
Brehm, John M; Hagiwara, Koichi; Tesfaigzi, Yohannes; Bruse, Shannon; Mariani, Thomas J; Bhattacharya, Soumyaroop; Boutaoui, Nadia; Ziniti, John P; Soto-Quiros, Manuel E; Avila, Lydiana; Cho, Michael H; Himes, Blanca; Litonjua, Augusto A; Jacobson, Francine; Bakke, Per; Gulsvik, Amund; Anderson, Wayne H; Lomas, David A; Forno, Erick; Datta, Soma; Silverman, Edwin K; Celedón, Juan C
2011-12-01
Traditional genome-wide association studies (GWASs) of large cohorts of subjects with chronic obstructive pulmonary disease (COPD) have successfully identified novel candidate genes, but several other plausible loci do not meet strict criteria for genome-wide significance after correction for multiple testing. The authors hypothesise that by applying unbiased weights derived from unique populations we can identify additional COPD susceptibility loci. Methods The authors performed a homozygosity haplotype analysis on a group of subjects with and without COPD to identify regions of conserved homozygosity haplotype (RCHHs). Weights were constructed based on the frequency of these RCHHs in case versus controls, and used to adjust the p values from a large collaborative GWAS of COPD. The authors identified 2318 RCHHs, of which 576 were significantly (p<0.05) over-represented in cases. After applying the weights constructed from these regions to a collaborative GWAS of COPD, the authors identified two single nucleotide polymorphisms (SNPs) in a novel gene (fibroblast growth factor-7 (FGF7)) that gained genome-wide significance by the false discovery rate method. In a follow-up analysis, both SNPs (rs12591300 and rs4480740) were significantly associated with COPD in an independent population (combined p values of 7.9E-7 and 2.8E-6, respectively). In another independent population, increased lung tissue FGF7 expression was associated with worse measures of lung function. Weights constructed from a homozygosity haplotype analysis of an isolated population successfully identify novel genetic associations from a GWAS on a separate population. This method can be used to identify promising candidate genes that fail to meet strict correction for multiple testing.
RADC SCAT automated sneak circuit analysis tool
NASA Astrophysics Data System (ADS)
Depalma, Edward L.
The sneak circuit analysis tool (SCAT) provides a PC-based system for real-time identification (during the design phase) of sneak paths and design concerns. The tool utilizes an expert system shell to assist the analyst so that prior experience with sneak analysis is not necessary for performance. Both sneak circuits and design concerns are targeted by this tool, with both digital and analog circuits being examined. SCAT focuses the analysis at the assembly level, rather than the entire system, so that most sneak problems can be identified and corrected by the responsible design engineer in a timely manner. The SCAT program identifies the sneak circuits to the designer, who then decides what course of action is necessary.
DOE Office of Scientific and Technical Information (OSTI.GOV)
U.S. Department of Energy, National Nuclear Security Administration Nevada Site Office
2004-04-01
This Corrective Action Decision Document identifies the U.S. Department of Energy, National Nuclear Security Administration Nevada Site Office's corrective action alternative recommendation for each of the corrective action sites (CASs) within Corrective Action Unit (CAU) 204: Storage Bunkers, Nevada Test Site (NTS), Nevada, under the Federal Facility Agreement and Consent Order. An evaluation of analytical data from the corrective action investigation, review of current and future operations at each CAS, and a detailed comparative analysis of potential corrective action alternatives were used to determine the appropriate corrective action for each CAS. There are six CASs in CAU 204, which aremore » all located between Areas 1, 2, 3, and 5 on the NTS. The No Further Action alternative was recommended for CASs 01-34-01, 02-34-01, 03-34-01, and 05-99-02; and a Closure in Place with Administrative Controls recommendation was the preferred corrective action for CASs 05-18-02 and 05-33-01. These alternatives were judged to meet all requirements for the technical components evaluated as well as applicable state and federal regulations for closure of the sites and will eliminate potential future exposure pathways to the contaminated media at CAU 204.« less
Multicenter Evaluation of the Vitek MS v3.0 System for the Identification of Filamentous Fungi.
Rychert, Jenna; Slechta, E Sue; Barker, Adam P; Miranda, Edwin; Babady, N Esther; Tang, Yi-Wei; Gibas, Connie; Wiederhold, Nathan; Sutton, DeAnna; Hanson, Kimberly E
2018-02-01
Invasive fungal infections are an important cause of morbidity and mortality affecting primarily immunocompromised patients. While fungal identification to the species level is critical to providing appropriate therapy, it can be slow and laborious and often relies on subjective morphological criteria. The use of matrix-assisted laser desorption ionization-time of flight (MALDI-TOF) mass spectrometry has the potential to speed up and improve the accuracy of identification. In this multicenter study, we evaluated the accuracy of the Vitek MS v3.0 system in identifying 1,601 clinical mold isolates compared to identification by DNA sequence analysis and supported by morphological and phenotypic testing. Among the 1,519 isolates representing organisms in the v3.0 database, 91% ( n = 1,387) were correctly identified to the species level. An additional 27 isolates (2%) were correctly identified to the genus level. Fifteen isolates were incorrectly identified, due to either a single incorrect identification ( n = 13) or multiple identifications from different genera ( n = 2). In those cases, when a single identification was provided that was not correct, the misidentification was within the same genus. The Vitek MS v3.0 was unable to identify 91 (6%) isolates, despite repeat testing. These isolates were distributed among all the genera. When considering all isolates tested, even those that were not represented in the database, the Vitek MS v3.0 provided a single correct identification 98% of the time. These findings demonstrate that the Vitek MS v3.0 system is highly accurate for the identification of common molds encountered in the clinical mycology laboratory. Copyright © 2018 American Society for Microbiology.
Genomic characteristics of cattle copy number variations
USDA-ARS?s Scientific Manuscript database
We performed a systematic analysis of cattle copy number variations (CNVs) using the Bovine HapMap SNP genotyping data, including 539 animals of 21 modern cattle breeds and 6 outgroups. After correcting genomic waves and considering the trio information, we identified 682 candidate CNV regions (CNVR...
NASA Astrophysics Data System (ADS)
Teuho, J.; Johansson, J.; Linden, J.; Saunavaara, V.; Tolvanen, T.; Teräs, M.
2014-01-01
Selection of reconstruction parameters has an effect on the image quantification in PET, with an additional contribution from a scanner-specific attenuation correction method. For achieving comparable results in inter- and intra-center comparisons, any existing quantitative differences should be identified and compensated for. In this study, a comparison between PET, PET/CT and PET/MR is performed by using an anatomical brain phantom, to identify and measure the amount of bias caused due to differences in reconstruction and attenuation correction methods especially in PET/MR. Differences were estimated by using visual, qualitative and quantitative analysis. The qualitative analysis consisted of a line profile analysis for measuring the reproduction of anatomical structures and the contribution of the amount of iterations to image contrast. The quantitative analysis consisted of measurement and comparison of 10 anatomical VOIs, where the HRRT was considered as the reference. All scanners reproduced the main anatomical structures of the phantom adequately, although the image contrast on the PET/MR was inferior when using a default clinical brain protocol. Image contrast was improved by increasing the amount of iterations from 2 to 5 while using 33 subsets. Furthermore, a PET/MR-specific bias was detected, which resulted in underestimation of the activity values in anatomical structures closest to the skull, due to the MR-derived attenuation map that ignores the bone. Thus, further improvements for the PET/MR reconstruction and attenuation correction could be achieved by optimization of RAMLA-specific reconstruction parameters and implementation of bone to the attenuation template.
Arvanitoyannis, Ioannis S; Varzakas, Theodoros H
2009-08-01
Failure Mode and Effect Analysis (FMEA) has been applied for the risk assessment of snails manufacturing. A tentative approach of FMEA application to the snails industry was attempted in conjunction with ISO 22000. Preliminary Hazard Analysis was used to analyze and predict the occurring failure modes in a food chain system (snails processing plant), based on the functions, characteristics, and/or interactions of the ingredients or the processes, upon which the system depends. Critical Control points have been identified and implemented in the cause and effect diagram (also known as Ishikawa, tree diagram, and fishbone diagram). In this work a comparison of ISO22000 analysis with HACCP is carried out over snails processing and packaging. However, the main emphasis was put on the quantification of risk assessment by determining the RPN per identified processing hazard. Sterilization of tins, bioaccumulation of heavy metals, packaging of shells and poisonous mushrooms, were the processes identified as the ones with the highest RPN (280, 240, 147, 144, respectively) and corrective actions were undertaken. Following the application of corrective actions, a second calculation of RPN values was carried out leading to considerably lower values (below the upper acceptable limit of 130). It is noteworthy that the application of Ishikawa (Cause and Effect or Tree diagram) led to converging results thus corroborating the validity of conclusions derived from risk assessment and FMEA. Therefore, the incorporation of FMEA analysis within the ISO22000 system of a snails processing industry is considered imperative.
NASA Technical Reports Server (NTRS)
1982-01-01
An effective data collection methodology for evaluating software development methodologies was applied to four different software development projects. Goals of the data collection included characterizing changes and errors, characterizing projects and programmers, identifying effective error detection and correction techniques, and investigating ripple effects. The data collected consisted of changes (including error corrections) made to the software after code was written and baselined, but before testing began. Data collection and validation were concurrent with software development. Changes reported were verified by interviews with programmers.
Application of wavelet multi-resolution analysis for correction of seismic acceleration records
NASA Astrophysics Data System (ADS)
Ansari, Anooshiravan; Noorzad, Assadollah; Zare, Mehdi
2007-12-01
During an earthquake, many stations record the ground motion, but only a few of them could be corrected using conventional high-pass and low-pass filtering methods and the others were identified as highly contaminated by noise and as a result useless. There are two major problems associated with these noisy records. First, since the signal to noise ratio (S/N) is low, it is not possible to discriminate between the original signal and noise either in the frequency domain or in the time domain. Consequently, it is not possible to cancel out noise using conventional filtering methods. The second problem is the non-stationary characteristics of the noise. In other words, in many cases the characteristics of the noise are varied over time and in these situations, it is not possible to apply frequency domain correction schemes. When correcting acceleration signals contaminated with high-level non-stationary noise, there is an important question whether it is possible to estimate the state of the noise in different bands of time and frequency. Wavelet multi-resolution analysis decomposes a signal into different time-frequency components, and besides introducing a suitable criterion for identification of the noise among each component, also provides the required mathematical tool for correction of highly noisy acceleration records. In this paper, the characteristics of the wavelet de-noising procedures are examined through the correction of selected real and synthetic acceleration time histories. It is concluded that this method provides a very flexible and efficient tool for the correction of very noisy and non-stationary records of ground acceleration. In addition, a two-step correction scheme is proposed for long period correction of the acceleration records. This method has the advantage of stable results in displacement time history and response spectrum.
NASA Technical Reports Server (NTRS)
Welsh, David; Denham, Samuel; Allen, Christopher
2011-01-01
In many cases, an initial symptom of hardware malfunction is unusual or unexpected acoustic noise. Many industries such as automotive, heating and air conditioning, and petro-chemical processing use noise and vibration data along with rotating machinery analysis techniques to identify noise sources and correct hardware defects. The NASA/Johnson Space Center Acoustics Office monitors the acoustic environment of the International Space Station (ISS) through periodic sound level measurement surveys. Trending of the sound level measurement survey results can identify in-flight hardware anomalies. The crew of the ISS also serves as a "detection tool" in identifying unusual hardware noises; in these cases the spectral analysis of audio recordings made on orbit can be used to identify hardware defects that are related to rotating components such as fans, pumps, and compressors. In this paper, three examples of the use of sound level measurements and audio recordings for the diagnosis of in-flight hardware anomalies are discussed: identification of blocked inter-module ventilation (IMV) ducts, diagnosis of abnormal ISS Crew Quarters rack exhaust fan noise, and the identification and replacement of a defective flywheel assembly in the Treadmill with Vibration Isolation (TVIS) hardware. In each of these examples, crew time was saved by identifying the off nominal component or condition that existed and in directing in-flight maintenance activities to address and correct each of these problems.
Evaluation of the Microbial Identification System for identification of clinically isolated yeasts.
Crist, A E; Johnson, L M; Burke, P J
1996-01-01
The Microbial Identification System (MIS; Microbial ID, Inc., Newark, Del.) was evaluated for the identification of 550 clinically isolated yeasts. The organisms evaluated were fresh clinical isolates identified by methods routinely used in our laboratory (API 20C and conventional methods) and included Candida albicans (n = 294), C. glabrata (n = 145), C. tropicalis (n = 58), C. parapsilosis (n = 33), and other yeasts (n = 20). In preparation for fatty acid analysis, yeasts were inoculated onto Sabouraud dextrose agar and incubated at 28 degrees C for 24 h. Yeasts were harvested, saponified, derivatized, and extracted, and fatty acid analysis was performed according to the manufacturer's instructions. Fatty acid profiles were analyzed, and computer identifications were made with the Yeast Clinical Library (database version 3.8). Of the 550 isolates tested, 374 (68.0%) were correctly identified to the species level, with 87 (15.8%) being incorrectly identified and 89 (16.2%) giving no identification. Repeat testing of isolates giving no identification resulted in an additional 18 isolates being correctly identified. This gave the MIS an overall identification rate of 71.3%. The most frequently misidentified yeast was C. glabrata, which was identified as Saccharomyces cerevisiae 32.4% of the time. On the basis of these results, the MIS, with its current database, does not appear suitable for the routine identification of clinically important yeasts. PMID:8880489
Statistical testing and power analysis for brain-wide association study.
Gong, Weikang; Wan, Lin; Lu, Wenlian; Ma, Liang; Cheng, Fan; Cheng, Wei; Grünewald, Stefan; Feng, Jianfeng
2018-04-05
The identification of connexel-wise associations, which involves examining functional connectivities between pairwise voxels across the whole brain, is both statistically and computationally challenging. Although such a connexel-wise methodology has recently been adopted by brain-wide association studies (BWAS) to identify connectivity changes in several mental disorders, such as schizophrenia, autism and depression, the multiple correction and power analysis methods designed specifically for connexel-wise analysis are still lacking. Therefore, we herein report the development of a rigorous statistical framework for connexel-wise significance testing based on the Gaussian random field theory. It includes controlling the family-wise error rate (FWER) of multiple hypothesis testings using topological inference methods, and calculating power and sample size for a connexel-wise study. Our theoretical framework can control the false-positive rate accurately, as validated empirically using two resting-state fMRI datasets. Compared with Bonferroni correction and false discovery rate (FDR), it can reduce false-positive rate and increase statistical power by appropriately utilizing the spatial information of fMRI data. Importantly, our method bypasses the need of non-parametric permutation to correct for multiple comparison, thus, it can efficiently tackle large datasets with high resolution fMRI images. The utility of our method is shown in a case-control study. Our approach can identify altered functional connectivities in a major depression disorder dataset, whereas existing methods fail. A software package is available at https://github.com/weikanggong/BWAS. Copyright © 2018 Elsevier B.V. All rights reserved.
ERIC Educational Resources Information Center
Silverstein, Roni
2014-01-01
Root cause analysis is a powerful method schools use to analyze data to solve problems; it aims to identify and correct the root causes of problems or events, rather than simply addressing their symptoms. Veteran practitioner, Roni Silverstein, presented the value of this process and practical ways to use it in your school or district. This…
Evaluation and updating of the Medical Malacology Collection (Fiocruz-CMM) using molecular taxonomy.
Aguiar-Silva, Cryslaine; Mendonça, Cristiane Lafetá Furtado; da Cunha Kellis Pinheiro, Pedro Henrique; Mesquita, Silvia Gonçalves; Carvalho, Omar Dos Santos; Caldeira, Roberta Lima
2014-01-01
The Medical Malacology Collection (Coleção de Malacologia Médica, Fiocruz-CMM) is a depository of medically relevant mollusks, especially from the genus Biomphalaria, which includes the hosts of Schistosoma mansoni. Taxonomic studies of these snails have traditionally focused on the morphology of the reproductive system. However, determination of some species is complicated by the similarity shown by these characters. Molecular techniques have been used to try to overcome this problem. The Fiocruz-CMM utilizes morphological and/or molecular method for species' identification. However, part of the collection has not been identified by molecular techniques and some points were unidentified. The present study employs polymerase chain reaction-based analysis of restriction fragment length polymorphisms (PCR-RFLP) to evaluate the identification of Biomphalaria in the Fiocruz-CMM, correct existing errors, assess the suitability of taxonomic synonyms, and identify unknown specimens. The results indicated that 56.7% of the mollusk specimens were correctly identified, 4.0% were wrongly identified, and 0.4% was identified under taxonomic synonyms. Additionally, the PCR-RFLP analysis identified for the first time 17.6% of the specimens in the Collection. However, 3.1% of the specimens could not be identified because the mollusk tissues were degraded, and 18.2% of the specimens were inconclusively identified, demonstrating the need for new taxonomic studies in this group. The data was utilized to update data of Environmental Information Reference Center (CRIA). These studies demonstrate the importance of using more than one technique in taxonomic confirmation and the good preservation of specimens' collection.
Genomic and evolutionary characteristics of cattle copy number variations
USDA-ARS?s Scientific Manuscript database
We performed a systematic analysis of cattle copy number variations (CNVs) using the Bovine HapMap SNP genotyping data, including 539 animals of 21 modern cattle breeds and 6 outgroups. After correcting genomic waves and considering the trio information, we identified 682 candidate CNV regions (CNVR...
Cracking the code: the accuracy of coding shoulder procedures and the repercussions.
Clement, N D; Murray, I R; Nie, Y X; McBirnie, J M
2013-05-01
Coding of patients' diagnosis and surgical procedures is subject to error levels of up to 40% with consequences on distribution of resources and financial recompense. Our aim was to explore and address reasons behind coding errors of shoulder diagnosis and surgical procedures and to evaluate a potential solution. A retrospective review of 100 patients who had undergone surgery was carried out. Coding errors were identified and the reasons explored. A coding proforma was designed to address these errors and was prospectively evaluated for 100 patients. The financial implications were also considered. Retrospective analysis revealed the correct primary diagnosis was assigned in 54 patients (54%) had an entirely correct diagnosis, and only 7 (7%) patients had a correct procedure code assigned. Coders identified indistinct clinical notes and poor clarity of procedure codes as reasons for errors. The proforma was significantly more likely to assign the correct diagnosis (odds ratio 18.2, p < 0.0001) and the correct procedure code (odds ratio 310.0, p < 0.0001). Using the proforma resulted in a £28,562 increase in revenue for the 100 patients evaluated relative to the income generated from the coding department. High error levels for coding are due to misinterpretation of notes and ambiguity of procedure codes. This can be addressed by allowing surgeons to assign the diagnosis and procedure using a simplified list that is passed directly to coding.
Lefave, Melissa; Harrell, Brad; Wright, Molly
2016-06-01
The purpose of this project was to assess the ability of anesthesiologists, nurse anesthetists, and registered nurses to correctly identify anatomic landmarks of cricoid pressure and apply the correct amount of force. The project included an educational intervention with one group pretest-post-test design. Participants demonstrated cricoid pressure on a laryngotracheal model. After an educational intervention video, participants were asked to repeat cricoid pressure on the model. Participants with a nurse anesthesia background applied more appropriate force pretest than other participants; however, post-test results, while improved, showed no significant difference among providers. Participant identification of the correct anatomy of the cricoid cartilage and application of correct force were significantly improved after education. This study revealed that participants lacked prior knowledge of correct cricoid anatomy and pressure as well as the ability to apply correct force to the laryngotracheal model before an educational intervention. The intervention used in this study proved successful in educating health care providers. Copyright © 2016 American Society of PeriAnesthesia Nurses. Published by Elsevier Inc. All rights reserved.
Cashmore, Aaron W; Indig, Devon; Hampton, Stephen E; Hegney, Desley G; Jalaludin, Bin B
2016-11-01
Little is known about the environmental and organisational determinants of workplace violence in correctional health settings. This paper describes the views of health professionals working in these settings on the factors influencing workplace violence risk. All employees of a large correctional health service in New South Wales, Australia, were invited to complete an online survey. The survey included an open-ended question seeking the views of participants about the factors influencing workplace violence in correctional health settings. Responses to this question were analysed using qualitative thematic analysis. Participants identified several factors that they felt reduced the risk of violence in their workplace, including: appropriate workplace health and safety policies and procedures; professionalism among health staff; the presence of prison guards and the quality of security provided; and physical barriers within clinics. Conversely, participants perceived workplace violence risk to be increased by: low health staff-to-patient and correctional officer-to-patient ratios; high workloads; insufficient or underperforming security staff; and poor management of violence, especially horizontal violence. The views of these participants should inform efforts to prevent workplace violence among correctional health professionals.
Thematic mapping, land use, geological structure and water resources in central Spain
NASA Technical Reports Server (NTRS)
Delascuevas, N. (Principal Investigator)
1976-01-01
The author has identified the following significant results. The images can be positioned in an absolute reference system (geographical coordinates or polar stereographic coordinates) by means of their marginal indicators. By digital analysis of LANDSAT data and geometric positioning of pixels in UTM projection, accuracy was achieved for corrected MSS information which could be used for updating maps at scale 1:200,000 or smaller. Results show that adjustment of the UTM grid was better obtained by a first order, or even second order, algorithm of geometric correction. Digital analysis of LANDSAT data from the Madrid area showed that this line of study was promising for automatic classification of data applied to thematic cartography and soils identification.
Chocholova, Erika; Bertok, Tomas; Jane, Eduard; Lorencova, Lenka; Holazova, Alena; Belicka, Ludmila; Belicky, Stefan; Mislovicova, Danica; Vikartovska, Alica; Imrich, Richard; Kasak, Peter; Tkac, Jan
2018-06-01
In this study, one hundred serum samples from healthy people and patients with rheumatoid arthritis (RA) were analyzed. Standard immunoassays for detection of 10 different RA markers and analysis of glycan markers on antibodies in 10 different assay formats with several lectins were applied for each serum sample. A dataset containing 2000 data points was data mined using artificial neural networks (ANN). We identified key RA markers, which can discriminate between healthy people and seropositive RA patients (serum containing autoantibodies) with accuracy of 83.3%. Combination of RA markers with glycan analysis provided much better discrimination accuracy of 92.5%. Immunoassays completely failed to identify seronegative RA patients (serum not containing autoantibodies), while glycan analysis correctly identified 43.8% of these patients. Further, we revealed other critical parameters for successful glycan analysis such as type of a sample, format of analysis and orientation of captured antibodies for glycan analysis. Copyright © 2018 Elsevier B.V. All rights reserved.
Contact Stress Analysis of Spiral Bevel Gears Using Finite Element Analysis
NASA Technical Reports Server (NTRS)
Bibel, G. D.; Kumar, A; Reddy, S.; Handschuh, R.
1995-01-01
A procedure is presented for performing three-dimensional stress analysis of spiral bevel gears in mesh using the finite element method. The procedure involves generating a finite element model by solving equations that identify tooth surface coordinates. Coordinate transformations are used to orientate the gear and pinion for gear meshing. Contact boundary conditions are simulated with gap elements. A solution technique for correct orientation of the gap elements is given. Example models and results are presented.
ERIC Educational Resources Information Center
Goncher, Andrea M.; Jayalath, Dhammika; Boles, Wageeh
2016-01-01
Concept inventory tests are one method to evaluate conceptual understanding and identify possible misconceptions. The multiple-choice question format, offering a choice between a correct selection and common misconceptions, can provide an assessment of students' conceptual understanding in various dimensions. Misconceptions of some engineering…
Axis I Screens and Suicide Risk in Jails: A Comparative Analysis
ERIC Educational Resources Information Center
Harrison, Kimberly S.; Rogers, Richard
2007-01-01
Mental health professionals conducting screenings in jail settings face formidable challenges in identifying inmates at risk for major depression and suicide. Psychologists often rely on correctional staff to provide initial appraisals of those inmates requiring further evaluation. In a sample of 100 jail detainees, the effectiveness of two…
Willis, Sarah R; Ahmed, Hashim U; Moore, Caroline M; Donaldson, Ian; Emberton, Mark; Miners, Alec H; van der Meulen, Jan
2014-01-01
Objective To compare the diagnostic outcomes of the current approach of transrectal ultrasound (TRUS)-guided biopsy in men with suspected prostate cancer to an alternative approach using multiparametric MRI (mpMRI), followed by MRI-targeted biopsy if positive. Design Clinical decision analysis was used to synthesise data from recently emerging evidence in a format that is relevant for clinical decision making. Population A hypothetical cohort of 1000 men with suspected prostate cancer. Interventions mpMRI and, if positive, MRI-targeted biopsy compared with TRUS-guided biopsy in all men. Outcome measures We report the number of men expected to undergo a biopsy as well as the numbers of correctly identified patients with or without prostate cancer. A probabilistic sensitivity analysis was carried out using Monte Carlo simulation to explore the impact of statistical uncertainty in the diagnostic parameters. Results In 1000 men, mpMRI followed by MRI-targeted biopsy ‘clinically dominates’ TRUS-guided biopsy as it results in fewer expected biopsies (600 vs 1000), more men being correctly identified as having clinically significant cancer (320 vs 250), and fewer men being falsely identified (20 vs 50). The mpMRI-based strategy dominated TRUS-guided biopsy in 86% of the simulations in the probabilistic sensitivity analysis. Conclusions Our analysis suggests that mpMRI followed by MRI-targeted biopsy is likely to result in fewer and better biopsies than TRUS-guided biopsy. Future research in prostate cancer should focus on providing precise estimates of key diagnostic parameters. PMID:24934207
Bonan, Brigitte; Martelli, Nicolas; Berhoune, Malik; Maestroni, Marie-Laure; Havard, Laurent; Prognon, Patrice
2009-02-01
To apply the Hazard analysis and Critical Control Points method to the preparation of anti-cancer drugs. To identify critical control points in our cancer chemotherapy process and to propose control measures and corrective actions to manage these processes. The Hazard Analysis and Critical Control Points application began in January 2004 in our centralized chemotherapy compounding unit. From October 2004 to August 2005, monitoring of the process nonconformities was performed to assess the method. According to the Hazard Analysis and Critical Control Points method, a multidisciplinary team was formed to describe and assess the cancer chemotherapy process. This team listed all of the critical points and calculated their risk indexes according to their frequency of occurrence, their severity and their detectability. The team defined monitoring, control measures and corrective actions for each identified risk. Finally, over a 10-month period, pharmacists reported each non-conformity of the process in a follow-up document. Our team described 11 steps in the cancer chemotherapy process. The team identified 39 critical control points, including 11 of higher importance with a high-risk index. Over 10 months, 16,647 preparations were performed; 1225 nonconformities were reported during this same period. The Hazard Analysis and Critical Control Points method is relevant when it is used to target a specific process such as the preparation of anti-cancer drugs. This method helped us to focus on the production steps, which can have a critical influence on product quality, and led us to improve our process.
Arvanitoyannis, Ioannis S; Varzakas, Theodoros H
2008-05-01
The Failure Mode and Effect Analysis (FMEA) model was applied for risk assessment of salmon manufacturing. A tentative approach of FMEA application to the salmon industry was attempted in conjunction with ISO 22000. Preliminary Hazard Analysis was used to analyze and predict the occurring failure modes in a food chain system (salmon processing plant), based on the functions, characteristics, and/or interactions of the ingredients or the processes, upon which the system depends. Critical Control points were identified and implemented in the cause and effect diagram (also known as Ishikawa, tree diagram and fishbone diagram). In this work, a comparison of ISO 22000 analysis with HACCP is carried out over salmon processing and packaging. However, the main emphasis was put on the quantification of risk assessment by determining the RPN per identified processing hazard. Fish receiving, casing/marking, blood removal, evisceration, filet-making cooling/freezing, and distribution were the processes identified as the ones with the highest RPN (252, 240, 210, 210, 210, 210, 200 respectively) and corrective actions were undertaken. After the application of corrective actions, a second calculation of RPN values was carried out resulting in substantially lower values (below the upper acceptable limit of 130). It is noteworthy that the application of Ishikawa (Cause and Effect or Tree diagram) led to converging results thus corroborating the validity of conclusions derived from risk assessment and FMEA. Therefore, the incorporation of FMEA analysis within the ISO 22000 system of a salmon processing industry is anticipated to prove advantageous to industrialists, state food inspectors, and consumers.
Evaluation of Molecular Methods for Identification of Salmonella Serovars
Gurnik, Simone; Ahmad, Aaminah; Blimkie, Travis; Murphy, Stephanie A.; Kropinski, Andrew M.; Nash, John H. E.
2016-01-01
Classification by serotyping is the essential first step in the characterization of Salmonella isolates and is important for surveillance, source tracking, and outbreak detection. To improve detection and reduce the burden of salmonellosis, several rapid and high-throughput molecular Salmonella serotyping methods have been developed. The aim of this study was to compare three commercial kits, Salm SeroGen (Salm Sero-Genotyping AS-1 kit), Check&Trace (Check-Points), and xMAP (xMAP Salmonella serotyping assay), to the Salmonella genoserotyping array (SGSA) developed by our laboratory. They were assessed using a panel of 321 isolates that represent commonly reported serovars from human and nonhuman sources globally. The four methods correctly identified 73.8% to 94.7% of the isolates tested. The methods correctly identified 85% and 98% of the clinically important Salmonella serovars Enteritidis and Typhimurium, respectively. The methods correctly identified 75% to 100% of the nontyphoidal, broad host range Salmonella serovars, including Heidelberg, Hadar, Infantis, Kentucky, Montevideo, Newport, and Virchow. The sensitivity and specificity of Salmonella serovars Typhimurium and Enteritidis ranged from 85% to 100% and 99% to 100%, respectively. It is anticipated that whole-genome sequencing will replace serotyping in public health laboratories in the future. However, at present, it is approximately three times more expensive than molecular methods. Until consistent standards and methodologies are deployed for whole-genome sequencing, data analysis and interlaboratory comparability remain a challenge. The use of molecular serotyping will provide a valuable high-throughput alternative to traditional serotyping. This comprehensive analysis provides a detailed comparison of commercial kits available for the molecular serotyping of Salmonella. PMID:27194688
Identification and topographic localization of metallic foreign bodies by metal detector.
Muensterer, Oliver J; Joppich, Ingolf
2004-08-01
Exact localization of ingested metal objects is necessary to guide therapy. This study prospectively evaluates the accuracy of foreign body (FB) identification and localization by metal detector (MTD) in a systematic topographic fashion. Patients who presented after an alleged or witnessed metal FB ingestion were scanned with an MTD. In case of a positive signal, the location was recorded in a topographic diagram, and radiographs were obtained. The diagnostic accuracy of the MTD scan for FB identification and topographic localization was determined by chi(2) analysis, and concordance was calculated by the McNemar test and expressed as kappa. A total of 70 MTD examinations were performed on 65 patients (age 6 months to 16 years); 5 patients were scanned twice on different days. The majority had swallowed coins and button batteries (n = 41). Of these, 29 items were correctly identified, and 11 of 12 were correctly ruled out (coins and button batteries: sensitivity, 100% [95% Confidence Interval 95% to 100%]; specificity, 91.7% [95% CI 76% to 100%], kappa = 0.94). When all metallic objects were included, 41 of 46 were correctly identified, and 22 of 24 were correctly ruled out (sensitivity, 89.1% [95% CI 80% to 98%]; specificity, 91.7% [95% CI 81% to 100%], kappa = 0.78). Five miscellaneous objects were not identified (sensitivity for items other than coins and button batteries 71% [95% CI 49% to 92%], kappa = 0.56). Localization by MTD was correct in 30 of 41 identified objects (73%). The error rates of junior and senior pediatric surgery residents did not differ significantly (P =.82). Ingested coins and button batteries can be safely and accurately found by metal detector. For these indications, the MTD is a radiation-free diagnostic alternative to conventional radiographs. Other items, however, cannot be ruled out reliably by MTD. In these cases, radiographic imaging is still indicated.
Gettig, Jacob P
2006-04-01
To determine the prevalence of established multiple-choice test-taking correct and incorrect answer cues in the American College of Clinical Pharmacy's Updates in Therapeutics: The Pharmacotherapy Preparatory Course, 2005 Edition, as an equal or lesser surrogate indication of the prevalence of such cues in the Pharmacotherapy board certification examination. All self-assessment and patient case question-and-answer sets were assessed individually to determine if they were subject to selected correct and incorrect answer cues commonly seen in multiple-choice question writing. If the question was considered evaluable, correct answer cues-longest answer, mid-range number, one of two similar choices, and one of two opposite choices-were tallied. In addition, incorrect answer cues- inclusionary language and grammatical mismatch-were also tallied. Each cue was counted if it did what was expected or did the opposite of what was expected. Multiple cues could be identified in each question. A total of 237 (47.7%) of 497 questions in the manual were deemed evaluable. A total of 325 correct answer cues and 35 incorrect answer cues were identified in the 237 evaluable questions. Most evaluable questions contained one to two correct and/or incorrect answer cue(s). Longest answer was the most frequently identified correct answer cue; however, it was the least likely to identify the correct answer. Inclusionary language was the most frequently identified incorrect answer cue. Incorrect answer cues were considerably more likely to identify incorrect answer choices than correct answer cues were able to identify correct answer choices. The use of established multiple-choice test-taking cues is unlikely to be of significant help when taking the Pharmacotherapy board certification examination, primarily because of the lack of questions subject to such cues and the inability of correct answer cues to accurately identify correct answers. Incorrect answer cues, especially the use of inclusionary language, almost always will accurately identify an incorrect answer choice. Assuming that questions in the preparatory course manual were equal or lesser surrogates of those in the board certification examination, it is unlikely that intuition alone can replace adequate preparation and studying as the sole determinant of examination success.
NASA Astrophysics Data System (ADS)
Liu, Yang; Li, Baojuan; Zhang, Xi; Zhang, Linchuan; Li, Liang; Lu, Hongbing
2016-03-01
To explore the alteration in cerebral blood flow (CBF) and functional connectivity between survivors with recent onset post-traumatic stress disorder (PTSD) and without PTSD, survived from the same coal mine flood disaster. In this study, a processing pipeline using arterial spin labeling (ASL) sequence was proposed. Considering low spatial resolution of ASL sequence, a linear regression method was firstly used to correct the partial volume (PV) effect for better CBF estimation. Then the alterations of CBF between two groups were analyzed using both uncorrected and PV-corrected CBF maps. Based on altered CBF regions detected from the CBF analysis as seed regions, the functional connectivity abnormities in PTSD patients was investigated. The CBF analysis using PV-corrected maps indicates CBF deficits in the bilateral frontal lobe, right superior frontal gyrus and right corpus callosum of PTSD patients, while only right corpus callosum was identified in uncorrected CBF analysis. Furthermore, the regional CBF of the right superior frontal gyrus exhibits significantly negative correlation with the symptom severity in PTSD patients. The resting-state functional connectivity indicates increased connectivity between left frontal lobe and right parietal lobe. These results indicate that PV-corrected CBF exhibits more subtle perfusion changes and may benefit further perfusion and connectivity analysis. The symptom-specific perfusion deficits and aberrant connectivity in above memory-related regions may be putative biomarkers for recent onset PTSD induced by a single prolonged trauma exposure and help predict the severity of PTSD.
NASA Astrophysics Data System (ADS)
Sessa, Francesco; D'Angelo, Paola; Migliorati, Valentina
2018-01-01
In this work we have developed an analytical procedure to identify metal ion coordination geometries in liquid media based on the calculation of Combined Distribution Functions (CDFs) starting from Molecular Dynamics (MD) simulations. CDFs provide a fingerprint which can be easily and unambiguously assigned to a reference polyhedron. The CDF analysis has been tested on five systems and has proven to reliably identify the correct geometries of several ion coordination complexes. This tool is simple and general and can be efficiently applied to different MD simulations of liquid systems.
Coughlan, Helena; Reddington, Kate; Tuite, Nina; Boo, Teck Wee; Cormican, Martin; Barrett, Louise; Smith, Terry J; Clancy, Eoin; Barry, Thomas
2015-10-01
Haemophilus influenzae is recognised as an important human pathogen associated with invasive infections, including bloodstream infection and meningitis. Currently used molecular-based diagnostic assays lack specificity in correctly detecting and identifying H. influenzae. As such, there is a need to develop novel diagnostic assays for the specific identification of H. influenzae. Whole genome comparative analysis was performed to identify putative diagnostic targets, which are unique in nucleotide sequence to H. influenzae. From this analysis, we identified 2H. influenzae putative diagnostic targets, phoB and pstA, for use in real-time PCR diagnostic assays. Real-time PCR diagnostic assays using these targets were designed and optimised to specifically detect and identify all 55H. influenzae strains tested. These novel rapid assays can be applied to the specific detection and identification of H. influenzae for use in epidemiological studies and could also enable improved monitoring of invasive disease caused by these bacteria. Copyright © 2015 Elsevier Inc. All rights reserved.
Stöggl, Thomas; Holst, Anders; Jonasson, Arndt; Andersson, Erik; Wunsch, Tobias; Norström, Christer; Holmberg, Hans-Christer
2014-10-31
The purpose of the current study was to develop and validate an automatic algorithm for classification of cross-country (XC) ski-skating gears (G) using Smartphone accelerometer data. Eleven XC skiers (seven men, four women) with regional-to-international levels of performance carried out roller skiing trials on a treadmill using fixed gears (G2left, G2right, G3, G4left, G4right) and a 950-m trial using different speeds and inclines, applying gears and sides as they normally would. Gear classification by the Smartphone (on the chest) and based on video recordings were compared. Formachine-learning, a collective database was compared to individual data. The Smartphone application identified the trials with fixed gears correctly in all cases. In the 950-m trial, participants executed 140 ± 22 cycles as assessed by video analysis, with the automatic Smartphone application giving a similar value. Based on collective data, gears were identified correctly 86.0% ± 8.9% of the time, a value that rose to 90.3% ± 4.1% (P < 0.01) with machine learning from individual data. Classification was most often incorrect during transition between gears, especially to or from G3. Identification was most often correct for skiers who made relatively few transitions between gears. The accuracy of the automatic procedure for identifying G2left, G2right, G3, G4left and G4right was 96%, 90%, 81%, 88% and 94%, respectively. The algorithm identified gears correctly 100% of the time when a single gear was used and 90% of the time when different gears were employed during a variable protocol. This algorithm could be improved with respect to identification of transitions between gears or the side employed within a given gear.
Contact stress analysis of spiral bevel gears using nonlinear finite element static analysis
NASA Technical Reports Server (NTRS)
Bibel, G. D.; Kumar, A.; Reddy, S.; Handschuh, R.
1993-01-01
A procedure is presented for performing three-dimensional stress analysis of spiral bevel gears in mesh using the finite element method. The procedure involves generating a finite element model by solving equations that identify tooth surface coordinates. Coordinate transformations are used to orientate the gear and pinion for gear meshing. Contact boundary conditions are simulated with gap elements. A solution technique for correct orientation of the gap elements is given. Example models and results are presented.
Mod 1 wind turbine generator failure modes and effects analysis
NASA Technical Reports Server (NTRS)
1979-01-01
A failure modes and effects analysis (FMEA) was directed primarily at identifying those critical failure modes that would be hazardous to life or would result in major damage to the system. Each subsystem was approached from the top down, and broken down to successive lower levels where it appeared that the criticality of the failure mode warranted more detail analysis. The results were reviewed by specialists from outside the Mod 1 program, and corrective action taken wherever recommended.
Suggestibility and state anxiety: how the two concepts relate in a source identification paradigm.
Ridley, Anne M; Clifford, Brian R
2006-01-01
Source identification tests provide a stringent method for testing the suggestibility of memory because they reduce response bias and experimental demand characteristics. Using the techniques and materials of Maria Zaragoza and her colleagues, we investigated how state anxiety affects the ability of undergraduates to identify correctly the source of misleading post-event information. The results showed that individuals high in state anxiety were less likely to make source misattributions of misleading information, indicating lower levels of suggestibility. This effect was strengthened when forgotten or non-recognised misleading items (for which a source identification task is not possible) were excluded from the analysis. Confidence in the correct attribution of misleading post-event information to its source was significantly less than confidence in source misattributions. Participants who were high in state anxiety tended to be less confident than those lower in state anxiety when they correctly identified the source of both misleading post-event information and non-misled items. The implications of these findings are discussed, drawing on the literature on anxiety and cognition as well as suggestibility.
Patlewicz, Grace; Casati, Silvia; Basketter, David A; Asturiol, David; Roberts, David W; Lepoittevin, Jean-Pierre; Worth, Andrew P; Aschberger, Karin
2016-12-01
Predictive testing to characterize substances for their skin sensitization potential has historically been based on animal tests such as the Local Lymph Node Assay (LLNA). In recent years, regulations in the cosmetics and chemicals sectors have provided strong impetus to develop non-animal alternatives. Three test methods have undergone OECD validation: the direct peptide reactivity assay (DPRA), the KeratinoSens™ and the human Cell Line Activation Test (h-CLAT). Whilst these methods perform relatively well in predicting LLNA results, a concern raised is their ability to predict chemicals that need activation to be sensitizing (pre- or pro-haptens). This current study reviewed an EURL ECVAM dataset of 127 substances for which information was available in the LLNA and three non-animal test methods. Twenty eight of the sensitizers needed to be activated, with the majority being pre-haptens. These were correctly identified by 1 or more of the test methods. Six substances were categorized exclusively as pro-haptens, but were correctly identified by at least one of the cell-based assays. The analysis here showed that skin metabolism was not likely to be a major consideration for assessing sensitization potential and that sensitizers requiring activation could be identified correctly using one or more of the current non-animal methods. Published by Elsevier Inc.
DOT National Transportation Integrated Search
2017-10-31
NOTE: This is a revision of Letter Report V324-FA5JBQ-LR4. The purpose of the revision is to correctly identify the system designation for which the validation applies. The applicant originally identified the materials provided in support of this val...
Phaser.MRage: automated molecular replacement
Bunkóczi, Gábor; Echols, Nathaniel; McCoy, Airlie J.; Oeffner, Robert D.; Adams, Paul D.; Read, Randy J.
2013-01-01
Phaser.MRage is a molecular-replacement automation framework that implements a full model-generation workflow and provides several layers of model exploration to the user. It is designed to handle a large number of models and can distribute calculations efficiently onto parallel hardware. In addition, phaser.MRage can identify correct solutions and use this information to accelerate the search. Firstly, it can quickly score all alternative models of a component once a correct solution has been found. Secondly, it can perform extensive analysis of identified solutions to find protein assemblies and can employ assembled models for subsequent searches. Thirdly, it is able to use a priori assembly information (derived from, for example, homologues) to speculatively place and score molecules, thereby customizing the search procedure to a certain class of protein molecule (for example, antibodies) and incorporating additional biological information into molecular replacement. PMID:24189240
Phaser.MRage: automated molecular replacement.
Bunkóczi, Gábor; Echols, Nathaniel; McCoy, Airlie J; Oeffner, Robert D; Adams, Paul D; Read, Randy J
2013-11-01
Phaser.MRage is a molecular-replacement automation framework that implements a full model-generation workflow and provides several layers of model exploration to the user. It is designed to handle a large number of models and can distribute calculations efficiently onto parallel hardware. In addition, phaser.MRage can identify correct solutions and use this information to accelerate the search. Firstly, it can quickly score all alternative models of a component once a correct solution has been found. Secondly, it can perform extensive analysis of identified solutions to find protein assemblies and can employ assembled models for subsequent searches. Thirdly, it is able to use a priori assembly information (derived from, for example, homologues) to speculatively place and score molecules, thereby customizing the search procedure to a certain class of protein molecule (for example, antibodies) and incorporating additional biological information into molecular replacement.
NASA Astrophysics Data System (ADS)
Joshi, K. D.; Marchant, T. E.; Moore, C. J.
2017-03-01
A shading correction algorithm for the improvement of cone-beam CT (CBCT) images (Phys. Med. Biol. 53 5719{33) has been further developed, optimised and validated extensively using 135 clinical CBCT images of patients undergoing radiotherapy treatment of the pelvis, lungs and head and neck. An automated technique has been developed to efficiently analyse the large number of clinical images. Small regions of similar tissue (for example fat tissue) are automatically identified using CT images. The same regions on the corresponding CBCT image are analysed to ensure that they do not contain pixels representing multiple types of tissue. The mean value of all selected pixels and the non-uniformity, defined as the median absolute deviation of the mean values in each small region, are calculated. Comparisons between CT and raw and corrected CBCT images are then made. Analysis of fat regions in pelvis images shows an average difference in mean pixel value between CT and CBCT of 136:0 HU in raw CBCT images, which is reduced to 2:0 HU after the application of the shading correction algorithm. The average difference in non-uniformity of fat pixels is reduced from 33:7 in raw CBCT to 2:8 in shading-corrected CBCT images. Similar results are obtained in the analysis of lung and head and neck images.
A fractional Fourier transform analysis of a bubble excited by an ultrasonic chirp.
Barlow, Euan; Mulholland, Anthony J
2011-11-01
The fractional Fourier transform is proposed here as a model based, signal processing technique for determining the size of a bubble in a fluid. The bubble is insonified with an ultrasonic chirp and the radiated pressure field is recorded. This experimental bubble response is then compared with a series of theoretical model responses to identify the most accurate match between experiment and theory which allows the correct bubble size to be identified. The fractional Fourier transform is used to produce a more detailed description of each response, and two-dimensional cross correlation is then employed to identify the similarities between the experimental response and each theoretical response. In this paper the experimental bubble response is simulated by adding various levels of noise to the theoretical model output. The method is compared to the standard technique of using time-domain cross correlation. The proposed method is shown to be far more robust at correctly sizing the bubble and can cope with much lower signal to noise ratios.
Deriving pathway maps from automated text analysis using a grammar-based approach.
Olsson, Björn; Gawronska, Barbara; Erlendsson, Björn
2006-04-01
We demonstrate how automated text analysis can be used to support the large-scale analysis of metabolic and regulatory pathways by deriving pathway maps from textual descriptions found in the scientific literature. The main assumption is that correct syntactic analysis combined with domain-specific heuristics provides a good basis for relation extraction. Our method uses an algorithm that searches through the syntactic trees produced by a parser based on a Referent Grammar formalism, identifies relations mentioned in the sentence, and classifies them with respect to their semantic class and epistemic status (facts, counterfactuals, hypotheses). The semantic categories used in the classification are based on the relation set used in KEGG (Kyoto Encyclopedia of Genes and Genomes), so that pathway maps using KEGG notation can be automatically generated. We present the current version of the relation extraction algorithm and an evaluation based on a corpus of abstracts obtained from PubMed. The results indicate that the method is able to combine a reasonable coverage with high accuracy. We found that 61% of all sentences were parsed, and 97% of the parse trees were judged to be correct. The extraction algorithm was tested on a sample of 300 parse trees and was found to produce correct extractions in 90.5% of the cases.
Evaluating soil risks associated with severe wildfire and ground-based logging
Keith M. Reynolds; Paul F. Hessburg; Richard E. Miller; Robert T. Meurisse
2011-01-01
Rehabilitation and timber-salvage activities after wildfire require rapid planning and rational decisions. Identifying areas with high risk for erosion and soil productivity losses is important. Moreover, allocation of corrective and mitigative efforts must be rational and prioritized. Our logic-based analysis of forested soil polygons on the Okanogan-Wenatchee...
Artifacts and power corrections: Reexamining Z{sub {psi}}(p{sup 2}) and Z{sub V} in the momentum-subtraction scheme
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boucaud, Ph.; Leroy, J. P.; Le Yaouanc, A.
2006-08-01
The next-to-leading-order (NLO) term in the operator product expansion (OPE) of the quark propagator vector part Z{sub {psi}} and the vertex function g{sub 1} of the vector current in the Landau gauge should be dominated by the same condensate as in the gluon propagator. On the other hand, the perturbative part has been calculated to a very high precision thanks to Chetyrkin and collaborators. We test this on the lattice, with both clover and overlap fermion actions at {beta}=6.0, 6.4, 6.6, 6.8. Elucidation of discretization artifacts appears to be absolutely crucial. First hypercubic artifacts are eliminated by amore » powerful method, which gives results notably different from the standard democratic method. Then, the presence of unexpected, very large, nonperturbative, O(4) symmetric discretization artifacts, increasing towards small momenta, is demonstrated by considering Z{sub V}{sup MOM}, which should be constant in the absence of such artifacts. They impede in general the analysis of OPE. However, in two special cases with overlap action--(1) for Z{sub {psi}}; (2) for g{sub 1}, but only at large p{sup 2}--we are able to identify the condensate; it agrees with the one resulting from gluonic Green functions. We conclude that the OPE analysis of quark and gluon Green function has reached a quite consistent status, and that the power corrections have been correctly identified. A practical consequence of the whole analysis is that the renormalization constant Z{sub {psi}} (=Z{sub 2}{sup -1} of the momentum-subtraction (MOM) scheme) may differ sizably from the one given by democratic selection methods. More generally, the values of the renormalization constants may be seriously affected by the differences in the treatment of the various types of artifacts, and by the subtraction of power corrections.« less
Naserpour Farivar, Taghi; Najafipour, Reza; Johari, Pouran; Aslanimehr, Masoumeh; Peymani, Amir; Jahani Hashemi, Hoasan; Mirzaui, Baman
2014-10-01
We developed and evaluated the utility of a quadruplex Taqman real-time PCR assay that allows simultaneous identification of vancomycin-resistant genotypes and clinically relevant enterococci. The specificity of the assay was tested using reference strains of vancomycin-resistant and susceptible enterococci. In total, 193 clinical isolates were identified and subsequently genotyped using a Quadruplex Taqman real-time PCR assay and melting curve analysis. Representative Quadruplex Taqman real-time PCR amplification curve were obtained for Enterococcus faecium, Enterococcus faecalis, vanA-containing E. faecium, vanB-containing E. faecalis. Phenotypic and genotypic analysis of the isolates gave same results for 82 enterococcal isolates, while in 5 isolates, they were inconsistent. We had three mixed strains, which were detected by the TaqMan real-time PCR assay and could not be identified correctly using phenotypic methods. Vancomycin resistant enterococci (VRE) genotyping and identification of clinically relevant enterococci were rapidly and correctly performed using TaqMan real-time multiplex real-time PCR assay.
Adult CHD: the ongoing need for physician counselling about heredity and contraceptive options.
Londono-Obregon, Camila; Goldmuntz, Elizabeth; Davey, Brooke T; Zhang, Xuemei; Slap, Gail B; Kim, Yuli Y
2017-05-01
Purpose Current guidelines recommend that patients with CHD receive age-appropriate counselling on reproduction, pregnancy, and risk of heredity. Our aim was to examine patient knowledge of reproductive health and explore the association between patient knowledge of CHD transmission risk and earlier physician counselling in adults with CHD. We performed a cross-sectional survey of patients with CHD aged 18 years and older in a paediatric hospital. Of the 100 patients who completed the questionnaire, most did not report counselling on heredity (66%) or contraception (71%). Of the 54 women, 25 (46%) identified their contraceptive options correctly; 42 (78%) women were classified as being at significantly increased risk for an adverse outcome during pregnancy, and of these 20 (48%) identified this risk correctly. Of all patients surveyed, 72% did not know that having CHD placed them at increased risk for having a child with CHD. On multivariate analysis, factors associated with correct knowledge about risk of recurrence were correct identification of CHD diagnosis (p=0.04) and patient-reported counselling (p=0.001). Knowledge about heredity, pregnancy risk, and contraceptive options is inadequate among adults with CHD followed-up in a paediatric subspecialty clinic. The majority of patients did not report a history of counselling about reproductive health. There is a strong correlation between history of counselling by the patient's cardiologist and correct knowledge about recurrence risk, suggesting that effective reproductive counselling can positively impact this knowledge gap.
A novel approach to identify genes that determine grain protein deviation in cereals.
Mosleth, Ellen F; Wan, Yongfang; Lysenko, Artem; Chope, Gemma A; Penson, Simon P; Shewry, Peter R; Hawkesford, Malcolm J
2015-06-01
Grain yield and protein content were determined for six wheat cultivars grown over 3 years at multiple sites and at multiple nitrogen (N) fertilizer inputs. Although grain protein content was negatively correlated with yield, some grain samples had higher protein contents than expected based on their yields, a trait referred to as grain protein deviation (GPD). We used novel statistical approaches to identify gene transcripts significantly related to GPD across environments. The yield and protein content were initially adjusted for nitrogen fertilizer inputs and then adjusted for yield (to remove the negative correlation with protein content), resulting in a parameter termed corrected GPD. Significant genetic variation in corrected GPD was observed for six cultivars grown over a range of environmental conditions (a total of 584 samples). Gene transcript profiles were determined in a subset of 161 samples of developing grain to identify transcripts contributing to GPD. Principal component analysis (PCA), analysis of variance (ANOVA) and means of scores regression (MSR) were used to identify individual principal components (PCs) correlating with GPD alone. Scores of the selected PCs, which were significantly related to GPD and protein content but not to the yield and significantly affected by cultivar, were identified as reflecting a multivariate pattern of gene expression related to genetic variation in GPD. Transcripts with consistent variation along the selected PCs were identified by an approach hereby called one-block means of scores regression (one-block MSR). © 2014 The Authors. Plant Biotechnology Journal published by Society for Experimental Biology and The Association of Applied Biologists and John Wiley & Sons Ltd.
Open Reading Frame Phylogenetic Analysis on the Cloud
2013-01-01
Phylogenetic analysis has become essential in researching the evolutionary relationships between viruses. These relationships are depicted on phylogenetic trees, in which viruses are grouped based on sequence similarity. Viral evolutionary relationships are identified from open reading frames rather than from complete sequences. Recently, cloud computing has become popular for developing internet-based bioinformatics tools. Biocloud is an efficient, scalable, and robust bioinformatics computing service. In this paper, we propose a cloud-based open reading frame phylogenetic analysis service. The proposed service integrates the Hadoop framework, virtualization technology, and phylogenetic analysis methods to provide a high-availability, large-scale bioservice. In a case study, we analyze the phylogenetic relationships among Norovirus. Evolutionary relationships are elucidated by aligning different open reading frame sequences. The proposed platform correctly identifies the evolutionary relationships between members of Norovirus. PMID:23671843
Lindblad, Anne S; Manukyan, Zorayr; Purohit-Sheth, Tejashri; Gensler, Gary; Okwesili, Paul; Meeker-O'Connell, Ann; Ball, Leslie; Marler, John R
2014-04-01
Site monitoring and source document verification account for 15%-30% of clinical trial costs. An alternative is to streamline site monitoring to focus on correcting trial-specific risks identified by central data monitoring. This risk-based approach could preserve or even improve the quality of clinical trial data and human subject protection compared to site monitoring focused primarily on source document verification. To determine whether a central review by statisticians using data submitted to the Food and Drug Administration (FDA) by clinical trial sponsors can identify problem sites and trials that failed FDA site inspections. An independent Analysis Center (AC) analyzed data from four anonymous new drug applications (NDAs) where FDA had performed site inspections overseen by FDA's Office of Scientific Investigations (OSI). FDA team members in the OSI chose the four NDAs from among all NDAs with data in Study Data Tabulation Model (SDTM) format. Two of the NDAs had data that OSI had deemed unreliable in support of the application after FDA site inspections identified serious data integrity problems. The other two NDAs had clinical data that OSI deemed reliable after site inspections. At the outset, the AC knew only that the experimental design specified two NDAs with significant problems. FDA gave the AC no information about which NDAs had problems, how many sites were inspected, or how many were found to have problems until after the AC analysis was complete. The AC evaluated randomization balance, enrollment patterns, study visit scheduling, variability of reported data, and last digit reference. The AC classified sites as 'High Concern', 'Moderate Concern', 'Mild Concern', or 'No Concern'. The AC correctly identified the two NDAs with data deemed unreliable by OSI. In addition, central data analysis correctly identified 5 of 6 (83%) sites for which FDA recommended rejection of data and 13 of 15 sites (87%) for which any regulatory deviations were identified during inspection. Of the six sites for which OSI reviewed inspections and found no deviations, the central process flagged four at the lowest level of concern, one at a moderate level, and one was not flagged. Central data monitoring during the conduct of a trial while data checking was in progress was not evaluated. Systematic central monitoring of clinical trial data can identify problems at the same trials and sites identified during FDA site inspections. Central data monitoring in conjunction with an overall monitoring process that adapts to identify risks as a trial progresses has the potential to reduce the frequency of site visits while increasing data integrity and decreasing trial costs compared to processes that are dependent primarily on source documentation.
Scientific data reduction and analysis plan: PI services
NASA Technical Reports Server (NTRS)
Feldman, P. D.; Fastie, W. G.
1971-01-01
This plan comprises two parts. The first concerns the real-time data display to be provided by MSC during the mission. The prime goal is to assess the operation of the UVS and to identify any problem areas that could be corrected during the mission. It is desirable to identify any possible observations of unusual scientific interest in order to repeat these observations at a later point in the mission, or to modify the time line with respect to the operating modes of the UVS. The second part of the plan discusses the more extensive postflight analysis of the data in terms of the scientific objectives of this experiment.
Le, Laetitia Minh Mai; Reitter, Delphine; He, Sophie; Bonle, Franck Té; Launois, Amélie; Martinez, Diane; Prognon, Patrice; Caudron, Eric
2017-12-01
Handling cytotoxic drugs is associated with chemical contamination of workplace surfaces. The potential mutagenic, teratogenic and oncogenic properties of those drugs create a risk of occupational exposure for healthcare workers, from reception of starting materials to the preparation and administration of cytotoxic therapies. The Security Failure Mode Effects and Criticality Analysis (FMECA) was used as a proactive method to assess the risks involved in the chemotherapy compounding process. FMECA was carried out by a multidisciplinary team from 2011 to 2016. Potential failure modes of the process were identified based on the Risk Priority Number (RPN) that prioritizes corrective actions. Twenty-five potential failure modes were identified. Based on RPN results, the corrective actions plan was revised annually to reduce the risk of exposure and improve practices. Since 2011, 16 specific measures were implemented successively. In six years, a cumulative RPN reduction of 626 was observed, with a decrease from 912 to 286 (-69%) despite an increase of cytotoxic compounding activity of around 23.2%. In order to anticipate and prevent occupational exposure, FMECA is a valuable tool to identify, prioritize and eliminate potential failure modes for operators involved in the cytotoxic drug preparation process before the failures occur. Copyright © 2017 Elsevier B.V. All rights reserved.
Marcotte, Thomas D.; Deutsch, Reena; Michael, Benedict Daniel; Franklin, Donald; Cookson, Debra Rosario; Bharti, Ajay R.; Grant, Igor; Letendre, Scott L.
2013-01-01
Background Neurocognitive (NC) impairment (NCI) occurs commonly in people living with HIV. Despite substantial effort, no biomarkers have been sufficiently validated for diagnosis and prognosis of NCI in the clinic. The goal of this project was to identify diagnostic or prognostic biomarkers for NCI in a comprehensively characterized HIV cohort. Methods Multidisciplinary case review selected 98 HIV-infected individuals and categorized them into four NC groups using normative data: stably normal (SN), stably impaired (SI), worsening (Wo), or improving (Im). All subjects underwent comprehensive NC testing, phlebotomy, and lumbar puncture at two timepoints separated by a median of 6.2 months. Eight biomarkers were measured in CSF and blood by immunoassay. Results were analyzed using mixed model linear regression and staged recursive partitioning. Results At the first visit, subjects were mostly middle-aged (median 45) white (58%) men (84%) who had AIDS (70%). Of the 73% who took antiretroviral therapy (ART), 54% had HIV RNA levels below 50 c/mL in plasma. Mixed model linear regression identified that only MCP-1 in CSF was associated with neurocognitive change group. Recursive partitioning models aimed at diagnosis (i.e., correctly classifying neurocognitive status at the first visit) were complex and required most biomarkers to achieve misclassification limits. In contrast, prognostic models were more efficient. A combination of three biomarkers (sCD14, MCP-1, SDF-1α) correctly classified 82% of Wo and SN subjects, including 88% of SN subjects. A combination of two biomarkers (MCP-1, TNF-α) correctly classified 81% of Im and SI subjects, including 100% of SI subjects. Conclusions This analysis of well-characterized individuals identified concise panels of biomarkers associated with NC change. Across all analyses, the two most frequently identified biomarkers were sCD14 and MCP-1, indicators of monocyte/macrophage activation. While the panels differed depending on the outcome and on the degree of misclassification, nearly all stable patients were correctly classified. PMID:24101401
NASA Astrophysics Data System (ADS)
Shahriari Nia, Morteza; Wang, Daisy Zhe; Bohlman, Stephanie Ann; Gader, Paul; Graves, Sarah J.; Petrovic, Milenko
2015-01-01
Hyperspectral images can be used to identify savannah tree species at the landscape scale, which is a key step in measuring biomass and carbon, and tracking changes in species distributions, including invasive species, in these ecosystems. Before automated species mapping can be performed, image processing and atmospheric correction is often performed, which can potentially affect the performance of classification algorithms. We determine how three processing and correction techniques (atmospheric correction, Gaussian filters, and shade/green vegetation filters) affect the prediction accuracy of classification of tree species at pixel level from airborne visible/infrared imaging spectrometer imagery of longleaf pine savanna in Central Florida, United States. Species classification using fast line-of-sight atmospheric analysis of spectral hypercubes (FLAASH) atmospheric correction outperformed ATCOR in the majority of cases. Green vegetation (normalized difference vegetation index) and shade (near-infrared) filters did not increase classification accuracy when applied to large and continuous patches of specific species. Finally, applying a Gaussian filter reduces interband noise and increases species classification accuracy. Using the optimal preprocessing steps, our classification accuracy of six species classes is about 75%.
DOE Office of Scientific and Technical Information (OSTI.GOV)
DOE /NV
This Corrective Action Decision Document has been prepared for Corrective Action Unit 340, the NTS Pesticide Release Sites, in accordance with the Federal Facility Agreement and Consent Order of 1996 (FFACO, 1996). Corrective Action Unit 340 is located at the Nevada Test Site, Nevada, and is comprised of the following Corrective Action Sites: 23-21-01, Area 23 Quonset Hut 800 Pesticide Release Ditch; 23-18-03, Area 23 Skid Huts Pesticide Storage; and 15-18-02, Area 15 Quonset Hut 15-11 Pesticide Storage. The purpose of this Corrective Action Decision Document is to identify and provide a rationale for the selection of a recommended correctivemore » action alternative for each Corrective Action Site. The scope of this Corrective Action Decision Document consists of the following tasks: Develop corrective action objectives; Identify corrective action alternative screening criteria; Develop corrective action alternatives; Perform detailed and comparative evaluations of the corrective action alternatives in relation to the corrective action objectives and screening criteria; and Recommend and justify a preferred corrective action alternative for each Corrective Action Site.« less
Identification of Particles in Parenteral Drug Raw Materials.
Lee, Kathryn; Lankers, Markus; Valet, Oliver
2018-04-18
Particles in drug products are not good and are therefore regulated. These particles can come from the very beginning of the manufacturing process, from the raw materials. To prevent particles, it is important to understand what they are and where they come from so the raw material quality, processing, and shipping can be improved. Thus, it is important to correctly identify particles seen in raw materials. Raw materials need to be of a certain quality with respect to physical and chemical composition, and need to have no contaminants in the form of particles which could contaminate the product or indicate the raw materials are not pure enough to make a good quality product. Particles are often seen when handling raw materials due to color, size, or shape characteristics different from those in the raw materials. Particles may appear to the eye to be very different things than they actually are, so microscope, chemical, and elemental analyses are required for accuracy in proper identification. This paper shows how using three different spectroscopy tools correctly and together can be used to identify particles from extrinsic, intrinsic, and inherent particles. Sources of materials can be humans and the environment (extrinsic), from within the process (intrinsic), and part of the formulation (inherent). Microscope versions of Raman spectroscopy, laser-induced breakdown spectroscopy (LIBS), and IR spectroscopy are excellent tools for identifying particles because they are fast and accurate techniques needing minimal sample preparation that can provide chemical composition as well as images that can be used for identification. The micro analysis capabilities allow for easy analysis of different portions of samples so multiple components can be identified and sample preparation can be reduced. Using just one of these techniques may not be sufficient to give adequate identification results so that the source of contamination can be adequately identified. The complementarity of the techniques provides the advantage of identifying various chemical and molecular components, as well as elemental and image analyses. Correct interpretation of the results from these techniques is also very important. Copyright © 2018, Parenteral Drug Association.
Processor register error correction management
Bose, Pradip; Cher, Chen-Yong; Gupta, Meeta S.
2016-12-27
Processor register protection management is disclosed. In embodiments, a method of processor register protection management can include determining a sensitive logical register for executable code generated by a compiler, generating an error-correction table identifying the sensitive logical register, and storing the error-correction table in a memory accessible by a processor. The processor can be configured to generate a duplicate register of the sensitive logical register identified by the error-correction table.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-28
... Identifier: CMS-10003] Public Information Collection Requirements Submitted to the Office of Management and Budget (OMB); Correction AGENCY: Centers for Medicare & Medicaid Services (CMS), HHS. ACTION: Correction of notice. SUMMARY: This document corrects a technical error in the notice [Document Identifier: CMS...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-08
... Identifier: CMS-10379] Public Information Collection Requirements Submitted to the Office of Management and Budget (OMB); Correction AGENCY: Centers for Medicare & Medicaid Services (CMS), HHS. ACTION: Correction of notice. SUMMARY: This document corrects the information provided for [Document Identifier: CMS...
Growth Modulation in Achondroplasia.
McClure, Philip K; Kilinc, Eray; Birch, John G
2017-09-01
Achondroplasia is the most common skeletal dysplasia with a rate of nearly 1/10,000. The development of lower extremity deformity is well documented, and various modes of correction have been reported. There are no reports on the use of growth modulation to correct angular deformity in achondroplasia. Medical Records from 1985 to 2015 were reviewed for the diagnosis of achondroplasia and growth modulation procedures. Patients who had been treated for angular deformity of the legs by growth modulation were identified. A detailed analysis of their medical record and preoperative and final lower extremity radiographs was completed. Four patients underwent growth modulation procedures, all to correct existing varus deformity of the legs. Three of the 4 patients underwent bilateral distal femoral and proximal tibial growth modulation. The remaining patient underwent tibial correction only. Two of the 4 patients had a combined proximal fibular epiphysiodesis. All limbs had some improvement of alignment; however, 1 patient went on to bilateral osteotomies. Only 1 limb corrected to a neutral axis with growth modulation alone at last follow-up, initial implantation was done before 5 years of age. Growth modulation is an effective means for deformity correction in the setting of achondroplasia. However implantation may need to be done earlier than would be typical for patients without achondroplasia. Osteotomy may still be required after growth modulation for incomplete correction.
2013-01-01
Background Surrogate variable analysis (SVA) is a powerful method to identify, estimate, and utilize the components of gene expression heterogeneity due to unknown and/or unmeasured technical, genetic, environmental, or demographic factors. These sources of heterogeneity are common in gene expression studies, and failing to incorporate them into the analysis can obscure results. Using SVA increases the biological accuracy and reproducibility of gene expression studies by identifying these sources of heterogeneity and correctly accounting for them in the analysis. Results Here we have developed a web application called SVAw (Surrogate variable analysis Web app) that provides a user friendly interface for SVA analyses of genome-wide expression studies. The software has been developed based on open source bioconductor SVA package. In our software, we have extended the SVA program functionality in three aspects: (i) the SVAw performs a fully automated and user friendly analysis workflow; (ii) It calculates probe/gene Statistics for both pre and post SVA analysis and provides a table of results for the regression of gene expression on the primary variable of interest before and after correcting for surrogate variables; and (iii) it generates a comprehensive report file, including graphical comparison of the outcome for the user. Conclusions SVAw is a web server freely accessible solution for the surrogate variant analysis of high-throughput datasets and facilitates removing all unwanted and unknown sources of variation. It is freely available for use at http://psychiatry.igm.jhmi.edu/sva. The executable packages for both web and standalone application and the instruction for installation can be downloaded from our web site. PMID:23497726
Global Surveillance of Emerging Influenza Virus Genotypes by Mass Spectrometry
Sampath, Rangarajan; Russell, Kevin L.; Massire, Christian; Eshoo, Mark W.; Harpin, Vanessa; Blyn, Lawrence B.; Melton, Rachael; Ivy, Cristina; Pennella, Thuy; Li, Feng; Levene, Harold; Hall, Thomas A.; Libby, Brian; Fan, Nancy; Walcott, Demetrius J.; Ranken, Raymond; Pear, Michael; Schink, Amy; Gutierrez, Jose; Drader, Jared; Moore, David; Metzgar, David; Addington, Lynda; Rothman, Richard; Gaydos, Charlotte A.; Yang, Samuel; St. George, Kirsten; Fuschino, Meghan E.; Dean, Amy B.; Stallknecht, David E.; Goekjian, Ginger; Yingst, Samuel; Monteville, Marshall; Saad, Magdi D.; Whitehouse, Chris A.; Baldwin, Carson; Rudnick, Karl H.; Hofstadler, Steven A.; Lemon, Stanley M.; Ecker, David J.
2007-01-01
Background Effective influenza surveillance requires new methods capable of rapid and inexpensive genomic analysis of evolving viral species for pandemic preparedness, to understand the evolution of circulating viral species, and for vaccine strain selection. We have developed one such approach based on previously described broad-range reverse transcription PCR/electrospray ionization mass spectrometry (RT-PCR/ESI-MS) technology. Methods and Principal Findings Analysis of base compositions of RT-PCR amplicons from influenza core gene segments (PB1, PB2, PA, M, NS, NP) are used to provide sub-species identification and infer influenza virus H and N subtypes. Using this approach, we detected and correctly identified 92 mammalian and avian influenza isolates, representing 30 different H and N types, including 29 avian H5N1 isolates. Further, direct analysis of 656 human clinical respiratory specimens collected over a seven-year period (1999–2006) showed correct identification of the viral species and subtypes with >97% sensitivity and specificity. Base composition derived clusters inferred from this analysis showed 100% concordance to previously established clades. Ongoing surveillance of samples from the recent influenza virus seasons (2005–2006) showed evidence for emergence and establishment of new genotypes of circulating H3N2 strains worldwide. Mixed viral quasispecies were found in approximately 1% of these recent samples providing a view into viral evolution. Conclusion/Significance Thus, rapid RT-PCR/ESI-MS analysis can be used to simultaneously identify all species of influenza viruses with clade-level resolution, identify mixed viral populations and monitor global spread and emergence of novel viral genotypes. This high-throughput method promises to become an integral component of influenza surveillance. PMID:17534439
Varzakas, Theodoros H
2011-09-01
The Failure Mode and Effect Analysis (FMEA) model has been applied for the risk assessment of pastry processing. A tentative approach of FMEA application to the pastry industry was attempted in conjunction with ISO22000. Preliminary Hazard Analysis was used to analyze and predict the occurring failure modes in a food chain system (pastry processing plant), based on the functions, characteristics, and/or interactions of the ingredients or the processes, upon which the system depends. Critical Control points have been identified and implemented in the cause and effect diagram (also known as Ishikawa, tree diagram, and fishbone diagram). In this work a comparison of ISO22000 analysis with HACCP is carried out over pastry processing and packaging. However, the main emphasis was put on the quantification of risk assessment by determining the Risk Priority Number (RPN) per identified processing hazard. Storage of raw materials and storage of final products at -18°C followed by freezing were the processes identified as the ones with the highest RPN (225, 225, and 144 respectively) and corrective actions were undertaken. Following the application of corrective actions, a second calculation of RPN values was carried out leading to considerably lower values (below the upper acceptable limit of 130). It is noteworthy that the application of Ishikawa (Cause and Effect or Tree diagram) led to converging results thus corroborating the validity of conclusions derived from risk assessment and FMEA. Therefore, the incorporation of FMEA analysis within the ISO22000 system of a pastry processing industry is considered imperative.
Carleton, W. Christopher; Campbell, David
2018-01-01
Statistical time-series analysis has the potential to improve our understanding of human-environment interaction in deep time. However, radiocarbon dating—the most common chronometric technique in archaeological and palaeoenvironmental research—creates challenges for established statistical methods. The methods assume that observations in a time-series are precisely dated, but this assumption is often violated when calibrated radiocarbon dates are used because they usually have highly irregular uncertainties. As a result, it is unclear whether the methods can be reliably used on radiocarbon-dated time-series. With this in mind, we conducted a large simulation study to investigate the impact of chronological uncertainty on a potentially useful time-series method. The method is a type of regression involving a prediction algorithm called the Poisson Exponentially Weighted Moving Average (PEMWA). It is designed for use with count time-series data, which makes it applicable to a wide range of questions about human-environment interaction in deep time. Our simulations suggest that the PEWMA method can often correctly identify relationships between time-series despite chronological uncertainty. When two time-series are correlated with a coefficient of 0.25, the method is able to identify that relationship correctly 20–30% of the time, providing the time-series contain low noise levels. With correlations of around 0.5, it is capable of correctly identifying correlations despite chronological uncertainty more than 90% of the time. While further testing is desirable, these findings indicate that the method can be used to test hypotheses about long-term human-environment interaction with a reasonable degree of confidence. PMID:29351329
Carleton, W Christopher; Campbell, David; Collard, Mark
2018-01-01
Statistical time-series analysis has the potential to improve our understanding of human-environment interaction in deep time. However, radiocarbon dating-the most common chronometric technique in archaeological and palaeoenvironmental research-creates challenges for established statistical methods. The methods assume that observations in a time-series are precisely dated, but this assumption is often violated when calibrated radiocarbon dates are used because they usually have highly irregular uncertainties. As a result, it is unclear whether the methods can be reliably used on radiocarbon-dated time-series. With this in mind, we conducted a large simulation study to investigate the impact of chronological uncertainty on a potentially useful time-series method. The method is a type of regression involving a prediction algorithm called the Poisson Exponentially Weighted Moving Average (PEMWA). It is designed for use with count time-series data, which makes it applicable to a wide range of questions about human-environment interaction in deep time. Our simulations suggest that the PEWMA method can often correctly identify relationships between time-series despite chronological uncertainty. When two time-series are correlated with a coefficient of 0.25, the method is able to identify that relationship correctly 20-30% of the time, providing the time-series contain low noise levels. With correlations of around 0.5, it is capable of correctly identifying correlations despite chronological uncertainty more than 90% of the time. While further testing is desirable, these findings indicate that the method can be used to test hypotheses about long-term human-environment interaction with a reasonable degree of confidence.
Trobisch, Per D; Samdani, Amer F; Betz, Randal R; Bastrom, Tracey; Pahys, Joshua M; Cahill, Patrick J
2013-06-01
Iatrogenic flattening of lumbar lordosis in patients with adolescent idiopathic scoliosis (AIS) was a major downside of first generation instrumentation. Current instrumentation systems allow a three-dimensional scoliosis correction, but flattening of lumbar lordosis remains a significant problem which is associated with decreased health-related quality of life. This study sought to identify risk factors for loss of lumbar lordosis in patients who had surgical correction of AIS with the use of segmental instrumentation. Patients were included if they had surgical correction for AIS with segmental pedicle screw instrumentation Lenke type 1 or 2 and if they had a minimum follow-up of 24 months. Two groups were created, based on the average loss of lumbar lordosis. The two groups were then compared and multivariate analysis was performed to identify parameters that correlated to loss of lumbar lordosis. Four hundred and seventeen patients were analyzed for this study. The average loss of lumbar lordosis at 24 months follow-up was an increase of 10° lordosis for group 1 and a decrease of 15° for group 2. Risk factors for loss of lumbar lordosis included a high preoperative lumbar lordosis, surgical decrease of thoracic kyphosis, and the particular operating surgeon. The lowest instrumented vertebra or spinopelvic parameters were two of many parameters that did not seem to influence loss of lumbar lordosis. This study identified important risk factors for decrease of lumbar lordosis in patients who had surgical treatment for AIS with segmental pedicle screw instrumentation, including a high preoperative lumbar lordosis, surgical decrease of thoracic kyphosis, and factors attributable to a particular operating surgeon that were not quantified in this study.
Color correction strategies in optical design
NASA Astrophysics Data System (ADS)
Pfisterer, Richard N.; Vorndran, Shelby D.
2014-12-01
An overview of color correction strategies is presented. Starting with basic first-order aberration theory, we identify known color corrected solutions for doublets and triplets. Reviewing the modern approaches of Robb-Mercado, Rayces-Aguilar, and C. de Albuquerque et al, we find that they confirm the existence of glass combinations for doublets and triplets that yield color corrected solutions that we already know exist. Finally we explore the use of the y, ӯ diagram in conjunction with aberration theory to identify the solution space of glasses capable of leading to color corrected solutions in arbitrary optical systems.
Hartmann, Linda; Neveling, Kornelia; Borkens, Stephanie; Schneider, Hildegard; Freund, Marcel; Grassman, Elke; Theiss, Stephan; Wawer, Angela; Burdach, Stefan; Auerbach, Arleen D.; Schindler, Detlev; Hanenberg, Helmut; Schaal, Heiner
2010-01-01
The U1 small nuclear RNA (U1 snRNA) as a component of the major U2-dependent spliceosome recognizes 5′ splice sites (5′ss) containing GT as the canonical dinucleotide in the intronic positions +1 and +2. The c.165+1G>T germline mutation in the 5′ss of exon 2 of the Fanconi anemia C (FANCC) gene commonly predicted to prevent correct splicing was identified in nine FA patients from three pedigrees. RT-PCR analysis of the endogenous FANCC mRNA splicing pattern of patient-derived fibroblasts revealed aberrant mRNA processing, but surprisingly also correct splicing at the TT dinucleotide, albeit with lower efficiency. This consequently resulted in low levels of correctly spliced transcript and minute levels of normal posttranslationally processed FANCD2 protein, indicating that this naturally occurring TT splicing might contribute to the milder clinical manifestations of the disease in these patients. Functional analysis of this FANCC 5′ss within splicing reporters revealed that both the noncanonical TT dinucleotide and the genomic context of FANCC were required for the residual correct splicing at this mutant 5′ss. Finally, use of lentiviral vectors as a delivery system to introduce expression cassettes for TT-adapted U1 snRNAs into primary FANCC patient fibroblasts allowed the correction of the DNA-damage-induced G2 cell-cycle arrest in these cells, thus representing an alternative transcript-targeting approach for genetic therapy of inherited splice-site mutations. PMID:20869034
Lewis, Noah D H; Keshen, Sam G N; Lenke, Lawrence G; Zywiel, Michael G; Skaggs, David L; Dear, Taylor E; Strantzas, Samuel; Lewis, Stephen J
2015-08-01
A retrospective analysis. The purpose of this study was to determine whether the deformity angular ratio (DAR) can reliably assess the neurological risks of patients undergoing deformity correction. Identifying high-risk patients and procedures can help ensure that appropriate measures are taken to minimize neurological complications during spinal deformity corrections. Subjectively, surgeons look at radiographs and evaluate the riskiness of the procedure. However, 2 curves of similar magnitude and location can have significantly different risks of neurological deficit during surgery. Whether the curve spans many levels or just a few can significantly influence surgical strategies. Lenke et al have proposed the DAR, which is a measure of curve magnitude per level of deformity. The data from 35 pediatric spinal deformity correction procedures with thoracic 3-column osteotomies were reviewed. Measurements from preoperative radiographs were used to calculate the DAR. Binary logistic regression was used to model the relationship between DARs (independent variables) and presence or absence of an intraoperative alert (dependent variable). In patients undergoing 3-column osteotomies, sagittal curve magnitude and total curve magnitude were associated with increased incidence of transcranial motor evoked potential changes. Total DAR greater than 45° per level and sagittal DAR greater than 22° per level were associated with a 75% incidence of a motor evoked potential alert, with the incidence increasing to 90% with sagittal DAR of 28° per level. In patients undergoing 3-column osteotomies for severe spinal deformities, the DAR was predictive of patients developing intraoperative motor evoked potential alerts. Identifying accurate radiographical, patient, and procedural risk factors in the correction of severe deformities can help prepare the surgical team to improve safety and outcomes when carrying out complex spinal corrections. 3.
Mouratidis, Athanasios; Lens, Willy; Vansteenkiste, Maarten
2010-10-01
We relied on self-determination theory (SDT; Deci & Ryan, 2000) to investigate to what extent autonomy-supporting corrective feedback (i.e., feedback that coaches communicate to their athletes after poor performance or mistakes) is associated with athletes' optimal motivation and well-being. To test this hypothesis, we conducted a cross-sectional study with 337 (67.1% males) Greek adolescent athletes (age M = 15.59, SD = 2.37) from various sports. Aligned with SDT, we found through path analysis that an autonomy-supporting versus controlling communication style was positively related to future intentions to persist and well-being and negatively related to ill-being. These relations were partially mediated by the perceived legitimacy of the corrective feedback (i.e., the degree of acceptance of corrective feedback), and, in turn, by intrinsic motivation, identified regulation, and external regulation for doing sports. Results indicate that autonomy-supporting feedback can be still motivating even in cases in which such feedback conveys messages of still too low competence.
Two-compartment modeling of tissue microcirculation revisited.
Brix, Gunnar; Salehi Ravesh, Mona; Griebel, Jürgen
2017-05-01
Conventional two-compartment modeling of tissue microcirculation is used for tracer kinetic analysis of dynamic contrast-enhanced (DCE) computed tomography or magnetic resonance imaging studies although it is well-known that the underlying assumption of an instantaneous mixing of the administered contrast agent (CA) in capillaries is far from being realistic. It was thus the aim of the present study to provide theoretical and computational evidence in favor of a conceptually alternative modeling approach that makes it possible to characterize the bias inherent to compartment modeling and, moreover, to approximately correct for it. Starting from a two-region distributed-parameter model that accounts for spatial gradients in CA concentrations within blood-tissue exchange units, a modified lumped two-compartment exchange model was derived. It has the same analytical structure as the conventional two-compartment model, but indicates that the apparent blood flow identifiable from measured DCE data is substantially overestimated, whereas the three other model parameters (i.e., the permeability-surface area product as well as the volume fractions of the plasma and interstitial distribution space) are unbiased. Furthermore, a simple formula was derived to approximately compute a bias-corrected flow from the estimates of the apparent flow and permeability-surface area product obtained by model fitting. To evaluate the accuracy of the proposed modeling and bias correction method, representative noise-free DCE curves were analyzed. They were simulated for 36 microcirculation and four input scenarios by an axially distributed reference model. As analytically proven, the considered two-compartment exchange model is structurally identifiable from tissue residue data. The apparent flow values estimated for the 144 simulated tissue/input scenarios were considerably biased. After bias-correction, the deviations between estimated and actual parameter values were (11.2 ± 6.4) % (vs. (105 ± 21) % without correction) for the flow, (3.6 ± 6.1) % for the permeability-surface area product, (5.8 ± 4.9) % for the vascular volume and (2.5 ± 4.1) % for the interstitial volume; with individual deviations of more than 20% being the exception and just marginal. Increasing the duration of CA administration only had a statistically significant but opposite effect on the accuracy of the estimated flow (declined) and intravascular volume (improved). Physiologically well-defined tissue parameters are structurally identifiable and accurately estimable from DCE data by the conceptually modified two-compartment model in combination with the bias correction. The accuracy of the bias-corrected flow is nearly comparable to that of the three other (theoretically unbiased) model parameters. As compared to conventional two-compartment modeling, this feature constitutes a major advantage for tracer kinetic analysis of both preclinical and clinical DCE imaging studies. © 2017 American Association of Physicists in Medicine.
D'Angelo, Heather; Fleischhacker, Sheila; Rose, Shyanika W; Ribisl, Kurt M
2014-07-01
Identifying tobacco retail outlets for U.S. FDA compliance checks or calculating tobacco outlet density is difficult in the 13 States without tobacco retail licensing or where licensing lists are unavailable for research. This study uses primary data collection to identify tobacco outlets in three counties in a non-licensing state and validate two commercial secondary data sources. We calculated sensitivity and positive predictive values (PPV) to examine the evidence of validity for two secondary data sources, and conducted a geospatial analysis to determine correct allocation to census tract. ReferenceUSA had almost perfect sensitivity (0.82) while Dun & Bradstreet (D&B) had substantial sensitivity (0.69) for identifying tobacco outlets; combined, sensitivity improved to 0.89. D&B identified fewer "false positives" with a PPV of 0.82 compared to 0.71 for ReferenceUSA. More than 90% of the outlets identified by ReferenceUSA were geocoded to the correct census tract. Combining two commercial data sources resulted in enumeration of nearly 90% of tobacco outlets in a three county area. Commercial databases appear to provide a reasonably accurate way to identify tobacco outlets for enforcement operations and density estimation. Copyright © 2014 Elsevier Ltd. All rights reserved.
Cortico-muscular coherence on artifact corrected EEG-EMG data recorded with a MRI scanner.
Muthuraman, M; Galka, A; Hong, V N; Heute, U; Deuschl, G; Raethjen, J
2013-01-01
Simultaneous recording of electroencephalogram (EEG) and electromyogram (EMG) with magnetic resonance imaging (MRI) provides great potential for studying human brain activity with high temporal and spatial resolution. But, due to the MRI, the recorded signals are contaminated with artifacts. The correction of these artifacts is important to use these signals for further spectral analysis. The coherence can reveal the cortical representation of peripheral muscle signal in particular motor tasks, e.g. finger movements. The artifact correction of these signals was done by two different algorithms the Brain vision analyzer (BVA) and the Matlab FMRIB plug-in for EEGLAB. The Welch periodogram method was used for estimating the cortico-muscular coherence. Our analysis revealed coherence with a frequency of 5Hz in the contralateral side of the brain. The entropy is estimated for the calculated coherence to get the distribution of coherence in the scalp. The significance of the paper is to identify the optimal algorithm to rectify the MR artifacts and as a first step to use both these signals EEG and EMG in conjunction with MRI for further studies.
Classification and prediction of pilot weather encounters: A discriminant function analysis.
O'Hare, David; Hunter, David R; Martinussen, Monica; Wiggins, Mark
2011-05-01
Flight into adverse weather continues to be a significant hazard for General Aviation (GA) pilots. Weather-related crashes have a significantly higher fatality rate than other GA crashes. Previous research has identified lack of situational awareness, risk perception, and risk tolerance as possible explanations for why pilots would continue into adverse weather. However, very little is known about the nature of these encounters or the differences between pilots who avoid adverse weather and those who do not. Visitors to a web site described an experience with adverse weather and completed a range of measures of personal characteristics. The resulting data from 364 pilots were carefully screened and subject to a discriminant function analysis. Two significant functions were found. The first, accounting for 69% of the variance, reflected measures of risk awareness and pilot judgment while the second differentiated pilots in terms of their experience levels. The variables measured in this study enabled us to correctly discriminate between the three groups of pilots considerably better (53% correct classifications) than would have been possible by chance (33% correct classifications). The implications of these findings for targeting safety interventions are discussed.
Raman Spectroscopy an Option for the Early Detection of Citrus Huanglongbing.
Pérez, Moisés Roberto Vallejo; Mendoza, María Guadalupe Galindo; Elías, Miguel Ghebre Ramírez; González, Francisco Javier; Contreras, Hugo Ricardo Navarro; Servín, Carlos Contreras
2016-05-01
This research describes the application of portable field Raman spectroscopy combined with a statistical analysis of the resulting spectra, employing principal component analysis (PCA) and linear discriminant analysis (LDA), in which we determine that this method provides a high degree of reliability in the early detection of Huanglongbing (HLB) on Sweet Orange, disease caused by the bacteria Candidatus Liberibacter asiaticus. Symptomatic and asymptomatic plant samples of Sweet Orange (Citrus sinensis), Persian Lime (C. latifolia), and Mexican Lime (C. aurantifolia) trees were collected from several municipalities, three at Colima State and three at Jalisco State (HLB presence). In addition, Sweet Orange samples were taken from two other Mexican municipalities, one at San Luis Potosí and the other at Veracruz (HLB absent). All samples were analyzed by real-time PCR to determine its phytosanitary condition, and its spectral signatures were obtained with an ID-Raman mini. Spectral anomalies in orange trees HLB-positive, were identified in bands related to carbohydrates (905 cm(-1), 1043 cm(-1), 1127 cm(-1), 1208 cm(-1), 1370 cm(-1), 1272 cm(-1), 1340 cm(-1), and 1260-1280 cm(-1)), amino acids, proteins (815 cm(-1), 830 cm(-1), 852 cm(-1), 918 cm(-1), 926 cm(-1), 970 cm(-1), 1002 cm(-1), 1053 cm(-1), and 1446 cm(-1)), and lipids (1734 cm(-1), 1736 cm(-1), 1738 cm(-1), 1745 cm(-1), and 1746 cm(-1)). Moreover, PCA-LDA showed a sensitivity of 86.9 % (percentage of positives, which are correctly identified), a specificity of 91.4 % (percentage of negatives, which are correctly identified), and a precision of 89.2 % (the proportion of all tests that are correct) in discriminating between orange plants HLB-positive and healthy plants. The Raman spectroscopy technique permitted rapid diagnoses, was low-cost, simple, and practical to administer, and produced immediate results. These are essential features for phytosanitary epidemiological surveillance activities that may conduct a targeted selection of highly suspicious trees to undergo molecular DNA analysis. © The Author(s) 2016.
1992-04-01
contractor’s existing data collection, analysis and corrective action system shall be utilized, with modification only as necessary to meet the...either from test or from analysis of field data . The procedures of MIL-STD-756B assume that the reliability of a 18 DEFINE IDENTIFY SOFTWARE LIFE CYCLE...to generate sufficient data to report a statistically valid reliability figure for a class of software. Casual data gathering accumulates data more
[Quality assurance of hospital medical records as a risk management tool].
Terranova, Giuseppina; Cortesi, Elisabetta; Briani, Silvia; Giannini, Raffaella
2006-01-01
A retrospective analysis of hospital medical records was performed jointly by the Medicolegal department of the Pistoia Local Health Unit N. 3 and by the management of the SS. Cosma and Damiano di Pescia Hospital. Evaluation was based on ANDEM criteria, JCAHO standards, and the 1992 discharge abstract guidelines of the Italian Health Ministry. In the first phase of the study, data were collected and processed for each hospital ward and then discussed with clinicians and audited. After auditing, appropriate actions were agreed upon for correcting identified problems. Approximately one year later a second smaller sample of medical records was evaluated and a higher compliance rate with the established corrective actions was found in all wards for all data categories. In this study the evaluation of medical records can be considered in the wider context of risk management, a multidisciplinary process directed towards identifying and monitoring risk through the use of appropriate quality indicators.
ERIC Educational Resources Information Center
MacMillan, Peter D.
2000-01-01
Compared classical test theory (CTT), generalizability theory (GT), and multifaceted Rasch model (MFRM) approaches to detecting and correcting for rater variability using responses of 4,930 high school students graded by 3 raters on 9 scales. The MFRM approach identified far more raters as different than did the CTT analysis. GT and Rasch…
Psychological and Pedagogical Conditions for the Prevention of Deviant Behavior among Adolescents
ERIC Educational Resources Information Center
Vist, Natalya V.
2016-01-01
This article focuses on such a highly relevant subject as the prevention and correction of deviant behavior in the adolescent environment. The study revealed the main vectors for the development of the modern science of deviant behavior, identified the main causes of deviations and carried out a comparative analysis of the work on the prevention…
USDA-ARS?s Scientific Manuscript database
To evaluate the percentage of body fat (%BF)-BMI relationship, identify %BF levels corresponding to adult BMI cut points, and examine %BF-BMI agreement in a diverse Hispanic/Latino population. %BF by bioelectrical impedance analysis was corrected against %BF by 18O dilution in 434 participants of th...
ERIC Educational Resources Information Center
Girod, Gerald R.
An experiment was performed to determine the efficiency of simulation teaching techniques in training elementary education teachers to identify and correct classroom management problems. The two presentation modes compared were film and audiotape. Twelve hypotheses were tested via analysis of variance to determine the relative efficiency of these…
A Revised Earthquake Catalogue for South Iceland
NASA Astrophysics Data System (ADS)
Panzera, Francesco; Zechar, J. Douglas; Vogfjörd, Kristín S.; Eberhard, David A. J.
2016-01-01
In 1991, a new seismic monitoring network named SIL was started in Iceland with a digital seismic system and automatic operation. The system is equipped with software that reports the automatic location and magnitude of earthquakes, usually within 1-2 min of their occurrence. Normally, automatic locations are manually checked and re-estimated with corrected phase picks, but locations are subject to random errors and systematic biases. In this article, we consider the quality of the catalogue and produce a revised catalogue for South Iceland, the area with the highest seismic risk in Iceland. We explore the effects of filtering events using some common recommendations based on network geometry and station spacing and, as an alternative, filtering based on a multivariate analysis that identifies outliers in the hypocentre error distribution. We identify and remove quarry blasts, and we re-estimate the magnitude of many events. This revised catalogue which we consider to be filtered, cleaned, and corrected should be valuable for building future seismicity models and for assessing seismic hazard and risk. We present a comparative seismicity analysis using the original and revised catalogues: we report characteristics of South Iceland seismicity in terms of b value and magnitude of completeness. Our work demonstrates the importance of carefully checking an earthquake catalogue before proceeding with seismicity analysis.
Authenticity assessment of banknotes using portable near infrared spectrometer and chemometrics.
da Silva Oliveira, Vanessa; Honorato, Ricardo Saldanha; Honorato, Fernanda Araújo; Pereira, Claudete Fernandes
2018-05-01
Spectra recorded using a portable near infrared (NIR) spectrometer, Soft Independent Modeling of Class Analogy (SIMCA) and Linear Discriminant Analysis (LDA) associated to Successive Projections Algorithm (SPA) models were applied to identify counterfeit and authentic Brazilian Real (R$20, R$50 and R$100) banknotes, enabling a simple field analysis. NIR spectra (950-1650nm) were recorded from seven different areas of the banknotes (two with fluorescent ink, one over watermark, three with intaglio printing process and one over the serial numbers with typography printing). SIMCA and SPA-LDA models were built using 1st derivative preprocessed spectral data from one of the intaglio areas. For the SIMCA models, all authentic (300) banknotes were correctly classified and the counterfeits (227) were not classified. For the two classes SPA-LDA models (authentic and counterfeit currencies), all the test samples were correctly classified into their respective class. The number of selected variables by SPA varied from two to nineteen for R$20, R$50 and R$100 currencies. These results show that the use of the portable near-infrared with SIMCA or SPA-LDA models can be a completely effective, fast, and non-destructive way to identify authenticity of banknotes as well as permitting field analysis. Copyright © 2018 Elsevier B.V. All rights reserved.
Crowdsourcing Participatory Evaluation of Medical Pictograms Using Amazon Mechanical Turk
Willis, Matt; Sun, Peiyuan; Wang, Jun
2013-01-01
Background Consumer and patient participation proved to be an effective approach for medical pictogram design, but it can be costly and time-consuming. We proposed and evaluated an inexpensive approach that crowdsourced the pictogram evaluation task to Amazon Mechanical Turk (MTurk) workers, who are usually referred to as the “turkers”. Objective To answer two research questions: (1) Is the turkers’ collective effort effective for identifying design problems in medical pictograms? and (2) Do the turkers’ demographic characteristics affect their performance in medical pictogram comprehension? Methods We designed a Web-based survey (open-ended tests) to ask 100 US turkers to type in their guesses of the meaning of 20 US pharmacopeial pictograms. Two judges independently coded the turkers’ guesses into four categories: correct, partially correct, wrong, and completely wrong. The comprehensibility of a pictogram was measured by the percentage of correct guesses, with each partially correct guess counted as 0.5 correct. We then conducted a content analysis on the turkers’ interpretations to identify misunderstandings and assess whether the misunderstandings were common. We also conducted a statistical analysis to examine the relationship between turkers’ demographic characteristics and their pictogram comprehension performance. Results The survey was completed within 3 days of our posting the task to the MTurk, and the collected data are publicly available in the multimedia appendix for download. The comprehensibility for the 20 tested pictograms ranged from 45% to 98%, with an average of 72.5%. The comprehensibility scores of 10 pictograms were strongly correlated to the scores of the same pictograms reported in another study that used oral response–based open-ended testing with local people. The turkers’ misinterpretations shared common errors that exposed design problems in the pictograms. Participant performance was positively correlated with their educational level. Conclusions The results confirmed that crowdsourcing can be used as an effective and inexpensive approach for participatory evaluation of medical pictograms. Through Web-based open-ended testing, the crowd can effectively identify problems in pictogram designs. The results also confirmed that education has a significant effect on the comprehension of medical pictograms. Since low-literate people are underrepresented in the turker population, further investigation is needed to examine to what extent turkers’ misunderstandings overlap with those elicited from low-literate people. PMID:23732572
Crowdsourcing participatory evaluation of medical pictograms using Amazon Mechanical Turk.
Yu, Bei; Willis, Matt; Sun, Peiyuan; Wang, Jun
2013-06-03
Consumer and patient participation proved to be an effective approach for medical pictogram design, but it can be costly and time-consuming. We proposed and evaluated an inexpensive approach that crowdsourced the pictogram evaluation task to Amazon Mechanical Turk (MTurk) workers, who are usually referred to as the "turkers". To answer two research questions: (1) Is the turkers' collective effort effective for identifying design problems in medical pictograms? and (2) Do the turkers' demographic characteristics affect their performance in medical pictogram comprehension? We designed a Web-based survey (open-ended tests) to ask 100 US turkers to type in their guesses of the meaning of 20 US pharmacopeial pictograms. Two judges independently coded the turkers' guesses into four categories: correct, partially correct, wrong, and completely wrong. The comprehensibility of a pictogram was measured by the percentage of correct guesses, with each partially correct guess counted as 0.5 correct. We then conducted a content analysis on the turkers' interpretations to identify misunderstandings and assess whether the misunderstandings were common. We also conducted a statistical analysis to examine the relationship between turkers' demographic characteristics and their pictogram comprehension performance. The survey was completed within 3 days of our posting the task to the MTurk, and the collected data are publicly available in the multimedia appendix for download. The comprehensibility for the 20 tested pictograms ranged from 45% to 98%, with an average of 72.5%. The comprehensibility scores of 10 pictograms were strongly correlated to the scores of the same pictograms reported in another study that used oral response-based open-ended testing with local people. The turkers' misinterpretations shared common errors that exposed design problems in the pictograms. Participant performance was positively correlated with their educational level. The results confirmed that crowdsourcing can be used as an effective and inexpensive approach for participatory evaluation of medical pictograms. Through Web-based open-ended testing, the crowd can effectively identify problems in pictogram designs. The results also confirmed that education has a significant effect on the comprehension of medical pictograms. Since low-literate people are underrepresented in the turker population, further investigation is needed to examine to what extent turkers' misunderstandings overlap with those elicited from low-literate people.
What is the evidence for retrieval problems in the elderly?
White, N; Cunningham, W R
1982-01-01
To determine whether older adults experience particular problems with retrieval, groups of young and elderly adults were given free recall and recognition tests of supraspan lists of unrelated words. Analysis of number of words correctly recalled and recognized yielded a significant age by retention test interaction: greater age differences were observed for recall than for recognition. In a second analysis of words recalled and recognized, corrected for guessing, the interaction disappeared. It was concluded that previous interpretations that age by retention test interactions are indicative of retrieval problems of the elderly may have been confounded by methodological problems. Furthermore, it was suggested that researchers in aging and memory need to be explicit in identifying their underlying models of error processes when analyzing recognition scores: different error models may lead to different results and interpretations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
James, Veronica J.; O’Malley Ford, Judith M.
Double blind analysis of a batch of thirty skin tissue samples from potential prostate cancer sufferers correctly identified all “control” patients, patients with high and low grade prostate cancers, the presence of benign prostate hyperplasia (BPH), perineural invasions, and the one lymphatic invasion. Identification was by analysis of fibre diffraction patterns interpreted using a schema developed from observations in nine previous studies. The method, schema, and specific experiment results are reported in this paper, with some implications then drawn.
James, Veronica J.; O’Malley Ford, Judith M.
2014-01-01
Double blind analysis of a batch of thirty skin tissue samples from potential prostate cancer sufferers correctly identified all “control” patients, patients with high and low grade prostate cancers, the presence of benign prostate hyperplasia (BPH), perineural invasions, and the one lymphatic invasion. Identification was by analysis of fibre diffraction patterns interpreted using a schema developed from observations in nine previous studies. The method, schema, and specific experiment results are reported in this paper, with some implications then drawn.
Towards process-informed bias correction of climate change simulations
NASA Astrophysics Data System (ADS)
Maraun, Douglas; Shepherd, Theodore G.; Widmann, Martin; Zappa, Giuseppe; Walton, Daniel; Gutiérrez, José M.; Hagemann, Stefan; Richter, Ingo; Soares, Pedro M. M.; Hall, Alex; Mearns, Linda O.
2017-11-01
Biases in climate model simulations introduce biases in subsequent impact simulations. Therefore, bias correction methods are operationally used to post-process regional climate projections. However, many problems have been identified, and some researchers question the very basis of the approach. Here we demonstrate that a typical cross-validation is unable to identify improper use of bias correction. Several examples show the limited ability of bias correction to correct and to downscale variability, and demonstrate that bias correction can cause implausible climate change signals. Bias correction cannot overcome major model errors, and naive application might result in ill-informed adaptation decisions. We conclude with a list of recommendations and suggestions for future research to reduce, post-process, and cope with climate model biases.
Rahimy, Ehsan; McCannel, Colin A
2016-04-01
To assess the literature regarding macular hole reopening rates stratified by whether the internal limiting membrane (ILM) was peeled during vitrectomy surgery. Systematic review and meta-analysis of studies reporting on macular hole reopenings among previously surgically closed idiopathic macular holes. A comprehensive literature search using the National Library of Medicine PubMed interface was used to identify potentially eligible publications in English. The minimum mean follow-up period for reports to be included in this study was 12 months. Analysis was divided into eyes that underwent vitrectomy with and without ILM peeling. The primary outcome parameter was the proportion of macular hole reopenings among previously closed holes between the two groups. Secondary outcome parameters included duration from initial surgery to hole reopening and preoperative and postoperative best-corrected correct visual acuities among the non-ILM peeling and ILM peeling groups. A total of 50 publications reporting on 5,480 eyes met inclusion criteria and were assessed in this meta-analysis. The reopening rate without ILM peeling was 7.12% (125 of 1,756 eyes), compared with 1.18% (44 of 3,724 eyes) with ILM peeling (odds ratio: 0.16; 95% confidence interval: 0.11-0.22; Fisher's exact test: P < 0.0001). There were no other identifiable associations or risk factors for reopening. The results of this meta-analysis support the concept that ILM peeling during macular hole surgery reduces the likelihood of macular hole reopening.
Exposed and embedded corrections in aphasia therapy: issues of voice and identity.
Simmons-Mackie, Nina; Damico, Jack S
2008-01-01
Because communication after the onset of aphasia can be fraught with errors, therapist corrections are pervasive in therapy for aphasia. Although corrections are designed to improve the accuracy of communication, some corrections can have social and emotional consequences during interactions. That is, exposure of errors can potentially silence the 'voice' of a speaker by orienting to an utterance as unacceptable. Although corrections can marginalize speakers with aphasia, the practice has not been widely investigated. A qualitative study of corrections during aphasia therapy was undertaken to describe corrections in therapy, identify patterns of occurrence, and develop hypotheses regarding the potential effects of corrections. Videotapes of six individual and five group aphasia therapy sessions were analysed. Sequences consistent with a definition of a therapist 'correction' were identified. Corrections were defined as instances when the therapist offered a 'fix' for a perceived error in the client's talk even though the intent was apparent. Two categories of correction were identified and were consistent with Jefferson's (1987) descriptions of exposed and embedded corrections. Exposed corrections involved explicit correcting by the therapist, while embedded corrections occurred implicitly within the ongoing talk. Patterns of occurrence appeared consistent with philosophical orientations of therapy sessions. Exposed corrections were more prevalent in sessions focusing on repairing deficits, while embedded corrections were prevalent in sessions focusing on natural communication events (e.g. conversation). In addition, exposed corrections were sometimes used when client offerings were plausible or appropriate, but were inconsistent with therapist expectations. The observation that some instances of exposed corrections effectively silenced the voice or self-expression of the person with aphasia has significant implications for outcomes from aphasia therapy. By focusing on accurate productions versus communicative intents, therapy runs the risk of reducing self-esteem and communicative confidence, as well as reinforcing a sense of 'helplessness' and disempowerment among people with aphasia. The results suggest that clinicians should carefully calibrate the use of exposed and embedded corrections to balance linguistic and psychosocial goals.
Yeast species associated with orange juice: evaluation of different identification methods.
Arias, Covadonga R; Burns, Jacqueline K; Friedrich, Lorrie M; Goodrich, Renee M; Parish, Mickey E
2002-04-01
Five different methods were used to identify yeast isolates from a variety of citrus juice sources. A total of 99 strains, including reference strains, were identified using a partial sequence of the 26S rRNA gene, restriction pattern analysis of the internal transcribed spacer region (5.8S-ITS), classical methodology, the RapID Yeast Plus system, and API 20C AUX. Twenty-three different species were identified representing 11 different genera. Distribution of the species was considerably different depending on the type of sample. Fourteen different species were identified from pasteurized single-strength orange juice that had been contaminated after pasteurization (PSOJ), while only six species were isolated from fresh-squeezed, unpasteurized orange juice (FSOJ). Among PSOJ isolates, Candida intermedia and Candida parapsilosis were the predominant species. Hanseniaspora occidentalis and Hanseniaspora uvarum represented up to 73% of total FSOJ isolates. Partial sequence of the 26S rRNA gene yielded the best results in terms of correct identification, followed by classical techniques and 5.8S-ITS analysis. The commercial identification kits RapID Yeast Plus system and API 20C AUX were able to correctly identify only 35 and 13% of the isolates, respectively. Six new 5.8S-ITS profiles were described, corresponding to Clavispora lusitaniae, Geotrichum citri-aurantii, H. occidentalis, H. vineae, Pichia fermentans, and Saccharomycopsis crataegensis. With the addition of these new profiles to the existing database, the use of 5.8S-ITS sequence became the best tool for rapid and accurate identification of yeast isolates from orange juice.
de Jong, Simone; Vidler, Lewis R; Mokrab, Younes; Collier, David A; Breen, Gerome
2016-08-01
Genome-wide association studies (GWAS) have identified thousands of novel genetic associations for complex genetic disorders, leading to the identification of potential pharmacological targets for novel drug development. In schizophrenia, 108 conservatively defined loci that meet genome-wide significance have been identified and hundreds of additional sub-threshold associations harbour information on the genetic aetiology of the disorder. In the present study, we used gene-set analysis based on the known binding targets of chemical compounds to identify the 'drug pathways' most strongly associated with schizophrenia-associated genes, with the aim of identifying potential drug repositioning opportunities and clues for novel treatment paradigms, especially in multi-target drug development. We compiled 9389 gene sets (2496 with unique gene content) and interrogated gene-based p-values from the PGC2-SCZ analysis. Although no single drug exceeded experiment wide significance (corrected p<0.05), highly ranked gene-sets reaching suggestive significance including the dopamine receptor antagonists metoclopramide and trifluoperazine and the tyrosine kinase inhibitor neratinib. This is a proof of principle analysis showing the potential utility of GWAS data of schizophrenia for the direct identification of candidate drugs and molecules that show polypharmacy. © The Author(s) 2016.
Halabi, Najeeb M.; Martinez, Alejandra; Al-Farsi, Halema; Mery, Eliane; Puydenus, Laurence; Pujol, Pascal; Khalak, Hanif G.; McLurcan, Cameron; Ferron, Gwenael; Querleu, Denis; Al-Azwani, Iman; Al-Dous, Eman; Mohamoud, Yasmin A.; Malek, Joel A.; Rafii, Arash
2016-01-01
Identifying genes where a variant allele is preferentially expressed in tumors could lead to a better understanding of cancer biology and optimization of targeted therapy. However, tumor sample heterogeneity complicates standard approaches for detecting preferential allele expression. We therefore developed a novel approach combining genome and transcriptome sequencing data from the same sample that corrects for sample heterogeneity and identifies significant preferentially expressed alleles. We applied this analysis to epithelial ovarian cancer samples consisting of matched primary ovary and peritoneum and lymph node metastasis. We find that preferentially expressed variant alleles include germline and somatic variants, are shared at a relatively high frequency between patients, and are in gene networks known to be involved in cancer processes. Analysis at a patient level identifies patient-specific preferentially expressed alleles in genes that are targets for known drugs. Analysis at a site level identifies patterns of site specific preferential allele expression with similar pathways being impacted in the primary and metastasis sites. We conclude that genes with preferentially expressed variant alleles can act as cancer drivers and that targeting those genes could lead to new therapeutic strategies. PMID:26735499
Artificial Intelligence Techniques for Automatic Screening of Amblyogenic Factors
Van Eenwyk, Jonathan; Agah, Arvin; Giangiacomo, Joseph; Cibis, Gerhard
2008-01-01
Purpose To develop a low-cost automated video system to effectively screen children aged 6 months to 6 years for amblyogenic factors. Methods In 1994 one of the authors (G.C.) described video vision development assessment, a digitizable analog video-based system combining Brückner pupil red reflex imaging and eccentric photorefraction to screen young children for amblyogenic factors. The images were analyzed manually with this system. We automated the capture of digital video frames and pupil images and applied computer vision and artificial intelligence to analyze and interpret results. The artificial intelligence systems were evaluated by a tenfold testing method. Results The best system was the decision tree learning approach, which had an accuracy of 77%, compared to the “gold standard” specialist examination with a “refer/do not refer” decision. Criteria for referral were strabismus, including microtropia, and refractive errors and anisometropia considered to be amblyogenic. Eighty-two percent of strabismic individuals were correctly identified. High refractive errors were also correctly identified and referred 90% of the time, as well as significant anisometropia. The program was less correct in identifying more moderate refractive errors, below +5 and less than −7. Conclusions Although we are pursuing a variety of avenues to improve the accuracy of the automated analysis, the program in its present form provides acceptable cost benefits for detecting ambylogenic factors in children aged 6 months to 6 years. PMID:19277222
Artificial intelligence techniques for automatic screening of amblyogenic factors.
Van Eenwyk, Jonathan; Agah, Arvin; Giangiacomo, Joseph; Cibis, Gerhard
2008-01-01
To develop a low-cost automated video system to effectively screen children aged 6 months to 6 years for amblyogenic factors. In 1994 one of the authors (G.C.) described video vision development assessment, a digitizable analog video-based system combining Brückner pupil red reflex imaging and eccentric photorefraction to screen young children for amblyogenic factors. The images were analyzed manually with this system. We automated the capture of digital video frames and pupil images and applied computer vision and artificial intelligence to analyze and interpret results. The artificial intelligence systems were evaluated by a tenfold testing method. The best system was the decision tree learning approach, which had an accuracy of 77%, compared to the "gold standard" specialist examination with a "refer/do not refer" decision. Criteria for referral were strabismus, including microtropia, and refractive errors and anisometropia considered to be amblyogenic. Eighty-two percent of strabismic individuals were correctly identified. High refractive errors were also correctly identified and referred 90% of the time, as well as significant anisometropia. The program was less correct in identifying more moderate refractive errors, below +5 and less than -7. Although we are pursuing a variety of avenues to improve the accuracy of the automated analysis, the program in its present form provides acceptable cost benefits for detecting ambylogenic factors in children aged 6 months to 6 years.
Runner's knowledge of their foot type: do they really know?
Hohmann, Erik; Reaburn, Peter; Imhoff, Andreas
2012-09-01
The use of correct individually selected running shoes may reduce the incidence of running injuries. However, the runner needs to be aware of their foot anatomy to ensure the "correct" footwear is chosen. The purpose of this study was to compare the individual runner's knowledge of their arch type to the arch index derived from a static footprint. We examined 92 recreational runners with a mean age of 35.4±11.4 (12-63) years. A questionnaire was used to investigate the knowledge of the runners about arch height and overpronation. A clinical examination was undertaken using defined criteria and the arch index was analysed using weight-bearing footprints. Forty-five runners (49%) identified their foot arch correctly. Eighteen of the 41 flat-arched runners (44%) identified their arch correctly. Twenty-four of the 48 normal-arched athletes (50%) identified their arch correctly. Three subjects with a high arch identified their arch correctly. Thirty-eight runners assessed themselves as overpronators; only four (11%) of these athletes were positively identified. Of the 34 athletes who did not categorize themselves as overpronators, four runners (12%) had clinical overpronation. The findings of this research suggest that runners possess poor knowledge of both their foot arch and dynamic pronation. Copyright © 2012 Elsevier Ltd. All rights reserved.
Defense Logistics Agency Disposition Services Afghanistan Disposal Process Needed Improvement
2013-11-08
audit, and management was proactive in correcting the deficiencies we identified. DLA DS eliminated backlogs, identified and corrected system ...problems, provided additional system training, corrected coding errors, added personnel to key positions, addressed scale issues, submitted debit...Service Automated Information System to the Reutilization Business Integration2 (RBI) solution. The implementation of RBI in Afghanistan occurred in
DNA methylation and inflammation marker profiles associated with a history of depression.
Crawford, Bethany; Craig, Zoe; Mansell, Georgina; White, Isobel; Smith, Adam; Spaull, Steve; Imm, Jennifer; Hannon, Eilis; Wood, Andrew; Yaghootkar, Hanieh; Ji, Yingjie; Mullins, Niamh; Lewis, Cathryn M; Mill, Jonathan; Murphy, Therese M
2018-05-22
Depression is a common and disabling disorder, representing a major social and economic health issue. Moreover, depression is associated with the progression of diseases with an inflammatory etiology including many inflammatory-related disorders. At the molecular level, the mechanisms by which depression might promote the onset of these diseases and associated immune-dysfunction are not well understood. In this study we assessed genome-wide patterns of DNA methylation in whole blood-derived DNA obtained from individuals with a self-reported history of depression (n = 100) and individuals without a history of depression (n = 100) using the Illumina 450K microarray. Our analysis identified 6 significant (Sidak corrected P < 0.05) depression-associated differentially methylated regions (DMRs); the top-ranked DMR was located in exon 1 of the LTB4R2 gene (Sidak corrected P = 1.27 x 10-14). Polygenic risk scores (PRS) for depression were generated and known biological markers of inflammation, telomere length (TL) and IL-6, were measured in DNA and serum samples respectively. Next, we employed a systems-level approach to identify networks of co-methylated loci associated with a history of depression, in addition to depression PRS, TL and IL-6 levels. Our analysis identified one depression-associated co-methylation module (P = 0.04). Interestingly, the depression-associated module was highly enriched for pathways related to immune function and was also associated with TL and IL-6 cytokine levels. In summary, our genome-wide DNA methylation analysis of individuals with and without a self-reported history of depression identified several candidate DMRs of potential relevance to the pathogenesis of depression and its associated immune-dysfunction phenotype.
NASA Technical Reports Server (NTRS)
Monaghan, Mark W.; Gillespie, Amanda M.
2013-01-01
During the shuttle era NASA utilized a failure reporting system called the Problem Reporting and Corrective Action (PRACA) it purpose was to identify and track system non-conformance. The PRACA system over the years evolved from a relatively nominal way to identify system problems to a very complex tracking and report generating data base. The PRACA system became the primary method to categorize any and all anomalies from corrosion to catastrophic failure. The systems documented in the PRACA system range from flight hardware to ground or facility support equipment. While the PRACA system is complex, it does possess all the failure modes, times of occurrence, length of system delay, parts repaired or replaced, and corrective action performed. The difficulty is mining the data then to utilize that data in order to estimate component, Line Replaceable Unit (LRU), and system reliability analysis metrics. In this paper, we identify a methodology to categorize qualitative data from the ground system PRACA data base for common ground or facility support equipment. Then utilizing a heuristic developed for review of the PRACA data determine what reports identify a credible failure. These data are the used to determine inter-arrival times to perform an estimation of a metric for repairable component-or LRU reliability. This analysis is used to determine failure modes of the equipment, determine the probability of the component failure mode, and support various quantitative differing techniques for performing repairable system analysis. The result is that an effective and concise estimate of components used in manned space flight operations. The advantage is the components or LRU's are evaluated in the same environment and condition that occurs during the launch process.
Chung, W Joon; Goeckeler-Fried, Jennifer L; Havasi, Viktoria; Chiang, Annette; Rowe, Steven M; Plyler, Zackery E; Hong, Jeong S; Mazur, Marina; Piazza, Gary A; Keeton, Adam B; White, E Lucile; Rasmussen, Lynn; Weissman, Allan M; Denny, R Aldrin; Brodsky, Jeffrey L; Sorscher, Eric J
2016-01-01
Small molecules that correct the folding defects and enhance surface localization of the F508del mutation in the Cystic Fibrosis Transmembrane conductance Regulator (CFTR) comprise an important therapeutic strategy for cystic fibrosis lung disease. However, compounds that rescue the F508del mutant protein to wild type (WT) levels have not been identified. In this report, we consider obstacles to obtaining robust and therapeutically relevant levels of F508del CFTR. For example, markedly diminished steady state amounts of F508del CFTR compared to WT CFTR are present in recombinant bronchial epithelial cell lines, even when much higher levels of mutant transcript are present. In human primary airway cells, the paucity of Band B F508del is even more pronounced, although F508del and WT mRNA concentrations are comparable. Therefore, to augment levels of "repairable" F508del CFTR and identify small molecules that then correct this pool, we developed compound library screening protocols based on automated protein detection. First, cell-based imaging measurements were used to semi-quantitatively estimate distribution of F508del CFTR by high content analysis of two-dimensional images. We evaluated ~2,000 known bioactive compounds from the NIH Roadmap Molecular Libraries Small Molecule Repository in a pilot screen and identified agents that increase the F508del protein pool. Second, we analyzed ~10,000 compounds representing diverse chemical scaffolds for effects on total CFTR expression using a multi-plate fluorescence protocol and describe compounds that promote F508del maturation. Together, our findings demonstrate proof of principle that agents identified in this fashion can augment the level of endoplasmic reticulum (ER) resident "Band B" F508del CFTR suitable for pharmacologic correction. As further evidence in support of this strategy, PYR-41-a compound that inhibits the E1 ubiquitin activating enzyme-was shown to synergistically enhance F508del rescue by C18, a small molecule corrector. Our combined results indicate that increasing the levels of ER-localized CFTR available for repair provides a novel route to correct F508del CFTR.
Kurz, Sascha; Pieroh, Philipp; Lenk, Maximilian; Josten, Christoph; Böhme, Jörg
2017-01-01
Abstract Rationale: Pelvic malunion is a rare complication and is technically challenging to correct owing to the complex three-dimensional (3D) geometry of the pelvic girdle. Hence, precise preoperative planning is required to ensure appropriate correction. Reconstructive surgery is generally a 2- or 3-stage procedure, with transiliac osteotomy serving as an alternative to address limb length discrepancy. Patient concerns: A 38-year-old female patient with a Mears type IV pelvic malunion with previous failed reconstructive surgery was admitted to our department due to progressive immobilization, increasing pain especially at the posterior pelvic arch and a leg length discrepancy. The leg discrepancy was approximately 4 cm and rotation of the right hip joint was associated with pain. Diagnosis: Radiography and computer tomography (CT) revealed a hypertrophic malunion at the site of the previous posterior osteotomy (Mears type IV) involving the anterior and middle column, according to the 3-column concept, as well as malunion of the left anterior arch (Mears type IV). Interventions: The surgery was planned virtually via 3D reconstruction, using the patient's CT, and subsequently performed via transiliac osteotomy and symphysiotomy. Finite element method (FEM) was used to plan the osteotomy and osteosynthesis as to include an estimation of the risk of implant failure. Outcomes: There was not incidence of neurological injury or infection, and the remaining leg length discrepancy was ≤ 2 cm. The patient recovered independent, pain free, mobility. Virtual 3D planning provided a more precise measurement of correction parameters than radiographic-based measurements. FEM analysis identified the highest risk for implant failure at the symphyseal plate osteosynthesis and the parasymphyseal screws. No implant failure was observed. Lessons: Transiliac osteotomy, with additional osteotomy or symphysiotomy, was a suitable surgical procedure for the correction of pelvic malunion and provided adequate correction of leg length discrepancy. Virtual 3D planning enabled precise determination of correction parameters, with FEM analysis providing an appropriate method to predict areas of implant failure. PMID:29049196
A Hybrid Wavelet-Based Method for the Peak Detection of Photoplethysmography Signals.
Li, Suyi; Jiang, Shanqing; Jiang, Shan; Wu, Jiang; Xiong, Wenji; Diao, Shu
2017-01-01
The noninvasive peripheral oxygen saturation (SpO 2 ) and the pulse rate can be extracted from photoplethysmography (PPG) signals. However, the accuracy of the extraction is directly affected by the quality of the signal obtained and the peak of the signal identified; therefore, a hybrid wavelet-based method is proposed in this study. Firstly, we suppressed the partial motion artifacts and corrected the baseline drift by using a wavelet method based on the principle of wavelet multiresolution. And then, we designed a quadratic spline wavelet modulus maximum algorithm to identify the PPG peaks automatically. To evaluate this hybrid method, a reflective pulse oximeter was used to acquire ten subjects' PPG signals under sitting, raising hand, and gently walking postures, and the peak recognition results on the raw signal and on the corrected signal were compared, respectively. The results showed that the hybrid method not only corrected the morphologies of the signal well but also optimized the peaks identification quality, subsequently elevating the measurement accuracy of SpO 2 and the pulse rate. As a result, our hybrid wavelet-based method profoundly optimized the evaluation of respiratory function and heart rate variability analysis.
Matias, Denise Margaret S; Borgemeister, Christian; von Wehrden, Henrik
2018-02-24
One of the traditional livelihood practices of indigenous Tagbanuas in Palawan, Philippines is wild honey hunting and gathering from the giant honey bee (Apis dorsata F.). In order to analyze the linkages of the social and ecological systems involved in this indigenous practice, we conducted spatial, quantitative, and qualitative analyses on field data gathered through mapping of global positioning system coordinates, community surveys, and key informant interviews. We found that only 24% of the 251 local community members surveyed could correctly identify the giant honey bee. Inferential statistics showed that a lower level of formal education strongly correlates with correct identification of the giant honey bee. Spatial analysis revealed that mean NDVI of sampled nesting tree areas has dropped from 0.61 in the year 1988 to 0.41 in 2015. However, those who correctly identified the giant honey bee lived in areas with high vegetation cover. Decreasing vegetation cover limits the presence of wild honey bees and this may also be limiting direct experience of the community with wild honey bees. However, with causality yet to be established, we recommend conducting further studies to concretely model feedbacks between ecological changes and local knowledge.
An exploration of public knowledge of warning signs for cancer.
Keeney, Sinead; McKenna, Hugh; Fleming, Paul; McIlfatrick, Sonja
2011-02-01
Warning signs of cancer have long been used as an effective way to summarise and communicate early indications of cancer to the public. Given the increasing global burden of cancer, the communication of these warning signs to the public is more important than ever before. This paper presents part of a larger study which explored the attitudes, knowledge and behaviours of people in mid-life towards cancer prevention. The focus of this paper is on the assessment of the knowledge of members of the public aged between 35 and 54 years of age. A questionnaire was administered to a representative sample of the population listing 17 warning signs of cancer. These included the correct warning signs and distracter signs. Respondents were asked to correctly identify the seven warning signs. Findings show that respondents could identify 4.8 cancer warning signs correctly. Analysis by demographics shows that being female, being older, having a higher level of educational attainment and being in a higher socio-economic group are predictors of better level of knowledge of cancer warning signs. Recommendations are proffered with regard to better targeting, clarification and communication of cancer warning signs. Copyright © 2010 Elsevier Ltd. All rights reserved.
A Hybrid Wavelet-Based Method for the Peak Detection of Photoplethysmography Signals
Jiang, Shanqing; Jiang, Shan; Wu, Jiang; Xiong, Wenji
2017-01-01
The noninvasive peripheral oxygen saturation (SpO2) and the pulse rate can be extracted from photoplethysmography (PPG) signals. However, the accuracy of the extraction is directly affected by the quality of the signal obtained and the peak of the signal identified; therefore, a hybrid wavelet-based method is proposed in this study. Firstly, we suppressed the partial motion artifacts and corrected the baseline drift by using a wavelet method based on the principle of wavelet multiresolution. And then, we designed a quadratic spline wavelet modulus maximum algorithm to identify the PPG peaks automatically. To evaluate this hybrid method, a reflective pulse oximeter was used to acquire ten subjects' PPG signals under sitting, raising hand, and gently walking postures, and the peak recognition results on the raw signal and on the corrected signal were compared, respectively. The results showed that the hybrid method not only corrected the morphologies of the signal well but also optimized the peaks identification quality, subsequently elevating the measurement accuracy of SpO2 and the pulse rate. As a result, our hybrid wavelet-based method profoundly optimized the evaluation of respiratory function and heart rate variability analysis. PMID:29250135
Local collective motion analysis for multi-probe dynamic imaging and microrheology
NASA Astrophysics Data System (ADS)
Khan, Manas; Mason, Thomas G.
2016-08-01
Dynamical artifacts, such as mechanical drift, advection, and hydrodynamic flow, can adversely affect multi-probe dynamic imaging and passive particle-tracking microrheology experiments. Alternatively, active driving by molecular motors can cause interesting non-Brownian motion of probes in local regions. Existing drift-correction techniques, which require large ensembles of probes or fast temporal sampling, are inadequate for handling complex spatio-temporal drifts and non-Brownian motion of localized domains containing relatively few probes. Here, we report an analytical method based on local collective motion (LCM) analysis of as few as two probes for detecting the presence of non-Brownian motion and for accurately eliminating it to reveal the underlying Brownian motion. By calculating an ensemble-average, time-dependent, LCM mean square displacement (MSD) of two or more localized probes and comparing this MSD to constituent single-probe MSDs, we can identify temporal regimes during which either thermal or athermal motion dominates. Single-probe motion, when referenced relative to the moving frame attached to the multi-probe LCM trajectory, provides a true Brownian MSD after scaling by an appropriate correction factor that depends on the number of probes used in LCM analysis. We show that LCM analysis can be used to correct many different dynamical artifacts, including spatially varying drifts, gradient flows, cell motion, time-dependent drift, and temporally varying oscillatory advection, thereby offering a significant improvement over existing approaches.
Identification of Clinical Isolates of Mycobacteria with Gas-Liquid Chromatography Alone
Tisdall, Philip A.; Roberts, Glenn D.; Anhalt, John P.
1979-01-01
Identification of 18 mycobacterial species was performed by analysis of profiles obtained by using gas-liquid chromatography. Organisms were saponified in methanolic NaOH, and the reaction mixture was treated with BF3 in methanol and extracted with a hexane-chloroform mixture. An identification scheme was developed from 128 stock strains and tested against a collection of 79 clinical isolates. By using gas-liquid chromatographic profiles alone, 58% of specimens were correctly identified to species level, and an additional 41% were correctly identified to a group of two or three organisms. Use in a clinical laboratory over a 2-month period proved chromatography to be as accurate as and more rapid than concurrent biochemical testing. Of 81 isolates tested, 64% were identified to species level by chromatography alone. An additional 35% were differentiated to the same groups of two or three organisms as found in our analysis of stock strains. These groups consisted of: Mycobacterium tuberculosis, M. bovis, and M. xenopi; M. avium complex, M. gastri, and M. scrofulaceum; or M. fortuitum and M. chelonei. Identification to species level from these groups could usually be done by colonial morphology alone and could always be done by the addition of one selected biochemical test. This study demonstrated the practical application of gas-liquid chromatography in the identification of mycobacteria in a clinical laboratory. In particular, all strains of M. gordonae and M. kansasii were identified to species level. M. tuberculosis was definitively identified in 85% of cases. When it could not be definitely identified, the only alternatives were M. bovis and M. xenopi, both of which are rare causes of infection. PMID:118984
Classification and correction of the radar bright band with polarimetric radar
NASA Astrophysics Data System (ADS)
Hall, Will; Rico-Ramirez, Miguel; Kramer, Stefan
2015-04-01
The annular region of enhanced radar reflectivity, known as the Bright Band (BB), occurs when the radar beam intersects a layer of melting hydrometeors. Radar reflectivity is related to rainfall through a power law equation and so this enhanced region can lead to overestimations of rainfall by a factor of up to 5, so it is important to correct for this. The BB region can be identified by using several techniques including hydrometeor classification and freezing level forecasts from mesoscale meteorological models. Advances in dual-polarisation radar measurements and continued research in the field has led to increased accuracy in the ability to identify the melting snow region. A method proposed by Kitchen et al (1994), a form of which is currently used operationally in the UK, utilises idealised Vertical Profiles of Reflectivity (VPR) to correct for the BB enhancement. A simpler and more computationally efficient method involves the formation of an average VPR from multiple elevations for correction that can still cause a significant decrease in error (Vignal 2000). The purpose of this research is to evaluate a method that relies only on analysis of measurements from an operational C-band polarimetric radar without the need for computationally expensive models. Initial results show that LDR is a strong classifier of melting snow with a high Critical Success Index of 97% when compared to the other variables. An algorithm based on idealised VPRs resulted in the largest decrease in error when BB corrected scans are compared to rain gauges and to lower level scans with a reduction in RMSE of 61% for rain-rate measurements. References Kitchen, M., R. Brown, and A. G. Davies, 1994: Real-time correction of weather radar data for the effects of bright band, range and orographic growth in widespread precipitation. Q.J.R. Meteorol. Soc., 120, 1231-1254. Vignal, B. et al, 2000: Three methods to determine profiles of reflectivity from volumetric radar data to correct precipitation estimates. J. Appl. Meteor., 39(10), 1715-1726.
Hoffman, Howard J.; Rawal, Shristi; Li, Chuan-Ming; Duffy, Valerie B.
2016-01-01
The U.S. NHANES included chemosensory assessments in the 2011–2014 protocol. We provide an overview of this protocol and 2012 olfactory exam findings. Of the 1818 NHANES participants aged ≥40 years, 1281 (70.5 %) completed the exam; non-participation mostly was due to time constraints. Health technicians administered an 8-item, forced-choice, odor identification task scored as normosmic (6–8 odors identified correctly) versus olfactory dysfunction, including hyposmic (4–5 correct) and anosmic/severe hyposmic (0–3 correct). Interviewers recorded self-reported smell alterations (during past year, since age 25, phantosmia), histories of sinonasal problems, xerostomia, dental extractions, head or facial trauma, and chemosensory-related treatment and changes in quality of life. Olfactory dysfunction was found in 12.4 % (13.3 million adults; 55 % males/45 % females) including 3.2 % anosmic/severe hyposmic (3.4 million; 74 % males/26 % females). Selected age-specific prevalences were 4.2 % (40–49 years), 12.7 % (60–69 years), and 39.4 % (80+ years). Among adults ≥70 years, misidentification rates for warning odors were 20.3 % for smoke and 31.3 % for natural gas. The highest sensitivity (correctly identifying dysfunction) and specificity (correctly identifying normosmia) of self-reported olfactory alteration was among anosmics/severe hyposmics (54.4 % and 78.1 %, respectively). In age- and sex-adjusted logistic regression analysis, risk factors of olfactory dysfunction were racial/ethnic minority, income-to-poverty ratio ≤ 1.1, education
Analysis of MMU FDIR expert system
NASA Technical Reports Server (NTRS)
Landauer, Christopher
1990-01-01
This paper describes the analysis of a rulebase for fault diagnosis, isolation, and recovery for NASA's Manned Maneuvering Unit (MMU). The MMU is used by a human astronaut to move around a spacecraft in space. In order to provide maneuverability, there are several thrusters oriented in various directions, and hand-controlled devices for useful groups of them. The rulebase describes some error detection procedures, and corrective actions that can be applied in a few cases. The approach taken in this paper is to treat rulebases as symbolic objects and compute correctness and 'reasonableness' criteria that use the statistical distribution of various syntactic structures within the rulebase. The criteria should identify awkward situations, and otherwise signal anomalies that may be errors. The rulebase analysis agorithms are derived from mathematical and computational criteria that implement certain principles developed for rulebase evaluation. The principles are Consistency, Completeness, Irredundancy, Connectivity, and finally, Distribution. Several errors were detected in the delivered rulebase. Some of these errors were easily fixed. Some errors could not be fixed with the available information. A geometric model of the thruster arrangement is needed to show how to correct certain other distribution nomalies that are in fact errors. The investigations reported here were partially supported by The Aerospace Corporation's Sponsored Research Program.
1998-06-01
quality management can have on the intermediate level of maintenance. Power quality management is a preventative process that focuses on identifying and correcting problems that cause bad power. Using cost-benefit analysis we compare the effects of implementing a power quality management program at AIMD Lemoore and AIMD Fallon. The implementation of power quality management can result in wide scale logistical support changes in regards to the life cycle costs of maintaining the DoD’s current inventory
Kokaly, Raymond F.
2011-01-01
This report describes procedures for installing and using the U.S. Geological Survey Processing Routines in IDL for Spectroscopic Measurements (PRISM) software. PRISM provides a framework to conduct spectroscopic analysis of measurements made using laboratory, field, airborne, and space-based spectrometers. Using PRISM functions, the user can compare the spectra of materials of unknown composition with reference spectra of known materials. This spectroscopic analysis allows the composition of the material to be identified and characterized. Among its other functions, PRISM contains routines for the storage of spectra in database files, import/export of ENVI spectral libraries, importation of field spectra, correction of spectra to absolute reflectance, arithmetic operations on spectra, interactive continuum removal and comparison of spectral features, correction of imaging spectrometer data to ground-calibrated reflectance, and identification and mapping of materials using spectral feature-based analysis of reflectance data. This report provides step-by-step instructions for installing the PRISM software and running its functions.
galaxie--CGI scripts for sequence identification through automated phylogenetic analysis.
Nilsson, R Henrik; Larsson, Karl-Henrik; Ursing, Björn M
2004-06-12
The prevalent use of similarity searches like BLAST to identify sequences and species implicitly assumes the reference database to be of extensive sequence sampling. This is often not the case, restraining the correctness of the outcome as a basis for sequence identification. Phylogenetic inference outperforms similarity searches in retrieving correct phylogenies and consequently sequence identities, and a project was initiated to design a freely available script package for sequence identification through automated Web-based phylogenetic analysis. Three CGI scripts were designed to facilitate qualified sequence identification from a Web interface. Query sequences are aligned to pre-made alignments or to alignments made by ClustalW with entries retrieved from a BLAST search. The subsequent phylogenetic analysis is based on the PHYLIP package for inferring neighbor-joining and parsimony trees. The scripts are highly configurable. A service installation and a version for local use are found at http://andromeda.botany.gu.se/galaxiewelcome.html and http://galaxie.cgb.ki.se
Jain, Ram B
2017-07-01
Prevalence of smoking is needed to estimate the need for future public health resources. To compute and compare smoking prevalence rates by using self-reported smoking statuses, two serum cotinine (SCOT) based biomarker methods, and one urinary 4-(methylnitrosamino)-1-(3-pyridyl)-1-butanol (NNAL) based biomarker method. These estimates were then used to develop correction factors to be applicable to self-reported prevalences to arrive at corrected smoking prevalence rates. Data from National Health and Nutrition Examination Survey (NHANES) for 2007-2012 for those aged ≥20 years (N = 16826) were used. Self-reported prevalence rate for the total population computed as the weighted number of self-reported smokers divided by weighted number of all participants was 21.6% and 24% when computed by weighted number of self-reported smokers divided by the weighted number of self-reported smokers and nonsmokers. The corrected prevalence rate was found to be 25.8%. A 1% underestimate in smoking prevalence is equivalent to not being able to identify 2.2 million smokers in US in a given year. This underestimation, if not corrected, could lead to serious gap in the public health services available and needed to provide adequate preventive and corrective treatment to smokers.
Machine Learned Replacement of N-Labels for Basecalled Sequences in DNA Barcoding.
Ma, Eddie Y T; Ratnasingham, Sujeevan; Kremer, Stefan C
2018-01-01
This study presents a machine learning method that increases the number of identified bases in Sanger Sequencing. The system post-processes a KB basecalled chromatogram. It selects a recoverable subset of N-labels in the KB-called chromatogram to replace with basecalls (A,C,G,T). An N-label correction is defined given an additional read of the same sequence, and a human finished sequence. Corrections are added to the dataset when an alignment determines the additional read and human agree on the identity of the N-label. KB must also rate the replacement with quality value of in the additional read. Corrections are only available during system training. Developing the system, nearly 850,000 N-labels are obtained from Barcode of Life Datasystems, the premier database of genetic markers called DNA Barcodes. Increasing the number of correct bases improves reference sequence reliability, increases sequence identification accuracy, and assures analysis correctness. Keeping with barcoding standards, our system maintains an error rate of percent. Our system only applies corrections when it estimates low rate of error. Tested on this data, our automation selects and recovers: 79 percent of N-labels from COI (animal barcode); 80 percent from matK and rbcL (plant barcodes); and 58 percent from non-protein-coding sequences (across eukaryotes).
Han, Hyemin; Glenn, Andrea L
2018-06-01
In fMRI research, the goal of correcting for multiple comparisons is to identify areas of activity that reflect true effects, and thus would be expected to replicate in future studies. Finding an appropriate balance between trying to minimize false positives (Type I error) while not being too stringent and omitting true effects (Type II error) can be challenging. Furthermore, the advantages and disadvantages of these types of errors may differ for different areas of study. In many areas of social neuroscience that involve complex processes and considerable individual differences, such as the study of moral judgment, effects are typically smaller and statistical power weaker, leading to the suggestion that less stringent corrections that allow for more sensitivity may be beneficial and also result in more false positives. Using moral judgment fMRI data, we evaluated four commonly used methods for multiple comparison correction implemented in Statistical Parametric Mapping 12 by examining which method produced the most precise overlap with results from a meta-analysis of relevant studies and with results from nonparametric permutation analyses. We found that voxelwise thresholding with familywise error correction based on Random Field Theory provides a more precise overlap (i.e., without omitting too few regions or encompassing too many additional regions) than either clusterwise thresholding, Bonferroni correction, or false discovery rate correction methods.
Brahmam, G.N.V.; Vijayaraghavan, K.
2011-01-01
The prevalence of chronic energy deficiency (CED) among one-third of the Indian population is attributed to inadequacy of consumption of nutrients. However, considering the complexity of diets among Indians, the relationship between a particular dietary pattern and the nutritional status of the population has not been established so far. A community-based cross-sectional study was undertaken to assess estimates, at district level, of diet and nutritional status in Orissa State, India. Factor analysis was used for exploring the existence of consumption pattern of food and nutrients and their relationship with the nutritional status of rural adult population. Data on 2,864 adult men and 3,525 adult women in Orissa state revealed that there exists six patterns among food-groups explaining 59% of the total variation and three patterns among nutrients that explain 73% of the total variation among both adult men and women. The discriminant function analysis revealed that, overall, 53% of the men were correctly classified as either with chronic energy deficiency (CED) or without CED. Similarly, overall, 54% of the women were correctly classified as either with CED or without CED. The sensitivity of the model was 65% for both men and women, and the specificity was 46% and 41% respectively for men and women. In the case of classification of overweight/obesity, the prediction of the model was about 75% among both men and women, along with high sensitivity. Using factor analysis, the dietary patterns were identified from the food and nutrient intake data. There exists a strong relationship between the dietary patterns and the nutritional status of rural adults. These results will help identify the community people with CED and help planners formulate nutritional interventions accordingly. PMID:21957671
Sequencing artifacts in the type A influenza databases and attempts to correct them.
Suarez, David L; Chester, Nikki; Hatfield, Jason
2014-07-01
There are over 276 000 influenza gene sequences in public databases, with the quality of the sequences determined by the contributor. As part of a high school class project, influenza sequences with possible errors were identified in the public databases based on the size of the gene being longer than expected, with the hypothesis that these sequences would have an error. Students contacted sequence submitters alerting them of the possible sequence issue(s) and requested they the suspect sequence(s) be correct as appropriate. Type A influenza viruses were screened, and gene segments longer than the accepted size were identified for further analysis. Attention was placed on sequences with additional nucleotides upstream or downstream of the highly conserved non-coding ends of the viral segments. A total of 1081 sequences were identified that met this criterion. Three types of errors were commonly observed: non-influenza primer sequence wasn't removed from the sequence; PCR product was cloned and plasmid sequence was included in the sequence; and Taq polymerase added an adenine at the end of the PCR product. Internal insertions of nucleotide sequence were also commonly observed, but in many cases it was unclear if the sequence was correct or actually contained an error. A total of 215 sequences, or 22.8% of the suspect sequences, were corrected in the public databases in the first year of the student project. Unfortunately 138 additional sequences with possible errors were added to the databases in the second year. Additional awareness of the need for data integrity of sequences submitted to public databases is needed to fully reap the benefits of these large data sets. © 2014 The Authors. Influenza and Other Respiratory Viruses Published by John Wiley & Sons Ltd.
Pramual, Pairot; Simwisat, Kusumart; Martin, Jon
2016-01-28
Chironomidae are a highly diverse group of insects. Members of this family are often included in programs monitoring the health of freshwater ecosystems. However, a difficulty in morphological identification, particularly of larval stages is the major obstacle to this application. In this study, we tested the efficiency of mitochondrial cytochrome c oxidase I (COI) sequences as the DNA barcoding region for species identification of Chironomidae in Thailand. The results revealed 14 species with a high success rate (>90%) for the correct species identification, which suggests the potential usefulness of the technique. However, some morphological species possess high (>3%) intraspecific genetic divergence that suggests these species could be species complexes and need further morphological or cytological examination. Sequence-based species delimitation analyses indicated that most specimens identified as Chironomus kiiensis, Tokunaga 1936, in Japan are conspecific with C. striatipennis, Kieffer 1912, although a small number form a separate cluster. A review of the descriptions of Kiefferulus tainanus (Kieffer 1912) and its junior synonym, K. biroi (Kieffer 1918), following our results, suggests that this synonymy is probably not correct and that K. tainanus occurs in Japan, China and Singapore, while K. biroi occurs in India and Thailand. Our results therefore revealed the usefulness of DNA barcoding for correct species identification of Chironomidae, particularly the immature stages. In addition, DNA barcodes could also uncover hidden diversity that can guide further taxonomic study, and offer a more efficient way to identify species than morphological analysis where large numbers of specimens are involved, provided the identifications of DNA barcodes in the databases are correct. Our studies indicate that this is not the case, and we identify cases of misidentifications for C. flaviplumus, Tokunaga 1940 and K. tainanus.
Wright Nunes, Julie A; Anderson, Cheryl A M; Greene, Jane H; Ikizler, Talat Alp; Cavanaugh, Kerri L
2015-03-31
Reducing dietary sodium has potential to benefit patients with chronic kidney disease (CKD). Little research is available defining dietary sodium knowledge gaps in patients with pre-dialysis CKD. We designed a brief screening tool to rapidly identify patient knowledge gaps related to dietary sodium for patients with CKD not yet on dialysis. A Short Sodium Knowledge Survey (SSKS) was developed and administered to patients with pre-dialysis CKD. We also asked patients if they received counseling on dietary sodium reduction and about recommended intake limits. We performed logistic regression to examine the association between sodium knowledge and patient characteristics. Characteristics of patients who answered all SSKS questions correctly were compared to those who did not. One-hundred fifty-five patients were surveyed. The mean (SD) age was 56.6 (15.1) years, 84 (54%) were men, and 119 (77%) were white. Sixty-seven patients (43.2%) correctly identified their daily intake sodium limit. Fifty-eight (37.4%) were unable to answer all survey questions correctly. In analysis adjusted for age, sex, race, education, health literacy, CKD stage, self-reported hypertension and attendance in a kidney education class, women and patients of non-white race had lower odds of correctly answering survey questions (0.36 [0.16,0.81]; p = 0.01 women versus men and 0.33 [0.14,0.76]; p = 0.01 non-white versus white, respectively). Our survey provides a mechanism to quickly identify dietary sodium knowledge gaps in patients with CKD. Women and patients of non-white race may have knowledge barriers impeding adherence to sodium reduction advice.
Pärn, Jaan; Verhoeven, Jos T A; Butterbach-Bahl, Klaus; Dise, Nancy B; Ullah, Sami; Aasa, Anto; Egorov, Sergey; Espenberg, Mikk; Järveoja, Järvi; Jauhiainen, Jyrki; Kasak, Kuno; Klemedtsson, Leif; Kull, Ain; Laggoun-Défarge, Fatima; Lapshina, Elena D; Lohila, Annalea; Lõhmus, Krista; Maddison, Martin; Mitsch, William J; Müller, Christoph; Niinemets, Ülo; Osborne, Bruce; Pae, Taavi; Salm, Jüri-Ott; Sgouridis, Fotis; Sohar, Kristina; Soosaar, Kaido; Storey, Kathryn; Teemusk, Alar; Tenywa, Moses M; Tournebize, Julien; Truu, Jaak; Veber, Gert; Villa, Jorge A; Zaw, Seint Sann; Mander, Ülo
2018-04-26
The original version of this Article contained an error in the first sentence of the Acknowledgements section, which incorrectly referred to the Estonian Research Council grant identifier as "PUTJD618". The correct version replaces the grant identifier with "PUTJD619". This has been corrected in both the PDF and HTML versions of the Article.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Matthews, Patrick
The purpose of this Corrective Action Decision Document is to identify and provide the rationale for the recommendation of corrective action alternatives (CAAs) for the 14 CASs within CAU 568. Corrective action investigation (CAI) activities were performed from April 2014 through May 2015, as set forth in the Corrective Action Investigation Plan for Corrective Action Unit 568: Area 3 Plutonium Dispersion Sites, Nevada National Security Site, Nevada; and in accordance with the Soils Activity Quality Assurance Plan, which establishes requirements, technical planning, and general quality practices. The purpose of the CAI was to fulfill data needs as defined during themore » DQO process. The CAU 568 dataset of investigation results was evaluated based on a data quality assessment. This assessment demonstrated that the dataset is complete and acceptable for use in fulfilling the DQO data needs. Based on the evaluation of analytical data from the CAI, review of future and current operations at the 14 CASs, and the detailed and comparative analysis of the potential CAAs, the following corrective actions are recommended for CAU 568: • No further action is the preferred corrective action for CASs 03-23-17, 03-23-22, 03-23-26. • Closure in place is the preferred corrective action for CAS 03-23-19; 03-45-01; the SE DCBs at CASs 03-23-20, 03-23-23, 03-23-31, 03-23-32, 03-23-33, and 03-23-34; and the Pascal-BHCA at CAS 03-23-31. • Clean closure is the preferred corrective action for CASs 03-08-04, 03-23-30, and 03-26-04; and the four well head covers at CASs 03-23-20, 03-23-23, 03-23-31, and 03-23-33.« less
Improving Building Energy Simulation Programs Through Diagnostic Testing (Fact Sheet)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
2012-02-01
New test procedure evaluates quality and accuracy of energy analysis tools for the residential building retrofit market. Reducing the energy use of existing homes in the United States offers significant energy-saving opportunities, which can be identified through building simulation software tools that calculate optimal packages of efficiency measures. To improve the accuracy of energy analysis for residential buildings, the National Renewable Energy Laboratory's (NREL) Buildings Research team developed the Building Energy Simulation Test for Existing Homes (BESTEST-EX), a method for diagnosing and correcting errors in building energy audit software and calibration procedures. BESTEST-EX consists of building physics and utility billmore » calibration test cases, which software developers can use to compare their tools simulation findings to reference results generated with state-of-the-art simulation tools. Overall, the BESTEST-EX methodology: (1) Tests software predictions of retrofit energy savings in existing homes; (2) Ensures building physics calculations and utility bill calibration procedures perform to a minimum standard; and (3) Quantifies impacts of uncertainties in input audit data and occupant behavior. BESTEST-EX is helping software developers identify and correct bugs in their software, as well as develop and test utility bill calibration procedures.« less
NASA Technical Reports Server (NTRS)
Schmidt, R. F.
1984-01-01
An analytical/numerical approach to identifying and correcting the aberrations introduced by a general displacement of the feed from the focal point of a single offset paraboloid antenna used in deployable radiometer systems is developed. A 15 meter reflector with 18 meter focal length is assumed for the analysis, which considers far field radiation pattern quality, focal region fields, and aberrations appearing in the aperture plane. The latter are obtained by ray tracing in the transmit mode and are expressed in terms of optical notation. Attention is given to the physical restraints imposed on corrective elements by real microwave systems and to the intermediate near field aspects of the problem in three dimensions. The subject of wave fronts and caustics in the receive mode is introduced for comparative purposes. Several specific examples are given for aberration reduction at eight beamwidths of scan at a frequency of 1.414 GHz.
Guerrini, G; Morabito, A; Samaja, M
2000-10-01
The aim is to determine if a single measurement of blood 2,3-diphosphoglycerate combined with gas analysis (pH, PCO2, PO2 and saturation) can identify the cause of an altered blood-oxygen affinity: the presence of an abnormal haemoglobin or a red cell disorder. The population (n=94) was divided into healthy controls (A, n=14), carriers of red cell disorders (B, n=72) and carriers of high oxygen affinity haemoglobins (C, n=8). Those variables were measured both in samples equilibrated at selected PCO2 and PO2 and in venous blood. In the univariable approach applied to equilibrated samples, we correctly identified C subjects in 93.6% or 96.8% of the cases depending on the selected variable, the standard P50 (PO2 at which 50% of haemoglobin is oxygenated) or a composite variable calculated from the above measurements. After introducing the haemoglobin concentration as a further discriminating variable, the A and B subjects were correctly identified in 91.9% or 94.2% of the cases, respectively. These figures become 93.0% or 86.1%, and 93.7% or 94.9% of the cases when using direct readings from venous blood, thereby avoiding the blood equilibration step. This test is feasible also in blood samples stored at 4 degrees C for 48 h, or at room temperature for 8 h.
Li, Yunhai; Lee, Kee Khoon; Walsh, Sean; Smith, Caroline; Hadingham, Sophie; Sorefan, Karim; Cawley, Gavin; Bevan, Michael W
2006-03-01
Establishing transcriptional regulatory networks by analysis of gene expression data and promoter sequences shows great promise. We developed a novel promoter classification method using a Relevance Vector Machine (RVM) and Bayesian statistical principles to identify discriminatory features in the promoter sequences of genes that can correctly classify transcriptional responses. The method was applied to microarray data obtained from Arabidopsis seedlings treated with glucose or abscisic acid (ABA). Of those genes showing >2.5-fold changes in expression level, approximately 70% were correctly predicted as being up- or down-regulated (under 10-fold cross-validation), based on the presence or absence of a small set of discriminative promoter motifs. Many of these motifs have known regulatory functions in sugar- and ABA-mediated gene expression. One promoter motif that was not known to be involved in glucose-responsive gene expression was identified as the strongest classifier of glucose-up-regulated gene expression. We show it confers glucose-responsive gene expression in conjunction with another promoter motif, thus validating the classification method. We were able to establish a detailed model of glucose and ABA transcriptional regulatory networks and their interactions, which will help us to understand the mechanisms linking metabolism with growth in Arabidopsis. This study shows that machine learning strategies coupled to Bayesian statistical methods hold significant promise for identifying functionally significant promoter sequences.
Sources of error in the retracted scientific literature.
Casadevall, Arturo; Steen, R Grant; Fang, Ferric C
2014-09-01
Retraction of flawed articles is an important mechanism for correction of the scientific literature. We recently reported that the majority of retractions are associated with scientific misconduct. In the current study, we focused on the subset of retractions for which no misconduct was identified, in order to identify the major causes of error. Analysis of the retraction notices for 423 articles indexed in PubMed revealed that the most common causes of error-related retraction are laboratory errors, analytical errors, and irreproducible results. The most common laboratory errors are contamination and problems relating to molecular biology procedures (e.g., sequencing, cloning). Retractions due to contamination were more common in the past, whereas analytical errors are now increasing in frequency. A number of publications that have not been retracted despite being shown to contain significant errors suggest that barriers to retraction may impede correction of the literature. In particular, few cases of retraction due to cell line contamination were found despite recognition that this problem has affected numerous publications. An understanding of the errors leading to retraction can guide practices to improve laboratory research and the integrity of the scientific literature. Perhaps most important, our analysis has identified major problems in the mechanisms used to rectify the scientific literature and suggests a need for action by the scientific community to adopt protocols that ensure the integrity of the publication process. © FASEB.
Comparing multilayer brain networks between groups: Introducing graph metrics and recommendations.
Mandke, Kanad; Meier, Jil; Brookes, Matthew J; O'Dea, Reuben D; Van Mieghem, Piet; Stam, Cornelis J; Hillebrand, Arjan; Tewarie, Prejaas
2018-02-01
There is an increasing awareness of the advantages of multi-modal neuroimaging. Networks obtained from different modalities are usually treated in isolation, which is however contradictory to accumulating evidence that these networks show non-trivial interdependencies. Even networks obtained from a single modality, such as frequency-band specific functional networks measured from magnetoencephalography (MEG) are often treated independently. Here, we discuss how a multilayer network framework allows for integration of multiple networks into a single network description and how graph metrics can be applied to quantify multilayer network organisation for group comparison. We analyse how well-known biases for single layer networks, such as effects of group differences in link density and/or average connectivity, influence multilayer networks, and we compare four schemes that aim to correct for such biases: the minimum spanning tree (MST), effective graph resistance cost minimisation, efficiency cost optimisation (ECO) and a normalisation scheme based on singular value decomposition (SVD). These schemes can be applied to the layers independently or to the multilayer network as a whole. For correction applied to whole multilayer networks, only the SVD showed sufficient bias correction. For correction applied to individual layers, three schemes (ECO, MST, SVD) could correct for biases. By using generative models as well as empirical MEG and functional magnetic resonance imaging (fMRI) data, we further demonstrated that all schemes were sensitive to identify network topology when the original networks were perturbed. In conclusion, uncorrected multilayer network analysis leads to biases. These biases may differ between centres and studies and could consequently lead to unreproducible results in a similar manner as for single layer networks. We therefore recommend using correction schemes prior to multilayer network analysis for group comparisons. Copyright © 2017 Elsevier Inc. All rights reserved.
Stöggl, Thomas; Holst, Anders; Jonasson, Arndt; Andersson, Erik; Wunsch, Tobias; Norström, Christer; Holmberg, Hans-Christer
2014-01-01
The purpose of the current study was to develop and validate an automatic algorithm for classification of cross-country (XC) ski-skating gears (G) using Smartphone accelerometer data. Eleven XC skiers (seven men, four women) with regional-to-international levels of performance carried out roller skiing trials on a treadmill using fixed gears (G2left, G2right, G3, G4left, G4right) and a 950-m trial using different speeds and inclines, applying gears and sides as they normally would. Gear classification by the Smartphone (on the chest) and based on video recordings were compared. Formachine-learning, a collective database was compared to individual data. The Smartphone application identified the trials with fixed gears correctly in all cases. In the 950-m trial, participants executed 140 ± 22 cycles as assessed by video analysis, with the automatic Smartphone application giving a similar value. Based on collective data, gears were identified correctly 86.0% ± 8.9% of the time, a value that rose to 90.3% ± 4.1% (P < 0.01) with machine learning from individual data. Classification was most often incorrect during transition between gears, especially to or from G3. Identification was most often correct for skiers who made relatively few transitions between gears. The accuracy of the automatic procedure for identifying G2left, G2right, G3, G4left and G4right was 96%, 90%, 81%, 88% and 94%, respectively. The algorithm identified gears correctly 100% of the time when a single gear was used and 90% of the time when different gears were employed during a variable protocol. This algorithm could be improved with respect to identification of transitions between gears or the side employed within a given gear. PMID:25365459
NASA Astrophysics Data System (ADS)
Liu, Y.; Zhang, Y.; Wood, A.; Lee, H. S.; Wu, L.; Schaake, J. C.
2016-12-01
Seasonal precipitation forecasts are a primary driver for seasonal streamflow prediction that is critical for a range of water resources applications, such as reservoir operations and drought management. However, it is well known that seasonal precipitation forecasts from climate models are often biased and also too coarse in spatial resolution for hydrologic applications. Therefore, post-processing procedures such as downscaling and bias correction are often needed. In this presentation, we discuss results from a recent study that applies a two-step methodology to downscale and correct the ensemble mean precipitation forecasts from the Climate Forecast System (CFS). First, CFS forecasts are downscaled and bias corrected using monthly reforecast analogs: we identify past precipitation forecasts that are similar to the current forecast, and then use the finer-scale observational analysis fields from the corresponding dates to represent the post-processed ensemble forecasts. Second, we construct the posterior distribution of forecast precipitation from the post-processed ensemble by integrating climate indices: a correlation analysis is performed to identify dominant climate indices for the study region, which are then used to weight the analysis analogs selected in the first step using a Bayesian approach. The methodology is applied to the California Nevada River Forecast Center (CNRFC) and the Middle Atlantic River Forecast Center (MARFC) regions for 1982-2015, using the North American Land Data Assimilation System (NLDAS-2) precipitation as the analysis. The results from cross validation show that the post-processed CFS precipitation forecast are considerably more skillful than the raw CFS with the analog approach only. Integrating climate indices can further improve the skill if the number of ensemble members considered is large enough; however, the improvement is generally limited to the first couple of months when compared against climatology. Impacts of various factors such as ensemble size, lead time, and choice of climate indices will also be discussed.
Graves, T.A.; Farley, S.; Goldstein, M.I.; Servheen, C.
2007-01-01
We identified primary habitat and functional corridors across a landscape using Global Positioning System (GPS) collar locations of brown bears (Ursus arctos). After deriving density, speed, and angular deviation of movement, we classified landscape function for a group of animals with a cluster analysis. We described areas with high amounts of sinuous movement as primary habitat patches and areas with high amounts of very directional, fast movement as highly functional bear corridors. The time between bear locations and scale of analysis influenced the number and size of corridors identified. Bear locations should be collected at intervals ???6 h to correctly identify travel corridors. Our corridor identification technique will help managers move beyond the theoretical discussion of corridors and linkage zones to active management of landscape features that will preserve connectivity. ?? 2007 Springer Science+Business Media, Inc.
McVea, Charmaine S; Gow, Kathryn; Lowe, Roger
2011-07-01
This study investigated the process of resolving painful emotional experience during psychodrama group therapy, by examining significant therapeutic events within seven psychodrama enactments. A comprehensive process analysis of four resolved and three not-resolved cases identified five meta-processes which were linked to in-session resolution. One was a readiness to engage in the therapeutic process, which was influenced by client characteristics and the client's experience of the group; and four were therapeutic events: (1) re-experiencing with insight; (2) activating resourcefulness; (3) social atom repair with emotional release; and (4) integration. A corrective interpersonal experience (social atom repair) healed the sense of fragmentation and interpersonal disconnection associated with unresolved emotional pain, and emotional release was therapeutically helpful when located within the enactment of this new role relationship. Protagonists who experienced resolution reported important improvements in interpersonal functioning and sense of self which they attributed to this experience.
Woods, Carl T; Raynor, Annette J; Bruce, Lyndell; McDonald, Zane
2016-01-01
This study examined if a video decision-making task could discriminate talent-identified junior Australian football players from their non-talent-identified counterparts. Participants were recruited from the 2013 under 18 (U18) West Australian Football League competition and classified into two groups: talent-identified (State U18 Academy representatives; n = 25; 17.8 ± 0.5 years) and non-talent-identified (non-State U18 Academy selection; n = 25; 17.3 ± 0.6 years). Participants completed a video decision-making task consisting of 26 clips sourced from the Australian Football League game-day footage, recording responses on a sheet provided. A score of "1" was given for correct and "0" for incorrect responses, with the participants total score used as the criterion value. One-way analysis of variance tested the main effect of "status" on the task criterion, whilst a bootstrapped receiver operating characteristic (ROC) curve assessed the discriminant ability of the task. An area under the curve (AUC) of 1 (100%) represented perfect discrimination. Between-group differences were evident (P < 0.05) and the ROC curve was maximised with a score of 15.5/26 (60%) (AUC = 89.0%), correctly classifying 92% and 76% of the talent-identified and non-talent-identified participants, respectively. Future research should investigate the mechanisms leading to the superior decision-making observed in the talent-identified group.
Stöckel, Stephan; Meisel, Susann; Elschner, Mandy; Melzer, Falk; Rösch, Petra; Popp, Jürgen
2015-01-01
Burkholderia mallei (the etiologic agent of glanders in equines and rarely humans) and Burkholderia pseudomallei, causing melioidosis in humans and animals, are designated category B biothreat agents. The intrinsically high resistance of both agents to many antibiotics, their potential use as bioweapons, and their low infectious dose, necessitate the need for rapid and accurate detection methods. Current methods to identify these organisms may require up to 1 week, as they rely on phenotypic characteristics and an extensive set of biochemical reactions. In this study, Raman microspectroscopy, a cultivation-independent typing technique for single bacterial cells with the potential for being a rapid point-of-care analysis system, is evaluated to identify and differentiate B. mallei and B. pseudomallei within hours. Here, not only broth-cultured microbes but also bacteria isolated out of pelleted animal feedstuff were taken into account. A database of Raman spectra allowed a calculation of classification functions, which were trained to differentiate Raman spectra of not only both pathogens but also of five further Burkholderia spp. and four species of the closely related genus Pseudomonas. The developed two-stage classification system comprising two support vector machine (SVM) classifiers was then challenged by a test set of 11 samples to simulate the case of a real-world-scenario, when "unknown samples" are to be identified. In the end, all test set samples were identified correctly, even if the contained bacterial strains were not incorporated in the database before or were isolated out of animal feedstuff. Specifically, the five test samples bearing B. mallei and B. pseudomallei were correctly identified on species level with accuracies between 93.9 and 98.7%. The sample analysis itself requires no biomass enrichment step prior to the analysis and can be performed under biosafety level 1 (BSL 1) conditions after inactivating the bacteria with formaldehyde.
Ferrazzi, Giulio; Kuklisova Murgasova, Maria; Arichi, Tomoki; Malamateniou, Christina; Fox, Matthew J; Makropoulos, Antonios; Allsop, Joanna; Rutherford, Mary; Malik, Shaihan; Aljabar, Paul; Hajnal, Joseph V
2014-11-01
There is growing interest in exploring fetal functional brain development, particularly with Resting State fMRI. However, during a typical fMRI acquisition, the womb moves due to maternal respiration and the fetus may perform large-scale and unpredictable movements. Conventional fMRI processing pipelines, which assume that brain movements are infrequent or at least small, are not suitable. Previous published studies have tackled this problem by adopting conventional methods and discarding as much as 40% or more of the acquired data. In this work, we developed and tested a processing framework for fetal Resting State fMRI, capable of correcting gross motion. The method comprises bias field and spin history corrections in the scanner frame of reference, combined with slice to volume registration and scattered data interpolation to place all data into a consistent anatomical space. The aim is to recover an ordered set of samples suitable for further analysis using standard tools such as Group Independent Component Analysis (Group ICA). We have tested the approach using simulations and in vivo data acquired at 1.5 T. After full motion correction, Group ICA performed on a population of 8 fetuses extracted 20 networks, 6 of which were identified as matching those previously observed in preterm babies. Copyright © 2014 Elsevier Inc. All rights reserved.
Lobb, Eric C
2016-07-08
Version 6.3 of the RITG148+ software package offers eight automated analysis routines for quality assurance of the TomoTherapy platform. A performance evaluation of each routine was performed in order to compare RITG148+ results with traditionally accepted analysis techniques and verify that simulated changes in machine parameters are correctly identified by the software. Reference films were exposed according to AAPM TG-148 methodology for each routine and the RITG148+ results were compared with either alternative software analysis techniques or manual analysis techniques in order to assess baseline agreement. Changes in machine performance were simulated through translational and rotational adjustments to subsequently irradiated films, and these films were analyzed to verify that the applied changes were accurately detected by each of the RITG148+ routines. For the Hounsfield unit routine, an assessment of the "Frame Averaging" functionality and the effects of phantom roll on the routine results are presented. All RITG148+ routines reported acceptable baseline results consistent with alternative analysis techniques, with 9 of the 11 baseline test results showing agreement of 0.1mm/0.1° or better. Simulated changes were correctly identified by the RITG148+ routines within approximately 0.2 mm/0.2° with the exception of the Field Centervs. Jaw Setting routine, which was found to have limited accuracy in cases where field centers were not aligned for all jaw settings due to inaccurate autorotation of the film during analysis. The performance of the RITG148+ software package was found to be acceptable for introduction into our clinical environment as an automated alternative to traditional analysis techniques for routine TomoTherapy quality assurance testing.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nakajima, K.; Bunko, H.; Tada, A.
1984-01-01
Twenty-one patients with the Wolff-Parkinson-White (WPW) syndrome who underwent surgical division of the accessory conduction pathway (ACP) were studied by gated blood-pool scintigraphy. In each case, a functional image of the phase was generated, based on the fundamental frequency of the Fourier transform. The location of the ACP was confirmed by electrophysiologic study, epicardial mapping, and surgery. Phase analysis identified the side of preexcitation correctly in 16 out of 20 patients with WPW syndrome with a delta wave. All patients with right-cardiac type (N=9) had initial contraction in the right ventricle (RV). In patients with left-cardiac type (N=10), six hadmore » initial movement in the left ventricle (LV); but in the other four the ACPs in the anterior or lateral wall of the left ventricle (LV) could not be detected. In patients with multiple ACPs (N=2), one right-cardiac type had initial contraction in the RV, while in the other (with an intermittent WPW syndrome) the ACP was not detected. These observations indicate that abnormal wall motion is associated with the conduction anomalies of the WPW syndrome. We conclude that phase analysis can correctly identify the side of initial contraction in the WPW syndrome before and after surgery. However, as a method of preoperative study, it seems difficult to determine the precise site of the ACP by phase analysis alone.« less
Contrast-detail phantom scoring methodology.
Thomas, Jerry A; Chakrabarti, Kish; Kaczmarek, Richard; Romanyukha, Alexander
2005-03-01
Published results of medical imaging studies which make use of contrast detail mammography (CDMAM) phantom images for analysis are difficult to compare since data are often not analyzed in the same way. In order to address this situation, the concept of ideal contrast detail curves is suggested. The ideal contrast detail curves are constructed based on the requirement of having the same product of the diameter and contrast (disk thickness) of the minimal correctly determined object for every row of the CDMAM phantom image. A correlation and comparison of five different quality parameters of the CDMAM phantom image determined for obtained ideal contrast detail curves is performed. The image quality parameters compared include: (1) contrast detail curve--a graph correlation between "minimal correct reading" diameter and disk thickness; (2) correct observation ratio--the ratio of the number of correctly identified objects to the actual total number of objects multiplied by 100; (3) image quality figure--the sum of the product of the diameter of the smallest scored object and its relative contrast; (4) figure-of-merit--the zero disk diameter value obtained from extrapolation of the contrast detail curve to the origin (e.g., zero disk diameter); and (5) k-factor--the product of the thickness and the diameter of the smallest correctly identified disks. The analysis carried out showed the existence of a nonlinear relationship between the above parameters, which means that use of different parameters of CDMAM image quality potentially can cause different conclusions about changes in image quality. Construction of the ideal contrast detail curves for CDMAM phantom is an attempt to determine the quantitative limits of the CDMAM phantom as employed for image quality evaluation. These limits are determined by the relationship between certain parameters of a digital mammography system and the set of the gold disks sizes in the CDMAM phantom. Recommendations are made on selections of CDMAM phantom regions which should be used for scoring at different image quality and which scoring methodology may be most appropriate. Special attention is also paid to the use of the CDMAM phantom for image quality assessment of digital mammography systems particularly in the vicinity of the Nyquist frequency.
Chen, Shan; Li, Xiao-ning; Liang, Yi-zeng; Zhang, Zhi-min; Liu, Zhao-xia; Zhang, Qi-ming; Ding, Li-xia; Ye, Fei
2010-08-01
During Raman spectroscopy analysis, the organic molecules and contaminations will obscure or swamp Raman signals. The present study starts from Raman spectra of prednisone acetate tablets and glibenclamide tables, which are acquired from the BWTek i-Raman spectrometer. The background is corrected by R package baselineWavelet. Then principle component analysis and random forests are used to perform clustering analysis. Through analyzing the Raman spectra of two medicines, the accurate and validity of this background-correction algorithm is checked and the influences of fluorescence background on Raman spectra clustering analysis is discussed. Thus, it is concluded that it is important to correct fluorescence background for further analysis, and an effective background correction solution is provided for clustering or other analysis.
Body, Barbara A; Beard, Melodie A; Slechta, E Susan; Hanson, Kimberly E; Barker, Adam P; Babady, N Esther; McMillen, Tracy; Tang, Yi-Wei; Brown-Elliott, Barbara A; Iakhiaeva, Elena; Vasireddy, Ravikiran; Vasireddy, Sruthi; Smith, Terry; Wallace, Richard J; Turner, S; Curtis, L; Butler-Wu, Susan; Rychert, Jenna
2018-06-01
This multicenter study was designed to assess the accuracy and reproducibility of the Vitek MS v3.0 matrix-assisted laser desorption ionization-time of flight (MALDI-TOF) mass spectrometry system for identification of Mycobacterium and Nocardia species compared to DNA sequencing. A total of 963 clinical isolates representing 51 taxa were evaluated. In all, 663 isolates were correctly identified to the species level (69%), with another 231 (24%) correctly identified to the complex or group level. Fifty-five isolates (6%) could not be identified despite repeat testing. All of the tuberculous mycobacteria (45/45; 100%) and most of the nontuberculous mycobacteria (569/606; 94%) were correctly identified at least to the group or complex level. However, not all species or subspecies within the M. tuberculosis , M. abscessus , and M. avium complexes and within the M. fortuitum and M. mucogenicum groups could be differentiated. Among the 312 Nocardia isolates tested, 236 (76%) were correctly identified to the species level, with an additional 44 (14%) correctly identified to the complex level. Species within the N. nova and N. transvalensis complexes could not always be differentiated. Eleven percent of the isolates (103/963) underwent repeat testing in order to get a final result. Identification of a representative set of Mycobacterium and Nocardia species was highly reproducible, with 297 of 300 (99%) replicates correctly identified using multiple kit lots, instruments, analysts, and sites. These findings demonstrate that the system is robust and has utility for the routine identification of mycobacteria and Nocardia in clinical practice. Copyright © 2018 American Society for Microbiology.
Investigating bias in squared regression structure coefficients
Nimon, Kim F.; Zientek, Linda R.; Thompson, Bruce
2015-01-01
The importance of structure coefficients and analogs of regression weights for analysis within the general linear model (GLM) has been well-documented. The purpose of this study was to investigate bias in squared structure coefficients in the context of multiple regression and to determine if a formula that had been shown to correct for bias in squared Pearson correlation coefficients and coefficients of determination could be used to correct for bias in squared regression structure coefficients. Using data from a Monte Carlo simulation, this study found that squared regression structure coefficients corrected with Pratt's formula produced less biased estimates and might be more accurate and stable estimates of population squared regression structure coefficients than estimates with no such corrections. While our findings are in line with prior literature that identified multicollinearity as a predictor of bias in squared regression structure coefficients but not coefficients of determination, the findings from this study are unique in that the level of predictive power, number of predictors, and sample size were also observed to contribute bias in squared regression structure coefficients. PMID:26217273
Haley, Valerie B; DiRienzo, A Gregory; Lutterloh, Emily C; Stricof, Rachel L
2014-01-01
To assess the effect of multiple sources of bias on state- and hospital-specific National Healthcare Safety Network (NHSN) laboratory-identified Clostridium difficile infection (CDI) rates. Sensitivity analysis. A total of 124 New York hospitals in 2010. New York NHSN CDI events from audited hospitals were matched to New York hospital discharge billing records to obtain additional information on patient age, length of stay, and previous hospital discharges. "Corrected" hospital-onset (HO) CDI rates were calculated after (1) correcting inaccurate case reporting found during audits, (2) incorporating knowledge of laboratory results from outside hospitals, (3) excluding days when patients were not at risk from the denominator of the rates, and (4) adjusting for patient age. Data sets were simulated with each of these sources of bias reintroduced individually and combined. The simulated rates were compared with the corrected rates. Performance (ie, better, worse, or average compared with the state average) was categorized, and misclassification compared with the corrected data set was measured. Counting days patients were not at risk in the denominator reduced the state HO rate by 45% and resulted in 8% misclassification. Age adjustment and reporting errors also shifted rates (7% and 6% misclassification, respectively). Changing the NHSN protocol to require reporting of age-stratified patient-days and adjusting for patient-days at risk would improve comparability of rates across hospitals. Further research is needed to validate the risk-adjustment model before these data should be used as hospital performance measures.
A Proper-Motion Corrected, Cross-Matched Catalog Of M Dwarfs In SDSS And FIRST
NASA Astrophysics Data System (ADS)
Arai, Erin; West, A. A.; Thyagarajan, N.; Agüeros, M.; Helfand, D.
2011-05-01
We present a preliminary analysis of M dwarfs identified in both the Sloan Digital Sky Survey (SDSS) and the Very Large Array's (VLA) Faint Images of the Radio Sky at Twenty-centimeters survey (FIRST). The presence of magnetic fields is often associated with indirect magnetic activity measurements, such as H-alpha or X-ray emission. Radio emission, in contrast, is directly proportional to the magnetic field strength in addition to being another measure of activity. We search for stellar radio emission by cross-matching the SDSS DR7 M dwarf sample with the FIRST catalog. The SDSS data allow us to examine the spectra of our objects and correlate the magnetic activity (H-alpha) with the magnetic field strength (radio emission). Accurate positions and proper motions are important for obtaining a complete list of overlapping targets. Positions in FIRST and SDSS need to be proper motion corrected in order to ensure unique target matches since nearby M dwarfs can have significant proper motions (up to 1'' per year). Some previous studies have neglected the significance of proper motions in identifying overlapping targets between SDSS and FIRST; we correct for some of these previous oversights. In addition the FIRST data were taken in multiple epochs; individual images need to be proper motion corrected before the images can be co-added. Our cross-match catalog puts important constraints on models of magnetic field generation in low-mass stars in addition to the true habitability of attending planets.
Improving Transgender Healthcare in the New York City Correctional System.
Jaffer, Mohamed; Ayad, John; Tungol, Jose Gabriel; MacDonald, Ross; Dickey, Nathaniel; Venters, Homer
2016-04-01
Correctional settings create unique challenges for patients with special needs, including transgender patients, who have an increased rate of overall discrimination, sexual abuse, healthcare disparities, and improper housing. As part of our correctional health quality improvement process, we sought to review and evaluate the adequacy of care for transgender patients in the New York City jail system. Using correctional pharmacy records, transgender patients receiving hormonal treatment were identified. A brief in-person survey was conducted to evaluate their care in the community before incarceration, medical care in jail, and experience in the jail environment. Survey findings and analysis of transgender patient healthcare-related complaints revealed opportunities for improvements in the provision of care and staff understanding of this population. Utilizing these findings, we conducted lesbian, gay, bisexual, and transgender (LGBT) trainings in all 12 jail clinics for medical, nursing, and mental health staff. Three months after LGBT training, patient complaints dropped by over 50%. After the development and implementation of a newly revised transgender healthcare policy, complaints dropped to zero within 6 months. Our efforts to assess the quality of care provided to transgender patients revealed significant areas for improvement. Although we have made important gains in providing quality care through the implementation of policies and procedures rooted in community standards and the express wishes of our patients, we continue to engage this patient population to identify other issues that impact their health and well-being in the jail environment.
The description of cough sounds by healthcare professionals
Smith, Jaclyn A; Ashurst, H Louise; Jack, Sandy; Woodcock, Ashley A; Earis, John E
2006-01-01
Background Little is known of the language healthcare professionals use to describe cough sounds. We aimed to examine how they describe cough sounds and to assess whether these descriptions suggested they appreciate the basic sound qualities (as assessed by acoustic analysis) and the underlying diagnosis of the patient coughing. Methods 53 health professionals from two large respiratory tertiary referral centres were recruited; 22 doctors and 31 staff from professions allied to medicine. Participants listened to 9 sequences of spontaneous cough sounds from common respiratory diseases. For each cough they selected patient gender, the most appropriate descriptors and a diagnosis. Cluster analysis was performed to assess which cough sounds attracted similar descriptions. Results Gender was correctly identified in 93% of cases. The presence or absence of mucus was correct in 76.1% and wheeze in 39.3% of cases. However, identifying clinical diagnosis from cough was poor at 34.0%. Cluster analysis showed coughs with the same acoustics properties rather than the same diagnoses attracted the same descriptions. Conclusion These results suggest that healthcare professionals can recognise some of the qualities of cough sounds but are poor at making diagnoses from them. It remains to be seen whether in the future cough sound acoustics will provide useful clinical information and whether their study will lead to the development of useful new outcome measures in cough monitoring. PMID:16436200
Massett, Holly A.; Mishkin, Grace; Rubinstein, Larry; Ivy, S. Percy; Denicoff, Andrea; Godwin, Elizabeth; DiPiazza, Kate; Bolognese, Jennifer; Zwiebel, James A.; Abrams, Jeffrey S.
2016-01-01
Accruing patients in a timely manner represents a significant challenge to early phase cancer clinical trials. The NCI Cancer Therapy Evaluation Program analyzed 19 months of corrective action plans (CAPs) received for slow-accruing Phase 1 and 2 trials to identify slow accrual reasons, evaluate whether proposed corrective actions matched these reasons, and assess the CAP impact on trial accrual, duration, and likelihood of meeting primary scientific objectives. Of the 135 CAPs analyzed, 69 were for Phase 1 trials and 66 for Phase 2 trials. Primary reasons cited for slow accrual were safety/toxicity (Phase 1: 48%), design/protocol concerns (Phase 1: 42%, Phase 2: 33%), and eligibility criteria (Phase 1: 41%, Phase 2: 35%). The most commonly proposed corrective actions were adding institutions (Phase 1: 43%, Phase 2: 85%) and amending the trial to change eligibility or design (Phase 1: 55%, Phase 2: 44%). Only 40% of CAPs provided proposed corrective actions that matched the reasons given for slow accrual. Seventy percent of trials were closed to accrual at time of analysis (Phase 1=48; Phase 2=46). Of these, 67% of Phase 1 and 70% of Phase 2 trials met their primary objectives, but they were active three times longer than projected. Among closed trials, 24% had an accrual rate increase associated with a greater likelihood of meeting their primary scientific objectives. Ultimately, trials receiving CAPs saw improved accrual rates. Future trials may benefit from implementing CAPs early in trial lifecycles, but it may be more beneficial to invest in earlier accrual planning. PMID:27401246
NASA Astrophysics Data System (ADS)
Lovejoy, McKenna R.; Wickert, Mark A.
2017-05-01
A known problem with infrared imaging devices is their non-uniformity. This non-uniformity is the result of dark current, amplifier mismatch as well as the individual photo response of the detectors. To improve performance, non-uniformity correction (NUC) techniques are applied. Standard calibration techniques use linear, or piecewise linear models to approximate the non-uniform gain and off set characteristics as well as the nonlinear response. Piecewise linear models perform better than the one and two-point models, but in many cases require storing an unmanageable number of correction coefficients. Most nonlinear NUC algorithms use a second order polynomial to improve performance and allow for a minimal number of stored coefficients. However, advances in technology now make higher order polynomial NUC algorithms feasible. This study comprehensively tests higher order polynomial NUC algorithms targeted at short wave infrared (SWIR) imagers. Using data collected from actual SWIR cameras, the nonlinear techniques and corresponding performance metrics are compared with current linear methods including the standard one and two-point algorithms. Machine learning, including principal component analysis, is explored for identifying and replacing bad pixels. The data sets are analyzed and the impact of hardware implementation is discussed. Average floating point results show 30% less non-uniformity, in post-corrected data, when using a third order polynomial correction algorithm rather than a second order algorithm. To maximize overall performance, a trade off analysis on polynomial order and coefficient precision is performed. Comprehensive testing, across multiple data sets, provides next generation model validation and performance benchmarks for higher order polynomial NUC methods.
Pathway analysis with next-generation sequencing data.
Zhao, Jinying; Zhu, Yun; Boerwinkle, Eric; Xiong, Momiao
2015-04-01
Although pathway analysis methods have been developed and successfully applied to association studies of common variants, the statistical methods for pathway-based association analysis of rare variants have not been well developed. Many investigators observed highly inflated false-positive rates and low power in pathway-based tests of association of rare variants. The inflated false-positive rates and low true-positive rates of the current methods are mainly due to their lack of ability to account for gametic phase disequilibrium. To overcome these serious limitations, we develop a novel statistic that is based on the smoothed functional principal component analysis (SFPCA) for pathway association tests with next-generation sequencing data. The developed statistic has the ability to capture position-level variant information and account for gametic phase disequilibrium. By intensive simulations, we demonstrate that the SFPCA-based statistic for testing pathway association with either rare or common or both rare and common variants has the correct type 1 error rates. Also the power of the SFPCA-based statistic and 22 additional existing statistics are evaluated. We found that the SFPCA-based statistic has a much higher power than other existing statistics in all the scenarios considered. To further evaluate its performance, the SFPCA-based statistic is applied to pathway analysis of exome sequencing data in the early-onset myocardial infarction (EOMI) project. We identify three pathways significantly associated with EOMI after the Bonferroni correction. In addition, our preliminary results show that the SFPCA-based statistic has much smaller P-values to identify pathway association than other existing methods.
Identification of misspelled words without a comprehensive dictionary using prevalence analysis.
Turchin, Alexander; Chu, Julia T; Shubina, Maria; Einbinder, Jonathan S
2007-10-11
Misspellings are common in medical documents and can be an obstacle to information retrieval. We evaluated an algorithm to identify misspelled words through analysis of their prevalence in a representative body of text. We evaluated the algorithm's accuracy of identifying misspellings of 200 anti-hypertensive medication names on 2,000 potentially misspelled words randomly selected from narrative medical documents. Prevalence ratios (the frequency of the potentially misspelled word divided by the frequency of the non-misspelled word) in physician notes were computed by the software for each of the words. The software results were compared to the manual assessment by an independent reviewer. Area under the ROC curve for identification of misspelled words was 0.96. Sensitivity, specificity, and positive predictive value were 99.25%, 89.72% and 82.9% for the prevalence ratio threshold (0.32768) with the highest F-measure (0.903). Prevalence analysis can be used to identify and correct misspellings with high accuracy.
Grinde, Kelsey E.; Arbet, Jaron; Green, Alden; O'Connell, Michael; Valcarcel, Alessandra; Westra, Jason; Tintle, Nathan
2017-01-01
To date, gene-based rare variant testing approaches have focused on aggregating information across sets of variants to maximize statistical power in identifying genes showing significant association with diseases. Beyond identifying genes that are associated with diseases, the identification of causal variant(s) in those genes and estimation of their effect is crucial for planning replication studies and characterizing the genetic architecture of the locus. However, we illustrate that straightforward single-marker association statistics can suffer from substantial bias introduced by conditioning on gene-based test significance, due to the phenomenon often referred to as “winner's curse.” We illustrate the ramifications of this bias on variant effect size estimation and variant prioritization/ranking approaches, outline parameters of genetic architecture that affect this bias, and propose a bootstrap resampling method to correct for this bias. We find that our correction method significantly reduces the bias due to winner's curse (average two-fold decrease in bias, p < 2.2 × 10−6) and, consequently, substantially improves mean squared error and variant prioritization/ranking. The method is particularly helpful in adjustment for winner's curse effects when the initial gene-based test has low power and for relatively more common, non-causal variants. Adjustment for winner's curse is recommended for all post-hoc estimation and ranking of variants after a gene-based test. Further work is necessary to continue seeking ways to reduce bias and improve inference in post-hoc analysis of gene-based tests under a wide variety of genetic architectures. PMID:28959274
Awareness of a rape crisis center and knowledge about sexual violence among high school adolescents.
Lee, Sara H; Stark, Amrita K; O'Riordan, Mary Ann; Lazebnik, Rina
2015-02-01
This study examined awareness among adolescents of a local rape crisis center as well as their knowledge about sexual violence. The Cleveland Rape Crisis Center (CRCC) conducts sexual violence prevention programs for high school students. A written, anonymous survey was distributed to students prior to the start of the program. Students were asked if they had heard of the CRCC; knowledge about sexual violence was assessed with a series of 7 statements (rape myths) that participants identified as true or false. Surveys were reviewed retrospectively. Analyses were carried out for individual questions and frequencies compared using chi-square analysis. A total of 1633 surveys were collected; 1118 (68.5%) participants were female and 514 (31.5%) were male; ages ranged from 12 to 19 years. Respondents described themselves as being of European descent (45.9%), African descent (26.2%), or mixed race (17.7%). Just over half (863, 52.9%) of survey respondents had heard of the CRCC. Over half (950, 58.2%) of participants answered 5 or more questions correctly (range of correct answers 0 to 7). In general, more participants who were aware of the CRCC were able to identify statements about rape correctly (P < .01 for statements 1, 4, 5, 6, and 7, P < .001 for ≥ 5 correct). Age, gender, and race were all significantly associated with knowledge about rape. Females were consistently more likely to get an answer correct, as were participants of European descent. Awareness of the CRCC was associated with increased knowledge about sexual violence. Copyright © 2015 North American Society for Pediatric and Adolescent Gynecology. Published by Elsevier Inc. All rights reserved.
Integrated Japanese Dependency Analysis Using a Dialog Context
NASA Astrophysics Data System (ADS)
Ikegaya, Yuki; Noguchi, Yasuhiro; Kogure, Satoru; Itoh, Toshihiko; Konishi, Tatsuhiro; Kondo, Makoto; Asoh, Hideki; Takagi, Akira; Itoh, Yukihiro
This paper describes how to perform syntactic parsing and semantic analysis in a dialog system. The paper especially deals with how to disambiguate potentially ambiguous sentences using the contextual information. Although syntactic parsing and semantic analysis are often studied independently of each other, correct parsing of a sentence often requires the semantic information on the input and/or the contextual information prior to the input. Accordingly, we merge syntactic parsing with semantic analysis, which enables syntactic parsing taking advantage of the semantic content of an input and its context. One of the biggest problems of semantic analysis is how to interpret dependency structures. We employ a framework for semantic representations that circumvents the problem. Within the framework, the meaning of any predicate is converted into a semantic representation which only permits a single type of predicate: an identifying predicate "aru". The semantic representations are expressed as sets of "attribute-value" pairs, and those semantic representations are stored in the context information. Our system disambiguates syntactic/semantic ambiguities of inputs referring to the attribute-value pairs in the context information. We have experimentally confirmed the effectiveness of our approach; specifically, the experiment confirmed high accuracy of parsing and correctness of generated semantic representations.
NASA Astrophysics Data System (ADS)
De Michelis, Paola; Tozzi, Roberta; Consolini, Giuseppe
2017-02-01
From the very first measurements made by the magnetometers onboard Swarm satellites launched by European Space Agency (ESA) in late 2013, it emerged a discrepancy between scalar and vector measurements. An accurate analysis of this phenomenon brought to build an empirical model of the disturbance, highly correlated with the Sun incidence angle, and to correct vector data accordingly. The empirical model adopted by ESA results in a significant decrease in the amplitude of the disturbance affecting VFM measurements so greatly improving the vector magnetic data quality. This study is focused on the characterization of the difference between magnetic field intensity measured by the absolute scalar magnetometer (ASM) and that reconstructed using the vector field magnetometer (VFM) installed on Swarm constellation. Applying empirical mode decomposition method, we find the intrinsic mode functions (IMFs) associated with ASM-VFM total intensity differences obtained with data both uncorrected and corrected for the disturbance correlated with the Sun incidence angle. Surprisingly, no differences are found in the nature of the IMFs embedded in the analyzed signals, being these IMFs characterized by the same dominant periodicities before and after correction. The effect of correction manifests in the decrease in the energy associated with some IMFs contributing to corrected data. Some IMFs identified by analyzing the ASM-VFM intensity discrepancy are characterized by the same dominant periodicities of those obtained by analyzing the temperature fluctuations of the VFM electronic unit. Thus, the disturbance correlated with the Sun incidence angle could be still present in the corrected magnetic data. Furthermore, the ASM-VFM total intensity difference and the VFM electronic unit temperature display a maximal shared information with a time delay that depends on local time. Taken together, these findings may help to relate the features of the observed VFM-ASM total intensity difference to the physical characteristics of the real disturbance thus contributing to improve the empirical model proposed for the correction of data.[Figure not available: see fulltext.
Kerfriden, P.; Schmidt, K.M.; Rabczuk, T.; Bordas, S.P.A.
2013-01-01
We propose to identify process zones in heterogeneous materials by tailored statistical tools. The process zone is redefined as the part of the structure where the random process cannot be correctly approximated in a low-dimensional deterministic space. Such a low-dimensional space is obtained by a spectral analysis performed on pre-computed solution samples. A greedy algorithm is proposed to identify both process zone and low-dimensional representative subspace for the solution in the complementary region. In addition to the novelty of the tools proposed in this paper for the analysis of localised phenomena, we show that the reduced space generated by the method is a valid basis for the construction of a reduced order model. PMID:27069423
Hosseini, Seyed Mojtaba; Etesaminia, Samira; Jafari, Mehrnoosh
2016-10-01
One of the important factors of correct management is to identify the reasons for patient tendency toward private hospitals. This study measures these factors based on service marketing mixes. This study used a cross sectional descriptive methodology. The study was conducted during 6 months in 2015. The studied population included patients of private hospitals in Tehran. Random sampling was used (n = 200). Data was collected by an author-made questionnaire for service marketing factors. Reliability and validity of the questionnaire were confirmed. Data analysis was done using factor analysis test in SPSS 20. The results showed that constant attendance of physicians and nurses has the highest effect (0.707%) on patient tendency toward private hospitals.
Jackman, Patrick; Sun, Da-Wen; Allen, Paul; Valous, Nektarios A; Mendoza, Fernando; Ward, Paddy
2010-04-01
A method to discriminate between various grades of pork and turkey ham was developed using colour and wavelet texture features. Image analysis methods originally developed for predicting the palatability of beef were applied to rapidly identify the ham grade. With high quality digital images of 50-94 slices per ham it was possible to identify the greyscale that best expressed the differences between the various ham grades. The best 10 discriminating image features were then found with a genetic algorithm. Using the best 10 image features, simple linear discriminant analysis models produced 100% correct classifications for both pork and turkey on both calibration and validation sets. 2009 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Tursz, Anne; Crost, Monique; Gerbouin-Rerolle, Pascale; Cook, Jon M.
2010-01-01
Objectives: Test the hypothesis of an underestimation of infant homicides in mortality statistics in France; identify its causes; examine data from the judicial system and their contribution in correcting this underestimation. Methods: A retrospective, cross-sectional study was carried out in 26 courts in three regions of France of cases of infant…
Uncertainty Analysis Principles and Methods
2007-09-01
error source . The Data Processor converts binary coded numbers to values, performs D/A curve fitting and applies any correction factors that may be...describes the stages or modules involved in the measurement process. We now need to identify all relevant error sources and develop the mathematical... sources , gathering and maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden
Defense Mapping Agency (DMA) Raster-to-Vector Analysis
1984-11-30
model) to pinpoint critical deficiencies and understand trade-offs between alternative solutions. This may be exemplified by the allocation of human ...process, prone to errors (i.e., human operator eye/motor control limitations), and its time consuming nature (as a function of data density). It should...achieved through the facilities of coinputer interactive graphics. Each error or anomaly is individually identified by a human operator and corrected
McTaggart, Lisa R.; Lei, Eric; Richardson, Susan E.; Hoang, Linda; Fothergill, Annette; Zhang, Sean X.
2011-01-01
Compared to DNA sequence analysis, matrix-assisted laser desorption ionization–time of flight mass spectrometry (MALDI-TOF MS) correctly identified 100% of Cryptococcus species, distinguishing the notable pathogens Cryptococcus neoformans and C. gattii. Identification was greatly enhanced by supplementing a commercial spectral library with additional entries to account for subspecies variability. PMID:21653762
Shao, Yongni; Li, Yuan; Jiang, Linjun; Pan, Jian; He, Yong; Dou, Xiaoming
2016-11-01
The main goal of this research is to examine the feasibility of applying Visible/Near-infrared hyperspectral imaging (Vis/NIR-HSI) and Raman microspectroscopy technology for non-destructive identification of pesticide varieties (glyphosate and butachlor). Both mentioned technologies were explored to investigate how internal elements or characteristics of Chlorella pyrenoidosa change when pesticides are applied, and in the meantime, to identify varieties of the pesticides during this procedure. Successive projections algorithm (SPA) was introduced to our study to identify seven most effective wavelengths. With those wavelengths suggested by SPA, a model of the linear discriminant analysis (LDA) was established to classify the pesticide varieties, and the correct classification rate of the SPA-LDA model reached as high as 100%. For the Raman technique, a few partial least squares discriminant analysis models were established with different preprocessing methods from which we also identified one processing approach that achieved the most optimal result. The sensitive wavelengths (SWs) which are related to algae's pigment were chosen, and a model of LDA was established with the correct identification reached a high level of 90.0%. The results showed that both Vis/NIR-HSI and Raman microspectroscopy techniques are capable to identify pesticide varieties in an indirect but effective way, and SPA is an effective wavelength extracting method. The SWs corresponding to microalgae pigments, which were influenced by pesticides, could also help to characterize different pesticide varieties and benefit the variety identification. Copyright © 2016 Elsevier Ltd. All rights reserved.
Evaluation of the new Vitek 2 ANC card for identification of medically relevant anaerobic bacteria.
Mory, Francine; Alauzet, Corentine; Matuszeswski, Céline; Riegel, Philippe; Lozniewski, Alain
2009-06-01
Of 261 anaerobic clinical isolates tested with the new Vitek 2 ANC card, 257 (98.5%) were correctly identified at the genus level. Among the 251 strains for which identification at the species level is possible with regard to the ANC database, 217 (86.5%) were correctly identified at the species level. Two strains (0.8%) were not identified, and eight were misidentified (3.1%). Of the 21 strains (8.1%) with low-level discrimination results, 14 were correctly identified at the species level by using the recommended additional tests. This system is a satisfactory new automated tool for the rapid identification of most anaerobic bacteria isolated in clinical laboratories.
Bennett, Derrick A; Landry, Denise; Little, Julian; Minelli, Cosetta
2017-09-19
Several statistical approaches have been proposed to assess and correct for exposure measurement error. We aimed to provide a critical overview of the most common approaches used in nutritional epidemiology. MEDLINE, EMBASE, BIOSIS and CINAHL were searched for reports published in English up to May 2016 in order to ascertain studies that described methods aimed to quantify and/or correct for measurement error for a continuous exposure in nutritional epidemiology using a calibration study. We identified 126 studies, 43 of which described statistical methods and 83 that applied any of these methods to a real dataset. The statistical approaches in the eligible studies were grouped into: a) approaches to quantify the relationship between different dietary assessment instruments and "true intake", which were mostly based on correlation analysis and the method of triads; b) approaches to adjust point and interval estimates of diet-disease associations for measurement error, mostly based on regression calibration analysis and its extensions. Two approaches (multiple imputation and moment reconstruction) were identified that can deal with differential measurement error. For regression calibration, the most common approach to correct for measurement error used in nutritional epidemiology, it is crucial to ensure that its assumptions and requirements are fully met. Analyses that investigate the impact of departures from the classical measurement error model on regression calibration estimates can be helpful to researchers in interpreting their findings. With regard to the possible use of alternative methods when regression calibration is not appropriate, the choice of method should depend on the measurement error model assumed, the availability of suitable calibration study data and the potential for bias due to violation of the classical measurement error model assumptions. On the basis of this review, we provide some practical advice for the use of methods to assess and adjust for measurement error in nutritional epidemiology.
Assessment of statistical methods used in library-based approaches to microbial source tracking.
Ritter, Kerry J; Carruthers, Ethan; Carson, C Andrew; Ellender, R D; Harwood, Valerie J; Kingsley, Kyle; Nakatsu, Cindy; Sadowsky, Michael; Shear, Brian; West, Brian; Whitlock, John E; Wiggins, Bruce A; Wilbur, Jayson D
2003-12-01
Several commonly used statistical methods for fingerprint identification in microbial source tracking (MST) were examined to assess the effectiveness of pattern-matching algorithms to correctly identify sources. Although numerous statistical methods have been employed for source identification, no widespread consensus exists as to which is most appropriate. A large-scale comparison of several MST methods, using identical fecal sources, presented a unique opportunity to assess the utility of several popular statistical methods. These included discriminant analysis, nearest neighbour analysis, maximum similarity and average similarity, along with several measures of distance or similarity. Threshold criteria for excluding uncertain or poorly matched isolates from final analysis were also examined for their ability to reduce false positives and increase prediction success. Six independent libraries used in the study were constructed from indicator bacteria isolated from fecal materials of humans, seagulls, cows and dogs. Three of these libraries were constructed using the rep-PCR technique and three relied on antibiotic resistance analysis (ARA). Five of the libraries were constructed using Escherichia coli and one using Enterococcus spp. (ARA). Overall, the outcome of this study suggests a high degree of variability across statistical methods. Despite large differences in correct classification rates among the statistical methods, no single statistical approach emerged as superior. Thresholds failed to consistently increase rates of correct classification and improvement was often associated with substantial effective sample size reduction. Recommendations are provided to aid in selecting appropriate analyses for these types of data.
30 CFR 250.1452 - What if I correct the violation?
Code of Federal Regulations, 2011 CFR
2011-07-01
... 30 Mineral Resources 2 2011-07-01 2011-07-01 false What if I correct the violation? 250.1452... Continental Shelf Civil Penalties Penalties After A Period to Correct § 250.1452 What if I correct the violation? The matter will be closed if you correct all of the violations identified in the Notice of...
NASA Technical Reports Server (NTRS)
Au, C. K.
1989-01-01
The Breit correction only accounts for part of the transverse photon exchange correction in the calculation of the energy levels in helium Rydberg states. The remaining leading corrections are identified and each is expressed in an effective potential form. The relevance to the Casimir correction potential in various limits is also discussed.
Training Correctional Educators: A Needs Assessment Study.
ERIC Educational Resources Information Center
Jurich, Sonia; Casper, Marta; Hull, Kim A.
2001-01-01
Focus groups and a training needs survey of Virginia correctional educators identified educational philosophy, communication skills, human behavior, and teaching techniques as topics of interest. Classroom observations identified additional areas: teacher isolation, multiple challenges, absence of grade structure, and safety constraints. (Contains…
Phonons in two-dimensional soft colloidal crystals.
Chen, Ke; Still, Tim; Schoenholz, Samuel; Aptowicz, Kevin B; Schindler, Michael; Maggs, A C; Liu, Andrea J; Yodh, A G
2013-08-01
The vibrational modes of pristine and polycrystalline monolayer colloidal crystals composed of thermosensitive microgel particles are measured using video microscopy and covariance matrix analysis. At low frequencies, the Debye relation for two-dimensional harmonic crystals is observed in both crystal types; at higher frequencies, evidence for van Hove singularities in the phonon density of states is significantly smeared out by experimental noise and measurement statistics. The effects of these errors are analyzed using numerical simulations. We introduce methods to correct for these limitations, which can be applied to disordered systems as well as crystalline ones, and we show that application of the error correction procedure to the experimental data leads to more pronounced van Hove singularities in the pristine crystal. Finally, quasilocalized low-frequency modes in polycrystalline two-dimensional colloidal crystals are identified and demonstrated to correlate with structural defects such as dislocations, suggesting that quasilocalized low-frequency phonon modes may be used to identify local regions vulnerable to rearrangements in crystalline as well as amorphous solids.
Knowledge of Abortion Laws and Services Among Low-Income Women in Three United States Cities.
Lara, Diana; Holt, Kelsey; Peña, Melanie; Grossman, Daniel
2015-12-01
Low-income women and women of color are disproportionately affected by unintended pregnancy. Lack of knowledge of abortion laws and services is one of several factors likely to hinder access to services, though little research has documented knowledge in this population. Survey with convenience sample of 1,262 women attending primary care or full-scope Ob/Gyn clinics serving low-income populations in three large cities and multivariable analyses with four knowledge outcomes. Among all participants, 53% were first-generation immigrants, 25% identified the correct gestational age limit, 41% identified state parental consent laws, 67% knew partner consent is not required, and 55% knew where to obtain abortion services. In multivariable analysis, first-generation immigrants and primarily Spanish speakers were significantly less likely than higher-generation or primarily English speakers to display correct knowledge. Design and evaluation of strategies to improve knowledge about abortion, particularly among migrant women and non-primary English speakers, is needed.
Jahani, Sahar; Setarehdan, Seyed K; Boas, David A; Yücel, Meryem A
2018-01-01
Motion artifact contamination in near-infrared spectroscopy (NIRS) data has become an important challenge in realizing the full potential of NIRS for real-life applications. Various motion correction algorithms have been used to alleviate the effect of motion artifacts on the estimation of the hemodynamic response function. While smoothing methods, such as wavelet filtering, are excellent in removing motion-induced sharp spikes, the baseline shifts in the signal remain after this type of filtering. Methods, such as spline interpolation, on the other hand, can properly correct baseline shifts; however, they leave residual high-frequency spikes. We propose a hybrid method that takes advantage of different correction algorithms. This method first identifies the baseline shifts and corrects them using a spline interpolation method or targeted principal component analysis. The remaining spikes, on the other hand, are corrected by smoothing methods: Savitzky-Golay (SG) filtering or robust locally weighted regression and smoothing. We have compared our new approach with the existing correction algorithms in terms of hemodynamic response function estimation using the following metrics: mean-squared error, peak-to-peak error ([Formula: see text]), Pearson's correlation ([Formula: see text]), and the area under the receiver operator characteristic curve. We found that spline-SG hybrid method provides reasonable improvements in all these metrics with a relatively short computational time. The dataset and the code used in this study are made available online for the use of all interested researchers.
2016-01-01
Reliably estimating wildlife abundance is fundamental to effective management. Aerial surveys are one of the only spatially robust tools for estimating large mammal populations, but statistical sampling methods are required to address detection biases that affect accuracy and precision of the estimates. Although various methods for correcting aerial survey bias are employed on large mammal species around the world, these have rarely been rigorously validated. Several populations of feral horses (Equus caballus) in the western United States have been intensively studied, resulting in identification of all unique individuals. This provided a rare opportunity to test aerial survey bias correction on populations of known abundance. We hypothesized that a hybrid method combining simultaneous double-observer and sightability bias correction techniques would accurately estimate abundance. We validated this integrated technique on populations of known size and also on a pair of surveys before and after a known number was removed. Our analysis identified several covariates across the surveys that explained and corrected biases in the estimates. All six tests on known populations produced estimates with deviations from the known value ranging from -8.5% to +13.7% and <0.7 standard errors. Precision varied widely, from 6.1% CV to 25.0% CV. In contrast, the pair of surveys conducted around a known management removal produced an estimated change in population between the surveys that was significantly larger than the known reduction. Although the deviation between was only 9.1%, the precision estimate (CV = 1.6%) may have been artificially low. It was apparent that use of a helicopter in those surveys perturbed the horses, introducing detection error and heterogeneity in a manner that could not be corrected by our statistical models. Our results validate the hybrid method, highlight its potentially broad applicability, identify some limitations, and provide insight and guidance for improving survey designs. PMID:27139732
Lubow, Bruce C; Ransom, Jason I
2016-01-01
Reliably estimating wildlife abundance is fundamental to effective management. Aerial surveys are one of the only spatially robust tools for estimating large mammal populations, but statistical sampling methods are required to address detection biases that affect accuracy and precision of the estimates. Although various methods for correcting aerial survey bias are employed on large mammal species around the world, these have rarely been rigorously validated. Several populations of feral horses (Equus caballus) in the western United States have been intensively studied, resulting in identification of all unique individuals. This provided a rare opportunity to test aerial survey bias correction on populations of known abundance. We hypothesized that a hybrid method combining simultaneous double-observer and sightability bias correction techniques would accurately estimate abundance. We validated this integrated technique on populations of known size and also on a pair of surveys before and after a known number was removed. Our analysis identified several covariates across the surveys that explained and corrected biases in the estimates. All six tests on known populations produced estimates with deviations from the known value ranging from -8.5% to +13.7% and <0.7 standard errors. Precision varied widely, from 6.1% CV to 25.0% CV. In contrast, the pair of surveys conducted around a known management removal produced an estimated change in population between the surveys that was significantly larger than the known reduction. Although the deviation between was only 9.1%, the precision estimate (CV = 1.6%) may have been artificially low. It was apparent that use of a helicopter in those surveys perturbed the horses, introducing detection error and heterogeneity in a manner that could not be corrected by our statistical models. Our results validate the hybrid method, highlight its potentially broad applicability, identify some limitations, and provide insight and guidance for improving survey designs.
Accuracy of electrocardiogram reading by family practice residents.
Sur, D K; Kaye, L; Mikus, M; Goad, J; Morena, A
2000-05-01
This study evaluated the electrocardiogram (EKG) reading skills of family practice residents. A multicenter study was carried out to evaluate the accuracy of EKG reading in the family practice setting. Based on the frequency and potential for clinical significance, we chose 18 common findings on 10 EKGs for evaluation. The EKGs were then distributed to residents at six family practice residencies. Residents were given one point for the identification of each correct EKG finding and scored based on the number correct over a total of 18. Sixty-one residents (20 first year, 23 second year, and 18 third year) completed readings for 10 EKGs and were evaluated for their ability to identify 18 EKG findings. The median score out of 18 possible points for all first-, second-, and third-year residents was 12, 12, and 11.5, respectively. Twenty-one percent of residents did not correctly identify a tracing of an acute myocardial infarction. Data analysis showed no statistically significant difference among the three groups of residents. We evaluated the accuracy of EKG reading skills of family practice residents at each year of training. This study suggests that EKG reading skills do not improve during residency, and further study of curricular change to improve these skills should be considered.
7 CFR 91.28 - Issuance of corrected certificates or amendments for analysis reports.
Code of Federal Regulations, 2010 CFR
2010-01-01
... analysis reports. 91.28 Section 91.28 Agriculture Regulations of the Department of Agriculture (Continued... (CONTINUED) COMMODITY LABORATORY TESTING PROGRAMS SERVICES AND GENERAL INFORMATION Reporting § 91.28 Issuance of corrected certificates or amendments for analysis reports. (a) A corrected certificate of analysis...
7 CFR 91.28 - Issuance of corrected certificates or amendments for analysis reports.
Code of Federal Regulations, 2013 CFR
2013-01-01
... analysis reports. 91.28 Section 91.28 Agriculture Regulations of the Department of Agriculture (Continued... (CONTINUED) COMMODITY LABORATORY TESTING PROGRAMS SERVICES AND GENERAL INFORMATION Reporting § 91.28 Issuance of corrected certificates or amendments for analysis reports. (a) A corrected certificate of analysis...
The femoral neck-shaft angle on plain radiographs: a systematic review.
Boese, Christoph Kolja; Dargel, Jens; Oppermann, Johannes; Eysel, Peer; Scheyerer, Max Joseph; Bredow, Jan; Lechler, Philipp
2016-01-01
The femoral neck-shaft angle (NSA) is an important measure for the assessment of the anatomy of the hip and planning of operations. Despite its common use, there remains disagreement concerning the method of measurement and the correction of hip rotation and femoral version of the projected NSA on conventional radiographs. We addressed the following questions: (1) What are the reported values for NSA in normal adult subjects and in osteoarthritis? (2) Is there a difference between non-corrected and rotation-corrected measurements? (3) Which methods are used for measuring the NSA on plain radiographs? (4) What could be learned from an analysis of the intra- and interobserver reliability? A systematic literature search was performed including 26 publications reporting the measurement of the NSA on conventional radiographs. The mean NSA of healthy adults (5,089 hips) was 128.8° (98-180°) and 131.5° (115-155°) in patients with osteoarthritis (1230 hips). The mean NSA was 128.5° (127-130.5°) for the rotation-corrected and 129.5° (119.6-151°) for the non-corrected measurements. Our data showed a high variance of the reported neck-shaft angles. Notably, we identified the inconsistency of the published methods of measurement as a central issue. The reported effect of rotation-correction cannot be reliably verified.
Yothers, Mitchell P; Browder, Aaron E; Bumm, Lloyd A
2017-01-01
We have developed a real-space method to correct distortion due to thermal drift and piezoelectric actuator nonlinearities on scanning tunneling microscope images using Matlab. The method uses the known structures typically present in high-resolution atomic and molecularly resolved images as an internal standard. Each image feature (atom or molecule) is first identified in the image. The locations of each feature's nearest neighbors are used to measure the local distortion at that location. The local distortion map across the image is simultaneously fit to our distortion model, which includes thermal drift in addition to piezoelectric actuator hysteresis and creep. The image coordinates of the features and image pixels are corrected using an inverse transform from the distortion model. We call this technique the thermal-drift, hysteresis, and creep transform. Performing the correction in real space allows defects, domain boundaries, and step edges to be excluded with a spatial mask. Additional real-space image analyses are now possible with these corrected images. Using graphite(0001) as a model system, we show lattice fitting to the corrected image, averaged unit cell images, and symmetry-averaged unit cell images. Statistical analysis of the distribution of the image features around their best-fit lattice sites measures the aggregate noise in the image, which can be expressed as feature confidence ellipsoids.
NASA Astrophysics Data System (ADS)
Yothers, Mitchell P.; Browder, Aaron E.; Bumm, Lloyd A.
2017-01-01
We have developed a real-space method to correct distortion due to thermal drift and piezoelectric actuator nonlinearities on scanning tunneling microscope images using Matlab. The method uses the known structures typically present in high-resolution atomic and molecularly resolved images as an internal standard. Each image feature (atom or molecule) is first identified in the image. The locations of each feature's nearest neighbors are used to measure the local distortion at that location. The local distortion map across the image is simultaneously fit to our distortion model, which includes thermal drift in addition to piezoelectric actuator hysteresis and creep. The image coordinates of the features and image pixels are corrected using an inverse transform from the distortion model. We call this technique the thermal-drift, hysteresis, and creep transform. Performing the correction in real space allows defects, domain boundaries, and step edges to be excluded with a spatial mask. Additional real-space image analyses are now possible with these corrected images. Using graphite(0001) as a model system, we show lattice fitting to the corrected image, averaged unit cell images, and symmetry-averaged unit cell images. Statistical analysis of the distribution of the image features around their best-fit lattice sites measures the aggregate noise in the image, which can be expressed as feature confidence ellipsoids.
Taxman, Faye S; Kitsantas, Panagiota
2009-08-01
OBJECTIVE TO BE ADDRESSED: The purpose of this study was to investigate the structural and organizational factors that contribute to the availability and increased capacity for substance abuse treatment programs in correctional settings. We used classification and regression tree statistical procedures to identify how multi-level data can explain the variability in availability and capacity of substance abuse treatment programs in jails and probation/parole offices. The data for this study combined the National Criminal Justice Treatment Practices (NCJTP) Survey and the 2000 Census. The NCJTP survey was a nationally representative sample of correctional administrators for jails and probation/parole agencies. The sample size included 295 substance abuse treatment programs that were classified according to the intensity of their services: high, medium, and low. The independent variables included jurisdictional-level structural variables, attributes of the correctional administrators, and program and service delivery characteristics of the correctional agency. The two most important variables in predicting the availability of all three types of services were stronger working relationships with other organizations and the adoption of a standardized substance abuse screening tool by correctional agencies. For high and medium intensive programs, the capacity increased when an organizational learning strategy was used by administrators and the organization used a substance abuse screening tool. Implications on advancing treatment practices in correctional settings are discussed, including further work to test theories on how to better understand access to intensive treatment services. This study presents the first phase of understanding capacity-related issues regarding treatment programs offered in correctional settings.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Griffith, B.; Deru M.; Torcellini, P.
2005-04-01
The Chesapeake Bay Foundation designed their new headquarters building to minimize its environmental impact on the already highly polluted Chesapeake Bay by incorporating numerous high-performance energy saving features into the building design. CBF then contacted NREL to perform a nonbiased energy evaluation of the building. Because their building attracted much attention in the sustainable design community, an unbiased evaluation was necessary to help designers replicate successes and identify and correct problem areas. This report focuses on NREL's monitoring and analysis of the overall energy performance of the building.
Eyewitness identification across the life span: A meta-analysis of age differences.
Fitzgerald, Ryan J; Price, Heather L
2015-11-01
Lineup identifications are often a critical component of criminal investigations. Over the past 35 years, researchers have been conducting empirical studies to assess the impact of witness age on identification accuracy. A previous meta-analysis indicated that children are less likely than adults to correctly reject a lineup that does not contain the culprit, but children 5 years and older are as likely as adults to make a correct identification if the culprit is in the lineup (Pozzulo & Lindsay, 1998). We report an updated meta-analysis of age differences in eyewitness identification, summarizing data from 20,244 participants across 91 studies. Contrary to extant reviews, we adopt a life span approach and examine witnesses from early childhood to late adulthood. Children's increased tendency to erroneously select a culprit-absent lineup member was replicated. Children were also less likely than young adults to correctly identify the culprit. Group data from culprit-absent and culprit-present lineups were used to produce signal detection measures, which indicated young adults were better able than children to discriminate between guilty and innocent suspects. A strikingly similar pattern emerged for older adults, who had even stronger deficits in discriminability than children, relative to adults. Although identifications by young adults were the most reliable, identifications by all witnesses had probative value. (c) 2015 APA, all rights reserved).
Tipping point analysis of ocean acoustic noise
NASA Astrophysics Data System (ADS)
Livina, Valerie N.; Brouwer, Albert; Harris, Peter; Wang, Lian; Sotirakopoulos, Kostas; Robinson, Stephen
2018-02-01
We apply tipping point analysis to a large record of ocean acoustic data to identify the main components of the acoustic dynamical system and study possible bifurcations and transitions of the system. The analysis is based on a statistical physics framework with stochastic modelling, where we represent the observed data as a composition of deterministic and stochastic components estimated from the data using time-series techniques. We analyse long-term and seasonal trends, system states and acoustic fluctuations to reconstruct a one-dimensional stochastic equation to approximate the acoustic dynamical system. We apply potential analysis to acoustic fluctuations and detect several changes in the system states in the past 14 years. These are most likely caused by climatic phenomena. We analyse trends in sound pressure level within different frequency bands and hypothesize a possible anthropogenic impact on the acoustic environment. The tipping point analysis framework provides insight into the structure of the acoustic data and helps identify its dynamic phenomena, correctly reproducing the probability distribution and scaling properties (power-law correlations) of the time series.
NASA Astrophysics Data System (ADS)
Asadi Haroni, Hooshang; Hassan Tabatabaei, Seyed
2016-04-01
Muteh gold mining area is located in 160 km NW of Isfahan town. Gold mineralization is meso-thermal type and associated with silisic, seresitic and carbonate alterations as well as with hematite and goethite. Image processing and interpretation were applied on the ASTER satellite imagery data of about 400 km2 at the Muteh gold mining area to identify hydrothermal alterations and iron oxides associated with gold mineralization. After applying preprocessing methods such as radiometric and geometric corrections, image processing methods of Principal Components Analysis (PCA), Least Square Fit (Ls-Fit) and Spectral Angle Mapper (SAM) were applied on the ASTER data to identify hydrothermal alterations and iron oxides. In this research reference spectra of minerals such as chlorite, hematite, clay minerals and phengite identified from laboratory spectral analysis of collected samples were used to map the hydrothermal alterations. Finally, identified hydrothermal alteration and iron oxides were validated by visiting and sampling some of the mapped hydrothermal alterations.
Bias correction of satellite-based rainfall data
NASA Astrophysics Data System (ADS)
Bhattacharya, Biswa; Solomatine, Dimitri
2015-04-01
Limitation in hydro-meteorological data availability in many catchments limits the possibility of reliable hydrological analyses especially for near-real-time predictions. However, the variety of satellite based and meteorological model products for rainfall provides new opportunities. Often times the accuracy of these rainfall products, when compared to rain gauge measurements, is not impressive. The systematic differences of these rainfall products from gauge observations can be partially compensated by adopting a bias (error) correction. Many of such methods correct the satellite based rainfall data by comparing their mean value to the mean value of rain gauge data. Refined approaches may also first find out a suitable time scale at which different data products are better comparable and then employ a bias correction at that time scale. More elegant methods use quantile-to-quantile bias correction, which however, assumes that the available (often limited) sample size can be useful in comparing probabilities of different rainfall products. Analysis of rainfall data and understanding of the process of its generation reveals that the bias in different rainfall data varies in space and time. The time aspect is sometimes taken into account by considering the seasonality. In this research we have adopted a bias correction approach that takes into account the variation of rainfall in space and time. A clustering based approach is employed in which every new data point (e.g. of Tropical Rainfall Measuring Mission (TRMM)) is first assigned to a specific cluster of that data product and then, by identifying the corresponding cluster of gauge data, the bias correction specific to that cluster is adopted. The presented approach considers the space-time variation of rainfall and as a result the corrected data is more realistic. Keywords: bias correction, rainfall, TRMM, satellite rainfall
Gap analysis: synergies and opportunities for effective nursing leadership.
Davis-Ajami, Mary Lynn; Costa, Linda; Kulik, Susan
2014-01-01
Gap analysis encompasses a comprehensive process to identify, understand, address, and bridge gaps in service delivery and nursing practice. onducting gap analysis provides structure to information gathering and the process of finding sustainable solutions to important deficiencies. Nursing leaders need to recognize, measure, monitor, and execute on feasible actionable solutions to help organizations make adjustments to address gaps between what is desired and the actual real-world conditions contributing to the quality chasm in health care. Gap analysis represents a functional and comprehensive tool to address organizational deficiencies. Using gap analysis proactively helps organizations map out and sustain corrective efforts to close the quality chasm. Gaining facility in gap analysis should help the nursing profession's contribution to narrowing the quality chasm.
NASA Technical Reports Server (NTRS)
1981-01-01
An approach to remote sensing that meets future mission requirements was investigated. The deterministic acquisition of data and the rapid correction of data for radiometric effects and image distortions are the most critical limitations of remote sensing. The following topics are discussed: onboard image correction systems, GCP navigation system simulation, GCP analysis, and image correction analysis measurement.
Ham, D Cal; Lin, Carol; Newman, Lori; Wijesooriya, N Saman; Kamb, Mary
2015-06-01
"Probable active syphilis," is defined as seroreactivity in both non-treponemal and treponemal tests. A correction factor of 65%, namely the proportion of pregnant women reactive in one syphilis test type that were likely reactive in the second, was applied to reported syphilis seropositivity data reported to WHO for global estimates of syphilis during pregnancy. To identify more accurate correction factors based on test type reported. Medline search using: "Syphilis [Mesh] and Pregnancy [Mesh]," "Syphilis [Mesh] and Prenatal Diagnosis [Mesh]," and "Syphilis [Mesh] and Antenatal [Keyword]. Eligible studies must have reported results for pregnant or puerperal women for both non-treponemal and treponemal serology. We manually calculated the crude percent estimates of subjects with both reactive treponemal and reactive non-treponemal tests among subjects with reactive treponemal and among subjects with reactive non-treponemal tests. We summarized the percent estimates using random effects models. Countries reporting both reactive non-treponemal and reactive treponemal testing required no correction factor. Countries reporting non-treponemal testing or treponemal testing alone required a correction factor of 52.2% and 53.6%, respectively. Countries not reporting test type required a correction factor of 68.6%. Future estimates should adjust reported maternal syphilis seropositivity by test type to ensure accuracy. Published by Elsevier Ireland Ltd.
DOE Office of Scientific and Technical Information (OSTI.GOV)
John McCord
2006-06-01
The U.S. Department of Energy (DOE), National Nuclear Security Administration Nevada Site Office (NNSA/NSO) initiated the Underground Test Area (UGTA) Project to assess and evaluate the effects of the underground nuclear weapons tests on groundwater beneath the Nevada Test Site (NTS) and vicinity. The framework for this evaluation is provided in Appendix VI, Revision No. 1 (December 7, 2000) of the Federal Facility Agreement and Consent Order (FFACO, 1996). Section 3.0 of Appendix VI ''Corrective Action Strategy'' of the FFACO describes the process that will be used to complete corrective actions specifically for the UGTA Project. The objective of themore » UGTA corrective action strategy is to define contaminant boundaries for each UGTA corrective action unit (CAU) where groundwater may have become contaminated from the underground nuclear weapons tests. The contaminant boundaries are determined based on modeling of groundwater flow and contaminant transport. A summary of the FFACO corrective action process and the UGTA corrective action strategy is provided in Section 1.5. The FFACO (1996) corrective action process for the Yucca Flat/Climax Mine CAU 97 was initiated with the Corrective Action Investigation Plan (CAIP) (DOE/NV, 2000a). The CAIP included a review of existing data on the CAU and proposed a set of data collection activities to collect additional characterization data. These recommendations were based on a value of information analysis (VOIA) (IT, 1999), which evaluated the value of different possible data collection activities, with respect to reduction in uncertainty of the contaminant boundary, through simplified transport modeling. The Yucca Flat/Climax Mine CAIP identifies a three-step model development process to evaluate the impact of underground nuclear testing on groundwater to determine a contaminant boundary (DOE/NV, 2000a). The three steps are as follows: (1) Data compilation and analysis that provides the necessary modeling data that is completed in two parts: the first addressing the groundwater flow model, and the second the transport model. (2) Development of a groundwater flow model. (3) Development of a groundwater transport model. This report presents the results of the first part of the first step, documenting the data compilation, evaluation, and analysis for the groundwater flow model. The second part, documentation of transport model data will be the subject of a separate report. The purpose of this document is to present the compilation and evaluation of the available hydrologic data and information relevant to the development of the Yucca Flat/Climax Mine CAU groundwater flow model, which is a fundamental tool in the prediction of the extent of contaminant migration. Where appropriate, data and information documented elsewhere are summarized with reference to the complete documentation. The specific task objectives for hydrologic data documentation are as follows: (1) Identify and compile available hydrologic data and supporting information required to develop and validate the groundwater flow model for the Yucca Flat/Climax Mine CAU. (2) Assess the quality of the data and associated documentation, and assign qualifiers to denote levels of quality. (3) Analyze the data to derive expected values or spatial distributions and estimates of the associated uncertainty and variability.« less
Xiao, Dongping; Liu, Huaitong; Zhou, Qiang; Xie, Yutong; Ma, Qichao
2016-01-01
According to the operating specifications of existing electric field measuring instruments, measuring technicians must be located far from the instruments to eliminate the influence of the human body occupancy on a spatial electric field. Nevertheless, in order to develop a portable safety protection instrument with an effective electric field warning function for working staff in a high-voltage environment, it is necessary to study the influence of an approaching human body on the measurement of an electric field and to correct the measurement results. A single-shaft electric field measuring instrument called the Type LP-2000, which was developed by our research team, is used as the research object in this study. First, we explain the principle of electric field measurement and describe the capacitance effect produced by the human body. Through a theoretical analysis, we show that the measured electric field value decreases as a human body approaches. Their relationship is linearly proportional. Then, the ratio is identified as a correction coefficient to correct for the influence of human body proximity. The conclusion drawn from the theoretical analysis is proved via simulation. The correction coefficient kb = 1.8010 is obtained on the basis of the linear fitting of simulated data. Finally, a physical experiment is performed. When no human is present, we compare the results from the Type LP-2000 measured with Narda EFA-300 and the simulated value to verify the accuracy of the Type LP-2000. For the case of an approaching human body, the correction coefficient kb* = 1.9094 is obtained by comparing the data measured with the Type LP-2000 to the simulated value. The correction coefficient obtained from the experiment (i.e., kb*) is highly consistent with that obtained from the simulation (i.e., kb). Two experimental programs are set; under these programs, the excitation voltages and distance measuring points are regulated to produce different electric field intensities. Using kb = 1.9094, the corrected measurement of electric field intensity can accurately reflect the original environmental electric field intensity, and the maximal error is less than 6% in all the data comparisons. These results verify the effectiveness of our proposed method. PMID:27294936
Xiao, Dongping; Liu, Huaitong; Zhou, Qiang; Xie, Yutong; Ma, Qichao
2016-06-10
According to the operating specifications of existing electric field measuring instruments, measuring technicians must be located far from the instruments to eliminate the influence of the human body occupancy on a spatial electric field. Nevertheless, in order to develop a portable safety protection instrument with an effective electric field warning function for working staff in a high-voltage environment, it is necessary to study the influence of an approaching human body on the measurement of an electric field and to correct the measurement results. A single-shaft electric field measuring instrument called the Type LP-2000, which was developed by our research team, is used as the research object in this study. First, we explain the principle of electric field measurement and describe the capacitance effect produced by the human body. Through a theoretical analysis, we show that the measured electric field value decreases as a human body approaches. Their relationship is linearly proportional. Then, the ratio is identified as a correction coefficient to correct for the influence of human body proximity. The conclusion drawn from the theoretical analysis is proved via simulation. The correction coefficient kb = 1.8010 is obtained on the basis of the linear fitting of simulated data. Finally, a physical experiment is performed. When no human is present, we compare the results from the Type LP-2000 measured with Narda EFA-300 and the simulated value to verify the accuracy of the Type LP-2000. For the case of an approaching human body, the correction coefficient kb* = 1.9094 is obtained by comparing the data measured with the Type LP-2000 to the simulated value. The correction coefficient obtained from the experiment (i.e., kb*) is highly consistent with that obtained from the simulation (i.e., kb). Two experimental programs are set; under these programs, the excitation voltages and distance measuring points are regulated to produce different electric field intensities. Using kb = 1.9094, the corrected measurement of electric field intensity can accurately reflect the original environmental electric field intensity, and the maximal error is less than 6% in all the data comparisons. These results verify the effectiveness of our proposed method.
Robert, Mark E; Linthicum, Fred H
2016-01-01
Profile count method for estimating cell number in sectioned tissue applies a correction factor for double count (resulting from transection during sectioning) of count units selected to represent the cell. For human spiral ganglion cell counts, we attempted to address apparent confusion between published correction factors for nucleus and nucleolus count units that are identical despite the role of count unit diameter in a commonly used correction factor formula. We examined a portion of human cochlea to empirically derive correction factors for the 2 count units, using 3-dimensional reconstruction software to identify double counts. The Neurotology and House Histological Temporal Bone Laboratory at University of California at Los Angeles. Using a fully sectioned and stained human temporal bone, we identified and generated digital images of sections of the modiolar region of the lower first turn of cochlea, identified count units with a light microscope, labeled them on corresponding digital sections, and used 3-dimensional reconstruction software to identify double-counted count units. For 25 consecutive sections, we determined that double-count correction factors for nucleus count unit (0.91) and nucleolus count unit (0.92) matched the published factors. We discovered that nuclei and, therefore, spiral ganglion cells were undercounted by 6.3% when using nucleolus count units. We determined that correction factors for count units must include an element for undercounting spiral ganglion cells as well as the double-count element. We recommend a correction factor of 0.91 for the nucleus count unit and 0.98 for the nucleolus count unit when using 20-µm sections. © American Academy of Otolaryngology—Head and Neck Surgery Foundation 2015.
Parameter as a Switch Between Dynamical States of a Network in Population Decoding.
Yu, Jiali; Mao, Hua; Yi, Zhang
2017-04-01
Population coding is a method to represent stimuli using the collective activities of a number of neurons. Nevertheless, it is difficult to extract information from these population codes with the noise inherent in neuronal responses. Moreover, it is a challenge to identify the right parameter of the decoding model, which plays a key role for convergence. To address the problem, a population decoding model is proposed for parameter selection. Our method successfully identified the key conditions for a nonzero continuous attractor. Both the theoretical analysis and the application studies demonstrate the correctness and effectiveness of this strategy.
Correction of elevation offsets in multiple co-located lidar datasets
Thompson, David M.; Dalyander, P. Soupy; Long, Joseph W.; Plant, Nathaniel G.
2017-04-07
IntroductionTopographic elevation data collected with airborne light detection and ranging (lidar) can be used to analyze short- and long-term changes to beach and dune systems. Analysis of multiple lidar datasets at Dauphin Island, Alabama, revealed systematic, island-wide elevation differences on the order of 10s of centimeters (cm) that were not attributable to real-world change and, therefore, were likely to represent systematic sampling offsets. These offsets vary between the datasets, but appear spatially consistent within a given survey. This report describes a method that was developed to identify and correct offsets between lidar datasets collected over the same site at different times so that true elevation changes over time, associated with sediment accumulation or erosion, can be analyzed.
The statistics of identifying differentially expressed genes in Expresso and TM4: a comparison
Sioson, Allan A; Mane, Shrinivasrao P; Li, Pinghua; Sha, Wei; Heath, Lenwood S; Bohnert, Hans J; Grene, Ruth
2006-01-01
Background Analysis of DNA microarray data takes as input spot intensity measurements from scanner software and returns differential expression of genes between two conditions, together with a statistical significance assessment. This process typically consists of two steps: data normalization and identification of differentially expressed genes through statistical analysis. The Expresso microarray experiment management system implements these steps with a two-stage, log-linear ANOVA mixed model technique, tailored to individual experimental designs. The complement of tools in TM4, on the other hand, is based on a number of preset design choices that limit its flexibility. In the TM4 microarray analysis suite, normalization, filter, and analysis methods form an analysis pipeline. TM4 computes integrated intensity values (IIV) from the average intensities and spot pixel counts returned by the scanner software as input to its normalization steps. By contrast, Expresso can use either IIV data or median intensity values (MIV). Here, we compare Expresso and TM4 analysis of two experiments and assess the results against qRT-PCR data. Results The Expresso analysis using MIV data consistently identifies more genes as differentially expressed, when compared to Expresso analysis with IIV data. The typical TM4 normalization and filtering pipeline corrects systematic intensity-specific bias on a per microarray basis. Subsequent statistical analysis with Expresso or a TM4 t-test can effectively identify differentially expressed genes. The best agreement with qRT-PCR data is obtained through the use of Expresso analysis and MIV data. Conclusion The results of this research are of practical value to biologists who analyze microarray data sets. The TM4 normalization and filtering pipeline corrects microarray-specific systematic bias and complements the normalization stage in Expresso analysis. The results of Expresso using MIV data have the best agreement with qRT-PCR results. In one experiment, MIV is a better choice than IIV as input to data normalization and statistical analysis methods, as it yields as greater number of statistically significant differentially expressed genes; TM4 does not support the choice of MIV input data. Overall, the more flexible and extensive statistical models of Expresso achieve more accurate analytical results, when judged by the yardstick of qRT-PCR data, in the context of an experimental design of modest complexity. PMID:16626497
NCLEX-RN performance: predicting success on the computerized examination.
Beeman, P B; Waterhouse, J K
2001-01-01
Since the adoption of the Computerized Adaptive Testing (CAT) format of the National Certification Licensure Examination for Registered Nurses (NCLEX-RN), no studies have been reported in the literature on predictors of successful performance by baccalaureate nursing graduates on the licensure examination. In this study, a discriminant analysis was used to identify which of 21 variables can be significant predictors of success on the CAT NCLEX-RN. The convenience sample consisted of 289 individuals who graduated from a baccalaureate nursing program between 1995 and 1998. Seven significant predictor variables were identified. The total number of C+ or lower grades earned in nursing theory courses was the best predictor, followed by grades in several individual nursing courses. More than 93 per cent of graduates were correctly classified. Ninety-four per cent of NCLEX "passes" were correctly classified, as were 92 per cent of NCLEX failures. This degree of accuracy in classifying CAT NCLEX-RN failures represents a marked improvement over results reported in previous studies of licensure examinations, and suggests the discriminant function will be helpful in identifying future students in danger of failure. J Prof Nurs 17:158-165, 2001. Copyright 2001 by W.B. Saunders Company
Vlek, Anneloes; Kolecka, Anna; Khayhan, Kantarawee; Theelen, Bart; Groenewald, Marizeth; Boel, Edwin
2014-01-01
An interlaboratory study using matrix-assisted laser desorption ionization–time of flight mass spectrometry (MALDI-TOF MS) to determine the identification of clinically important yeasts (n = 35) was performed at 11 clinical centers, one company, and one reference center using the Bruker Daltonics MALDI Biotyper system. The optimal cutoff for the MALDI-TOF MS score was investigated using receiver operating characteristic (ROC) curve analyses. The percentages of correct identifications were compared for different sample preparation methods and different databases. Logistic regression analysis was performed to analyze the association between the number of spectra in the database and the percentage of strains that were correctly identified. A total of 5,460 MALDI-TOF MS results were obtained. Using all results, the area under the ROC curve was 0.95 (95% confidence interval [CI], 0.94 to 0.96). With a sensitivity of 0.84 and a specificity of 0.97, a cutoff value of 1.7 was considered optimal. The overall percentage of correct identifications (formic acid-ethanol extraction method, score ≥ 1.7) was 61.5% when the commercial Bruker Daltonics database (BDAL) was used, and it increased to 86.8% by using an extended BDAL supplemented with a Centraalbureau voor Schimmelcultures (CBS)-KNAW Fungal Biodiversity Centre in-house database (BDAL+CBS in-house). A greater number of main spectra (MSP) in the database was associated with a higher percentage of correct identifications (odds ratio [OR], 1.10; 95% CI, 1.05 to 1.15; P < 0.01). The results from the direct transfer method ranged from 0% to 82.9% correct identifications, with the results of the top four centers ranging from 71.4% to 82.9% correct identifications. This study supports the use of a cutoff value of 1.7 for the identification of yeasts using MALDI-TOF MS. The inclusion of enough isolates of the same species in the database can enhance the proportion of correctly identified strains. Further optimization of the preparation methods, especially of the direct transfer method, may contribute to improved diagnosis of yeast-related infections. PMID:24920782
De Sio, S; Cedrone, F; Greco, E; Di Traglia, M; Sanità, D; Mandolesi, D; Stansfeld, S A
2016-01-01
Psychosocial hazards and work-related stress have reached epidemic proportions in Europe. The Italia law introduced in 2008 the obligation for Italian companies to assess work related stress risk in order to protect their workers' safety and health. The purpose of our study was to propose an accurate measurement tool, using the HSE indicator tool, for more appropriate and significant work-related stress' prevention measures. The study was conducted on 204 visual display unit (VDU) operators: 106 male and 98 female. All subjects were administered the HSE questionnaire. The sample was studied through a 4 step process, using HSE analysis tool and a statistical analysis, based on the odds ratio calculation. The assessment model used demonstrated the presence of work related stress in VDU operators and additional "critical" aspects which had failed to emerge by the classical use of HSE analysis tool. The approach we propose allows to obtain a complete picture of the perception of work-related stress and can point out the most appropriate corrective actions.
NASA Astrophysics Data System (ADS)
Nallala, Jayakrupakar; Gobinet, Cyril; Diebold, Marie-Danièle; Untereiner, Valérie; Bouché, Olivier; Manfait, Michel; Sockalingum, Ganesh Dhruvananda; Piot, Olivier
2012-11-01
Innovative diagnostic methods are the need of the hour that could complement conventional histopathology for cancer diagnosis. In this perspective, we propose a new concept based on spectral histopathology, using IR spectral micro-imaging, directly applied to paraffinized colon tissue array stabilized in an agarose matrix without any chemical pre-treatment. In order to correct spectral interferences from paraffin and agarose, a mathematical procedure is implemented. The corrected spectral images are then processed by a multivariate clustering method to automatically recover, on the basis of their intrinsic molecular composition, the main histological classes of the normal and the tumoral colon tissue. The spectral signatures from different histological classes of the colonic tissues are analyzed using statistical methods (Kruskal-Wallis test and principal component analysis) to identify the most discriminant IR features. These features allow characterizing some of the biomolecular alterations associated with malignancy. Thus, via a single analysis, in a label-free and nondestructive manner, main changes associated with nucleotide, carbohydrates, and collagen features can be identified simultaneously between the compared normal and the cancerous tissues. The present study demonstrates the potential of IR spectral imaging as a complementary modern tool, to conventional histopathology, for an objective cancer diagnosis directly from paraffin-embedded tissue arrays.
Gilbert, Andrew R.; Keshavan, Matcheri S.; Diwadkar, Vaibhav; Nutche, Jeffrey; MacMaster, Frank; Easter, Phillip C.; Buhagiar, Christian J.; Rosenberg, David R.
2008-01-01
Neuroimaging studies have identified alterations in frontostriatal circuitry in OCD. Voxel-based morphometry (VBM) allows for the assessment of differences in gray matter density across the whole brain. VBM has not previously been used to examine regional gray matter density in pediatric OCD patients and the siblings of pediatric OCD patients. Volumetric magnetic resonance imaging (MRI) studies were conducted in 10 psychotropic-naïve pediatric patients with OCD, 10 unaffected siblings of pediatric patients with OCD, and 10 healthy controls. VBM analysis was conducted using SPM2. Statistical comparisons were performed with the general linear model, implementing small volume random field corrections for a priori regions of interest (anterior cingulate cortex or ACC, striatum and thalamus). VBM analysis revealed significantly lower gray matter density in OCD patients compared to healthy in the left ACC and bilateral medial superior frontal gyrus (SFG). Furthermore, a small volume correction was used to identify a significantly greater gray matter density in the right putamen in OCD patients as compared to unaffected siblings of OCD patients. These findings in patients, siblings, and healthy controls, although preliminary, suggest the presence of gray matter structural differences between affected subjects and healthy controls as well as between affected subjects and individuals at risk for OCD. PMID:18314272
Preliminary evaluation of a nest usage sensor to detect double nest occupations of laying hens.
Zaninelli, Mauro; Costa, Annamaria; Tangorra, Francesco Maria; Rossi, Luciana; Agazzi, Alessandro; Savoini, Giovanni
2015-01-26
Conventional cage systems will be replaced by housing systems that allow hens to move freely. These systems may improve hens' welfare, but they lead to some disadvantages: disease, bone fractures, cannibalism, piling and lower egg production. New selection criteria for existing commercial strains should be identified considering individual data about laying performance and the behavior of hens. Many recording systems have been developed to collect these data. However, the management of double nest occupations remains critical for the correct egg-to-hen assignment. To limit such events, most systems adopt specific trap devices and additional mechanical components. Others, instead, only prevent these occurrences by narrowing the nest, without any detection and management. The aim of this study was to develop and test a nest usage "sensor", based on imaging analysis, that is able to automatically detect a double nest occupation. Results showed that the developed sensor correctly identified the double nest occupation occurrences. Therefore, the imaging analysis resulted in being a useful solution that could simplify the nest construction for this type of recording system, allowing the collection of more precise and accurate data, since double nest occupations would be managed and the normal laying behavior of hens would not be discouraged by the presence of the trap devices.
Preliminary Evaluation of a Nest Usage Sensor to Detect Double Nest Occupations of Laying Hens
Zaninelli, Mauro; Costa, Annamaria; Tangorra, Francesco Maria; Rossi, Luciana; Agazzi, Alessandro; Savoini, Giovanni
2015-01-01
Conventional cage systems will be replaced by housing systems that allow hens to move freely. These systems may improve hens' welfare, but they lead to some disadvantages: disease, bone fractures, cannibalism, piling and lower egg production. New selection criteria for existing commercial strains should be identified considering individual data about laying performance and the behavior of hens. Many recording systems have been developed to collect these data. However, the management of double nest occupations remains critical for the correct egg-to-hen assignment. To limit such events, most systems adopt specific trap devices and additional mechanical components. Others, instead, only prevent these occurrences by narrowing the nest, without any detection and management. The aim of this study was to develop and test a nest usage “sensor”, based on imaging analysis, that is able to automatically detect a double nest occupation. Results showed that the developed sensor correctly identified the double nest occupation occurrences. Therefore, the imaging analysis resulted in being a useful solution that could simplify the nest construction for this type of recording system, allowing the collection of more precise and accurate data, since double nest occupations would be managed and the normal laying behavior of hens would not be discouraged by the presence of the trap devices. PMID:25629704
Ford, Simon; Dosani, Maryam; Robinson, Ashley J; Campbell, G Claire; Ansermino, J Mark; Lim, Joanne; Lauder, Gillian R
2009-12-01
The ilioinguinal (II)/iliohypogastric (IH) nerve block is a safe, frequently used block that has been improved in efficacy and safety by the use of ultrasound guidance. We assessed the frequency with which pediatric anesthesiologists with limited experience with ultrasound-guided regional anesthesia could correctly identify anatomical structures within the inguinal region. Our primary outcome was to compare the frequency of correct identification of the transversus abdominis (TA) muscle with the frequency of correct identification of the II/IH nerves. We used 2 ultrasound machines with different capabilities to assess a potential equipment effect on success of structure identification and time taken for structure identification. Seven pediatric anesthesiologists with <6 mo experience with ultrasound-guided regional anesthesia performed a total of 127 scans of the II region in anesthetized children. The muscle planes and the II and IH nerves were identified and labeled. The ultrasound images were reviewed by a blinded expert to mark accuracy of structure identification and time taken for identification. Two ultrasound machines (Sonosite C180plus and Micromaxx, both from Sonosite, Bothell, WA) were used. There was no difference in the frequency of correct identification of the TA muscle compared with the II/IH nerves (chi(2) test, TA versus II, P = 0.45; TA versus IH, P = 0.50). Ultrasound machine selection did show a nonsignificant trend in improving correct II/IH nerve identification (II nerve chi(2) test, P = 0.02; IH nerve chi(2) test, P = 0.04; Bonferroni corrected significance 0.17) but not for the muscle planes (chi(2) test, P = 0.83) or time taken (1-way analysis of variance, P = 0.07). A curve of improving accuracy with number of scans was plotted, with reliability of TA recognition occurring after 14-15 scans and II/IH identification after 18 scans. We have demonstrated that although there is no difference in the overall accuracy of muscle plane versus II/IH nerve identification, the muscle planes are reliably identified after fewer scans of the inguinal region. We suggest that a reliable end point for the inexperienced practitioner of ultrasound-guided II/IH nerve block may be the TA/internal oblique plane where the nerves are reported to be found in 100% of cases.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mark Krause
2010-08-01
This Corrective Action Decision Document (CADD) presents information supporting the selection of corrective action alternatives (CAAs) leading to the closure of Corrective Action Unit (CAU) 562, Waste Systems, in Areas 2, 23, and 25 of the Nevada Test Site, Nevada. This complies with the requirements of the Federal Facility Agreement and Consent Order (FFACO) that was agreed to by the State of Nevada; U.S. Department of Energy (DOE), Environmental Management; U.S. Department of Defense; and DOE, Legacy Management. Corrective Action Unit 562 comprises the following corrective action sites (CASs): • 02-26-11, Lead Shot • 02-44-02, Paint Spills and French Drainmore » • 02-59-01, Septic System • 02-60-01, Concrete Drain • 02-60-02, French Drain • 02-60-03, Steam Cleaning Drain • 02-60-04, French Drain • 02-60-05, French Drain • 02-60-06, French Drain • 02-60-07, French Drain • 23-60-01, Mud Trap Drain and Outfall • 23-99-06, Grease Trap • 25-60-04, Building 3123 Outfalls The purpose of this CADD is to identify and provide the rationale for the recommendation of CAAs for the 13 CASs within CAU 562. Corrective action investigation (CAI) activities were performed from July 27, 2009, through May 12, 2010, as set forth in the CAU 562 Corrective Action Investigation Plan. The purpose of the CAI was to fulfill the following data needs as defined during the data quality objective (DQO) process: • Determine whether COCs are present. • If COCs are present, determine their nature and extent. • Provide sufficient information and data to complete appropriate corrective actions. A data quality assessment (DQA) performed on the CAU 562 data demonstrated the quality and acceptability of the data for use in fulfilling the DQO data needs. Analytes detected during the CAI were evaluated against appropriate final action levels (FALs) to identify the COCs for each CAS. The results of the CAI identified COCs at 10 of the 13 CASs in CAU 562, and thus corrective action is required. Assessment of the data generated from investigation activities conducted at CAU 562 is shown in Table ES-1. Based on the evaluation of analytical data from the CAI, review of future and current operations at the 13 CASs, and the detailed and comparative analysis of the potential CAAs, the following corrective actions are recommended for CAU 562. • No further action is the preferred corrective action for CASs 02-60-01, 02-60-06, and 02-60-07. • Clean closure is the preferred corrective action for CASs 02-26-11, 02-44-02, 02-59-01, 02-60-02, 02-60-03, 02-60-04, 02-60-05, 23-60-01, 23-99-06, and 25-60-04. The preferred CAAs were evaluated on technical merit focusing on performance, reliability, feasibility, safety, and cost. The alternatives were judged to meet all requirements for the technical components evaluated. The alternatives meet all applicable federal and state regulations for closure of the site and will reduce potential exposures to contaminated media to acceptable levels. The DOE, National Nuclear Security Administration Nevada Site Office provides the following recommendations: • No further corrective action is required at CASs 02-60-01, 02-60-06, and 02-60-07. • Clean closure is recommended for the remaining 10 CASs in CAU 562. • A Corrective Action Plan will be submitted to the Nevada Division of Environmental Protection that contains a detailed description of the proposed actions that will be taken to implement the selected corrective actions.« less
Effectiveness of health management departments of universities that train health managers in Turkey.
Karagoz, Sevgul; Balci, Ali
2007-01-01
This research has [corrected] aimed to examine the effectiveness of the health management departments of universities which [corrected] train health managers in Turkey. The study compares - for lecturers and students - nine variables of organisational effectiveness [corrected] These nine dimensions are derived from Cameron (1978; 1981; 1986) [corrected] Factor analysis was used to validate [corrected] the scale developed by the researcher. For internal consistency and reliability, the [corrected] Cronbach Alpha reliability coefficient and item total correlation were applied. A questionnaire was administered to a [corrected] total of [corrected] 207 people [corrected] in health management departments in [corrected]Turkey. In analysis of the data, [corrected] descriptive statistics and the [corrected] t-test were [corrected]used. According to our [corrected] research findings, at individual [corrected] university level, lecturers found their departments more effective than did [corrected] their students. The highest effectiveness was perceived at Baskent University, a private university [corrected] The best outcome was achieved for 'organisational health', and 'the [corrected] ability to acquire resources' achieved [corrected] the lowest outcome [corrected] Effectiveness overall [corrected] was found to be moderate [corrected] Copyright (c) 2006 John Wiley & Sons, Ltd.
Detection of counterfeit brand spirits using 1H NMR fingerprints in comparison to sensory analysis.
Kuballa, Thomas; Hausler, Thomas; Okaru, Alex O; Neufeld, Maria; Abuga, Kennedy O; Kibwage, Isaac O; Rehm, Jürgen; Luy, Burkhard; Walch, Stephan G; Lachenmeier, Dirk W
2018-04-15
Beverage fraud involving counterfeiting of brand spirits is an increasing problem not only due to deception of the consumer but also because it poses health risks e.g. from possible methanol admixture. Suspicious spirit samples from Russia and Kenya were analysed using 1 H nuclear magnetic resonance (NMR) spectroscopy in comparison to authentic products. Using linear regression analysis of spectral integral values, 4 counterfeited samples from Russia and 2 from Kenya were easily identifiable with R 2 < 0.7. Sensory analysis using triangle test methodology confirmed significant taste differences between counterfeited and authentic samples but the assessors were unable to correctly identify the counterfeited product in the majority of cases. An important conclusion is that consumers cannot assumed to be self-responsible when consuming counterfeit alcohol because there is no general ability to organoleptically detect counterfeit alcohol. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Warren, Aaron R.
2009-11-01
Time-series designs are an alternative to pretest-posttest methods that are able to identify and measure the impacts of multiple educational interventions, even for small student populations. Here, we use an instrument employing standard multiple-choice conceptual questions to collect data from students at regular intervals. The questions are modified by asking students to distribute 100 Confidence Points among the options in order to indicate the perceived likelihood of each answer option being the correct one. Tracking the class-averaged ratings for each option produces a set of time-series. ARIMA (autoregressive integrated moving average) analysis is then used to test for, and measure, changes in each series. In particular, it is possible to discern which educational interventions produce significant changes in class performance. Cluster analysis can also identify groups of students whose ratings evolve in similar ways. A brief overview of our methods and an example are presented.
NASA Astrophysics Data System (ADS)
Meng, Qizhi; Xie, Fugui; Liu, Xin-Jun
2018-06-01
This paper deals with the conceptual design, kinematic analysis and workspace identification of a novel four degrees-of-freedom (DOFs) high-speed spatial parallel robot for pick-and-place operations. The proposed spatial parallel robot consists of a base, four arms and a 1½ mobile platform. The mobile platform is a major innovation that avoids output singularity and offers the advantages of both single and double platforms. To investigate the characteristics of the robot's DOFs, a line graph method based on Grassmann line geometry is adopted in mobility analysis. In addition, the inverse kinematics is derived, and the constraint conditions to identify the correct solution are also provided. On the basis of the proposed concept, the workspace of the robot is identified using a set of presupposed parameters by taking input and output transmission index as the performance evaluation criteria.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Patrick Matthews
Corrective Action Unit (CAU) 371 is located in Areas 11 and 18 of the Nevada Test Site, which is approximately 65 miles northwest of Las Vegas, Nevada. Corrective Action Unit 371 is comprised of the two corrective action sites (CASs) listed below: • 11-23-05, Pin Stripe Contamination Area • 18-45-01, U-18j-2 Crater (Johnnie Boy) These sites are being investigated because existing information on the nature and extent of potential contamination is insufficient to evaluate and recommend corrective action alternatives. Additional information will be obtained by conducting a corrective action investigation before evaluating corrective action alternatives and selecting the appropriate correctivemore » action for each CAS. The results of the field investigation will support a defensible evaluation of viable corrective action alternatives that will be presented in the Corrective Action Decision Document. The sites will be investigated based on the data quality objectives (DQOs) developed on November 19, 2008, by representatives of the Nevada Division of Environmental Protection; U.S. Department of Energy, National Nuclear Security Administration Nevada Site Office; Stoller-Navarro Joint Venture; and National Security Technologies, LLC. The DQO process was used to identify and define the type, amount, and quality of data needed to develop and evaluate appropriate corrective actions for CAU 371. Appendix A provides a detailed discussion of the DQO methodology and the DQOs specific to each CAS. The scope of the corrective action investigation for CAU 371 includes the following activities: • Move surface debris and/or materials, as needed, to facilitate sampling. • Conduct radiological surveys. • Measure in situ external dose rates using thermoluminescent dosimeters or other dose measurement devices. • Collect and submit environmental samples for laboratory analysis to determine internal dose rates. • Combine internal and external dose rates to determine whether total dose rates exceed final action levels (FALs). • Collect and submit environmental samples for laboratory analysis to determine whether chemical contaminants are present at concentrations exceeding FALs. • If contamination exceeds FALs, define the extent of the contamination exceeding FALs. • Investigate waste to determine whether potential source material is present. This Corrective Action Investigation Plan has been developed in accordance with the Federal Facility Agreement and Consent Order that was agreed to by the State of Nevada; U.S. Department of Energy; and U.S. Department of Defense. Under the Federal Facility Agreement and Consent Order, this Corrective Action Investigation Plan will be submitted to the Nevada Division of Environmental Protection for approval. Fieldwork will be conducted following approval of the plan.« less
Correction of Measured Taxicab Exhaust Emission Data Based on Cmem Modle
NASA Astrophysics Data System (ADS)
Li, Q.; Jia, T.
2017-09-01
Carbon dioxide emissions from urban road traffic mainly come from automobile exhaust. However, the carbon dioxide emissions obtained by the instruments are unreliable due to time delay error. In order to improve the reliability of data, we propose a method to correct the measured vehicles' carbon dioxide emissions from instrument based on the CMEM model. Firstly, the synthetic time series of carbon dioxide emissions are simulated by CMEM model and GPS velocity data. Then, taking the simulation data as the control group, the time delay error of the measured carbon dioxide emissions can be estimated by the asynchronous correlation analysis, and the outliers can be automatically identified and corrected using the principle of DTW algorithm. Taking the taxi trajectory data of Wuhan as an example, the results show that (1) the correlation coefficient between the measured data and the control group data can be improved from 0.52 to 0.59 by mitigating the systematic time delay error. Furthermore, by adjusting the outliers which account for 4.73 % of the total data, the correlation coefficient can raise to 0.63, which suggests strong correlation. The construction of low carbon traffic has become the focus of the local government. In order to respond to the slogan of energy saving and emission reduction, the distribution of carbon emissions from motor vehicle exhaust emission was studied. So our corrected data can be used to make further air quality analysis.
Using Mason number to predict MR damper performance from limited test data
NASA Astrophysics Data System (ADS)
Becnel, Andrew C.; Wereley, Norman M.
2017-05-01
The Mason number can be used to produce a single master curve which relates MR fluid stress versus strain rate behavior across a wide range of shear rates, temperatures, and applied magnetic fields. As applications of MR fluid energy absorbers expand to a variety of industries and operating environments, Mason number analysis offers a path to designing devices with desired performance from a minimal set of preliminary test data. Temperature strongly affects the off-state viscosity of the fluid, as the passive viscous force drops considerably at higher temperatures. Yield stress is not similarly affected, and stays relatively constant with changing temperature. In this study, a small model-scale MR fluid rotary energy absorber is used to measure the temperature correction factor of a commercially-available MR fluid from LORD Corporation. This temperature correction factor is identified from shear stress vs. shear rate data collected at four different temperatures. Measurements of the MR fluid yield stress are also obtained and related to a standard empirical formula. From these two MR fluid properties - temperature-dependent viscosity and yield stress - the temperature-corrected Mason number is shown to predict the force vs. velocity performance of a full-scale rotary MR fluid energy absorber. This analysis technique expands the design space of MR devices to high shear rates and allows for comprehensive predictions of overall performance across a wide range of operating conditions from knowledge only of the yield stress vs. applied magnetic field and a temperature-dependent viscosity correction factor.
A novel approach for evaluating the risk of health care failure modes.
Chang, Dong Shang; Chung, Jenq Hann; Sun, Kuo Lung; Yang, Fu Chiang
2012-12-01
Failure mode and effects analysis (FMEA) can be employed to reduce medical errors by identifying the risk ranking of the health care failure modes and taking priority action for safety improvement. The purpose of this paper is to propose a novel approach of data analysis. The approach is to integrate FMEA and a mathematical tool-Data envelopment analysis (DEA) with "slack-based measure" (SBM), in the field of data analysis. The risk indexes (severity, occurrence, and detection) of FMEA are viewed as multiple inputs of DEA. The practicality and usefulness of the proposed approach is illustrated by one case of health care. Being a systematic approach for improving the service quality of health care, the approach can offer quantitative corrective information of risk indexes that thereafter reduce failure possibility. For safety improvement, these new targets of the risk indexes could be used for management by objectives. But FMEA cannot provide quantitative corrective information of risk indexes. The novel approach can surely overcome this chief shortcoming of FMEA. After combining DEA SBM model with FMEA, the two goals-increase of patient safety, medical cost reduction-can be together achieved.
Massett, Holly A; Mishkin, Grace; Rubinstein, Larry; Ivy, S Percy; Denicoff, Andrea; Godwin, Elizabeth; DiPiazza, Kate; Bolognese, Jennifer; Zwiebel, James A; Abrams, Jeffrey S
2016-11-15
Accruing patients in a timely manner represents a significant challenge to early phase cancer clinical trials. The NCI Cancer Therapy Evaluation Program analyzed 19 months of corrective action plans (CAP) received for slow-accruing phase I and II trials to identify slow accrual reasons, evaluate whether proposed corrective actions matched these reasons, and assess the CAP impact on trial accrual, duration, and likelihood of meeting primary scientific objectives. Of the 135 CAPs analyzed, 69 were for phase I trials and 66 for phase II trials. Primary reasons cited for slow accrual were safety/toxicity (phase I: 48%), design/protocol concerns (phase I: 42%, phase II: 33%), and eligibility criteria (phase I: 41%, phase II: 35%). The most commonly proposed corrective actions were adding institutions (phase I: 43%, phase II: 85%) and amending the trial to change eligibility or design (phase I: 55%, phase II: 44%). Only 40% of CAPs provided proposed corrective actions that matched the reasons given for slow accrual. Seventy percent of trials were closed to accrual at time of analysis (phase I = 48; phase II = 46). Of these, 67% of phase I and 70% of phase II trials met their primary objectives, but they were active three times longer than projected. Among closed trials, 24% had an accrual rate increase associated with a greater likelihood of meeting their primary scientific objectives. Ultimately, trials receiving CAPs saw improved accrual rates. Future trials may benefit from implementing CAPs early in trial life cycles, but it may be more beneficial to invest in earlier accrual planning. Clin Cancer Res; 22(22); 5408-16. ©2016 AACRSee related commentary by Mileham and Kim, p. 5397. ©2016 American Association for Cancer Research.
Tsai, Wei-Chung; Lee, Kun-Tai; Wu, Ming-Tsang; Chu, Chih-Sheng; Lin, Tsung-Hsien; Hsu, Po-Chao; Su, Ho-Ming; Voon, Wen-Chol; Lai, Wen-Ter; Sheu, Sheng-Hsiung
2013-07-01
The 12-lead electrocardiogram (ECG) is a commonly used tool to access left atrial enlargement, which is a marker of left ventricular diastolic dysfunction (LVDD). The aim of this study was to evaluate any association of the P-wave measurements in ECG with left atrial volume (LAV) index and LVDD. This study enrolled 270 patients. In this study, 4 ECG P-wave parameters corrected by heart rate, that is, corrected P-wave maximum duration (PWdurMaxC), corrected P-wave dispersion (PWdisperC), corrected P-wave area (PWareaC) and corrected mean P-wave duration (meanPWdurC), were measured. LAV and left ventricular diastolic parameters were measured from echocardiography. LVDD was defined as a pseudonormal or restrictive mitral inflow pattern. The 4 P-wave parameters were significantly correlated with the LAV index after adjusting for age, sex, diabetes, hypertension, coronary artery disease, body mass index and diastolic blood pressure in multivariate analysis. The standardized β coefficients of PWdurMaxC, PWdisperC, meanPWdurC and PWareaC were 0.338, 0.298, 0.215 and 0.296, respectively. The 4 P-wave parameters were also significantly correlated with LVDD after multivariate logistic regression analysis. The odds ratios (95% confidence intervals) of PWdurMaxC, PWdisperC, meanPWdurC and PWareaC were 1.03 (1.01-1.04), 1.02 (1.04-1.04), 1.04 (1.02-1.07) and 1.01 (1.00-1.02), respectively. This study demonstrated that PWdurMaxC, PWdisperC, meanPWdurC and PWareaC were important determinants of the LAV index and LVDD. Therefore, screening patients by means of the 12-lead ECG may be helpful in identifying a high-risk group of increased LAV index and LVDD.
Hosseini, Seyed Mojtaba; Etesaminia, Samira; Jafari, Mehrnoosh
2016-01-01
Introduction: One of the important factors of correct management is to identify the reasons for patient tendency toward private hospitals. This study measures these factors based on service marketing mixes. Patients and methods: This study used a cross sectional descriptive methodology. The study was conducted during 6 months in 2015. The studied population included patients of private hospitals in Tehran. Random sampling was used (n = 200). Data was collected by an author-made questionnaire for service marketing factors. Reliability and validity of the questionnaire were confirmed. Data analysis was done using factor analysis test in SPSS 20. Results: The results showed that constant attendance of physicians and nurses has the highest effect (0.707%) on patient tendency toward private hospitals. PMID:27999486
Masuyama, Kotoka; Shojo, Hideki; Nakanishi, Hiroaki; Inokuchi, Shota; Adachi, Noboru
2017-01-01
Sex determination is important in archeology and anthropology for the study of past societies, cultures, and human activities. Sex determination is also one of the most important components of individual identification in criminal investigations. We developed a new method of sex determination by detecting a single-nucleotide polymorphism in the amelogenin gene using amplified product-length polymorphisms in combination with sex-determining region Y analysis. We particularly focused on the most common types of postmortem DNA damage in ancient and forensic samples: fragmentation and nucleotide modification resulting from deamination. Amplicon size was designed to be less than 60 bp to make the method more useful for analyzing degraded DNA samples. All DNA samples collected from eight Japanese individuals (four male, four female) were evaluated correctly using our method. The detection limit for accurate sex determination was determined to be 20 pg of DNA. We compared our new method with commercial short tandem repeat analysis kits using DNA samples artificially fragmented by ultraviolet irradiation. Our novel method was the most robust for highly fragmented DNA samples. To deal with allelic dropout resulting from deamination, we adopted "bidirectional analysis," which analyzed samples from both sense and antisense strands. This new method was applied to 14 Jomon individuals (3500-year-old bone samples) whose sex had been identified morphologically. We could correctly identify the sex of 11 out of 14 individuals. These results show that our method is reliable for the sex determination of highly degenerated samples.
Masuyama, Kotoka; Shojo, Hideki; Nakanishi, Hiroaki; Inokuchi, Shota; Adachi, Noboru
2017-01-01
Sex determination is important in archeology and anthropology for the study of past societies, cultures, and human activities. Sex determination is also one of the most important components of individual identification in criminal investigations. We developed a new method of sex determination by detecting a single-nucleotide polymorphism in the amelogenin gene using amplified product-length polymorphisms in combination with sex-determining region Y analysis. We particularly focused on the most common types of postmortem DNA damage in ancient and forensic samples: fragmentation and nucleotide modification resulting from deamination. Amplicon size was designed to be less than 60 bp to make the method more useful for analyzing degraded DNA samples. All DNA samples collected from eight Japanese individuals (four male, four female) were evaluated correctly using our method. The detection limit for accurate sex determination was determined to be 20 pg of DNA. We compared our new method with commercial short tandem repeat analysis kits using DNA samples artificially fragmented by ultraviolet irradiation. Our novel method was the most robust for highly fragmented DNA samples. To deal with allelic dropout resulting from deamination, we adopted “bidirectional analysis,” which analyzed samples from both sense and antisense strands. This new method was applied to 14 Jomon individuals (3500-year-old bone samples) whose sex had been identified morphologically. We could correctly identify the sex of 11 out of 14 individuals. These results show that our method is reliable for the sex determination of highly degenerated samples. PMID:28052096
76 FR 44010 - Medicare Program; Hospice Wage Index for Fiscal Year 2012; Correction
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-22
.... 93.774, Medicare-- Supplementary Medical Insurance Program) Dated: July 15, 2011. Dawn L. Smalls... corrects technical errors that appeared in the notice of CMS ruling published in the Federal Register on... FR 26731), there were technical errors that are identified and corrected in the Correction of Errors...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Desai, V; Labby, Z; Culberson, W
Purpose: To determine whether body site-specific treatment plans form unique “plan class” clusters in a multi-dimensional analysis of plan complexity metrics such that a single beam quality correction determined for a representative plan could be universally applied within the “plan class”, thereby increasing the dosimetric accuracy of a detector’s response within a subset of similarly modulated nonstandard deliveries. Methods: We collected 95 clinical volumetric modulated arc therapy (VMAT) plans from four body sites (brain, lung, prostate, and spine). The lung data was further subdivided into SBRT and non-SBRT data for a total of five plan classes. For each control pointmore » in each plan, a variety of aperture-based complexity metrics were calculated and stored as unique characteristics of each patient plan. A multiple comparison of means analysis was performed such that every plan class was compared to every other plan class for every complexity metric in order to determine which groups could be considered different from one another. Statistical significance was assessed after correcting for multiple hypothesis testing. Results: Six out of a possible 10 pairwise plan class comparisons were uniquely distinguished based on at least nine out of 14 of the proposed metrics (Brain/Lung, Brain/SBRT lung, Lung/Prostate, Lung/SBRT Lung, Lung/Spine, Prostate/SBRT Lung). Eight out of 14 of the complexity metrics could distinguish at least six out of the possible 10 pairwise plan class comparisons. Conclusion: Aperture-based complexity metrics could prove to be useful tools to quantitatively describe a distinct class of treatment plans. Certain plan-averaged complexity metrics could be considered unique characteristics of a particular plan. A new approach to generating plan-class specific reference (pcsr) fields could be established through a targeted preservation of select complexity metrics or a clustering algorithm that identifies plans exhibiting similar modulation characteristics. Measurements and simulations will better elucidate potential plan-class specific dosimetry correction factors.« less
Sturrock, Hugh J W; Gething, Pete W; Ashton, Ruth A; Kolaczinski, Jan H; Kabatereine, Narcis B; Brooker, Simon
2011-09-01
In schistosomiasis control, there is a need to geographically target treatment to populations at high risk of morbidity. This paper evaluates alternative sampling strategies for surveys of Schistosoma mansoni to target mass drug administration in Kenya and Ethiopia. Two main designs are considered: lot quality assurance sampling (LQAS) of children from all schools; and a geostatistical design that samples a subset of schools and uses semi-variogram analysis and spatial interpolation to predict prevalence in the remaining unsurveyed schools. Computerized simulations are used to investigate the performance of sampling strategies in correctly classifying schools according to treatment needs and their cost-effectiveness in identifying high prevalence schools. LQAS performs better than geostatistical sampling in correctly classifying schools, but at a cost with a higher cost per high prevalence school correctly classified. It is suggested that the optimal surveying strategy for S. mansoni needs to take into account the goals of the control programme and the financial and drug resources available.
The distinct emotional flavor of Gnostic writings from the early Christian era.
Whissell, Cynthia
2008-02-01
More than 500,000 scored words in 83 documents were used to conclude that it is possible to identify the source of documents (proto-orthodox Christian versus early Gnostic) on the basis of the emotions underlying the words. Twenty-seven New Testament works and seven Gnostic documents (including the gospels of Thomas, Judas, and Mary [Magdalene]) were scored with the Dictionary of Affect in Language. Patterns of emotional word use focusing on eight types of extreme emotional words were employed in a discriminant function analysis to predict source. Prediction was highly successful (canonical r = .81, 97% correct identification of source). When the discriminant function was tested with more than 30 additional Gnostic and Christian works including a variety of translations and some wisdom books, it correctly classified all of them. The majority of the predictive power of the function (97% of all correct categorizations, 70% of the canonical r2) was associated with the preferential presence of passive and passive/pleasant words in Gnostic documents.
Baumann, Soo Mee; Webb, Patrick; Zeller, Manfred
2013-03-01
Cross-cultural validity of food security indicators is commonly presumed without questioning the suitability of generic indicators in different geographic settings. However, ethnic differences in the perception of and reporting on, food insecurity, as well as variations in consumption patterns, may limit the comparability of results. Although research on correction factors for standardization of food security indicators is in process, so far no universal indicator has been identified. The current paper considers the ability of the Food Consumption Score (FCS) developed by the World Food Programme in southern Africa in 1996 to meet the requirement of local cultural validity in a Laotian context. The analysis is based on research that seeks to identify options for correcting possible biases linked to cultural disparities. Based on the results of a household survey conducted in different agroecological zones of Laos in 2009, the FCS was validated against a benchmark of calorie consumption. Changing the thresholds and excluding small amounts of food items consumed were tested as options to correct for biases caused by cultural disparities. The FCS in its original form underestimates the food insecurity level in the surveyed villages. However, the closeness of fit of the FCS to the benchmark classification improves when small amounts of food items are excluded from the assessment. Further research in different cultural settings is required to generate more insight into the extent to which universal thresholds can be applied to dietary diversity indicators with or without locally determined correction factors such as the exclusion of small amounts of food items.
Analysis of interacting entropy-corrected holographic and new agegraphic dark energies
NASA Astrophysics Data System (ADS)
Ranjit, Chayan; Debnath, Ujjal
In the present work, we assume the flat FRW model of the universe is filled with dark matter and dark energy where they are interacting. For dark energy model, we consider the entropy-corrected HDE (ECHDE) model and the entropy-corrected NADE (ECNADE). For entropy-corrected models, we assume logarithmic correction and power law correction. For ECHDE model, length scale L is assumed to be Hubble horizon and future event horizon. The ωde-ωde‧ analysis for our different horizons are discussed.
Oberle, Michael; Wohlwend, Nadia; Jonas, Daniel; Maurer, Florian P; Jost, Geraldine; Tschudin-Sutter, Sarah; Vranckx, Katleen; Egli, Adrian
2016-01-01
The technical, biological, and inter-center reproducibility of matrix-assisted laser desorption ionization-time of flight mass spectrometry (MALDI TOF MS) typing data has not yet been explored. The aim of this study is to compare typing data from multiple centers employing bioinformatics using bacterial strains from two past outbreaks and non-related strains. Participants received twelve extended spectrum betalactamase-producing E. coli isolates and followed the same standard operating procedure (SOP) including a full-protein extraction protocol. All laboratories provided visually read spectra via flexAnalysis (Bruker, Germany). Raw data from each laboratory allowed calculating the technical and biological reproducibility between centers using BioNumerics (Applied Maths NV, Belgium). Technical and biological reproducibility ranged between 96.8-99.4% and 47.6-94.4%, respectively. The inter-center reproducibility showed a comparable clustering among identical isolates. Principal component analysis indicated a higher tendency to cluster within the same center. Therefore, we used a discriminant analysis, which completely separated the clusters. Next, we defined a reference center and performed a statistical analysis to identify specific peaks to identify the outbreak clusters. Finally, we used a classifier algorithm and a linear support vector machine on the determined peaks as classifier. A validation showed that within the set of the reference center, the identification of the cluster was 100% correct with a large contrast between the score with the correct cluster and the next best scoring cluster. Based on the sufficient technical and biological reproducibility of MALDI-TOF MS based spectra, detection of specific clusters is possible from spectra obtained from different centers. However, we believe that a shared SOP and a bioinformatics approach are required to make the analysis robust and reliable.
Brain Injury Vision Symptom Survey (BIVSS) Questionnaire.
Laukkanen, Hannu; Scheiman, Mitchell; Hayes, John R
2017-01-01
Validation of the Brain Injury Vision Symptom Survey (BIVSS), a self-administered survey for vision symptoms related to traumatic brain injury (TBI). A 28-item vision symptom questionnaire was completed by 107 adult subjects (mean age 42.1, 16.2 SD, range 18-75) who self-reported as having sustained mild-to-moderate TBI and two groups of reference adult subjects (first-year optometry students: mean age 23.2, 2.8 SD, range 20-39; and 71 third-year optometry students: mean age 26.0, 2.9 SD, range 22-42) without TBI. Both a Likert-style method of analysis with factor analysis and a Rasch analysis were used. Logistic regression was used to determine sensitivity and specificity. At least 27 of 28 questions were completed by 93.5% of TBI subjects, and all 28 items were completed by all of the 157 reference subjects. BIVSS sensitivity was 82.2% for correctly predicting TBI and 90.4% for correctly predicting the optometry students. Factor analysis identified eight latent variables; six factors were positive in their risk for TBI. Other than dry eye and double vision, the TBI patients were significantly more symptomatic than either cohort of optometry students by at least one standard deviation (p < 0.001). Twenty-five of 28 questions were within limits for creating a single-dimension Rasch scale. Nearly all of the adult TBI subjects were able to self-complete the BIVSS, and there was significant mean score separation between TBI and non-TBI groups. The Rasch analysis revealed a single dimension associated with TBI. Using the Likert method with the BIVSS, it may be possible to identify different vision symptom profiles with TBI patients. The BIVSS seems to be a promising tool for better understanding the complex and diverse nature of vision symptoms that are associated with brain injury.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grant Evenson
2006-05-01
This Corrective Action Decision Document has been prepared for Corrective Action Unit (CAU) 151, Septic Systems and Discharge Area, at the Nevada Test Site, Nevada, according to the ''Federal Facility Agreement and Consent Order'' (FFACO) (1996). Corrective Action Unit 151 is comprised of eight corrective action sites (CASs): (1) CAS 02-05-01, UE-2ce Pond; (2) CAS 12-03-01, Sewage Lagoons (6); (3) CAS 12-04-01, Septic Tanks; (4) CAS 12-04-02, Septic Tanks; (5) CAS 12-04-03, Septic Tank; (6) CAS 12-47-01, Wastewater Pond; (7) CAS 18-03-01, Sewage Lagoon; and (8) CAS 18-99-09, Sewer Line (Exposed). The purpose of this Corrective Action Decision Document ismore » to identify and provide the rationale for the recommendation of corrective action alternatives (CAAs) for each of the eight CASs within CAU 151. Corrective action investigation (CAI) activities were performed from September 12 through November 18, 2005, as set forth in the CAU 151 Corrective Action Investigation Plan and Record of Technical Change No. 1. Additional confirmation sampling was performed on December 9, 2005; January 10, 2006; and February 13, 2006. Analytes detected during the CAI were evaluated against appropriate final action levels (FALs) to identify the contaminants of concern for each CAS. The results of the CAI identified contaminants of concern at two of the eight CASs in CAU 151 and required the evaluation of CAAs. Assessment of the data generated from investigation activities conducted at CAU 151 revealed the following: (1) Soils at CASs 02-05-01, 12-04-01, 12-04-02, 12-04-03, 12-47-01, 18-03-01, 18-99-09, and Lagoons B through G of CAS 12-03-01 do not contain contamination at concentrations exceeding the FALs. (2) Lagoon A of CAS 12-03-01 has arsenic above FALs in shallow subsurface soils. (3) One of the two tanks of CAS 12-04-01, System No.1, has polychlorinated biphenyls (aroclor-1254), trichloroethane, and cesium-137 above FALs in the sludge. Both CAS 12-04-01, System No.1 tanks contain trichloroethane and 1,4-dichlorobenzene above ''Resource Conservation and Recovery Act'' toxicity characteristic limits. Based on the evaluation of analytical data from the CAI, review of future and current operations at the eight CASs, and the detailed and comparative analysis of the potential CAAs, the following corrective actions are recommended for CAU 151. No Further Action is the recommended corrective action for soils at CASs 02-05-01, 12-04-01, 12-04-02, 12-04-03, 18-03-01, and 18-99-09; and Lagoons C, D, F, and G of CAS 12-03-01. No Further Action with implementation of a best management practice (BMP) is recommended for soils at CAS 12-47-01 and Lagoons B and E of CAS 12-03-01. To be protective of future workers should the present scenario used to calculate FALs change, an administrative use restriction will be recorded per the FFACO agreement as a BMP. Close in Place with Administrative Controls is the recommended corrective action for Lagoon A of CAS 12-03-01. Based on the evaluation of analytical data from the CAI; review of future and current operations at CASs 12-04-01, 12-04-02, and 12-04-03; and the detailed and comparative analysis of the potential CAAs, the following corrective actions are recommended for the septic tanks at these CASs. No Further Action with implementation of BMPs is the recommended corrective action for septic tanks that do not contain potential source material from CAS 12-04-01, System No.4 (four tanks); CAS 12-04-02, System No.5 (six tanks); and CAS 12-04-03, System No.3 (four tanks). Clean Closure with implementation of BMPs is the recommended corrective action for the septic tanks from CAS 12-04-01, System No.1 (two tanks). The preferred CAAs were evaluated on technical merit focusing on performance, reliability, feasibility, safety, and cost. The alternatives were judged to meet all requirements for the technical components evaluated. The alternatives meet all applicable federal and state regulations for closure of the site and will reduce potential exposure pathways to the contaminated media to an acceptable level at CAU 151.« less
Mathematical misconception in calculus 1: Identification and gender difference
NASA Astrophysics Data System (ADS)
Nassir, Asyura Abd; Abdullah, Nur Hidayah Masni; Ahmad, Salimah; Tarmuji, Nor Habibah; Idris, Aminatul Solehah
2017-08-01
A few years of experience of teaching mathematics make us notice that the same types of mistakes are done repeatedly by students. This paper presents an insight into categories of mistakes, how male and female students differ in terms of mistakes that are commonly done and the ability of the students to identify the mistakes. Sample of mistakes were taken from Calculus 1 final exam answer scripts, then it was listed and analyzed. Data analysis revealed that students' misconceptions fall into four categories. The first category is misunderstanding the meaning of brackets, followed by misconception of basic mathematics rules, misconception in notation and misconception in properties of trigonometry. A mistake identification test which consists of ten false mathematical statements was designed based on the mistake done by the previous batch of students that covered topics algebra, trigonometry, index, limit, differentiation and integration. Then, the test was given to students who enrolled in Calculus I course. Respondents of this study were randomly selected among two hundreds engineering students. Data obtained were analyzed using basic descriptive analysis and Chi Square test to capture gender differences in the mistake done for each category. Findings indicate that thirty five percent of the students have the ability to identify the mistakes and make a proper correction for at most two statements. Thirty one percent of the students are able to identify the mistakes but unable to make proper correction. Twenty five percent of the students failed to identify the mistakes in six out of ten false statements. Female students' misconception is more likely in basic mathematics rules compared to male. The findings of this study could serve as baseline information to be stressed in improving teaching and learning mathematics.
Power System Transient Stability Based on Data Mining Theory
NASA Astrophysics Data System (ADS)
Cui, Zhen; Shi, Jia; Wu, Runsheng; Lu, Dan; Cui, Mingde
2018-01-01
In order to study the stability of power system, a power system transient stability based on data mining theory is designed. By introducing association rules analysis in data mining theory, an association classification method for transient stability assessment is presented. A mathematical model of transient stability assessment based on data mining technology is established. Meanwhile, combining rule reasoning with classification prediction, the method of association classification is proposed to perform transient stability assessment. The transient stability index is used to identify the samples that cannot be correctly classified in association classification. Then, according to the critical stability of each sample, the time domain simulation method is used to determine the state, so as to ensure the accuracy of the final results. The results show that this stability assessment system can improve the speed of operation under the premise that the analysis result is completely correct, and the improved algorithm can find out the inherent relation between the change of power system operation mode and the change of transient stability degree.
Lastra-Mejías, Miguel; Torreblanca-Zanca, Albertina; Aroca-Santos, Regina; Cancilla, John C; Izquierdo, Jesús G; Torrecilla, José S
2018-08-01
A set of 10 honeys comprising a diverse range of botanical origins have been successfully characterized through fluorescence spectroscopy using inexpensive light-emitting diodes (LEDs) as light sources. It has been proven that each LED-honey combination tested originates a unique emission spectrum, which enables the authentication of every honey, being able to correctly label it with its botanical origin. Furthermore, the analysis was backed up by a mathematical analysis based on partial least square models which led to a correct classification rate of each type of honey of over 95%. Finally, the same approach was followed to analyze rice syrup, which is a common honey adulterant that is challenging to identify when mixed with honey. A LED-dependent and unique fluorescence spectrum was found for the syrup, which presumably qualifies this approach for the design of uncomplicated, fast, and cost-effective quality control and adulteration assessing tools for different types of honey. Copyright © 2018 Elsevier B.V. All rights reserved.
Utility of three anthropometric indices in assessing the cardiometabolic risk profile in children.
Buchan, Duncan S; Boddy, Lynne M; Grace, Fergal M; Brown, Elise; Sculthorpe, Nicholas; Cunningham, Conor; Murphy, Marie H; Dagger, Rebecca; Foweather, Lawrence; Graves, Lee E F; Hopkins, Nicola D; Stratton, Gareth; Baker, Julien S
2017-05-06
To evaluate the ability of BMI, WC, and WHtR to identify increased cardiometabolic risk in pre-adolescents. This is a cross-sectional study involving 192 children (10.92 ± 0.58 years, 56% female) from the United Kingdom between 2010 and 2013. Receiver operating characteristic curves determined the discriminatory ability of BMI, WC and WHtR to identify individuals with increased cardiometabolic risk (increased clustered triglycerides, HDL-cholesterol, systolic blood pressure, cardiorespiratory fitness, and glucose). A WHtR ≥ 0.5 increased the odds by 5.2 (95% confidence interval 2.6 - 10.3) of having increased cardiometabolic risk. Similar associations were observed for BMI and WC. Both BMI-z and WHtR were fair predictors of increased cardiometabolic risk, although BMI-z demonstrated the best trade-off between sensitivity and specificity, 76.1% and 63.6%, compared with 68.1% and 65.5% for WHtR. Cross-validation analysis revealed that BMI-z and WHtR correctly classified 84% of individuals (kappa score = 0.671, 95% CI 0.55, 0.79). The sensitivity of the cut-points suggests that 89.3% of individuals were correctly classified as being at risk with only 10.7% misdiagnosed whereas the specificity of the cut-points indicated that 77.8% of individuals were correctly identified as being healthy with 22.2% of individuals incorrectly diagnosed as being at risk. Findings suggest that WHtR provides similar cardiometabolic risk estimates to age and sex adjusted BMI. © 2016 Wiley Periodicals, Inc.
Macera, Annalisa; Lario, Chiara; Petracchini, Massimo; Gallo, Teresa; Regge, Daniele; Floriani, Irene; Ribero, Dario; Capussotti, Lorenzo; Cirillo, Stefano
2013-03-01
To compare the diagnostic accuracy and sensitivity of Gd-EOB-DTPA MRI and diffusion-weighted (DWI) imaging alone and in combination for detecting colorectal liver metastases in patients who had undergone preoperative chemotherapy. Thirty-two consecutive patients with a total of 166 liver lesions were retrospectively enrolled. Of the lesions, 144 (86.8 %) were metastatic at pathology. Three image sets (1, Gd-EOB-DTPA; 2, DWI; 3, combined Gd-EOB-DTPA and DWI) were independently reviewed by two observers. Statistical analysis was performed on a per-lesion basis. Evaluation of image set 1 correctly identified 127/166 lesions (accuracy 76.5 %; 95 % CI 69.3-82.7) and 106/144 metastases (sensitivity 73.6 %, 95 % CI 65.6-80.6). Evaluation of image set 2 correctly identified 108/166 (accuracy 65.1 %, 95 % CI 57.3-72.3) and 87/144 metastases (sensitivity of 60.4 %, 95 % CI 51.9-68.5). Evaluation of image set 3 correctly identified 148/166 (accuracy 89.2 %, 95 % CI 83.4-93.4) and 131/144 metastases (sensitivity 91 %, 95 % CI 85.1-95.1). Differences were statistically significant (P < 0.001). Notably, similar results were obtained analysing only small lesions (<1 cm). The combination of DWI with Gd-EOB-DTPA-enhanced MRI imaging significantly increases the diagnostic accuracy and sensitivity in patients with colorectal liver metastases treated with preoperative chemotherapy, and it is particularly effective in the detection of small lesions.
Toye, Warren; Das, Ram; Kron, Tomas; Franich, Rick; Johnston, Peter; Duchesne, Gillian
2009-05-01
To develop an in vivo dosimetry based investigative action level relevant for a corrective protocol for HDR brachytherapy boost treatment. The dose delivered to points within the urethra and rectum was measured using TLD in vivo dosimetry in 56 patients. Comparisons between the urethral and rectal measurements and TPS calculations showed differences, which are related to the relative position of the implant and TLD trains, and allowed shifts of implant position relative to the prostate to be estimated. Analysis of rectal dose measurements is consistent with implant movement, which was previously only identified with the urethral data. Shift corrected doses were compared with results from the TPS. Comparison of peak doses to the urethra and rectum has been assessed against the proposed corrective protocol to limit overdosing these critical structures. An initial investigative level of 20% difference between measured and TPS peak dose was established, which corresponds to 1/3 of patients which was practical for the caseload. These patients were assessed resulting in corrective action being applied for one patient. Multiple triggering for selective investigative action is outlined. The use of a single in vivo measurement in the first fraction optimizes patient benefit at acceptable cost.
Investigation of Primary Mirror Segment's Residual Errors for the Thirty Meter Telescope
NASA Technical Reports Server (NTRS)
Seo, Byoung-Joon; Nissly, Carl; Angeli, George; MacMynowski, Doug; Sigrist, Norbert; Troy, Mitchell; Williams, Eric
2009-01-01
The primary mirror segment aberrations after shape corrections with warping harness have been identified as the single largest error term in the Thirty Meter Telescope (TMT) image quality error budget. In order to better understand the likely errors and how they will impact the telescope performance we have performed detailed simulations. We first generated unwarped primary mirror segment surface shapes that met TMT specifications. Then we used the predicted warping harness influence functions and a Shack-Hartmann wavefront sensor model to determine estimates for the 492 corrected segment surfaces that make up the TMT primary mirror. Surface and control parameters, as well as the number of subapertures were varied to explore the parameter space. The corrected segment shapes were then passed to an optical TMT model built using the Jet Propulsion Laboratory (JPL) developed Modeling and Analysis for Controlled Optical Systems (MACOS) ray-trace simulator. The generated exit pupil wavefront error maps provided RMS wavefront error and image-plane characteristics like the Normalized Point Source Sensitivity (PSSN). The results have been used to optimize the segment shape correction and wavefront sensor designs as well as provide input to the TMT systems engineering error budgets.
Rules based process window OPC
NASA Astrophysics Data System (ADS)
O'Brien, Sean; Soper, Robert; Best, Shane; Mason, Mark
2008-03-01
As a preliminary step towards Model-Based Process Window OPC we have analyzed the impact of correcting post-OPC layouts using rules based methods. Image processing on the Brion Tachyon was used to identify sites where the OPC model/recipe failed to generate an acceptable solution. A set of rules for 65nm active and poly were generated by classifying these failure sites. The rules were based upon segment runlengths, figure spaces, and adjacent figure widths. 2.1 million sites for active were corrected in a small chip (comparing the pre and post rules based operations), and 59 million were found at poly. Tachyon analysis of the final reticle layout found weak margin sites distinct from those sites repaired by rules-based corrections. For the active layer more than 75% of the sites corrected by rules would have printed without a defect indicating that most rulesbased cleanups degrade the lithographic pattern. Some sites were missed by the rules based cleanups due to either bugs in the DRC software or gaps in the rules table. In the end dramatic changes to the reticle prevented catastrophic lithography errors, but this method is far too blunt. A more subtle model-based procedure is needed changing only those sites which have unsatisfactory lithographic margin.
Li, Zihui; Du, Boping; Li, Jing; Zhang, Jinli; Zheng, Xiaojing; Jia, Hongyan; Xing, Aiying; Sun, Qi; Liu, Fei; Zhang, Zongde
2017-03-01
Tuberculous meningitis (TBM) is the most severe and frequent form of central nervous system tuberculosis. The current lack of efficient diagnostic tests makes it difficult to differentiate TBM from other common types of meningitis, especially viral meningitis (VM). Metabolomics is an important tool to identify disease-specific biomarkers. However, little metabolomic information is available on adult TBM. We used 1 H nuclear magnetic resonance-based metabolomics to investigate the metabolic features of the CSF from 18 TBM and 20 VM patients. Principal component analysis and orthogonal signal correction-partial least squares-discriminant analysis (OSC-PLS-DA) were applied to analyze profiling data. Metabolites were identified using the Human Metabolome Database and pathway analysis was performed with MetaboAnalyst 3.0. The OSC-PLS-DA model could distinguish TBM from VM with high reliability. A total of 25 key metabolites that contributed to their discrimination were identified, including some, such as betaine and cyclohexane, rarely reported before in TBM. Pathway analysis indicated that amino acid and energy metabolism was significantly different in the CSF of TBM compared with VM. Twenty-five key metabolites identified in our study may be potential biomarkers for TBM differential diagnosis and are worthy of further investigation. Copyright © 2017 Elsevier B.V. All rights reserved.
The Pareto Analysis for Establishing Content Criteria in Surgical Training.
Kramp, Kelvin H; van Det, Marc J; Veeger, Nic J G M; Pierie, Jean-Pierre E N
2016-01-01
Current surgical training is still highly dependent on expensive operating room (OR) experience. Although there have been many attempts to transfer more training to the skills laboratory, little research is focused on which technical behaviors can lead to the highest profit when they are trained outside the OR. The Pareto principle states that in any population that contributes to a common effect, a few account for the bulk of the effect. This principle has been widely used in business management to increase company profits. This study uses the Pareto principle for establishing content criteria for more efficient surgical training. A retrospective study was conducted to assess verbal guidance provided by 9 supervising surgeons to 12 trainees performing 64 laparoscopic cholecystectomies in the OR. The verbal corrections were documented, tallied, and clustered according to the aimed change in novice behavior. The corrections were rank ordered, and a cumulative distribution curve was used to calculate which corrections accounted for 80% of the total number of verbal corrections. In total, 253 different verbal corrections were uttered 1587 times and were categorized into 40 different clusters of aimed changes in novice behaviors. The 35 highest-ranking verbal corrections (14%) and the 11 highest-ranking clusters (28%) accounted for 80% of the total number of given verbal corrections. Following the Pareto principle, we were able to identify the aspects of trainee behavior that account for most corrections given by supervisors during a laparoscopic cholecystectomy on humans. This strategy can be used for the development of new training programs to prepare the trainee in advance for the challenges encountered in the clinical setting in an OR. Copyright © 2016 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robert Boehlecke
2004-04-01
The six bunkers included in CAU 204 were primarily used to monitor atmospheric testing or store munitions. The ''Corrective Action Investigation Plan (CAIP) for Corrective Action Unit 204: Storage Bunkers, Nevada Test Site, Nevada'' (NNSA/NV, 2002a) provides information relating to the history, planning, and scope of the investigation; therefore, it will not be repeated in this CADD. This CADD identifies potential corrective action alternatives and provides a rationale for the selection of a recommended corrective action alternative for each CAS within CAU 204. The evaluation of corrective action alternatives is based on process knowledge and the results of investigative activitiesmore » conducted in accordance with the CAIP (NNSA/NV, 2002a) that was approved prior to the start of the Corrective Action Investigation (CAI). Record of Technical Change (ROTC) No. 1 to the CAIP (approval pending) documents changes to the preliminary action levels (PALs) agreed to by the Nevada Division of Environmental Protection (NDEP) and DOE, National Nuclear Security Administration Nevada Site Office (NNSA/NSO). This ROTC specifically discusses the radiological PALs and their application to the findings of the CAU 204 corrective action investigation. The scope of this CADD consists of the following: (1) Develop corrective action objectives; (2) Identify corrective action alternative screening criteria; (3) Develop corrective action alternatives; (4) Perform detailed and comparative evaluations of corrective action alternatives in relation to corrective action objectives and screening criteria; and (5) Recommend and justify a preferred corrective action alternative for each CAS within CAU 204.« less
NASA Astrophysics Data System (ADS)
Brindha, Elumalai; Rajasekaran, Ramu; Aruna, Prakasarao; Koteeswaran, Dornadula; Ganesan, Singaravelu
2017-01-01
Urine has emerged as one of the diagnostically potential bio fluids, as it has many metabolites. As the concentration and the physiochemical properties of the urinary metabolites may vary under pathological transformation, Raman spectroscopic characterization of urine has been exploited as a significant tool in identifying several diseased conditions, including cancers. In the present study, an attempt was made to study the high wavenumber (HWVN) Raman spectroscopic characterization of urine samples of normal subjects, oral premalignant and malignant patients. It is concluded that the urinary metabolites flavoproteins, tryptophan and phenylalanine are responsible for the observed spectral variations between the normal and abnormal groups. Principal component analysis-based linear discriminant analysis was carried out to verify the diagnostic potentiality of the present technique. The discriminant analysis performed across normal and oral premalignant subjects classifies 95.6% of the original and 94.9% of the cross-validated grouped cases correctly. In the second analysis performed across normal and oral malignant groups, the accuracy of the original and cross-validated grouped cases was 96.4% and 92.1% respectively. Similarly, the third analysis performed across three groups, normal, oral premalignant and malignant groups, classifies 93.3% and 91.2% of the original and cross-validated grouped cases correctly.
21 CFR 820.100 - Corrective and preventive action.
Code of Federal Regulations, 2013 CFR
2013-04-01
..., work operations, concessions, quality audit reports, quality records, service records, complaints, returned product, and other sources of quality data to identify existing and potential causes of... (CONTINUED) MEDICAL DEVICES QUALITY SYSTEM REGULATION Corrective and Preventive Action § 820.100 Corrective...
21 CFR 820.100 - Corrective and preventive action.
Code of Federal Regulations, 2011 CFR
2011-04-01
..., work operations, concessions, quality audit reports, quality records, service records, complaints, returned product, and other sources of quality data to identify existing and potential causes of... (CONTINUED) MEDICAL DEVICES QUALITY SYSTEM REGULATION Corrective and Preventive Action § 820.100 Corrective...
21 CFR 820.100 - Corrective and preventive action.
Code of Federal Regulations, 2010 CFR
2010-04-01
..., work operations, concessions, quality audit reports, quality records, service records, complaints, returned product, and other sources of quality data to identify existing and potential causes of... (CONTINUED) MEDICAL DEVICES QUALITY SYSTEM REGULATION Corrective and Preventive Action § 820.100 Corrective...
21 CFR 820.100 - Corrective and preventive action.
Code of Federal Regulations, 2012 CFR
2012-04-01
..., work operations, concessions, quality audit reports, quality records, service records, complaints, returned product, and other sources of quality data to identify existing and potential causes of... (CONTINUED) MEDICAL DEVICES QUALITY SYSTEM REGULATION Corrective and Preventive Action § 820.100 Corrective...
21 CFR 820.100 - Corrective and preventive action.
Code of Federal Regulations, 2014 CFR
2014-04-01
..., work operations, concessions, quality audit reports, quality records, service records, complaints, returned product, and other sources of quality data to identify existing and potential causes of... (CONTINUED) MEDICAL DEVICES QUALITY SYSTEM REGULATION Corrective and Preventive Action § 820.100 Corrective...
NASA Astrophysics Data System (ADS)
Hughes, J. D.; White, J.; Doherty, J.
2011-12-01
Linear prediction uncertainty analysis in a Bayesian framework was applied to guide the conditioning of an integrated surface water/groundwater model that will be used to predict the effects of groundwater withdrawals on surface-water and groundwater flows. Linear prediction uncertainty analysis is an effective approach for identifying (1) raw and processed data most effective for model conditioning prior to inversion, (2) specific observations and periods of time critically sensitive to specific predictions, and (3) additional observation data that would reduce model uncertainty relative to specific predictions. We present results for a two-dimensional groundwater model of a 2,186 km2 area of the Biscayne aquifer in south Florida implicitly coupled to a surface-water routing model of the actively managed canal system. The model domain includes 5 municipal well fields withdrawing more than 1 Mm3/day and 17 operable surface-water control structures that control freshwater releases from the Everglades and freshwater discharges to Biscayne Bay. More than 10 years of daily observation data from 35 groundwater wells and 24 surface water gages are available to condition model parameters. A dense parameterization was used to fully characterize the contribution of the inversion null space to predictive uncertainty and included bias-correction parameters. This approach allows better resolution of the boundary between the inversion null space and solution space. Bias-correction parameters (e.g., rainfall, potential evapotranspiration, and structure flow multipliers) absorb information that is present in structural noise that may otherwise contaminate the estimation of more physically-based model parameters. This allows greater precision in predictions that are entirely solution-space dependent, and reduces the propensity for bias in predictions that are not. Results show that application of this analysis is an effective means of identifying those surface-water and groundwater data, both raw and processed, that minimize predictive uncertainty, while simultaneously identifying the maximum solution-space dimensionality of the inverse problem supported by the data.
Comparison of seven protocols to identify fecal contamination sources using Escherichia coli
Stoeckel, D.M.; Mathes, M.V.; Hyer, K.E.; Hagedorn, C.; Kator, H.; Lukasik, J.; O'Brien, T. L.; Fenger, T.W.; Samadpour, M.; Strickler, K.M.; Wiggins, B.A.
2004-01-01
Microbial source tracking (MST) uses various approaches to classify fecal-indicator microorganisms to source hosts. Reproducibility, accuracy, and robustness of seven phenotypic and genotypic MST protocols were evaluated by use of Escherichia coli from an eight-host library of known-source isolates and a separate, blinded challenge library. In reproducibility tests, measuring each protocol's ability to reclassify blinded replicates, only one (pulsed-field gel electrophoresis; PFGE) correctly classified all test replicates to host species; three protocols classified 48-62% correctly, and the remaining three classified fewer than 25% correctly. In accuracy tests, measuring each protocol's ability to correctly classify new isolates, ribotyping with EcoRI and PvuII approached 100% correct classification but only 6% of isolates were classified; four of the other six protocols (antibiotic resistance analysis, PFGE, and two repetitive-element PCR protocols) achieved better than random accuracy rates when 30-100% of challenge isolates were classified. In robustness tests, measuring each protocol's ability to recognize isolates from nonlibrary hosts, three protocols correctly classified 33-100% of isolates as "unknown origin," whereas four protocols classified all isolates to a source category. A relevance test, summarizing interpretations for a hypothetical water sample containing 30 challenge isolates, indicated that false-positive classifications would hinder interpretations for most protocols. Study results indicate that more representation in known-source libraries and better classification accuracy would be needed before field application. Thorough reliability assessment of classification results is crucial before and during application of MST protocols.
Chase, John H; Bolyen, Evan; Rideout, Jai Ram; Caporaso, J Gregory
2016-01-01
The number of samples in high-throughput comparative "omics" studies is increasing rapidly due to declining experimental costs. To keep sample data and metadata manageable and to ensure the integrity of scientific results as the scale of these projects continues to increase, it is essential that we transition to better-designed sample identifiers. Ideally, sample identifiers should be globally unique across projects, project teams, and institutions; short (to facilitate manual transcription); correctable with respect to common types of transcription errors; opaque, meaning that they do not contain information about the samples; and compatible with existing standards. We present cual-id, a lightweight command line tool that creates, or mints, sample identifiers that meet these criteria without reliance on centralized infrastructure. cual-id allows users to assign universally unique identifiers, or UUIDs, that are globally unique to their samples. UUIDs are too long to be conveniently written on sampling materials, such as swabs or microcentrifuge tubes, however, so cual-id additionally generates human-friendly 4- to 12-character identifiers that map to their UUIDs and are unique within a project. By convention, we use "cual-id" to refer to the software, "CualID" to refer to the short, human-friendly identifiers, and "UUID" to refer to the globally unique identifiers. CualIDs are used by humans when they manually write or enter identifiers, while the longer UUIDs are used by computers to unambiguously reference a sample. Finally, cual-id optionally generates printable label sticker sheets containing Code 128 bar codes and CualIDs for labeling of sample collection and processing materials. IMPORTANCE The adoption of identifiers that are globally unique, correctable, and easily handwritten or manually entered into a computer will be a major step forward for sample tracking in comparative omics studies. As the fields transition to more-centralized sample management, for example, across labs within an institution, across projects funded under a common program, or in systems designed to facilitate meta- and/or integrated analysis, sample identifiers generated with cual-id will not need to change; thus, costly and error-prone updating of data and metadata identifiers will be avoided. Further, using cual-id will ensure that transcription errors in sample identifiers do not require the discarding of otherwise-useful samples that may have been expensive to obtain. Finally, cual-id is simple to install and use and is free for all use. No centralized infrastructure is required to ensure global uniqueness, so it is feasible for any lab to get started using these identifiers within their existing infrastructure.
ERIC Educational Resources Information Center
Saavedra, Pedro; Kuchak, JoAnn
An error-prone model (EPM) to predict financial aid applicants who are likely to misreport on Basic Educational Opportunity Grant (BEOG) applications was developed, based on interviews conducted with a quality control sample of 1,791 students during 1978-1979. The model was designed to identify corrective methods appropriate for different types of…
Mahjoub, Mohamed; Bouafia, Nabiha; Cheikh, Asma Ben; Ezzi, Olfa; Njah, Mansour
2016-11-25
This study provided an overview of healthcare professionals’ perception of patient safety based on analysis of the concept of freedom of expression and non-punitive response in order to identify and correct errors in our health system. This concept is a cornerstone of the patient safety culture among healthcare professionals and plays a central role in the quality improvement strategy..
Multivariate proteomic profiling identifies novel accessory proteins of coated vesicles
Antrobus, Robin; Hirst, Jennifer; Bhumbra, Gary S.; Kozik, Patrycja; Jackson, Lauren P.; Sahlender, Daniela A.
2012-01-01
Despite recent advances in mass spectrometry, proteomic characterization of transport vesicles remains challenging. Here, we describe a multivariate proteomics approach to analyzing clathrin-coated vesicles (CCVs) from HeLa cells. siRNA knockdown of coat components and different fractionation protocols were used to obtain modified coated vesicle-enriched fractions, which were compared by stable isotope labeling of amino acids in cell culture (SILAC)-based quantitative mass spectrometry. 10 datasets were combined through principal component analysis into a “profiling” cluster analysis. Overall, 136 CCV-associated proteins were predicted, including 36 new proteins. The method identified >93% of established CCV coat proteins and assigned >91% correctly to intracellular or endocytic CCVs. Furthermore, the profiling analysis extends to less well characterized types of coated vesicles, and we identify and characterize the first AP-4 accessory protein, which we have named tepsin. Finally, our data explain how sequestration of TACC3 in cytosolic clathrin cages causes the severe mitotic defects observed in auxilin-depleted cells. The profiling approach can be adapted to address related cell and systems biological questions. PMID:22472443
Measuring System Value in the Ares 1 Rocket Using an Uncertainty-Based Coupling Analysis Approach
NASA Astrophysics Data System (ADS)
Wenger, Christopher
Coupling of physics in large-scale complex engineering systems must be correctly accounted for during the systems engineering process to ensure no unanticipated behaviors or unintended consequences arise in the system during operation. Structural vibration of large segmented solid rocket motors, known as thrust oscillation, is a well-documented problem that can affect the health and safety of any crew onboard. Within the Ares 1 rocket, larger than anticipated vibrations were recorded during late stage flight that propagated from the engine chamber to the Orion crew module. Upon investigation engineers found the root cause to be the structure of the rockets feedback onto fluid flow within the engine. The goal of this paper is to showcase a coupling strength analysis from the field of Multidisciplinary Design Optimization to identify the major impacts that caused the Thrust Oscillation event in the Ares 1. Once identified an uncertainty analysis of the coupled system using an uncertainty based optimization technique is used to identify the likelihood of occurrence for these strong or weak interactions to take place.
Singh, Amit; Rhee, Kyung E; Brennan, Jesse J; Kuelbs, Cynthia; El-Kareh, Robert; Fisher, Erin S
2016-03-01
Increase parent/caregiver ability to correctly identify the attending in charge and define terminology of treatment team members (TTMs). We hypothesized that correct TTM identification would increase with use of an electronic communication tool. Secondary aims included assessing subjects' satisfaction with and trust of TTM and interest in computer activities during hospitalization. Two similar groups of parents/legal guardians/primary caregivers of children admitted to the Pediatric Hospital Medicine teaching service with an unplanned first admission were surveyed before (Phase 1) and after (Phase 2) implementation of a novel electronic medical record (EMR)-based tool with names, photos, and definitions of TTMs. Physicians were also surveyed only during Phase 1. Surveys assessed TTM identification, satisfaction, trust, and computer use. More subjects in Phase 2 correctly identified attending physicians by name (71% vs. 28%, P < .001) and correctly defined terms intern, resident, and attending (P ≤ .03) compared with Phase 1. Almost all subjects (>79%) and TTMs (>87%) reported that subjects' ability to identify TTMs moderately or strongly impacted satisfaction and trust. The majority of subjects expressed interest in using computers to understand TTMs in each phase. Subjects' ability to correctly identify attending physicians and define TTMs was significantly greater for those who used our tool. In our study, subjects reported that TTM identification impacted aspects of the TTM relationship, yet few could correctly identify TTMs before tool use. This pilot study showed early success in engaging subjects with the EMR in the hospital and suggests that families would engage in computer-based activities in this setting. Copyright © 2016 by the American Academy of Pediatrics.
Goodacre, R; Hiom, S J; Cheeseman, S L; Murdoch, D; Weightman, A J; Wade, W G
1996-02-01
Curie-point pyrolysis mass spectra were obtained from 29 oral asaccharolytic Eubacterium strains and 6 abscess isolates previously identified as Peptostreptococcus heliotrinreducens. Pyrolysis mass spectrometry (PyMS) with cluster analysis was able to clarify the taxonomic position of this group of organisms. Artificial neural networks (ANNS) were then trained by supervised learning (with the back-propagation algorithm) to recognize the strains from their pyrolysis mass spectra; all Eubacterium strains were correctly identified, and the abscess isolates were identified as un-named Eubacterium taxon C2 and were distinct from the type strain of P. heliotrinreducens. These results demonstrate that the combination of PyMS and ANNs provides a rapid and accurate identification technique.
Holden, Ronald R; Lambert, Christine E
2015-12-01
Van Hooft and Born (Journal of Applied Psychology 97:301-316, 2012) presented data challenging both the correctness of a congruence model of faking on personality test items and the relative merit (i.e., effect size) of response latencies for identifying fakers. We suggest that their analysis of response times was suboptimal, and that it followed neither from a congruence model of faking nor from published protocols on appropriately filtering the noise in personality test item answering times. Using new data and following recommended analytic procedures, we confirmed the relative utility of response times for identifying personality test fakers, and our obtained results, again, reinforce a congruence model of faking.
Classification of white wine aromas with an electronic nose.
Lozano, J; Santos, J P; Horrillo, M C
2005-09-15
This paper reports the use of a tin dioxide multisensor array based electronic nose for recognition of 29 typical aromas in white wine. Headspace technique has been used to extract aroma of the wine. Multivariate analysis, including principal component analysis (PCA) as well as probabilistic neural networks (PNNs), has been used to identify the main aroma added to the wine. The results showed that in spite of the strong influence of ethanol and other majority compounds of wine, the system could discriminate correctly the aromatic compounds added to the wine with a minimum accuracy of 97.2%.
Molecular reclassification of Crohn's disease: a cautionary note on population stratification.
Maus, Bärbel; Jung, Camille; Mahachie John, Jestinah M; Hugot, Jean-Pierre; Génin, Emmanuelle; Van Steen, Kristel
2013-01-01
Complex human diseases commonly differ in their phenotypic characteristics, e.g., Crohn's disease (CD) patients are heterogeneous with regard to disease location and disease extent. The genetic susceptibility to Crohn's disease is widely acknowledged and has been demonstrated by identification of over 100 CD associated genetic loci. However, relating CD subphenotypes to disease susceptible loci has proven to be a difficult task. In this paper we discuss the use of cluster analysis on genetic markers to identify genetic-based subgroups while taking into account possible confounding by population stratification. We show that it is highly relevant to consider the confounding nature of population stratification in order to avoid that detected clusters are strongly related to population groups instead of disease-specific groups. Therefore, we explain the use of principal components to correct for population stratification while clustering affected individuals into genetic-based subgroups. The principal components are obtained using 30 ancestry informative markers (AIM), and the first two PCs are determined to discriminate between continental origins of the affected individuals. Genotypes on 51 CD associated single nucleotide polymorphisms (SNPs) are used to perform latent class analysis, hierarchical and Partitioning Around Medoids (PAM) cluster analysis within a sample of affected individuals with and without the use of principal components to adjust for population stratification. It is seen that without correction for population stratification clusters seem to be influenced by population stratification while with correction clusters are unrelated to continental origin of individuals.
Molecular Reclassification of Crohn’s Disease: A Cautionary Note on Population Stratification
Maus, Bärbel; Jung, Camille; Mahachie John, Jestinah M.; Hugot, Jean-Pierre; Génin, Emmanuelle; Van Steen, Kristel
2013-01-01
Complex human diseases commonly differ in their phenotypic characteristics, e.g., Crohn’s disease (CD) patients are heterogeneous with regard to disease location and disease extent. The genetic susceptibility to Crohn’s disease is widely acknowledged and has been demonstrated by identification of over 100 CD associated genetic loci. However, relating CD subphenotypes to disease susceptible loci has proven to be a difficult task. In this paper we discuss the use of cluster analysis on genetic markers to identify genetic-based subgroups while taking into account possible confounding by population stratification. We show that it is highly relevant to consider the confounding nature of population stratification in order to avoid that detected clusters are strongly related to population groups instead of disease-specific groups. Therefore, we explain the use of principal components to correct for population stratification while clustering affected individuals into genetic-based subgroups. The principal components are obtained using 30 ancestry informative markers (AIM), and the first two PCs are determined to discriminate between continental origins of the affected individuals. Genotypes on 51 CD associated single nucleotide polymorphisms (SNPs) are used to perform latent class analysis, hierarchical and Partitioning Around Medoids (PAM) cluster analysis within a sample of affected individuals with and without the use of principal components to adjust for population stratification. It is seen that without correction for population stratification clusters seem to be influenced by population stratification while with correction clusters are unrelated to continental origin of individuals. PMID:24147066
[Differentiation by geometric morphometrics among 11 Anopheles (Nyssorhynchus) in Colombia].
Calle, David Alonso; Quiñones, Martha Lucía; Erazo, Holmes Francisco; Jaramillo, Nicolás
2008-09-01
The correct identification of the Anopheles species of the subgenus Nyssorhynchus is important because this subgenus includes the main malaria vectors in Colombia. This information is necessary for focusing a malaria control program. Geometric morphometrics were used to evaluate morphometric variation of 11 species of subgenus Nyssorhynchus present in Colombia and to distinguish females of each species. Materials and methods. The specimens were obtained from series and family broods from females collected with protected human hosts as attractants. The field collected specimens and their progeny were identified at each of the associated stages by conventional keys. For some species, wild females were used. Landmarks were selected on wings from digital pictures from 336 individuals, and digitized with coordinates. The coordinate matrix was processed by generalized Procrustes analysis which generated size and shape variables, free of non-biological variation. Size and shape variables were analyzed by univariate and multivariate statistics. The subdivision of subgenus Nyssorhynchus in sections is not correlated with wing shape. Discriminant analyses correctly classified 97% of females in the section Albimanus and 86% in the section Argyritarsis. In addition, these methodologies allowed the correct identification of 3 sympatric species from Putumayo which have been difficult to identify in the adult female stage. The geometric morphometrics were demonstrated to be a very useful tool as an adjunct to taxonomy of females the use of this method is recommended in studies of the subgenus Nyssorhynchus in Colombia.
The Essential Genome of Escherichia coli K-12
2018-01-01
ABSTRACT Transposon-directed insertion site sequencing (TraDIS) is a high-throughput method coupling transposon mutagenesis with short-fragment DNA sequencing. It is commonly used to identify essential genes. Single gene deletion libraries are considered the gold standard for identifying essential genes. Currently, the TraDIS method has not been benchmarked against such libraries, and therefore, it remains unclear whether the two methodologies are comparable. To address this, a high-density transposon library was constructed in Escherichia coli K-12. Essential genes predicted from sequencing of this library were compared to existing essential gene databases. To decrease false-positive identification of essential genes, statistical data analysis included corrections for both gene length and genome length. Through this analysis, new essential genes and genes previously incorrectly designated essential were identified. We show that manual analysis of TraDIS data reveals novel features that would not have been detected by statistical analysis alone. Examples include short essential regions within genes, orientation-dependent effects, and fine-resolution identification of genome and protein features. Recognition of these insertion profiles in transposon mutagenesis data sets will assist genome annotation of less well characterized genomes and provides new insights into bacterial physiology and biochemistry. PMID:29463657
Birch, Gabriel Carisle; Griffin, John Clark
2015-07-23
Numerous methods are available to measure the spatial frequency response (SFR) of an optical system. A recent change to the ISO 12233 photography resolution standard includes a sinusoidal Siemens star test target. We take the sinusoidal Siemens star proposed by the ISO 12233 standard, measure system SFR, and perform an analysis of errors induced by incorrectly identifying the center of a test target. We show a closed-form solution for the radial profile intensity measurement given an incorrectly determined center and describe how this error reduces the measured SFR of the system. As a result, using the closed-form solution, we proposemore » a two-step process by which test target centers are corrected and the measured SFR is restored to the nominal, correctly centered values.« less
Strife, Robert J; Wang, Yongdong; Kuehl, Don
2018-06-19
Restricted Spectral Accuracy (RSA) is applied to Orbitrap data (240,000 resolution at m/z 400) to more clearly break out the scoring and ranking of allowable elemental compositions (EC) in a candidate list. The correct EC is usually top ranked and separated from other answers by 10-40% within the dimensionless 0-100% scale, providing a single, definitive EC. The A+2 position (where A denotes the monoisotopic line position) is especially advantageous in RSA. It has enough intensity and more complexity than (A+1) fine lines and is like a fingerprint. Avoidance of coalescene phenomena and careful ion population control are essential. This article is protected by copyright. All rights reserved.
Identifying Potential Ventilator Auto-Triggering Among Organ Procurement Organization Referrals.
Henry, Nicholas R; Russian, Christopher J; Nespral, Joseph
2016-06-01
Ventilator auto-trigger is the delivery of an assisted mechanical ventilated breath over the set ventilator frequency in the absence of a spontaneous inspiratory effort and can be caused by inappropriate ventilator trigger sensitivity. Ventilator auto-trigger can be misinterpreted as a spontaneous breath and has the potential to delay or prevent brain death testing and confuse health-care professionals and/or patient families. To determine the frequency of organ donor referrals from 1 Organ Procurement Organization (OPO) that could benefit from an algorithm designed to assist organ recovery coordinators to identify and correct ventilator auto-triggering. This retrospective analysis evaluated documentation of organ donor referrals from 1 OPO in central Texas during the 2013 calendar year that resulted in the withdrawal of care by the patient's family and the recovery of organs. The frequency of referrals that presented with absent brain stem reflexes except for additional respirations over the set ventilator rate was determined to assess for the need of the proposed algorithm. Documentation of 672 organ procurement organization referrals was evaluated. Documentation from 42 referrals that resulted in the withdrawal of care and 21 referrals that resulted in the recovery of organs were identified with absent brain stem reflexes except for spontaneous respirations on the mechanical ventilator. As a result, an algorithm designed to identify and correct ventilator auto-trigger could have been used 63 times during the 2013 calendar year. © 2016, NATCO.
ERIC Educational Resources Information Center
Marsden, Mary Ellen, Ed.; Straw, Richard S., Ed.
This report presents methodology and findings from the Uniform Facility Data Set (UFDS) 1997 Survey of Correctional Facilities, which surveyed about 7,600 adult and juvenile correctional facilities to identify those that provide on-site substance abuse treatment to their inmates or residents. The survey assesses substance abuse treatment provided…
Correcting the Alar Base Retraction in Crooked Nose by Dissection of Levator Alaque Nasi Muscle.
Taş, Süleyman
2016-10-01
Nasal base retraction results from cephalic malposition of the alar base in the vertical plane causing disharmonies in the alar base. In literature, there are some excisional procedures to correct this deformity, but it may result to nostril distortion, stenosis, or upper lip elevation. Here, a new technique is reported for the correction of nasal base retraction in crooked nose by manipulating the levator labii alaeque nasi muscle. Sixteen patients, 6 women and 10 men ranging in age from 21 to 42 years, who have alar retraction with crooked nose, were operated, with a follow-up period of 12 months. Preoperative and postoperative frontal, profile, base, and oblique base views in a standard manner were taken and analyzed with Image software. Comparison of preoperative and postoperative photographs demonstrated that nasal base retractions were corrected in all cases without distortion and recurrence. Nasal obstruction was reduced after surgery, and self-evaluation of nasal patency scores significantly increased in all patients (P < 0.001). Functional and aesthetic outcomes were satisfactory for surgeons and the patients. Careful analysis to identify the deformity and proper selection of the technique will ensure a pleasing outcome. The new techniques presented for the correction of nasal base retraction and prevention of the recurrence of the dorsal deviation will help rhinoplasty surgeons obtain pleasing outcomes.
Garbarino, John R.; Taylor, Howard E.
1987-01-01
Inductively coupled plasma mass spectrometry is employed in the determination of Ni, Cu, Sr, Cd, Ba, Ti, and Pb in nonsaline, natural water samples by stable isotope dilution analysis. Hydrologic samples were directly analyzed without any unusual pretreatment. Interference effects related to overlapping isobars, formation of metal oxide and multiply charged ions, and matrix composition were identified and suitable methods of correction evaluated. A comparability study snowed that single-element isotope dilution analysis was only marginally better than sequential multielement isotope dilution analysis. Accuracy and precision of the single-element method were determined on the basis of results obtained for standard reference materials. The instrumental technique was shown to be ideally suited for programs associated with certification of standard reference materials.
De Molfetta, Greice Andreotti; Felix, Temis Maria; Riegel, Mariluce; Ferraz, Victor Evangelista de Faria; de Pina Neto, João Monteiro
2002-12-01
Angelman syndrome (AS) and Prader-Willi syndrome (PWS) are distinct human neurogenetic disorders; however, a clinical overlap between AS and PWS has been identified. We report on a further case of a patient showing the PWS phenotype with the AS molecular defect. Despite the PWS phenotype, the DNA methylation analysis of SNRPN revealed an AS pattern. Cytogenetic and FISH analysis showed normal chromosomes 15 and microsatellite analysis showed heterozygous loci inside and outside the 15q11-13 region. The presence of these atypical cases could be more frequent than previously expected and we reinforce that the DNA methylation analysis is important for the correct diagnosis of severe mental deficiency, congenital hypotonia and obesity.
Li, Ke; Cavaignac, Etienne; Xu, Wei; Cheng, Qiang; Telmon, Nobert; Huang, Wei
2018-02-20
Morphologic data of the knee is very important in the design of total knee prostheses. Generally, the designs of the total knee prostheses are based on the knee anatomy of Caucasian population. Moreover, in forensic medicine, a person's age and sex might be estimated by the shape of their knees. The aim of this study is to utilize three-dimensional morphometric analysis of the knee in Chinese population to reveal sexual dimorphism and age-related differences. Sexually dimorphic differences and age-related differences of the distal femur were studied by using geometric morphometric analysis of ten osteometric landmarks on three-dimensional reconstructions of 259 knees in Chinese population. General Procrustes analysis, PCA, and other discriminant analysis such as Mahalanobis and Goodall's F test were conducted for the knee to identify sexually dimorphism and age-related differences of the knee. The shape of distal femur between the male and female is significantly different. A difference between males and females in distal femur shape was identified by PCA; PC1 and PC2 accounted for 61.63% of the variance measured. The correct sex was assigned in 84.9% of cases by CVA, and the cross-validation revealed a 81.1% rate of correct sex estimation. The osteometric analysis also showed significant differences between the three age-related subgroups (< 40, 40-60, > 60 years, p < 0.005). This study showed both sex-related difference and age-related difference in the distal femur in Chinese population by 3D geometric morphometric analysis. Our bone measurements and geometric morphometric analysis suggest that population characteristics should be taken into account and may provide references for design of total knee prostheses in a Chinese population. Moreover, this reliable, accurate method could be used to perform diachronic and interethnic comparisons.
Stępień-Pyśniak, Dagmara; Hauschild, Tomasz; Różański, Paweł; Marek, Agnieszka
2017-06-28
The aim of this study was to explore the accuracy and feasibility of matrix-assisted laser desorption ionization time-of-flight mass spectrometry (MALDI-TOF MS) in identifying bacteria from environmental sources, as compared with rpoA gene sequencing, and to evaluate the occurrence of bacteria of the genus Enterococcus in wild birds. In addition, a phyloproteomic analysis of certain Enterococcus species with spectral relationships was performed. The enterococci were isolated from 25 species of wild birds in central Europe (Poland). Proteomic (MALDI-TOF MS) and genomic ( rpoA gene sequencing) methods were used to identify all the isolates. Using MALDI-TOF MS, all 54 (100%) isolates were identified as Enterococcus spp. Among these, 51 (94.4%) isolates were identified to the species level (log(score) > or =2.0), and three isolates (5.6%) were identified at a level of probable genus identification (log(score) 1.88-1.927). Phylogenetic analysis based on rpoA sequences confirmed that all enterococci had been correctly identified. Enterococcus faecalis was the most prevalent enterococcal species (50%) and Enterococcus faecium (33.3%) the second most frequent species, followed by Enterococcus hirae (9.3%), Enterococcus durans (3.7%), and Enterococcus casseliflavus (3.7%). The phyloproteomic analysis of the spectral profiles of the isolates showed that MALDI-TOF MS is able to differentiate among similar species of the genus Enterococcus .
Spontaneous movements of preterm infants is associated with outcome of gross motor development.
Miyagishima, Saori; Asaka, Tadayoshi; Kamatsuka, Kaori; Kozuka, Naoki; Kobayashi, Masaki; Igarashi, Lisa; Hori, Tsukasa; Tsutsumi, Hiroyuki
2018-04-30
We conducted a longitudinal cohort study to analyze the relationship between outcome of gross motor development in preterm infants and factors that might affect their development. Preterm infants with a birth weight of <1500 g were recruited. We measured spontaneous antigravity limbs movements by 3D motion capture system at 3 months corrected age. Gross motor developmental outcomes at 6 and 12 months corrected age were evaluated using the Alberta Infant Motor Scale (AIMS). Statistical analysis was carried out by canonical correlation analysis. Eighteen preterm infants were included. In the 6 months corrected age analysis, spontaneous movement had a major effect on Prone and Sitting at 6 months corrected age of AIMS. In the 12 months corrected age analysis, spontaneous movement had a major effect on Sitting and Standing at 12 months corrected age of AIMS. In preterm infants, better antigravity spontaneous movements at 3 months corrected age were significantly correlated with better gross motor development at 6 or 12 months corrected age. Copyright © 2018 The Japanese Society of Child Neurology. Published by Elsevier B.V. All rights reserved.
Exploring "psychic transparency" during pregnancy: a mixed-methods approach.
Oriol, Cécile; Tordjman, Sylvie; Dayan, Jacques; Poulain, Patrice; Rosenblum, Ouriel; Falissard, Bruno; Dindoyal, Asha; Naudet, Florian
2016-08-12
Psychic transparency is described as a psychic crisis occurring during pregnancy. The objective was to test if it was clinically detectable. Seven primiparous and seven nulliparous subjects were recorded during 5 min of spontaneous speech about their dreams. 25 raters from five groups (psychoanalysts, psychiatrists, general practitioners, pregnant women and medical students) listened to the audiotapes. They were asked to rate the probability of the women being pregnant or not. Their ability to discriminate the primiparous women was tested. The probability of being identified correctly or not was calculated for each woman. A qualitative analysis of the speech samples was performed. No group of rater was able to correctly classify pregnant and non-pregnant women. However, the raters' choices were not completely random. The wish to be pregnant or to have a baby could be linked to a primiparous classification whereas job priorities could be linked to a nulliparous classification. It was not possible to detect Psychic transparency in this study. The wish for a child might be easier to identify. In addition, the raters' choices seemed to be connected to social representations of motherhood.
Predicting Fog in the Nocturnal Boundary Layer
NASA Astrophysics Data System (ADS)
Izett, Jonathan; van de Wiel, Bas; Baas, Peter; van der Linden, Steven; van Hooft, Antoon; Bosveld, Fred
2017-04-01
Fog is a global phenomenon that presents a hazard to navigation and human safety, resulting in significant economic impacts for air and shipping industries as well as causing numerous road traffic accidents. Accurate prediction of fog events, however, remains elusive both in terms of timing and occurrence itself. Statistical methods based on set threshold criteria for key variables such as wind speed have been developed, but high rates of correct prediction of fog events still lead to similarly high "false alarms" when the conditions appear favourable, but no fog forms. Using data from the CESAR meteorological observatory in the Netherlands, we analyze specific cases and perform statistical analyses of event climatology, in order to identify the necessary conditions for correct prediction of fog. We also identify potential "missing ingredients" in current analysis that could help to reduce the number of false alarms. New variables considered include the indicators of boundary layer stability, as well as the presence of aerosols conducive to droplet formation. The poster presents initial findings of new research as well as plans for continued research.
Tang, Hua; Chen, Wei; Lin, Hao
2016-04-01
Immunoglobulins, also called antibodies, are a group of cell surface proteins which are produced by the immune system in response to the presence of a foreign substance (called antigen). They play key roles in many medical, diagnostic and biotechnological applications. Correct identification of immunoglobulins is crucial to the comprehension of humoral immune function. With the avalanche of protein sequences identified in postgenomic age, it is highly desirable to develop computational methods to timely identify immunoglobulins. In view of this, we designed a predictor called "IGPred" by formulating protein sequences with the pseudo amino acid composition into which nine physiochemical properties of amino acids were incorporated. Jackknife cross-validated results showed that 96.3% of immunoglobulins and 97.5% of non-immunoglobulins can be correctly predicted, indicating that IGPred holds very high potential to become a useful tool for antibody analysis. For the convenience of most experimental scientists, a web-server for IGPred was established at http://lin.uestc.edu.cn/server/IGPred. We believe that the web-server will become a powerful tool to study immunoglobulins and to guide related experimental validations.
Ambiguities in model-independent partial-wave analysis
NASA Astrophysics Data System (ADS)
Krinner, F.; Greenwald, D.; Ryabchikov, D.; Grube, B.; Paul, S.
2018-06-01
Partial-wave analysis is an important tool for analyzing large data sets in hadronic decays of light and heavy mesons. It commonly relies on the isobar model, which assumes multihadron final states originate from successive two-body decays of well-known undisturbed intermediate states. Recently, analyses of heavy-meson decays and diffractively produced states have attempted to overcome the strong model dependences of the isobar model. These analyses have overlooked that model-independent, or freed-isobar, partial-wave analysis can introduce mathematical ambiguities in results. We show how these ambiguities arise and present general techniques for identifying their presence and for correcting for them. We demonstrate these techniques with specific examples in both heavy-meson decay and pion-proton scattering.
Seidman, Dominika; Carlson, Kimberly; Weber, Shannon; Witt, Jacki; Kelly, Patricia J
2016-05-01
The Centers for Disease Control and Prevention defines HIV prevention as a core family planning service. The HIV community identified family planning visits as key encounters for women to access preexposure prophylaxis (PrEP) for HIV prevention. No studies explore US family planning providers' knowledge of and attitudes towards PrEP. We conducted a national survey of clinicians to understand barriers and facilitators to PrEP implementation in family planning. Family planning providers recruited via website postings, national meetings, and email completed an anonymous survey in 2015. Descriptive statistics were performed. Among 604 respondents, 495 were eligible for analysis and 342 were potential PrEP prescribers (physicians, nurse practitioners, midwives or physicians assistants). Among potential prescribers, 38% correctly defined PrEP [95% confidence interval (CI): 32.5-42.8], 37% correctly stated the efficacy of PrEP (95% CI: 32.0-42.4), and 36% chose the correct HIV test after a recent exposure (95% CI: 30.6-40.8). Characteristics of those who answered knowledge questions correctly included age less than 35 years, practicing in the Northeast or West, routinely offering HIV testing, providing rectal sexually transmitted infection screening or having seen any PrEP guidelines. Even among providers in the Northeast and West, the proportion of respondents answering questions correctly was less than 50%. Thirty-six percent of respondents had seen any PrEP guidelines. Providers identified lack of training as the main barrier to PrEP implementation; 87% wanted PrEP education. To offer comprehensive HIV prevention services, family planning providers urgently need training on PrEP and HIV testing. US family planning providers have limited knowledge about HIV PrEP and HIV testing, and report lack of provider training as the main barrier to PrEP provision. Provider education is needed to ensure that family planning clients access comprehensive HIV prevention methods. Copyright © 2016 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Hoehndorf, Robert; Schofield, Paul N.; Gkoutos, Georgios V.
2015-06-01
Phenotypes are the observable characteristics of an organism arising from its response to the environment. Phenotypes associated with engineered and natural genetic variation are widely recorded using phenotype ontologies in model organisms, as are signs and symptoms of human Mendelian diseases in databases such as OMIM and Orphanet. Exploiting these resources, several computational methods have been developed for integration and analysis of phenotype data to identify the genetic etiology of diseases or suggest plausible interventions. A similar resource would be highly useful not only for rare and Mendelian diseases, but also for common, complex and infectious diseases. We apply a semantic text-mining approach to identify the phenotypes (signs and symptoms) associated with over 6,000 diseases. We evaluate our text-mined phenotypes by demonstrating that they can correctly identify known disease-associated genes in mice and humans with high accuracy. Using a phenotypic similarity measure, we generate a human disease network in which diseases that have similar signs and symptoms cluster together, and we use this network to identify closely related diseases based on common etiological, anatomical as well as physiological underpinnings.
NASA Astrophysics Data System (ADS)
Bhargava, K.; Kalnay, E.; Carton, J.; Yang, F.
2017-12-01
Systematic forecast errors, arising from model deficiencies, form a significant portion of the total forecast error in weather prediction models like the Global Forecast System (GFS). While much effort has been expended to improve models, substantial model error remains. The aim here is to (i) estimate the model deficiencies in the GFS that lead to systematic forecast errors, (ii) implement an online correction (i.e., within the model) scheme to correct GFS following the methodology of Danforth et al. [2007] and Danforth and Kalnay [2008, GRL]. Analysis Increments represent the corrections that new observations make on, in this case, the 6-hr forecast in the analysis cycle. Model bias corrections are estimated from the time average of the analysis increments divided by 6-hr, assuming that initial model errors grow linearly and first ignoring the impact of observation bias. During 2012-2016, seasonal means of the 6-hr model bias are generally robust despite changes in model resolution and data assimilation systems, and their broad continental scales explain their insensitivity to model resolution. The daily bias dominates the sub-monthly analysis increments and consists primarily of diurnal and semidiurnal components, also requiring a low dimensional correction. Analysis increments in 2015 and 2016 are reduced over oceans, which is attributed to improvements in the specification of the SSTs. These results encourage application of online correction, as suggested by Danforth and Kalnay, for mean, seasonal and diurnal and semidiurnal model biases in GFS to reduce both systematic and random errors. As the error growth in the short-term is still linear, estimated model bias corrections can be added as a forcing term in the model tendency equation to correct online. Preliminary experiments with GFS, correcting temperature and specific humidity online show reduction in model bias in 6-hr forecast. This approach can then be used to guide and optimize the design of sub-grid scale physical parameterizations, more accurate discretization of the model dynamics, boundary conditions, radiative transfer codes, and other potential model improvements which can then replace the empirical correction scheme. The analysis increments also provide guidance in testing new physical parameterizations.
Osborne, Jason W
2006-05-01
D'Amico, Neilands, and Zambarano (2001) published SPSS syntax to perform power analyses for three complex procedures: ANCOVA, MANOVA, and repeated measures ANOVA. Unfortunately, the published SPSS syntax for performing the repeated measures analysis needed some minor revision in order to perform the analysis correctly. This article presents the corrected syntax that will successfully perform the repeated measures analysis and provides some guidance on modifying the syntax to customize the analysis.
Implementing Training for Correctional Educators. Correctional/Special Education Training Project.
ERIC Educational Resources Information Center
Wolford, Bruce I., Ed.; And Others
These papers represent the collected thoughts of the contributors to a national training and dissemination conference dealing with identifying and developing linkages between postsecondary special education and criminal justice preservice education programs in order to improve training for correctional educators working with disabled clients. The…
Vocational Education in Corrections. Information Series No. 237.
ERIC Educational Resources Information Center
Day, Sherman R.; McCane, Mel R.
Vocational education programs in America's correctional institutions have been financially handicapped, since security demands the greatest portion of resource allocations. Four eras in the development of the correctional system are generally identified: era of punishment and retribution, era of restraint or reform, era of rehabilitation and…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-01-05
... cracking was identified as stress corrosion. This condition, if not corrected, could lead to in-flight... identified as stress corrosion. This condition, if not corrected, could lead to in-flight failure of the tab..., using a material that is more resistant to stress corrosion. The improved material rudder spring tab...
ERIC Educational Resources Information Center
Murray, Ellen R.
2016-01-01
According to the literature, identifying and treating tuberculosis (TB) in correctional facilities have been problematic for the inmates and also for the communities into which inmates are released. The importance of training those who can identify this disease early into incarceration is vital to halt the transmission. Although some training has…
ERIC Educational Resources Information Center
Croker, Robert E.; And Others
A study identified the learning style preferences and brain hemisphericity of female inmates at the Pocatello Women's Correctional Center in Pocatello, Idaho. It also identified teaching methodologies to which inmates were exposed while in a learning environment as well as preferred teaching methods. Data were gathered by the Learning Type Measure…
Work productivity loss from depression: evidence from an employer survey.
Rost, Kathryn M; Meng, Hongdao; Xu, Stanley
2014-12-18
National working groups identify the need for return on investment research conducted from the purchaser perspective; however, the field has not developed standardized methods for measuring the basic components of return on investment, including costing out the value of work productivity loss due to illness. Recent literature is divided on whether the most commonly used method underestimates or overestimates this loss. The goal of this manuscript is to characterize between and within variation in the cost of work productivity loss from illness estimated by the most commonly used method and its two refinements. One senior health benefit specialist from each of 325 companies employing 100+ workers completed a cross-sectional survey describing their company size, industry and policies/practices regarding work loss which allowed the research team to derive the variables needed to estimate work productivity loss from illness using three methods. Compensation estimates were derived by multiplying lost work hours from presenteeism and absenteeism by wage/fringe. Disruption correction adjusted this estimate to account for co-worker disruption, while friction correction accounted for labor substitution. The analysis compared bootstrapped means and medians between and within these three methods. The average company realized an annual $617 (SD = $75) per capita loss from depression by compensation methods and a $649 (SD = $78) loss by disruption correction, compared to a $316 (SD = $58) loss by friction correction (p < .0001). Agreement across estimates was 0.92 (95% CI 0.90, 0.93). Although the methods identify similar companies with high costs from lost productivity, friction correction reduces the size of compensation estimates of productivity loss by one half. In analyzing the potential consequences of method selection for the dissemination of interventions to employers, intervention developers are encouraged to include friction methods in their estimate of the economic value of interventions designed to improve absenteeism and presenteeism. Business leaders in industries where labor substitution is common are encouraged to seek friction corrected estimates of return on investment. Health policy analysts are encouraged to target the dissemination of productivity enhancing interventions to employers with high losses rather than all employers. NCT01013220.
Potential use of ionic species for identifying source land-uses of stormwater runoff.
Lee, Dong Hoon; Kim, Jin Hwi; Mendoza, Joseph A; Lee, Chang-Hee; Kang, Joo-Hyon
2017-02-01
Identifying critical land-uses or source areas is important to prioritize resources for cost-effective stormwater management. This study investigated the use of information on ionic composition as a fingerprint to identify the source land-use of stormwater runoff. We used 12 ionic species in stormwater runoff monitored for a total of 20 storm events at five sites with different land-use compositions during the 2012-2014 wet seasons. A stepwise forward discriminant function analysis (DFA) with the jack-knifed cross validation approach was used to select ionic species that better discriminate the land-use of its source. Of the 12 ionic species, 9 species (K + , Mg 2+ , Na + , NH 4 + , Br - , Cl - , F - , NO 2 - , and SO 4 2- ) were selected for better performance of the DFA. The DFA successfully differentiated stormwater samples from urban, rural, and construction sites using concentrations of the ionic species (70%, 95%, and 91% of correct classification, respectively). Over 80% of the new data cases were correctly classified by the trained DFA model. When applied to data cases from a mixed land-use catchment and downstream, the DFA model showed the greater impact of urban areas and rural areas respectively in the earlier and later parts of a storm event.
Phylogenetic Analysis and Classification of the Fungal bHLH Domain
Sailsbery, Joshua K.; Atchley, William R.; Dean, Ralph A.
2012-01-01
The basic Helix-Loop-Helix (bHLH) domain is an essential highly conserved DNA-binding domain found in many transcription factors in all eukaryotic organisms. The bHLH domain has been well studied in the Animal and Plant Kingdoms but has yet to be characterized within Fungi. Herein, we obtained and evaluated the phylogenetic relationship of 490 fungal-specific bHLH containing proteins from 55 whole genome projects composed of 49 Ascomycota and 6 Basidiomycota organisms. We identified 12 major groupings within Fungi (F1–F12); identifying conserved motifs and functions specific to each group. Several classification models were built to distinguish the 12 groups and elucidate the most discerning sites in the domain. Performance testing on these models, for correct group classification, resulted in a maximum sensitivity and specificity of 98.5% and 99.8%, respectively. We identified 12 highly discerning sites and incorporated those into a set of rules (simplified model) to classify sequences into the correct group. Conservation of amino acid sites and phylogenetic analyses established that like plant bHLH proteins, fungal bHLH–containing proteins are most closely related to animal Group B. The models used in these analyses were incorporated into a software package, the source code for which is available at www.fungalgenomics.ncsu.edu. PMID:22114358
Saturn 5 launch vehicle flight evaluation report-AS-509 Apollo 14 mission
NASA Technical Reports Server (NTRS)
1971-01-01
A postflight analysis of the Apollo 14 flight is presented. The basic objective of the flight evaluation is to acquire, reduce, analyze, and report on flight data to the extent required to assure future mission success and vehicle reliability. Actual flight failures are identified, their causes are determined and corrective actions are recommended. Summaries of launch operations and spacecraft performance are included. The significant events for all phases of the flight are analyzed.
2013-10-01
correct group assignment of samples in unsupervised hierarchical clustering by the Unweighted Pair-Group Method using Arithmetic averages ( UPGMA ) based on...centering of log2 transformed MAS5.0 signal values; probe set clustering was performed by the UPGMA method using Cosine correlation as the similarity met...A) The 108 differentially-regulated genes identified were subjected to unsupervised hierarchical clustering analysis using the UPGMA algorithm with
Li, Qiang; Ren, Xiaojing; Lu, Chuan; Li, Weixia; Huang, Yuxian; Chen, Liang
2017-01-01
Abstract To evaluate the performance of aspartate transaminase-to-platelet ratio index (APRI) and fibrosis index based on four factors (FIB-4) to predict significant fibrosis and cirrhosis in hepatitis B virus e antigen (HBeAg)-negative chronic hepatitis B (CHB) patients with alanine transaminase (ALT) ≤ twice the upper limit of normal (2 ULN). Histologic and laboratory data of 236 HBeAg-negative CHB patients with ALT ≤ 2 ULN were analyzed. Predicted fibrosis stage, based on established scales and cut-offs for APRI and FIB-4, was compared with METAVIR scores obtained from liver biopsy. In this study, the areas under the receiver operating characteristic curves (AUROCs) of APRI were lower than that of FIB-4 (0.62 vs 0.69; P = 0.019) for diagnosing significant fibrosis; however APRI and FIB-4 were comparable for diagnosing cirrhosis (0.77 vs 0.81; P = 0.374). When the cut-off proposed by WHO HBV guideline for APRI (>2.0) was used, no cirrhotic patients were correctly predicted. For FIB-4, the WHO proposed cut-off of 3.25 correctly identified significant fibrosis 83% of the time; but for APRI, the WHO proposed cut-off of 1.5 identified significant fibrosis 56%. In ruling out significant fibrosis, the WHO proposed APRI cut-off of 0.5 had a predictive value of 39%, and the FIB-4 cut-off of 1.45 correctly identified lack of significant fibrosis in 47% of the patients. In this study, based on ROC analysis, the optimal cut-offs were 0.46 and 0.65 for APRI, and 1.05 and 1.29 for FIB-4, for diagnosing significant fibrosis and cirrhosis, respectively. When the new cut-off of APRI (>0.65) was used, 82% of the cirrhotic patients were correctly predicted. In ruling out significant fibrosis, the new APRI cut-off (<0.46) had a predictive value of 80%, and new FIB-4 cut-off (<1.05) correctly identified lack of significant fibrosis in 84% of the patients. The WHO guidelines proposed cut-offs might be higher for HBeAg-negative CHB patients with ALT ≤2 ULN, and might underestimate the proportion of significant fibrosis and cirrhosis. A new set of cut-offs should be used to predict significant fibrosis and cirrhosis in this specific population. PMID:28328813
Li, Qiang; Ren, Xiaojing; Lu, Chuan; Li, Weixia; Huang, Yuxian; Chen, Liang
2017-03-01
To evaluate the performance of aspartate transaminase-to-platelet ratio index (APRI) and fibrosis index based on four factors (FIB-4) to predict significant fibrosis and cirrhosis in hepatitis B virus e antigen (HBeAg)-negative chronic hepatitis B (CHB) patients with alanine transaminase (ALT) ≤ twice the upper limit of normal (2 ULN).Histologic and laboratory data of 236 HBeAg-negative CHB patients with ALT ≤ 2 ULN were analyzed. Predicted fibrosis stage, based on established scales and cut-offs for APRI and FIB-4, was compared with METAVIR scores obtained from liver biopsy.In this study, the areas under the receiver operating characteristic curves (AUROCs) of APRI were lower than that of FIB-4 (0.62 vs 0.69; P = 0.019) for diagnosing significant fibrosis; however APRI and FIB-4 were comparable for diagnosing cirrhosis (0.77 vs 0.81; P = 0.374). When the cut-off proposed by WHO HBV guideline for APRI (>2.0) was used, no cirrhotic patients were correctly predicted. For FIB-4, the WHO proposed cut-off of 3.25 correctly identified significant fibrosis 83% of the time; but for APRI, the WHO proposed cut-off of 1.5 identified significant fibrosis 56%. In ruling out significant fibrosis, the WHO proposed APRI cut-off of 0.5 had a predictive value of 39%, and the FIB-4 cut-off of 1.45 correctly identified lack of significant fibrosis in 47% of the patients. In this study, based on ROC analysis, the optimal cut-offs were 0.46 and 0.65 for APRI, and 1.05 and 1.29 for FIB-4, for diagnosing significant fibrosis and cirrhosis, respectively. When the new cut-off of APRI (>0.65) was used, 82% of the cirrhotic patients were correctly predicted. In ruling out significant fibrosis, the new APRI cut-off (<0.46) had a predictive value of 80%, and new FIB-4 cut-off (<1.05) correctly identified lack of significant fibrosis in 84% of the patients.The WHO guidelines proposed cut-offs might be higher for HBeAg-negative CHB patients with ALT ≤2 ULN, and might underestimate the proportion of significant fibrosis and cirrhosis. A new set of cut-offs should be used to predict significant fibrosis and cirrhosis in this specific population.
Alsaggaf, Rotana; O'Hara, Lyndsay M; Stafford, Kristen A; Leekha, Surbhi; Harris, Anthony D
2018-02-01
OBJECTIVE A systematic review of quasi-experimental studies in the field of infectious diseases was published in 2005. The aim of this study was to assess improvements in the design and reporting of quasi-experiments 10 years after the initial review. We also aimed to report the statistical methods used to analyze quasi-experimental data. DESIGN Systematic review of articles published from January 1, 2013, to December 31, 2014, in 4 major infectious disease journals. METHODS Quasi-experimental studies focused on infection control and antibiotic resistance were identified and classified based on 4 criteria: (1) type of quasi-experimental design used, (2) justification of the use of the design, (3) use of correct nomenclature to describe the design, and (4) statistical methods used. RESULTS Of 2,600 articles, 173 (7%) featured a quasi-experimental design, compared to 73 of 2,320 articles (3%) in the previous review (P<.01). Moreover, 21 articles (12%) utilized a study design with a control group; 6 (3.5%) justified the use of a quasi-experimental design; and 68 (39%) identified their design using the correct nomenclature. In addition, 2-group statistical tests were used in 75 studies (43%); 58 studies (34%) used standard regression analysis; 18 (10%) used segmented regression analysis; 7 (4%) used standard time-series analysis; 5 (3%) used segmented time-series analysis; and 10 (6%) did not utilize statistical methods for comparisons. CONCLUSIONS While some progress occurred over the decade, it is crucial to continue improving the design and reporting of quasi-experimental studies in the fields of infection control and antibiotic resistance to better evaluate the effectiveness of important interventions. Infect Control Hosp Epidemiol 2018;39:170-176.
NASA Technical Reports Server (NTRS)
Pham, Timothy T.; Machuzak, Richard J.; Bedrossian, Alina; Kelly, Richard M.; Liao, Jason C.
2012-01-01
This software provides an automated capability to measure and qualify the frequency stability performance of the Deep Space Network (DSN) ground system, using daily spacecraft tracking data. The results help to verify if the DSN performance is meeting its specification, therefore ensuring commitments to flight missions; in particular, the radio science investigations. The rich set of data also helps the DSN Operations and Maintenance team to identify the trends and patterns, allowing them to identify the antennas of lower performance and implement corrective action in a timely manner. Unlike the traditional approach where the performance can only be obtained from special calibration sessions that are both time-consuming and require manual setup, the new method taps into the daily spacecraft tracking data. This new approach significantly increases the amount of data available for analysis, roughly by two orders of magnitude, making it possible to conduct trend analysis with good confidence. The software is built with automation in mind for end-to-end processing. From the inputs gathering to computation analysis and later data visualization of the results, all steps are done automatically, making the data production at near zero cost. This allows the limited engineering resource to focus on high-level assessment and to follow up with the exceptions/deviations. To make it possible to process the continual stream of daily incoming data without much effort, and to understand the results quickly, the processing needs to be automated and the data summarized at a high level. Special attention needs to be given to data gathering, input validation, handling anomalous conditions, computation, and presenting the results in a visual form that makes it easy to spot items of exception/ deviation so that further analysis can be directed and corrective actions followed.
NASA Technical Reports Server (NTRS)
Pham, Timothy T.; Machuzak, Richard J.; Bedrossian, Alina; Kelly, Richard M.; Liao, Jason C.
2012-01-01
This software provides an automated capability to measure and qualify the frequency stability performance of the Deep Space Network (DSN) ground system, using daily spacecraft tracking data. The results help to verify if the DSN performance is meeting its specification, therefore ensuring commitments to flight missions; in particular, the radio science investigations. The rich set of data also helps the DSN Operations and Maintenance team to identify the trends and patterns, allowing them to identify the antennas of lower performance and implement corrective action in a timely manner. Unlike the traditional approach where the performance can only be obtained from special calibration sessions that are both time-consuming and require manual setup, the new method taps into the daily spacecraft tracking data. This new approach significantly increases the amount of data available for analysis, roughly by two orders of magnitude, making it possible to conduct trend analysis with good confidence. The software is built with automation in mind for end-to-end processing. From the inputs gathering to computation analysis and later data visualization of the results, all steps are done automatically, making the data production at near zero cost. This allows the limited engineering resource to focus on high-level assessment and to follow up with the exceptions/deviations. To make it possible to process the continual stream of daily incoming data without much effort, and to understand the results quickly, the processing needs to be automated and the data summarized at a high level. Special attention needs to be given to data gathering, input validation, handling anomalous conditions, computation, and presenting the results in a visual form that makes it easy to spot items of exception/deviation so that further analysis can be directed and corrective actions followed.
Validating the Use of Deep Learning Neural Networks for Correction of Large Hydrometric Datasets
NASA Astrophysics Data System (ADS)
Frazier, N.; Ogden, F. L.; Regina, J. A.; Cheng, Y.
2017-12-01
Collection and validation of Earth systems data can be time consuming and labor intensive. In particular, high resolution hydrometric data, including rainfall and streamflow measurements, are difficult to obtain due to a multitude of complicating factors. Measurement equipment is subject to clogs, environmental disturbances, and sensor drift. Manual intervention is typically required to identify, correct, and validate these data. Weirs can become clogged and the pressure transducer may float or drift over time. We typically employ a graphical tool called Time Series Editor to manually remove clogs and sensor drift from the data. However, this process is highly subjective and requires hydrological expertise. Two different people may produce two different data sets. To use this data for scientific discovery and model validation, a more consistent method is needed to processes this field data. Deep learning neural networks have proved to be excellent mechanisms for recognizing patterns in data. We explore the use of Recurrent Neural Networks (RNN) to capture the patterns in the data over time using various gating mechanisms (LSTM and GRU), network architectures, and hyper-parameters to build an automated data correction model. We also explore the required amount of manually corrected training data required to train the network for reasonable accuracy. The benefits of this approach are that the time to process a data set is significantly reduced, and the results are 100% reproducible after training is complete. Additionally, we train the RNN and calibrate a physically-based hydrological model against the same portion of data. Both the RNN and the model are applied to the remaining data using a split-sample methodology. Performance of the machine learning is evaluated for plausibility by comparing with the output of the hydrological model, and this analysis identifies potential periods where additional investigation is warranted.
NASA Astrophysics Data System (ADS)
Oreiro, F. A.; Wziontek, H.; Fiore, M. M. E.; D'Onofrio, E. E.; Brunini, C.
2018-05-01
The Argentinean-German Geodetic Observatory is located 13 km from the Río de la Plata, in an area that is frequently affected by storm surges that can vary the level of the river over ±3 m. Water-level information from seven tide gauge stations located in the Río de la Plata are used to calculate every hour an empirical model of water heights (tidal + non-tidal component) and an empirical model of storm surge (non-tidal component) for the period 01/2016-12/2016. Using the SPOTL software, the gravimetric response of the models and the tidal response are calculated, obtaining that for the observatory location, the range of the tidal component (3.6 nm/s2) is only 12% of the range of the non-tidal component (29.4 nm/s2). The gravimetric response of the storm surge model is subtracted from the superconducting gravimeter observations, after applying the traditional corrections, and a reduction of 7% of the RMS is obtained. The wavelet transform is applied to the same series, before and after the non-tidal correction, and a clear decrease in the spectral energy in the periods between 2 and 12 days is identify between the series. Using the same software East, North and Up displacements are calculated, and a range of 3, 2, and 11 mm is obtained, respectively. The residuals obtained after applying the non-tidal correction allow to clearly identify the influence of rain events in the superconducting gravimeter observations, indicating the need of the analysis of this, and others, hydrological and geophysical effects.
Høeg, Tracy B; Moldow, Birgitte; Klein, Ronald; La Cour, Morten; Klemp, Kristian; Erngaard, Ditte; Ellervik, Christina; Buch, Helena
2016-03-01
To examine non-mydriatic fundus photography (FP) and fundus autofluorescence (FAF) as alternative non-invasive imaging modalities to fluorescein angiography (FA) in the detection of cuticular drusen (CD). Among 2953 adults from the Danish Rural Eye Study (DRES) with gradable FP, three study groups were selected: (1) All those with suspected CD without age-related macular degeneration (AMD) on FP, (2) all those with suspected CD with AMD on FP and (3) a randomly selected group with early AMD. Groups 1, 2 and 3 underwent FA and FAF and group 4 underwent FAF only as part of DRES CD substudy. Main outcome measures included percentage of correct positive and correct negative diagnoses, Cohen's κ and prevalence-adjusted and bias-adjusted κ (PABAK) coefficients of test and grader reliability. CD was correctly identified on FP 88.9% of the time and correctly identified as not being present 83.3% of the time. CD was correctly identified on FAF 62.0% of the time and correctly identified as not being present 100.0% of the time. Compared with FA, FP has a PABAK of 0.75 (0.60 to 1.5) and FAF a PABAK of 0.44 (0.23 to 0.95). FP is a promising, non-invasive substitute for FA in the diagnosis of CD. FAF was less reliable than FP to detect CD. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
Oakes, Theres; Heather, James M.; Best, Katharine; Byng-Maddick, Rachel; Husovsky, Connor; Ismail, Mazlina; Joshi, Kroopa; Maxwell, Gavin; Noursadeghi, Mahdad; Riddell, Natalie; Ruehl, Tabea; Turner, Carolin T.; Uddin, Imran; Chain, Benny
2017-01-01
The T cell receptor (TCR) repertoire can provide a personalized biomarker for infectious and non-infectious diseases. We describe a protocol for amplifying, sequencing, and analyzing TCRs which is robust, sensitive, and versatile. The key experimental step is ligation of a single-stranded oligonucleotide to the 3′ end of the TCR cDNA. This allows amplification of all possible rearrangements using a single set of primers per locus. It also introduces a unique molecular identifier to label each starting cDNA molecule. This molecular identifier is used to correct for sequence errors and for effects of differential PCR amplification efficiency, thus producing more accurate measures of the true TCR frequency within the sample. This integrated experimental and computational pipeline is applied to the analysis of human memory and naive subpopulations, and results in consistent measures of diversity and inequality. After error correction, the distribution of TCR sequence abundance in all subpopulations followed a power law over a wide range of values. The power law exponent differed between naïve and memory populations, but was consistent between individuals. The integrated experimental and analysis pipeline we describe is appropriate to studies of T cell responses in a broad range of physiological and pathological contexts. PMID:29075258
Rose, Peter G.; Java, James; Whitney, Charles W.; Stehman, Frederick B.; Lanciano, Rachelle; Thomas, Gillian M.; DiSilvestro, Paul A.
2015-01-01
Purpose To evaluate the prognostic factors in locally advanced cervical cancer limited to the pelvis and develop nomograms for 2-year progression-free survival (PFS), 5-year overall survival (OS), and pelvic recurrence. Patients and Methods We retrospectively reviewed 2,042 patients with locally advanced cervical carcinoma enrolled onto Gynecologic Oncology Group clinical trials of concurrent cisplatin-based chemotherapy and radiotherapy. Nomograms for 2-year PFS, five-year OS, and pelvic recurrence were created as visualizations of Cox proportional hazards regression models. The models were validated by bootstrap-corrected, relatively unbiased estimates of discrimination and calibration. Results Multivariable analysis identified prognostic factors including histology, race/ethnicity, performance status, tumor size, International Federation of Gynecology and Obstetrics stage, tumor grade, pelvic node status, and treatment with concurrent cisplatin-based chemotherapy. PFS, OS, and pelvic recurrence nomograms had bootstrap-corrected concordance indices of 0.62, 0.64, and 0.73, respectively, and were well calibrated. Conclusion Prognostic factors were used to develop nomograms for 2-year PFS, 5-year OS, and pelvic recurrence for locally advanced cervical cancer clinically limited to the pelvis treated with concurrent cisplatin-based chemotherapy and radiotherapy. These nomograms can be used to better estimate individual and collective outcomes. PMID:25732170
NASA Astrophysics Data System (ADS)
Bogiatzis, P.; Altoé, I. L.; Karamitrou, A.; Ishii, M.; Ishii, H.
2015-12-01
DigitSeis is a new open-source, interactive digitization software written in MATLAB that converts digital, raster images of analog seismograms to readily usable, discretized time series using image processing algorithms. DigitSeis automatically identifies and corrects for various geometrical distortions of seismogram images that are acquired through the original recording, storage, and scanning procedures. With human supervision, the software further identifies and classifies important features such as time marks and notes, corrects time-mark offsets from the main trace, and digitizes the combined trace with an analysis to obtain as accurate timing as possible. Although a large effort has been made to minimize the human input, DigitSeis provides interactive tools for challenging situations such as trace crossings and stains in the paper. The effectiveness of the software is demonstrated with the digitization of seismograms that are over half a century old from the Harvard-Adam Dziewoński observatory that is still in operation as a part of the Global Seismographic Network (station code HRV and network code IU). The spectral analysis of the digitized time series shows no spurious features that may be related to the occurrence of minute and hour marks. They also display signals associated with significant earthquakes, and a comparison of the spectrograms with modern recordings reveals similarities in the background noise.
Grammatikopoulou, Ioanna; Olsen, Søren Bøye
2013-11-30
Based on a Contingent Valuation survey aiming to reveal the willingness to pay (WTP) for conservation of a wetland area in Greece, we show how protest and warm glow motives can be taken into account when modeling WTP. In a sample of more than 300 respondents, we find that 54% of the positive bids are rooted to some extent in warm glow reasoning while 29% of the zero bids can be classified as expressions of protest rather than preferences. In previous studies, warm glow bidders are only rarely identified while protesters are typically identified and excluded from further analysis. We test for selection bias associated with simple removal of both protesters and warm glow bidders in our data. Our findings show that removal of warm glow bidders does not significantly distort WTP whereas we find strong evidence of selection bias associated with removal of protesters. We show how to correct for such selection bias by using a sample selection model. In our empirical sample, using the typical approach of removing protesters from the analysis, the value of protecting the wetland is significantly underestimated by as much as 46% unless correcting for selection bias. Copyright © 2013 Elsevier Ltd. All rights reserved.
Mohallem, José R
2008-04-14
Recent post-Hartree-Fock calculations of the diagonal-Born-Oppenheimer correction empirically show that it behaves quite similar to atomic nuclear mass corrections. An almost constant contribution per electron is identified, which converges with system size for specific series of organic molecules. This feature permits pocket-calculator evaluation of the corrections within thermochemical accuracy (10(-1) mhartree or kcal/mol).
Saffert, Ryan T.; Cunningham, Scott A.; Ihde, Sherry M.; Monson Jobe, Kristine E.; Mandrekar, Jayawant; Patel, Robin
2011-01-01
We compared the BD Phoenix automated microbiology system to the Bruker Biotyper (version 2.0) matrix-assisted laser desorption ionization–time of flight (MALDI-TOF) mass spectrometry (MS) system for identification of Gram-negative bacilli, using biochemical testing and/or genetic sequencing to resolve discordant results. The BD Phoenix correctly identified 363 (83%) and 330 (75%) isolates to the genus and species level, respectively. The Bruker Biotyper correctly identified 408 (93%) and 360 (82%) isolates to the genus and species level, respectively. The 440 isolates were grouped into common (308) and infrequent (132) isolates in the clinical laboratory. For the 308 common isolates, the BD Phoenix and Bruker Biotyper correctly identified 294 (95%) and 296 (96%) of the isolates to the genus level, respectively. For species identification, the BD Phoenix and Bruker Biotyper correctly identified 93% of the common isolates (285 and 286, respectively). In contrast, for the 132 infrequent isolates, the Bruker Biotyper correctly identified 112 (85%) and 74 (56%) isolates to the genus and species level, respectively, compared to the BD Phoenix, which identified only 69 (52%) and 45 (34%) isolates to the genus and species level, respectively. Statistically, the Bruker Biotyper overall outperformed the BD Phoenix for identification of Gram-negative bacilli to the genus (P < 0.0001) and species (P = 0.0005) level in this sample set. When isolates were categorized as common or infrequent isolates, there was statistically no difference between the instruments for identification of common Gram-negative bacilli (P > 0.05). However, the Bruker Biotyper outperformed the BD Phoenix for identification of infrequently isolated Gram-negative bacilli (P < 0.0001). PMID:21209160
Evaluating Reported Candidate Gene Associations with Polycystic Ovary Syndrome
Pau, Cindy; Saxena, Richa; Welt, Corrine Kolka
2013-01-01
Objective To replicate variants in candidate genes associated with PCOS in a population of European PCOS and control subjects. Design Case-control association analysis and meta-analysis. Setting Major academic hospital Patients Women of European ancestry with PCOS (n=525) and controls (n=472), aged 18 to 45 years. Intervention Variants previously associated with PCOS in candidate gene studies were genotyped (n=39). Metabolic, reproductive and anthropomorphic parameters were examined as a function of the candidate variants. All genetic association analyses were adjusted for age, BMI and ancestry and were reported after correction for multiple testing. Main Outcome Measure Association of candidate gene variants with PCOS. Results Three variants, rs3797179 (SRD5A1), rs12473543 (POMC), and rs1501299 (ADIPOQ), were nominally associated with PCOS. However, they did not remain significant after correction for multiple testing and none of the variants replicated in a sufficiently powered meta-analysis. Variants in the FBN3 gene (rs17202517 and rs73503752) were associated with smaller waist circumferences and variant rs727428 in the SHBG gene was associated with lower SHBG levels. Conclusion Previously identified variants in candidate genes do not appear to be associated with PCOS risk. PMID:23375202
[Internal audit in medical laboratory: what means of control for an effective audit process?].
Garcia-Hejl, Carine; Chianéa, Denis; Dedome, Emmanuel; Sanmartin, Nancy; Bugier, Sarah; Linard, Cyril; Foissaud, Vincent; Vest, Philippe
2013-01-01
To prepare the French Accreditation Committee (COFRAC) visit for initial certification of our medical laboratory, our direction evaluated its quality management system (QMS) and all its technical activities. This evaluation was performed owing an internal audit. This audit was outsourced. Auditors had an expertise in audit, a whole knowledge of biological standards and were independent. Several nonconformities were identified at that time, including a lack of control of several steps of the internal audit process. Hence, necessary corrective actions were taken in order to meet the requirements of standards, in particular, the formalization of all stages, from the audit program, to the implementation, review and follow-up of the corrective actions taken, and also the implementation of the resources needed to carry out audits in a pre-established timing. To ensure an optimum control of each step, the main concepts of risk management were applied: process approach, root cause analysis, effects and criticality analysis (FMECA). After a critical analysis of our practices, this methodology allowed us to define our "internal audit" process, then to formalize it and to follow it up, with a whole documentary system.
Depner, Rachel M; Grant, Pei C; Byrwa, David J; Breier, Jennifer M; Lodi-Smith, Jennifer; Luczkiewicz, Debra L; Kerr, Christopher W
2018-05-01
The age demographic of the incarcerated is quickly shifting from young to old. Correctional facilities are responsible for navigating inmate access to healthcare; currently, there is no standardization for access to end-of-life care. There is growing research support for prison-based end-of-life care programs that incorporate inmate peer caregivers as a way to meet the needs of the elderly and dying who are incarcerated. This project aims to (a) describe a prison-based end-of-life program utilizing inmate peer caregivers, (b) identify inmate-caregiver motivations for participation, and (c) analyze the role of building trust and meaningful relationships within the correctional end-of-life care setting. A total of 22 semi-structured interviews were conducted with inmate-caregivers. Data were analyzed using Consensual Qualitative Research methodology. All inmate-caregivers currently participating in the end-of-life peer care program at Briarcliff Correctional Facility were given the opportunity to participate. All participants were male, over the age of 18, and also incarcerated at Briarcliff Correctional Facility, a maximum security, state-level correctional facility. In total, five over-arching and distinct domains emerged; this manuscript focuses on the following three: (a) program description, (b) motivation, and (c) connections with others. Findings suggest that inmate-caregivers believe they provide a unique and necessary adaptation to prison-based end-of-life care resulting in multilevel benefits. These additional perceived benefits go beyond a marginalized group gaining access to patient-centered end-of-life care and include potential inmate-caregiver rehabilitation, correctional medical staff feeling supported, and correctional facilities meeting end-of-life care mandates. Additional research is imperative to work toward greater standardization of and access to end-of-life care for the incarcerated.
15 CFR 30.9 - Transmitting and correcting Electronic Export Information.
Code of Federal Regulations, 2012 CFR
2012-01-01
... in the AES and transmitting any changes to that information as soon as they are known. Corrections, cancellations, or amendments to that information shall be electronically identified and transmitted to the AES... authorized agent has received an error message from AES, the corrections shall take place as required. Fatal...
15 CFR 30.9 - Transmitting and correcting Electronic Export Information.
Code of Federal Regulations, 2014 CFR
2014-01-01
... in the AES and transmitting any changes to that information as soon as they are known. Corrections, cancellations, or amendments to that information shall be electronically identified and transmitted to the AES... authorized agent has received an error message from AES, the corrections shall take place as required. Fatal...
15 CFR 30.9 - Transmitting and correcting Electronic Export Information.
Code of Federal Regulations, 2011 CFR
2011-01-01
... in the AES and transmitting any changes to that information as soon as they are known. Corrections, cancellations, or amendments to that information shall be electronically identified and transmitted to the AES... authorized agent has received an error message from AES, the corrections shall take place as required. Fatal...
15 CFR 30.9 - Transmitting and correcting Electronic Export Information.
Code of Federal Regulations, 2013 CFR
2013-01-01
... in the AES and transmitting any changes to that information as soon as they are known. Corrections, cancellations, or amendments to that information shall be electronically identified and transmitted to the AES... authorized agent has received an error message from AES, the corrections shall take place as required. Fatal...
The King-Devick (K-D) test and concussion diagnosis in semi-professional rugby union players.
Molloy, John H; Murphy, Ian; Gissane, Conor
2017-08-01
To determine the utility of the King-Devick (K-D) test in identifying sports-related concussion in semi-professional rugby players. Descriptive cohort study. 176 male players were recruited from a semi-professional rugby union competition in New Zealand (NZ). Baseline K-D scores were obtained in the pre-season. Post-match K-D and Pitch Side Concussion Assessment Version 2 (PSCA2) scores were obtained in those with suspected concussion. Post-match K-D scores were also administered to selected control players. 19 concussions in 18 players were analysed. In addition, 33 controls were used for analysis. A positive K-D test was identified in 53% of players with concussion post-match. Conversely, a positive test was identified in 33% of controls. The sensitivity and specificity of the K-D test was calculated as 53% and 69% respectively. The positive predictive value and negative predictive value was 48% and 73% respectively. The PSCA2 correctly identified 74% of concussions. The K-D test identified 3 cases not identified by the PSCA2. When the PSCA2 and K-D were combined, 89% of concussions were correctly identified. The K-D test does not appear to be effective if used as a stand-alone test for the diagnosis of concussion. However, if used alongside current side-line cognitive and balance tests, it may assist in more accurately diagnosing sports-related concussion. Further research should look to utilise the K-D test in in-match protocols to establish if this improves the diagnostic accuracy of in-match protocols for sports-related concussion. Copyright © 2017 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.
Kim, Han Jo; Iyer, Sravisht; Diebo, Basel G; Kelly, Michael P; Sciubba, Daniel; Schwab, Frank; Lafage, Virginie; Mundis, Gregory M; Shaffrey, Christopher I; Smith, Justin S; Hart, Robert; Burton, Douglas; Bess, Shay; Klineberg, Eric O
2018-05-01
Retrospective cohort study. Describe the rate and risk factors for venous thromboembolic events (VTEs; defined as deep venous thrombosis [DVT] and/or pulmonary embolism [PE]) in adult spinal deformity (ASD) surgery. ASD patients with VTE were identified in a prospective, multicenter database. Complications, revision, and mortality rate were examined. Patient demographics, operative details, and radiographic and clinical outcomes were compared with a non-VTE group. Multivariate binary regression model was used to identify predictors of VTE. A total of 737 patients were identified, 32 (4.3%) had VTE (DVT = 14; PE = 18). At baseline, VTE patients were less likely to be employed in jobs requiring physical labor (59.4% vs 79.7%, P < .01) and more likely to have osteoporosis (29% vs 15.1%, P = .037) and liver disease (6.5% vs 1.4%, P = .027). Patients with VTE had a larger preoperative sagittal vertical axis (SVA; 93 mm vs 55 mm, P < .01) and underwent larger SVA corrections. VTE was associated with a combined anterior/posterior approach (45% vs 25%, P = .028). VTE patients had a longer hospital stay (10 vs 7 days, P < .05) and higher mortality rate (6.3% vs 0.7%, P < .01). Multivariate analysis demonstrated osteoporosis, lack of physical labor, and increased SVA correction were independent predictors of VTE ( r 2 = .11, area under the curve = 0.74, P < .05). The incidence of VTE in ASD is 4.3% with a DVT rate of 1.9% and PE rate of 2.4%. Osteoporosis, lack of physical labor, and increased SVA correction were independent predictors of VTE. Patients with VTE had a higher mortality rate compared with non-VTE patients.
Influences on women's decision making about intrauterine device use in Madagascar.
Gottert, Ann; Jacquin, Karin; Rahaivondrafahitra, Bakoly; Moracco, Kathryn; Maman, Suzanne
2015-04-01
We explored influences on decision making about intrauterine device (IUD) use among women in the Women's Health Project (WHP), managed by Population Services International in Madagascar. We conducted six small group photonarrative discussions (n=18 individuals) and 12 individual in-depth interviews with women who were IUD users and nonusers. All participants had had contact with WHP counselors in three sites in Madagascar. Data analysis involved creating summaries of each transcript, coding in Atlas.ti and then synthesizing findings in a conceptual model. We identified three stages of women's decision making about IUD use, and specific forms of social support that seemed helpful at each stage. During the first stage, receiving correct information from a trusted source such as a counselor conveys IUD benefits and corrects misinformation, but lingering fears about the method often appeared to delay method adoption among interested women. During the second stage, hearing testimony from satisfied users and receiving ongoing emotional support appeared to help alleviate these fears. During the third stage, accompaniment by a counselor or peer seemed to help some women gain confidence to go to the clinic to receive the IUD. Identifying and supplying the types of social support women find helpful at different stages of the decision-making process could help program managers better respond to women's staged decision-making process about IUD use. This qualitative study suggests that women in Madagascar perceive multiple IUD benefits but also fear the method even after misinformation is corrected, leading to a staged decision-making process about IUD use. Programs should identify and supply the types of social support that women find helpful at each stage of decision making. Copyright © 2015 Elsevier Inc. All rights reserved.
de Vries, Paul S; Sabater-Lleal, Maria; Chasman, Daniel I; Trompet, Stella; Ahluwalia, Tarunveer S; Teumer, Alexander; Kleber, Marcus E; Chen, Ming-Huei; Wang, Jie Jin; Attia, John R; Marioni, Riccardo E; Steri, Maristella; Weng, Lu-Chen; Pool, Rene; Grossmann, Vera; Brody, Jennifer A; Venturini, Cristina; Tanaka, Toshiko; Rose, Lynda M; Oldmeadow, Christopher; Mazur, Johanna; Basu, Saonli; Frånberg, Mattias; Yang, Qiong; Ligthart, Symen; Hottenga, Jouke J; Rumley, Ann; Mulas, Antonella; de Craen, Anton J M; Grotevendt, Anne; Taylor, Kent D; Delgado, Graciela E; Kifley, Annette; Lopez, Lorna M; Berentzen, Tina L; Mangino, Massimo; Bandinelli, Stefania; Morrison, Alanna C; Hamsten, Anders; Tofler, Geoffrey; de Maat, Moniek P M; Draisma, Harmen H M; Lowe, Gordon D; Zoledziewska, Magdalena; Sattar, Naveed; Lackner, Karl J; Völker, Uwe; McKnight, Barbara; Huang, Jie; Holliday, Elizabeth G; McEvoy, Mark A; Starr, John M; Hysi, Pirro G; Hernandez, Dena G; Guan, Weihua; Rivadeneira, Fernando; McArdle, Wendy L; Slagboom, P Eline; Zeller, Tanja; Psaty, Bruce M; Uitterlinden, André G; de Geus, Eco J C; Stott, David J; Binder, Harald; Hofman, Albert; Franco, Oscar H; Rotter, Jerome I; Ferrucci, Luigi; Spector, Tim D; Deary, Ian J; März, Winfried; Greinacher, Andreas; Wild, Philipp S; Cucca, Francesco; Boomsma, Dorret I; Watkins, Hugh; Tang, Weihong; Ridker, Paul M; Jukema, Jan W; Scott, Rodney J; Mitchell, Paul; Hansen, Torben; O'Donnell, Christopher J; Smith, Nicholas L; Strachan, David P; Dehghan, Abbas
2017-01-01
An increasing number of genome-wide association (GWA) studies are now using the higher resolution 1000 Genomes Project reference panel (1000G) for imputation, with the expectation that 1000G imputation will lead to the discovery of additional associated loci when compared to HapMap imputation. In order to assess the improvement of 1000G over HapMap imputation in identifying associated loci, we compared the results of GWA studies of circulating fibrinogen based on the two reference panels. Using both HapMap and 1000G imputation we performed a meta-analysis of 22 studies comprising the same 91,953 individuals. We identified six additional signals using 1000G imputation, while 29 loci were associated using both HapMap and 1000G imputation. One locus identified using HapMap imputation was not significant using 1000G imputation. The genome-wide significance threshold of 5×10-8 is based on the number of independent statistical tests using HapMap imputation, and 1000G imputation may lead to further independent tests that should be corrected for. When using a stricter Bonferroni correction for the 1000G GWA study (P-value < 2.5×10-8), the number of loci significant only using HapMap imputation increased to 4 while the number of loci significant only using 1000G decreased to 5. In conclusion, 1000G imputation enabled the identification of 20% more loci than HapMap imputation, although the advantage of 1000G imputation became less clear when a stricter Bonferroni correction was used. More generally, our results provide insights that are applicable to the implementation of other dense reference panels that are under development.
de Vries, Paul S.; Sabater-Lleal, Maria; Chasman, Daniel I.; Trompet, Stella; Kleber, Marcus E.; Chen, Ming-Huei; Wang, Jie Jin; Attia, John R.; Marioni, Riccardo E.; Weng, Lu-Chen; Grossmann, Vera; Brody, Jennifer A.; Venturini, Cristina; Tanaka, Toshiko; Rose, Lynda M.; Oldmeadow, Christopher; Mazur, Johanna; Basu, Saonli; Yang, Qiong; Ligthart, Symen; Hottenga, Jouke J.; Rumley, Ann; Mulas, Antonella; de Craen, Anton J. M.; Grotevendt, Anne; Taylor, Kent D.; Delgado, Graciela E.; Kifley, Annette; Lopez, Lorna M.; Berentzen, Tina L.; Mangino, Massimo; Bandinelli, Stefania; Morrison, Alanna C.; Hamsten, Anders; Tofler, Geoffrey; de Maat, Moniek P. M.; Draisma, Harmen H. M.; Lowe, Gordon D.; Zoledziewska, Magdalena; Sattar, Naveed; Lackner, Karl J.; Völker, Uwe; McKnight, Barbara; Huang, Jie; Holliday, Elizabeth G.; McEvoy, Mark A.; Starr, John M.; Hysi, Pirro G.; Hernandez, Dena G.; Guan, Weihua; Rivadeneira, Fernando; McArdle, Wendy L.; Slagboom, P. Eline; Zeller, Tanja; Psaty, Bruce M.; Uitterlinden, André G.; de Geus, Eco J. C.; Stott, David J.; Binder, Harald; Hofman, Albert; Franco, Oscar H.; Rotter, Jerome I.; Ferrucci, Luigi; Spector, Tim D.; Deary, Ian J.; März, Winfried; Greinacher, Andreas; Wild, Philipp S.; Cucca, Francesco; Boomsma, Dorret I.; Watkins, Hugh; Tang, Weihong; Ridker, Paul M.; Jukema, Jan W.; Scott, Rodney J.; Mitchell, Paul; Hansen, Torben; O'Donnell, Christopher J.; Smith, Nicholas L.; Strachan, David P.
2017-01-01
An increasing number of genome-wide association (GWA) studies are now using the higher resolution 1000 Genomes Project reference panel (1000G) for imputation, with the expectation that 1000G imputation will lead to the discovery of additional associated loci when compared to HapMap imputation. In order to assess the improvement of 1000G over HapMap imputation in identifying associated loci, we compared the results of GWA studies of circulating fibrinogen based on the two reference panels. Using both HapMap and 1000G imputation we performed a meta-analysis of 22 studies comprising the same 91,953 individuals. We identified six additional signals using 1000G imputation, while 29 loci were associated using both HapMap and 1000G imputation. One locus identified using HapMap imputation was not significant using 1000G imputation. The genome-wide significance threshold of 5×10−8 is based on the number of independent statistical tests using HapMap imputation, and 1000G imputation may lead to further independent tests that should be corrected for. When using a stricter Bonferroni correction for the 1000G GWA study (P-value < 2.5×10−8), the number of loci significant only using HapMap imputation increased to 4 while the number of loci significant only using 1000G decreased to 5. In conclusion, 1000G imputation enabled the identification of 20% more loci than HapMap imputation, although the advantage of 1000G imputation became less clear when a stricter Bonferroni correction was used. More generally, our results provide insights that are applicable to the implementation of other dense reference panels that are under development. PMID:28107422
New Swift UVOT data reduction tools and AGN variability studies
NASA Astrophysics Data System (ADS)
Gelbord, Jonathan; Edelson, Rick
2017-08-01
The efficient slewing and flexible scheduling of the Swift observatory have made it possible to conduct monitoring campaigns that are both intensive and prolonged, with multiple visits per day sustained over weeks and months. Recent Swift monitoring campaigns of a handful of AGN provide simultaneous optical, UV and X-ray light curves that can be used to measure variability and interband correlations on timescales from hours to months, providing new constraints for the structures within AGN and the relationships between them. However, the first of these campaigns, thrice-per-day observations of NGC 5548 through four months, revealed anomalous dropouts in the UVOT light curves (Edelson, Gelbord, et al. 2015). We identified the cause as localized regions of reduced detector sensitivity that are not corrected by standard processing. Properly interpreting the light curves required identifying and screening out the affected measurements.We are now using archival Swift data to better characterize these low sensitivity regions. Our immediate goal is to produce a more complete mapping of their locations so that affected measurements can be identified and screened before further analysis. Our longer-term goal is to build a more quantitative model of the effect in order to define a correction for measured fluxes, if possible, or at least to put limits on the impact upon any observation. We will combine data from numerous background stars in well-monitored fields in order to quantify the strength of the effect as a function of filter as well as location on the detector, and to test for other dependencies such as evolution over time or sensitivity to the count rate of the target. Our UVOT sensitivity maps and any correction tools will be provided to the community of Swift users.
Page, Brent T; Shields, Christine E; Merz, William G; Kurtzman, Cletus P
2006-09-01
This study was designed to compare the identification of ascomycetous yeasts recovered from clinical specimens by using phenotypic assays (PA) and a molecular flow cytometric (FC) method. Large-subunit rRNA domains 1 and 2 (D1/D2) gene sequence analysis was also performed and served as the reference for correct strain identification. A panel of 88 clinical isolates was tested that included representatives of nine commonly encountered species and six infrequently encountered species. The PA included germ tube production, fermentation of seven carbohydrates, morphology on corn meal agar, urease and phenoloxidase activities, and carbohydrate assimilation tests when needed. The FC method (Luminex) employed species-specific oligonucleotides attached to polystyrene beads, which were hybridized with D1/D2 amplicons from the unidentified isolates. The PA identified 81 of 88 strains correctly but misidentified 4 of Candida dubliniensis, 1 of C. bovina, 1 of C. palmioleophila, and 1 of C. bracarensis. The FC method correctly identified 79 of 88 strains and did not misidentify any isolate but did not identify nine isolates because oligonucleotide probes were not available in the current library. The FC assay takes approximately 5 h, whereas the PA takes from 2 h to 5 days for identification. In conclusion, PA did well with the commonly encountered species, was not accurate for uncommon species, and takes significantly longer than the FC method. These data strongly support the potential of FC technology for rapid and accurate identification of medically important yeasts. With the introduction of new antifungals, rapid, accurate identification of pathogenic yeasts is more important than ever for guiding antifungal chemotherapy.
van Veen, S. Q.; Claas, E. C. J.; Kuijper, Ed J.
2010-01-01
Matrix-assisted laser desorption ionization-time of flight mass spectrometry (MALDI-TOF MS) is suitable for high-throughput and rapid diagnostics at low costs and can be considered an alternative for conventional biochemical and molecular identification systems in a conventional microbiological laboratory. First, we evaluated MALDI-TOF MS using 327 clinical isolates previously cultured from patient materials and identified by conventional techniques (Vitek-II, API, and biochemical tests). Discrepancies were analyzed by molecular analysis of the 16S genes. Of 327 isolates, 95.1% were identified correctly to genus level, and 85.6% were identified to species level by MALDI-TOF MS. Second, we performed a prospective validation study, including 980 clinical isolates of bacteria and yeasts. Overall performance of MALDI-TOF MS was significantly better than conventional biochemical systems for correct species identification (92.2% and 83.1%, respectively) and produced fewer incorrect genus identifications (0.1% and 1.6%, respectively). Correct species identification by MALDI-TOF MS was observed in 97.7% of Enterobacteriaceae, 92% of nonfermentative Gram-negative bacteria, 94.3% of staphylococci, 84.8% of streptococci, 84% of a miscellaneous group (mainly Haemophilus, Actinobacillus, Cardiobacterium, Eikenella, and Kingella [HACEK]), and 85.2% of yeasts. MALDI-TOF MS had significantly better performance than conventional methods for species identification of staphylococci and genus identification of bacteria belonging to HACEK group. Misidentifications by MALDI-TOF MS were clearly associated with an absence of sufficient spectra from suitable reference strains in the MALDI-TOF MS database. We conclude that MALDI-TOF MS can be implemented easily for routine identification of bacteria (except for pneumococci and viridans streptococci) and yeasts in a medical microbiological laboratory. PMID:20053859
van Veen, S Q; Claas, E C J; Kuijper, Ed J
2010-03-01
Matrix-assisted laser desorption ionization-time of flight mass spectrometry (MALDI-TOF MS) is suitable for high-throughput and rapid diagnostics at low costs and can be considered an alternative for conventional biochemical and molecular identification systems in a conventional microbiological laboratory. First, we evaluated MALDI-TOF MS using 327 clinical isolates previously cultured from patient materials and identified by conventional techniques (Vitek-II, API, and biochemical tests). Discrepancies were analyzed by molecular analysis of the 16S genes. Of 327 isolates, 95.1% were identified correctly to genus level, and 85.6% were identified to species level by MALDI-TOF MS. Second, we performed a prospective validation study, including 980 clinical isolates of bacteria and yeasts. Overall performance of MALDI-TOF MS was significantly better than conventional biochemical systems for correct species identification (92.2% and 83.1%, respectively) and produced fewer incorrect genus identifications (0.1% and 1.6%, respectively). Correct species identification by MALDI-TOF MS was observed in 97.7% of Enterobacteriaceae, 92% of nonfermentative Gram-negative bacteria, 94.3% of staphylococci, 84.8% of streptococci, 84% of a miscellaneous group (mainly Haemophilus, Actinobacillus, Cardiobacterium, Eikenella, and Kingella [HACEK]), and 85.2% of yeasts. MALDI-TOF MS had significantly better performance than conventional methods for species identification of staphylococci and genus identification of bacteria belonging to HACEK group. Misidentifications by MALDI-TOF MS were clearly associated with an absence of sufficient spectra from suitable reference strains in the MALDI-TOF MS database. We conclude that MALDI-TOF MS can be implemented easily for routine identification of bacteria (except for pneumococci and viridans streptococci) and yeasts in a medical microbiological laboratory.
Kim, Dokyoon; Basile, Anna O; Bang, Lisa; Horgusluoglu, Emrin; Lee, Seunggeun; Ritchie, Marylyn D; Saykin, Andrew J; Nho, Kwangsik
2017-05-18
Rapid advancement of next generation sequencing technologies such as whole genome sequencing (WGS) has facilitated the search for genetic factors that influence disease risk in the field of human genetics. To identify rare variants associated with human diseases or traits, an efficient genome-wide binning approach is needed. In this study we developed a novel biological knowledge-based binning approach for rare-variant association analysis and then applied the approach to structural neuroimaging endophenotypes related to late-onset Alzheimer's disease (LOAD). For rare-variant analysis, we used the knowledge-driven binning approach implemented in Bin-KAT, an automated tool, that provides 1) binning/collapsing methods for multi-level variant aggregation with a flexible, biologically informed binning strategy and 2) an option of performing unified collapsing and statistical rare variant analyses in one tool. A total of 750 non-Hispanic Caucasian participants from the Alzheimer's Disease Neuroimaging Initiative (ADNI) cohort who had both WGS data and magnetic resonance imaging (MRI) scans were used in this study. Mean bilateral cortical thickness of the entorhinal cortex extracted from MRI scans was used as an AD-related neuroimaging endophenotype. SKAT was used for a genome-wide gene- and region-based association analysis of rare variants (MAF (minor allele frequency) < 0.05) and potential confounding factors (age, gender, years of education, intracranial volume (ICV) and MRI field strength) for entorhinal cortex thickness were used as covariates. Significant associations were determined using FDR adjustment for multiple comparisons. Our knowledge-driven binning approach identified 16 functional exonic rare variants in FANCC significantly associated with entorhinal cortex thickness (FDR-corrected p-value < 0.05). In addition, the approach identified 7 evolutionary conserved regions, which were mapped to FAF1, RFX7, LYPLAL1 and GOLGA3, significantly associated with entorhinal cortex thickness (FDR-corrected p-value < 0.05). In further analysis, the functional exonic rare variants in FANCC were also significantly associated with hippocampal volume and cerebrospinal fluid (CSF) Aβ 1-42 (p-value < 0.05). Our novel binning approach identified rare variants in FANCC as well as 7 evolutionary conserved regions significantly associated with a LOAD-related neuroimaging endophenotype. FANCC (fanconi anemia complementation group C) has been shown to modulate TLR and p38 MAPK-dependent expression of IL-1β in macrophages. Our results warrant further investigation in a larger independent cohort and demonstrate that the biological knowledge-driven binning approach is a powerful strategy to identify rare variants associated with AD and other complex disease.
Plöchl, Michael; Ossandón, José P.; König, Peter
2012-01-01
Eye movements introduce large artifacts to electroencephalographic recordings (EEG) and thus render data analysis difficult or even impossible. Trials contaminated by eye movement and blink artifacts have to be discarded, hence in standard EEG-paradigms subjects are required to fixate on the screen. To overcome this restriction, several correction methods including regression and blind source separation have been proposed. Yet, there is no automated standard procedure established. By simultaneously recording eye movements and 64-channel-EEG during a guided eye movement paradigm, we investigate and review the properties of eye movement artifacts, including corneo-retinal dipole changes, saccadic spike potentials and eyelid artifacts, and study their interrelations during different types of eye- and eyelid movements. In concordance with earlier studies our results confirm that these artifacts arise from different independent sources and that depending on electrode site, gaze direction, and choice of reference these sources contribute differently to the measured signal. We assess the respective implications for artifact correction methods and therefore compare the performance of two prominent approaches, namely linear regression and independent component analysis (ICA). We show and discuss that due to the independence of eye artifact sources, regression-based correction methods inevitably over- or under-correct individual artifact components, while ICA is in principle suited to address such mixtures of different types of artifacts. Finally, we propose an algorithm, which uses eye tracker information to objectively identify eye-artifact related ICA-components (ICs) in an automated manner. In the data presented here, the algorithm performed very similar to human experts when those were given both, the topographies of the ICs and their respective activations in a large amount of trials. Moreover it performed more reliable and almost twice as effective than human experts when those had to base their decision on IC topographies only. Furthermore, a receiver operating characteristic (ROC) analysis demonstrated an optimal balance of false positive and false negative at an area under curve (AUC) of more than 0.99. Removing the automatically detected ICs from the data resulted in removal or substantial suppression of ocular artifacts including microsaccadic spike potentials, while the relevant neural signal remained unaffected. In conclusion the present work aims at a better understanding of individual eye movement artifacts, their interrelations and the respective implications for eye artifact correction. Additionally, the proposed ICA-procedure provides a tool for optimized detection and correction of eye movement-related artifact components. PMID:23087632
Sokolenko, Stanislav; Aucoin, Marc G
2015-09-04
The growing ubiquity of metabolomic techniques has facilitated high frequency time-course data collection for an increasing number of applications. While the concentration trends of individual metabolites can be modeled with common curve fitting techniques, a more accurate representation of the data needs to consider effects that act on more than one metabolite in a given sample. To this end, we present a simple algorithm that uses nonparametric smoothing carried out on all observed metabolites at once to identify and correct systematic error from dilution effects. In addition, we develop a simulation of metabolite concentration time-course trends to supplement available data and explore algorithm performance. Although we focus on nuclear magnetic resonance (NMR) analysis in the context of cell culture, a number of possible extensions are discussed. Realistic metabolic data was successfully simulated using a 4-step process. Starting with a set of metabolite concentration time-courses from a metabolomic experiment, each time-course was classified as either increasing, decreasing, concave, or approximately constant. Trend shapes were simulated from generic functions corresponding to each classification. The resulting shapes were then scaled to simulated compound concentrations. Finally, the scaled trends were perturbed using a combination of random and systematic errors. To detect systematic errors, a nonparametric fit was applied to each trend and percent deviations calculated at every timepoint. Systematic errors could be identified at time-points where the median percent deviation exceeded a threshold value, determined by the choice of smoothing model and the number of observed trends. Regardless of model, increasing the number of observations over a time-course resulted in more accurate error estimates, although the improvement was not particularly large between 10 and 20 samples per trend. The presented algorithm was able to identify systematic errors as small as 2.5 % under a wide range of conditions. Both the simulation framework and error correction method represent examples of time-course analysis that can be applied to further developments in (1)H-NMR methodology and the more general application of quantitative metabolomics.
Jaimes-Bautista, A G; Rodríguez-Camacho, M; Martínez-Juárez, I E; Rodríguez-Agudelo, Y
2017-08-29
Patients with temporal lobe epilepsy (TLE) perform poorly on semantic verbal fluency (SVF) tasks. Completing these tasks successfully involves multiple cognitive processes simultaneously. Therefore, quantitative analysis of SVF (number of correct words in one minute), conducted in most studies, has been found to be insufficient to identify cognitive dysfunction underlying SVF difficulties in TLE. To determine whether a sample of patients with TLE had SVF difficulties compared with a control group (CG), and to identify the cognitive components associated with SVF difficulties using quantitative and qualitative analysis. SVF was evaluated in 25 patients with TLE and 24 healthy controls; the semantic verbal fluency test included 5 semantic categories: animals, fruits, occupations, countries, and verbs. All 5 categories were analysed quantitatively (number of correct words per minute and interval of execution: 0-15, 16-30, 31-45, and 46-60seconds); the categories animals and fruits were also analysed qualitatively (clusters, cluster size, switches, perseverations, and intrusions). Patients generated fewer words for all categories and intervals and fewer clusters and switches for animals and fruits than the CG (P<.01). Differences between groups were not significant in terms of cluster size and number of intrusions and perseverations (P>.05). Our results suggest an association between SVF difficulties in TLE and difficulty activating semantic networks, impaired strategic search, and poor cognitive flexibility. Attention, inhibition, and working memory are preserved in these patients. Copyright © 2017 Sociedad Española de Neurología. Publicado por Elsevier España, S.L.U. All rights reserved.
Sensitivity and Specificity of Eustachian Tube Function Tests in Adults
Doyle, William J.; Swarts, J. Douglas; Banks, Julianne; Casselbrant, Margaretha L; Mandel, Ellen M; Alper, Cuneyt M.
2013-01-01
Objective Determine if Eustachian Tube (ET) function (ETF) tests can identify ears with physician-diagnosed ET dysfunction (ETD) in a mixed population at high sensitivity and specificity and define the inter-relatedness of ETF test parameters. Methods ETF was evaluated using the Forced-Response, Inflation-Deflation, Valsalva and Sniffing tests in 15 control ears of adult subjects after unilateral myringotomy (Group I) and in 23 ears of 19 adult subjects with ventilation tubes inserted for ETD (Group II). Data were analyzed using logistic regression including each parameter independently and then a step-down Discriminant Analysis including all ETF test parameters to predict group assignment. Factor Analysis operating over all parameters was used to explore relatedness. Results The Discriminant Analysis identified 4 ETF test parameters (Valsalva, ET opening pressure, dilatory efficiency and % positive pressure equilibrated) that together correctly assigned ears to Group II at a sensitivity of 95% and a specificity of 83%. Individual parameters representing the efficiency of ET opening during swallowing showed moderately accurate assignments of ears to their respective groups. Three factors captured approximately 98% of the variance among parameters, the first had negative loadings of the ETF structural parameters, the second had positive loadings of the muscle-assisted ET opening parameters and the third had negative loadings of the muscle-assisted ET opening parameters and positive loadings of the structural parameters. Discussion These results show that ETF tests can correctly assign individual ears to physician-diagnosed ETD with high sensitivity and specificity and that ETF test parameters can be grouped into structural-functional categories. PMID:23868429
Hong, Jae-Young; Suh, Seung-Woo; Modi, Hitesh N; Yang, Jae-Hyuk; Park, Si-Young
2013-06-01
To identify factors that can affect postoperative shoulder balance in AIS. 89 adolescent idiopathic scoliosis patients with six types of curvatures who underwent surgery were included in this study. Whole spine antero-posterior and lateral radiographs were obtained pre- and postoperatively. In radiograms, shape and changes in curvatures were analyzed. In addition, four shoulder parameters and coronal balance were analyzed in an effort to identify factors significantly related to postoperative shoulder balance. In general, all the four shoulder parameters (CHD, CA, CRID, RSH) were slightly increased at final follow up (t test, P < 0.05), although there was a decrease in Lenke type II and IV curvatures. However, pre- and postoperative shoulder parameters were not significantly different between each curvature types (ANOVA, P > 0.05). Moreover, no significant differences of pre- and postoperative shoulder level between different level of proximal fusion groups (ANOVA, P > 0.05) existed. In the analysis of coronal curvature changes, no difference was observed in every individual coronal curvatures between improved shoulder balance and aggravated groups (P > 0.05). However, the middle to distal curve change ratio was significantly lower in patients with aggravated shoulder balance (P < 0.05). In addition, patients with smaller preoperative shoulder imbalance showed the higher chance of aggravation after surgery with similar postoperative changes (P < 0.05). Significant relations were found between correction rate of middle, and distal curvature, and postoperative shoulder balance. In addition, preoperative shoulder level difference can be a determinant of postoperative shoulder balance.
The Canadian Precipitation Analysis (CaPA): Evaluation of the statistical interpolation scheme
NASA Astrophysics Data System (ADS)
Evans, Andrea; Rasmussen, Peter; Fortin, Vincent
2013-04-01
CaPA (Canadian Precipitation Analysis) is a data assimilation system which employs statistical interpolation to combine observed precipitation with gridded precipitation fields produced by Environment Canada's Global Environmental Multiscale (GEM) climate model into a final gridded precipitation analysis. Precipitation is important in many fields and applications, including agricultural water management projects, flood control programs, and hydroelectric power generation planning. Precipitation is a key input to hydrological models, and there is a desire to have access to the best available information about precipitation in time and space. The principal goal of CaPA is to produce this type of information. In order to perform the necessary statistical interpolation, CaPA requires the estimation of a semi-variogram. This semi-variogram is used to describe the spatial correlations between precipitation innovations, defined as the observed precipitation amounts minus the GEM forecasted amounts predicted at the observation locations. Currently, CaPA uses a single isotropic variogram across the entire analysis domain. The present project investigates the implications of this choice by first conducting a basic variographic analysis of precipitation innovation data across the Canadian prairies, with specific interest in identifying and quantifying potential anisotropy within the domain. This focus is further expanded by identifying the effect of storm type on the variogram. The ultimate goal of the variographic analysis is to develop improved semi-variograms for CaPA that better capture the spatial complexities of precipitation over the Canadian prairies. CaPA presently applies a Box-Cox data transformation to both the observations and the GEM data, prior to the calculation of the innovations. The data transformation is necessary to satisfy the normal distribution assumption, but introduces a significant bias. The second part of the investigation aims at devising a bias correction scheme based on a moving-window averaging technique. For both the variogram and bias correction components of this investigation, a series of trial runs are conducted to evaluate the impact of these changes on the resulting CaPA precipitation analyses.
Corrective responses in human food intake identified from an analysis of 7-d food-intake records2
Bray, George A; Flatt, Jean-Pierre; Volaufova, Julia; DeLany, James P; Champagne, Catherine M
2009-01-01
Background We tested the hypothesis that ad libitum food intake shows corrective responses over periods of 1–5 d. Design This was a prospective study of food intake in women. Methods Two methods, a weighed food intake and a measured food intake, were used to determine daily nutrient intake during 2 wk in 20 women. Energy expenditure with the use of doubly labeled water was done contemporaneously with the weighed food-intake record. The daily deviations in macronutrient and energy intake from the average 7-d values were compared with the deviations observed 1, 2, 3, 4, and 5 d later to estimate the corrective responses. Results Both methods of recording food intake gave similar patterns of macronutrient and total energy intakes and for deviations from average intakes. The intraindividual CVs for energy intake ranged from ±12% to ±47% with an average of ±25%. Reported energy intake was 85.5–95.0% of total energy expenditure determined by doubly labeled water. Significant corrective responses were observed in food intakes with a 3- to 4-d lag that disappeared when data were randomized within each subject. Conclusions Human beings show corrective responses to deviations from average energy and macronutrient intakes with a lag time of 3–4 d, but not 1–2 d. This suggests that short-term studies may fail to recognize important signals of food-intake regulation that operate over several days. These corrective responses probably play a crucial role in bringing about weight stability. PMID:19064509
Analysis of routine cytopathologic reports in 1,598 histologically verified benign breast lesions.
Pogacnik, A; Us-Krasovec, M
2004-02-01
This retrospective study was designed to evaluate the accuracy of cytopathologic diagnosis and of correct classification of benign breast diseases. A total of 1,598 FNABs were identified to have met the study criteria; of these, 1,258 (78.7%) cases were cytologically benign, 88 (5.5%) suspicious, 3 (0.18%) false-positive, and in 249 (15.6%) cases an inadequate sample was obtained. A specific diagnosis was made in 847/1,258 (67.3%) cases; the other 411 were diagnosed as benign NOS. Out of 847 specific FNABs diagnoses, 451 were fibroadenomas, 27 phyllodes tumors, 289 fibrocystic diseases, 4 proliferative fibrocystic diseases, 38 papillomas, 22 fat necrosis, 9 mastitis, 1 pseudolymphoma, 2 lipomas, 2 duct ecstasies, and 2 atheromas. In our study group the cytopathologic diagnosis of benign breast diseases excluding unsatisfactory aspirates was correct in 93%. Specific diagnosis was correct on average in 50% of cases, only in FA was its accuracy over 60%; in adequately sampled tumor, the predictive value of FA was 86.2%. Copyright 2004 Wiley-Liss, Inc.
Jiang, Hongzhen; Zhao, Jianlin; Di, Jianglei; Qin, Chuan
2009-10-12
We propose an effective reconstruction method for correcting the joint misplacement of the sub-holograms caused by the displacement error of CCD in spatial synthetic aperture digital Fresnel holography. For every two adjacent sub-holograms along the motion path of CCD, we reconstruct the corresponding holographic images under different joint distances between the sub-holograms and then find out the accurate joint distance by evaluating the quality of the corresponding synthetic reconstructed images. Then the accurate relative position relationships of the sub-holograms can be confirmed according to all of the identified joint distances, with which the accurate synthetic reconstructed image can be obtained by superposing the reconstruction results of the sub-holograms. The numerical reconstruction results are in agreement with the theoretical analysis. Compared with the traditional reconstruction method, this method could be used to not only correct the joint misplacement of the sub-holograms without the limitation of the actually overlapping circumstances of the adjacent sub-holograms, but also make the joint precision of the sub-holograms reach sub-pixel accuracy.
Quinn, Kieran L; Crystal, Eugene; Lashevsky, Ilan; Arouny, Banafsheh; Baranchuk, Adrian
2016-07-01
We have previously developed a novel digital tool capable of automatically recognizing correct electrocardiography (ECG) diagnoses in an online exam and demonstrated a significant improvement in diagnostic accuracy when utilizing an inductive-deductive reasoning strategy over a pattern recognition strategy. In this study, we sought to validate these findings from participants at the International Winter Arrhythmia School meeting, one of the foremost electrophysiology events in Canada. Preregistration to the event was sent by e-mail. The exam was administered on day 1 of the conference. Results and analysis were presented the following morning to participants. Twenty-five attendees completed the exam, providing a total of 500 responses to be marked. The online tool accurately identified 195 of a total of 395 (49%) correct responses (49%). In total, 305 responses required secondary manual review, of which 200 were added to the correct responses pool. The overall accuracy of correct ECG diagnosis for all participants was 69% and 84% when using pattern recognition or inductive-deductive strategies, respectively. Utilization of a novel digital tool to evaluate ECG competency can be set up as a workshop at international meetings or educational events. Results can be presented during the sessions to ensure immediate feedback. © 2015 Wiley Periodicals, Inc.
Ongoing transmission of hepatitis B virus infection among inmates at a state correctional facility.
Khan, Amy J; Simard, Edgar P; Bower, William A; Wurtzel, Heather L; Khristova, Marina; Wagner, Karla D; Arnold, Kathryn E; Nainan, Omana V; LaMarre, Madeleine; Bell, Beth P
2005-10-01
We sought to determine hepatitis B virus (HBV) infection prevalence, associated exposures, and incidence among male inmates at a state correctional facility. A cross-sectional serological survey was conducted in June 2000, and susceptible inmates were retested in June 2001. At baseline, 230 inmates (20.5%; 95% confidence interval [CI]=18.2%, 22.9%) exhibited evidence of HBV infection, including 11 acute and 11 chronic infections. Inmates with HBV infection were more likely than susceptible inmates to have injected drugs (38.8% vs 18.0%; adjusted prevalence odds ratio [OR]=3.0; 95% CI=1.9, 4.9), to have had more than 25 female sex partners (27.7% vs 17.5%; adjusted prevalence OR=2.0; 95% CI=1.4, 3.0), and to have been incarcerated for more than 14 years (38.4% vs 17.6%; adjusted prevalence OR=1.7; 95% CI=1.1, 2.6). One year later, 18 (3.6%) showed evidence of new HBV infection. Among 19 individuals with infections, molecular analysis identified 2 clusters involving 10 inmates, each with a unique HBV sequence. We documented ongoing HBV transmission at a state correctional facility. Similar transmission may occur at other US correctional facilities and could be prevented by vaccination of inmates.
Celebi, N; Zwirner, K; Lischner, U; Bauder, M; Ditthard, K; Schürger, S; Riessen, R; Engel, C; Balletshofer, B; Weyrich, P
2012-04-01
Ultrasound is a widely used diagnostic tool. In medical education, it can be used to teach sonographic anatomy as well as the basics of ultrasound diagnostics. Some medical schools have begun implementing student tutor-led teaching sessions in sonographic abdominal anatomy in order to meet the growing demand in ultrasound teaching. However, while this teaching concept has proven to be feasible and well accepted, there is limited data regarding its effectiveness. We investigated whether student tutors teach sonographic anatomy as effectively as faculty staff sonographers. 50 medical students were randomly assigned to one of two groups. 46 of these could be included in the analysis. One group was taught by student tutors (ST) and the other by a faculty staff sonographer (FS). Using a pre/post-test design, students were required to locate and label 15 different abdominal structures. They printed out three pictures in three minutes and subsequently labeled the structures they were able to identify. The pictures were then rated by two blinded faculty staff sonographers. A mean difference of one point in the improvement of correctly identified abdominal structures between the pre-test and post-test among the two groups was regarded as equivalent. In the pre-test, the ST (FS) correctly identified 1.6 ± 1.0 (2.0 ± 1.1) structures. Both the ST and FS group showed improvement in the post-test, correctly identifying 7.8 ± 2.8 vs. 8.9 ± 2.9 structures, respectively (p < .0001 each). Comparing the improvement of the ST (6.2 ± 2.8 structures) versus the FS (6.9 ± 3.2) showed equivalent results between the two groups (p < .05 testing for equivalence). Basic abdominal sonographic anatomy can be taught effectively by student tutors. © Georg Thieme Verlag KG Stuttgart · New York.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ruebelmann, K.L.
1990-01-01
Following the detection of chlorinated volatile organic compounds in the groundwater beneath the SDA in the summer of 1987, hydrogeological characterization of the Radioactive Waste Management Complex (RWMC), Idaho National Engineering Laboratory (INEL) was required by the Resource Conservation and Recovery Act (RCRA). The waste site, the Subsurface Disposal Area (SDA), is the subject of a RCRA Corrective Action Program. Regulatory requirements for the Corrective Action Program dictate a phased approach to evaluation of the SDA. In the first phase of the program, the SDA is the subject of a RCRA Facility Investigation (RIF), which will obtain information to fullymore » characterize the physical properties of the site, determine the nature and extent of contamination, and identify pathways for migration of contaminants. If the need for corrective measures is identified during the RIF, a Corrective Measures Study (CMS) will be performed as second phase. Information generated during the RIF will be used to aid in the selection and implementation of appropriate corrective measures to correct the release. Following the CMS, the final phase is the implementation of the selected corrective measures. 4 refs., 1 fig.« less
Identification accuracy of children versus adults: a meta-analysis.
Pozzulo, J D; Lindsay, R C
1998-10-01
Identification accuracy of children and adults was examined in a meta-analysis. Preschoolers (M = 4 years) were less likely than adults to make correct identifications. Children over the age of 5 did not differ significantly from adults with regard to correct identification rate. Children of all ages examined were less likely than adults to correctly reject a target-absent lineup. Even adolescents (M = 12-13 years) did not reach an adult rate of correct rejection. Compared to simultaneous lineup presentation, sequential lineups increased the child-adult gap for correct rejections. Providing child witnesses with identification practice or training did not increase their correct rejection rates. Suggestions for children's inability to correctly reject target-absent lineups are discussed. Future directions for identification research are presented.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-15
... NUCLEAR REGULATORY COMMISSION [NRC-2011-0254] Common-Cause Failure Analysis in Event and Condition Assessment: Guidance and Research, Draft Report for Comment; Correction AGENCY: Nuclear Regulatory Commission. ACTION: Draft NUREG; request for comment; correction. SUMMARY: This document corrects a notice appearing...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wickline, Alfred
2004-04-01
This Corrective Action Decision Document (CADD) has been prepared for Corrective Action Unit (CAU) 204 Storage Bunkers, Nevada Test Site (NTS), Nevada, in accordance with the ''Federal Facility Agreement and Consent Order'' (FFACO) that was agreed to by the State of Nevada; U.S. Department of Energy (DOE); and the U.S. Department of Defense (FFACO, 1996). The NTS is approximately 65 miles (mi) north of Las Vegas, Nevada (Figure 1-1). The Corrective Action Sites (CASs) within CAU 204 are located in Areas 1, 2, 3, and 5 of the NTS, in Nye County, Nevada (Figure 1-2). Corrective Action Unit 204 ismore » comprised of the six CASs identified in Table 1-1. As shown in Table 1-1, the FFACO describes four of these CASs as bunkers one as chemical exchange storage and one as a blockhouse. Subsequent investigations have identified four of these structures as instrumentation bunkers (CASs 01-34-01, 02-34-01, 03-34-01, 05-33-01), one as an explosives storage bunker (CAS 05-99-02), and one as both (CAS 05-18-02). The six bunkers included in CAU 204 were primarily used to monitor atmospheric testing or store munitions. The ''Corrective Action Investigation Plan (CAIP) for Corrective Action Unit 204: Storage Bunkers, Nevada Test Site, Nevada'' (NNSA/NV, 2002a) provides information relating to the history, planning, and scope of the investigation; therefore, it will not be repeated in this CADD. This CADD identifies potential corrective action alternatives and provides a rationale for the selection of a recommended corrective action alternative for each CAS within CAU 204. The evaluation of corrective action alternatives is based on process knowledge and the results of investigative activities conducted in accordance with the CAIP (NNSA/NV, 2002a) that was approved prior to the start of the Corrective Action Investigation (CAI). Record of Technical Change (ROTC) No. 1 to the CAIP (approval pending) documents changes to the preliminary action levels (PALs) agreed to by the Nevada Division of Environmental Protection (NDEP) and DOE, National Nuclear Security Administration Nevada Site Office (NNSA/NSO). This ROTC specifically discusses the radiological PALs and their application to the findings of the CAU 204 corrective action investigation.« less
Correction for spatial averaging in laser speckle contrast analysis
Thompson, Oliver; Andrews, Michael; Hirst, Evan
2011-01-01
Practical laser speckle contrast analysis systems face a problem of spatial averaging of speckles, due to the pixel size in the cameras used. Existing practice is to use a system factor in speckle contrast analysis to account for spatial averaging. The linearity of the system factor correction has not previously been confirmed. The problem of spatial averaging is illustrated using computer simulation of time-integrated dynamic speckle, and the linearity of the correction confirmed using both computer simulation and experimental results. The valid linear correction allows various useful compromises in the system design. PMID:21483623
Fractography and estimates of fracture origin size from fracture mechanics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Quinn, G.D.; Swab, J.J.
1996-12-31
Fracture mechanics should be used routinely in fractographic analyses in order to verify that the correct feature has been identified as the fracture origin. This was highlighted in a recent Versailles Advanced Materials and Standards (VAMAS) fractographic analysis round robin. The practice of using fracture mechanics as an aid to fractographic interpretation is codified in a new ASTM Standard Practice. Conversely, very good estimates for fracture toughness often come from fractographic analysis of strength tested specimens. In many instances however, the calculated flaw size is different from the empirically-measured flaw size. This paper reviews the factors which may cause themore » discrepancies.« less
Kuchenbecker, J; Blum, M; Paul, F
2016-03-01
In acute unilateral optic neuritis (ON) color vision defects combined with a decrease in visual acuity and contrast sensitivity frequently occur. This study investigated whether a web-based color vision test is a reliable detector of acquired color vision defects in ON and, if so, which charts are particularly suitable. In 12 patients with acute unilateral ON, a web-based color vision test ( www.farbsehtest.de ) with 25 color plates (16 Velhagen/Broschmann and 9 Ishihara color plates) was performed. For each patient the affected eye was tested first and then the unaffected eye. The mean best-corrected distance visual acuity (BCDVA) in the ON eye was 0.36 ± 0.20 and 1.0 ± 0.1 in the contralateral eye. The number of incorrectly read plates correlated with the visual acuity. For the ON eye a total of 134 plates were correctly identified and 166 plates were incorrectly identified, while for the disease-free fellow eye, 276 plates were correctly identified and 24 plates were incorrectly identified. Both of the blue/yellow plates were identified correctly 14 times and incorrectly 10 times using the ON eye and exclusively correctly (24 times) using the fellow eye. The Velhagen/Broschmann plates were incorrectly identified significantly more frequently in comparison with the Ishihara plates. In 4 out of 16 Velhagen/Broschmann plates and 5 out of 9 Ishihara plates, no statistically significant differences between the ON eye and the fellow eye could be detected. The number of incorrectly identified plates correlated with a decrease in visual acuity. Red/green and blue/yellow plates were incorrectly identified significantly more frequently with the ON eye, while the Velhagen/Broschmann color plates were incorrectly identified significantly more frequently than the Ishihara color plates. Thus, under defined test conditions the web-based color vision test can also be used to detect acquired color vision defects, such as those caused by ON. Optimization of the test by altering the combination of plates may be a useful next step.
Systematic Analysis Of Ocean Colour Uncertainties
NASA Astrophysics Data System (ADS)
Lavender, Samantha
2013-12-01
This paper reviews current research into the estimation of uncertainties as a pixel-based measure to aid non- specialist users of remote sensing products. An example MERIS image, captured on the 28 March 2012, was processed with above-water atmospheric correction code. This was initially based on both the Antoine & Morel Standard Atmospheric Correction, with Bright Pixel correction component, and Doerffer Neural Network coastal water's approach. It's showed that analysis of the atmospheric by-products yield important information about the separation of the atmospheric and in-water signals, helping to sign-post possible uncertainties in the atmospheric correction results. Further analysis has concentrated on implementing a ‘simplistic' atmospheric correction so that the impact of changing the input auxiliary data can be analysed; the influence of changing surface pressure is demonstrated. Future work will focus on automating the analysis, so that the methodology can be implemented within an operational system.
Error Detection/Correction in Collaborative Writing
ERIC Educational Resources Information Center
Pilotti, Maura; Chodorow, Martin
2009-01-01
In the present study, we examined error detection/correction during collaborative writing. Subjects were asked to identify and correct errors in two contexts: a passage written by the subject (familiar text) and a passage written by a person other than the subject (unfamiliar text). A computer program inserted errors in function words prior to the…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-11-16
... Insurance; and Program No. 93.774, Medicare-- Supplementary Medical Insurance Program) Dated: November 9...: Correction notice. SUMMARY: This document corrects a technical error that appeared in the notice published in... of July 22, 2010 (75 FR 42836), there was a technical error that we are identifying and correcting in...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-02
..., Medicare--Hospital Insurance; and Program No. 93.774, Medicare-- Supplementary Medical Insurance Program.... SUMMARY: This document corrects a typographical error that appeared in the notice published in the Federal... typographical error that is identified and corrected in the Correction of Errors section below. II. Summary of...
Neutron Capture and the Antineutrino Yield from Nuclear Reactors.
Huber, Patrick; Jaffke, Patrick
2016-03-25
We identify a new, flux-dependent correction to the antineutrino spectrum as produced in nuclear reactors. The abundance of certain nuclides, whose decay chains produce antineutrinos above the threshold for inverse beta decay, has a nonlinear dependence on the neutron flux, unlike the vast majority of antineutrino producing nuclides, whose decay rate is directly related to the fission rate. We have identified four of these so-called nonlinear nuclides and determined that they result in an antineutrino excess at low energies below 3.2 MeV, dependent on the reactor thermal neutron flux. We develop an analytic model for the size of the correction and compare it to the results of detailed reactor simulations for various real existing reactors, spanning 3 orders of magnitude in neutron flux. In a typical pressurized water reactor the resulting correction can reach ∼0.9% of the low energy flux which is comparable in size to other, known low-energy corrections from spent nuclear fuel and the nonequilibrium correction. For naval reactors the nonlinear correction may reach the 5% level by the end of cycle.
Ruiz-Aragón, Jesús; Ballestero-Téllez, Mónica; Gutiérrez-Gutiérrez, Belén; de Cueto, Marina; Rodríguez-Baño, Jesús; Pascual, Álvaro
2017-10-27
The rapid identification of bacteraemia-causing pathogens could assist clinicians in the timely prescription of targeted therapy, thereby reducing the morbidity and mortality of this infection. In recent years, numerous techniques that rapidly and directly identify positive blood cultures have been marketed, with matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF MS) being one of the most commonly used. The aim of this systematic review and meta-analysis was to evaluate the accuracy of MALDI-TOF (Bruker ® ) for the direct identification of positive blood culture bottles. A meta-analysis was performed to summarize the results of the 32 studies evaluated. The overall quality of the studies was moderate. For Gram-positive bacteria, overall rates of correct identification of the species ranged from 0.17 to 0.98, with a cumulative rate (random-effects model) of 0.72 (95% CI: 0.64-0.80). For Gram-negative bacteria, correct identification rates ranged from 0.66 to 1.00, with a cumulative effect of 0.92 (95% CI: 0.88-0.95). For Enterobacteriaceae, the rate was 0.96 (95% CI: 0.94-0.97). MALDI-TOF mass spectrometry shows high accuracy for the correct identification of Gram-negative bacteria, particularly Enterobacteriaceae, directly from positive blood culture bottles, and moderate accuracy for the identification of Gram-positive bacteria (low for some species). Copyright © 2017 Elsevier España, S.L.U. and Sociedad Española de Enfermedades Infecciosas y Microbiología Clínica. All rights reserved.
The response of numerical weather prediction analysis systems to FGGE 2b data
NASA Technical Reports Server (NTRS)
Hollingsworth, A.; Lorenc, A.; Tracton, S.; Arpe, K.; Cats, G.; Uppala, S.; Kallberg, P.
1985-01-01
An intercomparison of analyses of the main PGGE Level IIb data set is presented with three advanced analysis systems. The aims of the work are to estimate the extent and magnitude of the differences between the analyses, to identify the reasons for the differences, and finally to estimate the significance of the differences. Extratropical analyses only are considered. Objective evaluations of analysis quality, such as fit to observations, statistics of analysis differences, and mean fields are discussed. In addition, substantial emphasis is placed on subjective evaluation of a series of case studies that were selected to illustrate the importance of different aspects of the analysis procedures, such as quality control, data selection, resolution, dynamical balance, and the role of the assimilating forecast model. In some cases, the forecast models are used as selective amplifiers of analysis differences to assist in deciding which analysis was more nearly correct in the treatment of particular data.
Detection and correction of prescription errors by an emergency department pharmacy service.
Stasiak, Philip; Afilalo, Marc; Castelino, Tanya; Xue, Xiaoqing; Colacone, Antoinette; Soucy, Nathalie; Dankoff, Jerrald
2014-05-01
Emergency departments (EDs) are recognized as a high-risk setting for prescription errors. Pharmacist involvement may be important in reviewing prescriptions to identify and correct errors. The objectives of this study were to describe the frequency and type of prescription errors detected by pharmacists in EDs, determine the proportion of errors that could be corrected, and identify factors associated with prescription errors. This prospective observational study was conducted in a tertiary care teaching ED on 25 consecutive weekdays. Pharmacists reviewed all documented prescriptions and flagged and corrected errors for patients in the ED. We collected information on patient demographics, details on prescription errors, and the pharmacists' recommendations. A total of 3,136 ED prescriptions were reviewed. The proportion of prescriptions in which a pharmacist identified an error was 3.2% (99 of 3,136; 95% confidence interval [CI] 2.5-3.8). The types of identified errors were wrong dose (28 of 99, 28.3%), incomplete prescription (27 of 99, 27.3%), wrong frequency (15 of 99, 15.2%), wrong drug (11 of 99, 11.1%), wrong route (1 of 99, 1.0%), and other (17 of 99, 17.2%). The pharmacy service intervened and corrected 78 (78 of 99, 78.8%) errors. Factors associated with prescription errors were patient age over 65 (odds ratio [OR] 2.34; 95% CI 1.32-4.13), prescriptions with more than one medication (OR 5.03; 95% CI 2.54-9.96), and those written by emergency medicine residents compared to attending emergency physicians (OR 2.21, 95% CI 1.18-4.14). Pharmacists in a tertiary ED are able to correct the majority of prescriptions in which they find errors. Errors are more likely to be identified in prescriptions written for older patients, those containing multiple medication orders, and those prescribed by emergency residents.
Saturn 5 launch vehicle flight evaluation report-AS-511 Apollo 16 mission
NASA Technical Reports Server (NTRS)
1972-01-01
A postflight analysis of the Apollo 16 mission is presented. The basic objective of the flight evaluation is to acquire, reduce, analyze, and report on flight data to the extent required to assure future mission success and vehicle reliability. Actual flight problems are identified, their causes are deet determined, and recommendations are made for corrective actions. Summaries of launch operations and spacecraft performance are included. Significant events for all phases of the flight are provide in tabular form.
Automated quality control for stitching of textile articles
NASA Technical Reports Server (NTRS)
Miller, Jeffrey L. (Inventor); Markus, Alan (Inventor)
1999-01-01
Quality control for stitching of a textile article is performed by measuring thread tension in the stitches as the stitches are being made, determining locations of the stitches, and generating a map including the locations and stitching data derived from the measured thread tensions. The stitching data can be analyzed, off-line or in real time, to identify defective stitches. Defective stitches can then be repaired. Real time analysis of the thread tensions allows problems such as broken needle threads to be corrected immediately.
Analysis of Navy radome failure problems
NASA Technical Reports Server (NTRS)
Tatnall, G. J.; Foulke, K.
1974-01-01
A survey of radome failure problems in military aircraft under actual operating conditions was conducted. The aircraft were operating from aircraft carriers in the Pacific Ocean. Critical problem areas were identified and a plan was developed for failure prevention. The development and application of repair kits for correcting the erosion damage are reported. It is stated that the rain erosion damage survey established a strong justification for qualification testing of all materials and designs which may have questionable life expectancy on the aircraft.
Corrective Action Plan in response to the March 1992 Tiger Team Assessment of the Ames Laboratory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1992-11-20
On March 5, 1992, a Department of Energy (DOE) Tiger Team completed an assessment of the Ames Laboratory, located in Ames, Iowa. The purpose of the assessment was to provide the Secretary of Energy with a report on the status and performance of Environment, Safety and Health (ES H) programs at Ames Laboratory. Detailed findings of the assessment are presented in the report, DOE/EH-0237, Tiger Team Assessment of the Ames Laboratory. This document, the Ames Laboratory Corrective Action Plan (ALCAP), presents corrective actions to overcome deficiencies cited in the Tiger Team Assessment. The Tiger Team identified 53 Environmental findings, frommore » which the Team derived four key findings. In the Safety and Health (S H) area, 126 concerns were identified, eight of which were designated Category 11 (there were no Category I concerns). Seven key concerns were derived from the 126 concerns. The Management Subteam developed 19 findings which have been summarized in four key findings. The eight S H Category 11 concerns identified in the Tiger Team Assessment were given prompt management attention. Actions to address these deficiencies have been described in individual corrective action plans, which were submitted to DOE Headquarters on March 20, 1992. The ALCAP includes actions described in this early response, as well as a long term strategy and framework for correcting all remaining deficiencies. Accordingly, the ALCAP presents the organizational structure, management systems, and specific responses that are being developed to implement corrective actions and to resolve root causes identified in the Tiger Team Assessment. The Chicago Field Office (CH), IowaState University (ISU), the Institute for Physical Research and Technology (IPRT), and Ames Laboratory prepared the ALCAP with input from the DOE Headquarters, Office of Energy Research (ER).« less
Corrective Action Plan in response to the March 1992 Tiger Team Assessment of the Ames Laboratory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1992-11-20
On March 5, 1992, a Department of Energy (DOE) Tiger Team completed an assessment of the Ames Laboratory, located in Ames, Iowa. The purpose of the assessment was to provide the Secretary of Energy with a report on the status and performance of Environment, Safety and Health (ES&H) programs at Ames Laboratory. Detailed findings of the assessment are presented in the report, DOE/EH-0237, Tiger Team Assessment of the Ames Laboratory. This document, the Ames Laboratory Corrective Action Plan (ALCAP), presents corrective actions to overcome deficiencies cited in the Tiger Team Assessment. The Tiger Team identified 53 Environmental findings, from whichmore » the Team derived four key findings. In the Safety and Health (S&H) area, 126 concerns were identified, eight of which were designated Category 11 (there were no Category I concerns). Seven key concerns were derived from the 126 concerns. The Management Subteam developed 19 findings which have been summarized in four key findings. The eight S&H Category 11 concerns identified in the Tiger Team Assessment were given prompt management attention. Actions to address these deficiencies have been described in individual corrective action plans, which were submitted to DOE Headquarters on March 20, 1992. The ALCAP includes actions described in this early response, as well as a long term strategy and framework for correcting all remaining deficiencies. Accordingly, the ALCAP presents the organizational structure, management systems, and specific responses that are being developed to implement corrective actions and to resolve root causes identified in the Tiger Team Assessment. The Chicago Field Office (CH), IowaState University (ISU), the Institute for Physical Research and Technology (IPRT), and Ames Laboratory prepared the ALCAP with input from the DOE Headquarters, Office of Energy Research (ER).« less
Evaluation of facial expression in acute pain in cats.
Holden, E; Calvo, G; Collins, M; Bell, A; Reid, J; Scott, E M; Nolan, A M
2014-12-01
To describe the development of a facial expression tool differentiating pain-free cats from those in acute pain. Observers shown facial images from painful and pain-free cats were asked to identify if they were in pain or not. From facial images, anatomical landmarks were identified and distances between these were mapped. Selected distances underwent statistical analysis to identify features discriminating pain-free and painful cats. Additionally, thumbnail photographs were reviewed by two experts to identify discriminating facial features between the groups. Observers (n = 68) had difficulty in identifying pain-free from painful cats, with only 13% of observers being able to discriminate more than 80% of painful cats. Analysis of 78 facial landmarks and 80 distances identified six significant factors differentiating pain-free and painful faces including ear position and areas around the mouth/muzzle. Standardised mouth and ear distances when combined showed excellent discrimination properties, correctly differentiating pain-free and painful cats in 98% of cases. Expert review supported these findings and a cartoon-type picture scale was developed from thumbnail images. Initial investigation into facial features of painful and pain-free cats suggests potentially good discrimination properties of facial images. Further testing is required for development of a clinical tool. © 2014 British Small Animal Veterinary Association.
DOE Office of Scientific and Technical Information (OSTI.GOV)
ITLV.
1998-06-01
This Corrective Action Decision Document has been prepared for the Area 3 Septic Waste Systems 2 and 6 (Corrective Action Unit 427) in accordance with the Federal Facility Agreement and Consent Order of 1996 (FFACO, 1996). Corrective Action Unit 427 is located at the Tonopah Test Range, Nevada, and is comprised of the following Corrective Action Sites, each an individual septic waste system (DOE/NV, 1996a): Septic Waste System 2 is Corrective Action Site Number 03-05-002-SW02. Septic Waste System 6 is Corrective Action Site Number 03-05-002-SW06. The purpose of this Corrective Action Decision Document is to identify and provide a rationalemore » for the selection of a recommended corrective action alternative for each Corrective Action Site. The scope of this Correction Action Decision Document consists of the following tasks: Develop corrective action objectives. Identify corrective action alternative screening criteria. Develop corrective action alternatives. Perform detailed and comparative evaluations of the corrective action alternatives in relation to the corrective action objectives and screening criteria. Recommend and justify a preferred corrective action alternative for each CAS. From November 1997 through January 1998, a corrective action investigation was performed as set forth in the Corrective Action Investigation Plan for Corrective Action Unit No. 427: Area 3 Septic Waste System Numbers 2 and 6, Tonopah Test Range, Nevada (DOE/NV, 1997b). Details can be found in Appendix A of this document. The results indicated that contamination is present in some portions of the CAU and not in others as described in Table ES-1 and shown in Figure A.2-2 of Appendix A. Based on the potential exposure pathways, the following corrective action objectives have been identified for Corrective Action Unit 427: Prevent or mitigate human exposure to subsurface soils containing TPH at concentrations greater than 100 milligrams per kilogram (NAC, 1996b). Close Septic Tank 33-5 in accordance with Nevada Administrative Code 459 (NAC, 1996c). Prevent adverse impacts to groundwater quality. Based on the review of existing data, future land use, and current operations at the Tonopah Test Range, the following alternatives were developed for consideration at the Area 3 Septic Waste Systems 2 and 6: Alternative 1 - No Further Action Alternative 2 - Closure of Septic Tank 33-5 and Administrative Controls Alternative 3 - Closure of Septic Tank 33-5, Excavation, and Disposal The corrective action alternatives were evaluated based on four general corrective action standards and five remedy selection decision factors. Based on the results of this evaluation, the preferred alternative for Corrective Action Unit 427 is Alternative 2, Closure of Septic Tank 33-5 and Administrative Controls. The preferred corrective action alternative was evaluated on technical merit, focusing on performance, reliability, feasibility, and safety. The alternative was judged to meet all requirements for the technical components evaluated. The alternative meets all applicable state and federal regulations for closure of the site and will reduce potential future exposure pathways to the contaminated soils. During corrective action implementation, this alternative will present minimal potential threat to site workers who come in contact with the waste. However, procedures will be developed and implemented to ensure worker health and safety.« less
Analytical and computational approaches to define the Aspergillus niger secretome.
Tsang, Adrian; Butler, Gregory; Powlowski, Justin; Panisko, Ellen A; Baker, Scott E
2009-03-01
We used computational and mass spectrometric approaches to characterize the Aspergillus niger secretome.The 11,200 gene models predicted in the genome of A. niger strain ATCC 1015 were the data source for the analysis. Depending on the computational methods used, 691 to 881 proteins were predicted to be secreted proteins. We cultured A. niger in six different media and analyzed the extracellular proteins produced using mass spectrometry. A total of 222 proteins were identified, with 39 proteins expressed under all six conditions and 74 proteins expressed under only one condition. The secreted proteins identified by mass spectrometry were used to guide the correction of about 20 gene models. Additional analysis focused on extracellular enzymes of interest for biomass processing. Of the 63 glycoside hydrolases predicted to be capable of hydrolyzing cellulose, hemicellulose or pectin, 94% of the exo-acting enzymes and only 18% of the endo-acting enzymes were experimentally detected.
Extracellular space preservation aids the connectomic analysis of neural circuits.
Pallotto, Marta; Watkins, Paul V; Fubara, Boma; Singer, Joshua H; Briggman, Kevin L
2015-12-09
Dense connectomic mapping of neuronal circuits is limited by the time and effort required to analyze 3D electron microscopy (EM) datasets. Algorithms designed to automate image segmentation suffer from substantial error rates and require significant manual error correction. Any improvement in segmentation error rates would therefore directly reduce the time required to analyze 3D EM data. We explored preserving extracellular space (ECS) during chemical tissue fixation to improve the ability to segment neurites and to identify synaptic contacts. ECS preserved tissue is easier to segment using machine learning algorithms, leading to significantly reduced error rates. In addition, we observed that electrical synapses are readily identified in ECS preserved tissue. Finally, we determined that antibodies penetrate deep into ECS preserved tissue with only minimal permeabilization, thereby enabling correlated light microscopy (LM) and EM studies. We conclude that preservation of ECS benefits multiple aspects of the connectomic analysis of neural circuits.
Applications of Skylab data to land use and climatological analysis
NASA Technical Reports Server (NTRS)
Alexander, R. H. (Principal Investigator); Lewis, J. E., Jr.; Lins, H. F., Jr.; Jenner, C. B.; Outcalt, S. I.; Pease, R. W.
1976-01-01
The author has identified the following significant results. Skylab study in Central Atlantic Regional Ecological Test Site encompassed two separate but related tasks: (1) evaluation of photographic sensors S190A and B as sources of land use data for planning and managing land resources in major metropolitan regions, and (2) evaluation of the multispectral scanner S192 used in conjunction with associated data and analytical techniques as a data source on urban climates and the surface energy balance. Photographs from the Skylab S190B earth terrain camera were of greatest interest in the land use analysis task; they were of sufficiently high resolution to identify and map many level 2 and 3 land use categories. After being corrected to allow for atmosphere effects, output from thermal and visible bands of the S192 was employed in constructing computer map plots of albedo and surface temperature.
Paediatric acid-base disorders: A case-based review of procedures and pitfalls
Carmody, J Bryan; Norwood, Victoria F
2013-01-01
Acid-base disorders occur frequently in paediatric patients. Despite the perception that their analysis is complex and difficult, a straightforward set of rules is sufficient to interpret even the most complex disorders – provided certain pitfalls are avoided. Using a case-based approach, the present article reviews the fundamental concepts of acid-base analysis and highlights common mistakes and oversights. Specific topics include the proper identification of the primary disorder; distinguishing compensatory changes from additional primary disorders; use of the albumin-corrected anion gap to generate a differential diagnosis for patients with metabolic acidosis; screening for mixed disorders with the delta-delta formula; recognizing the limits of compensation; use of the anion gap to identify ‘hidden’ acidosis; and the importance of using information from the history and physical examination to identify the specific cause of a patient’s acid-base disturbance. PMID:24381489
Analytical and computational approaches to define the Aspergillus niger secretome
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tsang, Adrian; Butler, Gregory D.; Powlowski, Justin
2009-03-01
We used computational and mass spectrometric approaches to characterize the Aspergillus niger secretome. The 11,200 gene models predicted in the genome of A. niger strain ATCC 1015 were the data source for the analysis. Depending on the computational methods used, 691 to 881 proteins were predicted to be secreted proteins. We cultured A. niger in six different media and analyzed the extracellular proteins produced using mass spectrometry. A total of 222 proteins were identified, with 39 proteins expressed under all six conditions and 74 proteins expressed under only one condition. The secreted proteins identified by mass spectrometry were used tomore » guide the correction of about 20 gene models. Additional analysis focused on extracellular enzymes of interest for biomass processing. Of the 63 glycoside hydrolases predicted to be capable of hydrolyzing cellulose, hemicellulose or pectin, 94% of the exo-acting enzymes and only 18% of the endo-acting enzymes were experimentally detected.« less
NASA Astrophysics Data System (ADS)
Hananto, R. B.; Kusmayadi, T. A.; Riyadi
2018-05-01
The research aims to identify the critical thinking process of students in solving geometry problems. The geometry problem selected in this study was the building of flat side room (cube). The critical thinking process was implemented to visual, auditory and kinesthetic learning styles. This research was a descriptive analysis research using qualitative method. The subjects of this research were 3 students selected by purposive sampling consisting of visual, auditory, and kinesthetic learning styles. Data collection was done through test, interview, and observation. The results showed that the students' critical thinking process in identifying and defining steps for each learning style were similar in solving problems. The critical thinking differences were seen in enumerate, analyze, list, and self-correct steps. It was also found that critical thinking process of students with kinesthetic learning style was better than visual and auditory learning styles.
Parenting style in relation to pathogenic and protective factors of Type A behaviour pattern.
Castro, J; de Pablo, J; Toro, J; Valdés, M
1999-07-01
Studies of type A behaviour pattern suggest that it can be promoted as a whole by certain parental rearing styles. However, the association of the different components of the type A behaviour with specific rearing practices has not been clarified. The relationship between parents' rearing style and the different type A behaviour components of their children was analysed in a sample of 312 university students. Parental rearing style was assessed with the EMBU, a Swedish measure originally designed to assess one's recollections concerning one's parents rearing behaviour. Type A pattern was measured by the JAS, a self-administered questionnaire that gives the global type A score and three of its components. Hard Driving was related to Rejection and Favouring Subject in males. Speed-Impatience was related to Rejection and Control in both sexes, and Job Involvement was related to Control and Favouring Subject in females. In a discriminant factor analysis in males, Rejection, Control and Favouring Subject on the part of fathers classified correctly 80% of the subjects identified as having high or low Speed-Impatience and the variables of Rejection and Favouring Subject (also by fathers) classified correctly 69.23% of the subjects identified as high or low Hard Driving. In females, Control and Favouring Subject on the part of mothers and low Rejection by fathers classified correctly 70.37% of the subjects with high or low Job Involvement. These results suggest that different rearing characteristics are related to the various components of the type A behaviour pattern.
The intelligibility in Context Scale: validity and reliability of a subjective rating measure.
McLeod, Sharynne; Harrison, Linda J; McCormack, Jane
2012-04-01
To describe a new measure of functional intelligibility, the Intelligibility in Context Scale (ICS), and evaluate its validity, reliability, and sensitivity using 3 clinical measures of severity of speech sound disorder: (a) percentage of phonemes correct (PPC), (b) percentage of consonants correct (PCC), and (c) percentage of vowels correct (PVC). Speech skills of 120 preschool children (109 with parent-/teacher-identified concern about how they talked and made speech sounds and 11 with no identified concern) were assessed with the Diagnostic Evaluation of Articulation and Phonology (Dodd, Hua, Crosbie, Holm, & Ozanne, 2002). Parents completed the 7-item ICS, which rates the degree to which children's speech is understood by different communication partners (parents, immediate family, extended family, friends, acquaintances, teachers, and strangers) on a 5-point scale. Parents' ratings showed that most children were always (5) or usually (4) understood by parents, immediate family, and teachers, but only sometimes (3) by strangers. Factor analysis confirmed the internal consistency of the ICS items; therefore, ratings were averaged to form an overall intelligibility score. The ICS had high internal reliability (α = .93), sensitivity, and construct validity. Criterion validity was established through significant correlations between the ICS and PPC (r = .54), PCC (r = .54), and PVC (r = .36). The ICS is a promising new measure of functional intelligibility. These data provide initial support for the ICS as an easily administered, valid, and reliable estimate of preschool children's intelligibility when speaking with people of varying levels of familiarity and authority.
The cost of adherence mismeasurement in serious mental illness: a claims-based analysis.
Shafrin, Jason; Forma, Felicia; Scherer, Ethan; Hatch, Ainslie; Vytlacil, Edward; Lakdawalla, Darius
2017-05-01
To quantify how adherence mismeasurement affects the estimated impact of adherence on inpatient costs among patients with serious mental illness (SMI). Proportion of days covered (PDC) is a common claims-based measure of medication adherence. Because PDC does not measure medication ingestion, however, it may inaccurately measure adherence. We derived a formula to correct the bias that occurs in adherence-utilization studies resulting from errors in claims-based measures of adherence. We conducted a literature review to identify the correlation between gold-standard and claims-based adherence measures. We derived a bias-correction methodology to address claims-based medication adherence measurement error. We then applied this methodology to a case study of patients with SMI who initiated atypical antipsychotics in 2 large claims databases. Our literature review identified 6 studies of interest. The 4 most relevant ones measured correlations between 0.38 and 0.91. Our preferred estimate implies that the effect of adherence on inpatient spending estimated from claims data would understate the true effect by a factor of 5.3, if there were no other sources of bias. Although our procedure corrects for measurement error, such error also may amplify or mitigate other potential biases. For instance, if adherent patients are healthier than nonadherent ones, measurement error makes the resulting bias worse. On the other hand, if adherent patients are sicker, measurement error mitigates the other bias. Measurement error due to claims-based adherence measures is worth addressing, alongside other more widely emphasized sources of bias in inference.
The class analysis of poverty: a response to Tony Novak.
Wright, E O
1996-01-01
In responding to Tony Novak's criticisms of his earlier article "The Class Analysis of Poverty," the author makes four principle points. First, contrary to Novak's views, a class analysis to poverty should define poverty in terms of both income-poverty and asset-poverty. Second, while Novak is correct that the term "underclass" often has a pejorative meaning, it remains an important concept for identifying segments of the population that are deeply oppressed economically, but not exploited. Third, the concepts of class analysis must be elaborated at a variety of levels of abstraction, not simply the highest level of the pure "mode of production," as is implied by Novak's arguments. Finally, class analysis must acknowledge and conceptualize the specific forms of complexity of contemporary class structures, which is impossible if it restricts its class concepts to a simple polarized notion.
Moore, D F; Harwood, V J; Ferguson, D M; Lukasik, J; Hannah, P; Getrich, M; Brownell, M
2005-01-01
The accuracy of ribotyping and antibiotic resistance analysis (ARA) for prediction of sources of faecal bacterial pollution in an urban southern California watershed was determined using blinded proficiency samples. Antibiotic resistance patterns and HindIII ribotypes of Escherichia coli (n = 997), and antibiotic resistance patterns of Enterococcus spp. (n = 3657) were used to construct libraries from sewage samples and from faeces of seagulls, dogs, cats, horses and humans within the watershed. The three libraries were analysed to determine the accuracy of host source prediction. The internal accuracy of the libraries (average rate of correct classification, ARCC) with six source categories was 44% for E. coli ARA, 69% for E. coli ribotyping and 48% for Enterococcus ARA. Each library's predictive ability towards isolates that were not part of the library was determined using a blinded proficiency panel of 97 E. coli and 99 Enterococcus isolates. Twenty-eight per cent (by ARA) and 27% (by ribotyping) of the E. coli proficiency isolates were assigned to the correct source category. Sixteen per cent were assigned to the same source category by both methods, and 6% were assigned to the correct category. Addition of 2480 E. coli isolates to the ARA library did not improve the ARCC or proficiency accuracy. In contrast, 45% of Enterococcus proficiency isolates were correctly identified by ARA. None of the methods performed well enough on the proficiency panel to be judged ready for application to environmental samples. Most microbial source tracking (MST) studies published have demonstrated library accuracy solely by the internal ARCC measurement. Low rates of correct classification for E. coli proficiency isolates compared with the ARCCs of the libraries indicate that testing of bacteria from samples that are not represented in the library, such as blinded proficiency samples, is necessary to accurately measure predictive ability. The library-based MST methods used in this study may not be suited for determination of the source(s) of faecal pollution in large, urban watersheds.
New methods for engineering site characterization using reflection and surface wave seismic survey
NASA Astrophysics Data System (ADS)
Chaiprakaikeow, Susit
This study presents two new seismic testing methods for engineering application, a new shallow seismic reflection method and Time Filtered Analysis of Surface Waves (TFASW). Both methods are described in this dissertation. The new shallow seismic reflection was developed to measure reflection at a single point using two to four receivers, assuming homogeneous, horizontal layering. It uses one or more shakers driven by a swept sine function as a source, and the cross-correlation technique to identify wave arrivals. The phase difference between the source forcing function and the ground motion due to the dynamic response of the shaker-ground interface was corrected by using a reference geophone. Attenuated high frequency energy was also recovered using the whitening in frequency domain. The new shallow seismic reflection testing was performed at the crest of Porcupine Dam in Paradise, Utah. The testing used two horizontal Vibroseis sources and four receivers for spacings between 6 and 300 ft. Unfortunately, the results showed no clear evidence of the reflectors despite correction of the magnitude and phase of the signals. However, an improvement in the shape of the cross-correlations was noticed after the corrections. The results showed distinct primary lobes in the corrected cross-correlated signals up to 150 ft offset. More consistent maximum peaks were observed in the corrected waveforms. TFASW is a new surface (Rayleigh) wave method to determine the shear wave velocity profile at a site. It is a time domain method as opposed to the Spectral Analysis of Surface Waves (SASW) method, which is a frequency domain method. This method uses digital filtering to optimize bandwidth used to determine the dispersion curve. Results from testings at three different sites in Utah indicated good agreement with the dispersion curves measured using both TFASW and SASW methods. The advantage of TFASW method is that the dispersion curves had less scatter at long wavelengths as a result from wider bandwidth used in those tests.
Marcello, Javier; Eugenio, Francisco; Perdomo, Ulises; Medina, Anabella
2016-01-01
The precise mapping of vegetation covers in semi-arid areas is a complex task as this type of environment consists of sparse vegetation mainly composed of small shrubs. The launch of high resolution satellites, with additional spectral bands and the ability to alter the viewing angle, offers a useful technology to focus on this objective. In this context, atmospheric correction is a fundamental step in the pre-processing of such remote sensing imagery and, consequently, different algorithms have been developed for this purpose over the years. They are commonly categorized as imaged-based methods as well as in more advanced physical models based on the radiative transfer theory. Despite the relevance of this topic, a few comparative studies covering several methods have been carried out using high resolution data or which are specifically applied to vegetation covers. In this work, the performance of five representative atmospheric correction algorithms (DOS, QUAC, FLAASH, ATCOR and 6S) has been assessed, using high resolution Worldview-2 imagery and field spectroradiometer data collected simultaneously, with the goal of identifying the most appropriate techniques. The study also included a detailed analysis of the parameterization influence on the final results of the correction, the aerosol model and its optical thickness being important parameters to be properly adjusted. The effects of corrections were studied in vegetation and soil sites belonging to different protected semi-arid ecosystems (high mountain and coastal areas). In summary, the superior performance of model-based algorithms, 6S in particular, has been demonstrated, achieving reflectance estimations very close to the in-situ measurements (RMSE of between 2% and 3%). Finally, an example of the importance of the atmospheric correction in the vegetation estimation in these natural areas is presented, allowing the robust mapping of species and the analysis of multitemporal variations related to the human activity and climate change. PMID:27706064
Marcello, Javier; Eugenio, Francisco; Perdomo, Ulises; Medina, Anabella
2016-09-30
The precise mapping of vegetation covers in semi-arid areas is a complex task as this type of environment consists of sparse vegetation mainly composed of small shrubs. The launch of high resolution satellites, with additional spectral bands and the ability to alter the viewing angle, offers a useful technology to focus on this objective. In this context, atmospheric correction is a fundamental step in the pre-processing of such remote sensing imagery and, consequently, different algorithms have been developed for this purpose over the years. They are commonly categorized as imaged-based methods as well as in more advanced physical models based on the radiative transfer theory. Despite the relevance of this topic, a few comparative studies covering several methods have been carried out using high resolution data or which are specifically applied to vegetation covers. In this work, the performance of five representative atmospheric correction algorithms (DOS, QUAC, FLAASH, ATCOR and 6S) has been assessed, using high resolution Worldview-2 imagery and field spectroradiometer data collected simultaneously, with the goal of identifying the most appropriate techniques. The study also included a detailed analysis of the parameterization influence on the final results of the correction, the aerosol model and its optical thickness being important parameters to be properly adjusted. The effects of corrections were studied in vegetation and soil sites belonging to different protected semi-arid ecosystems (high mountain and coastal areas). In summary, the superior performance of model-based algorithms, 6S in particular, has been demonstrated, achieving reflectance estimations very close to the in-situ measurements (RMSE of between 2% and 3%). Finally, an example of the importance of the atmospheric correction in the vegetation estimation in these natural areas is presented, allowing the robust mapping of species and the analysis of multitemporal variations related to the human activity and climate change.
Keihaninejad, Shiva; Heckemann, Rolf A.; Gousias, Ioannis S.; Hajnal, Joseph V.; Duncan, John S.; Aljabar, Paul; Rueckert, Daniel; Hammers, Alexander
2012-01-01
Brain images contain information suitable for automatically sorting subjects into categories such as healthy controls and patients. We sought to identify morphometric criteria for distinguishing controls (n = 28) from patients with unilateral temporal lobe epilepsy (TLE), 60 with and 20 without hippocampal atrophy (TLE-HA and TLE-N, respectively), and for determining the presumed side of seizure onset. The framework employs multi-atlas segmentation to estimate the volumes of 83 brain structures. A kernel-based separability criterion was then used to identify structures whose volumes discriminate between the groups. Next, we applied support vector machines (SVM) to the selected set for classification on the basis of volumes. We also computed pairwise similarities between all subjects and used spectral analysis to convert these into per-subject features. SVM was again applied to these feature data. After training on a subgroup, all TLE-HA patients were correctly distinguished from controls, achieving an accuracy of 96 ± 2% in both classification schemes. For TLE-N patients, the accuracy was 86 ± 2% based on structural volumes and 91 ± 3% using spectral analysis. Structures discriminating between patients and controls were mainly localized ipsilaterally to the presumed seizure focus. For the TLE-HA group, they were mainly in the temporal lobe; for the TLE-N group they included orbitofrontal regions, as well as the ipsilateral substantia nigra. Correct lateralization of the presumed seizure onset zone was achieved using hippocampi and parahippocampal gyri in all TLE-HA patients using either classification scheme; in the TLE-N patients, lateralization was accurate based on structural volumes in 86 ± 4%, and in 94 ± 4% with the spectral analysis approach. Unilateral TLE has imaging features that can be identified automatically, even when they are invisible to human experts. Such morphometric image features may serve as classification and lateralization criteria. The technique also detects unsuspected distinguishing features like the substantia nigra, warranting further study. PMID:22523539
Doherty, Cailbhe; Bleakley, Chris; Hertel, Jay; Caulfield, Brian; Ryan, John; Delahunt, Eamonn
2016-04-01
Impairments in motor control may predicate the paradigm of chronic ankle instability (CAI) that can develop in the year after an acute lateral ankle sprain (LAS) injury. No prospective analysis is currently available identifying the mechanisms by which these impairments develop and contribute to long-term outcome after LAS. To identify the motor control deficits predicating CAI outcome after a first-time LAS injury. Cohort study (diagnosis); Level of evidence, 2. Eighty-two individuals were recruited after sustaining a first-time LAS injury. Several biomechanical analyses were performed for these individuals, who completed 5 movement tasks at 3 time points: (1) 2 weeks, (2) 6 months, and (3) 12 months after LAS occurrence. A logistic regression analysis of several "salient" biomechanical parameters identified from the movement tasks, in addition to scores from the Cumberland Ankle Instability Tool and the Foot and Ankle Ability Measure (FAAM) recorded at the 2-week and 6-month time points, were used as predictors of 12-month outcome. At the 2-week time point, an inability to complete 2 of the movement tasks (a single-leg drop landing and a drop vertical jump) was predictive of CAI outcome and correctly classified 67.6% of cases (sensitivity, 83%; specificity, 55%; P = .004). At the 6-month time point, several deficits exhibited by the CAI group during 1 of the movement tasks (reach distances and sagittal plane joint positions at the hip, knee and ankle during the posterior reach directions of the Star Excursion Balance Test) and their scores on the activities of daily living subscale of the FAAM were predictive of outcome and correctly classified 84.8% of cases (sensitivity, 75%; specificity, 91%; P < .001). An inability to complete jumping and landing tasks within 2 weeks of a first-time LAS and poorer dynamic postural control and lower self-reported function 6 months after a first-time LAS were predictive of eventual CAI outcome. © 2016 The Author(s).
An Extreme-Value Approach to Anomaly Vulnerability Identification
NASA Technical Reports Server (NTRS)
Everett, Chris; Maggio, Gaspare; Groen, Frank
2010-01-01
The objective of this paper is to present a method for importance analysis in parametric probabilistic modeling where the result of interest is the identification of potential engineering vulnerabilities associated with postulated anomalies in system behavior. In the context of Accident Precursor Analysis (APA), under which this method has been developed, these vulnerabilities, designated as anomaly vulnerabilities, are conditions that produce high risk in the presence of anomalous system behavior. The method defines a parameter-specific Parameter Vulnerability Importance measure (PVI), which identifies anomaly risk-model parameter values that indicate the potential presence of anomaly vulnerabilities, and allows them to be prioritized for further investigation. This entails analyzing each uncertain risk-model parameter over its credible range of values to determine where it produces the maximum risk. A parameter that produces high system risk for a particular range of values suggests that the system is vulnerable to the modeled anomalous conditions, if indeed the true parameter value lies in that range. Thus, PVI analysis provides a means of identifying and prioritizing anomaly-related engineering issues that at the very least warrant improved understanding to reduce uncertainty, such that true vulnerabilities may be identified and proper corrective actions taken.
Tragante, Vinicius; Barnes, Michael R.; Ganesh, Santhi K.; Lanktree, Matthew B.; Guo, Wei; Franceschini, Nora; Smith, Erin N.; Johnson, Toby; Holmes, Michael V.; Padmanabhan, Sandosh; Karczewski, Konrad J.; Almoguera, Berta; Barnard, John; Baumert, Jens; Chang, Yen-Pei Christy; Elbers, Clara C.; Farrall, Martin; Fischer, Mary E.; Gaunt, Tom R.; Gho, Johannes M.I.H.; Gieger, Christian; Goel, Anuj; Gong, Yan; Isaacs, Aaron; Kleber, Marcus E.; Leach, Irene Mateo; McDonough, Caitrin W.; Meijs, Matthijs F.L.; Melander, Olle; Nelson, Christopher P.; Nolte, Ilja M.; Pankratz, Nathan; Price, Tom S.; Shaffer, Jonathan; Shah, Sonia; Tomaszewski, Maciej; van der Most, Peter J.; Van Iperen, Erik P.A.; Vonk, Judith M.; Witkowska, Kate; Wong, Caroline O.L.; Zhang, Li; Beitelshees, Amber L.; Berenson, Gerald S.; Bhatt, Deepak L.; Brown, Morris; Burt, Amber; Cooper-DeHoff, Rhonda M.; Connell, John M.; Cruickshanks, Karen J.; Curtis, Sean P.; Davey-Smith, George; Delles, Christian; Gansevoort, Ron T.; Guo, Xiuqing; Haiqing, Shen; Hastie, Claire E.; Hofker, Marten H.; Hovingh, G. Kees; Kim, Daniel S.; Kirkland, Susan A.; Klein, Barbara E.; Klein, Ronald; Li, Yun R.; Maiwald, Steffi; Newton-Cheh, Christopher; O’Brien, Eoin T.; Onland-Moret, N. Charlotte; Palmas, Walter; Parsa, Afshin; Penninx, Brenda W.; Pettinger, Mary; Vasan, Ramachandran S.; Ranchalis, Jane E.; M Ridker, Paul; Rose, Lynda M.; Sever, Peter; Shimbo, Daichi; Steele, Laura; Stolk, Ronald P.; Thorand, Barbara; Trip, Mieke D.; van Duijn, Cornelia M.; Verschuren, W. Monique; Wijmenga, Cisca; Wyatt, Sharon; Young, J. Hunter; Zwinderman, Aeilko H.; Bezzina, Connie R.; Boerwinkle, Eric; Casas, Juan P.; Caulfield, Mark J.; Chakravarti, Aravinda; Chasman, Daniel I.; Davidson, Karina W.; Doevendans, Pieter A.; Dominiczak, Anna F.; FitzGerald, Garret A.; Gums, John G.; Fornage, Myriam; Hakonarson, Hakon; Halder, Indrani; Hillege, Hans L.; Illig, Thomas; Jarvik, Gail P.; Johnson, Julie A.; Kastelein, John J.P.; Koenig, Wolfgang; Kumari, Meena; März, Winfried; Murray, Sarah S.; O’Connell, Jeffery R.; Oldehinkel, Albertine J.; Pankow, James S.; Rader, Daniel J.; Redline, Susan; Reilly, Muredach P.; Schadt, Eric E.; Kottke-Marchant, Kandice; Snieder, Harold; Snyder, Michael; Stanton, Alice V.; Tobin, Martin D.; Uitterlinden, André G.; van der Harst, Pim; van der Schouw, Yvonne T.; Samani, Nilesh J.; Watkins, Hugh; Johnson, Andrew D.; Reiner, Alex P.; Zhu, Xiaofeng; de Bakker, Paul I.W.; Levy, Daniel; Asselbergs, Folkert W.; Munroe, Patricia B.; Keating, Brendan J.
2014-01-01
Blood pressure (BP) is a heritable risk factor for cardiovascular disease. To investigate genetic associations with systolic BP (SBP), diastolic BP (DBP), mean arterial pressure (MAP), and pulse pressure (PP), we genotyped ∼50,000 SNPs in up to 87,736 individuals of European ancestry and combined these in a meta-analysis. We replicated findings in an independent set of 68,368 individuals of European ancestry. Our analyses identified 11 previously undescribed associations in independent loci containing 31 genes including PDE1A, HLA-DQB1, CDK6, PRKAG2, VCL, H19, NUCB2, RELA, HOXC@ complex, FBN1, and NFAT5 at the Bonferroni-corrected array-wide significance threshold (p < 6 × 10−7) and confirmed 27 previously reported associations. Bioinformatic analysis of the 11 loci provided support for a putative role in hypertension of several genes, such as CDK6 and NUCB2. Analysis of potential pharmacological targets in databases of small molecules showed that ten of the genes are predicted to be a target for small molecules. In summary, we identified previously unknown loci associated with BP. Our findings extend our understanding of genes involved in BP regulation, which may provide new targets for therapeutic intervention or drug response stratification. PMID:24560520
Spectroscopy methods for identifying the country of origin
NASA Astrophysics Data System (ADS)
Hondrogiannis, Ellen; Ehrlinger, Erin; Miziolek, Andrzej W.
2013-05-01
There is a need in many industries and government functions to identify the source of origin for various materials. For example, the food industry needs to ensure that the claimed source of some of the food products (e.g. coffee, spices) are in fact legitimate due to the variation of quality from different source locations world-wide. Another example is to identify the source country for imported commodities going through Customs so as to assess the correct tariff which varies depending on the source country. Laser Induced Breakdown Spectroscopy (LIBS) holds promise for being a field-portable tool for rapid identification of the country of origin of various materials. Recent research at Towson University has identified the elemental markers needed for discrimination of select spices back to their country of origin using wavelength dispersive X-ray fluorescence (WDXRF). The WDXRF device, however, is not particularly suitable for convenient and fast field analysis. We are extending this study to evaluate the potential of a benchtop commercial LIBS device that could be located at ports of entry and to compare its performance with WDXRF. Our initial study on the spice cumin has demonstrated that discriminant function models can not only be created with 100% separation between the 4 countries of origin (China, India, Syria, and Turkey), but also when tested they show 100% correct matching to the country of origin. This study adds to the growing number of publications that indicate the power of LIBS elemental fingerprinting for provenance determinations.
Costa-Alcalde, José Javier; Barbeito-Castiñeiras, Gema; González-Alba, José María; Aguilera, Antonio; Galán, Juan Carlos; Pérez-Del-Molino, María Luisa
2018-06-02
The American Thoracic Society and the Infectious Diseases Society of America recommend that clinically significant non-tuberculous mycobacteria (NTM) should be identified to the species level in order to determine their clinical significance. The aim of this study was to evaluate identification of rapidly growing NTM (RGM) isolated from clinical samples by using MALDI-TOF MS and a commercial molecular system. The results were compared with identification using a reference method. We included 46 clinical isolates of RGM and identified them using the commercial molecular system GenoType ® CM/AS (Hain, Lifescience, Germany), MALDI-TOF MS (Bruker) and, as reference method, partial rpoβ gene sequencing followed by BLAST and phylogenetic analysis with the 1093 sequences available in the GeneBank. The degree of agreement between GenoType ® and MALDI-TOF MS and the reference method, partial rpoβ sequencing, was 27/43 (62.8%) and 38/43 cases (88.3%) respectively. For all the samples correctly classified by GenoType ® , we obtained the same result with MALDI-TOF MS (27/27). However, MALDI-TOF MS also correctly identified 68.75% (11/16) of the samples that GenoType ® had misclassified (p=0.005). MALDI-TOF MS classified significantly better than GenoType ® . When a MALDI-TOF MS score >1.85 was achieved, MALDI-TOF MS and partial rpoβ gene sequencing were equivalent. GenoType ® was not able to distinguish between species belonging to the M. fortuitum complex. MALDI-TOF MS methodology is simple, rapid and associated with lower consumable costs than GenoType ® . The partial rpoβ sequencing methods with BLAST and phylogenetic analysis were not able to identify some RGM unequivocally. Therefore, sequencing of additional regions would be indicated in these cases. Copyright © 2018 Elsevier España, S.L.U. and Sociedad Española de Enfermedades Infecciosas y Microbiología Clínica. All rights reserved.
Tan, Kok Chooi; Lim, Hwee San; Matjafri, Mohd Zubir; Abdullah, Khiruddin
2012-06-01
Atmospheric corrections for multi-temporal optical satellite images are necessary, especially in change detection analyses, such as normalized difference vegetation index (NDVI) rationing. Abrupt change detection analysis using remote-sensing techniques requires radiometric congruity and atmospheric correction to monitor terrestrial surfaces over time. Two atmospheric correction methods were used for this study: relative radiometric normalization and the simplified method for atmospheric correction (SMAC) in the solar spectrum. A multi-temporal data set consisting of two sets of Landsat images from the period between 1991 and 2002 of Penang Island, Malaysia, was used to compare NDVI maps, which were generated using the proposed atmospheric correction methods. Land surface temperature (LST) was retrieved using ATCOR3_T in PCI Geomatica 10.1 image processing software. Linear regression analysis was utilized to analyze the relationship between NDVI and LST. This study reveals that both of the proposed atmospheric correction methods yielded high accuracy through examination of the linear correlation coefficients. To check for the accuracy of the equation obtained through linear regression analysis for every single satellite image, 20 points were randomly chosen. The results showed that the SMAC method yielded a constant value (in terms of error) to predict the NDVI value from linear regression analysis-derived equation. The errors (average) from both proposed atmospheric correction methods were less than 10%.
Error analysis and correction of discrete solutions from finite element codes
NASA Technical Reports Server (NTRS)
Thurston, G. A.; Stein, P. A.; Knight, N. F., Jr.; Reissner, J. E.
1984-01-01
Many structures are an assembly of individual shell components. Therefore, results for stresses and deflections from finite element solutions for each shell component should agree with the equations of shell theory. This paper examines the problem of applying shell theory to the error analysis and the correction of finite element results. The general approach to error analysis and correction is discussed first. Relaxation methods are suggested as one approach to correcting finite element results for all or parts of shell structures. Next, the problem of error analysis of plate structures is examined in more detail. The method of successive approximations is adapted to take discrete finite element solutions and to generate continuous approximate solutions for postbuckled plates. Preliminary numerical results are included.
NASA Astrophysics Data System (ADS)
Howard, T. A.; Nandy, D.; Koepke, A. C.
2008-01-01
One of the main sources of uncertainty in quantifying the kinematic properties of coronal mass ejections (CMEs) using coronagraphs is the fact that coronagraph images are projected into the sky plane, resulting in measurements which can differ significantly from their actual values. By identifying solar surface source regions of CMEs using X-ray and Hα flare and disappearing filament data, and through considerations of CME trajectories in three-dimensional (3-D) geometry, we have devised a methodology to correct for the projection effect. We outline this method here. The methodology was automated and applied to over 10,000 CMEs in the Coordinated Data Analysis Workshop (CDAW) (SOHO Large Angle Spectroscopic Coronagraph) catalog spanning 1996-2005, in which we could associate 1961 CMEs with an appropriate surface event. In the latter subset, deprojected speeds, accelerations, and launch angles were determined to study CME kinematics. Our analysis of this subset of events reconfirms some important trends, notably that previously uncovered solar cycle variation of CME properties are preserved, CMEs with greater width have higher speeds, and slower CMEs tend to accelerate while faster CMEs tend to decelerate. This points out that statistical trends in CME properties, recovered from plane-of-sky measurements, may be preserved even in the face of more sophisticated 3-D measurements from spacecrafts such as STEREO, if CME trajectories are predominantly radial. However, our results also show that the magnitude of corrected measurements can differ significantly from the projected plane-of-sky measurements on a case-by-case basis and that acceleration is more sensitive to the deprojection process than speed. Average corrected speed and acceleration tend to be a factor of 1.7 and 4.4 higher than their projected values, with mean corrected speed and acceleration magnitudes being on the order of 1000 km/s and 50 m/s2, respectively. We conclude that while using the plane-of-sky measurements may be suitable for studies of general trends in a large sample of events, correcting for projection effects is mandatory for those investigations which rely on a numerically precise determination of the properties of individual CMEs.
Consultation behaviour of doctor-shopping patients and factors that reduce shopping.
Ohira, Yoshiyuki; Ikusaka, Masatomi; Noda, Kazutaka; Tsukamoto, Tomoko; Takada, Toshihiko; Miyahara, Masahito; Funakoshi, Hiraku; Basugi, Ayako; Keira, Katsunori; Uehara, Takanori
2012-04-01
To investigate the subsequent behaviour of doctor-shopping patients (defined as those attending multiple hospitals for the same complaint) who consulted our department and factors related to cessation of doctor shopping. Patients who presented without referral to the Department of General Medicine at Chiba University Hospital in Japan (our department) completed a questionnaire at their first visit. A follow-up questionnaire was also sent to them in order to assess doctor shopping after 3 months. Then items in the questionnaires were investigated for significant differences between patients who continued or stopped doctor shopping. Logistic regression analysis was performed with items showing a significant difference between patients who stopped doctor shopping and those who continued it, in order to identify independent determinants of the cessation of shopping. A total of 978 patients who presented spontaneously to our department consented to this study, and 929 patients (95.0%) completed questionnaires correctly. Among them, 203 patients (21.9%) were identified as doctor shoppers. The follow-up survey was completed correctly by 138 patients (68.0%). Among them, 25 patients (18.1%) were found to have continued doctor shopping, which was a significantly lower rate than before (P < 0.001). Logistic regression analysis selected the following factors as independent determinants of the cessation of doctor shopping: 'confirmation of the diagnosis' (odds ratio: 8.12, 95% confidence interval: 1.46-45.26), and 'satisfaction with consultation' (odds ratio: 2.07, 95% confidence interval: 1.42-3.01). Doctor shopping decreased significantly after patients consulted our department, with 'confirmation of the diagnosis' and 'satisfaction with consultation' being identified as contributing factors. © 2010 Blackwell Publishing Ltd.
NASA Technical Reports Server (NTRS)
Carlson, S.; Culler, T.; Muller, R. A.; Tetreault, M.; Perlmutter, S.
1994-01-01
The parallax of all stars of visual magnitude greater than about 6.5 has already been measured. If Nemesis is a main-sequence star 1 parsec away, this requires Nemesis's mass to be less than about 0.4 solar masses. If it were less than about 0.05 solar masses its gravity would be too weak to trigger a comet storm. If Nemesis is on the main sequence, this mass range requires it to be a red dwarf. A red dwarf companion would probably have been missed by standard astronomical surveys. Nearby stars are usually found because they are bright or have high proper motion. However, Nemesis's proper motion would now be 0.01 arcsec/yr, and if it is a red dwarf its magnitude is about 10 - too dim to attract attention. Unfortunately, standard four-color photometry does not distinguish between red dwarfs and giants. So although surveys such as the Dearborn Red Star Catalog list stars by magnitude and spectral type, they do not identify the dwarfs. Every star of the correct spectral type and magnitude must be scrutinized. Our candidate list is a hybrid; candidate red stars are identified in the astrometrically poor Dearborn Red Star Catalog and their positions are corrected using the Hubble Guide Star Catalog. When errors in the Dearborn catalog make it impossible to identify the corresponding Hubble star, the fields are split so that we have one centering on each possible candidate. We are currently scrutinizing 3098 fields, which we believe contain all possible red dwarf candidates in the northern hemisphere. Since our last report the analysis and database software has been completely rebuilt to take advantage of updated hardware, to make the data more accessible, and to implement improved methods of data analysis. The software is now completed and we are eliminating stars every clear night.
Gaudio, Carlo; Mirabelli, Francesca; Pelliccia, Francesco; Francone, Marco; Tanzilli, Gaetano; Di Michele, Sara; Leonetti, Stefania; De Vincentis, Giuseppe; Carbone, Iacopo; Mangieri, Enrico; Catalano, Carlo; Passariello, Roberto
2009-07-10
The 64-slice multidetector-row computed tomography (MDCT) is an accurate noninvasive technique for assessing the degree of luminal narrowing in coronary arteries of patients with chronic ischemic disease. Aim of this study was to determine the value of MDCT in comparison to invasive coronary angiography (ICA) for detecting the presence and extent of coronary atherosclerotic plaques in a population of asymptomatic, hypertensive patients considered to be at high risk for cardiovascular events. We studied 67 asymptomatic, hypertensive patients at high-risk (Euro Score >5%). All patients had negative or nondiagnostic findings at exercise stress testing and therefore underwent both MDCT and ICA. In the per-patient analysis, MDCT correctly identified 16/17 (94%) patients with significant coronary artery disease involving at least 1 vessel and 48/50 (96%) normal subjects. In the per-segment analysis, MDCT correctly detected 21/22 (95%) coronary segments with a stenosis >or=50% and 856/868 (98%) normal segments, with a high negative predictivity of normal scans (100%). There was a good concordance between MDCT and ICA, with a high Pearson correlation coefficient between the coronary narrowings with the two techniques (r=0.84, p<0.01). Mean coronary calcium score was higher for the 17 patients with significant coronary artery disease on ICA than in the 50 patients without (422+/-223 HU vs 72+/-21 HU p<0.001). The ROC curves identified 160 as the best calcium volumetric score cut-off value able to identify >or=1 significant coronary stenosis with sensitivity 88% and specificity 85%. MDCT is an excellent noninvasive technique for early identification of significant coronary stenoses in high risk asymptomatic hypertensive patients and might provide unique information for the screening of this broad population.
Lee, Jaewoong; Park, Yeon Joon; Park, Dong Jin; Park, Kang Gyun; Lee, Hae Kyung
2017-01-01
We evaluated the performance of the BD MAX StaphSR Assay (SR assay; BD, USA) for direct detection of Staphylococcus aureus and methicillin resistance not only in S. aureus but also in coagulase-negative Staphylococci (CNS) from positive blood cultures. From 228 blood culture bottles, 103 S. aureus [45 methicillin-resistant S. aureus (MRSA), 55 methicillin-susceptible S. aureus (MSSA), 3 mixed infections (1 MRSA+Enterococcus faecalis, 1 MSSA+MRCNS, 1 MSSA+MSCNS)], and 125 CNS (102 MRCNS, 23 MSCNS) were identified by Vitek 2. For further analysis, we obtained the cycle threshold (Ct) values from the BD MAX system software to determine an appropriate cutoff value. For discrepancy analysis, conventional mecA/mecC PCR and oxacillin minimum inhibitory concentrations (MICs) were determined. Compared to Vitek 2, the SR assay identified all 103 S. aureus isolates correctly but failed to detect methicillin resistance in three MRSA isolates. All 55 MSSA isolates were correctly identified by the SR assay. In the concordant cases, the highest Ct values for nuc, mecA, and mec right-extremity junction (MREJ) were 25.6, 22, and 22.2, respectively. Therefore, we selected Ct values from 0-27 as a range of positivity, and applying this cutoff, the sensitivity/specificity of the SR assay were 100%/100% for detecting S. aureus, and 97.9%/98.1% and 99.0%/95.8% for detecting methicillin resistance in S. aureus and CNS, respectively. We propose a Ct cutoff value for nuc/mec assay without considering MREJ because mixed cultures of MSSA and MRCNS were very rare (0.4%) in the positive blood cultures.
Ypinazar, Valmae A; Margolis, Stephen A
2004-07-30
Little is known about teaching medical ethics across cultural and linguistic boundaries. This study examined two successive cohorts of first year medical students in a six year undergraduate MBBS program. The objective was to investigate whether Arabic speaking students studying medicine in an Arabic country would be able to correctly identify some of the principles of Western medical ethical reasoning. This cohort study was conducted on first year students in a six-year undergraduate program studying medicine in English, their second language at a medical school in the Arabian Gulf. The ethics teaching was based on the four-principle approach (autonomy, beneficence, non-malfeasance and justice) and delivered by a non-Muslim native English speaker with no knowledge of the Arabic language. Although the course was respectful of Arabic culture and tradition, the content excluded an analysis of Islamic medical ethics and focused on Western ethical reasoning. Following two 45-minute interactive seminars, students in groups of 3 or 4 visited a primary health care centre for one morning, sitting in with an attending physician seeing his or her patients in Arabic. Each student submitted a personal report for summative assessment detailing the ethical issues they had observed. All 62 students enrolled in these courses participated. Each student acting independently was able to correctly identify a median number of 4 different medical ethical issues (range 2-9) and correctly identify and label accurately a median of 2 different medical ethical issues (range 2-7) There were no significant correlations between their English language skills or general academic ability and the number or accuracy of ethical issues identified. This study has demonstrated that these students could identify medical ethical issues based on Western constructs, despite learning in English, their second language, being in the third week of their medical school experience and with minimal instruction. This result was independent of their academic and English language skills suggesting that ethical principles as espoused in the four principal approach may be common to the students' Islamic religious beliefs, allowing them to access complex medical ethical reasoning skills at an early stage in the medical curriculum.
Ypinazar, Valmae A; Margolis, Stephen A
2004-01-01
Background Little is known about teaching medical ethics across cultural and linguistic boundaries. This study examined two successive cohorts of first year medical students in a six year undergraduate MBBS program. Methods The objective was to investigate whether Arabic speaking students studying medicine in an Arabic country would be able to correctly identify some of the principles of Western medical ethical reasoning. This cohort study was conducted on first year students in a six-year undergraduate program studying medicine in English, their second language at a medical school in the Arabian Gulf. The ethics teaching was based on the four-principle approach (autonomy, beneficence, non-malfeasance and justice) and delivered by a non-Muslim native English speaker with no knowledge of the Arabic language. Although the course was respectful of Arabic culture and tradition, the content excluded an analysis of Islamic medical ethics and focused on Western ethical reasoning. Following two 45-minute interactive seminars, students in groups of 3 or 4 visited a primary health care centre for one morning, sitting in with an attending physician seeing his or her patients in Arabic. Each student submitted a personal report for summative assessment detailing the ethical issues they had observed. Results All 62 students enrolled in these courses participated. Each student acting independently was able to correctly identify a median number of 4 different medical ethical issues (range 2–9) and correctly identify and label accurately a median of 2 different medical ethical issues (range 2–7) There were no significant correlations between their English language skills or general academic ability and the number or accuracy of ethical issues identified. Conclusions This study has demonstrated that these students could identify medical ethical issues based on Western constructs, despite learning in English, their second language, being in the third week of their medical school experience and with minimal instruction. This result was independent of their academic and English language skills suggesting that ethical principles as espoused in the four principal approach may be common to the students' Islamic religious beliefs, allowing them to access complex medical ethical reasoning skills at an early stage in the medical curriculum. PMID:15283868
Chang, Hing-Chiu; Bilgin, Ali; Bernstein, Adam; Trouard, Theodore P.
2018-01-01
Over the past several years, significant efforts have been made to improve the spatial resolution of diffusion-weighted imaging (DWI), aiming at better detecting subtle lesions and more reliably resolving white-matter fiber tracts. A major concern with high-resolution DWI is the limited signal-to-noise ratio (SNR), which may significantly offset the advantages of high spatial resolution. Although the SNR of DWI data can be improved by denoising in post-processing, existing denoising procedures may potentially reduce the anatomic resolvability of high-resolution imaging data. Additionally, non-Gaussian noise induced signal bias in low-SNR DWI data may not always be corrected with existing denoising approaches. Here we report an improved denoising procedure, termed diffusion-matched principal component analysis (DM-PCA), which comprises 1) identifying a group of (not necessarily neighboring) voxels that demonstrate very similar magnitude signal variation patterns along the diffusion dimension, 2) correcting low-frequency phase variations in complex-valued DWI data, 3) performing PCA along the diffusion dimension for real- and imaginary-components (in two separate channels) of phase-corrected DWI voxels with matched diffusion properties, 4) suppressing the noisy PCA components in real- and imaginary-components, separately, of phase-corrected DWI data, and 5) combining real- and imaginary-components of denoised DWI data. Our data show that the new two-channel (i.e., for real- and imaginary-components) DM-PCA denoising procedure performs reliably without noticeably compromising anatomic resolvability. Non-Gaussian noise induced signal bias could also be reduced with the new denoising method. The DM-PCA based denoising procedure should prove highly valuable for high-resolution DWI studies in research and clinical uses. PMID:29694400
Magnetic Resonance Fingerprinting of Adult Brain Tumors: Initial Experience
Badve, Chaitra; Yu, Alice; Dastmalchian, Sara; Rogers, Matthew; Ma, Dan; Jiang, Yun; Margevicius, Seunghee; Pahwa, Shivani; Lu, Ziang; Schluchter, Mark; Sunshine, Jeffrey; Griswold, Mark; Sloan, Andrew; Gulani, Vikas
2016-01-01
Background Magnetic resonance fingerprinting (MRF) allows rapid simultaneous quantification of T1 and T2 relaxation times. This study assesses the utility of MRF in differentiating between common types of adult intra-axial brain tumors. Methods MRF acquisition was performed in 31 patients with untreated intra-axial brain tumors: 17 glioblastomas, 6 WHO grade II lower-grade gliomas and 8 metastases. T1, T2 of the solid tumor (ST), immediate peritumoral white matter (PW), and contralateral white matter (CW) were summarized within each region of interest. Statistical comparisons on mean, standard deviation, skewness and kurtosis were performed using univariate Wilcoxon rank sum test across various tumor types. Bonferroni correction was used to correct for multiple comparisons testing. Multivariable logistic regression analysis was performed for discrimination between glioblastomas and metastases and area under the receiver operator curve (AUC) was calculated. Results Mean T2 values could differentiate solid tumor regions of lower-grade gliomas from metastases (mean±sd: 172±53ms and 105±27ms respectively, p =0.004, significant after Bonferroni correction). Mean T1 of PW surrounding lower-grade gliomas differed from PW around glioblastomas (mean±sd: 1066±218ms and 1578±331ms respectively, p=0.004, significant after Bonferroni correction). Logistic regression analysis revealed that mean T2 of ST offered best separation between glioblastomas and metastases with AUC of 0.86 (95% CI 0.69–1.00, p<0.0001). Conclusion MRF allows rapid simultaneous T1, T2 measurement in brain tumors and surrounding tissues. MRF based relaxometry can identify quantitative differences between solid-tumor regions of lower grade gliomas and metastases and between peritumoral regions of glioblastomas and lower grade gliomas. PMID:28034994
MR Fingerprinting of Adult Brain Tumors: Initial Experience.
Badve, C; Yu, A; Dastmalchian, S; Rogers, M; Ma, D; Jiang, Y; Margevicius, S; Pahwa, S; Lu, Z; Schluchter, M; Sunshine, J; Griswold, M; Sloan, A; Gulani, V
2017-03-01
MR fingerprinting allows rapid simultaneous quantification of T1 and T2 relaxation times. This study assessed the utility of MR fingerprinting in differentiating common types of adult intra-axial brain tumors. MR fingerprinting acquisition was performed in 31 patients with untreated intra-axial brain tumors: 17 glioblastomas, 6 World Health Organization grade II lower grade gliomas, and 8 metastases. T1, T2 of the solid tumor, immediate peritumoral white matter, and contralateral white matter were summarized within each ROI. Statistical comparisons on mean, SD, skewness, and kurtosis were performed by using the univariate Wilcoxon rank sum test across various tumor types. Bonferroni correction was used to correct for multiple-comparison testing. Multivariable logistic regression analysis was performed for discrimination between glioblastomas and metastases, and area under the receiver operator curve was calculated. Mean T2 values could differentiate solid tumor regions of lower grade gliomas from metastases (mean, 172 ± 53 ms, and 105 ± 27 ms, respectively; P = .004, significant after Bonferroni correction). The mean T1 of peritumoral white matter surrounding lower grade gliomas differed from peritumoral white matter around glioblastomas (mean, 1066 ± 218 ms, and 1578 ± 331 ms, respectively; P = .004, significant after Bonferroni correction). Logistic regression analysis revealed that the mean T2 of solid tumor offered the best separation between glioblastomas and metastases with an area under the curve of 0.86 (95% CI, 0.69-1.00; P < .0001). MR fingerprinting allows rapid simultaneous T1 and T2 measurement in brain tumors and surrounding tissues. MR fingerprinting-based relaxometry can identify quantitative differences between solid tumor regions of lower grade gliomas and metastases and between peritumoral regions of glioblastomas and lower grade gliomas. © 2017 by American Journal of Neuroradiology.
Doctors' confusion over ratios and percentages in drug solutions: the case for standard labelling
Wheeler, Daniel Wren; Remoundos, Dionysios Dennis; Whittlestone, Kim David; Palmer, Michael Ian; Wheeler, Sarah Jane; Ringrose, Timothy Richard; Menon, David Krishna
2004-01-01
The different ways of expressing concentrations of drugs in solution, as ratios or percentages or mass per unit volume, are a potential cause of confusion that may contribute to dose errors. To assess doctors' understanding of what they signify, all active subscribers to doctors.net.uk, an online community exclusively for UK doctors, were invited to complete a brief web-based multiple-choice questionnaire that explored their familiarity with solutions of adrenaline (expressed as a ratio), lidocaine (expressed as a percentage) and atropine (expressed in mg per mL), and their ability to calculate the correct volume to administer in clinical scenarios relevant to all specialties. 2974 (24.6%) replied. The mean score achieved was 4.80 out of 6 (SD 1.38). Only 85.2% and 65.8% correctly identified the mass of drug in the adrenaline and lidocaine solutions, respectively, whilst 93.1% identified the correct concentration of atropine. More would have administered the correct volume of adrenaline and lidocaine in clinical scenarios (89.4% and 81.0%, respectively) but only 65.5% identified the correct volume of atropine. The labelling of drug solutions as ratios or percentages is antiquated and confusing. Labelling should be standardized to mass per unit volume. PMID:15286190
Christ, Ana Paula Guarnieri; Ramos, Solange Rodrigues; Cayô, Rodrigo; Gales, Ana Cristina; Hachich, Elayse Maria; Sato, Maria Inês Zanoli
2017-05-15
MALDI-TOF Mass Spectrometry Biotyping has proven to be a reliable method for identifying bacteria at the species level based on the analysis of the ribosomal proteins mass fingerprint. We evaluate the usefulness of this method to identify Enterococcus species isolated from marine recreational water at Brazilian beaches. A total of 127 Enterococcus spp. isolates were identified to species level by bioMérieux's API® 20 Strep and MALDI-TOF systems. The biochemical test identified 117/127 isolates (92%), whereas MALDI identified 100% of the isolates, with an agreement of 63% between the methods. The 16S rRNA gene sequencing of isolates with discrepant results showed that MALDI-TOF and API® correctly identified 74% and 11% of these isolates, respectively. This discrepancy probably relies on the bias of the API® has to identify clinical isolates. MALDI-TOF proved to be a feasible approach for identifying Enterococcus from environmental matrices increasing the rapidness and accuracy of results. Copyright © 2017 Elsevier Ltd. All rights reserved.
A method to correct coordinate distortion in EBSD maps
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Y.B., E-mail: yubz@dtu.dk; Elbrønd, A.; Lin, F.X.
2014-10-15
Drift during electron backscatter diffraction mapping leads to coordinate distortions in resulting orientation maps, which affects, in some cases significantly, the accuracy of analysis. A method, thin plate spline, is introduced and tested to correct such coordinate distortions in the maps after the electron backscatter diffraction measurements. The accuracy of the correction as well as theoretical and practical aspects of using the thin plate spline method is discussed in detail. By comparing with other correction methods, it is shown that the thin plate spline method is most efficient to correct different local distortions in the electron backscatter diffraction maps. -more » Highlights: • A new method is suggested to correct nonlinear spatial distortion in EBSD maps. • The method corrects EBSD maps more precisely than presently available methods. • Errors less than 1–2 pixels are typically obtained. • Direct quantitative analysis of dynamic data are available after this correction.« less
Coding "Corrective Recasts": The Maintenance of Meaning and More Fundamental Problems
ERIC Educational Resources Information Center
Hauser, Eric
2005-01-01
A fair amount of descriptive research in the field of second language acquisition has looked at the presence of what have been labeled corrective recasts. This research has relied on the methodological practice of coding to identify particular turns as "corrective recasts." Often, the coding criteria make use of the notion of the maintenance of…
Investigating the Speech Act of Correction in Iraqi EFL Context
ERIC Educational Resources Information Center
Darweesh, Abbas Deygan; Mehdi, Wafaa Sahib
2016-01-01
The present paper investigates the performance of the Iraqi students for the speech act of correction and how it is realized with status unequal. It attempts to achieve the following aims: (1) Setting out the felicity conditions for the speech act of correction in terms of Searle conditions; (2) Identifying the semantic formulas that realize the…
Can Australians identify snakes?
Morrison, J J; Pearn, J H; Covacevich, J; Nixon, J
1983-07-23
A study of the ability of Australians to identify snakes was undertaken, in which 558 volunteers (primary and secondary schoolchildren, doctors and university science and medical students) took part. Over all, subjects correctly identified an average of 19% of snakes; 28% of subjects could identify a taipan, 59% could identify a death adder, 18% a tiger snake, 23% an eastern (or common) brown snake, and 0.5% a rough-scaled snake. Eighty-six per cent of subjects who grew up in rural areas could identify a death adder; only 4% of those who grew up in an Australian capital city could identify a nonvenomous python. Male subjects identified snakes more accurately than did female subjects. Doctors and medical students correctly identified an average of 25% of snakes. The ability to identify medically significant Australian snakes was classified according to the observer's background, education, sex, and according to the individual snake species. Australians need to be better educated about snakes indigenous to this country.
VanWeelden, Kimberly; Cevasco, Andrea M
2010-01-01
The purposes of the current study were to determine geriatric clients' recognition of 32 popular songs and songs from musicals by asking whether they: (a) had heard the songs before; (b) could "name the tune" of each song; and (c) list the decade that each song was composed. Additionally, comparisons were made between the geriatric clients' recognition of these songs and by music therapy students' recognition of the same, songs, based on data from an earlier study (VanWeelden, Juchniewicz, & Cevasco, 2008). Results found 90% or more of the geriatric clients had heard 28 of the 32 songs, 80% or more of the graduate students had heard 20 songs, and 80% of the undergraduates had heard 18 songs. The geriatric clients correctly identified 3 songs with 80% or more accuracy, which the graduate students also correctly identified, while the undergraduates identified 2 of the 3 same songs. Geriatric clients identified the decades of 3 songs with 50% or greater accuracy. Neither the undergraduate nor graduate students identified any songs by the correct decade with over 50% accuracy. Further results are discussed.
Oberle, Michael; Wohlwend, Nadia; Jonas, Daniel; Maurer, Florian P.; Jost, Geraldine; Tschudin-Sutter, Sarah; Vranckx, Katleen; Egli, Adrian
2016-01-01
Background The technical, biological, and inter-center reproducibility of matrix-assisted laser desorption ionization-time of flight mass spectrometry (MALDI TOF MS) typing data has not yet been explored. The aim of this study is to compare typing data from multiple centers employing bioinformatics using bacterial strains from two past outbreaks and non-related strains. Material/Methods Participants received twelve extended spectrum betalactamase-producing E. coli isolates and followed the same standard operating procedure (SOP) including a full-protein extraction protocol. All laboratories provided visually read spectra via flexAnalysis (Bruker, Germany). Raw data from each laboratory allowed calculating the technical and biological reproducibility between centers using BioNumerics (Applied Maths NV, Belgium). Results Technical and biological reproducibility ranged between 96.8–99.4% and 47.6–94.4%, respectively. The inter-center reproducibility showed a comparable clustering among identical isolates. Principal component analysis indicated a higher tendency to cluster within the same center. Therefore, we used a discriminant analysis, which completely separated the clusters. Next, we defined a reference center and performed a statistical analysis to identify specific peaks to identify the outbreak clusters. Finally, we used a classifier algorithm and a linear support vector machine on the determined peaks as classifier. A validation showed that within the set of the reference center, the identification of the cluster was 100% correct with a large contrast between the score with the correct cluster and the next best scoring cluster. Conclusions Based on the sufficient technical and biological reproducibility of MALDI-TOF MS based spectra, detection of specific clusters is possible from spectra obtained from different centers. However, we believe that a shared SOP and a bioinformatics approach are required to make the analysis robust and reliable. PMID:27798637
Evaluation of the Vitek 2 ANC card for identification of clinical isolates of anaerobic bacteria.
Lee, E H L; Degener, J E; Welling, G W; Veloo, A C M
2011-05-01
An evaluation of the Vitek 2 ANC card (bioMérieux, Marcy l'Etoile, France) was performed with 301 anaerobic isolates. Each strain was identified by 16S rRNA gene sequencing, which is considered to be the reference method. The Vitek 2 ANC card correctly identified 239 (79.4%) of the 301 clinical isolates to the genus level, including 100 species that were not represented in the database. Correct species identification was obtained for 60.1% (181/301) of the clinical isolates. For the isolates not identified to the species level, a correct genus identification was obtained for 47.0% of them (47/100), and 16 were accurately designated not identified. Although the Vitek 2 ANC card allows the rapid and acceptable identification of the most common clinically important anaerobic bacteria within 6 h, improvement is required for the identification of members of the genera Fusobacterium, Prevotella, and Actinomyces and certain Gram-positive anaerobic cocci (GPAC).
DOE Office of Scientific and Technical Information (OSTI.GOV)
none
1998-03-01
This Corrective Action Decision Document (CADD) has been prepared for the Area 9 Unexploded Ordnance (UXO) Landfill (Corrective Action Unit [CAU] 453) in accordance with the Federal Facility Agreement and Consent Order (FFACO) of 1996. Corrective Action Unit 453 is located at the Tonopah Test Range (TTR), Nevada, and is comprised of three individual landfill cells located northwest of Area 9. The cells are listed as one Corrective Action Site (CAS) 09-55-001-0952. The landfill cells have been designated as: � Cell A9-1 � Cell A9-2 � Cell A9-3 The purpose of this CADD is to identify and provide a rationalemore » for the selection of a recommended corrective action alternative for CAU 453. The scope of this CADD consists of the following tasks: � Develop corrective action objectives. � Identify corrective action alternative screening criteria. � Develop corrective action alternatives. � Perform detailed and comparative evaluations of the corrective action alternatives in relation to the corrective action objectives and screening criteria. � Recommend and justify a preferred corrective action alternative for the CAU. In June and July 1997, a corrective action investigation was performed that consisted of activities set forth in the Corrective Action Investigation Plan (CAIP) (DOE/NV, 1997). Subsurface investigation of the soils surrounding the cells revealed no contaminants of concern (COCs) above preliminary action levels. The cell contents were not investigated due to the potential for live UXO. Details concerning the analytical and investigation results can be found in Appendix A of this CADD. Based on the potential exposure pathways, the following corrective action objectives have been identified for CAU 453: � Prevent or mitigate human exposure to subsurface soils containing COCs, solid waste, and/or UXO. � Prevent adverse impacts to groundwater quality. Based on the review of existing data, future land use, and current operations at the TTR, the following alternatives have been developed for consideration at the Area 9 UXO Landfill CAU: � Alternative 1 - No Further Action � Alternative 2 - Closure in Place by Administrative Controls � Alternative 3 - Closure in Place by Capping � Alternative 4 - Clean Closure by Removal The corrective action alternatives were evaluated based on four general corrective action standards and five remedy selection decision factors. Based on the results of this evaluation, Alternative 2, Closure in Place by Administrative Controls, was selected as the preferred corrective action alternative. The preferred corrective action alternative was evaluated on its technical merits, focusing on performance, reliability, feasibility, and safety. The alternative was judged to meet all requirements for the technical components evaluated and to represent the most cost-effective corrective action. The alternative meets all applicable state and federal regulations for closure of the site and will reduce potential future exposure pathways to the contents of the landfill. During corrective action implementation, this alternative will present minimal potential threat to site workers. However, appropriate health and safety procedures will be developed and implemented.« less
Method of Analysis for Determining and Correcting Mirror Deformation due to Gravity
2014-01-01
obtainable. 1.3 Description of As-Built Beam Compressor Assembly The as-built beam compressor assembly consists of primary and secondary Zerodur ® mirrors held...Method of analysis for determining and correcting mirror deformation due to gravity James H. Clark, III F. Ernesto, Penado Downloaded From: http...00-00-2014 4. TITLE AND SUBTITLE Method of analysis for determining and correcting mirror deformation due to gravity 5a. CONTRACT NUMBER 5b. GRANT
ERIC Educational Resources Information Center
Turner, Jill; Rafferty, Lisa A.; Sullivan, Ray; Blake, Amy
2017-01-01
In this action research case study, the researchers used a multiple baseline across two student pairs design to investigate the effects of the error self-correction method on the spelling accuracy behaviors for four fifth-grade students who were identified as being at risk for learning disabilities. The dependent variable was the participants'…
Use of Facial Recognition Software to Identify Disaster Victims With Facial Injuries.
Broach, John; Yong, Rothsovann; Manuell, Mary-Elise; Nichols, Constance
2017-10-01
After large-scale disasters, victim identification frequently presents a challenge and a priority for responders attempting to reunite families and ensure proper identification of deceased persons. The purpose of this investigation was to determine whether currently commercially available facial recognition software can successfully identify disaster victims with facial injuries. Photos of 106 people were taken before and after application of moulage designed to simulate traumatic facial injuries. These photos as well as photos from volunteers' personal photo collections were analyzed by using facial recognition software to determine whether this technology could accurately identify a person with facial injuries. The study results suggest that a responder could expect to get a correct match between submitted photos and photos of injured patients between 39% and 45% of the time and a much higher percentage of correct returns if submitted photos were of optimal quality with percentages correct exceeding 90% in most situations. The present results suggest that the use of this software would provide significant benefit to responders. Although a correct result was returned only 40% of the time, this would still likely represent a benefit for a responder trying to identify hundreds or thousands of victims. (Disaster Med Public Health Preparedness. 2017;11:568-572).
Atomistic cluster alignment method for local order mining in liquids and glasses
NASA Astrophysics Data System (ADS)
Fang, X. W.; Wang, C. Z.; Yao, Y. X.; Ding, Z. J.; Ho, K. M.
2010-11-01
An atomistic cluster alignment method is developed to identify and characterize the local atomic structural order in liquids and glasses. With the “order mining” idea for structurally disordered systems, the method can detect the presence of any type of local order in the system and can quantify the structural similarity between a given set of templates and the aligned clusters in a systematic and unbiased manner. Moreover, population analysis can also be carried out for various types of clusters in the system. The advantages of the method in comparison with other previously developed analysis methods are illustrated by performing the structural analysis for four prototype systems (i.e., pure Al, pure Zr, Zr35Cu65 , and Zr36Ni64 ). The results show that the cluster alignment method can identify various types of short-range orders (SROs) in these systems correctly while some of these SROs are difficult to capture by most of the currently available analysis methods (e.g., Voronoi tessellation method). Such a full three-dimensional atomistic analysis method is generic and can be applied to describe the magnitude and nature of noncrystalline ordering in many disordered systems.
NASA Astrophysics Data System (ADS)
Ogruc Ildiz, G.; Arslan, M.; Unsalan, O.; Araujo-Andrade, C.; Kurt, E.; Karatepe, H. T.; Yilmaz, A.; Yalcinkaya, O. B.; Herken, H.
2016-01-01
In this study, a methodology based on Fourier-transform infrared spectroscopy and principal component analysis and partial least square methods is proposed for the analysis of blood plasma samples in order to identify spectral changes correlated with some biomarkers associated with schizophrenia and bipolarity. Our main goal was to use the spectral information for the calibration of statistical models to discriminate and classify blood plasma samples belonging to bipolar and schizophrenic patients. IR spectra of 30 samples of blood plasma obtained from each, bipolar and schizophrenic patients and healthy control group were collected. The results obtained from principal component analysis (PCA) show a clear discrimination between the bipolar (BP), schizophrenic (SZ) and control group' (CG) blood samples that also give possibility to identify three main regions that show the major differences correlated with both mental disorders (biomarkers). Furthermore, a model for the classification of the blood samples was calibrated using partial least square discriminant analysis (PLS-DA), allowing the correct classification of BP, SZ and CG samples. The results obtained applying this methodology suggest that it can be used as a complimentary diagnostic tool for the detection and discrimination of these mental diseases.
Medical management of deliberate drug overdose: a neglected area for suicide prevention?
Gunnell, D; Ho, D; Murray, V
2004-01-01
Overdoses account for a quarter of all suicides in England. The number of people who survive the immediate effects of their overdose long enough to reach medical attention, but who subsequently die in hospital is unknown. The aim of this study was to determine the proportion of overdose suicides dying in hospital and describe their sociodemographic characteristics. Cross sectional analysis of routinely collected Hospital Episode Statistics data for England (1997 to 1999) to identify hospital admissions for overdose among people aged 12+ and the outcome of these admissions. Between 1997 and 1999 there were 233 756 hospital admissions for overdose, 1149 (0.5%) of these ended in the death of the patient such deaths accounted for 28% [corrected] of all overdose suicides and 8% [corrected] of total suicides. The median time between admission and death was three days (interquartile range one to nine days). The most commonly identified drugs taken in fatal overdose were paracetamol compounds, benzodiazepines, and tricyclic/tetracyclic antidepressants. Around a quarter of all overdose suicide deaths occur subsequent to hospital admission. Further more detailed research is required to discover if better pre-admission and in-hospital medical management of those taking serious overdoses may prevent some of these deaths.
A use case study on late stent thrombosis for ontology-based temporal reasoning and analysis.
Clark, Kim; Sharma, Deepak; Qin, Rui; Chute, Christopher G; Tao, Cui
2014-01-01
In this paper, we show how we have applied the Clinical Narrative Temporal Relation Ontology (CNTRO) and its associated temporal reasoning system (the CNTRO Timeline Library) to trend temporal information within medical device adverse event report narratives. 238 narratives documenting occurrences of late stent thrombosis adverse events from the Food and Drug Administration's (FDA) Manufacturing and User Facility Device Experience (MAUDE) database were annotated and evaluated using the CNTRO Timeline Library to identify, order, and calculate the duration of temporal events. The CNTRO Timeline Library had a 95% accuracy in correctly ordering events within the 238 narratives. 41 narratives included an event in which the duration was documented, and the CNTRO Timeline Library had an 80% accuracy in correctly determining these durations. 77 narratives included documentation of a duration between events, and the CNTRO Timeline Library had a 76% accuracy in determining these durations. This paper also includes an example of how this temporal output from the CNTRO ontology can be used to verify recommendations for length of drug administration, and proposes that these same tools could be applied to other medical device adverse event narratives in order to identify currently unknown temporal trends.
Emotion recognition in girls with conduct problems.
Schwenck, Christina; Gensthaler, Angelika; Romanos, Marcel; Freitag, Christine M; Schneider, Wolfgang; Taurines, Regina
2014-01-01
A deficit in emotion recognition has been suggested to underlie conduct problems. Although several studies have been conducted on this topic so far, most concentrated on male participants. The aim of the current study was to compare recognition of morphed emotional faces in girls with conduct problems (CP) with elevated or low callous-unemotional (CU+ vs. CU-) traits and a matched healthy developing control group (CG). Sixteen girls with CP-CU+, 16 girls with CP-CU- and 32 controls (mean age: 13.23 years, SD=2.33 years) were included. Video clips with morphed faces were presented in two runs to assess emotion recognition. Multivariate analysis of variance with the factors group and run was performed. Girls with CP-CU- needed more time than the CG to encode sad, fearful, and happy faces and they correctly identified sadness less often. Girls with CP-CU+ outperformed the other groups in the identification of fear. Learning effects throughout runs were the same for all groups except that girls with CP-CU- correctly identified fear less often in the second run compared to the first run. Results need to be replicated with comparable tasks, which might result in subgroup-specific therapeutic recommendations.
NASA Astrophysics Data System (ADS)
Nganvongpanit, Korakot; Buddhachat, Kittisak; Piboon, Promporn; Euppayo, Thippaporn; Kaewmong, Patcharaporn; Cherdsukjai, Phaothep; Kittiwatanawong, Kongkiat; Thitaram, Chatchote
2017-04-01
The elemental composition was investigated and applied for identifying the sex and habitat of dugongs, in addition to distinguishing dugong tusks and teeth from other animal wildlife materials such as Asian elephant (Elephas maximus) tusks and tiger (Panthera tigris tigris) canine teeth. A total of 43 dugong tusks, 60 dugong teeth, 40 dolphin teeth, 1 whale tooth, 40 Asian elephant tusks and 20 tiger canine teeth were included in the study. Elemental analyses were conducted using a handheld X-ray fluorescence analyzer (HH-XRF). There was no significant difference in the elemental composition of male and female dugong tusks, whereas the overall accuracy for identifying habitat (the Andaman Sea and the Gulf of Thailand) was high (88.1%). Dolphin teeth were able to be correctly predicted 100% of the time. Furthermore, we demonstrated a discrepancy in elemental composition among dugong tusks, Asian elephant tusks and tiger canine teeth, and provided a high correct prediction rate among these species of 98.2%. Here, we demonstrate the feasible use of HH-XRF for preliminary species classification and habitat determination prior to using more advanced techniques such as molecular biology.
Raghav, Raj; Middleton, Rachael; BSc, Rinshiya Ahamed; Arjunan, Raji; Caliendo, Valentina
2015-12-01
Arterial and venous blood gas analysis is useful in the assessment of tissue oxygenation and ventilation and in diagnosis of metabolic and respiratory derangements. It can be performed with a relatively small volume of blood in avian patients under emergency situations. Arterial and venous blood gas analysis was performed in 30 healthy gyr falcons ( Falco rusticolus ) under anaesthesia to establish temperature-corrected reference intervals for arterial blood gas values and to compare them to temperature-corrected venous blood gas values with a portable point-of-care blood gas analyzer (i-STAT 1, Abbott Laboratories, Abbott Park, IL, USA). Statistically significant differences were observed between the temperature-corrected values of pH, partial pressure of carbon dioxide (Pco2), and partial pressure of oxygen (Po2) and the corresponding nontemperature-corrected values of these parameters in both arterial and venous blood. Values of temperature-corrected pH, temperature-corrected Pco2, bicarbonate concentrations, and base excess of extra cellular fluid did not differ significantly between arterial and venous blood, suggesting that, in anesthetized gyr falcons, venous blood gas analysis can be used in place of arterial blood gas analysis in clinical situations. Values for hematocrit, measured by the point-of-care analyzer, were significantly lower compared with those obtained by the microhematocrit method.
Zhang, Jiamei; Wang, Yan
2016-01-01
Since sixty percent of ametropes obtain astigmatism, which has influence on the visual quality, correcting the astigmatism is always the focus of concerns during visual correction procedures especially for the corneal refractive surgery. The postoperative spherical equivalent or residual cylindrical dioptors was used as quantitative index to evaluate the correction of astigmatism previously; however, such results neglect the effect of astigmatic axis shift on the treatment. Taking astigmatism as a vector parameter could describe the magnitude and direction of astigmatism accurately, thus it was increasingly applied in the evaluation of astigmatism correction. This paper reviews the present vector analysis methods, evaluation indexes and its application for the correction of astigmatism in the corneal refractive surgery.
NASA Astrophysics Data System (ADS)
Jiang, Fulin; Tang, Jie; Fu, Dinfa; Huang, Jianping; Zhang, Hui
2018-04-01
Multistage stress-strain curve correction based on an instantaneous friction factor was studied for axisymmetric uniaxial hot compression of 7150 aluminum alloy. Experimental friction factors were calculated based on continuous isothermal axisymmetric uniaxial compression tests at various deformation parameters. Then, an instantaneous friction factor equation was fitted by mathematic analysis. After verification by comparing single-pass flow stress correction with traditional average friction factor correction, the instantaneous friction factor equation was applied to correct multistage stress-strain curves. The corrected results were reasonable and validated by multistage relative softening calculations. This research provides a broad potential for implementing axisymmetric uniaxial compression in multistage physical simulations and friction optimization in finite element analysis.
Begum, Housne Ara; Mascie-Taylor, Cgn; Nahar, Shamsun
2007-01-01
To examine the efficiency of the Bangladesh Integrated Nutritional Program (BINP) in identifying which infants should be supplemented, whether full supplementation was given for the stipulated period of time, and whether the correct exit criteria from the supplementation programme were used. To test whether targeted food supplementation of infants between 6-12 months of age resulted in enhanced weight gain. Mallickbari Union, Bhaluka, a rural area located about 100 km north of Dhaka, Bangladesh. Five hundred and twenty-six infants followed for 6 to 12 months. Of the 526 infants studied, 368 should have received supplementation based on BINP criteria but only 111 infants (30%) did so, while a further 13% were incorrectly given supplementation. So in total over half (52.8%) of the sample was incorrectly identified for supplementation. In addition, less than a quarter of the infants received the full 90 days of supplementation and close to half of the infants exited the programme without the requisite weight gain. Infants were assigned to one of four groups: correctly supplemented, correctly non-supplemented, incorrectly supplemented or incorrectly non-supplemented. This classification provided natural controls; the correctly supplemented infants versus the incorrectly non-supplemented infants, and the correctly non-supplemented infants versus the incorrectly supplemented infants. There were no significant differences in weight gain between the correctly supplemented group and the incorrectly non-supplemented group or between the correctly non-supplemented and the incorrectly supplemented groups, nor was there any evidence of growth faltering in the incorrectly non-supplemented group. This study found serious programmatic deficiencies - inability to identify growth faltering in infants, failure to supplement for the full time period and incorrect exit procedures. There was no evidence that food supplementation had any impact on improving infant weight gain.
Noble, L D; Gow, J A
1998-03-01
Bacteria belonging to the family Vibrionaceae were suspended using saline and a solution prepared from a marine-cations supplement. The effect of this on the profile of oxidized substrates obtained when using Biolog GN MicroPlates was investigated. Thirty-nine species belonging to the genera Aeromonas, Listonella, Photobacterium, and Vibrio were studied. Of the strains studied, species of Listonella, Photobacterium, and Vibrio could be expected to benefit from a marine-cations supplement that contained Na+, K+, and Mg2+. Bacteria that are not of marine origin are usually suspended in normal saline. Of the 39 species examined, 9 were not included in the Biolog data base and were not identified. Of the 30 remaining species, 50% were identified correctly using either of the suspending solutions. A further 20% were correctly identified only when suspended in saline. Three species, or 10%, were correctly identified only after suspension in the marine-cations supplemented solution. The remaining 20% of species were not correctly identified by either method. Generally, more substrates were oxidized when the bacteria had been suspended in the more complex salts solution. Usually, when identifications were incorrect, the use of the marine-cations supplemented suspending solution had resulted in many more substrates being oxidized. Based on these results, it would be preferable to use saline to suspend the cells when using Biolog for identification of species of Vibrionaceae. A salts solution containing a marine-cations supplement would be preferable for environmental studies where the objective is to determine profiles of substrates that the bacteria have the potential to oxidize. If identifications are done using marine-cations supplemented suspending solution, it would be advisable to include reference cultures to determine the effect of the supplement. Of the Vibrio and Listonella species associated with human clinical specimens, 8 out of the 11 studied were identified correctly when either of the suspending solutions was used.
Evaluation of the Biotyper MALDI-TOF MS system for identification of Staphylococcus species.
Zhu, Wenming; Sieradzki, Krzysztof; Albrecht, Valerie; McAllister, Sigrid; Lin, Wen; Stuchlik, Olga; Limbago, Brandi; Pohl, Jan; Kamile Rasheed, J
2015-10-01
The Bruker Biotyper MALDI-TOF MS (Biotyper) system, with a modified 30 minute formic acid extraction method, was evaluated by its ability to identify 216 clinical Staphylococcus isolates from the CDC reference collection comprising 23 species previously identified by conventional biochemical tests. 16S rDNA sequence analysis was used to resolve discrepancies. Of these, 209 (96.8%) isolates were correctly identified: 177 (84.7%) isolates had scores ≥2.0, while 32 (15.3%) had scores between 1.70 and 1.99. The Biotyper identification was inconsistent with the biochemical identification for seven (3.2%) isolates, but the Biotyper identifications were confirmed by 16S rDNA analysis. The distribution of low scores was strongly species-dependent, e.g. only 5% of Staphylococcus epidermidis and 4.8% of Staphylococcus aureus isolates scored below 2.0, while 100% of Staphylococcus cohnii, 75% of Staphylococcus sciuri, and 60% of Staphylococcus caprae produced low but accurate Biotyper scores. Our results demonstrate that the Biotyper can reliably identify Staphylococcus species with greater accuracy than conventional biochemicals. Broadening of the reference database by inclusion of additional examples of under-represented species could further optimize Biotyper results. Published by Elsevier B.V.
Grant, Ashleigh; Wilkinson, T J; Holman, Derek R; Martin, Michael C
2005-09-01
Analysis of fingerprints has predominantly focused on matching the pattern of ridges to a specific person as a form of identification. The present work focuses on identifying extrinsic materials that are left within a person's fingerprint after recent handling of such materials. Specifically, we employed infrared spectromicroscopy to locate and positively identify microscopic particles from a mixture of common materials in the latent human fingerprints of volunteer subjects. We were able to find and correctly identify all test substances based on their unique infrared spectral signatures. Spectral imaging is demonstrated as a method for automating recognition of specific substances in a fingerprint. We also demonstrate the use of attenuated total reflectance (ATR) and synchrotron-based infrared spectromicroscopy for obtaining high-quality spectra from particles that were too thick or too small, respectively, for reflection/absorption measurements. We believe the application of this rapid, nondestructive analytical technique to the forensic study of latent human fingerprints has the potential to add a new layer of information available to investigators. Using fingerprints to not only identify who was present at a crime scene, but also to link who was handling key materials, will be a powerful investigative tool.
Vaidyanathan, Balu; Radhakrishnan, Reshma; Sarala, Deepa Aravindakshan; Sundaram, Karimassery Ramaiyar; Kumar, Raman Krishna
2009-08-01
Malnutrition is common in children with congenital heart disease (CHD), especially in developing countries. To examine the impact of corrective intervention on the nutritional status of children with CHD and identify factors associated with suboptimal recovery. Consecutive patients with CHD in a tertiary center in South India were evaluated for nutritional status before and 2 years after corrective intervention. Anthropometry was performed at presentation and every 6 months for 2 years, and z scores were compared. Malnutrition was defined as a weight-for-age, height-for-age, and weight/height z score <-2. Determinants of malnutrition were entered into a multivariate logistic regression analysis model. Of 476 patients undergoing corrective intervention (surgical: 344; catheter-based: 132) z scores of less than -2 for weight for age, height for age, and weight/height were recorded in 59%, 26.3%, and 55.9% of patients, respectively, at presentation. On follow-up (425 patients [92.5% of survivors; 20.63 +/- 13.1 months of age]), z scores for weight for age and weight/height improved significantly from the baseline (weight: -1.42 +/- 1.03 vs -2.19 +/- 1.16; P < .001; weight/height: -1.15 +/- 1.25 vs -2.09 +/- 1.3; P < .001). Height-for-age z scores were not significantly different. Malnutrition persisted in 116 (27.3%) patients on follow-up and was associated with a birth weight of
Improving Global Net Surface Heat Flux with Ocean Reanalysis
NASA Astrophysics Data System (ADS)
Carton, J.; Chepurin, G. A.; Chen, L.; Grodsky, S.
2017-12-01
This project addresses the current level of uncertainty in surface heat flux estimates. Time mean surface heat flux estimates provided by atmospheric reanalyses differ by 10-30W/m2. They are generally unbalanced globally, and have been shown by ocean simulation studies to be incompatible with ocean temperature and velocity measurements. Here a method is presented 1) to identify the spatial and temporal structure of the underlying errors and 2) to reduce them by exploiting hydrographic observations and the analysis increments produced by an ocean reanalysis using sequential data assimilation. The method is applied to fluxes computed from daily state variables obtained from three widely used reanalyses: MERRA2, ERA-Interim, and JRA-55, during an eight year period 2007-2014. For each of these seasonal heat flux errors/corrections are obtained. In a second set of experiments the heat fluxes are corrected and the ocean reanalysis experiments are repeated. This second round of experiments shows that the time mean error in the corrected fluxes is reduced to within ±5W/m2 over the interior subtropical and midlatitude oceans, with the most significant changes occuring over the Southern Ocean. The global heat flux imbalance of each reanalysis is reduced to within a few W/m2 with this single correction. Encouragingly, the corrected forms of the three sets of fluxes are also shown to converge. In the final discussion we present experiments beginning with a modified form of the ERA-Int reanalysis, produced by the DAKKAR program, in which state variables have been individually corrected based on independent measurements. Finally, we discuss the separation of flux error from model error.
Pifferi, Massimo; Bush, Andrew; Pioggia, Giovanni; Di Cicco, Maria; Chinellato, Iolanda; Bodini, Alessandro; Macchia, Pierantonio; Boner, Attilio L
2011-02-01
Asthma control is emphasized by new guidelines but remains poor in many children. Evaluation of control relies on subjective patient recall and may be overestimated by health-care professionals. This study assessed the value of spirometry and fractional exhaled nitric oxide (FeNO) measurements, used alone or in combination, in models developed by a machine learning approach in the objective classification of asthma control according to Global Initiative for Asthma guidelines and tested the model in a second group of children with asthma. Fifty-three children with persistent atopic asthma underwent two to six evaluations of asthma control, including spirometry and FeNO. Soft computing evaluation was performed by means of artificial neural networks and principal component analysis. The model was then tested in a cross-sectional study in an additional 77 children with allergic asthma. The machine learning method was not able to distinguish different levels of control using either spirometry or FeNO values alone. However, their use in combination modeled by soft computing was able to discriminate levels of asthma control. In particular, the model is able to recognize all children with uncontrolled asthma and correctly identify 99.0% of children with totally controlled asthma. In the cross-sectional study, the model prospectively identified correctly all the uncontrolled children and 79.6% of the controlled children. Soft computing analysis of spirometry and FeNO allows objective categorization of asthma control status.
Identifying biologically relevant putative mechanisms in a given phenotype comparison
Hanoudi, Samer; Donato, Michele; Draghici, Sorin
2017-01-01
A major challenge in life science research is understanding the mechanism involved in a given phenotype. The ability to identify the correct mechanisms is needed in order to understand fundamental and very important phenomena such as mechanisms of disease, immune systems responses to various challenges, and mechanisms of drug action. The current data analysis methods focus on the identification of the differentially expressed (DE) genes using their fold change and/or p-values. Major shortcomings of this approach are that: i) it does not consider the interactions between genes; ii) its results are sensitive to the selection of the threshold(s) used, and iii) the set of genes produced by this approach is not always conducive to formulating mechanistic hypotheses. Here we present a method that can construct networks of genes that can be considered putative mechanisms. The putative mechanisms constructed by this approach are not limited to the set of DE genes, but also considers all known and relevant gene-gene interactions. We analyzed three real datasets for which both the causes of the phenotype, as well as the true mechanisms were known. We show that the method identified the correct mechanisms when applied on microarray datasets from mouse. We compared the results of our method with the results of the classical approach, showing that our method produces more meaningful biological insights. PMID:28486531
Lupcho, Tiffany; Harrist, Alexia; Van Houten, Clay
2016-10-28
On October 12, 2015, a county health department notified the Wyoming Department of Health of an outbreak of gastrointestinal illness among residents and staff members at a local correctional facility. The majority of ill persons reported onset of symptoms within 1-3 hours after eating lunch served at the facility cafeteria at noon on October 11. Residents and staff members reported that tortilla chips served at the lunch tasted and smelled like chemicals. The Wyoming Department of Health and county health department personnel conducted case-control studies to identify the outbreak source. Consuming lunch at the facility on October 11 was highly associated with illness; multivariate logistic regression analysis found that tortilla chips were the only food item associated with illness. Hexanal and peroxide, markers for rancidity, were detected in tortilla chips and composite food samples from the lunch. No infectious agent was detected in human stool specimens or food samples. Extensive testing of lunch items did not identify any unusual chemical. Epidemiologic and laboratory evidence implicated rancid tortilla chips as the most likely source of illness. This outbreak serves as a reminder to consider alternative food testing methods during outbreaks of unusual gastrointestinal illness when typical foodborne pathogens are not identified. For interpretation of alternative food testing results, samples of each type of food not suspected to be contaminated are needed to serve as controls.
Gianassi, S; Bisin, S; Bindi, B; Spitaleri, I; Bambi, F
2010-01-01
The collection and handling of hematopoietic stem cells (HSCs) must meet high quality requirements. An integrated Quality Risk Management can help to identify and contain potential risks related to HSC production. Risk analysis techniques allow one to "weigh" identified hazards, considering the seriousness of their effects, frequency, and detectability, seeking to prevent the most harmful hazards. The Hazard Analysis Critical Point, recognized as the most appropriate technique to identify risks associated with physical, chemical, and biological hazards for cellular products, consists of classifying finished product specifications and limits of acceptability, identifying all off-specifications, defining activities that can cause them, and finally establishing both a monitoring system for each Critical Control Point and corrective actions for deviations. The severity of possible effects on patients, as well as the occurrence and detectability of critical parameters, are measured on quantitative scales (Risk Priority Number [RPN]). Risk analysis was performed with this technique on manipulation process of HPC performed at our blood center. The data analysis showed that hazards with higher values of RPN with greater impact on the process are loss of dose and tracking; technical skills of operators and manual transcription of data were the most critical parameters. Problems related to operator skills are handled by defining targeted training programs, while other critical parameters can be mitigated with the use of continuous control systems. The blood center management software was completed by a labeling system with forms designed to be in compliance with standards in force and by starting implementation of a cryopreservation management module. Copyright 2010 Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Jayroe, R. R., Jr.
1976-01-01
Geographical correction effects on LANDSAT image data are identified, using the nearest neighbor, bilinear interpolation and bicubic interpolation techniques. Potential impacts of registration on image compression and classification are explored.
Corrections Officer Physical Abilities Report. Standards and Training for Corrections Program.
ERIC Educational Resources Information Center
California State Board of Corrections, Sacramento.
A study examined the physical ability requirements for entry-level corrections officers in the California. The study, which was undertaken at the request of the California Board of Corrections, had the following objectives: statewide job analysis of the requirements of three entry-level positions in county agencies--corrections officer, probation…
Hand-writing motion tracking with vision-inertial sensor fusion: calibration and error correction.
Zhou, Shengli; Fei, Fei; Zhang, Guanglie; Liu, Yunhui; Li, Wen J
2014-08-25
The purpose of this study was to improve the accuracy of real-time ego-motion tracking through inertial sensor and vision sensor fusion. Due to low sampling rates supported by web-based vision sensor and accumulation of errors in inertial sensors, ego-motion tracking with vision sensors is commonly afflicted by slow updating rates, while motion tracking with inertial sensor suffers from rapid deterioration in accuracy with time. This paper starts with a discussion of developed algorithms for calibrating two relative rotations of the system using only one reference image. Next, stochastic noises associated with the inertial sensor are identified using Allan Variance analysis, and modeled according to their characteristics. Finally, the proposed models are incorporated into an extended Kalman filter for inertial sensor and vision sensor fusion. Compared with results from conventional sensor fusion models, we have shown that ego-motion tracking can be greatly enhanced using the proposed error correction model.
The commercial use of satellite data to monitor the potato crop in the Columbia Basin
NASA Technical Reports Server (NTRS)
Waddington, George R., Jr.; Lamb, Frank G.
1990-01-01
The imaging of potato crops with satellites is described and evaluated in terms of the commercial application of the remotely sensed data. The identification and analysis of the crops is accomplished with multiple images acquired from the Landsat MSS and TM systems. The data are processed on a PC with image-procesing software which produces images of the seven 1024 x 1024 pixel windows which are subdivided into 21 512 x 512 pixel windows. Maximization of imaged data throughout the year aids in the identification of crop types by IR reflectance. The classification techniques involve the use of six or seven spectral classes for particular image dates. Comparisons with ground-truth data show good agreement; for example, potato fields are identified correctly 90 percent of the time. Acreage estimates and crop-condition assessments can be made from satellite data and used for corrective agricultural action.
Coulomb-free and Coulomb-distorted recolliding quantum orbits in photoelectron holography
NASA Astrophysics Data System (ADS)
Maxwell, A. S.; Figueira de Morisson Faria, C.
2018-06-01
We perform a detailed analysis of the different types of orbits in the Coulomb quantum orbit strong-field approximation (CQSFA), ranging from direct to those undergoing hard collisions. We show that some of them exhibit clear counterparts in the standard formulations of the strong-field approximation for direct and rescattered above-threshold ionization, and show that the standard orbit classification commonly used in Coulomb-corrected models is over-simplified. We identify several types of rescattered orbits, such as those responsible for the low-energy structures reported in the literature, and determine the momentum regions in which they occur. We also find formerly overlooked interference patterns caused by backscattered Coulomb-corrected orbits and assess their effect on photoelectron angular distributions. These orbits improve the agreement of photoelectron angular distributions computed with the CQSFA with the outcome of ab initio methods for high energy phtotoelectrons perpendicular to the field polarization axis.
Stanford, T; Pollack, R H
1984-09-01
A cross-sectional study comparing response time and the percentage of items correctly identified in three color vision tests (Pflügertrident, HRR-AO pseudoisochromatic plates, and AO pseudoisochromatic plates) was carried out on 72 women (12 in each decade) ranging from ages 20 to 79 years. Overall, time scores increased across the age groups. Analysis of the correctness scores indicated that the AO pseudoisochromatic plates requiring the identification of numbers was more difficult than the other tests which consisted of geometric forms or the letter E. This differential difficulty increased as a function of age. There was no indication of color defect per se which led to the conclusion that figure complexity may be the key variable determining performance. The results were similar to those obtained by Lee and Pollack (1978) in their study of the Embedded Figures Test.
Motor neurons in Drosophila flight control: could b1 be the one?
NASA Astrophysics Data System (ADS)
Whitehead, Samuel; Shirangi, Troy; Cohen, Itai
Similar to balancing a stick on one's fingertip, flapping flight is inherently unstable; maintaining stability is a delicate balancing act made possible only by near-constant, often-subtle corrective actions. For fruit flies, such corrective responses need not only be robust, but also fast: the Drosophila flight control reflex has a response latency time of ~5 ms, ranking it among the fastest reflexes in the animal kingdom. How is such rapid, robust control implemented physiologically? Here we present an analysis of a putatively crucial component of the Drosophila flight control circuit: the b1 motor neuron. Specifically, we apply mechanical perturbations to freely-flying Drosophila and analyze the differences in kinematics patterns between flies with manipulated and un-manipulated b1 motor neurons. Ultimately, we hope to identify the functional role of b1 in flight stabilization, with the aim of linking it to previously-proposed, reduced-order models for reflexive control.
Jurgens, Anneke; Anderson, Angelika; Moore, Dennis W
2012-01-01
To investigate the integrity with which parents and carers implement PECS in naturalistic settings, utilizing a sample of videos obtained from YouTube. Twenty-one YouTube videos meeting selection criteria were identified. The videos were reviewed for instances of seven implementer errors and, where appropriate, presence of a physical prompter. Forty-three per cent of videos and 61% of PECS exchanges contained errors in parent implementation of specific teaching strategies of the PECS training protocol. Vocal prompts, incorrect error correction and the absence of timely reinforcement occurred most frequently, while gestural prompts, insistence on speech, incorrect use of the open hand prompt and not waiting for the learner to initiate occurred less frequently. Results suggest that parents engage in vocal prompting and incorrect use of the 4-step error correction strategy when using PECS with their children, errors likely to result in prompt dependence.
[Identification of meridian-acupoint diagrams and meridian diagrams].
Shen, Wei-hong
2008-08-01
In acu-moxibustion literature, there are two kinds of diagrams, meridian-acupoint diagrams and meridian diagrams. Because they are very similar in outline, and people now have seldom seen the typical ancient meridian diagrams, meridian-acupoint diagrams have been being incorrectly considered to be the meridian diagrams for a long time. It results in confusion in acu-moxibustion academia. The present paper stresses its importance in academic research and introduces some methods for identifying them correctly. The key points for identification of meridian-acupoint diagrams and meridian diagrams are: the legend of diagrams and the drawing style of the ancient charts. In addition, the author makes a detailed explanation about some acu-moxibustion charts which are easily confused. In order to distinguish meridian-acupoint diagrams and meridian diagrams correctly, he or she shoulnd understand the diagrams' intrinsic information as much as possible and make a comprehensive analysis about them.
Healing assessment of tile sets for error tolerance in DNA self-assembly.
Hashempour, M; Mashreghian Arani, Z; Lombardi, F
2008-12-01
An assessment of the effectiveness of healing for error tolerance in DNA self-assembly tile sets for algorithmic/nano-manufacturing applications is presented. Initially, the conditions for correct binding of a tile to an existing aggregate are analysed using a Markovian approach; based on this analysis, it is proved that correct aggregation (as identified with a so-called ideal tile set) is not always met for the existing tile sets for nano-manufacturing. A metric for assessing tile sets for healing by utilising punctures is proposed. Tile sets are investigated and assessed with respect to features such as error (mismatched tile) movement, punctured area and bond types. Subsequently, it is shown that the proposed metric can comprehensively assess the healing effectiveness of a puncture type for a tile set and its capability to attain error tolerance for the desired pattern. Extensive simulation results are provided.