Mouse Assay for Determination of Arsenic Bioavailability in Contaminated Soils
Background: Accurate assessment of human exposure estimates from arsenic-contaminated soils depends upon estimating arsenic (As) soil bioavailability. Development of bioavailability assays provides data needed for human health risk assessments and supports development and valida...
Internal Medicine Residents Do Not Accurately Assess Their Medical Knowledge
ERIC Educational Resources Information Center
Jones, Roger; Panda, Mukta; Desbiens, Norman
2008-01-01
Background: Medical knowledge is essential for appropriate patient care; however, the accuracy of internal medicine (IM) residents' assessment of their medical knowledge is unknown. Methods: IM residents predicted their overall percentile performance 1 week (on average) before and after taking the in-training exam (ITE), an objective and well…
ERIC Educational Resources Information Center
Holt, Josh E.; Kinchin, Gary; Clarke, Gill
2012-01-01
Background: Coaches developing young talent in team sports must maximise practice and learning of essential game skills and accurately and continuously assess the performance and potential of each player. Relative age effects highlight an erroneous process of initial and on-going player assessment, based largely on subjective opinions of game…
Ferrario, J; Byrne, C; Dupuy, A E
1997-06-01
The addition of the "dioxin-like" polychlorinated biphenyl (PCB) congeners to the assessment of risk associated with the 2,3,7,8-chlorine substituted dioxins and furans has dramatically increased the number of laboratories worldwide that are developing analytical procedures for their detection and quantitation. Most of these procedures are based on established sample preparation and analytical techniques employing high resolution gas chromatography/high resolution mass spectrometry (HRGC/HRMS), which are used for the analyses of dioxin/furans at low parts-per-trillion (ppt) levels. A significant and widespread problem that arises when using these sample preparation procedures for the analysis of coplanar PCBs is the presence of background levels of these congeners. Industrial processes, urban incineration, leaking electrical transformers, hazardous waste accidents, and improper waste disposal practices have released appreciable quantities of PCBs into the environment. This contamination has resulted in the global distribution of these compounds via the atmosphere and their ubiquitous presence in ambient air. The background presence of these compounds in method blanks must be addressed when determining the exact concentrations of these and other congeners in environmental samples. In this study reliable procedures were developed to accurately define these background levels and assess their variability over the course of the study. The background subtraction procedures developed and employed increase the probability that the values reported accurately represent the concentrations found in the samples and were not biased due to this background contamination.
NASA Technical Reports Server (NTRS)
Ferrario, J.; Byrne, C.; Dupuy, A. E. Jr
1997-01-01
The addition of the "dioxin-like" polychlorinated biphenyl (PCB) congeners to the assessment of risk associated with the 2,3,7,8-chlorine substituted dioxins and furans has dramatically increased the number of laboratories worldwide that are developing analytical procedures for their detection and quantitation. Most of these procedures are based on established sample preparation and analytical techniques employing high resolution gas chromatography/high resolution mass spectrometry (HRGC/HRMS), which are used for the analyses of dioxin/furans at low parts-per-trillion (ppt) levels. A significant and widespread problem that arises when using these sample preparation procedures for the analysis of coplanar PCBs is the presence of background levels of these congeners. Industrial processes, urban incineration, leaking electrical transformers, hazardous waste accidents, and improper waste disposal practices have released appreciable quantities of PCBs into the environment. This contamination has resulted in the global distribution of these compounds via the atmosphere and their ubiquitous presence in ambient air. The background presence of these compounds in method blanks must be addressed when determining the exact concentrations of these and other congeners in environmental samples. In this study reliable procedures were developed to accurately define these background levels and assess their variability over the course of the study. The background subtraction procedures developed and employed increase the probability that the values reported accurately represent the concentrations found in the samples and were not biased due to this background contamination.
Clinical Diagnosis among Diverse Populations: A Multicultural Perspective.
ERIC Educational Resources Information Center
Solomon, Alison
1992-01-01
Discusses four ways in which clinical diagnosis can be detrimental to minority clients: (1) cultural expressions of symptomatology; (2) unreliable research instruments; (3) clinician bias; and (4) institutional racism. Recommendations to avoid misdiagnosis begin with accurate assessment of a client's history and cultural background. (SLD)
Detection of wheat powdery mildew by differentiating background factors using hyperspectral imaging
USDA-ARS?s Scientific Manuscript database
Accurate assessment of crop disease severities is the key for precision application of pesticides to prevent disease infestation. In-situ hyperspectral imaging technology can provide high-resolution imagery with spectra for rapid identification of crop disease and determining disease infestation pat...
ERIC Educational Resources Information Center
Beail, N.; Mitchell, K.; Vlissides, N.; Jackson, T.
2015-01-01
Background: When assessing the mental health needs of people who have intellectual disabilities (ID) it is important to use measures that have good validity and reliability to ensure accurate case recognition and reliable and valid outcome data. Measures developed for this purpose tend to be self-report or by informant report. Multi-trait…
The Joint Effects of Background Selection and Genetic Recombination on Local Gene Genealogies
Zeng, Kai; Charlesworth, Brian
2011-01-01
Background selection, the effects of the continual removal of deleterious mutations by natural selection on variability at linked sites, is potentially a major determinant of DNA sequence variability. However, the joint effects of background selection and genetic recombination on the shape of the neutral gene genealogy have proved hard to study analytically. The only existing formula concerns the mean coalescent time for a pair of alleles, making it difficult to assess the importance of background selection from genome-wide data on sequence polymorphism. Here we develop a structured coalescent model of background selection with recombination and implement it in a computer program that efficiently generates neutral gene genealogies for an arbitrary sample size. We check the validity of the structured coalescent model against forward-in-time simulations and show that it accurately captures the effects of background selection. The model produces more accurate predictions of the mean coalescent time than the existing formula and supports the conclusion that the effect of background selection is greater in the interior of a deleterious region than at its boundaries. The level of linkage disequilibrium between sites is elevated by background selection, to an extent that is well summarized by a change in effective population size. The structured coalescent model is readily extendable to more realistic situations and should prove useful for analyzing genome-wide polymorphism data. PMID:21705759
The joint effects of background selection and genetic recombination on local gene genealogies.
Zeng, Kai; Charlesworth, Brian
2011-09-01
Background selection, the effects of the continual removal of deleterious mutations by natural selection on variability at linked sites, is potentially a major determinant of DNA sequence variability. However, the joint effects of background selection and genetic recombination on the shape of the neutral gene genealogy have proved hard to study analytically. The only existing formula concerns the mean coalescent time for a pair of alleles, making it difficult to assess the importance of background selection from genome-wide data on sequence polymorphism. Here we develop a structured coalescent model of background selection with recombination and implement it in a computer program that efficiently generates neutral gene genealogies for an arbitrary sample size. We check the validity of the structured coalescent model against forward-in-time simulations and show that it accurately captures the effects of background selection. The model produces more accurate predictions of the mean coalescent time than the existing formula and supports the conclusion that the effect of background selection is greater in the interior of a deleterious region than at its boundaries. The level of linkage disequilibrium between sites is elevated by background selection, to an extent that is well summarized by a change in effective population size. The structured coalescent model is readily extendable to more realistic situations and should prove useful for analyzing genome-wide polymorphism data.
Background Adverse cardiovascular events have been linked with PM2.5 exposure obtained primarily from air quality monitors, which rarely co-locate with participant residences. Modeled PM2.5 predictions at finer resolution may more accurately predict residential exposure; however...
Assessing the Eating Behaviors of Low-Income, Urban Adolescents
ERIC Educational Resources Information Center
Fahlman, Mariane; McCaughtry, Nate; Martin, Jeffrey; Garn, Alex C.; Shen, Bo
2012-01-01
Background: There is a need for instruments that can accurately determine the effectiveness of nutrition interventions targeting low-income, inner-city adolescents. Purpose: To examine the development of a valid and reliable eating behavior scale (EBS) for use in school-based nutrition interventions in urban, inner-city communities dominated by…
Background: Accurate prediction of in vivo toxicity from in vitro testing is a challenging problem. Large public–private consortia have been formed with the goal of improving chemical safety assessment by the means of high-throughput screening. Methods and results: A database co...
External validation of a simple clinical tool used to predict falls in people with Parkinson disease
Duncan, Ryan P.; Cavanaugh, James T.; Earhart, Gammon M.; Ellis, Terry D.; Ford, Matthew P.; Foreman, K. Bo; Leddy, Abigail L.; Paul, Serene S.; Canning, Colleen G.; Thackeray, Anne; Dibble, Leland E.
2015-01-01
Background Assessment of fall risk in an individual with Parkinson disease (PD) is a critical yet often time consuming component of patient care. Recently a simple clinical prediction tool based only on fall history in the previous year, freezing of gait in the past month, and gait velocity <1.1 m/s was developed and accurately predicted future falls in a sample of individuals with PD. METHODS We sought to externally validate the utility of the tool by administering it to a different cohort of 171 individuals with PD. Falls were monitored prospectively for 6 months following predictor assessment. RESULTS The tool accurately discriminated future fallers from non-fallers (area under the curve [AUC] = 0.83; 95% CI 0.76 –0.89), comparable to the developmental study. CONCLUSION The results validated the utility of the tool for allowing clinicians to quickly and accurately identify an individual’s risk of an impending fall. PMID:26003412
Segmentation of bone pixels from EROI Image using clustering method for bone age assessment
NASA Astrophysics Data System (ADS)
Bakthula, Rajitha; Agarwal, Suneeta
2016-03-01
The bone age of a human can be identified using carpal and epiphysis bones ossification, which is limited to teen age. The accurate age estimation depends on best separation of bone pixels and soft tissue pixels in the ROI image. The traditional approaches like canny, sobel, clustering, region growing and watershed can be applied, but these methods requires proper pre-processing and accurate initial seed point estimation to provide accurate results. Therefore this paper proposes new approach to segment the bone from soft tissue and background pixels. First pixels are enhanced using BPE and the edges are identified by HIPI. Later a K-Means clustering is applied for segmentation. The performance of the proposed approach has been evaluated and compared with the existing methods.
ERIC Educational Resources Information Center
Chen, Ruey-Shin; Liu, I-Fan
2017-01-01
Currently, e-learning systems are being widely used in all stages of education. However, it is difficult for school administrators to accurately assess the actual usage performance of a new system, especially when an organization wishes to update the system for users from different backgrounds using new devices such as smartphones. To allow school…
ERIC Educational Resources Information Center
Janssen, R.; Maes, B.
2013-01-01
Background: People with intellectual disabilities (ID) have an increased vulnerability to develop psychiatric problems. Moreover, the early recognition and the accurate diagnosis of psychiatric disorders in the population of persons with ID are challenging. Method: A Dutch version of the Mini PAS-ADD, which is a screening instrument for…
Accurate assessment and identification of naturally occurring cellular cobalamins
Hannibal, Luciana; Axhemi, Armend; Glushchenko, Alla V.; Moreira, Edward S.; Brasch, Nicola E.; Jacobsen, Donald W.
2009-01-01
Background Accurate assessment of cobalamin profiles in human serum, cells, and tissues may have clinical diagnostic value. However, non-alkyl forms of cobalamin undergo β-axial ligand exchange reactions during extraction, which leads to inaccurate profiles having little or no diagnostic value. Methods Experiments were designed to: 1) assess β-axial ligand exchange chemistry during the extraction and isolation of cobalamins from cultured bovine aortic endothelial cells, human foreskin fibroblasts, and human hepatoma HepG2 cells, and 2) to establish extraction conditions that would provide a more accurate assessment of endogenous forms containing both exchangeable and non-exchangeable β-axial ligands. Results The cobalamin profile of cells grown in the presence of [57Co]-cyanocobalamin as a source of vitamin B12 shows that the following derivatives are present: [57Co]-aquacobalamin, [57Co]-glutathionylcobalamin, [57Co]-sulfitocobalamin, [57Co]-cyanocobalamin, [57Co]-adenosylcobalamin, [57Co]-methylcobalamin, as well as other yet unidentified corrinoids. When the extraction is performed in the presence of excess cold aquacobalamin acting as a scavenger cobalamin (i.e., “cold trapping”), the recovery of both [57Co]-glutathionylcobalamin and [57Co]-sulfitocobalamin decreases to low but consistent levels. In contrast, the [57Co]-nitrocobalamin observed in extracts prepared without excess aquacobalamin is undetectable in extracts prepared with cold trapping. Conclusions This demonstrates that β-ligand exchange occurs with non-covalently bound β-ligands. The exception to this observation is cyanocobalamin with a non-covalent but non-exchangeable− CNT group. It is now possible to obtain accurate profiles of cellular cobalamins. PMID:18973458
Diagnosis of adolescent polycystic ovary syndrome.
Hardy, Tristan S E; Norman, Robert J
2013-08-01
Polycystic ovary syndrome (PCOS) is the most common endocrinopathy affecting women of reproductive age and is increasingly recognized as a disorder manifesting in the peripubertal and adolescent period. Diagnosis in the adolescent is difficult due to the high background rate of menstrual irregularity, the high prevalence of polycystic ovarian morphology and hyperandrogenic features in this population. Recent guidelines suggest that menstrual irregularity for over two years, reduced reliance on ultrasound diagnosis of polycystic ovarian morphology, and accurate assessment of hyperandrogenic and metabolic features are suitable strategies for the diagnosis of PCOS in the adolescent. Accurate diagnosis is important given the long-term implications of the disorder, with increasing emphasis on metabolic sequelae. Copyright © 2013 Elsevier Inc. All rights reserved.
Fábián, Anna; Bor, Renáta; Farkas, Klaudia; Bálint, Anita; Milassin, Ágnes; Rutka, Mariann; Tiszlavicz, László; Wittmann, Tibor; Nagy, Ferenc; Molnár, Tamás; Szepes, Zoltán
2016-01-01
Background. Rectal tumour management depends highly on locoregional extension. Rectal endoscopic ultrasound (ERUS) is a good alternative to computed tomography and magnetic resonance imaging. However, in Hungary only a small amount of rectal tumours is examined with ERUS. Methods. Our retrospective study (2006-2012) evaluates the diagnostic accuracy of ERUS and compares the results, the first data from Central Europe, with those from Western Europe. The effect of neoadjuvant therapy, rectal probe type, and investigator's experience were also assessed. Results. 311 of the 647 ERUS assessed locoregional extension. Histological comparison was available in 177 cases: 67 patients underwent surgery alone; 110 received neoadjuvant chemoradiotherapy (CRT); ERUS preceded CRT in 77 and followed it in 33 patients. T-staging was accurate in 72% of primarily operated patients. N-staging was less accurate (62%). CRT impaired staging accuracy (64% and 59% for T- and N-staging). Rigid probes were more accurate (79%). At least 30 examinations are needed to master the technique. Conclusions. The sensitivity of ERUS complies with the literature. ERUS is easy to learn and more accurate in early stages but unnecessary for restaging after CRT. Staging accuracy is similar in Western and Central Europe, although the number of examinations should be increased.
Yu, Mi; Kang, Kyung Ja
2017-06-01
Accurate, skilled communication in handover is of high priority in maintaining patients' safety. Nursing students have few chances to practice nurse-to-doctor handover in clinical training, and some have little knowledge of what constitutes effective handover or lack confidence in conveying information. This study aimed to develop a role-play simulation program involving the Situation, Background, Assessment, Recommendation technique for nurse-to-doctor handover; implement the program; and analyze its effects on situation, background, assessment, recommendation communication, communication clarity, handover confidence, and education satisfaction in nursing students. Non-equivalent control-group pretest-posttest quasi-experimental. A convenience sample of 62 senior nursing students from two Korean universities. The differences in SBAR communication, communication clarity, handover confidence, and education satisfaction between the control and intervention groups were measured before and after program participation. The intervention group showed higher Situation, Background, Assessment, Recommendation communication scores (t=-3.05, p=0.003); communication clarity scores in doctor notification scenarios (t=-5.50, p<0.001); and Situation, Background, Assessment, Recommendation education satisfaction scores (t=-4.94, p<0.001) relative to those of the control group. There was no significant difference in handover confidence between groups (t=-1.97, p=0.054). The role-play simulation program developed in this study could be used to promote communication skills in nurse-to-doctor handover and cultivate communicative competence in nursing students. Copyright © 2017. Published by Elsevier Ltd.
Impact of reconstruction parameters on quantitative I-131 SPECT
NASA Astrophysics Data System (ADS)
van Gils, C. A. J.; Beijst, C.; van Rooij, R.; de Jong, H. W. A. M.
2016-07-01
Radioiodine therapy using I-131 is widely used for treatment of thyroid disease or neuroendocrine tumors. Monitoring treatment by accurate dosimetry requires quantitative imaging. The high energy photons however render quantitative SPECT reconstruction challenging, potentially requiring accurate correction for scatter and collimator effects. The goal of this work is to assess the effectiveness of various correction methods on these effects using phantom studies. A SPECT/CT acquisition of the NEMA IEC body phantom was performed. Images were reconstructed using the following parameters: (1) without scatter correction, (2) with triple energy window (TEW) scatter correction and (3) with Monte Carlo-based scatter correction. For modelling the collimator-detector response (CDR), both (a) geometric Gaussian CDRs as well as (b) Monte Carlo simulated CDRs were compared. Quantitative accuracy, contrast to noise ratios and recovery coefficients were calculated, as well as the background variability and the residual count error in the lung insert. The Monte Carlo scatter corrected reconstruction method was shown to be intrinsically quantitative, requiring no experimentally acquired calibration factor. It resulted in a more accurate quantification of the background compartment activity density compared with TEW or no scatter correction. The quantification error relative to a dose calibrator derived measurement was found to be <1%,-26% and 33%, respectively. The adverse effects of partial volume were significantly smaller with the Monte Carlo simulated CDR correction compared with geometric Gaussian or no CDR modelling. Scatter correction showed a small effect on quantification of small volumes. When using a weighting factor, TEW correction was comparable to Monte Carlo reconstruction in all measured parameters, although this approach is clinically impractical since this factor may be patient dependent. Monte Carlo based scatter correction including accurately simulated CDR modelling is the most robust and reliable method to reconstruct accurate quantitative iodine-131 SPECT images.
Castro, Liliana Norma; Rendina, Alicia Elena; Orgeira, Maria Julia
2018-06-15
Contamination assessment in riverbed sediments depends on the accurate determination of the background values. The aim of this study is to assess the degree of contamination and to evaluate the most adequate background for the determination of anthropogenic contamination in Cd, Cr, Cu, Ni, Pb and Zn in bed sediments of the Pampean area river basin (Matanza-Riachuelo River and tributary streams), Argentina. The geo-accumulation index (Igeo) values were calculated using selected lithogenic backgrounds (loess, loessoid sediments and paleosoils), the metal concentrations in the residual fraction (F4) in riverbed sediments and a global average shale often applied in the estimation of toxic metal Igeo. The IgeoF4, IgeoLZB and most of the others Igeos, indicated that in land areas used mainly for agriculture and cattle grazing, the superficial sediments were uncontaminated with Cd, Cr, Cu and Zn, and slightly contaminated with Ni and Pb. Conversely, in those areas dedicated to urban and industrial use, the metal contamination was greater. Overall, the relatively significant anthropogenic contamination of Cr > Pb ≥ Cu > Zn > Ni > Cd in the Riachuelo River area was associated with metallurgic activities, tanning and industrial waste. The comparative analysis of different values suggested that Buenos Aires' "pristine" loess could be recommended to evaluate the Igeo index of riverbed sediments in the Pampean area. To enhance the use of the selected background, the normalized enrichment factor using Al. In this study case, the Igeo and the EF using LZB background display the same trend, showing the greatest degree of contamination, as would be expected, in Riachuelo samples (RIA 1 and RIA 2) located in the urban/industrial area. Copyright © 2018 Elsevier B.V. All rights reserved.
The Cyber War: Maintaining and Controlling the Key Cyber Terrain of the Cyberspace Domain
2016-06-26
solution strategy to assess options that will enable the commander to realize the Air Force’s cyber mission. Recommendations will be made that will...will present a solution to assist the JFC in achieving cyberspace dominance. Background In the modern world of advanced technology, control of...the solutions are: 1) timely identification of key cyber terrain, 2) accurate mapping of the cyber terrain, 3) defense of key cyber terrain, and 4
Measurement Issues in Health Disparities Research
Ramírez, Mildred; Ford, Marvella E; Stewart, Anita L; A Teresi, Jeanne
2005-01-01
Background Racial and ethnic disparities in health and health care have been documented; the elimination of such disparities is currently part of a national agenda. In order to meet this national objective, it is necessary that measures identify accurately the true prevalence of the construct of interest across diverse groups. Measurement error might lead to biased results, e.g., estimates of prevalence, magnitude of risks, and differences in mean scores. Addressing measurement issues in the assessment of health status may contribute to a better understanding of health issues in cross-cultural research. Objective To provide a brief overview of issues regarding measurement in diverse populations. Findings Approaches used to assess the magnitude and nature of bias in measures when applied to diverse groups include qualitative analyses, classic psychometric studies, as well as more modern psychometric methods. These approaches should be applied sequentially, and/or iteratively during the development of measures. Conclusions Investigators performing comparative studies face the challenge of addressing measurement equivalence, crucial for obtaining accurate results in cross-cultural comparisons. PMID:16179000
NASA Astrophysics Data System (ADS)
Kulisek, J. A.; Schweppe, J. E.; Stave, S. C.; Bernacki, B. E.; Jordan, D. V.; Stewart, T. N.; Seifert, C. E.; Kernan, W. J.
2015-06-01
Helicopter-mounted gamma-ray detectors can provide law enforcement officials the means to quickly and accurately detect, identify, and locate radiological threats over a wide geographical area. The ability to accurately distinguish radiological threat-generated gamma-ray signatures from background gamma radiation in real time is essential in order to realize this potential. This problem is non-trivial, especially in urban environments for which the background may change very rapidly during flight. This exacerbates the challenge of estimating background due to the poor counting statistics inherent in real-time airborne gamma-ray spectroscopy measurements. To address this challenge, we have developed a new technique for real-time estimation of background gamma radiation from aerial measurements without the need for human analyst intervention. The method can be calibrated using radiation transport simulations along with data from previous flights over areas for which the isotopic composition need not be known. Over the examined measured and simulated data sets, the method generated accurate background estimates even in the presence of a strong, 60Co source. The potential to track large and abrupt changes in background spectral shape and magnitude was demonstrated. The method can be implemented fairly easily in most modern computing languages and environments.
Aerts, Sam; Deschrijver, Dirk; Joseph, Wout; Verloock, Leen; Goeminne, Francis; Martens, Luc; Dhaene, Tom
2013-05-01
Human exposure to background radiofrequency electromagnetic fields (RF-EMF) has been increasing with the introduction of new technologies. There is a definite need for the quantification of RF-EMF exposure but a robust exposure assessment is not yet possible, mainly due to the lack of a fast and efficient measurement procedure. In this article, a new procedure is proposed for accurately mapping the exposure to base station radiation in an outdoor environment based on surrogate modeling and sequential design, an entirely new approach in the domain of dosimetry for human RF exposure. We tested our procedure in an urban area of about 0.04 km(2) for Global System for Mobile Communications (GSM) technology at 900 MHz (GSM900) using a personal exposimeter. Fifty measurement locations were sufficient to obtain a coarse street exposure map, locating regions of high and low exposure; 70 measurement locations were sufficient to characterize the electric field distribution in the area and build an accurate predictive interpolation model. Hence, accurate GSM900 downlink outdoor exposure maps (for use in, e.g., governmental risk communication and epidemiological studies) are developed by combining the proven efficiency of sequential design with the speed of exposimeter measurements and their ease of handling. Copyright © 2013 Wiley Periodicals, Inc.
Mondal, Suman B.; Gao, Shengkui; Zhu, Nan; Hebimana-Griffin, LeMoyne; Akers, Walter J.; Liang, Rongguang; Gruev, Viktor; Margenthaler, Julie; Achilefu, Samuel
2017-01-01
Background The inability to directly visualize the patient and surgical site limits the use of current near infrared fluorescence-guided surgery systems for real-time sentinel lymph node biopsy and tumor margin assessment. Methods We evaluated an optical see-through goggle augmented imaging and navigation system (GAINS) for near-infrared fluorescence-guided surgery. Tumor-bearing mice injected with a near infrared cancer-targeting agent underwent fluorescence-guided tumor resection. Female Yorkshire pigs received hind leg intradermal indocyanine green injection and underwent fluorescence-guided popliteal lymph node resection. Four breast cancer patients received 99mTc-sulfur colloid and indocyanine green retroareolarly, before undergoing sentinel lymph node biopsy using radioactive tracking and fluorescence imaging. Three other breast cancer patients received indocyanine green retroareolarly before undergoing standard-of-care partial mastectomy, followed by fluorescence imaging of resected tumor and tumor cavity for margin assessment. Results Using near-infrared fluorescence from the dyes, the optical see-through GAINS accurately identified all mouse tumors, pig lymphatics, and 4 pig popliteal lymph nodes with high signal-to-background ratio. In 4 human breast cancer patients, 11 sentinel lymph nodes were identified with a detection sensitivity of 86.67± 0.27% for radioactive tracking and 100% for GAINS. Tumor margin status was accurately predicted by GAINS in all three patients, including clear margins in patients 1 and 2 and positive margins in patient 3 as confirmed by paraffin embedded section histopathology. Conclusions The optical see-through GAINS prototype enhances near infrared fluorescence-guided surgery for sentinel lymph node biopsy and tumor margin assessment in breast cancer patients without disrupting the surgical workflow in the operating room. PMID:28213790
NASA Astrophysics Data System (ADS)
Zhou, Y.; Zhao, H.; Hao, H.; Wang, C.
2018-05-01
Accurate remote sensing water extraction is one of the primary tasks of watershed ecological environment study. Since the Yanhe water system has typical characteristics of a small water volume and narrow river channel, which leads to the difficulty for conventional water extraction methods such as Normalized Difference Water Index (NDWI). A new Multi-Spectral Threshold segmentation of the NDWI (MST-NDWI) water extraction method is proposed to achieve the accurate water extraction in Yanhe watershed. In the MST-NDWI method, the spectral characteristics of water bodies and typical backgrounds on the Landsat/TM images have been evaluated in Yanhe watershed. The multi-spectral thresholds (TM1, TM4, TM5) based on maximum-likelihood have been utilized before NDWI water extraction to realize segmentation for a division of built-up lands and small linear rivers. With the proposed method, a water map is extracted from the Landsat/TM images in 2010 in China. An accuracy assessment is conducted to compare the proposed method with the conventional water indexes such as NDWI, Modified NDWI (MNDWI), Enhanced Water Index (EWI), and Automated Water Extraction Index (AWEI). The result shows that the MST-NDWI method generates better water extraction accuracy in Yanhe watershed and can effectively diminish the confusing background objects compared to the conventional water indexes. The MST-NDWI method integrates NDWI and Multi-Spectral Threshold segmentation algorithms, with richer valuable information and remarkable results in accurate water extraction in Yanhe watershed.
A Temperature-Monitoring Vaginal Ring for Measuring Adherence
Boyd, Peter; Desjardins, Delphine; Kumar, Sandeep; Fetherston, Susan M.; Le-Grand, Roger; Dereuddre-Bosquet, Nathalie; Helgadóttir, Berglind; Bjarnason, Ásgeir; Narasimhan, Manjula; Malcolm, R. Karl
2015-01-01
Background Product adherence is a pivotal issue in the development of effective vaginal microbicides to reduce sexual transmission of HIV. To date, the six Phase III studies of vaginal gel products have relied primarily on self-reporting of adherence. Accurate and reliable methods for monitoring user adherence to microbicide-releasing vaginal rings have yet to be established. Methods A silicone elastomer vaginal ring prototype containing an embedded, miniature temperature logger has been developed and tested in vitro and in cynomolgus macaques for its potential to continuously monitor environmental temperature and accurately determine episodes of ring insertion and removal. Results In vitro studies demonstrated that DST nano-T temperature loggers encapsulated in medical grade silicone elastomer were able to accurately and continuously measure environmental temperature. The devices responded quickly to temperature changes despite being embedded in different thickness of silicone elastomer. Prototype vaginal rings measured higher temperatures compared with a subcutaneously implanted device, showed high sensitivity to diurnal fluctuations in vaginal temperature, and accurately detected periods of ring removal when tested in macaques. Conclusions Vaginal rings containing embedded temperature loggers may be useful in the assessment of product adherence in late-stage clinical trials. PMID:25965956
Accounting for orphaned aftershocks in the earthquake background rate
Van Der Elst, Nicholas
2017-01-01
Aftershocks often occur within cascades of triggered seismicity in which each generation of aftershocks triggers an additional generation, and so on. The rate of earthquakes in any particular generation follows Omori's law, going approximately as 1/t. This function decays rapidly, but is heavy-tailed, and aftershock sequences may persist for long times at a rate that is difficult to discriminate from background. It is likely that some apparently spontaneous earthquakes in the observational catalogue are orphaned aftershocks of long-past main shocks. To assess the relative proportion of orphaned aftershocks in the apparent background rate, I develop an extension of the ETAS model that explicitly includes the expected contribution of orphaned aftershocks to the apparent background rate. Applying this model to California, I find that the apparent background rate can be almost entirely attributed to orphaned aftershocks, depending on the assumed duration of an aftershock sequence. This implies an earthquake cascade with a branching ratio (the average number of directly triggered aftershocks per main shock) of nearly unity. In physical terms, this implies that very few earthquakes are completely isolated from the perturbing effects of other earthquakes within the fault system. Accounting for orphaned aftershocks in the ETAS model gives more accurate estimates of the true background rate, and more realistic expectations for long-term seismicity patterns.
Accounting for orphaned aftershocks in the earthquake background rate
NASA Astrophysics Data System (ADS)
van der Elst, Nicholas J.
2017-11-01
Aftershocks often occur within cascades of triggered seismicity in which each generation of aftershocks triggers an additional generation, and so on. The rate of earthquakes in any particular generation follows Omori's law, going approximately as 1/t. This function decays rapidly, but is heavy-tailed, and aftershock sequences may persist for long times at a rate that is difficult to discriminate from background. It is likely that some apparently spontaneous earthquakes in the observational catalogue are orphaned aftershocks of long-past main shocks. To assess the relative proportion of orphaned aftershocks in the apparent background rate, I develop an extension of the ETAS model that explicitly includes the expected contribution of orphaned aftershocks to the apparent background rate. Applying this model to California, I find that the apparent background rate can be almost entirely attributed to orphaned aftershocks, depending on the assumed duration of an aftershock sequence. This implies an earthquake cascade with a branching ratio (the average number of directly triggered aftershocks per main shock) of nearly unity. In physical terms, this implies that very few earthquakes are completely isolated from the perturbing effects of other earthquakes within the fault system. Accounting for orphaned aftershocks in the ETAS model gives more accurate estimates of the true background rate, and more realistic expectations for long-term seismicity patterns.
Airborne Particulate Threat Assessment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Patrick Treado; Oksana Klueva; Jeffrey Beckstead
Aerosol threat detection requires the ability to discern between threat agents and ambient background particulate matter (PM) encountered in the environment. To date, Raman imaging technology has been demonstrated as an effective strategy for the assessment of threat agents in the presence of specific, complex backgrounds. Expanding our understanding of the composition of ambient particulate matter background will improve the overall performance of Raman Chemical Imaging (RCI) detection strategies for the autonomous detection of airborne chemical and biological hazards. Improving RCI detection performance is strategic due to its potential to become a widely exploited detection approach by several U.S. governmentmore » agencies. To improve the understanding of the ambient PM background with subsequent improvement in Raman threat detection capability, ChemImage undertook the Airborne Particulate Threat Assessment (APTA) Project in 2005-2008 through a collaborative effort with the National Energy Technology Laboratory (NETL), under cooperative agreement number DE-FC26-05NT42594. During Phase 1 of the program, a novel PM classification based on molecular composition was developed based on a comprehensive review of the scientific literature. In addition, testing protocols were developed for ambient PM characterization. A signature database was developed based on a variety of microanalytical techniques, including scanning electron microscopy, FT-IR microspectroscopy, optical microscopy, fluorescence and Raman chemical imaging techniques. An automated particle integrated collector and detector (APICD) prototype was developed for automated collection, deposition and detection of biothreat agents in background PM. During Phase 2 of the program, ChemImage continued to refine the understanding of ambient background composition. Additionally, ChemImage enhanced the APICD to provide improved autonomy, sensitivity and specificity. Deliverables included a Final Report detailing our findings and APICD Gen II subsystems for automated collection, deposition and detection of ambient particulate matter. Key findings from the APTA Program include: Ambient biological PM taxonomy; Demonstration of key subsystems needed for autonomous bioaerosol detection; System design; Efficient electrostatic collection; Automated bioagent recognition; Raman analysis performance validating Td<9 sec; Efficient collection surface regeneration; and Development of a quantitative bioaerosol defection model. The objective of the APTA program was to advance the state of our knowledge of ambient background PM composition. Operation of an automated aerosol detection system was enhanced by a more accurate assessment of background variability, especially for sensitive and specific sensing strategies like Raman detection that are background-limited in performance. Based on this improved knowledge of background, the overall threat detection performance of Raman sensors was improved.« less
Background: Preflight Screening, In-flight Capabilities, and Postflight Testing
NASA Technical Reports Server (NTRS)
Gibson, Charles Robert; Duncan, James
2009-01-01
Recommendations for minimal in-flight capabilities: Retinal Imaging - provide in-flight capability for the visual monitoring of ocular health (specifically, imaging of the retina and optic nerve head) with the capability of downlinking video/still images. Tonometry - provide more accurate and reliable in-flight capability for measuring intraocular pressure. Ultrasound - explore capabilities of current on-board system for monitoring ocular health. We currently have limited in-flight capabilities on board the International Space Station for performing an internal ocular health assessment. Visual Acuity, Direct Ophthalmoscope, Ultrasound, Tonometry(Tonopen):
Improving the Linkages between Air Pollution Epidemiology and Quantitative Risk Assessment
Bell, Michelle L.; Walker, Katy; Hubbell, Bryan
2011-01-01
Background: Air pollution epidemiology plays an integral role in both identifying the hazards of air pollution as well as supplying the risk coefficients that are used in quantitative risk assessments. Evidence from both epidemiology and risk assessments has historically supported critical environmental policy decisions. The extent to which risk assessors can properly specify a quantitative risk assessment and characterize key sources of uncertainty depends in part on the availability, and clarity, of data and assumptions in the epidemiological studies. Objectives: We discuss the interests shared by air pollution epidemiology and risk assessment communities in ensuring that the findings of epidemiological studies are appropriately characterized and applied correctly in risk assessments. We highlight the key input parameters for risk assessments and consider how modest changes in the characterization of these data might enable more accurate risk assessments that better represent the findings of epidemiological studies. Discussion: We argue that more complete information regarding the methodological choices and input data used in epidemiological studies would support more accurate risk assessments—to the benefit of both disciplines. In particular, we suggest including additional details regarding air quality, demographic, and health data, as well as certain types of data-rich graphics. Conclusions: Relatively modest changes to the data reported in epidemiological studies will improve the quality of risk assessments and help prevent the misinterpretation and mischaracterization of the results of epidemiological studies. Such changes may also benefit epidemiologists undertaking meta-analyses. We suggest workshops as a way to improve the dialogue between the two communities. PMID:21816702
NASA Astrophysics Data System (ADS)
Peng, Yahui; Ma, Xiao; Gao, Xinyu; Zhou, Fangxu
2015-12-01
Computer vision is an important tool for sports video processing. However, its application in badminton match analysis is very limited. In this study, we proposed a straightforward but robust histogram-based background estimation and player detection methods for badminton video clips, and compared the results with the naive averaging method and the mixture of Gaussians methods, respectively. The proposed method yielded better background estimation results than the naive averaging method and more accurate player detection results than the mixture of Gaussians player detection method. The preliminary results indicated that the proposed histogram-based method could estimate the background and extract the players accurately. We conclude that the proposed method can be used for badminton player tracking and further studies are warranted for automated match analysis.
Spectral and Temporal Laser Fluorescence Analysis Such as for Natural Aquatic Environments
NASA Technical Reports Server (NTRS)
Chekalyuk, Alexander (Inventor)
2015-01-01
An Advanced Laser Fluorometer (ALF) can combine spectrally and temporally resolved measurements of laser-stimulated emission (LSE) for characterization of dissolved and particulate matter, including fluorescence constituents, in liquids. Spectral deconvolution (SDC) analysis of LSE spectral measurements can accurately retrieve information about individual fluorescent bands, such as can be attributed to chlorophyll-a (Chl-a), phycobiliprotein (PBP) pigments, or chromophoric dissolved organic matter (CDOM), among others. Improved physiological assessments of photosynthesizing organisms can use SDC analysis and temporal LSE measurements to assess variable fluorescence corrected for SDC-retrieved background fluorescence. Fluorescence assessments of Chl-a concentration based on LSE spectral measurements can be improved using photo-physiological information from temporal measurements. Quantitative assessments of PBP pigments, CDOM, and other fluorescent constituents, as well as basic structural characterizations of photosynthesizing populations, can be performed using SDC analysis of LSE spectral measurements.
The development of formative assessment probes for optics education
NASA Astrophysics Data System (ADS)
Dokter, Erin F. C.; Pompea, Stephen M.; Sparks, Robert T.; Walker, Constance E.
2010-08-01
Research exploring students' knowledge of optics from elementary through college has revealed that many concepts can be difficult for students to grasp. This can be the case particularly with fundamental concepts, such as the nature of light, how light interacts with matter, and how light behaves in optical systems. The use of formative assessment probes (low-stakes questions posed to students before instruction or in real-time in the classroom) can inform instructors about student background knowledge, and can also be used as they progress through learning in class. By understanding what students know prior to instruction, and how well they are learning in real-time, instruction can be designed and modified in order to encourage the development of scientifically-accurate knowledge.
Low contrast detection in abdominal CT: comparing single-slice and multi-slice tasks
NASA Astrophysics Data System (ADS)
Ba, Alexandre; Racine, Damien; Viry, Anaïs.; Verdun, Francis R.; Schmidt, Sabine; Bochud, François O.
2017-03-01
Image quality assessment is crucial for the optimization of computed tomography (CT) protocols. Human and mathematical model observers are increasingly used for the detection of low contrast signal in abdominal CT, but are frequently limited to the use of a single image slice. Another limitation is that most of them only consider the detection of a signal embedded in a uniform background phantom. The purpose of this paper was to test if human observer performance is significantly different in CT images read in single or multiple slice modes and if these differences are the same for anatomical and uniform clinical images. We investigated detection performance and scrolling trends of human observers of a simulated liver lesion embedded in anatomical and uniform CT backgrounds. Results show that observers don't take significantly benefit of additional information provided in multi-slice reading mode. Regarding the background, performances are moderately higher for uniform than for anatomical images. Our results suggest that for low contrast detection in abdominal CT, the use of multi-slice model observers would probably only add a marginal benefit. On the other hand, the quality of a CT image is more accurately estimated with clinical anatomical backgrounds.
CERN-derived analysis of lunar radiation backgrounds
NASA Technical Reports Server (NTRS)
Wilson, Thomas L.; Svoboda, Robert
1993-01-01
The Moon produces radiation which background-limits scientific experiments there. Early analyses of these backgrounds have either failed to take into consideration the effect of charm in particle physics (because they pre-dated its discovery), or have used branching ratios which are no longer strictly valid (due to new accelerator data). We are presently investigating an analytical program for deriving muon and neutrino spectra generated by the Moon, converting an existing CERN computer program known as GEANT which does the same for the Earth. In so doing, this will (1) determine an accurate prompt neutrino spectrum produced by the lunar surface; (2) determine the lunar subsurface particle flux; (3) determine the consequence of charm production physics upon the lunar background radiation environment; and (4) provide an analytical tool for the NASA astrophysics community with which to begin an assessment of the Moon as a scientific laboratory versus its particle radiation environment. This will be done on a recurring basis with the latest experimental results of the particle data groups at Earth-based high-energy accelerators, in particular with the latest branching ratios for charmed meson decay. This will be accomplished for the first time as a full 3-dimensional simulation.
Plancade, Sandra; Rozenholc, Yves; Lund, Eiliv
2012-12-11
Illumina BeadArray technology includes non specific negative control features that allow a precise estimation of the background noise. As an alternative to the background subtraction proposed in BeadStudio which leads to an important loss of information by generating negative values, a background correction method modeling the observed intensities as the sum of the exponentially distributed signal and normally distributed noise has been developed. Nevertheless, Wang and Ye (2012) display a kernel-based estimator of the signal distribution on Illumina BeadArrays and suggest that a gamma distribution would represent a better modeling of the signal density. Hence, the normal-exponential modeling may not be appropriate for Illumina data and background corrections derived from this model may lead to wrong estimation. We propose a more flexible modeling based on a gamma distributed signal and a normal distributed background noise and develop the associated background correction, implemented in the R-package NormalGamma. Our model proves to be markedly more accurate to model Illumina BeadArrays: on the one hand, it is shown on two types of Illumina BeadChips that this model offers a more correct fit of the observed intensities. On the other hand, the comparison of the operating characteristics of several background correction procedures on spike-in and on normal-gamma simulated data shows high similarities, reinforcing the validation of the normal-gamma modeling. The performance of the background corrections based on the normal-gamma and normal-exponential models are compared on two dilution data sets, through testing procedures which represent various experimental designs. Surprisingly, we observe that the implementation of a more accurate parametrisation in the model-based background correction does not increase the sensitivity. These results may be explained by the operating characteristics of the estimators: the normal-gamma background correction offers an improvement in terms of bias, but at the cost of a loss in precision. This paper addresses the lack of fit of the usual normal-exponential model by proposing a more flexible parametrisation of the signal distribution as well as the associated background correction. This new model proves to be considerably more accurate for Illumina microarrays, but the improvement in terms of modeling does not lead to a higher sensitivity in differential analysis. Nevertheless, this realistic modeling makes way for future investigations, in particular to examine the characteristics of pre-processing strategies.
Rodgers, G G; Tenzing, P; Clark, T D
2016-01-01
In light of an increasing trend in fish biology towards using static respirometry techniques without the inclusion of a mixing mechanism and without accurately accounting for the influence of microbial (background) respiration, this paper quantifies the effect of these approaches on the oxygen consumption rates (ṀO2 ) measured from juvenile barramundi Lates calcarifer (mean ± s.e. mass = 20·31 ± 0·81 g) and adult spiny chromis damselfish Acanthochromis polyacanthus (22·03 ± 2·53 g). Background respiration changed consistently and in a sigmoidal manner over time in the treatment with a mixing device (inline recirculation pump), whereas attempts to measure background respiration in the non-mixed treatment yielded highly variable estimates of ṀO2 that were probably artefacts due to the lack of water movement over the oxygen sensor during measurement periods. This had clear consequences when accounting for background respiration in the calculations of fish ṀO2 . Exclusion of a mixing device caused a significantly lower estimate of ṀO2 in both species and reduced the capacity to detect differences between individuals as well as differences within an individual over time. There was evidence to suggest that the magnitude of these effects was dependent on the spontaneous activity levels of the fish, as the difference between mixed and non-mixed treatments was more pronounced for L. calcarifer (sedentary) than for A. polyacanthus (more spontaneously active). It is clear that respirometry set-ups for sedentary species must contain a mixing device to prevent oxygen stratification inside the respirometer. While more active species may provide a higher level of water mixing during respirometry measurements and theoretically reduce the need for a mixing device, the level of mixing cannot be quantified and may change with diurnal cycles in activity. To ensure consistency across studies without relying on fish activity levels, and to enable accurate assessments of background respiration, it is recommended that all respirometry systems should include an appropriate mixing device. © 2016 The Fisheries Society of the British Isles.
Salina, Husain; Abdullah, Asma; Mukari, Siti Zamratol Mai-sarah; Azmi, Mohd Tamil
2010-04-01
Transient-evoked otoacoustic emission (TEOAE) is a well-established screening tool for universal newborn hearing screening. The aims of this study are to measure the effects of background noise on recording of TEOAE and the duration required to complete the test at various noise levels. This study is a prospective study from June 2006 until May 2007. The study population were newborns from postnatal wards who were delivered at term pregnancy. Newborns who were more than 8-h old and passed a hearing screening testing using screening auditory brainstem response (SABRe) were further tested with TEOAE in four different test environments [isolation room in the ward during non-peak hour (E1), isolation room in the ward during peak hour (E2), maternal bedside in the ward during non-peak hour (E3) and maternal bedside in the ward during peak hour (E4)]. This study showed that test environment significantly influenced the time required to complete testing in both ears with F [534.23] = 0.945; P < 0.001 on the right ear and F [636.54] = 0.954; P < 0.001 on the left. Our study revealed that TEOAE testing was efficient in defining the presence of normal hearing in our postnatal wards at maternal bedside during non-peak hour with a specificity of 96.8%. Our study concludes that background noise levels for acceptable and accurate TEOAE recording in newborns should not exceed 65 dB A. In addition, when using TEOAE assessment in noisy environments, the time taken to obtain accurate results will greatly increase.
2012-01-01
Background Self-reported anthropometric data are commonly used to estimate prevalence of obesity in population and community-based studies. We aim to: 1) Determine whether survey participants are able and willing to self-report height and weight; 2) Assess the accuracy of self-reported compared to measured anthropometric data in a community-based sample of young people. Methods Participants (16–29 years) of a behaviour survey, recruited at a Melbourne music festival (January 2011), were asked to self-report height and weight; researchers independently weighed and measured a sub-sample. Body Mass Index was calculated and overweight/obesity classified as ≥25kg/m2. Differences between measured and self-reported values were assessed using paired t-test/Wilcoxon signed ranks test. Accurate report of height and weight were defined as <2cm and <2kg difference between self-report and measured values, respectively. Agreement between classification of overweight/obesity by self-report and measured values was assessed using McNemar’s test. Results Of 1405 survey participants, 82% of males and 72% of females self-reported their height and weight. Among 67 participants who were also independently measured, self-reported height and weight were significantly less than measured height (p=0.01) and weight (p<0.01) among females, but no differences were detected among males. Overall, 52% accurately self-reported height, 30% under-reported, and 18% over-reported; 34% accurately self-reported weight, 52% under-reported and 13% over-reported. More females (70%) than males (35%) under-reported weight (p=0.01). Prevalence of overweight/obesity was 33% based on self-report data and 39% based on measured data (p=0.16). Conclusions Self-reported measurements may underestimate weight but accurately identified overweight/obesity in the majority of this sample of young people. PMID:23170838
ERIC Educational Resources Information Center
Natale, Ruby; Uhlhorn, Susan B.; Lopez-Mitnik, Gabriela; Camejo, Stephanie; Englebert, Nicole; Delamater, Alan M.; Messiah, Sarah E.
2016-01-01
Background: One in four preschool-age children in the United States are currently overweight or obese. Previous studies have shown that caregivers of this age group often have difficulty accurately recognizing their child's weight status. The purpose of this study was to examine factors associated with accurate/inaccurate perception of child body…
Development of an Evidence-Based Clinical Algorithm for Practice in Hypotonia Assessment: A Proposal
2014-01-01
Background Assessing muscle tone in children is essential during the neurological assessment and is often essential in ensuring a more accurate diagnosis for appropriate management. While there have been advances in child neurology, there remains much contention around the subjectivity of the clinical assessment of hypotonia, which is often the first step in the diagnostic process. Objective In response to this challenge, the objective of the study is to develop and validate a prototype of a decision making process in the form of a clinical algorithm that will guide clinicians during this assessment process. Methods Design research within a pragmatic stance will be employed in this study. Multi-phase stages of assessment, prototyping and evaluation will occur. These will include processes that include a systematic review, processes of reflection and action as well as validation methods. Given the mixed methods nature of this study, use of NVIVO or ATLAS-ti will be used in the analysis of qualitative data and SPSS for quantitative data. Results Initial results from the systematic review revealed a paucity of scientific literature that documented the objective assessment of hypotonia in children. The review identified the need for more studies with greater methodological rigor in order to determine best practice with respect to the methods used in the assessment of low muscle tone in the paediatric population. Conclusions It is envisaged that this proposal will contribute to a more accurate clinical diagnosis of children with low muscle tone in the absence of a gold standard. We anticipate that the use of this tool will ultimately assist clinicians towards moving to evidenced based practice whilst upholding best practice in the care of children with hypotonia. PMID:25485571
Accurate object tracking system by integrating texture and depth cues
NASA Astrophysics Data System (ADS)
Chen, Ju-Chin; Lin, Yu-Hang
2016-03-01
A robust object tracking system that is invariant to object appearance variations and background clutter is proposed. Multiple instance learning with a boosting algorithm is applied to select discriminant texture information between the object and background data. Additionally, depth information, which is important to distinguish the object from a complicated background, is integrated. We propose two depth-based models that can compensate texture information to cope with both appearance variants and background clutter. Moreover, in order to reduce the risk of drifting problem increased for the textureless depth templates, an update mechanism is proposed to select more precise tracking results to avoid incorrect model updates. In the experiments, the robustness of the proposed system is evaluated and quantitative results are provided for performance analysis. Experimental results show that the proposed system can provide the best success rate and has more accurate tracking results than other well-known algorithms.
NASA Astrophysics Data System (ADS)
Brahmi, Djamel; Cassoux, Nathalie; Serruys, Camille; Giron, Alain; Lehoang, Phuc; Fertil, Bernard
1999-05-01
To support ophthalmologists in their daily routine and enable the quantitative assessment of progression of Cytomegalovirus infection as observed on series of retinal angiograms, a methodology allowing an accurate comparison of retinal borders has been developed. In order to evaluate accuracy of borders, ophthalmologists have been asked to repeatedly outline boundaries between infected and noninfected areas. As a matter of fact, accuracy of drawing relies on local features such as contrast, quality of image, background..., all factors which make the boundaries more or less perceptible from one part of an image to another. In order to directly estimate accuracy of retinal border from image analysis, an artificial neural network (a succession of unsupervised and supervised neural networks) has been designed to correlate accuracy of drawing (as calculated form ophthalmologists' hand-outlines) with local features of the underlying image. Our method has been applied to the quantification of CMV retinitis. It is shown that accuracy of border is properly predicted and characterized by a confident envelope that allows, after a registration phase based on fixed landmarks such as vessel forks, to accurately assess the evolution of CMV infection.
Performance of JT-60SA divertor Thomson scattering diagnostics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kajita, Shin, E-mail: kajita.shin@nagoya-u.jp; Hatae, Takaki; Tojo, Hiroshi
2015-08-15
For the satellite tokamak JT-60 Super Advanced (JT-60SA), a divertor Thomson scattering measurement system is planning to be installed. In this study, we improved the design of the collection optics based on the previous one, in which it was found that the solid angle of the collection optics became very small, mainly because of poor accessibility to the measurement region. By improvement, the solid angle was increased by up to approximately five times. To accurately assess the measurement performance, background noise was assessed using the plasma parameters in two typical discharges in JT-60SA calculated from the SONIC code. Moreover, themore » influence of the reflection of bremsstrahlung radiation by the wall is simulated by using a ray tracing simulation. The errors in the temperature and the density are assessed based on the simulation results for three typical field of views.« less
Performance of JT-60SA divertor Thomson scattering diagnostics.
Kajita, Shin; Hatae, Takaki; Tojo, Hiroshi; Enokuchi, Akito; Hamano, Takashi; Shimizu, Katsuhiro; Kawashima, Hisato
2015-08-01
For the satellite tokamak JT-60 Super Advanced (JT-60SA), a divertor Thomson scattering measurement system is planning to be installed. In this study, we improved the design of the collection optics based on the previous one, in which it was found that the solid angle of the collection optics became very small, mainly because of poor accessibility to the measurement region. By improvement, the solid angle was increased by up to approximately five times. To accurately assess the measurement performance, background noise was assessed using the plasma parameters in two typical discharges in JT-60SA calculated from the SONIC code. Moreover, the influence of the reflection of bremsstrahlung radiation by the wall is simulated by using a ray tracing simulation. The errors in the temperature and the density are assessed based on the simulation results for three typical field of views.
Validation of the use of synthetic imagery for camouflage effectiveness assessment
NASA Astrophysics Data System (ADS)
Newman, Sarah; Gilmore, Marilyn A.; Moorhead, Ian R.; Filbee, David R.
2002-08-01
CAMEO-SIM was developed as a laboratory method to assess the effectiveness of aircraft camouflage schemes. It is a physically accurate synthetic image generator, rendering in any waveband between 0.4 and 14 microns. Camouflage schemes are assessed by displaying imagery to observers under controlled laboratory conditions or by analyzing the digital image and calculating the contrast statistics between the target and background. Code verification has taken place during development. However, validation of CAMEO-SIM is essential to ensure that the imagery produced is suitable to be used for camouflage effectiveness assessment. Real world characteristics are inherently variable, so exact pixel to pixel correlation is unnecessary. For camouflage effectiveness assessment it is more important to be confident that the comparative effects of different schemes are correct, but prediction of detection ranges is also desirable. Several different tests have been undertaken to validate CAMEO-SIM for the purpose of assessing camouflage effectiveness. Simple scenes have been modeled and measured. Thermal and visual properties of the synthetic and real scenes have been compared. This paper describes the validation tests and discusses the suitability of CAMEO-SIM for camouflage assessment.
Issues and progress in determining background ozone and particle concentrations
NASA Astrophysics Data System (ADS)
Pinto, J. P.
2011-12-01
Exposure to ambient ozone is associated with a variety of health outcomes ranging from mild breathing discomfort to mortality. For the purpose of health risk and policy assessments EPA evaluates the anthropogenic increase in ozone above background concentrations and has defined the North American (NA) background concentration of O3 as that which would occur in the U.S. in the absence of anthropogenic emissions of precursors in the U.S., Canada, and Mexico. Monthly average NA background ozone has been used to evaluate health risks, but EPA and state air quality managers must also estimate day specific ozone background levels for high ozone episodes as part of urban scale photochemical modeling efforts to support ozone regulatory programs. The background concentration of O3 is of more concern than other air pollutants because it typically represents a much larger fraction of observed O3 than do the backgrounds of other criteria pollutants (particulate matter (PM), CO, NO2, SO2). NA background cannot be determined directly from ambient monitoring data because of the influence of NA precursor emissions on formation of ozone within NA. Instead, estimates of NA background O3 have been based on GEOS-Chem using simulations in which NA anthropogenic precursor emissions are zeroed out. Thus, modeled NA background O3 includes contributions from natural sources of precursors (including CH4, NMVOCs, NOx, and CO) everywhere in the world, anthropogenic sources of precursors outside of NA, and downward transport of O3 from the stratosphere. Although monitoring data cannot determine NA background directly, measurements by satellites, aircraft, ozonesondes and surface monitors have proved to be highly useful for identifying sources of background O3 and for evaluating the performance of the GEOS-Chem model. Model simulated NA background concentrations are strong functions of location and season with large inter-day variability and with values increasing with elevation and higher in spring than in summer, and tend to be highest in the Intermountain West during spring. Estimates of annual average NA and other background definitions that have been considered will be presented. Issues associated with modeling background concentrations for both health-risk assessments and for episodic regulatory air quality programs will be discussed, and proposals for new atmospheric measurements and model improvements needed to quantify more accurately background contributions to ozone will also be presented. The views expressed are those of the author and do not necessarily represent the views or policies of the U.S. Environmental Protection Agency.
2014-01-01
Background Pediatric antiretroviral therapy (ART) has been shown to substantially reduce morbidity and mortality in HIV-infected infants and children. To accurately project program costs, analysts need accurate estimations of antiretroviral drug (ARV) costs for children. However, the costing of pediatric antiretroviral therapy is complicated by weight-based dosing recommendations which change as children grow. Methods We developed a step-by-step methodology for estimating the cost of pediatric ARV regimens for children ages 0–13 years old. The costing approach incorporates weight-based dosing recommendations to provide estimated ARV doses throughout childhood development. Published unit drug costs are then used to calculate average monthly drug costs. We compared our derived monthly ARV costs to published estimates to assess the accuracy of our methodology. Results The estimates of monthly ARV costs are provided for six commonly used first-line pediatric ARV regimens, considering three possible care scenarios. The costs derived in our analysis for children were fairly comparable to or slightly higher than available published ARV drug or regimen estimates. Conclusions The methodology described here can be used to provide an accurate estimation of pediatric ARV regimen costs for cost-effectiveness analysts to project the optimum packages of care for HIV-infected children, as well as for program administrators and budget analysts who wish to assess the feasibility of increasing pediatric ART availability in constrained budget environments. PMID:24885453
Bastani, Meysam; Vos, Larissa; Asgarian, Nasimeh; Deschenes, Jean; Graham, Kathryn; Mackey, John; Greiner, Russell
2013-01-01
Background Selecting the appropriate treatment for breast cancer requires accurately determining the estrogen receptor (ER) status of the tumor. However, the standard for determining this status, immunohistochemical analysis of formalin-fixed paraffin embedded samples, suffers from numerous technical and reproducibility issues. Assessment of ER-status based on RNA expression can provide more objective, quantitative and reproducible test results. Methods To learn a parsimonious RNA-based classifier of hormone receptor status, we applied a machine learning tool to a training dataset of gene expression microarray data obtained from 176 frozen breast tumors, whose ER-status was determined by applying ASCO-CAP guidelines to standardized immunohistochemical testing of formalin fixed tumor. Results This produced a three-gene classifier that can predict the ER-status of a novel tumor, with a cross-validation accuracy of 93.17±2.44%. When applied to an independent validation set and to four other public databases, some on different platforms, this classifier obtained over 90% accuracy in each. In addition, we found that this prediction rule separated the patients' recurrence-free survival curves with a hazard ratio lower than the one based on the IHC analysis of ER-status. Conclusions Our efficient and parsimonious classifier lends itself to high throughput, highly accurate and low-cost RNA-based assessments of ER-status, suitable for routine high-throughput clinical use. This analytic method provides a proof-of-principle that may be applicable to developing effective RNA-based tests for other biomarkers and conditions. PMID:24312637
Picher, Maria M; Küpcü, Seta; Huang, Chun-Jen; Dostalek, Jakub; Pum, Dietmar; Sleytr, Uwe B; Ertl, Peter
2013-05-07
In the current work we have developed a lab-on-a-chip containing embedded amperometric sensors in four microreactors that can be addressed individually and that are coated with crystalline surface protein monolayers to provide a continuous, stable, reliable and accurate detection of blood glucose. It is envisioned that the microfluidic device will be used in a feedback loop mechanism to assess natural variations in blood glucose levels during hemodialysis to allow the individual adjustment of glucose. Reliable and accurate detection of blood glucose is accomplished by simultaneously performing (a) blood glucose measurements, (b) autocalibration routines, (c) mediator-interferences detection, and (d) background subtractions. The electrochemical detection of blood glucose variations in the absence of electrode fouling events is performed by integrating crystalline surface layer proteins (S-layer) that function as an efficient antifouling coating, a highly-oriented immobilization matrix for biomolecules and an effective molecular sieve with pore sizes of 4 to 5 nm. We demonstrate that the S-layer protein SbpA (from Lysinibacillus sphaericus CCM 2177) readily forms monomolecular lattice structures at the various microchip surfaces (e.g. glass, PDMS, platinum and gold) within 60 min, eliminating unspecific adsorption events in the presence of human serum albumin, human plasma and freshly-drawn blood samples. The highly isoporous SbpA-coating allows undisturbed diffusion of the mediator between the electrode surface, thus enabling bioelectrochemical measurements of glucose concentrations between 500 μM to 50 mM (calibration slope δI/δc of 8.7 nA mM(-1)). Final proof-of-concept implementing the four microfluidic microreactor design is demonstrated using freshly drawn blood. Accurate and drift-free assessment of blood glucose concentrations (6. 4 mM) is accomplished over 130 min at 37 °C using immobilized enzyme glucose oxidase by calculating the difference between autocalibration (10 mM glc) and background measurements. The novel combination of biologically-derived nanostructured surfaces with microchip technology constitutes a powerful new tool for multiplexed analysis of complex samples.
Braun, Elizabeth I; Huang, An; Tusa, Carolyn A; Yukica, Michael A; Pantano, Paul
2016-12-01
Carbon nanotubes (CNTs) are cylindrical molecules of carbon with diverse commercial applications. CNTs are also lightweight, easily airborne, and have been shown to be released during various phases of production and use. Therefore, as global CNT production increases, so do concerns that CNTs could pose a safety threat to those who are exposed to them. This makes it imperative to fully understand CNT release scenarios to make accurate risk assessments and to implement effective control measures. However, the current suite of direct-reading and off-line instrumentation used to monitor the release of CNTs in workplaces lack high chemical specificity, which complicates risk assessments when the sampling and/or measurements are performed at a single site where multiple CNT types are handled in the presence of naturally occurring background particles, or dust. Herein, we demonstrate the utility of Raman spectroscopy to unequivocally identify whether particulate matter collected from a multi-user analytical balance workstation comprised CNTs, as well as, whether the contamination included CNTs that were synthesized by a Ni/Y-catalyzed electric-arc method or a Co/Mo-catalyzed chemical vapor deposition method. Identifying the exact CNT type generated a more accurate risk assessment by knowing the metallic impurities involved, and it also led to the identification of the users who handled these CNTs, a review of their handling techniques, and an improved protocol for safely weighing CNTs.
ERIC Educational Resources Information Center
Zwicker, Barrie, Ed.
Written for editors, reporters, and researchers, this publication contains background information on war and peace. Included are newspaper articles, essays, and excerpts from radio commentaries. The information is intended to help journalists provide more accurate coverage of war-and-peace issues, in particular more accurate coverage of the Soviet…
Energy Measurement Studies for CO2 Measurement with a Coherent Doppler Lidar System
NASA Technical Reports Server (NTRS)
Beyon, Jeffrey Y.; Koch, Grady J.; Vanvalkenburg, Randal L.; Yu, Jirong; Singh, Upendra N.; Kavaya, Michael J.
2010-01-01
The accurate measurement of energy in the application of lidar system for CO2 measurement is critical. Different techniques of energy estimation in the online and offline pulses are investigated for post processing of lidar returns. The cornerstone of the techniques is the accurate estimation of the spectrum of lidar signal and background noise. Since the background noise is not the ideal white Gaussian noise, simple average level estimation of noise level is not well fit in the energy estimation of lidar signal and noise. A brief review of the methods is presented in this paper.
Huo, Xueliang; Johnson-Long, Ashley N.; Ghovanloo, Maysam; Shinohara, Minoru
2015-01-01
The purpose of this study was to compare the motor performance of tongue, using Tongue Drive System, to hand operation for relatively complex tasks under different levels of background physical exertion. Thirteen young able-bodied adults performed tasks that tested the accuracy and variability in tracking a sinusoidal waveform, and the performance in playing two video games that require accurate and rapid movements with cognitive processing using tongue and hand under two levels of background physical exertion. Results show additional background physical activity did not influence rapid and accurate displacement motor performance, but compromised the slow waveform tracking and shooting performances in both hand and tongue. Slow waveform tracking performance by the tongue was compromised with an additional motor or cognitive task, but with an additional motor task only for the hand. Practitioner Summary We investigated the influence of task complexity and background physical exertion on the motor performance of tongue and hand. Results indicate the task performance degrades with an additional concurrent task or physical exertion due to the limited attentional resources available for handling both the motor task and background exertion. PMID:24003900
Automatic online spike sorting with singular value decomposition and fuzzy C-mean clustering
2012-01-01
Background Understanding how neurons contribute to perception, motor functions and cognition requires the reliable detection of spiking activity of individual neurons during a number of different experimental conditions. An important problem in computational neuroscience is thus to develop algorithms to automatically detect and sort the spiking activity of individual neurons from extracellular recordings. While many algorithms for spike sorting exist, the problem of accurate and fast online sorting still remains a challenging issue. Results Here we present a novel software tool, called FSPS (Fuzzy SPike Sorting), which is designed to optimize: (i) fast and accurate detection, (ii) offline sorting and (iii) online classification of neuronal spikes with very limited or null human intervention. The method is based on a combination of Singular Value Decomposition for fast and highly accurate pre-processing of spike shapes, unsupervised Fuzzy C-mean, high-resolution alignment of extracted spike waveforms, optimal selection of the number of features to retain, automatic identification the number of clusters, and quantitative quality assessment of resulting clusters independent on their size. After being trained on a short testing data stream, the method can reliably perform supervised online classification and monitoring of single neuron activity. The generalized procedure has been implemented in our FSPS spike sorting software (available free for non-commercial academic applications at the address: http://www.spikesorting.com) using LabVIEW (National Instruments, USA). We evaluated the performance of our algorithm both on benchmark simulated datasets with different levels of background noise and on real extracellular recordings from premotor cortex of Macaque monkeys. The results of these tests showed an excellent accuracy in discriminating low-amplitude and overlapping spikes under strong background noise. The performance of our method is competitive with respect to other robust spike sorting algorithms. Conclusions This new software provides neuroscience laboratories with a new tool for fast and robust online classification of single neuron activity. This feature could become crucial in situations when online spike detection from multiple electrodes is paramount, such as in human clinical recordings or in brain-computer interfaces. PMID:22871125
Accurate spectroscopic redshift of the multiply lensed quasar PSOJ0147 from the Pan-STARRS survey
NASA Astrophysics Data System (ADS)
Lee, C.-H.
2017-09-01
Context. The gravitational lensing time delay method provides a one-step determination of the Hubble constant (H0) with an uncertainty level on par with the cosmic distance ladder method. However, to further investigate the nature of the dark energy, a H0 estimate down to 1% level is greatly needed. This requires dozens of strongly lensed quasars that are yet to be delivered by ongoing and forthcoming all-sky surveys. Aims: In this work we aim to determine the spectroscopic redshift of PSOJ0147, the first strongly lensed quasar candidate found in the Pan-STARRS survey. The main goal of our work is to derive an accurate redshift estimate of the background quasar for cosmography. Methods: To obtain timely spectroscopically follow-up, we took advantage of the fast-track service programme that is carried out by the Nordic Optical Telescope. Using a grism covering 3200-9600 Å, we identified prominent emission line features, such as Lyα, N V, O I, C II, Si IV, C IV, and [C III] in the spectra of the background quasar of the PSOJ0147 lens system. This enables us to determine accurately the redshift of the background quasar. Results: The spectrum of the background quasar exhibits prominent absorption features bluewards of the strong emission lines, such as Lyα, N V, and C IV. These blue absorption lines indicate that the background source is a broad absorption line (BAL) quasar. Unfortunately, the BAL features hamper an accurate determination of redshift using the above-mentioned strong emission lines. Nevertheless, we are able to determine a redshift of 2.341 ± 0.001 from three of the four lensed quasar images with the clean forbidden line [C III]. In addition, we also derive a maximum outflow velocity of 9800 km s-1 with the broad absorption features bluewards of the C IV emission line. This value of maximum outflow velocity is in good agreement with other BAL quasars.
The Active Side of Stereopsis: Fixation Strategy and Adaptation to Natural Environments.
Gibaldi, Agostino; Canessa, Andrea; Sabatini, Silvio P
2017-03-20
Depth perception in near viewing strongly relies on the interpretation of binocular retinal disparity to obtain stereopsis. Statistical regularities of retinal disparities have been claimed to greatly impact on the neural mechanisms that underlie binocular vision, both to facilitate perceptual decisions and to reduce computational load. In this paper, we designed a novel and unconventional approach in order to assess the role of fixation strategy in conditioning the statistics of retinal disparity. We integrated accurate realistic three-dimensional models of natural scenes with binocular eye movement recording, to obtain accurate ground-truth statistics of retinal disparity experienced by a subject in near viewing. Our results evidence how the organization of human binocular visual system is finely adapted to the disparity statistics characterizing actual fixations, thus revealing a novel role of the active fixation strategy over the binocular visual functionality. This suggests an ecological explanation for the intrinsic preference of stereopsis for a close central object surrounded by a far background, as an early binocular aspect of the figure-ground segregation process.
Blane, Alison; Falkmer, Torbjörn; Lee, Hoe C; Dukic Willstrand, Tania
2018-01-01
Background Safe driving is a complex activity that requires calibration. This means the driver can accurately assess the level of task demand required for task completion and can accurately evaluate their driving capability. There is much debate on the calibration ability of post-stroke drivers. Objectives The aim of this study was to assess the cognition, self-rated performance, and estimation of task demand in a driving simulator with post-stroke drivers and controls. Methods A between-groups study design was employed, which included a post-stroke driver group and a group of similarly aged older control drivers. Both groups were observed driving in two simulator-based driving scenarios and asked to complete the NASA Task Load Index (TLX) to assess their perceived task demand and self-rate their driving performance. Participants also completed a battery of psychometric tasks to assess attention and executive function, which was used to determine whether post-stroke cognitive impairment impacted on calibration. Results There was no difference in the amount of perceived task demand required to complete the driving task. Despite impairments in cognition, the post-stroke drivers were not more likely to over-estimate their driving abilities than controls. On average, the post-stroke drivers self-rated themselves more poorly than the controls and this rating was related to cognitive ability. Conclusion This study suggests that post-stroke drivers may be aware of their deficits and adjust their driving behavior. Furthermore, using self-performance measures alongside a driving simulator and cognitive assessments may provide complementary fitness-to-drive assessments, as well as rehabilitation tools during post-stroke recovery.
Nonword Repetition Errors of Children with and without Specific Language Impairments (SLI)
ERIC Educational Resources Information Center
Burke, Heidi L.; Coady, Jeffry A.
2015-01-01
Background: Two ubiquitous findings from the literature are that (1) children with specific language impairments (SLI) repeat nonwords less accurately than peers with typical language development (TLD), and (2) all children repeat nonwords with frequent phonotactic patterns more accurately than low-probability nonwords. Many studies have examined…
Veira, Andreas; Jackson, Peter L; Ainslie, Bruce; Fudge, Dennis
2013-07-01
This study investigates the development and application of a simple method to calculate annual and seasonal PM2.5 and PM10 background concentrations in small cities and rural areas. The Low Pollution Sectors and Conditions (LPSC) method is based on existing measured long-term data sets and is designed for locations where particulate matter (PM) monitors are only influenced by local anthropogenic emission sources from particular wind sectors. The LPSC method combines the analysis of measured hourly meteorological data, PM concentrations, and geographical emission source distributions. PM background levels emerge from measured data for specific wind conditions, where air parcel trajectories measured at a monitoring station are assumed to have passed over geographic sectors with negligible local emissions. Seasonal and annual background levels were estimated for two monitoring stations in Prince George, Canada, and the method was also applied to four other small cities (Burns Lake, Houston, Quesnel, Smithers) in northern British Columbia. The analysis showed reasonable background concentrations for both monitoring stations in Prince George, whereas annual PM10 background concentrations at two of the other locations and PM2.5 background concentrations at one other location were implausibly high. For those locations where the LPSC method was successful, annual background levels ranged between 1.8 +/- 0.1 microg/m3 and 2.5 +/- 0.1 microg/m3 for PM2.5 and between 6.3 +/- 0.3 microg/m3 and 8.5 +/- 0.3 microg/m3 for PM10. Precipitation effects and patterns of seasonal variability in the estimated background concentrations were detectable for all locations where the method was successful. Overall the method was dependent on the configuration of local geography and sources with respect to the monitoring location, and may fail at some locations and under some conditions. Where applicable, the LPSC method can provide a fast and cost-efficient way to estimate background PM concentrations for small cities in sparsely populated regions like northern British Columbia. In rural areas like northern British Columbia, particulate matter (PM) monitoring stations are usually located close to emission sources and residential areas in order to assess the PM impact on human health. Thus there is a lack of accurate PM background concentration data that represent PM ambient concentrations in the absence of local emissions. The background calculation method developed in this study uses observed meteorological data as well as local source emission locations and provides annual, seasonal and precipitation-related PM background concentrations that are comparable to literature values for four out of six monitoring stations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Park, Subok; Jennings, Robert; Liu Haimo
Purpose: For the last few years, development and optimization of three-dimensional (3D) x-ray breast imaging systems, such as digital breast tomosynthesis (DBT) and computed tomography, have drawn much attention from the medical imaging community, either academia or industry. However, there is still much room for understanding how to best optimize and evaluate the devices over a large space of many different system parameters and geometries. Current evaluation methods, which work well for 2D systems, do not incorporate the depth information from the 3D imaging systems. Therefore, it is critical to develop a statistically sound evaluation method to investigate the usefulnessmore » of inclusion of depth and background-variability information into the assessment and optimization of the 3D systems. Methods: In this paper, we present a mathematical framework for a statistical assessment of planar and 3D x-ray breast imaging systems. Our method is based on statistical decision theory, in particular, making use of the ideal linear observer called the Hotelling observer. We also present a physical phantom that consists of spheres of different sizes and materials for producing an ensemble of randomly varying backgrounds to be imaged for a given patient class. Lastly, we demonstrate our evaluation method in comparing laboratory mammography and three-angle DBT systems for signal detection tasks using the phantom's projection data. We compare the variable phantom case to that of a phantom of the same dimensions filled with water, which we call the uniform phantom, based on the performance of the Hotelling observer as a function of signal size and intensity. Results: Detectability trends calculated using the variable and uniform phantom methods are different from each other for both mammography and DBT systems. Conclusions: Our results indicate that measuring the system's detection performance with consideration of background variability may lead to differences in system performance estimates and comparisons. For the assessment of 3D systems, to accurately determine trade offs between image quality and radiation dose, it is critical to incorporate randomness arising from the imaging chain including background variability into system performance calculations.« less
Sensitivity and accuracy of hybrid fluorescence-mediated tomography in deep tissue regions.
Rosenhain, Stefanie; Al Rawashdeh, Wa'el; Kiessling, Fabian; Gremse, Felix
2017-09-01
Fluorescence-mediated tomography (FMT) enables noninvasive assessment of the three-dimensional distribution of near-infrared fluorescence in mice. The combination with micro-computed tomography (µCT) provides anatomical data, enabling improved fluorescence reconstruction and image analysis. The aim of our study was to assess sensitivity and accuracy of µCT-FMT under realistic in vivo conditions in deeply-seated regions. Accordingly, we acquired fluorescence reflectance images (FRI) and µCT-FMT scans of mice which were prepared with rectal insertions with different amounts of fluorescent dye. Default and high-sensitivity scans were acquired and background signal was analyzed for three FMT channels (670 nm, 745 nm, and 790 nm). Analysis was performed for the original and an improved FMT reconstruction using the µCT data. While FRI and the original FMT reconstruction could detect 100 pmol, the improved FMT reconstruction could detect 10 pmol and significantly improved signal localization. By using a finer sampling grid and increasing the exposure time, the sensitivity could be further improved to detect 0.5 pmol. Background signal was highest in the 670 nm channel and most prominent in the gastro-intestinal tract and in organs with high relative amounts of blood. In conclusion, we show that µCT-FMT allows sensitive and accurate assessment of fluorescence in deep tissue regions. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
Effect of education and clinical assessment on the accuracy of post partum blood loss estimation
2014-01-01
Background This research aimed to assess the effect of health care provider education on the accuracy of post partum blood loss estimation. Methods A non-randomized observational study that was conducted at King Abdulaziz Medical City, Riyadh, Saudi Arabia between January 1, 2011 and June 30, 2011. Hundred and twenty three health care providers who are involved in the estimation of post partum blood loss were eligible to participate. The participants were subjected to three research phases and an educational intervention. They have assessed a total of 30 different simulated blood loss stations, with 10 stations in each of the research phases. These phases took place before and after educational sessions on how to visually estimate blood loss and how to best utilize patient data in clinical scenarios. We have assessed the differences between the estimated blood loss and the actual measure. P-values were calculated to assess the differences between the three research phases estimations. Results The participants significantly under-estimated post partum blood loss. The accuracy was improved after training (p-value < 0.0001) and after analysing each patient’s clinical information (p-value = 0.042). The overall results were not affected by the participants’ clinical backgrounds or their years of experience. Under-estimation was more prominent in cases where more than average-excessive blood losses were simulated while over-estimations or accurate estimations were more prominent in less than average blood loss incidents. Conclusion Simple education programmes can improve traditional findings related to under-estimation of blood loss. More sophisticated clinical education programmes may provide additional improvements. PMID:24646156
Children's Perception of Conversational and Clear American-English Vowels in Noise
ERIC Educational Resources Information Center
Leone, Dorothy; Levy, Erika S.
2015-01-01
Purpose: Much of a child's day is spent listening to speech in the presence of background noise. Although accurate vowel perception is important for listeners' accurate speech perception and comprehension, little is known about children's vowel perception in noise. "Clear speech" is a speech style frequently used by talkers in the…
Rouzé, Héloïse; Lecellier, Gaël J; Saulnier, Denis; Planes, Serge; Gueguen, Yannick; Wirshing, Herman H; Berteaux-Lecellier, Véronique
2017-01-01
The adaptative bleaching hypothesis (ABH) states that, depending on the symbiotic flexibility of coral hosts (i.e., the ability of corals to "switch" or "shuffle" their algal symbionts), coral bleaching can lead to a change in the composition of their associated Symbiodinium community and, thus, contribute to the coral's overall survival. In order to determine the flexibility of corals, molecular tools are required to provide accurate species delineations and to detect low levels of coral-associated Symbiodinium . Here, we used highly sensitive quantitative (real-time) PCR (qPCR) technology to analyse five common coral species from Moorea (French Polynesia), previously screened using only traditional molecular methods, to assess the presence of low-abundance (background) Symbiodinium spp. Similar to other studies, each coral species exhibited a strong specificity to a particular clade, irrespective of the environment. In addition, however, each of the five species harboured at least one additional Symbiodinium clade, among clades A-D, at background levels. Unexpectedly, and for the first time in French Polynesia, clade B was detected as a coral symbiont. These results increase the number of known coral- Symbiodinium associations from corals found in French Polynesia, and likely indicate an underestimation of the ability of the corals in this region to associate with and/or "shuffle" different Symbiodinium clades. Altogether our data suggest that corals from French Polynesia may favor a trade-off between optimizing symbioses with a specific Symbiodinium clade(s), maintaining associations with particular background clades that may play a role in the ability of corals to respond to environmental change.
Staffing Preschools: Background Information.
ERIC Educational Resources Information Center
Katz, Lilian G.; Weir, Mary K.
This report explores background variables related to preschool teaching, and emphasizes that statistics fluctuate in early childhood education. The increase for preprimary enrollment of 3- and 4-year-olds was 26 percent from 1966 to 1967. Accurate figures on preschool teaching personnel are not available, but a large proportion of Head Start…
Indocyanine green-based fluorescent angiography in breast reconstruction
Chae, Michael P.; Rozen, Warren Matthew
2016-01-01
Background Fluorescent angiography (FA) has been useful for assessing blood flow and assessing tissue perfusion in ophthalmology and other surgical disciplines for decades. In plastic surgery, indocyanine green (ICG) dye-based FA is a relatively novel imaging technology with high potential in various applications. We review the various FA detector systems currently available and critically appraise its utility in breast reconstruction. Methods A review of the published English literature dating from 1950 to 2015 using databases, such as PubMed, Medline, Web of Science, and EMBASE was undertaken. Results In comparison to the old fluorescein dye, ICG has a superior side effect profile and can be accurately detected by various commercial devices, such as SPY Elite (Novadaq, Canada), FLARE (Curadel LLC, USA), PDE-Neo (Hamamatsu Photonics, Japan), Fluobeam 800 (Fluoptics, France), and IC-View (Pulsion Medical Systems AG, Germany). In breast reconstruction, ICG has established as a safer, more accurate tracer agent, in lieu of the traditional blue dyes, for detection of sentinel lymph nodes with radioactive isotopes (99m-Technetium). In prosthesis-based breast reconstruction, intraoperative assessment of the mastectomy skin flap to guide excision of hypoperfused areas translates to improved clinical outcomes. Similarly, in autologous breast reconstructions, FA can be utilized to detect poorly perfused areas of the free flap, evaluate microvascular anastomosis for patency, and assess SIEA vascular territory for use as an alternative free flap with minimal donor site morbidity. Conclusions ICG-based FA is a novel, useful tool for various applications in breast reconstruction. More studies with higher level of evidence are currently lacking to validate this technology. PMID:27047782
Use of technology in children’s dietary assessment
Boushey, CJ; Kerr, DA; Wright, J; Lutes, KD; Ebert, DS; Delp, EJ
2010-01-01
Background Information on dietary intake provides some of the most valuable insights for mounting intervention programmes for the prevention of chronic diseases. With the growing concern about adolescent overweight, the need to accurately measure diet becomes imperative. Assessment among adolescents is problematic as this group has irregular eating patterns and they have less enthusiasm for recording food intake. Subjects/Methods We used qualitative and quantitative techniques among adolescents to assess their preferences for dietary assessment methods. Results Dietary assessment methods using technology, for example, a personal digital assistant (PDA) or a disposable camera, were preferred over the pen and paper food record. Conclusions There was a strong preference for using methods that incorporate technology such as capturing images of food. This suggests that for adolescents, dietary methods that incorporate technology may improve cooperation and accuracy. Current computing technology includes higher resolution images, improved memory capacity and faster processors that allow small mobile devices to process information not previously possible. Our goal is to develop, implement and evaluate a mobile device (for example, PDA, mobile phone) food record that will translate to an accurate account of daily food and nutrient intake among adolescents. This mobile computing device will include digital images, a nutrient database and image analysis for identification and quantification of food consumption. Mobile computing devices provide a unique vehicle for collecting dietary information that reduces the burden on record keepers. Images of food can be marked with a variety of input methods that link the item for image processing and analysis to estimate the amount of food. Images before and after the foods are eaten can estimate the amount of food consumed. The initial stages and potential of this project will be described. PMID:19190645
Object Detection in Natural Backgrounds Predicted by Discrimination Performance and Models
NASA Technical Reports Server (NTRS)
Ahumada, A. J., Jr.; Watson, A. B.; Rohaly, A. M.; Null, Cynthia H. (Technical Monitor)
1995-01-01
In object detection, an observer looks for an object class member in a set of backgrounds. In discrimination, an observer tries to distinguish two images. Discrimination models predict the probability that an observer detects a difference between two images. We compare object detection and image discrimination with the same stimuli by: (1) making stimulus pairs of the same background with and without the target object and (2) either giving many consecutive trials with the same background (discrimination) or intermixing the stimuli (object detection). Six images of a vehicle in a natural setting were altered to remove the vehicle and mixed with the original image in various proportions. Detection observers rated the images for vehicle presence. Discrimination observers rated the images for any difference from the background image. Estimated detectabilities of the vehicles were found by maximizing the likelihood of a Thurstone category scaling model. The pattern of estimated detectabilities is similar for discrimination and object detection, and is accurately predicted by a Cortex Transform discrimination model. Predictions of a Contrast- Sensitivity- Function filter model and a Root-Mean-Square difference metric based on the digital image values are less accurate. The discrimination detectabilities averaged about twice those of object detection.
Werner-Wasik, Maria; Nelson, Arden D; Choi, Walter; Arai, Yoshio; Faulhaber, Peter F; Kang, Patrick; Almeida, Fabio D; Xiao, Ying; Ohri, Nitin; Brockway, Kristin D; Piper, Jonathan W; Nelson, Aaron S
2012-03-01
To evaluate the accuracy and consistency of a gradient-based positron emission tomography (PET) segmentation method, GRADIENT, compared with manual (MANUAL) and constant threshold (THRESHOLD) methods. Contouring accuracy was evaluated with sphere phantoms and clinically realistic Monte Carlo PET phantoms of the thorax. The sphere phantoms were 10-37 mm in diameter and were acquired at five institutions emulating clinical conditions. One institution also acquired a sphere phantom with multiple source-to-background ratios of 2:1, 5:1, 10:1, 20:1, and 70:1. One observer segmented (contoured) each sphere with GRADIENT and THRESHOLD from 25% to 50% at 5% increments. Subsequently, seven physicians segmented 31 lesions (7-264 mL) from 25 digital thorax phantoms using GRADIENT, THRESHOLD, and MANUAL. For spheres <20 mm in diameter, GRADIENT was the most accurate with a mean absolute % error in diameter of 8.15% (10.2% SD) compared with 49.2% (51.1% SD) for 45% THRESHOLD (p < 0.005). For larger spheres, the methods were statistically equivalent. For varying source-to-background ratios, GRADIENT was the most accurate for spheres >20 mm (p < 0.065) and <20 mm (p < 0.015). For digital thorax phantoms, GRADIENT was the most accurate (p < 0.01), with a mean absolute % error in volume of 10.99% (11.9% SD), followed by 25% THRESHOLD at 17.5% (29.4% SD), and MANUAL at 19.5% (17.2% SD). GRADIENT had the least systematic bias, with a mean % error in volume of -0.05% (16.2% SD) compared with 25% THRESHOLD at -2.1% (34.2% SD) and MANUAL at -16.3% (20.2% SD; p value <0.01). Interobserver variability was reduced using GRADIENT compared with both 25% THRESHOLD and MANUAL (p value <0.01, Levene's test). GRADIENT was the most accurate and consistent technique for target volume contouring. GRADIENT was also the most robust for varying imaging conditions. GRADIENT has the potential to play an important role for tumor delineation in radiation therapy planning and response assessment. Copyright © 2012. Published by Elsevier Inc.
Infrared images target detection based on background modeling in the discrete cosine domain
NASA Astrophysics Data System (ADS)
Ye, Han; Pei, Jihong
2018-02-01
Background modeling is the critical technology to detect the moving target for video surveillance. Most background modeling techniques are aimed at land monitoring and operated in the spatial domain. A background establishment becomes difficult when the scene is a complex fluctuating sea surface. In this paper, the background stability and separability between target are analyzed deeply in the discrete cosine transform (DCT) domain, on this basis, we propose a background modeling method. The proposed method models each frequency point as a single Gaussian model to represent background, and the target is extracted by suppressing the background coefficients. Experimental results show that our approach can establish an accurate background model for seawater, and the detection results outperform other background modeling methods in the spatial domain.
Easy Leaf Area: Automated digital image analysis for rapid and accurate measurement of leaf area.
Easlon, Hsien Ming; Bloom, Arnold J
2014-07-01
Measurement of leaf areas from digital photographs has traditionally required significant user input unless backgrounds are carefully masked. Easy Leaf Area was developed to batch process hundreds of Arabidopsis rosette images in minutes, removing background artifacts and saving results to a spreadsheet-ready CSV file. • Easy Leaf Area uses the color ratios of each pixel to distinguish leaves and calibration areas from their background and compares leaf pixel counts to a red calibration area to eliminate the need for camera distance calculations or manual ruler scale measurement that other software methods typically require. Leaf areas estimated by this software from images taken with a camera phone were more accurate than ImageJ estimates from flatbed scanner images. • Easy Leaf Area provides an easy-to-use method for rapid measurement of leaf area and nondestructive estimation of canopy area from digital images.
Chen, Chen Hsiu; Kuo, Su Ching; Tang, Siew Tzuh
2017-05-01
No systematic meta-analysis is available on the prevalence of cancer patients' accurate prognostic awareness and differences in accurate prognostic awareness by publication year, region, assessment method, and service received. To examine the prevalence of advanced/terminal cancer patients' accurate prognostic awareness and differences in accurate prognostic awareness by publication year, region, assessment method, and service received. Systematic review and meta-analysis. MEDLINE, Embase, The Cochrane Library, CINAHL, and PsycINFO were systematically searched on accurate prognostic awareness in adult patients with advanced/terminal cancer (1990-2014). Pooled prevalences were calculated for accurate prognostic awareness by a random-effects model. Differences in weighted estimates of accurate prognostic awareness were compared by meta-regression. In total, 34 articles were retrieved for systematic review and meta-analysis. At best, only about half of advanced/terminal cancer patients accurately understood their prognosis (49.1%; 95% confidence interval: 42.7%-55.5%; range: 5.4%-85.7%). Accurate prognostic awareness was independent of service received and publication year, but highest in Australia, followed by East Asia, North America, and southern Europe and the United Kingdom (67.7%, 60.7%, 52.8%, and 36.0%, respectively; p = 0.019). Accurate prognostic awareness was higher by clinician assessment than by patient report (63.2% vs 44.5%, p < 0.001). Less than half of advanced/terminal cancer patients accurately understood their prognosis, with significant variations by region and assessment method. Healthcare professionals should thoroughly assess advanced/terminal cancer patients' preferences for prognostic information and engage them in prognostic discussion early in the cancer trajectory, thus facilitating their accurate prognostic awareness and the quality of end-of-life care decision-making.
Pain management in burn injury.
Montgomery, Robert K
2004-03-01
Traumatic bum injuries and the associated treatments are a tremendous pain management challenge. The degree of tissue damage in severe burns can initiate physiologic changes in nociceptive pathways that place the patient at risk for undertreatment. The use of analgesic guidelines that address both background and procedural pain and associated anxiety can provide a rational and consistent approach to treatment. The key to successful treatment is the continuous and accurate assessment of the patient's pain and the response to therapy. Medications, especially opioids, should be regularly evaluated and adjusted to achieve maximum effect and minimal side effect.Nursing's role is perhaps the most important in the essential focused surveillance of bum pain and it's successful treatment.
How Reliable is the Acetabular Cup Position Assessment from Routine Radiographs?
Carvajal Alba, Jaime A.; Vincent, Heather K.; Sodhi, Jagdeep S.; Latta, Loren L.; Parvataneni, Hari K.
2017-01-01
Abstract Background: Cup position is crucial for optimal outcomes in total hip arthroplasty. Radiographic assessment of component position is routinely performed in the early postoperative period. Aims: The aims of this study were to determine in a controlled environment if routine radiographic methods accurately and reliably assess the acetabular cup position and to assess if there is a statistical difference related to the rater’s level of training. Methods: A pelvic model was mounted in a spatial frame. An acetabular cup was fixed in different degrees of version and inclination. Standardized radiographs were obtained. Ten observers including five fellowship-trained orthopaedic surgeons and five orthopaedic residents performed a blind assessment of cup position. Inclination was assessed from anteroposterior radiographs of the pelvis and version from cross-table lateral radiographs of the hip. Results: The radiographic methods used showed to be imprecise specially when the cup was positioned at the extremes of version and inclination. An excellent inter-observer reliability (Intra-class coefficient > 0,9) was evidenced. There were no differences related to the level of training of the raters. Conclusions: These widely used radiographic methods should be interpreted cautiously and computed tomography should be utilized in cases when further intervention is contemplated. PMID:28852355
Chen, Yong; Huang, Biao; Hu, Wenyou; Weindorf, David C; Liu, Xiaoxiao; Niedermann, Silvana
2014-02-01
The risk assessment of trace elements of different environmental media in conventional and organic greenhouse vegetable production systems (CGVPS and OGVPS) can reveal the influence of different farming philosophy on the trace element accumulations and their effects on human health. These provide important basic data for the environmental protection and human health. This paper presents trace element accumulation characteristics of different land uses; reveals the difference of soil trace element accumulation both with and without consideration of background levels; compares the trace element uptake by main vegetables; and assesses the trace element risks of soils, vegetables, waters and agricultural inputs, using two selected greenhouse vegetable systems in Nanjing, China as examples. Results showed that greenhouse vegetable fields contained significant accumulations of Zn in CGVPS relative to rice-wheat rotation fields, open vegetable fields, and geochemical background levels, and this was the case for organic matter in OGVPS. The comparative analysis of the soil medium in two systems with consideration of geochemical background levels and evaluation of the geo-accumulation pollution index achieved a more reasonable comparison and accurate assessment relative to the direct comparison analysis and the evaluation of the Nemerow pollution index, respectively. According to the Chinese food safety standards and the value of the target hazard quotient or hazard index, trace element contents of vegetables were safe for local residents in both systems. However, the spatial distribution of the estimated hazard index for producers still presented certain specific hotspots which may cause potential risk for human health in CGVPS. The water was mainly influenced by nitrogen, especially for CGVPS, while the potential risk of Cd and Cu pollution came from sediments in OGVPS. The main inputs for trace elements were fertilizers which were relatively safe based on relevant standards; but excess application caused trace element accumulations in the environmental media. Copyright © 2013 Elsevier B.V. All rights reserved.
Determining gestational age and preterm birth in rural Guatemala: A comparison of methods
Thompson, Lisa M.; Díaz Artiga, Anaité; Bryan, Joe P.; Arriaga, William E.; Omer, Saad B.; McCracken, John P.
2018-01-01
Background Preterm birth is the leading cause of death among children <5 years of age. Accurate determination of prematurity is necessary to provide appropriate neonatal care and guide preventive measures. To estimate the most accurate method to identify infants at risk for adverse outcomes, we assessed the validity of two widely available methods—last menstrual period (LMP) and the New Ballard (NB) neonatal assessment—against ultrasound in determining gestational age and preterm birth in highland Guatemala. Methods Pregnant women (n = 188) were recruited with a gestational age <20 weeks and followed until delivery. Ultrasound was performed by trained physicians and LMP was collected during recruitment. NB was performed on infants within 96 hours of birth by trained study nurses. LMP and NB accuracy at determining gestational age and identifying prematurity was assessed by comparing them to ultrasound. Results By ultrasound, infant mean gestational age at birth was 38.3 weeks (SD = 1.6) with 16% born at less than 37 gestation. LMP was more accurate than NB (mean difference of +0.13 weeks for LMP and +0.61 weeks for NB). However, LMP and NB estimates had low agreement with ultrasound-determined gestational age (Lin’s concordance<0.48 for both methods) and preterm birth (κ<0.29 for both methods). By LMP, 18% were judged premature compared with 6% by NB. LMP underestimated gestational age among women presenting later to prenatal care (0.18 weeks for each additional week). Gestational age for preterm infants was overestimated by nearly one week using LMP and nearly two weeks using NB. New Ballard neuromuscular measurements were more predictive of preterm birth than those measuring physical criteria. Conclusion In an indigenous population in highland Guatemala, LMP overestimated prematurity by 2% and NB underestimated prematurity by 10% compared with ultrasound estimates. New, simple and accurate methods are needed to identify preterm birth in resource-limited settings worldwide. PMID:29554145
Meunier, Carl J; Roberts, James G; McCarty, Gregory S; Sombers, Leslie A
2017-02-15
Background-subtracted fast-scan cyclic voltammetry (FSCV) has emerged as a powerful analytical technique for monitoring subsecond molecular fluctuations in live brain tissue. Despite increasing utilization of FSCV, efforts to improve the accuracy of quantification have been limited due to the complexity of the technique and the dynamic recording environment. It is clear that variable electrode performance renders calibration necessary for accurate quantification; however, the nature of in vivo measurements can make conventional postcalibration difficult, or even impossible. Analyte-specific voltammograms and scaling factors that are critical for quantification can shift or fluctuate in vivo. This is largely due to impedance changes, and the effects of impedance on these measurements have not been characterized. We have previously reported that the background current can be used to predict electrode-specific scaling factors in situ. In this work, we employ model circuits to investigate the impact of impedance on FSCV measurements. Additionally, we take another step toward in situ electrode calibration by using the oxidation potential of quinones on the electrode surface to accurately predict the oxidation potential for dopamine at any point in an electrochemical experiment, as both are dependent on impedance. The model, validated both in adrenal slice and live brain tissue, enables information encoded in the shape of the background voltammogram to determine electrochemical parameters that are critical for accurate quantification. This improves data interpretation and provides a significant next step toward more automated methods for in vivo data analysis.
Measurement of the Velocity of the Neutrino with MINOS
2012-01-01
in the cosmic microwave background , but these are not direct measurements of the neutrino velocity. In September 2011, the OPERA experiment reported...neutrino interactions in the MINOS detectors is used to reject background from muons from cosmic rays. Each detector has a Truetime ‡ XL-AK single...accurate result. I. BACKGROUND The earliest measurements of the speed of the neutrino were made in the 1970s, with the Fermilab Main Ring narrow
Pallaro, Anabel; Tarducci, Gabriel
2014-12-01
The application of nuclear techniques in the area of nutrition is safe because they use stable isotopes. The deuterium dilution method is used in body composition and human milk intake analysis. It is a reference method for body fat and validates inexpensive tools because of its accuracy, simplicity of application in individuals and population and the background of its usefulness in adults and children as an evaluation tool in clinical and health programs. It is a non-invasive technique as it uses saliva, which facilitates the assessment in pediatric populations. Changes in body fat are associated with non-communicable diseases; moreover, normal weight individuals with high fat deposition were reported. Furthermore, this technique is the only accurate way to determine whether infants are exclusively breast-fed and validate conventional methods based on surveys to mothers.
Han, Sang-Wook; Park, Chang-Jin; Lee, Sang-Won; Ronald, Pamela C
2008-01-01
Background Xanthomonas oryzae pv. oryzae, the causal agent of bacterial blight disease, is a serious pathogen of rice. Here we describe a fluorescent marker system to study virulence and pathogenicity of X. oryzae pv. oryzae. Results A fluorescent X. oryzae pv. oryzae Philippine race 6 strain expressing green fluorescent protein (GFP) (PXO99GFP) was generated using the gfp gene under the control of the neomycin promoter in the vector, pPneo-gfp. The PXO99GFPstrain displayed identical virulence and avirulence properties as the wild type control strain, PXO99. Using fluorescent microscopy, bacterial multiplication and colonization were directly observed in rice xylem vessels. Accurate and rapid determination of bacterial growth was assessed using fluoremetry and an Enzyme-Linked ImmunoSorbant Assay (ELISA). Conclusion Our results indicate that the fluorescent marker system is useful for assessing bacterial infection and monitoring bacterial multiplication in planta. PMID:18826644
Design-based stereology: Planning, volumetry and sampling are crucial steps for a successful study.
Tschanz, Stefan; Schneider, Jan Philipp; Knudsen, Lars
2014-01-01
Quantitative data obtained by means of design-based stereology can add valuable information to studies performed on a diversity of organs, in particular when correlated to functional/physiological and biochemical data. Design-based stereology is based on a sound statistical background and can be used to generate accurate data which are in line with principles of good laboratory practice. In addition, by adjusting the study design an appropriate precision can be achieved to find relevant differences between groups. For the success of the stereological assessment detailed planning is necessary. In this review we focus on common pitfalls encountered during stereological assessment. An exemplary workflow is included, and based on authentic examples, we illustrate a number of sampling principles which can be implemented to obtain properly sampled tissue blocks for various purposes. Copyright © 2013 Elsevier GmbH. All rights reserved.
Assessing fullness of asthma patients' aerosol inhalers.
Rickenbach, M A; Julious, S A
1994-01-01
BACKGROUND. The importance of regular medication in order to control asthma symptoms is recognized. However, there is no accurate mechanism for assessing the fullness of aerosol inhalers. The contribution to asthma morbidity of unexpectedly running out of inhaled medication is unknown. AIM. A study was undertaken to determine how patients assess inhaler fullness and the accuracy of their assessments, and to evaluate the floatation method of assessing inhaler fullness. METHOD. An interview survey of 98 patients (51% of those invited to take part), using 289 inhalers, was completed at one general practice in Hampshire. RESULTS. One third of participants said they had difficulty assessing aerosol inhaler fullness and those aged 60 years and over were found to be more inaccurate in assessing fullness than younger participants. Shaking the inhaler to feel the contents move was the commonest method of assessment. When placed in water, an inhaler canister floating on its side with a corner of the canister valve exposed to air indicates that the canister is less than 15% full (sensitivity 90%, specificity 99%). CONCLUSION. Floating a canister in water provides an objective measurement of aerosol inhaler fullness. Providing the method is recommended by the aerosol inhaler manufacturer, general practitioners should demonstrate the floatation method to patients experiencing difficulty in assessing inhaler fullness. PMID:7619099
Accurate estimates for North American background (NAB) ozone (O3) in surface air over the United States are needed for setting and implementing an attainable national O3 standard. These estimates rely on simulations with atmospheric chemistry-transport models that set North Amer...
Comparison of Measured Galactic Background Radiation at L-Band with Model
NASA Technical Reports Server (NTRS)
LeVine, David M.; Abraham, Saji; Kerr, Yann H.; Wilson, William J.; Skou, Niels; Sobjaerg, Sten
2004-01-01
Radiation from the celestial sky in the spectral window at 1.413 GHz is strong and an accurate accounting of this background radiation is needed for calibration and retrieval algorithms. Modern radio astronomy measurements in this window have been converted into a brightness temperature map of the celestial sky at L-band suitable for such applications. This paper presents a comparison of the background predicted by this map with the measurements of several modern L-band remote sensing radiometer Keywords-Galactic background, microwave radiometry; remote sensing;
NASA Astrophysics Data System (ADS)
O'Keeffe, Brendon Andrew; Johnson, Michael
2017-01-01
Light pollution plays an ever increasing role in the operations of observatories across the world. This is especially true in urban environments like Columbus, GA, where Columbus State University’s WestRock Observatory is located. Light pollution’s effects on an observatory include high background levels, which results in a lower signal to noise ratio. Overall, this will limit what the telescope can detect, and therefore limit the capabilities of the observatory as a whole.Light pollution has been mapped in Columbus before using VIIRS DNB composites. However, this approach did not provide the detailed resolution required to narrow down the problem areas around the vicinity of the observatory. The purpose of this study is to assess the current state of light pollution surrounding the WestRock observatory by measuring and mapping the brightness of the sky due to light pollution using light meters and geographic information system (GIS) software.Compared to VIIRS data this study allows for an improved spatial resolution and a direct measurement of the sky background. This assessment will enable future studies to compare their results to the baseline established here, ensuring that any changes to the way the outdoors are illuminated and their effects can be accurately measured, and counterbalanced.
Effects of noise levels and call types on the source levels of killer whale calls.
Holt, Marla M; Noren, Dawn P; Emmons, Candice K
2011-11-01
Accurate parameter estimates relevant to the vocal behavior of marine mammals are needed to assess potential effects of anthropogenic sound exposure including how masking noise reduces the active space of sounds used for communication. Information about how these animals modify their vocal behavior in response to noise exposure is also needed for such assessment. Prior studies have reported variations in the source levels of killer whale sounds, and a more recent study reported that killer whales compensate for vessel masking noise by increasing their call amplitude. The objectives of the current study were to investigate the source levels of a variety of call types in southern resident killer whales while also considering background noise level as a likely factor related to call source level variability. The source levels of 763 discrete calls along with corresponding background noise were measured over three summer field seasons in the waters surrounding the San Juan Islands, WA. Both noise level and call type were significant factors on call source levels (1-40 kHz band, range of 135.0-175.7 dB(rms) re 1 [micro sign]Pa at 1 m). These factors should be considered in models that predict how anthropogenic masking noise reduces vocal communication space in marine mammals.
Kafferlein, H; Ferstl, C; Burkhart-Reichl, A; Hennebruder, K; Drexler, H; Bruning, T; Angerer, J
2005-01-01
Background: N,N-dimethylformamide (DMF) was recently prioritised for field studies by the National Toxicology Program based on the potency of its reproductive toxic effects. Aims: To measure accurately exposure to DMF in occupational settings. Methods: In 35 healthy workers employed in the polyacrylic fibre industry, N-methylformamide (NMF) and N-acetyl-S-(N-methylcarbamoyl)cysteine (AMCC) in urine, and N-methylcarbamoylated haemoglobin (NMHb) in blood were measured. Workplace documentation and questionnaire information were used to categorise workers in groups exposed to low, medium, and high concentrations of DMF. Results: All three biomarkers can be used to identify occupational exposure to DMF. However, only the analysis of NMHb could accurately distinguish between workers exposed to different concentrations of DMF. The median concentrations were determined to be 55.1, 122.8, and 152.6 nmol/g globin in workers exposed to low, medium, and high concentrations of DMF, respectively. It was possible by the use of NMHb to identify all working tasks with increased exposure to DMF. While fibre crimpers were found to be least exposed to DMF, persons washing, dyeing, or towing the fibres were found to be highly exposed to DMF. In addition, NMHb measurements were capable of uncovering working tasks, which previously were not associated with increased exposure to DMF; for example, the person preparing the fibre forming solution. Conclusions: Measurement of NMHb in blood is recommended rather than measurement of NMF and AMCC in urine to accurately assess exposure to DMF in health risk assessment. However, NMF and AMCC are useful biomarkers for occupational hygiene intervention. Further investigations regarding toxicity of DMF should focus on highly exposed persons in the polyacrylic fibre industry. Additional measurements in occupational settings other than the polyacrylic fibre industry are also recommended, since the population at risk and the production volume of DMF are high. PMID:15837855
Robust Small Target Co-Detection from Airborne Infrared Image Sequences.
Gao, Jingli; Wen, Chenglin; Liu, Meiqin
2017-09-29
In this paper, a novel infrared target co-detection model combining the self-correlation features of backgrounds and the commonality features of targets in the spatio-temporal domain is proposed to detect small targets in a sequence of infrared images with complex backgrounds. Firstly, a dense target extraction model based on nonlinear weights is proposed, which can better suppress background of images and enhance small targets than weights of singular values. Secondly, a sparse target extraction model based on entry-wise weighted robust principal component analysis is proposed. The entry-wise weight adaptively incorporates structural prior in terms of local weighted entropy, thus, it can extract real targets accurately and suppress background clutters efficiently. Finally, the commonality of targets in the spatio-temporal domain are used to construct target refinement model for false alarms suppression and target confirmation. Since real targets could appear in both of the dense and sparse reconstruction maps of a single frame, and form trajectories after tracklet association of consecutive frames, the location correlation of the dense and sparse reconstruction maps for a single frame and tracklet association of the location correlation maps for successive frames have strong ability to discriminate between small targets and background clutters. Experimental results demonstrate that the proposed small target co-detection method can not only suppress background clutters effectively, but also detect targets accurately even if with target-like interference.
Spatiotemporal models for the simulation of infrared backgrounds
NASA Astrophysics Data System (ADS)
Wilkes, Don M.; Cadzow, James A.; Peters, R. Alan, II; Li, Xingkang
1992-09-01
It is highly desirable for designers of automatic target recognizers (ATRs) to be able to test their algorithms on targets superimposed on a wide variety of background imagery. Background imagery in the infrared spectrum is expensive to gather from real sources, consequently, there is a need for accurate models for producing synthetic IR background imagery. We have developed a model for such imagery that will do the following: Given a real, infrared background image, generate another image, distinctly different from the one given, that has the same general visual characteristics as well as the first and second-order statistics of the original image. The proposed model consists of a finite impulse response (FIR) kernel convolved with an excitation function, and histogram modification applied to the final solution. A procedure for deriving the FIR kernel using a signal enhancement algorithm has been developed, and the histogram modification step is a simple memoryless nonlinear mapping that imposes the first order statistics of the original image onto the synthetic one, thus the overall model is a linear system cascaded with a memoryless nonlinearity. It has been found that the excitation function relates to the placement of features in the image, the FIR kernel controls the sharpness of the edges and the global spectrum of the image, and the histogram controls the basic coloration of the image. A drawback to this method of simulating IR backgrounds is that a database of actual background images must be collected in order to produce accurate FIR and histogram models. If this database must include images of all types of backgrounds obtained at all times of the day and all times of the year, the size of the database would be prohibitive. In this paper we propose improvements to the model described above that enable time-dependent modeling of the IR background. This approach can greatly reduce the number of actual IR backgrounds that are required to produce a sufficiently accurate mathematical model for synthesizing a similar IR background for different times of the day. Original and synthetic IR backgrounds will be presented. Previous research in simulating IR backgrounds was performed by Strenzwilk, et al., Botkin, et al., and Rapp. The most recent work of Strenzwilk, et al. was based on the use of one-dimensional ARMA models for synthesizing the images. Their results were able to retain the global statistical and spectral behavior of the original image, but the synthetic image was not visually very similar to the original. The research presented in this paper is the result of an attempt to improve upon their results, and represents a significant improvement in quality over previously obtained results.
Dual-tracer background subtraction approach for fluorescent molecular tomography
Holt, Robert W.; El-Ghussein, Fadi; Davis, Scott C.; Samkoe, Kimberley S.; Gunn, Jason R.; Leblond, Frederic
2013-01-01
Abstract. Diffuse fluorescence tomography requires high contrast-to-background ratios to accurately reconstruct inclusions of interest. This is a problem when imaging the uptake of fluorescently labeled molecularly targeted tracers in tissue, which can result in high levels of heterogeneously distributed background uptake. We present a dual-tracer background subtraction approach, wherein signal from the uptake of an untargeted tracer is subtracted from targeted tracer signal prior to image reconstruction, resulting in maps of targeted tracer binding. The approach is demonstrated in simulations, a phantom study, and in a mouse glioma imaging study, demonstrating substantial improvement over conventional and homogenous background subtraction image reconstruction approaches. PMID:23292612
Telemedicine in acute plastic surgical trauma and burns.
Jones, S. M.; Milroy, C.; Pickford, M. A.
2004-01-01
BACKGROUND: Telemedicine is a relatively new development within the UK, but is increasingly useful in many areas of medicine including plastic surgery. Plastic surgery centres often work on a hub-and-spoke basis with many district hospitals referring to one tertiary centre. The Queen Victoria Hospital is one such centre receiving calls from more than 28 hospitals in the Southeast of England resulting in approximately 20 referrals a day. OBJECTIVE: A telemedicine system was developed to improve trauma management. This study was designed to establish whether digital images were sufficiently accurate enough to aid decision-making. A store-and-forward telemedicine system was devised and the images of 150 trauma referrals evaluated in terms of injury severity and operative priority by each member of the plastic surgical team. RESULTS: Correlation scores for assessed images were high. Accuracy of "transmitted image" in comparison to injury on examination scored > 97%. Operative priority scores tended to be higher than injury severity. CONCLUSIONS: Telemedicine is an accurate method by which to transfer information on plastic surgical trauma including burns. PMID:15239862
Subar, Amy F; Crafts, Jennifer; Zimmerman, Thea Palmer; Wilson, Michael; Mittl, Beth; Islam, Noemi G; McNutt, Suzanne; Potischman, Nancy; Buday, Richard; Hull, Stephen G; Baranowski, Tom; Guenther, Patricia M; Willis, Gordon; Tapia, Ramsey; Thompson, Frances E
2010-01-01
To assess the accuracy of portion-size estimates and participant preferences using various presentations of digital images. Two observational feeding studies were conducted. In both, each participant selected and consumed foods for breakfast and lunch, buffet style, serving themselves portions of nine foods representing five forms (eg, amorphous, pieces). Serving containers were weighed unobtrusively before and after selection as was plate waste. The next day, participants used a computer software program to select photographs representing portion sizes of foods consumed the previous day. Preference information was also collected. In Study 1 (n=29), participants were presented with four different types of images (aerial photographs, angled photographs, images of mounds, and household measures) and two types of screen presentations (simultaneous images vs an empty plate that filled with images of food portions when clicked). In Study 2 (n=20), images were presented in two ways that varied by size (large vs small) and number (4 vs 8). Convenience sample of volunteers of varying background in an office setting. Repeated-measures analysis of variance of absolute differences between actual and reported portions sizes by presentation methods. Accuracy results were largely not statistically significant, indicating that no one image type was most accurate. Accuracy results indicated the use of eight vs four images was more accurate. Strong participant preferences supported presenting simultaneous vs sequential images. These findings support the use of aerial photographs in the automated self-administered 24-hour recall. For some food forms, images of mounds or household measures are as accurate as images of food and, therefore, are a cost-effective alternative to photographs of foods. Copyright 2010 American Dietetic Association. Published by Elsevier Inc. All rights reserved.
Subar, Amy F.; Crafts, Jennifer; Zimmerman, Thea Palmer; Wilson, Michael; Mittl, Beth; Islam, Noemi G.; Mcnutt, Suzanne; Potischman, Nancy; Buday, Richard; Hull, Stephen G.; Baranowski, Tom; Guenther, Patricia M.; Willis, Gordon; Tapia, Ramsey; Thompson, Frances E.
2013-01-01
Objective To assess the accuracy of portion-size estimates and participant preferences using various presentations of digital images. Design Two observational feeding studies were conducted. In both, each participant selected and consumed foods for breakfast and lunch, buffet style, serving themselves portions of nine foods representing five forms (eg, amorphous, pieces). Serving containers were weighed unobtrusively before and after selection as was plate waste. The next day, participants used a computer software program to select photographs representing portion sizes of foods consumed the previous day. Preference information was also collected. In Study 1 (n=29), participants were presented with four different types of images (aerial photographs, angled photographs, images of mounds, and household measures) and two types of screen presentations (simultaneous images vs an empty plate that filled with images of food portions when clicked). In Study 2 (n=20), images were presented in two ways that varied by size (large vs small) and number (4 vs 8). Subjects/setting Convenience sample of volunteers of varying background in an office setting. Statistical analyses performed Repeated-measures analysis of variance of absolute differences between actual and reported portions sizes by presentation methods. Results Accuracy results were largely not statistically significant, indicating that no one image type was most accurate. Accuracy results indicated the use of eight vs four images was more accurate. Strong participant preferences supported presenting simultaneous vs sequential images. Conclusions These findings support the use of aerial photographs in the automated self-administered 24-hour recall. For some food forms, images of mounds or household measures are as accurate as images of food and, therefore, are a cost-effective alternative to photographs of foods. PMID:20102828
Morrow, Linda; Hompesch, Marcus; Tideman, Ann M; Matson, Jennifer; Dunne, Nancy; Pardo, Scott; Parkes, Joan L; Schachner, Holly C; Simmons, David A
2011-01-01
Background This glucose clamp study assessed the performance of an electrochemical continuous glucose monitoring (CGM) system for monitoring levels of interstitial glucose. This novel system does not require use of a trocar or needle for sensor insertion. Method Continuous glucose monitoring sensors were inserted subcutaneously into the abdominal tissue of 14 adults with type 1 or type 2 diabetes. Subjects underwent an automated glucose clamp procedure with four consecutive post-steady-state glucose plateau periods (40 min each): (a) hypoglycemic (50 mg/dl), (b) hyperglycemic (250 mg/dl), (c) second hypoglycemic (50 mg/dl), and (d) euglycemic (90 mg/dl). Plasma glucose results obtained with YSI glucose analyzers were used for sensor calibration. Accuracy was assessed retrospectively for plateau periods and transition states, when glucose levels were changing rapidly (approximately 2 mg/dl/min). Results Mean absolute percent difference (APD) was lowest during hypoglycemic plateaus (11.68%, 14.15%) and the euglycemic-to-hypoglycemic transition (14.21%). Mean APD during the hyperglycemic plateau was 17.11%; mean APDs were 18.12% and 19.25% during the hypoglycemic-to-hyperglycemic and hyperglycemic-to-hypoglycemic transitions, respectively. Parkes (consensus) error grid analysis (EGA) and rate EGA of the plateaus and transition periods, respectively, yielded 86.8% and 68.6% accurate results (zone A) and 12.1% and 20.0% benign errors (zone B). Continuous EGA yielded 88.5%, 75.4%, and 79.3% accurate results and 8.3%, 14.3%, and 2.4% benign errors for the euglycemic, hyperglycemic, and hypoglycemic transition periods, respectively. Adverse events were mild and unlikely to be device related. Conclusion This novel CGM system was safe and accurate across the clinically relevant glucose range. PMID:21880226
Physiological and biochemical basis of clinical liver function tests: a review.
Hoekstra, Lisette T; de Graaf, Wilmar; Nibourg, Geert A A; Heger, Michal; Bennink, Roelof J; Stieger, Bruno; van Gulik, Thomas M
2013-01-01
To review the literature on the most clinically relevant and novel liver function tests used for the assessment of hepatic function before liver surgery. Postoperative liver failure is the major cause of mortality and morbidity after partial liver resection and develops as a result of insufficient remnant liver function. Therefore, accurate preoperative assessment of the future remnant liver function is mandatory in the selection of candidates for safe partial liver resection. A MEDLINE search was performed using the key words "liver function tests," "functional studies in the liver," "compromised liver," "physiological basis," and "mechanistic background," with and without Boolean operators. Passive liver function tests, including biochemical parameters and clinical grading systems, are not accurate enough in predicting outcome after liver surgery. Dynamic quantitative liver function tests, such as the indocyanine green test and galactose elimination capacity, are more accurate as they measure the elimination process of a substance that is cleared and/or metabolized almost exclusively by the liver. However, these tests only measure global liver function. Nuclear imaging techniques ((99m)Tc-galactosyl serum albumin scintigraphy and (99m)Tc-mebrofenin hepatobiliary scintigraphy) can measure both total and future remnant liver function and potentially identify patients at risk for postresectional liver failure. Because of the complexity of liver function, one single test does not represent overall liver function. In addition to computed tomography volumetry, quantitative liver function tests should be used to determine whether a safe resection can be performed. Presently, (99m)Tc-mebrofenin hepatobiliary scintigraphy seems to be the most valuable quantitative liver function test, as it can measure multiple aspects of liver function in, specifically, the future remnant liver.
Simulations of eddy kinetic energy transport in barotropic turbulence
NASA Astrophysics Data System (ADS)
Grooms, Ian
2017-11-01
Eddy energy transport in rotating two-dimensional turbulence is investigated using numerical simulation. Stochastic forcing is used to generate an inhomogeneous field of turbulence and the time-mean energy profile is diagnosed. An advective-diffusive model for the transport is fit to the simulation data by requiring the model to accurately predict the observed time-mean energy distribution. Isotropic harmonic diffusion of energy is found to be an accurate model in the case of uniform, solid-body background rotation (the f plane), with a diffusivity that scales reasonably well with a mixing-length law κ ∝V ℓ , where V and ℓ are characteristic eddy velocity and length scales. Passive tracer dynamics are added and it is found that the energy diffusivity is 75 % of the tracer diffusivity. The addition of a differential background rotation with constant vorticity gradient β leads to significant changes to the energy transport. The eddies generate and interact with a mean flow that advects the eddy energy. Mean advection plus anisotropic diffusion (with reduced diffusivity in the direction of the background vorticity gradient) is moderately accurate for flows with scale separation between the eddies and mean flow, but anisotropic diffusion becomes a much less accurate model of the transport when scale separation breaks down. Finally, it is observed that the time-mean eddy energy does not look like the actual eddy energy distribution at any instant of time. In the future, stochastic models of the eddy energy transport may prove more useful than models of the mean transport for predicting realistic eddy energy distributions.
Accurate beacon positioning method for satellite-to-ground optical communication.
Wang, Qiang; Tong, Ling; Yu, Siyuan; Tan, Liying; Ma, Jing
2017-12-11
In satellite laser communication systems, accurate positioning of the beacon is essential for establishing a steady laser communication link. For satellite-to-ground optical communication, the main influencing factors on the acquisition of the beacon are background noise and atmospheric turbulence. In this paper, we consider the influence of background noise and atmospheric turbulence on the beacon in satellite-to-ground optical communication, and propose a new locating algorithm for the beacon, which takes the correlation coefficient obtained by curve fitting for image data as weights. By performing a long distance laser communication experiment (11.16 km), we verified the feasibility of this method. Both simulation and experiment showed that the new algorithm can accurately obtain the position of the centroid of beacon. Furthermore, for the distortion of the light spot through atmospheric turbulence, the locating accuracy of the new algorithm was 50% higher than that of the conventional gray centroid algorithm. This new approach will be beneficial for the design of satellite-to ground optical communication systems.
Gallo-Oller, Gabriel; Ordoñez, Raquel; Dotor, Javier
2018-06-01
Since its first description, Western blot has been widely used in molecular labs. It constitutes a multistep method that allows the detection and/or quantification of proteins from simple to complex protein mixtures. Western blot quantification method constitutes a critical step in order to obtain accurate and reproducible results. Due to the technical knowledge required for densitometry analysis together with the resources availability, standard office scanners are often used for the imaging acquisition of developed Western blot films. Furthermore, the use of semi-quantitative software as ImageJ (Java-based image-processing and analysis software) is clearly increasing in different scientific fields. In this work, we describe the use of office scanner coupled with the ImageJ software together with a new image background subtraction method for accurate Western blot quantification. The proposed method represents an affordable, accurate and reproducible approximation that could be used in the presence of limited resources availability. Copyright © 2018 Elsevier B.V. All rights reserved.
Comparison of model and human observer performance in FFDM, DBT, and synthetic mammography
NASA Astrophysics Data System (ADS)
Ikejimba, Lynda; Glick, Stephen J.; Samei, Ehsan; Lo, Joseph Y.
2016-03-01
Reader studies are important in assessing breast imaging systems. The purpose of this work was to assess task-based performance of full field digital mammography (FFDM), digital breast tomosynthesis (DBT), and synthetic mammography (SM) using different phantom types, and to determine an accurate observer model for human readers. Images were acquired on a Hologic Selenia Dimensions system with a uniform and anthropomorphic phantom. A contrast detail insert of small, low-contrast disks was created using an inkjet printer with iodine-doped ink and inserted in the phantoms. The disks varied in diameter from 210 to 630 μm, and in contrast from 1.1% contrast to 2.2% in regular increments. Human and model observers performed a 4-alternative forced choice experiment. The models were a non-prewhitening matched filter with eye model (NPWE) and a channelized Hotelling observer with either Gabor channels (Gabor-CHO) or Laguerre-Gauss channels (LG-CHO). With the given phantoms, reader scores were higher in FFDM and DBT than SM. The structure in the phantom background had a bigger impact on outcome for DBT than for FFDM or SM. All three model observers showed good correlation with humans in the uniform background, with ρ between 0.89 and 0.93. However, in the structured background, only the CHOs had high correlation, with ρ=0.92 for Gabor-CHO, 0.90 for LG-CHO, and 0.77 for NPWE. Because results of any analysis can depend on the phantom structure, conclusions of modality performance may need to be taken in the context of an appropriate model observer and a realistic phantom.
British Military Mission (BMM) to Greece, 1942-44
2009-05-21
Terrorism. Failure to take into account and accurately assess political and military actions in such environments can lead to unintended consequences ...such environments can lead to unintended consequences (potential civil war) affecting the stability of a country. Accurate assessment of the political...take into account and accurately assess political and military actions in such environments can lead to unintended consequences (potential civil war
From the Cluster Temperature Function to the Mass Function at Low Z
NASA Technical Reports Server (NTRS)
Mushotzky, Richard (Technical Monitor); Markevitch, Maxim
2004-01-01
This XMM project consisted of three observations of the nearby, hot galaxy cluster Triangulum Australis, one of the cluster center and two offsets. The goal was to measure the radial gas temperature profile out to large radii and derive the total gravitating mass within the radius of average mass overdensity 500. The central pointing also provides data for a detailed two-dimensional gas temperature map of this interesting cluster. We have analyzed all three observations. The derivation of the temperature map using the central pointing is complete, and the paper is soon to be submitted. During the course of this study and of the analysis of archival XMM cluster observations, it became apparent that the commonly used XMM background flare screening techniques are often not accurate enough for studies of the cluster outer regions. The information on the cluster's total masses is contained at large off-center distances, and it is precisely the temperatures for those low-brightness regions that are most affected by the detector background anomalies. In particular, our two offset observations of the Triangulum have been contaminated by the background flares ("bad cosmic weather") to a degree where they could not be used for accurate spectral analysis. This forced us to expand the scope of our project. We needed to devise a more accurate method of screening and modeling the background flares, and to evaluate the uncertainty of the XMM background modeling. To do this, we have analyzed a large number of archival EPIC blank-field and closed-cover observations. As a result, we have derived stricter background screening criteria. It also turned out that mild flares affecting EPIC-pn can be modeled with an adequate accuracy. Such modeling has been used to derive our Triangulum temperature map. The results of our XMM background analysis, including the modeling recipes, are presented in a paper which is in final preparation and will be submitted soon. It will be useful not only for our future analysis but for other XMM cluster observations as well.
Arbour, Richard
2003-01-01
Practice concerns associated with the medical prescription and nurses' administration and monitoring of sedatives, analgesics, and neuromuscular blocking agents were identified by the clinical nurse specialist within a surgical intensive care unit of a large, tertiary-care referral center. These concerns were identified using a variety of needs assessment strategies. Results of the needs assessment were used to develop a program of care, including a teaching initiative, specific to these practice areas. The teaching initiative incorporated principles of andragogy, the theory of adult learning. Educational techniques included inservice education, bedside instruction using "teaching moments," competency-based education modules, and integration of instruction into critical care orientation. Content and approach were based on the background and level of experience of participants. Educational program outcomes included increased consistency in monitoring neuromuscular blockade by clinical assessment and peripheral nerve stimulation. A second outcome was more accurate patient assessment leading to the provision of drug therapy specific to the patients' clinical states, including anxiety or pain. The continuous quality improvement approach offers a model for improving patient care using individualized needs assessment, focused educational interventions, and program evaluation strategies.
2013-01-01
Background Calcium deficiency is a global public-health problem. Although the initial stage of calcium deficiency can lead to metabolic alterations or potential pathological changes, calcium deficiency is difficult to diagnose accurately. Moreover, the details of the molecular mechanism of calcium deficiency remain somewhat elusive. To accurately assess and provide appropriate nutritional intervention, we carried out a global analysis of metabolic alterations in response to calcium deficiency. Methods The metabolic alterations associated with calcium deficiency were first investigated in a rat model, using urinary metabonomics based on ultra-performance liquid chromatography coupled with quadrupole time-of-flight tandem mass spectrometry and multivariate statistical analysis. Correlations between dietary calcium intake and the biomarkers identified from the rat model were further analyzed to confirm the potential application of these biomarkers in humans. Results Urinary metabolic-profiling analysis could preliminarily distinguish between calcium-deficient and non-deficient rats after a 2-week low-calcium diet. We established an integrated metabonomics strategy for identifying reliable biomarkers of calcium deficiency using a time-course analysis of discriminating metabolites in a low-calcium diet experiment, repeating the low-calcium diet experiment and performing a calcium-supplement experiment. In total, 27 biomarkers were identified, including glycine, oxoglutaric acid, pyrophosphoric acid, sebacic acid, pseudouridine, indoxyl sulfate, taurine, and phenylacetylglycine. The integrated urinary metabonomics analysis, which combined biomarkers with regular trends of change (types A, B, and C), could accurately assess calcium-deficient rats at different stages and clarify the dynamic pathophysiological changes and molecular mechanism of calcium deficiency in detail. Significant correlations between calcium intake and two biomarkers, pseudouridine (Pearson correlation, r = 0.53, P = 0.0001) and citrate (Pearson correlation, r = -0.43, P = 0.001), were further confirmed in 70 women. Conclusions To our knowledge, this is the first report of reliable biomarkers of calcium deficiency, which were identified using an integrated strategy. The identified biomarkers give new insights into the pathophysiological changes and molecular mechanisms of calcium deficiency. The correlations between calcium intake and two of the biomarkers provide a rationale or potential for further assessment and elucidation of the metabolic responses of calcium deficiency in humans. PMID:23537001
Development and Utility of a Four-Channel Scanner for Wildland Fire Research and Applications
NASA Technical Reports Server (NTRS)
Ambrosia, Vincent G.; Brass, James A.; Higgins, Robert G.; Hildum, Edward; Peterson, David L. (Technical Monitor)
1996-01-01
The Airborne Infrared Disaster Assessment System (AIRDAS) is a four-channel scanner designed and built at NASA-Ames for the specific task of supporting research and applications on fire impacts on terrestrial and atmospheric processes and also of serving as a vital instrument in the assessment of natural and man-induced disasters. The system has been flown on numerous airframes including the Navajo, King-Air, C0130, and Lear Jet 310 and a 206. The system includes a configuration composed of a 386 PC computer workstation, a non-linear detector amplifier, a sixteen-bit digitizer, dichroic filters, and Exabyte 8500 5Gb Tape output, VHS tape output, a Rockwell GPS and a 2-axis gyro. The AIRDAS system collects digital data in four wavelength regions, which can be filtered: band 1 (0.61-0.68 microns), band 2 (1.57-1.7 microns), band 3 (3.6-5.5 microns), and band 4 (5.5-13.0 microns), an FOV of 108 degrees, an IFOV of 2.62 mrads, and a digitized swath width of 720 pixels. The inclusion of the non-linear detector amplifier allows for the accurate measurement of emitted temperature from fires and hot spots. Lab testing of the scanner has indicated temperature assessments of 800 C without detector saturation. This has advantages over previous systems which were designed for thermal measurement of earth background temperatures, and were ill-equipped for accurate determination of high intensity conditions. The scanner has been flown successfully on data collection missions since 1992 in the western US as well as Brazil. These and other research and applications responses will be presented along with an assessment of future directions with the system.a
Static and elevated pollen traps do not provide an accurate assessment of personal pollen exposure.
Penel, V; Calleja, M; Pichot, C; Charpin, D
2017-03-01
Background. Volumetric pollen traps are commonly used to assess pollen exposure. These traps are well suited for estimating the regional mean airborne pollen concentration but are likely not to provide an accurate index of personal exposure. In this study, we tested the hypothesis that hair sampling may provide different pollen counts from those from pollen traps, especially when the pollen exposure is diverse. Methods. We compared pollen counts in hair washes to counts provided by stationary volumetric and gravimetric pollen traps in 2 different settings: urban with volunteers living in short distance from one another and from the static trap and suburban in which volunteers live in a scattered environment, quite far from the static trap. Results. Pollen counts in hair washes are in full agreement with trap counts for uniform pollen exposure. In contrast, for diverse pollen exposure, .individual pollen counts in hair washes vary strongly in quantity and taxa composition between individuals and dates. These results demonstrate that the pollen counts method (hair washes vs. stationary pollen traps) may lead to different absolute and relative contributions of taxa to the total pollen count. Conclusions. In a geographic area with a high diversity of environmental exposure to pollen, static pollen traps, in contrast to hair washes, do not provide a reliable estimate of this higher diversity.
Measurement of radial artery contrast intensity to assess cardiac microbubble behavior.
Sosnovik, David E; Januzzi, James L; Church, Charles C; Mertsch, Judith A; Sears, Andrea L; Fetterman, Robert C; Walovitch, Richard C; Picard, Michael H
2003-12-01
We sought to determine whether analysis of the contrast signal from the radial artery is better able to reflect changes in left ventricular (LV) microbubble dynamics than the signal from the LV itself. Assessment of microbubble behavior from images of the LV may be affected by attenuation from overlying microbubbles and nonuniform background signal intensities. The signal intensity from contrast in a peripheral artery is not affected by these artifacts and may, thus, be more accurate. After injection of a contrast bolus into a peripheral vein, signal intensity was followed simultaneously in the LV and radial artery. The measurements were repeated using continuous, triggered, low and high mechanical index harmonic imaging of the LV. Peak and integrated signal intensities ranged from 25 dB and 1550 dB/s, respectively, with radial artery imaging to 5.6 dB and 471 dB/s with ventricular imaging. Although differences in microbubble behavior during the different imaging protocols could be determined from both the LV and radial artery curves, analysis of the radial artery curves yielded more consistent and robust differences. The signal from microbubbles in the radial artery is not affected by shadowing and is, thus, a more accurate reflection of microbubble behavior in the LV than the signal from the LV itself. This may have important implications for the measurement of myocardial perfusion by contrast echocardiography.
Towle, Erica L.; Richards, Lisa M.; Kazmi, S. M. Shams; Fox, Douglas J.; Dunn, Andrew K.
2013-01-01
BACKGROUND Assessment of the vasculature is critical for overall success in cranial vascular neurological surgery procedures. Although several methods of monitoring cortical perfusion intraoperatively are available, not all are appropriate or convenient in a surgical environment. Recently, 2 optical methods of care have emerged that are able to obtain high spatial resolution images with easily implemented instrumentation: indocyanine green (ICG) angiography and laser speckle contrast imaging (LSCI). OBJECTIVE To evaluate the usefulness of ICG and LSCI in measuring vessel perfusion. METHODS An experimental setup was developed that simultaneously collects measurements of ICG fluorescence and LSCI in a rodent model. A 785-nm laser diode was used for both excitation of the ICG dye and the LSCI illumination. A photothrombotic clot model was used to occlude specific vessels within the field of view to enable comparison of the 2 methods for monitoring vessel perfusion. RESULTS The induced blood flow change demonstrated that ICG is an excellent method for visualizing the volume and type of vessel at a single point in time; however, it is not always an accurate representation of blood flow. In contrast, LSCI provides a continuous and accurate measurement of blood flow changes without the need of an external contrast agent. CONCLUSION These 2 methods should be used together to obtain a complete understanding of tissue perfusion. PMID:22843129
Modeling haplotype block variation using Markov chains.
Greenspan, G; Geiger, D
2006-04-01
Models of background variation in genomic regions form the basis of linkage disequilibrium mapping methods. In this work we analyze a background model that groups SNPs into haplotype blocks and represents the dependencies between blocks by a Markov chain. We develop an error measure to compare the performance of this model against the common model that assumes that blocks are independent. By examining data from the International Haplotype Mapping project, we show how the Markov model over haplotype blocks is most accurate when representing blocks in strong linkage disequilibrium. This contrasts with the independent model, which is rendered less accurate by linkage disequilibrium. We provide a theoretical explanation for this surprising property of the Markov model and relate its behavior to allele diversity.
Modeling Haplotype Block Variation Using Markov Chains
Greenspan, G.; Geiger, D.
2006-01-01
Models of background variation in genomic regions form the basis of linkage disequilibrium mapping methods. In this work we analyze a background model that groups SNPs into haplotype blocks and represents the dependencies between blocks by a Markov chain. We develop an error measure to compare the performance of this model against the common model that assumes that blocks are independent. By examining data from the International Haplotype Mapping project, we show how the Markov model over haplotype blocks is most accurate when representing blocks in strong linkage disequilibrium. This contrasts with the independent model, which is rendered less accurate by linkage disequilibrium. We provide a theoretical explanation for this surprising property of the Markov model and relate its behavior to allele diversity. PMID:16361244
NASA Technical Reports Server (NTRS)
Costanza, Bryan T.; Horne, William C.; Schery, S. D.; Babb, Alex T.
2011-01-01
The Aero-Physics Branch at NASA Ames Research Center utilizes a 32- by 48-inch subsonic wind tunnel for aerodynamics research. The feasibility of acquiring acoustic measurements with a phased microphone array was recently explored. Acoustic characterization of the wind tunnel was carried out with a floor-mounted 24-element array and two ceiling-mounted speakers. The minimum speaker level for accurate level measurement was evaluated for various tunnel speeds up to a Mach number of 0.15 and streamwise speaker locations. A variety of post-processing procedures, including conventional beamforming and deconvolutional processing such as TIDY, were used. The speaker measurements, with and without flow, were used to compare actual versus simulated in-flow speaker calibrations. Data for wind-off speaker sound and wind-on tunnel background noise were found valuable for predicting sound levels for which the speakers were detectable when the wind was on. Speaker sources were detectable 2 - 10 dB below the peak background noise level with conventional data processing. The effectiveness of background noise cross-spectral matrix subtraction was assessed and found to improve the detectability of test sound sources by approximately 10 dB over a wide frequency range.
Donnelly, Aoife; Misstear, Bruce; Broderick, Brian
2011-02-15
Background concentrations of nitrogen dioxide (NO(2)) are not constant but vary temporally and spatially. The current paper presents a powerful tool for the quantification of the effects of wind direction and wind speed on background NO(2) concentrations, particularly in cases where monitoring data are limited. In contrast to previous studies which applied similar methods to sites directly affected by local pollution sources, the current study focuses on background sites with the aim of improving methods for predicting background concentrations adopted in air quality modelling studies. The relationship between measured NO(2) concentration in air at three such sites in Ireland and locally measured wind direction has been quantified using nonparametric regression methods. The major aim was to analyse a method for quantifying the effects of local wind direction on background levels of NO(2) in Ireland. The method was expanded to include wind speed as an added predictor variable. A Gaussian kernel function is used in the analysis and circular statistics employed for the wind direction variable. Wind direction and wind speed were both found to have a statistically significant effect on background levels of NO(2) at all three sites. Frequently environmental impact assessments are based on short term baseline monitoring producing a limited dataset. The presented non-parametric regression methods, in contrast to the frequently used methods such as binning of the data, allow concentrations for missing data pairs to be estimated and distinction between spurious and true peaks in concentrations to be made. The methods were found to provide a realistic estimation of long term concentration variation with wind direction and speed, even for cases where the data set is limited. Accurate identification of the actual variation at each location and causative factors could be made, thus supporting the improved definition of background concentrations for use in air quality modelling studies. Copyright © 2010 Elsevier B.V. All rights reserved.
Chen, Shan; Li, Xiao-ning; Liang, Yi-zeng; Zhang, Zhi-min; Liu, Zhao-xia; Zhang, Qi-ming; Ding, Li-xia; Ye, Fei
2010-08-01
During Raman spectroscopy analysis, the organic molecules and contaminations will obscure or swamp Raman signals. The present study starts from Raman spectra of prednisone acetate tablets and glibenclamide tables, which are acquired from the BWTek i-Raman spectrometer. The background is corrected by R package baselineWavelet. Then principle component analysis and random forests are used to perform clustering analysis. Through analyzing the Raman spectra of two medicines, the accurate and validity of this background-correction algorithm is checked and the influences of fluorescence background on Raman spectra clustering analysis is discussed. Thus, it is concluded that it is important to correct fluorescence background for further analysis, and an effective background correction solution is provided for clustering or other analysis.
NASA Astrophysics Data System (ADS)
Weissman, L.; Kreisel, A.; Hirsh, T.; Aviv, O.; Berkovits, D.; Girshevitz, O.; Eisen, Y.
2015-01-01
The cross sections of 63Cu(d,p)64Cu and natCu(d,x)65Zn were determined for deuteron beam energy range of 2.77-5.62 MeV at the SARAF Phase I variable energy LINAC. Thin copper foils were irradiated by a deuteron beam followed up by measurement of the produced activation at the Soreq NRC low-background γ-counting system. The results are consistent with data in the literature, but are of better accuracy. The data are important for assessment of the activation of components of Radio Frequency Quadrupole injectors and Medium Energy Beam Transport beam dumps in modern deuteron LINACs.
Assessing quality in cardiac surgery: why this is necessary in the twenty-first century
NASA Technical Reports Server (NTRS)
Swain, J. A.; Hartz, R. S.
2000-01-01
The cost and high-profile nature of coronary surgery means that this is an area of close public scrutiny. As much pioneering work in data collection and risk analyses has been carried out by cardiac surgeons, substantial information exists and the correct interpretation of that data is identified as an important issue. This paper considers the background and history of risk-adjustment in cardiac surgery, the uses of quality data, examines the observed/expected mortality ratio and looks at issues such as cost and reactions to outliers. The conclusion of the study is that the continuation of accurate data collection by the whole operative team and a strong commitment to constantly improving quality is crucial to its meaningful application.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tarolli, Jay G.; Naes, Benjamin E.; Butler, Lamar
A fully convolutional neural network (FCN) was developed to supersede automatic or manual thresholding algorithms used for tabulating SIMS particle search data. The FCN was designed to perform a binary classification of pixels in each image belonging to a particle or not, thereby effectively removing background signal without manually or automatically determining an intensity threshold. Using 8,000 images from 28 different particle screening analyses, the FCN was trained to accurately predict pixels belonging to a particle with near 99% accuracy. Background eliminated images were then segmented using a watershed technique in order to determine isotopic ratios of particles. A comparisonmore » of the isotopic distributions of an independent data set segmented using the neural network, compared to a commercially available automated particle measurement (APM) program developed by CAMECA, highlighted the necessity for effective background removal to ensure that resulting particle identification is not only accurate, but preserves valuable signal that could be lost due to improper segmentation. The FCN approach improves the robustness of current state-of-the-art particle searching algorithms by reducing user input biases, resulting in an improved absolute signal per particle and decreased uncertainty of the determined isotope ratios.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zotti, G. De; Negrello, M.; Castex, G.
We review aspects of Cosmic Microwave Background (CMB) spectral distortions which do not appear to have been fully explored in the literature. In particular, implications of recent evidences of heating of the intergalactic medium (IGM) by feedback from active galactic nuclei are investigated. Taking also into account the IGM heating associated to structure formation, we argue that values of the y parameter of several × 10{sup −6}, i.e. a factor of a few below the COBE/FIRAS upper limit, are to be expected. The Compton scattering by the re-ionized plasma also re-processes primordial distortions, adding a y-type contribution. Hence no pure Bose-Einstein-likemore » distortions are to be expected. An assessment of Galactic and extragalactic foregrounds, taking into account the latest results from the Planck satellite as well as the contributions from the strong CII and CO lines from star-forming galaxies, demonstrates that a foreground subtraction accurate enough to fully exploit the PIXIE sensitivity will be extremely challenging. Motivated by this fact we also discuss methods to detect spectral distortions not requiring absolute measurements and show that accurate determinations of the frequency spectrum of the CMB dipole amplitude may substantially improve over COBE/FIRAS limits on distortion parameters. Such improvements may be at reach of next generation CMB anisotropy experiments. The estimated amplitude of the Cosmic Infrared Background (CIB) dipole might be detectable by careful analyses of Planck maps at the highest frequencies. Thus Planck might provide interesting constraints on the CIB intensity, currently known with a ≅ 30% uncertainty.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kulisek, Jonathan A.; Schweppe, John E.; Stave, Sean C.
2015-06-01
Helicopter-mounted gamma-ray detectors can provide law enforcement officials the means to quickly and accurately detect, identify, and locate radiological threats over a wide geographical area. The ability to accurately distinguish radiological threat-generated gamma-ray signatures from background gamma radiation in real time is essential in order to realize this potential. This problem is non-trivial, especially in urban environments for which the background may change very rapidly during flight. This exacerbates the challenge of estimating background due to the poor counting statistics inherent in real-time airborne gamma-ray spectroscopy measurements. To address this, we have developed a new technique for real-time estimation ofmore » background gamma radiation from aerial measurements. This method is built upon on the noise-adjusted singular value decomposition (NASVD) technique that was previously developed for estimating the potassium (K), uranium (U), and thorium (T) concentrations in soil post-flight. The method can be calibrated using K, U, and T spectra determined from radiation transport simulations along with basis functions, which may be determined empirically by applying maximum likelihood estimation (MLE) to previously measured airborne gamma-ray spectra. The method was applied to both measured and simulated airborne gamma-ray spectra, with and without man-made radiological source injections. Compared to schemes based on simple averaging, this technique was less sensitive to background contamination from the injected man-made sources and may be particularly useful when the gamma-ray background frequently changes during the course of the flight.« less
Economic Assessment: A Model for Assessing Ability to Pay.
ERIC Educational Resources Information Center
Andre, Patricia; And Others
1978-01-01
Accurate assessment of the client's ability to pay is the cornerstone to fee collections in any service organization. York County Counseling Services implemented a new method of fee assessment and collection based on the principles of providing a service worth paying for, accurate assessment of ability to pay, and a budget-payment system. (Author)
Easy-interactive and quick psoriasis lesion segmentation
NASA Astrophysics Data System (ADS)
Ma, Guoli; He, Bei; Yang, Wenming; Shu, Chang
2013-12-01
This paper proposes an interactive psoriasis lesion segmentation algorithm based on Gaussian Mixture Model (GMM). Psoriasis is an incurable skin disease and affects large population in the world. PASI (Psoriasis Area and Severity Index) is the gold standard utilized by dermatologists to monitor the severity of psoriasis. Computer aid methods of calculating PASI are more objective and accurate than human visual assessment. Psoriasis lesion segmentation is the basis of the whole calculating. This segmentation is different from the common foreground/background segmentation problems. Our algorithm is inspired by GrabCut and consists of three main stages. First, skin area is extracted from the background scene by transforming the RGB values into the YCbCr color space. Second, a rough segmentation of normal skin and psoriasis lesion is given. This is an initial segmentation given by thresholding a single gaussian model and the thresholds are adjustable, which enables user interaction. Third, two GMMs, one for the initial normal skin and one for psoriasis lesion, are built to refine the segmentation. Experimental results demonstrate the effectiveness of the proposed algorithm.
Size–strain separation in diffraction line profile analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Scardi, P.; Ermrich, M.; Fitch, A.
Separation of size and strain effects on diffraction line profiles has been studied in a round robin involving laboratory instruments and synchrotron radiation beamlines operating with different radiation, optics, detectors and experimental configurations. The studied sample, an extensively ball milled iron alloy powder, provides an ideal test case, as domain size broadening and strain broadening are of comparable size. The high energy available at some synchrotron radiation beamlines provides the best conditions for an accurate analysis of the line profiles, as the size–strain separation clearly benefits from a large number of Bragg peaks in the pattern; high counts, reliable intensitymore » values in low-absorption conditions, smooth background and data collection at different temperatures also support the possibility to include diffuse scattering in the analysis, for the most reliable assessment of the line broadening effect. However, results of the round robin show that good quality information on domain size distribution and microstrain can also be obtained using standard laboratory equipment, even when patterns include relatively few Bragg peaks, provided that the data are of good quality in terms of high counts and low and smooth background.« less
Size–strain separation in diffraction line profile analysis
Scardi, P.; Ermrich, M.; Fitch, A.; ...
2018-05-29
Separation of size and strain effects on diffraction line profiles has been studied in a round robin involving laboratory instruments and synchrotron radiation beamlines operating with different radiation, optics, detectors and experimental configurations. The studied sample, an extensively ball milled iron alloy powder, provides an ideal test case, as domain size broadening and strain broadening are of comparable size. The high energy available at some synchrotron radiation beamlines provides the best conditions for an accurate analysis of the line profiles, as the size–strain separation clearly benefits from a large number of Bragg peaks in the pattern; high counts, reliable intensitymore » values in low-absorption conditions, smooth background and data collection at different temperatures also support the possibility to include diffuse scattering in the analysis, for the most reliable assessment of the line broadening effect. However, results of the round robin show that good quality information on domain size distribution and microstrain can also be obtained using standard laboratory equipment, even when patterns include relatively few Bragg peaks, provided that the data are of good quality in terms of high counts and low and smooth background.« less
Wang, Chang; Huang, Chichao; Qian, Jian; Xiao, Jian; Li, Huan; Wen, Yongli; He, Xinhua; Ran, Wei; Shen, Qirong; Yu, Guanghui
2014-01-01
The composting industry has been growing rapidly in China because of a boom in the animal industry. Therefore, a rapid and accurate assessment of the quality of commercial organic fertilizers is of the utmost importance. In this study, a novel technique that combines near infrared (NIR) spectroscopy with partial least squares (PLS) analysis is developed for rapidly and accurately assessing commercial organic fertilizers quality. A total of 104 commercial organic fertilizers were collected from full-scale compost factories in Jiangsu Province, east China. In general, the NIR-PLS technique showed accurate predictions of the total organic matter, water soluble organic nitrogen, pH, and germination index; less accurate results of the moisture, total nitrogen, and electrical conductivity; and the least accurate results for water soluble organic carbon. Our results suggested the combined NIR-PLS technique could be applied as a valuable tool to rapidly and accurately assess the quality of commercial organic fertilizers. PMID:24586313
Wang, Chang; Huang, Chichao; Qian, Jian; Xiao, Jian; Li, Huan; Wen, Yongli; He, Xinhua; Ran, Wei; Shen, Qirong; Yu, Guanghui
2014-01-01
The composting industry has been growing rapidly in China because of a boom in the animal industry. Therefore, a rapid and accurate assessment of the quality of commercial organic fertilizers is of the utmost importance. In this study, a novel technique that combines near infrared (NIR) spectroscopy with partial least squares (PLS) analysis is developed for rapidly and accurately assessing commercial organic fertilizers quality. A total of 104 commercial organic fertilizers were collected from full-scale compost factories in Jiangsu Province, east China. In general, the NIR-PLS technique showed accurate predictions of the total organic matter, water soluble organic nitrogen, pH, and germination index; less accurate results of the moisture, total nitrogen, and electrical conductivity; and the least accurate results for water soluble organic carbon. Our results suggested the combined NIR-PLS technique could be applied as a valuable tool to rapidly and accurately assess the quality of commercial organic fertilizers.
Shape-based human detection for threat assessment
NASA Astrophysics Data System (ADS)
Lee, Dah-Jye; Zhan, Pengcheng; Thomas, Aaron; Schoenberger, Robert B.
2004-07-01
Detection of intrusions for early threat assessment requires the capability of distinguishing whether the intrusion is a human, an animal, or other objects. Most low-cost security systems use simple electronic motion detection sensors to monitor motion or the location of objects within the perimeter. Although cost effective, these systems suffer from high rates of false alarm, especially when monitoring open environments. Any moving objects including animals can falsely trigger the security system. Other security systems that utilize video equipment require human interpretation of the scene in order to make real-time threat assessment. Shape-based human detection technique has been developed for accurate early threat assessments for open and remote environment. Potential threats are isolated from the static background scene using differential motion analysis and contours of the intruding objects are extracted for shape analysis. Contour points are simplified by removing redundant points connecting short and straight line segments and preserving only those with shape significance. Contours are represented in tangent space for comparison with shapes stored in database. Power cepstrum technique has been developed to search for the best matched contour in database and to distinguish a human from other objects from different viewing angles and distances.
A tool to estimate the Fermi Large Area Telescope background for short-duration observations
Vasileiou, Vlasios
2013-07-25
Here, the proper estimation of the background is a crucial component of data analyses in astrophysics, such as source detection, temporal studies, spectroscopy, and localization. For the case of the Large Area Telescope (LAT) on board the Fermi spacecraft, approaches to estimate the background for short (≲1000 s duration) observations fail if they ignore the strong dependence of the LAT background on the continuously changing observational conditions. We present a (to be) publicly available background-estimation tool created and used by the LAT Collaboration in several analyses of Gamma Ray Bursts. This tool can accurately estimate the expected LAT background formore » any observational conditions, including, for example, observations with rapid variations of the Fermi spacecraft’s orientation occurring during automatic repointings.« less
A New Determination of the Extragalactic Diffuse X-Ray Background from EGRET Data
NASA Technical Reports Server (NTRS)
Strong, Andrew W.; Moskalenko, Igor V.; Reimer, Olaf
2004-01-01
We use the GALPROP model for cosmic-ray propagation to obtain a new estimate of the Galactic component of gamma rays, and show that away from the Galactic plane it gives an accurate prediction of the observed EGRET intensities in the energy range 30 MeV - 50 GeV. On this basis we re-evaluate the extragalactic gamma-ray background. We find that for some energies previous work underestimated the Galactic contribution at high latitudes and hence overestimated the background. Our new background spectrum shows a positive curvature similar to that expected for models of the extragalactic emission based on the blazar population.
NASA Astrophysics Data System (ADS)
Yi, Longtao; Liu, Zhiguo; Wang, Kai; Chen, Man; Peng, Shiqi; Zhao, Weigang; He, Jialin; Zhao, Guangcui
2015-03-01
A new method is presented to subtract the background from the energy dispersive X-ray fluorescence (EDXRF) spectrum using a cubic spline interpolation. To accurately obtain interpolation nodes, a smooth fitting and a set of discriminant formulations were adopted. From these interpolation nodes, the background is estimated by a calculated cubic spline function. The method has been tested on spectra measured from a coin and an oil painting using a confocal MXRF setup. In addition, the method has been tested on an existing sample spectrum. The result confirms that the method can properly subtract the background.
Learning to predict where human gaze is using quaternion DCT based regional saliency detection
NASA Astrophysics Data System (ADS)
Li, Ting; Xu, Yi; Zhang, Chongyang
2014-09-01
Many current visual attention approaches used semantic features to accurately capture human gaze. However, these approaches demand high computational cost and can hardly be applied to daily use. Recently, some quaternion-based saliency detection models, such as PQFT (phase spectrum of Quaternion Fourier Transform), QDCT (Quaternion Discrete Cosine Transform), have been proposed to meet real-time requirement of human gaze tracking tasks. However, current saliency detection methods used global PQFT and QDCT to locate jump edges of the input, which can hardly detect the object boundaries accurately. To address the problem, we improved QDCT-based saliency detection model by introducing superpixel-wised regional saliency detection mechanism. The local smoothness of saliency value distribution is emphasized to distinguish noises of background from salient regions. Our algorithm called saliency confidence can distinguish the patches belonging to the salient object and those of the background. It decides whether the image patches belong to the same region. When an image patch belongs to a region consisting of other salient patches, this patch should be salient as well. Therefore, we use saliency confidence map to get background weight and foreground weight to do the optimization on saliency map obtained by QDCT. The optimization is accomplished by least square method. The optimization approach we proposed unifies local and global saliency by combination of QDCT and measuring the similarity between each image superpixel. We evaluate our model on four commonly-used datasets (Toronto, MIT, OSIE and ASD) using standard precision-recall curves (PR curves), the mean absolute error (MAE) and area under curve (AUC) measures. In comparison with most state-of-art models, our approach can achieve higher consistency with human perception without training. It can get accurate human gaze even in cluttered background. Furthermore, it achieves better compromise between speed and accuracy.
Thrane, Jan-Erik; Kyle, Marcia; Striebel, Maren; Haande, Sigrid; Grung, Merete; Rohrlack, Thomas; Andersen, Tom
2015-01-01
The Gauss-peak spectra (GPS) method represents individual pigment spectra as weighted sums of Gaussian functions, and uses these to model absorbance spectra of phytoplankton pigment mixtures. We here present several improvements for this type of methodology, including adaptation to plate reader technology and efficient model fitting by open source software. We use a one-step modeling of both pigment absorption and background attenuation with non-negative least squares, following a one-time instrument-specific calibration. The fitted background is shown to be higher than a solvent blank, with features reflecting contributions from both scatter and non-pigment absorption. We assessed pigment aliasing due to absorption spectra similarity by Monte Carlo simulation, and used this information to select a robust set of identifiable pigments that are also expected to be common in natural samples. To test the method’s performance, we analyzed absorbance spectra of pigment extracts from sediment cores, 75 natural lake samples, and four phytoplankton cultures, and compared the estimated pigment concentrations with concentrations obtained using high performance liquid chromatography (HPLC). The deviance between observed and fitted spectra was generally very low, indicating that measured spectra could successfully be reconstructed as weighted sums of pigment and background components. Concentrations of total chlorophylls and total carotenoids could accurately be estimated for both sediment and lake samples, but individual pigment concentrations (especially carotenoids) proved difficult to resolve due to similarity between their absorbance spectra. In general, our modified-GPS method provides an improvement of the GPS method that is a fast, inexpensive, and high-throughput alternative for screening of pigment composition in samples of phytoplankton material. PMID:26359659
Assessment of human respiration patterns via noncontact sensing using Doppler multi-radar system.
Gu, Changzhan; Li, Changzhi
2015-03-16
Human respiratory patterns at chest and abdomen are associated with both physical and emotional states. Accurate measurement of the respiratory patterns provides an approach to assess and analyze the physical and emotional states of the subject persons. Not many research efforts have been made to wirelessly assess different respiration patterns, largely due to the inaccuracy of the conventional continuous-wave radar sensor to track the original signal pattern of slow respiratory movements. This paper presents the accurate assessment of different respiratory patterns based on noncontact Doppler radar sensing. This paper evaluates the feasibility of accurately monitoring different human respiration patterns via noncontact radar sensing. A 2.4 GHz DC coupled multi-radar system was used for accurate measurement of the complete respiration patterns without any signal distortion. Experiments were carried out in the lab environment to measure the different respiration patterns when the subject person performed natural breathing, chest breathing and diaphragmatic breathing. The experimental results showed that accurate assessment of different respiration patterns is feasible using the proposed noncontact radar sensing technique.
Assessment of Human Respiration Patterns via Noncontact Sensing Using Doppler Multi-Radar System
Gu, Changzhan; Li, Changzhi
2015-01-01
Human respiratory patterns at chest and abdomen are associated with both physical and emotional states. Accurate measurement of the respiratory patterns provides an approach to assess and analyze the physical and emotional states of the subject persons. Not many research efforts have been made to wirelessly assess different respiration patterns, largely due to the inaccuracy of the conventional continuous-wave radar sensor to track the original signal pattern of slow respiratory movements. This paper presents the accurate assessment of different respiratory patterns based on noncontact Doppler radar sensing. This paper evaluates the feasibility of accurately monitoring different human respiration patterns via noncontact radar sensing. A 2.4 GHz DC coupled multi-radar system was used for accurate measurement of the complete respiration patterns without any signal distortion. Experiments were carried out in the lab environment to measure the different respiration patterns when the subject person performed natural breathing, chest breathing and diaphragmatic breathing. The experimental results showed that accurate assessment of different respiration patterns is feasible using the proposed noncontact radar sensing technique. PMID:25785310
Expected background in the LZ experiment
NASA Astrophysics Data System (ADS)
Kudryavtsev, Vitaly A.
2015-08-01
The LZ experiment, featuring a 7-tonne active liquid xenon target, is aimed at achieving unprecedented sensitivity to WIMPs with the background expected to be dominated by astrophysical neutrinos. To reach this goal, extensive simulations are carried out to accurately calculate the electron recoil and nuclear recoil rates in the detector. Both internal (from target material) and external (from detector components and surrounding environment) backgrounds are considered. A very efficient suppression of background rate is achieved with an outer liquid scintillator veto, liquid xenon skin and fiducialisation. Based on the current measurements of radioactivity of different materials, it is shown that LZ can achieve the reduction of a total background for a WIMP search down to about 2 events in 1000 live days for 5.6 tonne fiducial mass.
O'Donoghue, R T; Broderick, B M
2007-09-01
A 5 week monitoring campaign was carried out in Dublin City centre, to establish which site gave a more accurate background city centre estimation: a roof-top or green field site. This background represented a conservative estimate of HC exposure in Dublin City centre, useful for quantifying health effects related to this form of pollution and also for establishing a local background relative to the four surrounding main roads when the wind direction is travelling towards each road with the background receptor upwind. Over the entire monitoring campaign, the lowest concentrations and relative standard deviations were observed at the green field site, regardless of time of day or meteorological effects.
Novel Psychoactive Substances (NPS): a Study on Persian Language Websites
BIGDELI, Imanollah; CORAZZA, Ornella; ASLANPOUR, Zoe; SCHIFANO, Fabrizio
2013-01-01
Background During the past few years, there has been an increasing recognition that Internet is playing a significant role in the synthesis, the distribution and the consumption of Novel Psychoactive Substances (NPS).The aim of this study was to assess the online availability of NPS in Persian language websites. Methods: The Google search engine was used to carry out an accurate qualitative assessment of information available on NPS in a sample of 104 websites. Results: The monitoring has led to the identification of 14 NPS including herbal, synthetic, pharmaceutical and combination drugs that have been sold online. Conclusion: The availability of online marketing of NPS in Persian language websites may constitute a public health challenge at least across three Farsi-speaking countries in the Middle East. Hence, descriptions of this phenomenon are valuable to clinicians and health professional in this region. Further international collaborative efforts may be able to tackle the growth and expansion of regular offer of NPS. PMID:23802109
NASA Astrophysics Data System (ADS)
Mueller, K.; Yadav, V.; Lopez-Coto, I.; Karion, A.; Gourdji, S.; Martin, C.; Whetstone, J.
2018-03-01
There is increased interest in understanding urban greenhouse gas (GHG) emissions. To accurately estimate city emissions, the influence of extraurban fluxes must first be removed from urban greenhouse gas (GHG) observations. This is especially true for regions, such as the U.S. Northeastern Corridor-Baltimore/Washington, DC (NEC-B/W), downwind of large fluxes. To help site background towers for the NEC-B/W, we use a coupled Bayesian Information Criteria and geostatistical regression approach to help site four background locations that best explain CO2 variability due to extraurban fluxes modeled at 12 urban towers. The synthetic experiment uses an atmospheric transport and dispersion model coupled with two different flux inventories to create modeled observations and evaluate 15 candidate towers located along the urban domain for February and July 2013. The analysis shows that the average ratios of extraurban inflow to total modeled enhancements at urban towers are 21% to 36% in February and 31% to 43% in July. In July, the incoming air dominates the total variability of synthetic enhancements at the urban towers (R2 = 0.58). Modeled observations from the selected background towers generally capture the variability in the synthetic CO2 enhancements at urban towers (R2 = 0.75, root-mean-square error (RMSE) = 3.64 ppm; R2 = 0.43, RMSE = 4.96 ppm for February and July). However, errors associated with representing background air can be up to 10 ppm for any given observation even with an optimal background tower configuration. More sophisticated methods may be necessary to represent background air to accurately estimate urban GHG emissions.
Ultrasound determination of rotator cuff tear repairability
Tse, Andrew K; Lam, Patrick H; Walton, Judie R; Hackett, Lisa
2015-01-01
Background Rotator cuff repair aims to reattach the torn tendon to the greater tuberosity footprint with suture anchors. The present study aimed to assess the diagnostic accuracy of ultrasound in predicting rotator cuff tear repairability and to assess which sonographic and pre-operative features are strongest in predicting repairability. Methods The study was a retrospective analysis of measurements made prospectively in a cohort of 373 patients who had ultrasounds of their shoulder and underwent rotator cuff repair. Measurements of rotator cuff tear size and muscle atrophy were made pre-operatively by ultrasound to enable prediction of rotator cuff repairability. Tears were classified following ultrasound as repairable or irreparable, and were correlated with intra-operative repairability. Results Ultrasound assessment of rotator cuff tear repairability has a sensitivity of 86% (p < 0.0001) and a specificity of 67% (p < 0.0001). The strongest predictors of rotator cuff repairability were tear size (p < 0.001) and age (p = 0.004). Sonographic assessments of tear size ≥4 cm2 or anteroposterior tear length ≥25 mm indicated an irreparable rotator cuff tear. Conclusions Ultrasound assessment is accurate in predicting rotator cuff tear repairability. Tear size or anteroposterior tear length and age were the best predictors of repairability. PMID:27582996
An Evaluation of the Pea Pod System for Assessing Body Composition of Moderately Premature Infants.
Forsum, Elisabet; Olhager, Elisabeth; Törnqvist, Caroline
2016-04-22
(1) BACKGROUND: Assessing the quality of growth in premature infants is important in order to be able to provide them with optimal nutrition. The Pea Pod device, based on air displacement plethysmography, is able to assess body composition of infants. However, this method has not been sufficiently evaluated in premature infants; (2) METHODS: In 14 infants in an age range of 3-7 days, born after 32-35 completed weeks of gestation, body weight, body volume, fat-free mass density (predicted by the Pea Pod software), and total body water (isotope dilution) were assessed. Reference estimates of fat-free mass density and body composition were obtained using a three-component model; (3) RESULTS: Fat-free mass density values, predicted using Pea Pod, were biased but not significantly (p > 0.05) different from reference estimates. Body fat (%), assessed using Pea Pod, was not significantly different from reference estimates. The biological variability of fat-free mass density was 0.55% of the average value (1.0627 g/mL); (4) CONCLUSION: The results indicate that the Pea Pod system is accurate for groups of newborn, moderately premature infants. However, more studies where this system is used for premature infants are needed, and we provide suggestions regarding how to develop this area.
Diagnostic digital cytopathology: Are we ready yet?
House, Jarret C.; Henderson-Jackson, Evita B.; Johnson, Joseph O.; Lloyd, Mark C.; Dhillon, Jasreman; Ahmad, Nazeel; Hakam, Ardeshir; Khalbuss, Walid E.; Leon, Marino E.; Chhieng, David; Zhang, Xiaohui; Centeno, Barbara A.; Bui, Marilyn M.
2013-01-01
Background: The cytology literature relating to diagnostic accuracy using whole slide imaging is scarce. We studied the diagnostic concordance between glass and digital slides among diagnosticians with different profiles to assess the readiness of adopting digital cytology in routine practice. Materials and Methods: This cohort consisted of 22 de-identified previously screened and diagnosed cases, including non-gynecological and gynecological slides using standard preparations. Glass slides were digitalized using Aperio ScanScope XT (×20 and ×40). Cytopathologists with (3) and without (3) digital experience, cytotechnologists (4) and senior pathology residents (2) diagnosed the digital slides independently first and recorded the results. Glass slides were read and recorded separately 1-3 days later. Accuracy of diagnosis, time to diagnosis and diagnostician's profile were analyzed. Results: Among 22 case pairs and four study groups, correct diagnosis (93% vs. 86%) was established using glass versus digital slides. Both methods more (>95%) accurately diagnosed positive cases than negatives. Cytopathologists with no digital experience were the most accurate in digital diagnosis, even the senior members. Cytotechnologists had the fastest diagnosis time (3 min/digital vs. 1.7 min/glass), but not the best accuracy. Digital time was 1.5 min longer than glass-slide time/per case for cytopathologists and cytotechnologists. Senior pathology residents were slower and less accurate with both methods. Cytopathologists with digital experience ranked 2nd fastest in time, yet last in accuracy for digital slides. Conclusions: There was good overall diagnostic agreement between the digital whole-slide images and glass slides. Although glass slide diagnosis was more accurate and faster, the results of technologists and pathologists with no digital cytology experience suggest that solid diagnostic ability is a strong indicator for readiness of digital adoption. PMID:24392242
Mock, Donald M.; Lankford, Gary L.; Matthews, Nell I.; Burmeister, Leon F.; Kahn, Daniel; Widness, John A.; Strauss, Ronald G.
2013-01-01
BACKGROUND Safe, accurate methods to reliably measure circulating red blood cell (RBC) kinetics are critical tools to investigate pathophysiology and therapy of anemia, including hemolytic anemias. This study documents the ability of a method using biotin-labeled RBCs (BioRBCs) to measure RBC survival (RCS) shortened by coating with a highly purified monomeric immunoglobulin G antibody to D antigen. STUDY DESIGN AND METHODS Autologous RBCs from 10 healthy D+ subjects were labeled with either biotin or 51Cr (reference method), coated (opsonized) either lightly (n = 4) or heavily (n = 6) with anti-D, and transfused. RCS was determined for BioRBCs and for 51Cr independently as assessed by three variables: 1) posttransfusion recovery at 24 hours (PTR24) for short-term RCS; 2) time to 50% decrease of the label (T50), and 3) mean potential life span (MPL) for long-term RCS. RESULTS BioRBCs tracked both normal and shortened RCS accurately relative to 51Cr. For lightly coated RBCs, mean PTR24, T50, and MPL results were not different between BioRBCs and 51Cr. For heavily coated RBCs, both short-term and long-term RCS were shortened by approximately 17 and 50%, respectively. Mean PTR24 by BioRBCs (84 ± 18%) was not different from 51Cr (81 ± 10%); mean T50 by BioRBCs (23 ± 17 days) was not different from 51Cr (22 ± 18 days). CONCLUSION RCS shortened by coating with anti-D can be accurately measured by BioRBCs. We speculate that BioRBCs will be useful for studying RCS in conditions involving accelerated removal of RBCs including allo- and autoimmune hemolytic anemias. PMID:22023312
Automatic segmentation of bones from digital hand radiographs
NASA Astrophysics Data System (ADS)
Liu, Brent J.; Taira, Ricky K.; Shim, Hyeonjoon; Keaton, Patricia
1995-05-01
The purpose of this paper is to develop a robust and accurate method that automatically segments phalangeal and epiphyseal bones from digital pediatric hand radiographs exhibiting various stages of growth. The algorithm uses an object-oriented approach comprising several stages beginning with the most general objects to be segmented, such as the outline of the hand from background, and proceeding in a succession of stages to the most specific object, such as a specific phalangeal bone from a digit of the hand. Each stage carries custom operators unique to the needs of that specific stage which will aid in more accurate results. The method is further aided by a knowledge base where all model contours and other information such as age, race, and sex, are stored. Shape models, 1-D wrist profiles, as well as an interpretation tree are used to map model and data contour segments. Shape analysis is performed using an arc-length orientation transform. The method is tested on close to 340 phalangeal and epiphyseal objects to be segmented from 17 cases of pediatric hand images obtained from our clinical PACS. Patient age ranges from 2 - 16 years. A pediatric radiologist preliminarily assessed the results of the object contours and were found to be accurate to within 95% for cases with non-fused bones and to within 85% for cases with fused bones. With accurate and robust results, the method can be applied toward areas such as the determination of bone age, the development of a normal hand atlas, and the characterization of many congenital and acquired growth diseases. Furthermore, this method's architecture can be applied to other image segmentation problems.
2011-01-01
Background The last three decades have seen a series of HIV interventions in sub-Saharan Africa. However, youths still have a mixture of correct and incorrect HIV/AIDS knowledge of transmission routes and prevention strategies. Previous studies have identified parents and peers as the most important socializing agents for youths. This paper assesses the relationships between family structure, family/peer communication about sexuality and accurate knowledge of transmission routes and prevention strategies. Methods Data were drawn from the Cameroon Family Life and Health Survey (CFHS) conducted in 2002. The CFHS collected information on a representative sample of 4 950 people aged 10 years and over nested within 1 765 selected households from the 75 localities forming the administrative prefecture of Bandjoun, using detailed questionnaires about family, HIV/AIDS/STDs knowledge, sexual behaviors, contraception, health, media exposure, household assets and neighborhood characteristics. The survey cooperation rates were high (97%). For the purpose of this study, a sub-sample of 2 028 unmarried youths aged 12 - 29 years was utilized. Results Overall, 42% of respondents reported accurate knowledge of documented HIV transmission routes whereas 21% of them had inaccurate knowledge such as AIDS can be transmitted through mosquito bites or casual contact with an infected person. Only 9% of respondents were knowledgeable about all HIV prevention strategies. Multivariate analyses showed that family structure, communication with parents/guardians and peers about sexual topics were significantly associated with accurate HIV knowledge. Additionally, age, education, sexual experience and migration had significant effects on accurate knowledge. Finally, living in poor households and disadvantaged neighborhoods significantly increased inaccurate knowledge of HIV transmission modes and prevention strategies. Conclusions This paper evidenced the limited effects of HIV interventions/programmes in sub-Saharan Africa. Indeed, few respondents reported accurate knowledge about HIV transmission routes and prevention strategies. Findings showed that the role of family environment as source of accurate HIV knowledge transmission routes and prevention strategies is of paramount significance; however, families have been poorly integrated in the design and implementation of the first generation of HIV interventions. There is an urgent need that policymakers work together with families to improve the efficiency of these interventions. Peer influences is likely controversial because of the double positive effect of peer-to-peer communication on both accurate and inaccurate knowledge of HIV transmission routes. PMID:21595931
Sex-specific lean body mass predictive equations are accurate in the obese paediatric population
Jackson, Lanier B.; Henshaw, Melissa H.; Carter, Janet; Chowdhury, Shahryar M.
2015-01-01
Background The clinical assessment of lean body mass (LBM) is challenging in obese children. A sex-specific predictive equation for LBM derived from anthropometric data was recently validated in children. Aim The purpose of this study was to independently validate these predictive equations in the obese paediatric population. Subjects and methods Obese subjects aged 4–21 were analysed retrospectively. Predicted LBM (LBMp) was calculated using equations previously developed in children. Measured LBM (LBMm) was derived from dual-energy x-ray absorptiometry. Agreement was expressed as [(LBMm-LBMp)/LBMm] with 95% limits of agreement. Results Of 310 enrolled patients, 195 (63%) were females. The mean age was 11.8 ± 3.4 years and mean BMI Z-score was 2.3 ± 0.4. The average difference between LBMm and LBMp was −0.6% (−17.0%, 15.8%). Pearson’s correlation revealed a strong linear relationship between LBMm and LBMp (r=0.97, p<0.01). Conclusion This study validates the use of these clinically-derived sex-specific LBM predictive equations in the obese paediatric population. Future studies should use these equations to improve the ability to accurately classify LBM in obese children. PMID:26287383
The prognostic significance of lymphatics in colorectal liver metastases.
Muralidharan, Vijayaragavan; Nguyen, Linh; Banting, Jonathan; Christophi, Christopher
2014-01-01
Background. Colorectal Cancer (CRC) is the most common form of cancer diagnosed in Australia across both genders. Approximately, 40%-60% of patients with CRC develop metastasis, the liver being the most common site. Almost 70% of CRC mortality can be attributed to the development of liver metastasis. This study examines the pattern and density of lymphatics in colorectal liver metastases (CLM) as predictors of survival following hepatic resection for CLM. Methods. Patient tissue samples were obtained from the Victorian Cancer Biobank. Immunohistochemistry was used to examine the spatial differences in blood and lymphatic vessel densities between different regions within the tumor (CLM) and surrounding host tissue. Lymphatic vessel density (LVD) was assessed as a potential prognostic marker. Results. Patients with low lymphatic vessel density in the tumor centre, tumor periphery, and adjacent normal liver demonstrated a significant disease-free survival advantage compared to patients with high lymphatic vessel density (P = 0.01, P > 0.01, and P = 0.05, resp.). Lymphatic vessel density in the tumor centre and periphery and adjacent normal liver was an accurate predictive marker of disease-free survival (P = 0.05). Conclusion. Lymphatic vessel density in CLM appears to be an accurate predictor of recurrence and disease-free survival.
NASA Technical Reports Server (NTRS)
Aumann, Hartmut H.; Manning, Evan
2008-01-01
Retrieval Skill quantifies the ability of one retrieval from a sounder to be more accurate than the best forecast relative to another with the same of another sounder. This is summarized using a Retrieval Anomaly Skill Score (RASS) which is the cor (retrieved-background, truth-background) * sqrt(f), Where f is defined as the ratio of accepted to the possible retrievals. Charts show various features and comparisons of RASS to other methods of retrieval.
SHAPE Selection (SHAPES) enrich for RNA structure signal in SHAPE sequencing-based probing data
Poulsen, Line Dahl; Kielpinski, Lukasz Jan; Salama, Sofie R.; Krogh, Anders; Vinther, Jeppe
2015-01-01
Selective 2′ Hydroxyl Acylation analyzed by Primer Extension (SHAPE) is an accurate method for probing of RNA secondary structure. In existing SHAPE methods, the SHAPE probing signal is normalized to a no-reagent control to correct for the background caused by premature termination of the reverse transcriptase. Here, we introduce a SHAPE Selection (SHAPES) reagent, N-propanone isatoic anhydride (NPIA), which retains the ability of SHAPE reagents to accurately probe RNA structure, but also allows covalent coupling between the SHAPES reagent and a biotin molecule. We demonstrate that SHAPES-based selection of cDNA–RNA hybrids on streptavidin beads effectively removes the large majority of background signal present in SHAPE probing data and that sequencing-based SHAPES data contain the same amount of RNA structure data as regular sequencing-based SHAPE data obtained through normalization to a no-reagent control. Moreover, the selection efficiently enriches for probed RNAs, suggesting that the SHAPES strategy will be useful for applications with high-background and low-probing signal such as in vivo RNA structure probing. PMID:25805860
Schaufele, Fred
2013-01-01
Förster resonance energy transfer (FRET) between fluorescent proteins (FPs) provides insights into the proximities and orientations of FPs as surrogates of the biochemical interactions and structures of the factors to which the FPs are genetically fused. As powerful as FRET methods are, technical issues have impeded their broad adoption in the biologic sciences. One hurdle to accurate and reproducible FRET microscopy measurement stems from variable fluorescence backgrounds both within a field and between different fields. Those variations introduce errors into the precise quantification of fluorescence levels on which the quantitative accuracy of FRET measurement is highly dependent. This measurement error is particularly problematic for screening campaigns since minimal well-to-well variation is necessary to faithfully identify wells with altered values. High content screening depends also upon maximizing the numbers of cells imaged, which is best achieved by low magnification high throughput microscopy. But, low magnification introduces flat-field correction issues that degrade the accuracy of background correction to cause poor reproducibility in FRET measurement. For live cell imaging, fluorescence of cell culture media in the fluorescence collection channels for the FPs commonly used for FRET analysis is a high source of background error. These signal-to-noise problems are compounded by the desire to express proteins at biologically meaningful levels that may only be marginally above the strong fluorescence background. Here, techniques are presented that correct for background fluctuations. Accurate calculation of FRET is realized even from images in which a non-flat background is 10-fold higher than the signal. PMID:23927839
Reliable Gene Expression Measurements from Fine Needle Aspirates of Pancreatic Tumors
Anderson, Michelle A.; Brenner, Dean E.; Scheiman, James M.; Simeone, Diane M.; Singh, Nalina; Sikora, Matthew J.; Zhao, Lili; Mertens, Amy N.; Rae, James M.
2010-01-01
Background and aims: Biomarker use for pancreatic cancer diagnosis has been impaired by a lack of samples suitable for reliable quantitative RT-PCR (qRT-PCR). Fine needle aspirates (FNAs) from pancreatic masses were studied to define potential causes of RNA degradation and develop methods for accurately measuring gene expression. Methods: Samples from 32 patients were studied. RNA degradation was assessed by using a multiplex PCR assay for varying lengths of glyceraldehyde-3-phosphate dehydrogenase, and effects on qRT-PCR were determined by using a 150-bp and a 80-bp amplicon for RPS6. Potential causes of and methods to circumvent RNA degradation were studied by using FNAs from a pancreatic cancer xenograft. Results: RNA extracted from pancreatic mass FNAs was extensively degraded. Fragmentation was related to needle bore diameter and could not be overcome by alterations in aspiration technique. Multiplex PCR for glyceraldehyde-3-phosphate dehydrogenase could distinguish samples that were suitable for qRT-PCR. The use of short PCR amplicons (<100 bp) provided reliable gene expression analysis from FNAs. When appropriate samples were used, the assay was highly reproducible for gene copy number with minimal (0.0003 or about 0.7% of total) variance. Conclusions: The degraded properties of endoscopic FNAs markedly affect the accuracy of gene expression measurements. Our novel approach to designate specimens “informative” for qRT-PCR allowed accurate molecular assessment for the diagnosis of pancreatic diseases. PMID:20709792
London, Michael; Larkum, Matthew E; Häusser, Michael
2008-11-01
Synaptic information efficacy (SIE) is a statistical measure to quantify the efficacy of a synapse. It measures how much information is gained, on the average, about the output spike train of a postsynaptic neuron if the input spike train is known. It is a particularly appropriate measure for assessing the input-output relationship of neurons receiving dynamic stimuli. Here, we compare the SIE of simulated synaptic inputs measured experimentally in layer 5 cortical pyramidal neurons in vitro with the SIE computed from a minimal model constructed to fit the recorded data. We show that even with a simple model that is far from perfect in predicting the precise timing of the output spikes of the real neuron, the SIE can still be accurately predicted. This arises from the ability of the model to predict output spikes influenced by the input more accurately than those driven by the background current. This indicates that in this context, some spikes may be more important than others. Lastly we demonstrate another aspect where using mutual information could be beneficial in evaluating the quality of a model, by measuring the mutual information between the model's output and the neuron's output. The SIE, thus, could be a useful tool for assessing the quality of models of single neurons in preserving input-output relationship, a property that becomes crucial when we start connecting these reduced models to construct complex realistic neuronal networks.
Guidelines for the Reporting of Treatment Trials for Alcohol Use Disorders
Witkiewitz, Katie; Finney, John W.; Harris, Alex H.S; Kivlahan, Daniel R.; Kranzler, Henry R.
2015-01-01
Background The primary goals in conducting clinical trials of treatments for alcohol use disorders (AUDs) is to identify efficacious treatments and determine which treatments are most efficacious for which patients. Accurate reporting of study design features and results is imperative to enable readers of research reports to evaluate to what extent a study has achieved these goals. Guidance on quality of clinical trial reporting has evolved substantially over the past two decades, primarily through the publication and widespread adoption of the Consolidated Standards of Reporting Trials (CONSORT) statement. However, there is room to improve the adoption of those standards in reporting the design and findings of treatment trials for AUD. Methods Narrative review of guidance on reporting quality in AUD treatment trials. Results Despite improvements in the reporting of results of treatment trials for AUD over the past two decades, many published reports provide insufficient information on design or methods. Conclusions The reporting of alcohol treatment trial design, analysis, and results requires improvement in four primary areas: (1) trial registration, (2) procedures for recruitment and retention, (3) procedures for randomization and intervention design considerations, and (4) statistical methods used to assess treatment efficacy. Improvements in these areas and the adoption of reporting standards by authors, reviewers, and editors are critical to an accurate assessment of the reliability and validity of treatment effects. Continued developments in this area are needed to move AUD treatment research forward via systematic reviews and meta-analyses that maximize the utility of completed studies. PMID:26259958
Vibrio trends in the ecology of the Venice lagoon.
Rahman, Mohammad Shamsur; Martino, Maria Elena; Cardazzo, Barbara; Facco, Pierantonio; Bordin, Paola; Mioni, Renzo; Novelli, Enrico; Fasolato, Luca
2014-04-01
Vibrio is a very diverse genus that is responsible for different human and animal diseases. The accurate identification of Vibrio at the species level is important to assess the risks related to public health and diseases caused by aquatic organisms. The ecology of Vibrio spp., together with their genetic background, represents an important key for species discrimination and evolution. Thus, analyses of population structure and ecology association are necessary for reliable characterization of bacteria and to investigate whether bacterial species are going through adaptation processes. In this study, a population of Vibrionaceae was isolated from shellfish of the Venice lagoon and analyzed in depth to study its structure and distribution in the environment. A multilocus sequence analysis (MLSA) was developed on the basis of four housekeeping genes. Both molecular and biochemical approaches were used for species characterization, and the results were compared to assess the consistency of the two methods. In addition, strain ecology and the association between genetic information and environment were investigated through statistical models. The phylogenetic and population analyses achieved good species clustering, while biochemical identification was demonstrated to be imprecise. In addition, this study provided a fine-scale overview of the distribution of Vibrio spp. in the Venice lagoon, and the results highlighted a preferential association of the species toward specific ecological variables. These findings support the use of MLSA for taxonomic studies and demonstrate the need to consider environmental information to obtain broader and more accurate bacterial characterization.
Vibrio Trends in the Ecology of the Venice Lagoon
Rahman, Mohammad Shamsur; Cardazzo, Barbara; Facco, Pierantonio; Bordin, Paola; Mioni, Renzo; Novelli, Enrico; Fasolato, Luca
2014-01-01
Vibrio is a very diverse genus that is responsible for different human and animal diseases. The accurate identification of Vibrio at the species level is important to assess the risks related to public health and diseases caused by aquatic organisms. The ecology of Vibrio spp., together with their genetic background, represents an important key for species discrimination and evolution. Thus, analyses of population structure and ecology association are necessary for reliable characterization of bacteria and to investigate whether bacterial species are going through adaptation processes. In this study, a population of Vibrionaceae was isolated from shellfish of the Venice lagoon and analyzed in depth to study its structure and distribution in the environment. A multilocus sequence analysis (MLSA) was developed on the basis of four housekeeping genes. Both molecular and biochemical approaches were used for species characterization, and the results were compared to assess the consistency of the two methods. In addition, strain ecology and the association between genetic information and environment were investigated through statistical models. The phylogenetic and population analyses achieved good species clustering, while biochemical identification was demonstrated to be imprecise. In addition, this study provided a fine-scale overview of the distribution of Vibrio spp. in the Venice lagoon, and the results highlighted a preferential association of the species toward specific ecological variables. These findings support the use of MLSA for taxonomic studies and demonstrate the need to consider environmental information to obtain broader and more accurate bacterial characterization. PMID:24487545
Self-Assessment in Coursework Essays.
ERIC Educational Resources Information Center
Longhurst, Nigel; Norton, Lin S.
1997-01-01
Self-assessments of coursework essays were compared with tutor grades for 67 college students. Students could accurately assess their overall essay grades and could give an overall rank for deep processing, but when judging essays on individual criteria they were not so accurate when compared to tutor evaluations. (SLD)
2013-01-01
Background A major hindrance to the development of high yielding biofuel feedstocks is the ability to rapidly assess large populations for fermentable sugar yields. Whilst recent advances have outlined methods for the rapid assessment of biomass saccharification efficiency, none take into account the total biomass, or the soluble sugar fraction of the plant. Here we present a holistic high-throughput methodology for assessing sweet Sorghum bicolor feedstocks at 10 days post-anthesis for total fermentable sugar yields including stalk biomass, soluble sugar concentrations, and cell wall saccharification efficiency. Results A mathematical method for assessing whole S. bicolor stalks using the fourth internode from the base of the plant proved to be an effective high-throughput strategy for assessing stalk biomass, soluble sugar concentrations, and cell wall composition and allowed calculation of total stalk fermentable sugars. A high-throughput method for measuring soluble sucrose, glucose, and fructose using partial least squares (PLS) modelling of juice Fourier transform infrared (FTIR) spectra was developed. The PLS prediction was shown to be highly accurate with each sugar attaining a coefficient of determination (R 2 ) of 0.99 with a root mean squared error of prediction (RMSEP) of 11.93, 5.52, and 3.23 mM for sucrose, glucose, and fructose, respectively, which constitutes an error of <4% in each case. The sugar PLS model correlated well with gas chromatography–mass spectrometry (GC-MS) and brix measures. Similarly, a high-throughput method for predicting enzymatic cell wall digestibility using PLS modelling of FTIR spectra obtained from S. bicolor bagasse was developed. The PLS prediction was shown to be accurate with an R 2 of 0.94 and RMSEP of 0.64 μg.mgDW-1.h-1. Conclusions This methodology has been demonstrated as an efficient and effective way to screen large biofuel feedstock populations for biomass, soluble sugar concentrations, and cell wall digestibility simultaneously allowing a total fermentable yield calculation. It unifies and simplifies previous screening methodologies to produce a holistic assessment of biofuel feedstock potential. PMID:24365407
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhen, X; Chen, H; Zhou, L
2014-06-15
Purpose: To propose and validate a novel and accurate deformable image registration (DIR) scheme to facilitate dose accumulation among treatment fractions of high-dose-rate (HDR) gynecological brachytherapy. Method: We have developed a method to adapt DIR algorithms to gynecologic anatomies with HDR applicators by incorporating a segmentation step and a point-matching step into an existing DIR framework. In the segmentation step, random walks algorithm is used to accurately segment and remove the applicator region (AR) in the HDR CT image. A semi-automatic seed point generation approach is developed to obtain the incremented foreground and background point sets to feed the randommore » walks algorithm. In the subsequent point-matching step, a feature-based thin-plate spline-robust point matching (TPS-RPM) algorithm is employed for AR surface point matching. With the resulting mapping, a DVF characteristic of the deformation between the two AR surfaces is generated by B-spline approximation, which serves as the initial DVF for the following Demons DIR between the two AR-free HDR CT images. Finally, the calculated DVF via Demons combined with the initial one serve as the final DVF to map doses between HDR fractions. Results: The segmentation and registration accuracy are quantitatively assessed by nine clinical HDR cases from three gynecological cancer patients. The quantitative results as well as the visual inspection of the DIR indicate that our proposed method can suppress the interference of the applicator with the DIR algorithm, and accurately register HDR CT images as well as deform and add interfractional HDR doses. Conclusions: We have developed a novel and robust DIR scheme that can perform registration between HDR gynecological CT images and yield accurate registration results. This new DIR scheme has potential for accurate interfractional HDR dose accumulation. This work is supported in part by the National Natural ScienceFoundation of China (no 30970866 and no 81301940)« less
List, Susan M; Starks, Nykole; Baum, John; Greene, Carmine; Pardo, Scott; Parkes, Joan L; Schachner, Holly C; Cuddihy, Robert
2011-01-01
Background This study evaluated performance and product labeling of CONTOUR® USB, a new blood glucose monitoring system (BGMS) with integrated diabetes management software and a universal serial bus (USB) port, in the hands of untrained lay users and health care professionals (HCPs). Method Subjects and HCPs tested subject's finger stick capillary blood in parallel using CONTOUR USB meters; deep finger stick blood was tested on a Yellow Springs Instruments (YSI) glucose analyzer for reference. Duplicate results by both subjects and HCPs were obtained to assess system precision. System accuracy was assessed according to International Organization for Standardization (ISO) 15197:2003 guidelines [within ±15 mg/dl of mean YSI results (samples <75 mg/dl) and ±20% (samples ≥75 mg/dl)]. Clinical accuracy was determined by Parkes error grid analysis. Subject labeling comprehension was assessed by HCP ratings of subject proficiency. Key system features and ease-of-use were evaluated by subject questionnaires. Results All subjects who completed the study (N = 74) successfully performed blood glucose measurements, connected the meter to a laptop computer, and used key features of the system. The system was accurate; 98.6% (146/148) of subject results and 96.6% (143/148) of HCP results exceeded ISO 15197:2003 criteria. All subject and HCP results were clinically accurate (97.3%; zone A) or associated with benign errors (2.7%; zone B). The majority of subjects rated features of the BGMS as “very good” or “excellent.” Conclusions CONTOUR USB exceeded ISO 15197:2003 system performance criteria in the hands of untrained lay users. Subjects understood the product labeling, found the system easy to use, and successfully performed blood glucose testing. PMID:22027308
Mondal, Suman B; Gao, Shengkui; Zhu, Nan; Habimana-Griffin, LeMoyne; Akers, Walter J; Liang, Rongguang; Gruev, Viktor; Margenthaler, Julie; Achilefu, Samuel
2017-07-01
The inability to visualize the patient and surgical site directly, limits the use of current near infrared fluorescence-guided surgery systems for real-time sentinel lymph node biopsy and tumor margin assessment. We evaluated an optical see-through goggle augmented imaging and navigation system (GAINS) for near-infrared, fluorescence-guided surgery. Tumor-bearing mice injected with a near infrared cancer-targeting agent underwent fluorescence-guided, tumor resection. Female Yorkshire pigs received hind leg intradermal indocyanine green injection and underwent fluorescence-guided, popliteal lymph node resection. Four breast cancer patients received 99m Tc-sulfur colloid and indocyanine green retroareolarly before undergoing sentinel lymph node biopsy using radioactive tracking and fluorescence imaging. Three other breast cancer patients received indocyanine green retroareolarly before undergoing standard-of-care partial mastectomy, followed by fluorescence imaging of resected tumor and tumor cavity for margin assessment. Using near-infrared fluorescence from the dyes, the optical see-through GAINS accurately identified all mouse tumors, pig lymphatics, and four pig popliteal lymph nodes with high signal-to-background ratio. In 4 human breast cancer patients, 11 sentinel lymph nodes were identified with a detection sensitivity of 86.67 ± 0.27% for radioactive tracking and 100% for GAINS. Tumor margin status was accurately predicted by GAINS in all three patients, including clear margins in patients 1 and 2 and positive margins in patient 3 as confirmed by paraffin-embedded section histopathology. The optical see-through GAINS prototype enhances near infrared fluorescence-guided surgery for sentinel lymph node biopsy and tumor margin assessment in breast cancer patients without disrupting the surgical workflow in the operating room.
Brown, Meghan A; Sampson, Elizabeth L; Jones, Louise
2013-01-01
Background: For end-of-life dementia patients, palliative care offers a better quality of life than continued aggressive or burdensome medical interventions. To provide the best care options to dementia sufferers, validated, reliable, sensitive, and accurate prognostic tools to identify end-of-life dementia stages are necessary. Aim: To identify accurate prognosticators of mortality in elderly advanced dementia patients consistently reported in the literature. Design: Systematic literature review. Data sources: PubMed, Embase, and PsycINFO databases were searched up to September 2012. Reference lists of included studies were also searched. Inclusion criteria were studies measuring factors specifically related to 6-month outcome in patients diagnosed with dementia in any residential or health-care setting. Results: Seven studies met the inclusion criteria, five of which were set in the United States and two in Israel. Methodology and prognostic outcomes varied greatly between the studies. All but one study found that Functional Assessment Staging phase 7c, currently widely used to assess hospice admission eligibility in the United States, was not a reliable predictor of 6-month mortality. The most common prognostic variables identified related to nutrition/nourishment, or eating habits, followed by increased risk on dementia severity scales and comorbidities. Conclusions: Although the majority of studies agreed that the Functional Assessment Staging 7c criterion was not a reliable predictor of 6-month mortality, we found a lack of prognosticator concordance across the literature. Further studies are essential to identify reliable, sensitive, and specific prognosticators, which can be applied to the clinical setting and allow increased availability of palliative care to dementia patients. PMID:23175514
Low, See-Wei; Pasha, Ahmed K; Howe, Carol L; Lee, Kwan S; Suryanarayana, Prakash G
2018-01-01
Background Accurate determination of right ventricular ejection fraction (RVEF) is challenging because of the unique geometry of the right ventricle. Tricuspidannular plane systolic excursion (TAPSE) and fractional area change (FAC) are commonly used echocardiographic quantitative estimates of RV function. Cardiac MRI (CMRI) has emerged as the gold standard for assessment of RVEF. We sought to summarise the available data on correlation of TAPSE and FAC with CMRI-derived RVEF and to compare their accuracy. Methods We searched PubMed, EMBASE, Web of Science, CINAHL, ClinicalTrials.gov and the Cochrane Library databases for studies that assessed the correlation of TAPSE or FAC with CMRI-derived RVEF. Data from each study selected were pooled and analysed to compare the correlation coefficient of TAPSE and FAC with CMRI-derived RVEF. Subgroup analysis was performed on patients with pulmonary hypertension. Results Analysis of data from 17 studies with a total of 1280 patients revealed that FAC had a higher correlation with CMRI-derived RVEF compared with TAPSE (0.56vs0.40, P=0.018). In patients with pulmonary hypertension, there was no statistical difference in the mean correlation coefficient of FAC and TAPSE to CMR (0.57vs0.46, P=0.16). Conclusions FAC provides a more accurate estimate of RV systolic function (RVSF) compared with TAPSE. Adoption of FAC as a routine tool for the assessment of RVSF should be considered, especially since it is also an independent predictor of morbidity and mortality. Further studies will be needed to compare other methods of echocardiographic measurement of RV function. PMID:29387425
NASA Astrophysics Data System (ADS)
Lei, Hebing; Yao, Yong; Liu, Haopeng; Tian, Yiting; Yang, Yanfu; Gu, Yinglong
2018-06-01
An accurate algorithm by combing Gram-Schmidt orthonormalization and least square ellipse fitting technology is proposed, which could be used for phase extraction from two or three interferograms. The DC term of background intensity is suppressed by subtraction operation on three interferograms or by high-pass filter on two interferograms. Performing Gram-Schmidt orthonormalization on pre-processing interferograms, the phase shift error is corrected and a general ellipse form is derived. Then the background intensity error and the corrected error could be compensated by least square ellipse fitting method. Finally, the phase could be extracted rapidly. The algorithm could cope with the two or three interferograms with environmental disturbance, low fringe number or small phase shifts. The accuracy and effectiveness of the proposed algorithm are verified by both of the numerical simulations and experiments.
Llewellyn, Stacey; Inpankaew, Tawin; Nery, Susana Vaz; Gray, Darren J.; Verweij, Jaco J.; Clements, Archie C. A.; Gomes, Santina J.; Traub, Rebecca; McCarthy, James S.
2016-01-01
Background Accurate quantitative assessment of infection with soil transmitted helminths and protozoa is key to the interpretation of epidemiologic studies of these parasites, as well as for monitoring large scale treatment efficacy and effectiveness studies. As morbidity and transmission of helminth infections are directly related to both the prevalence and intensity of infection, there is particular need for improved techniques for assessment of infection intensity for both purposes. The current study aimed to evaluate two multiplex PCR assays to determine prevalence and intensity of intestinal parasite infections, and compare them to standard microscopy. Methodology/Principal Findings Faecal samples were collected from a total of 680 people, originating from rural communities in Timor-Leste (467 samples) and Cambodia (213 samples). DNA was extracted from stool samples and subject to two multiplex real-time PCR reactions the first targeting: Necator americanus, Ancylostoma spp., Ascaris spp., and Trichuris trichiura; and the second Entamoeba histolytica, Cryptosporidium spp., Giardia. duodenalis, and Strongyloides stercoralis. Samples were also subject to sodium nitrate flotation for identification and quantification of STH eggs, and zinc sulphate centrifugal flotation for detection of protozoan parasites. Higher parasite prevalence was detected by multiplex PCR (hookworms 2.9 times higher, Ascaris 1.2, Giardia 1.6, along with superior polyparasitism detection with this effect magnified as the number of parasites present increased (one: 40.2% vs. 38.1%, two: 30.9% vs. 12.9%, three: 7.6% vs. 0.4%, four: 0.4% vs. 0%). Although, all STH positive samples were low intensity infections by microscopy as defined by WHO guidelines the DNA-load detected by multiplex PCR suggested higher intensity infections. Conclusions/Significance Multiplex PCR, in addition to superior sensitivity, enabled more accurate determination of infection intensity for Ascaris, hookworms and Giardia compared to microscopy, especially in samples exhibiting polyparasitism. The superior performance of multiplex PCR to detect polyparasitism and more accurately determine infection intensity suggests that it is a more appropriate technique for use in epidemiologic studies and for monitoring large-scale intervention trials. PMID:26820626
Little, Stephen H.; Pirat, Bahar; Kumar, Rahul; Igo, Stephen R.; McCulloch, Marti; Hartley, Craig J.; Xu, Jiaqiong; Zoghbi, William A.
2012-01-01
OBJECTIVES Our goal was to prospectively compare the accuracy of real-time three-dimensional (3D) color Doppler vena contracta (VC) area and two-dimensional (2D) VC diameter in an in vitro model and in the clinical assessment of mitral regurgitation (MR) severity. BACKGROUND Real-time 3D color Doppler allows direct measurement of VC area and may be more accurate for assessment of MR than the conventional VC diameter measurement by 2D color Doppler. METHODS Using a circulatory loop with an incorporated imaging chamber, various pulsatile flow rates of MR were driven through 4 differently sized orifices. In a clinical study of patients with at least mild MR, regurgitation severity was assessed quantitatively using Doppler-derived effective regurgitant orifice area (EROA), and semiquantitatively as recommended by the American Society of Echocardiography. We describe a step-by-step process to accurately identify the 3D-VC area and compare that measure against known orifice areas (in vitro study) and EROA (clinical study). RESULTS In vitro, 3D-VC area demonstrated the strongest correlation with known orifice area (r = 0.92, p < 0.001), whereas 2D-VC diameter had a weak correlation with orifice area (r = 0.56, p = 0.01). In a clinical study of 61 patients, 3D-VC area correlated with Doppler-derived EROA (r = 0.85, p < 0.001); the relation was stronger than for 2D-VC diameter (r = 0.67, p < 0.001). The advantage of 3D-VC area over 2D-VC diameter was more pronounced in eccentric jets (r = 0.87, p < 0.001 vs. r = 0.6, p < 0.001, respectively) and in moderate-to-severe or severe MR (r = 0.80, p < 0.001 vs. r = 0.18, p = 0.4, respectively). CONCLUSIONS Measurement of VC area is feasible with real-time 3D color Doppler and provides a simple parameter that accurately reflects MR severity, particularly in eccentric and clinically significant MR where geometric assumptions may be challenging. PMID:19356505
How Can We Assess Innovative Structures and Programs?
ERIC Educational Resources Information Center
Shuy, Roger W.
Preoccupation with the science of educational evaluation has led educators to overlook the art of evaluation, the ability to make quick and accurate assessments, usually without statistical props. This paper suggests common-sense steps to insure accurate assessment of innovative structures. (1) Know the nature and purpose of innovation. An urban…
Efficient alignment-free DNA barcode analytics
Kuksa, Pavel; Pavlovic, Vladimir
2009-01-01
Background In this work we consider barcode DNA analysis problems and address them using alternative, alignment-free methods and representations which model sequences as collections of short sequence fragments (features). The methods use fixed-length representations (spectrum) for barcode sequences to measure similarities or dissimilarities between sequences coming from the same or different species. The spectrum-based representation not only allows for accurate and computationally efficient species classification, but also opens possibility for accurate clustering analysis of putative species barcodes and identification of critical within-barcode loci distinguishing barcodes of different sample groups. Results New alignment-free methods provide highly accurate and fast DNA barcode-based identification and classification of species with substantial improvements in accuracy and speed over state-of-the-art barcode analysis methods. We evaluate our methods on problems of species classification and identification using barcodes, important and relevant analytical tasks in many practical applications (adverse species movement monitoring, sampling surveys for unknown or pathogenic species identification, biodiversity assessment, etc.) On several benchmark barcode datasets, including ACG, Astraptes, Hesperiidae, Fish larvae, and Birds of North America, proposed alignment-free methods considerably improve prediction accuracy compared to prior results. We also observe significant running time improvements over the state-of-the-art methods. Conclusion Our results show that newly developed alignment-free methods for DNA barcoding can efficiently and with high accuracy identify specimens by examining only few barcode features, resulting in increased scalability and interpretability of current computational approaches to barcoding. PMID:19900305
2013-01-01
Background Perturbations in intestinal microbiota composition have been associated with a variety of gastrointestinal tract-related diseases. The alleviation of symptoms has been achieved using treatments that alter the gastrointestinal tract microbiota toward that of healthy individuals. Identifying differences in microbiota composition through the use of 16S rRNA gene hypervariable tag sequencing has profound health implications. Current computational methods for comparing microbial communities are usually based on multiple alignments and phylogenetic inference, making them time consuming and requiring exceptional expertise and computational resources. As sequencing data rapidly grows in size, simpler analysis methods are needed to meet the growing computational burdens of microbiota comparisons. Thus, we have developed a simple, rapid, and accurate method, independent of multiple alignments and phylogenetic inference, to support microbiota comparisons. Results We create a metric, called compression-based distance (CBD) for quantifying the degree of similarity between microbial communities. CBD uses the repetitive nature of hypervariable tag datasets and well-established compression algorithms to approximate the total information shared between two datasets. Three published microbiota datasets were used as test cases for CBD as an applicable tool. Our study revealed that CBD recaptured 100% of the statistically significant conclusions reported in the previous studies, while achieving a decrease in computational time required when compared to similar tools without expert user intervention. Conclusion CBD provides a simple, rapid, and accurate method for assessing distances between gastrointestinal tract microbiota 16S hypervariable tag datasets. PMID:23617892
NASA Astrophysics Data System (ADS)
White, R. D.; Cocks, D.; Boyle, G.; Casey, M.; Garland, N.; Konovalov, D.; Philippa, B.; Stokes, P.; de Urquijo, J.; González-Magaña, O.; McEachran, R. P.; Buckman, S. J.; Brunger, M. J.; Garcia, G.; Dujko, S.; Petrovic, Z. Lj
2018-05-01
Accurate modelling of electron transport in plasmas, plasma-liquid and plasma-tissue interactions requires (i) the existence of accurate and complete sets of cross-sections, and (ii) an accurate treatment of electron transport in these gaseous and soft-condensed phases. In this study we present progress towards the provision of self-consistent electron-biomolecule cross-section sets representative of tissue, including water and THF, by comparison of calculated transport coefficients with those measured using a pulsed-Townsend swarm experiment. Water–argon mixtures are used to assess the self-consistency of the electron-water vapour cross-section set proposed in de Urquijo et al (2014 J. Chem. Phys. 141 014308). Modelling of electron transport in liquids and soft-condensed matter is considered through appropriate generalisations of Boltzmann’s equation to account for spatial-temporal correlations and screening of the electron potential. The ab initio formalism is applied to electron transport in atomic liquids and compared with available experimental swarm data for these noble liquids. Issues on the applicability of the ab initio formalism for krypton are discussed and addressed through consideration of the background energy of the electron in liquid krypton. The presence of self-trapping (into bubble/cluster states/solvation) in some liquids requires a reformulation of the governing Boltzmann equation to account for the combined localised–delocalised nature of the resulting electron transport. A generalised Boltzmann equation is presented which is highlighted to produce dispersive transport observed in some liquid systems.
2010-01-01
Background Accurate identification is necessary to discriminate harmless environmental Yersinia species from the food-borne pathogens Yersinia enterocolitica and Yersinia pseudotuberculosis and from the group A bioterrorism plague agent Yersinia pestis. In order to circumvent the limitations of current phenotypic and PCR-based identification methods, we aimed to assess the usefulness of matrix-assisted laser desorption/ionization time-of-flight (MALDI-TOF) protein profiling for accurate and rapid identification of Yersinia species. As a first step, we built a database of 39 different Yersinia strains representing 12 different Yersinia species, including 13 Y. pestis isolates representative of the Antiqua, Medievalis and Orientalis biotypes. The organisms were deposited on the MALDI-TOF plate after appropriate ethanol-based inactivation, and a protein profile was obtained within 6 minutes for each of the Yersinia species. Results When compared with a 3,025-profile database, every Yersinia species yielded a unique protein profile and was unambiguously identified. In the second step of analysis, environmental and clinical isolates of Y. pestis (n = 2) and Y. enterocolitica (n = 11) were compared to the database and correctly identified. In particular, Y. pestis was unambiguously identified at the species level, and MALDI-TOF was able to successfully differentiate the three biotypes. Conclusion These data indicate that MALDI-TOF can be used as a rapid and accurate first-line method for the identification of Yersinia isolates. PMID:21073689
Intertester reliability of the acceptable noise level.
Gordon-Hickey, Susan; Adams, Elizabeth; Moore, Robert; Gaal, Ashley; Berry, Katie; Brock, Sommer
2012-01-01
The acceptable noise level (ANL) serves to accurately predict the listener's likelihood of success with amplification. It has been proposed as a pre-hearing aid fitting protocol for hearing aid selection and counseling purposes. The ANL is a subjective measure of the listener's ability to accept background noise. Measurement of ANL relies on the tester and listener to follow the instructions set forth. To date, no research has explored the reliability of ANL as measured across clinicians or testers. To examine the intertester reliability of ANL. A descriptive quasi-experimental reliability study was completed. ANL was measured for one group of listeners by three testers. Three participants served as testers. Each tester was familiar with basic audiometry. Twenty-five young adults with normal hearing served as listeners. Each tester was stationed in a laboratory with the needed equipment. Listeners were instructed to report to these laboratories in a random order provided by the experimenters. The testers assessed most comfortable listening level (MCL) and background noise level (BNL) for all 25 listeners. Intraclass correlation coefficients were significant and revealed that MCL, BNL, and ANLs are reliable across testers. Additionally, one-way ANOVAs for MCL, BNL, and ANL were not significant. These findings indicate that MCL, BNL, and ANL do not differ significantly when measured by different testers. If the ANL instruction set is accurately followed, ANL can be reliably measured across testers, laboratories, and clinics. Intertester reliability of ANL allows for comparison across ANLs measured by different individuals. Findings of the present study indicate that tester reliability can be ruled out as a factor contributing to the disparity of mean ANLs reported in the literature. American Academy of Audiology.
NASA Astrophysics Data System (ADS)
Ren, Lei; Hartnett, Michael
2017-02-01
Accurate forecasting of coastal surface currents is of great economic importance due to marine activities such as marine renewable energy and fish farms in coastal regions in recent twenty years. Advanced oceanographic observation systems such as satellites and radars can provide many parameters of interest, such as surface currents and waves, with fine spatial resolution in near real time. To enhance modelling capability, data assimilation (DA) techniques which combine the available measurements with the hydrodynamic models have been used since the 1990s in oceanography. Assimilating measurements into hydrodynamic models makes the original model background states follow the observation trajectory, then uses it to provide more accurate forecasting information. Galway Bay is an open, wind dominated water body on which two coastal radars are deployed. An efficient and easy to implement sequential DA algorithm named Optimal Interpolation (OI) was used to blend radar surface current data into a three-dimensional Environmental Fluid Dynamics Code (EFDC) model. Two empirical parameters, horizontal correlation length and DA cycle length (CL), are inherent within OI. No guidance has previously been published regarding selection of appropriate values of these parameters or how sensitive OI DA is to variations in their values. Detailed sensitivity analysis has been performed on both of these parameters and results presented. Appropriate value of DA CL was examined and determined on producing the minimum Root-Mean-Square-Error (RMSE) between radar data and model background states. Analysis was performed to evaluate assimilation index (AI) of using an OI DA algorithm in the model. AI of the half-day forecasting mean vectors' directions was over 50% in the best assimilation model. The ability of using OI to improve model forecasts was also assessed and is reported upon.
Desktop publishing and validation of custom near visual acuity charts.
Marran, Lynn; Liu, Lei; Lau, George
2008-11-01
Customized visual acuity (VA) assessment is an important part of basic and clinical vision research. Desktop computer based distance VA measurements have been utilized, and shown to be accurate and reliable, but computer based near VA measurements have not been attempted, mainly due to the limited spatial resolution of computer monitors. In this paper, we demonstrate how to use desktop publishing to create printed custom near VA charts. We created a set of six near VA charts in a logarithmic progression, 20/20 through 20/63, with multiple lines of the same acuity level, different letter arrangements in each line and a random noise background. This design allowed repeated measures of subjective accommodative amplitude without the potential artifact of familiarity of the optotypes. The background maintained a constant and spatial frequency rich peripheral stimulus for accommodation across the six different acuity levels. The paper describes in detail how pixel-wise accurate black and white bitmaps of Sloan optotypes were used to create the printed custom VA charts. At all acuity levels, the physical sizes of the printed custom optotypes deviated no more than 0.034 log units from that of the standard, satisfying the 0.05 log unit ISO criterion we used to demonstrate physical equivalence. Also, at all acuity levels, log unit differences in the mean target distance for which reliable recognition of letters first occurred for the printed custom optotypes compared to the standard were found to be below 0.05, satisfying the 0.05 log unit ISO criterion we used to demonstrate functional equivalence. It is possible to use desktop publishing to create custom near VA charts that are physically and functionally equivalent to standard VA charts produced by a commercial printing process.
[Study of emission spectroscopy of OH radicals in pulsed corona discharge].
Wei, Bo; Luo, Zhong-Yang; Xu, Fei; Zhao, Lei; Gao, Xiang; Cen, Ke-Fa
2010-02-01
In the present paper, OH radicals generated by pulsed corona discharge in humidified air, N2 and Ar in a needle-plate reactor were measured by emission spectra. With the analysis of the emission spectra, the influence of pulse peak voltage and frequency on OH radical generation was investigated in the three kinds of background gases. The influence of the gas humidity on the generation and the distribution of OH radicals in the electric field was also discussed in detail. The authors studied the influence of the gas humidity on the generation of OH radicals in the electric field by the control of accurate change in humidity, and we also studied the distribution of OH radicals in the electric field in different background gases including humidified air, N2 and Ar by the accurate change in scales. The experiment shows that the output of OH radicals grows as the pulse peak voltage and frequency grow, but the influence of gas humidity on the process of generating OH radicals by pulsed corona discharge depends on the discharge background. The rules of the generation change when the background gases change. As the humidity in the background gases grows, the amount of OH radicals grows in the air, but it grows at first and decreases at last in N2, while it decreases at first and grows at last in Ar. The distribution of OH radical shows a trend of decreasing from the needle-electrode to its circumambience.
Expected background in the LZ experiment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kudryavtsev, Vitaly A.
2015-08-17
The LZ experiment, featuring a 7-tonne active liquid xenon target, is aimed at achieving unprecedented sensitivity to WIMPs with the background expected to be dominated by astrophysical neutrinos. To reach this goal, extensive simulations are carried out to accurately calculate the electron recoil and nuclear recoil rates in the detector. Both internal (from target material) and external (from detector components and surrounding environment) backgrounds are considered. A very efficient suppression of background rate is achieved with an outer liquid scintillator veto, liquid xenon skin and fiducialisation. Based on the current measurements of radioactivity of different materials, it is shown thatmore » LZ can achieve the reduction of a total background for a WIMP search down to about 2 events in 1000 live days for 5.6 tonne fiducial mass.« less
Automatic online spike sorting with singular value decomposition and fuzzy C-mean clustering.
Oliynyk, Andriy; Bonifazzi, Claudio; Montani, Fernando; Fadiga, Luciano
2012-08-08
Understanding how neurons contribute to perception, motor functions and cognition requires the reliable detection of spiking activity of individual neurons during a number of different experimental conditions. An important problem in computational neuroscience is thus to develop algorithms to automatically detect and sort the spiking activity of individual neurons from extracellular recordings. While many algorithms for spike sorting exist, the problem of accurate and fast online sorting still remains a challenging issue. Here we present a novel software tool, called FSPS (Fuzzy SPike Sorting), which is designed to optimize: (i) fast and accurate detection, (ii) offline sorting and (iii) online classification of neuronal spikes with very limited or null human intervention. The method is based on a combination of Singular Value Decomposition for fast and highly accurate pre-processing of spike shapes, unsupervised Fuzzy C-mean, high-resolution alignment of extracted spike waveforms, optimal selection of the number of features to retain, automatic identification the number of clusters, and quantitative quality assessment of resulting clusters independent on their size. After being trained on a short testing data stream, the method can reliably perform supervised online classification and monitoring of single neuron activity. The generalized procedure has been implemented in our FSPS spike sorting software (available free for non-commercial academic applications at the address: http://www.spikesorting.com) using LabVIEW (National Instruments, USA). We evaluated the performance of our algorithm both on benchmark simulated datasets with different levels of background noise and on real extracellular recordings from premotor cortex of Macaque monkeys. The results of these tests showed an excellent accuracy in discriminating low-amplitude and overlapping spikes under strong background noise. The performance of our method is competitive with respect to other robust spike sorting algorithms. This new software provides neuroscience laboratories with a new tool for fast and robust online classification of single neuron activity. This feature could become crucial in situations when online spike detection from multiple electrodes is paramount, such as in human clinical recordings or in brain-computer interfaces.
Characterizing the Background Corona with SDO/AIA
NASA Technical Reports Server (NTRS)
Napier, Kate; Alexander, Caroline; Winebarger, Amy
2014-01-01
Characterizing the nature of the solar coronal background would enable scientists to more accurately determine plasma parameters, and may lead to a better understanding of the coronal heating problem. Because scientists study the 3D structure of the Sun in 2D, any line-of-sight includes both foreground and background material, and thus, the issue of background subtraction arises. By investigating the intensity values in and around an active region, using multiple wavelengths collected from the Atmospheric Imaging Assembly (AIA) on the Solar Dynamics Observatory (SDO) over an eight-hour period, this project aims to characterize the background as smooth or structured. Different methods were employed to measure the true coronal background and create minimum intensity images. These were then investigated for the presence of structure. The background images created were found to contain long-lived structures, including coronal loops, that were still present in all of the wavelengths, 131, 171, 193, 211, and 335 A. The intensity profiles across the active region indicate that the background is much more structured than previously thought.
Holi, Matti Mikael; Pelkonen, Mirjami; Karlsson, Linnea; Tuisku, Virpi; Kiviruusu, Olli; Ruuttu, Titta; Marttunen, Mauri
2008-01-01
Background Accurate assessment of suicidality is of major importance. We aimed to evaluate trained clinicians' ability to assess suicidality against a structured assessment made by trained raters. Method Treating clinicians classified 218 adolescent psychiatric outpatients suffering from a depressive mood disorder into three classes: 1-no suicidal ideation, 2-suicidal ideation, no suicidal acts, 3-suicidal or self-harming acts. This classification was compared with a classification with identical content derived from the Kiddie Schedule for Affective Disorders and Schizophrenia (K-SADS-PL) made by trained raters. The convergence was assessed by kappa- and weighted kappa tests. Results The clinicians' classification to class 1 (no suicidal ideation) was 85%, class 2 (suicidal ideation) 50%, and class 3 (suicidal acts) 10% concurrent with the K-SADS evaluation (γ2 = 37.1, df 4, p = 0.000). Weighted kappa for the agreement of the measures was 0.335 (CI = 0.198–0.471, p < 0.0001). The clinicians under-detected suicidal and self-harm acts, but over-detected suicidal ideation. Conclusion There was only a modest agreement between the trained clinicians' suicidality evaluation and the K-SADS evaluation, especially concerning suicidal or self-harming acts. We suggest a wider use of structured scales in clinical and research settings to improve reliable detection of adolescents with suicidality. PMID:19116040
Donnell, Deborah; Komárek, Arnošt; Omelka, Marek; Mullis, Caroline E.; Szekeres, Greg; Piwowar-Manning, Estelle; Fiamma, Agnes; Gray, Ronald H.; Lutalo, Tom; Morrison, Charles S.; Salata, Robert A.; Chipato, Tsungai; Celum, Connie; Kahle, Erin M.; Taha, Taha E.; Kumwenda, Newton I.; Karim, Quarraisha Abdool; Naranbhai, Vivek; Lingappa, Jairam R.; Sweat, Michael D.; Coates, Thomas; Eshleman, Susan H.
2013-01-01
Background Accurate methods of HIV incidence determination are critically needed to monitor the epidemic and determine the population level impact of prevention trials. One such trial, Project Accept, a Phase III, community-randomized trial, evaluated the impact of enhanced, community-based voluntary counseling and testing on population-level HIV incidence. The primary endpoint of the trial was based on a single, cross-sectional, post-intervention HIV incidence assessment. Methods and Findings Test performance of HIV incidence determination was evaluated for 403 multi-assay algorithms [MAAs] that included the BED capture immunoassay [BED-CEIA] alone, an avidity assay alone, and combinations of these assays at different cutoff values with and without CD4 and viral load testing on samples from seven African cohorts (5,325 samples from 3,436 individuals with known duration of HIV infection [1 month to >10 years]). The mean window period (average time individuals appear positive for a given algorithm) and performance in estimating an incidence estimate (in terms of bias and variance) of these MAAs were evaluated in three simulated epidemic scenarios (stable, emerging and waning). The power of different test methods to detect a 35% reduction in incidence in the matched communities of Project Accept was also assessed. A MAA was identified that included BED-CEIA, the avidity assay, CD4 cell count, and viral load that had a window period of 259 days, accurately estimated HIV incidence in all three epidemic settings and provided sufficient power to detect an intervention effect in Project Accept. Conclusions In a Southern African setting, HIV incidence estimates and intervention effects can be accurately estimated from cross-sectional surveys using a MAA. The improved accuracy in cross-sectional incidence testing that a MAA provides is a powerful tool for HIV surveillance and program evaluation. PMID:24236054
Developing a dengue forecast model using machine learning: A case study in China
Zhang, Qin; Wang, Li; Xiao, Jianpeng; Zhang, Qingying; Luo, Ganfeng; Li, Zhihao; He, Jianfeng; Zhang, Yonghui; Ma, Wenjun
2017-01-01
Background In China, dengue remains an important public health issue with expanded areas and increased incidence recently. Accurate and timely forecasts of dengue incidence in China are still lacking. We aimed to use the state-of-the-art machine learning algorithms to develop an accurate predictive model of dengue. Methodology/Principal findings Weekly dengue cases, Baidu search queries and climate factors (mean temperature, relative humidity and rainfall) during 2011–2014 in Guangdong were gathered. A dengue search index was constructed for developing the predictive models in combination with climate factors. The observed year and week were also included in the models to control for the long-term trend and seasonality. Several machine learning algorithms, including the support vector regression (SVR) algorithm, step-down linear regression model, gradient boosted regression tree algorithm (GBM), negative binomial regression model (NBM), least absolute shrinkage and selection operator (LASSO) linear regression model and generalized additive model (GAM), were used as candidate models to predict dengue incidence. Performance and goodness of fit of the models were assessed using the root-mean-square error (RMSE) and R-squared measures. The residuals of the models were examined using the autocorrelation and partial autocorrelation function analyses to check the validity of the models. The models were further validated using dengue surveillance data from five other provinces. The epidemics during the last 12 weeks and the peak of the 2014 large outbreak were accurately forecasted by the SVR model selected by a cross-validation technique. Moreover, the SVR model had the consistently smallest prediction error rates for tracking the dynamics of dengue and forecasting the outbreaks in other areas in China. Conclusion and significance The proposed SVR model achieved a superior performance in comparison with other forecasting techniques assessed in this study. The findings can help the government and community respond early to dengue epidemics. PMID:29036169
TRIPPy: Python-based Trailed Source Photometry
NASA Astrophysics Data System (ADS)
Fraser, Wesley C.; Alexandersen, Mike; Schwamb, Megan E.; Marsset, Michael E.; Pike, Rosemary E.; Kavelaars, JJ; Bannister, Michele T.; Benecchi, Susan; Delsanti, Audrey
2016-05-01
TRIPPy (TRailed Image Photometry in Python) uses a pill-shaped aperture, a rectangle described by three parameters (trail length, angle, and radius) to improve photometry of moving sources over that done with circular apertures. It can generate accurate model and trailed point-spread functions from stationary background sources in sidereally tracked images. Appropriate aperture correction provides accurate, unbiased flux measurement. TRIPPy requires numpy, scipy, matplotlib, Astropy (ascl:1304.002), and stsci.numdisplay; emcee (ascl:1303.002) and SExtractor (ascl:1010.064) are optional.
Assessment of Telemedicine in Surgical Education and Patient Care
Demartines, Nicolas; Mutter, Didier; Vix, Michel; Leroy, Joël; Glatz, Dieter; Rösel, Fritz; Harder, Felix; Marescaux, Jacques
2000-01-01
Objective To analyze the value of teleconferencing for patient care and surgical education by assessing the activity of an international academic network. Summary Background Data The uses of telemedicine include teleeducation, training, and consulting, and surgical teams are now involved, sharing diagnostic information and opinions without the need for travel. However, the value of telematics in surgery remains to be assessed. Methods During a 2-year period, weekly surgical teleconferences were held among six university hospitals in four European countries. To assess the accuracy of telediagnosis for surgical cases, 60 randomly selected cases were analyzed by a panel of surgeons. Participants’ opinions were analyzed by questionnaire. Results Seventy teleconferences (50 lectures and 271 case presentations) were held. Ninety-five of the 114 participants (83.3%) completed the final questionnaire. Eighty-six percent rated the surgical activity as good or excellent, 75.7% rated the scientific level as good or excellent, 55.8% rated the daily clinical activity as good or excellent, and 28.4% rated the manual surgical technique as good or excellent. The target organ was identified in all the cases; the organ structure and pathology were considered well defined in 93.3%, and the fine structure was considered well defined in 58.3%. Diagnosis was accurate in 17 cases (28.3%), probable in 25 (41.7%), possible but uncertain in 16 (26.7%), and not possible in 2 cases (3.3%). Discussion among the remote sites increased the rate of valuable therapeutic advice from 55% of cases before the discussion to 95% after the discussion. Eighty-six percent of the surgeons expressed satisfaction with telematics for medical education and patient care. Conclusions Participant satisfaction was high, transmission of clinical documents was accurate, and the opportunity to discuss case documentation and management significantly improved diagnostic potential, resulting in an accuracy rate of up to 95%. Teleeducation and teleconsultation in surgery appear to be beneficial. PMID:10674622
Rydberg, Henny; Marrone, Gaetano; Strömdahl, Susanne; von Schreeb, Johan
2015-01-01
Background Research on long-term health effects of earthquakes is scarce, especially in low- and middle-income countries, which are disproportionately affected by disasters. To date, progress in this area has been hampered by the lack of tools to accurately measure these effects. Here, we explored whether long-term public health effects of earthquakes can be assessed using a combination of readily available data sources on public health and geographic distribution of seismic activity. Methods We used childhood stunting as a proxy for public health effects. Data on stunting were attained from Demographic and Health Surveys. Earthquake data were obtained from U.S. Geological Survey’s ShakeMaps, geographic information system-based maps that divide earthquake affected areas into different shaking intensity zones. We combined these two data sources to categorize the surveyed children into different earthquake exposure groups, based on how much their area of residence was affected by the earthquake. We assessed the feasibility of the approach using a real earthquake case – an 8.4 magnitude earthquake that hit southern Peru in 2001. Results and conclusions Our results indicate that the combination of health survey data and disaster data may offer a readily accessible and accurate method for determining the long-term public health consequences of a natural disaster. Our work allowed us to make pre- and post- earthquake comparisons of stunting, an important indicator of the well-being of a society, as well as comparisons between populations with different levels of exposure to the earthquake. Furthermore, the detailed GIS based data provided a precise and objective definition of earthquake exposure. Our approach should be considered in future public health and disaster research exploring the long-term effects of earthquakes and potentially other natural disasters. PMID:26090999
Hurks, Petra PM; Aldenkamp, Albert P; van der Spek, Erik D; Rauterberg, GWM; Vles, Johan SH; Hendriksen, Jos GM
2016-01-01
Background A computer-based game, named Timo’s Adventure, was developed to assess specific cognitive functions (eg, attention, planning, and working memory), time perception, and reward mechanisms in young school-aged children. The game consists of 6 mini-games embedded in a story line and includes fantasy elements to enhance motivation. Objective The aim of this study was to investigate the validity of Timo’s Adventure in normally developing children and in children with attention-deficit/hyperactivity disorder (ADHD). Methods A total of 96 normally developing children aged 4-8 years and 40 children with ADHD were assessed using the game. Clinical validity was investigated by examining the effects of age on performances within the normally developing children, as well as performance differences between the healthy controls and the ADHD group. Results Our analyses in the normally developing children showed developmental effects; that is, older children made fewer inhibition mistakes (r=−.33, P=.001), had faster (and therefore better) reaction times (r=−.49, P<.001), and were able to produce time intervals more accurately than younger children (ρ=.35, P<.001). Discriminant analysis showed that Timo’s Adventure was accurate in most classifications whether a child belonged to the ADHD group or the normally developing group: 78% (76/97) of the children were correctly classified as having ADHD or as being in the normally developing group. The classification results showed that 72% (41/57) children in the control group were correctly classified, and 88% (35/40) of the children in the ADHD group were correctly classified as having ADHD. Sensitivity (0.89) and specificity (0.69) of Timo’s Adventure were satisfying. Conclusions Computer-based games seem to be a valid tool to assess specific strengths and weaknesses in young children with ADHD. PMID:27658428
Application of neuro-fuzzy methods to gamma spectroscopy
NASA Astrophysics Data System (ADS)
Grelle, Austin L.
Nuclear non-proliferation activities are an essential part of national security activities both domestic and abroad. The safety of the public in densely populated environments such as urban areas or large events can be compromised if devices using special nuclear materials are present. Therefore, the prompt and accurate detection of these materials is an important topic of research, in which the identification of normal conditions is also of importance. With gamma-ray spectroscopy, these conditions are identified as the radiation background, which though being affected by a multitude of factors is ever present. Therefore, in nuclear non-proliferation activities the accurate identification of background is important. With this in mind, a method has been developed to utilize aggregate background data to predict the background of a location through the use of an Artificial Neural Network (ANN). After being trained on background data, the ANN is presented with nearby relevant gamma-ray spectroscopy data---as identified by a Fuzzy Inference System - to create a predicted background spectra to compare to a measured spectra. If a significant deviation exists between the predicted and measured data, the method alerts the user such that a more thorough investigation can take place. Research herein focused on data from an urban setting in which the number of false positives was observed to be 28 out of a total of 987, representing 2.94% error. The method therefore currently shows a high rate of false positives given the current configuration, however there are promising steps that can be taken to further minimize this error. With this in mind, the method stands as a potentially significant tool in urban nuclear nonproliferation activities.
Cock, Don; Adams, Iain C; Ibbetson, Adrian B; Baugh, Phil
2006-01-01
Background The development of an instrument accurately assessing service quality in the GP Exercise Referral Scheme (ERS) industry could potentially inform scheme organisers of the factors that affect adherence rates leading to the implementation of strategic interventions aimed at reducing client drop-out. Methods A modified version of the SERVQUAL instrument was designed for use in the ERS setting and subsequently piloted amongst 27 ERS clients. Results Test re-test correlations were calculated via Pearson's 'r' or Spearman's 'rho', depending on whether the variables were Normally Distributed, to show a significant (mean r = 0.957, SD = 0.02, p < 0.05; mean rho = 0.934, SD = 0.03, p < 0.05) relationship between all items within the questionnaire. In addition, satisfactory internal consistency was demonstrated via Cronbach's 'α'. Furthermore, clients responded favourably towards the usability, wording and applicability of the instrument's items. Conclusion REFERQUAL is considered to represent promise as a suitable tool for future evaluation of service quality within the ERS community. Future research should further assess the validity and reliability of this instrument through the use of a confirmatory factor analysis to scrutinise the proposed dimensional structure. PMID:16725021
Emotional reactivity assessment of healthy elderly with an emotion-induction procedure.
Fajula, Claire; Bonin-Guillaume, Sylvie; Jouve, Elisabeth; Blin, Olivier
2013-01-01
BACKGROUND/STUDY CONTEXT: No emotion-induction procedure is clearly recommended to assess the emotional reactivity in the elderly. This study aimed to validate an emotional reactivity procedure in healthy old patients. Nineteen healthy elders (age range: 66-91 years old) were compared with 19 education- and sex-matched young adults (age range: 20-33 years old) using a cross-sectional design. The main outcome measure was the evaluation of emotional reactivity to commercial film excerpts used as stimuli (joy, anger, fear, sadness, disgust, or neutral state) according to Philippot's procedure and using a 5-point questionnaire assessing 10 emotion dimensions (Differential Emotions Scale, DES). In the elderly sample, targeted emotions of fear, disgust, anger, and sadness were significantly induced compared with the baseline status. The global emotional reactivity to each film showed that the elderly subjects rated the DES in the same manner as the young adults, but with significantly higher global intensity for the excerpts inducing fear, anger, disgust, and sadness. The Philippot procedure is accurate for studying emotional reactivity in healthy elderly. The simplicity and rapidity of this procedure makes it suitable for emotion studies in different elderly populations.
Raj, S; Sharma, V L; Singh, A J; Goel, S
2016-01-01
Background. The available health information on websites should be reliable and accurate in order to make informed decisions by community. This study was done to assess the quality and readability of health information websites on World Wide Web in India. Methods. This cross-sectional study was carried out in June 2014. The key words "Health" and "Information" were used on search engines "Google" and "Yahoo." Out of 50 websites (25 from each search engines), after exclusion, 32 websites were evaluated. LIDA tool was used to assess the quality whereas the readability was assessed using Flesch Reading Ease Score (FRES), Flesch-Kincaid Grade Level (FKGL), and SMOG. Results. Forty percent of websites (n = 13) were sponsored by government. Health On the Net Code of Conduct (HONcode) certification was present on 50% (n = 16) of websites. The mean LIDA score (74.31) was average. Only 3 websites scored high on LIDA score. Only five had readability scores at recommended sixth-grade level. Conclusion. Most health information websites had average quality especially in terms of usability and reliability and were written at high readability levels. Efforts are needed to develop the health information websites which can help general population in informed decision making.
Sánchez, Carolina Ramírez; Taurino, Antonietta; Bozzini, Benedetto
2016-01-01
This paper reports on the quantitative assessment of the oxygen reduction reaction (ORR) electrocatalytic activity of electrodeposited Mn/polypyrrole (PPy) nanocomposites for alkaline aqueous solutions, based on the Rotating Disk Electrode (RDE) method and accompanied by structural characterizations relevant to the establishment of structure-function relationships. The characterization of Mn/PPy films is addressed to the following: (i) morphology, as assessed by Field-Emission Scanning Electron Microscopy (FE-SEM) and Atomic Force Microscope (AFM); (ii) local electrical conductivity, as measured by Scanning Probe Microscopy (SPM); and (iii) molecular structure, accessed by Raman Spectroscopy; these data provide the background against which the electrocatalytic activity can be rationalised. For comparison, the properties of Mn/PPy are gauged against those of graphite, PPy, and polycrystalline-Pt (poly-Pt). Due to the literature lack of accepted protocols for precise catalytic activity measurement at poly-Pt electrode in alkaline solution using the RDE methodology, we have also worked on the obtainment of an intralaboratory benchmark by evidencing some of the time-consuming parameters which drastically affect the reliability and repeatability of the measurement. PMID:28042491
Coaching: a new model for academic and career achievement
Deiorio, Nicole M.; Carney, Patricia A.; Kahl, Leslie E.; Bonura, Erin M.; Juve, Amy Miller
2016-01-01
Background Individualized education is emerging as an innovative model for physician training. This requires faculty coaching to guide learners’ achievements in academic performance, competency development, and career progression. In addition, coaching can foster self-reflection and self-monitoring using a data-guided approach to support lifelong learning. Context Coaching differs from mentoring or advising, and its application in medical education is novel. Because of this, definitions of the concept and the constructs of coaching as applied to medical education are needed to accurately assess the coaching relationship and coaching processes. These can then be linked to learner outcomes to inform how coaching serves as a modifier of academic and competency achievement and career satisfaction. Innovation We developed definitions and constructs for academic coaching in medical education based on review of existing education and non-education coaching literature. These constructs focus on 1) establishing relationship principles, 2) conducting learner assessments, 3) developing and implementing an action plan, and 4) assessing results and revising plans accordingly. Implication Coaching is emerging as an important construct in the context of medical education. This article lays the vital groundwork needed for evaluation of coaching programs aimed at producing outstanding physicians. PMID:27914193
Horliana, Anna Carolina Ratto Tempestini; Chambrone, Leandro; Foz, Adriana Moura; Artese, Hilana Paula Carillo; Rabelo, Mariana de Sousa; Pannuti, Cláudio Mendes; Romito, Giuseppe Alexandre
2014-01-01
Background To date, there is no compilation of evidence-based information associating bacteremia and periodontal procedures. This systematic review aims to assess magnitude, duration, prevalence and nature of bacteremia caused by periodontal procedures. Study Design Systematic Review Types of Studies Reviewed MEDLINE, EMBASE and LILACS databases were searched in duplicate through August, 2013 without language restriction. Observational studies were included if blood samples were collected before, during or after periodontal procedures of patients with periodontitis. The methodological quality was assessed in duplicate using the modified Newcastle-Ottawa scale (NOS). Results Search strategy identified 509 potentially eligible articles and nine were included. Only four studies demonstrated high methodological quality, whereas five were of medium or low methodological quality. The study characteristics were considered too heterogeneous to conduct a meta-analysis. Among 219 analyzed patients, 106 (49.4%) had positive bacteremia. More frequent bacteria were S. viridans, A. actinomycetemcomitans P. gingivalis, M. micros and species Streptococcus and Actinomyces, although identification methods of microbiologic assays were different among studies. Clinical Implications Although half of the patients presented positive bacteremia after periodontal procedures, accurate results regarding the magnitude, duration and nature of bacteremia could not be confidentially assessed. PMID:24870125
Wavelet-based system identification of short-term dynamic characteristics of arterial baroreflex.
Kashihara, Koji; Kawada, Toru; Sugimachi, Masaru; Sunagawa, Kenji
2009-01-01
The assessment of arterial baroreflex function in cardiovascular diseases requires quantitative evaluation of dynamic and static baroreflex properties because of the frequent modulation of baroreflex properties with unstable hemodynamics. The purpose of this study was to identify the dynamic baroreflex properties from transient changes of step pressure inputs with background noise during a short-duration baroreflex test in anesthetized rabbits with isolated carotid sinuses, using a modified wavelet-based time-frequency analysis. The proposed analysis was able to identify the transfer function of baroreflex as well as static properties from the transient input-output responses under normal [gain at 0.04 Hz from carotid sinus pressure (CSP) to arterial pressure (n = 8); 0.29 +/- 0.05 at low (40-60 mmHg), 1.28 +/- 0.12 at middle (80-100 mmHg), and 0.38 +/- 0.07 at high (120-140 mmHg) CSP changes] and pathophysiological [gain in control vs. phenylbiguanide (n = 8); 0.32 +/- 0.07 vs. 0.39 +/- 0.09 at low, 1.39 +/- 0.15 vs. 0.59 +/- 0.09 (p < 0.01) at middle, and 0.35 +/- 0.04 vs. 0.15 +/- 0.02 (p < 0.01) at high CSP changes] conditions. Subsequently, we tested the proposed wavelet-based method under closed-loop baroreflex responses; the simulation study indicates that it may be applicable to clinical situations for accurate assessment of dynamic baroreflex function. In conclusion, the dynamic baroreflex property to various pressure inputs could be simultaneously extracted from the step responses with background noise.
Impact of cause of death adjudication on the results of the European prostate cancer screening trial
Walter, Stephen D; de Koning, Harry J; Hugosson, Jonas; Talala, Kirsi; Roobol, Monique J; Carlsson, Sigrid; Zappa, Marco; Nelen, Vera; Kwiatkowski, Maciej; Páez, Álvaro; Moss, Sue; Auvinen, Anssi
2017-01-01
Background: The European Randomised Study of Prostate Cancer Screening has shown a 21% relative reduction in prostate cancer mortality at 13 years. The causes of death can be misattributed, particularly in elderly men with multiple comorbidities, and therefore accurate assessment of the underlying cause of death is crucial for valid results. To address potential unreliability of end-point assessment, and its possible impact on mortality results, we analysed the study outcome adjudication data in six countries. Methods: Latent class statistical models were formulated to compare the accuracy of individual adjudicators, and to assess whether accuracy differed between the trial arms. We used the model to assess whether correcting for adjudication inaccuracies might modify the study results. Results: There was some heterogeneity in adjudication accuracy of causes of death, but no consistent differential accuracy by trial arm. Correcting the estimated screening effect for misclassification did not alter the estimated mortality effect of screening. Conclusions: Our findings were consistent with earlier reports on the European screening trial. Observer variation, while demonstrably present, is unlikely to have materially biased the main study results. A bias in assigning causes of death that might have explained the mortality reduction by screening can be effectively ruled out. PMID:27855442
DOE Office of Scientific and Technical Information (OSTI.GOV)
Porcella, D.B.; Bowie, G.L.; Campbell, C.L.
The Ecosystem Assessment Model (EAM) of the Cooling Lake Assessment Methodology was applied to the extensive ecological field data collected at Lake Norman, North Carolina by Duke Power Company to evaluate its capability to simulate lake ecosystems and the ecological effects of steam electric power plants. The EAM provided simulations over a five-year verification period that behaved as expected based on a one-year calibration. Major state variables of interest to utilities and regulatory agencies are: temperature, dissolved oxygen, and fish community variables. In qualitative terms, temperature simulation was very accurate, dissolved oxygen simulation was accurate, and fish prediction was reasonablymore » accurate. The need for more accurate fisheries data collected at monthly intervals and non-destructive sampling techniques was identified.« less
Separating astrophysical sources from indirect dark matter signals
Siegal-Gaskins, Jennifer M.
2015-01-01
Indirect searches for products of dark matter annihilation and decay face the challenge of identifying an uncertain and subdominant signal in the presence of uncertain backgrounds. Two valuable approaches to this problem are (i) using analysis methods which take advantage of different features in the energy spectrum and angular distribution of the signal and backgrounds and (ii) more accurately characterizing backgrounds, which allows for more robust identification of possible signals. These two approaches are complementary and can be significantly strengthened when used together. I review the status of indirect searches with gamma rays using two promising targets, the Inner Galaxy and the isotropic gamma-ray background. For both targets, uncertainties in the properties of backgrounds are a major limitation to the sensitivity of indirect searches. I then highlight approaches which can enhance the sensitivity of indirect searches using these targets. PMID:25304638
Morales Piga, Antonio; Alonso Ferreira, Verónica; Villaverde-Hueso, Ana
2011-01-01
Recent years have seen an unprecedented increase in the knowledge and understanding of biochemical disturbances involved on constitutional bone disorders. Recognition of the genetic background as the common cause of these diseases prompted the substitution of the term «constitutional» by «genetic», in referring to them. Understanding physiopathological bases by finding out the altered metabolic pathways as well as their regulatory and control systems, favours an earlier and more accurate diagnosis based on interdisciplinary collaboration. Although clinical and radiological assessment remains crucial in the study of these disorders, ever more often the diagnosis is achieved by molecular and genetic analysis. Elucidation of the damaged underlying molecular mechanisms offers targets potentially useful for therapeutic research in these complex and often disabling diseases. 2010 Elsevier España, S.L. All rights reserved.
Costing interventions in primary care.
Kernick, D
2000-02-01
Against a background of increasing demands on limited resources, studies that relate benefits of health interventions to the resources they consume will be an important part of any decision-making process in primary care, and an accurate assessment of costs will be an important part of any economic evaluation. Although there is no such thing as a gold standard cost estimate, there are a number of basic costing concepts that underlie any costing study. How costs are derived and combined will depend on the assumptions that have been made in their derivation. It is important to be clear what assumptions have been made and why in order to maintain consistency across comparative studies and prevent inappropriate conclusions being drawn. This paper outlines some costing concepts and principles to enable primary care practitioners and researchers to have a basic understanding of costing exercises and their pitfalls.
Li, Cuixia; Zuo, Jing; Zhang, Li; Chang, Yulei; Zhang, Youlin; Tu, Langping; Liu, Xiaomin; Xue, Bin; Li, Qiqing; Zhao, Huiying; Zhang, Hong; Kong, Xianggui
2016-12-09
Accurate quantitation of intracellular pH (pH i ) is of great importance in revealing the cellular activities and early warning of diseases. A series of fluorescence-based nano-bioprobes composed of different nanoparticles or/and dye pairs have already been developed for pH i sensing. Till now, biological auto-fluorescence background upon UV-Vis excitation and severe photo-bleaching of dyes are the two main factors impeding the accurate quantitative detection of pH i . Herein, we have developed a self-ratiometric luminescence nanoprobe based on förster resonant energy transfer (FRET) for probing pH i , in which pH-sensitive fluorescein isothiocyanate (FITC) and upconversion nanoparticles (UCNPs) were served as energy acceptor and donor, respectively. Under 980 nm excitation, upconversion emission bands at 475 nm and 645 nm of NaYF 4 :Yb 3+ , Tm 3+ UCNPs were used as pH i response and self-ratiometric reference signal, respectively. This direct quantitative sensing approach has circumvented the traditional software-based subsequent processing of images which may lead to relatively large uncertainty of the results. Due to efficient FRET and fluorescence background free, a highly-sensitive and accurate sensing has been achieved, featured by 3.56 per unit change in pH i value 3.0-7.0 with deviation less than 0.43. This approach shall facilitate the researches in pH i related areas and development of the intracellular drug delivery systems.
NASA Astrophysics Data System (ADS)
Li, Cuixia; Zuo, Jing; Zhang, Li; Chang, Yulei; Zhang, Youlin; Tu, Langping; Liu, Xiaomin; Xue, Bin; Li, Qiqing; Zhao, Huiying; Zhang, Hong; Kong, Xianggui
2016-12-01
Accurate quantitation of intracellular pH (pHi) is of great importance in revealing the cellular activities and early warning of diseases. A series of fluorescence-based nano-bioprobes composed of different nanoparticles or/and dye pairs have already been developed for pHi sensing. Till now, biological auto-fluorescence background upon UV-Vis excitation and severe photo-bleaching of dyes are the two main factors impeding the accurate quantitative detection of pHi. Herein, we have developed a self-ratiometric luminescence nanoprobe based on förster resonant energy transfer (FRET) for probing pHi, in which pH-sensitive fluorescein isothiocyanate (FITC) and upconversion nanoparticles (UCNPs) were served as energy acceptor and donor, respectively. Under 980 nm excitation, upconversion emission bands at 475 nm and 645 nm of NaYF4:Yb3+, Tm3+ UCNPs were used as pHi response and self-ratiometric reference signal, respectively. This direct quantitative sensing approach has circumvented the traditional software-based subsequent processing of images which may lead to relatively large uncertainty of the results. Due to efficient FRET and fluorescence background free, a highly-sensitive and accurate sensing has been achieved, featured by 3.56 per unit change in pHi value 3.0-7.0 with deviation less than 0.43. This approach shall facilitate the researches in pHi related areas and development of the intracellular drug delivery systems.
Visual and Vestibular Determinants of Perceived Eye-Level
NASA Technical Reports Server (NTRS)
Cohen, Malcolm Martin
2003-01-01
Both gravitational and optical sources of stimulation combine to determine the perceived elevations of visual targets. The ways in which these sources of stimulation combine with one another in operational aeronautical environments are critical for pilots to make accurate judgments of the relative altitudes of other aircraft and of their own altitude relative to the terrain. In a recent study, my colleagues and I required eighteen observers to set visual targets at their apparent horizon while they experienced various levels of G(sub z) in the human centrifuge at NASA-Ames Research Center. The targets were viewed in darkness and also against specific background optical arrays that were oriented at various angles with respect to the vertical; target settings were lowered as Gz was increased; this effect was reduced when the background optical array was visible. Also, target settings were displaced in the direction that the background optical array was pitched. Our results were attributed to the combined influences of otolith-oculomotor mechanisms that underlie the elevator illusion and visual-oculomotor mechanisms (optostatic responses) that underlie the perceptual effects of viewing pitched optical arrays that comprise the background. In this paper, I present a mathematical model that describes the independent and combined effects of G(sub z) intensity and the orientation and structure of background optical arrays; the model predicts quantitative deviations from normal accurate perceptions of target localization under a variety of conditions. Our earlier experimental results and the mathematical model are described in some detail, and the effects of viewing specific optical arrays under various gravitational-inertial conditions encountered in aeronautical environments are discussed.
NASA Astrophysics Data System (ADS)
Alsharrah, Saad A.; Bruce, David A.; Bouabid, Rachid; Somenahalli, Sekhar; Corcoran, Paul A.
2015-10-01
The use of remote sensing techniques to extract vegetation cover information for the assessment and monitoring of land degradation in arid environments has gained increased interest in recent years. However, such a task can be challenging, especially for medium-spatial resolution satellite sensors, due to soil background effects and the distribution and structure of perennial desert vegetation. In this study, we utilised Pleiades high-spatial resolution, multispectral (2m) and panchromatic (0.5m) imagery and focused on mapping small shrubs and low-lying trees using three classification techniques: 1) vegetation indices (VI) threshold analysis, 2) pre-built object-oriented image analysis (OBIA), and 3) a developed vegetation shadow model (VSM). We evaluated the success of each approach using a root of the sum of the squares (RSS) metric, which incorporated field data as control and three error metrics relating to commission, omission, and percent cover. Results showed that optimum VI performers returned good vegetation cover estimates at certain thresholds, but failed to accurately map the distribution of the desert plants. Using the pre-built IMAGINE Objective OBIA approach, we improved the vegetation distribution mapping accuracy, but this came at the cost of over classification, similar to results of lowering VI thresholds. We further introduced the VSM which takes into account shadow for further refining vegetation cover classification derived from VI. The results showed significant improvements in vegetation cover and distribution accuracy compared to the other techniques. We argue that the VSM approach using high-spatial resolution imagery provides a more accurate representation of desert landscape vegetation and should be considered in assessments of desertification.
Epidemiology of tobacco use and dependence in adults in a poor peri-urban community in Lima, Peru
2012-01-01
Background Tobacco smoking is an important public health concern worldwide leading to both chronic disease and early death. In Latin America, smoking prevalence is estimated at approximately 30% and prior studies suggest that the prevalence in Peru is 22% to 38%. We sought to determine the prevalence of daily smoking in a poor peri-urban community in Lima, Peru. Methods We conducted a cross-sectional survey in a random sample of adults ≥40 years of age living in Pampas de San Juan de Miraflores, Lima, Peru. We asked participants to respond to a survey that included questions on sociodemographics, tobacco use and dependence. Results We enrolled 316 participants. Average monthly household income was ≤ 400 USD and nearly all homes had running water, sewage, and electricity. Most individuals had not completed high school. Smoking prevalence was 16% overall, yet daily smoking prevalence was 1.9%. Former daily smokers comprised 3.8% of current nonsmokers and 9.1% current occasional smokers. Average scores for the Fagerstrom Test for Nicotine Dependence for daily smokers and occasional smokers were 1.5 and 0, respectively. Conclusions Daily use of tobacco is uncommon among adults in peri-urban communities of Lima, Peru, unlike their counterparts in Lima and other Latin American capital cities. Tobacco dependence is also low. Hence, efforts aimed at primary prevention are of utmost importance in these communities. This study provides an accurate baseline using an internationally recognized assessment tool (Global Adult Tobacco Survey), allowing for accurate assessment of tobacco control interventions over time. PMID:22429737
2012-01-01
Background Multiplex cytometric bead assay (CBA) have a number of advantages over ELISA for antibody testing, but little information is available on standardization and validation of antibody CBA to multiple Plasmodium falciparum antigens. The present study was set to determine optimal parameters for multiplex testing of antibodies to P. falciparum antigens, and to compare results of multiplex CBA to ELISA. Methods Antibodies to ten recombinant P. falciparum antigens were measured by CBA and ELISA in samples from 30 individuals from a malaria endemic area of Kenya and compared to known positive and negative control plasma samples. Optimal antigen amounts, monoplex vs multiplex testing, plasma dilution, optimal buffer, number of beads required were assessed for CBA testing, and results from CBA vs. ELISA testing were compared. Results Optimal amounts for CBA antibody testing differed according to antigen. Results for monoplex CBA testing correlated strongly with multiplex testing for all antigens (r = 0.88-0.99, P values from <0.0001 - 0.004), and antibodies to variants of the same antigen were accurately distinguished within a multiplex reaction. Plasma dilutions of 1:100 or 1:200 were optimal for all antigens for CBA testing. Plasma diluted in a buffer containing 0.05% sodium azide, 0.5% polyvinylalcohol, and 0.8% polyvinylpyrrolidone had the lowest background activity. CBA median fluorescence intensity (MFI) values with 1,000 antigen-conjugated beads/well did not differ significantly from MFI with 5,000 beads/well. CBA and ELISA results correlated well for all antigens except apical membrane antigen-1 (AMA-1). CBA testing produced a greater range of values in samples from malaria endemic areas and less background reactivity for blank samples than ELISA. Conclusion With optimization, CBA may be the preferred method of testing for antibodies to P. falciparum antigens, as CBA can test for antibodies to multiple recombinant antigens from a single plasma sample and produces a greater range of values in positive samples and lower background readings for blank samples than ELISA. PMID:23259607
Adaptations in humans for assessing physical strength from the voice
Sell, Aaron; Bryant, Gregory A.; Cosmides, Leda; Tooby, John; Sznycer, Daniel; von Rueden, Christopher; Krauss, Andre; Gurven, Michael
2010-01-01
Recent research has shown that humans, like many other animals, have a specialization for assessing fighting ability from visual cues. Because it is probable that the voice contains cues of strength and formidability that are not available visually, we predicted that selection has also equipped humans with the ability to estimate physical strength from the voice. We found that subjects accurately assessed upper-body strength in voices taken from eight samples across four distinct populations and language groups: the Tsimane of Bolivia, Andean herder-horticulturalists and United States and Romanian college students. Regardless of whether raters were told to assess height, weight, strength or fighting ability, they produced similar ratings that tracked upper-body strength independent of height and weight. Male voices were more accurately assessed than female voices, which is consistent with ethnographic data showing a greater tendency among males to engage in violent aggression. Raters extracted information about strength from the voice that was not supplied from visual cues, and were accurate with both familiar and unfamiliar languages. These results provide, to our knowledge, the first direct evidence that both men and women can accurately assess men's physical strength from the voice, and suggest that estimates of strength are used to assess fighting ability. PMID:20554544
Planets as background noise sources in free space optical communications
NASA Technical Reports Server (NTRS)
Katz, J.
1986-01-01
Background noise generated by planets is the dominant noise source in most deep space direct detection optical communications systems. Earlier approximate analyses of this problem are based on simplified blackbody calculations and can yield results that may be inaccurate by up to an order of magnitude. Various other factors that need to be taken into consideration, such as the phase angle and the actual spectral dependence of the planet albedo, in order to obtain a more accurate estimate of the noise magnitude are examined.
The Rényi divergence enables accurate and precise cluster analysis for localisation microscopy.
Staszowska, Adela D; Fox-Roberts, Patrick; Hirvonen, Liisa M; Peddie, Christopher J; Collinson, Lucy M; Jones, Gareth E; Cox, Susan
2018-06-01
Clustering analysis is a key technique for quantitatively characterising structures in localisation microscopy images. To build up accurate information about biological structures, it is critical that the quantification is both accurate (close to the ground truth) and precise (has small scatter and is reproducible). Here we describe how the Rényi divergence can be used for cluster radius measurements in localisation microscopy data. We demonstrate that the Rényi divergence can operate with high levels of background and provides results which are more accurate than Ripley's functions, Voronoi tesselation or DBSCAN. Data supporting this research will be made accessible via a web link. Software codes developed for this work can be accessed via http://coxphysics.com/Renyi_divergence_software.zip. Implemented in C ++. Correspondence and requests for materials can be also addressed to the corresponding author. adela.staszowska@gmail.com or susan.cox@kcl.ac.uk. Supplementary data are available at Bioinformatics online.
NASA Astrophysics Data System (ADS)
Iwano, K.; Iwamoto, A.; Asahina, T.; Yamanoi, K.; Arikawa, Y.; Nagatomo, H.; Nakai, M.; Norimatsu, T.; Azechi, H.
2017-07-01
Infrared (IR) heating processes have been studied to form a deuterium layer in an inertial confinement fusion target. To understand the relationship between the IR intensity and the fuel layering time constant, we have developed a new method to assess the IR intensity during irradiation. In our method, a glass flask acting as a dummy target is filled with liquid hydrogen (LH2) and is then irradiated with 2-μm light. The IR intensity is subsequently calculated from the time constant of the LH2 evaporation rate. Although LH2 evaporation is also caused by the heat inflow from the surroundings and by the background heat, the evaporation rate due to IR heating can be accurately determined by acquiring the time constant with and without irradiation. The experimentally measured IR intensity is 0.66 mW/cm2, which agrees well with a value estimated by considering the IR photon energy balance. Our results suggest that the present method can be used to measure the IR intensity inside a cryogenic system during IR irradiation of laser fusion targets.
Iwano, K; Iwamoto, A; Asahina, T; Yamanoi, K; Arikawa, Y; Nagatomo, H; Nakai, M; Norimatsu, T; Azechi, H
2017-07-01
Infrared (IR) heating processes have been studied to form a deuterium layer in an inertial confinement fusion target. To understand the relationship between the IR intensity and the fuel layering time constant, we have developed a new method to assess the IR intensity during irradiation. In our method, a glass flask acting as a dummy target is filled with liquid hydrogen (LH 2 ) and is then irradiated with 2-μm light. The IR intensity is subsequently calculated from the time constant of the LH 2 evaporation rate. Although LH 2 evaporation is also caused by the heat inflow from the surroundings and by the background heat, the evaporation rate due to IR heating can be accurately determined by acquiring the time constant with and without irradiation. The experimentally measured IR intensity is 0.66 mW/cm 2 , which agrees well with a value estimated by considering the IR photon energy balance. Our results suggest that the present method can be used to measure the IR intensity inside a cryogenic system during IR irradiation of laser fusion targets.
Steiner, John F.; Ho, P. Michael; Beaty, Brenda L.; Dickinson, L. Miriam; Hanratty, Rebecca; Zeng, Chan; Tavel, Heather M.; Havranek, Edward P.; Davidson, Arthur J.; Magid, David J.; Estacio, Raymond O.
2009-01-01
Background Although many studies have identified patient characteristics or chronic diseases associated with medication adherence, the clinical utility of such predictors has rarely been assessed. We attempted to develop clinical prediction rules for adherence with antihypertensive medications in two health care delivery systems. Methods and Results Retrospective cohort studies of hypertension registries in an inner-city health care delivery system (N = 17176) and a health maintenance organization (N = 94297) in Denver, Colorado. Adherence was defined by acquisition of 80% or more of antihypertensive medications. A multivariable model in the inner-city system found that adherent patients (36.3% of the total) were more likely than non-adherent patients to be older, white, married, and acculturated in US society, to have diabetes or cerebrovascular disease, not to abuse alcohol or controlled substances, and to be prescribed less than three antihypertensive medications. Although statistically significant, all multivariate odds ratios were 1.7 or less, and the model did not accurately discriminate adherent from non-adherent patients (C-statistic = 0.606). In the health maintenance organization, where 72.1% of patients were adherent, significant but weak associations existed between adherence and older age, white race, the lack of alcohol abuse, and fewer antihypertensive medications. The multivariate model again failed to accurately discriminate adherent from non-adherent individuals (C-statistic = 0.576). Conclusions Although certain socio-demographic characteristics or clinical diagnoses are statistically associated with adherence to refills of antihypertensive medications, a combination of these characteristics is not sufficiently accurate to allow clinicians to predict whether their patients will be adherent with treatment. PMID:20031876
Ezra, D; Mellington, F; Cugnoni, H; Westcott, M
2005-01-01
Background and objectives: Annual attendances at the accident and emergency (A&E) department of St Bartholomew's and The Royal London NHS Trust exceed 100 000 people of which 6% are ophthalmic. This study evaluated the accuracy of eye referrals from A&E senior house officers (SHOs) and emergency nurse practitioners (ENPs) and the impact any inaccuracies may have had on out of hours work. Methods: Over a four week period a record of all referrals from the A&E department was made. The doctor receiving the referral made a note of clinical variables as reported by the referring clinician. When the patient was subsequently reviewed by an ophthalmologist, a record was again made of these findings. Any discrepancies were recorded. Results: A total of 67 patients were recruited. ENPs were found to be consistently more accurate than SHOs in every aspect of the assessment, most notably in visual acuity (p = 0.0029), and provisional diagnosis (p = 0.012). Furthermore, had the examination findings been accurate, 58% of all SHO referrals seen after hours would have been triaged to the next available clinic but only 10% of ENP referrals could have been seen at the next clinic session (p = 0.027). Conclusion: This study found ENPs to be more accurate than A&E SHOs in history taking, recording visual acuity, describing ocular anatomy, and making provisional diagnoses. A significant reduction in out of hours ophthalmic workload may be achieved in the authors' unit if ENPs were to see all eye emergencies. PMID:16189030
Verifying Three-Dimensional Skull Model Reconstruction Using Cranial Index of Symmetry
Kung, Woon-Man; Chen, Shuo-Tsung; Lin, Chung-Hsiang; Lu, Yu-Mei; Chen, Tzu-Hsuan; Lin, Muh-Shi
2013-01-01
Background Difficulty exists in scalp adaptation for cranioplasty with customized computer-assisted design/manufacturing (CAD/CAM) implant in situations of excessive wound tension and sub-cranioplasty dead space. To solve this clinical problem, the CAD/CAM technique should include algorithms to reconstruct a depressed contour to cover the skull defect. Satisfactory CAM-derived alloplastic implants are based on highly accurate three-dimensional (3-D) CAD modeling. Thus, it is quite important to establish a symmetrically regular CAD/CAM reconstruction prior to depressing the contour. The purpose of this study is to verify the aesthetic outcomes of CAD models with regular contours using cranial index of symmetry (CIS). Materials and methods From January 2011 to June 2012, decompressive craniectomy (DC) was performed for 15 consecutive patients in our institute. 3-D CAD models of skull defects were reconstructed using commercial software. These models were checked in terms of symmetry by CIS scores. Results CIS scores of CAD reconstructions were 99.24±0.004% (range 98.47–99.84). CIS scores of these CAD models were statistically significantly greater than 95%, identical to 99.5%, but lower than 99.6% (p<0.001, p = 0.064, p = 0.021 respectively, Wilcoxon matched pairs signed rank test). These data evidenced the highly accurate symmetry of these CAD models with regular contours. Conclusions CIS calculation is beneficial to assess aesthetic outcomes of CAD-reconstructed skulls in terms of cranial symmetry. This enables further accurate CAD models and CAM cranial implants with depressed contours, which are essential in patients with difficult scalp adaptation. PMID:24204566
Perceived Intelligence Is Associated with Measured Intelligence in Men but Not Women
Kleisner, Karel; Chvátalová, Veronika; Flegr, Jaroslav
2014-01-01
Background The ability to accurately assess the intelligence of other persons finds its place in everyday social interaction and should have important evolutionary consequences. Methodology/Principal Findings We used static facial photographs of 40 men and 40 women to test the relationship between measured IQ, perceived intelligence, and facial shape. Both men and women were able to accurately evaluate the intelligence of men by viewing facial photographs. In addition to general intelligence, figural and fluid intelligence showed a significant relationship with perceived intelligence, but again, only in men. No relationship between perceived intelligence and IQ was found for women. We used geometric morphometrics to determine which facial traits are associated with the perception of intelligence, as well as with intelligence as measured by IQ testing. Faces that are perceived as highly intelligent are rather prolonged with a broader distance between the eyes, a larger nose, a slight upturn to the corners of the mouth, and a sharper, pointing, less rounded chin. By contrast, the perception of lower intelligence is associated with broader, more rounded faces with eyes closer to each other, a shorter nose, declining corners of the mouth, and a rounded and massive chin. By contrast, we found no correlation between morphological traits and real intelligence measured with IQ test, either in men or women. Conclusions These results suggest that a perceiver can accurately gauge the real intelligence of men, but not women, by viewing their faces in photographs; however, this estimation is possibly not based on facial shape. Our study revealed no relation between intelligence and either attractiveness or face shape. PMID:24651120
See, Rachel L.; Driscoll, Virginia D.; Gfeller, Kate; Kliethermes, Stephanie; Oleson, Jacob
2013-01-01
Background Cochlear implant (CI) users have difficulty perceiving some intonation cues in speech and melodic contours because of poor frequency selectivity in the cochlear implant signal. Objectives To assess perceptual accuracy of normal hearing (NH) children and pediatric CI users on speech intonation (prosody), melodic contour, and pitch ranking, and to determine potential predictors of outcomes. Hypothesis Does perceptual accuracy for speech intonation or melodic contour differ as a function of auditory status (NH, CI), perceptual category (falling vs. rising intonation/contour), pitch perception, or individual differences (e.g., age, hearing history)? Method NH and CI groups were tested on recognition of falling intonation/contour vs. rising intonation/contour presented in both spoken and melodic (sung) conditions. Pitch ranking was also tested. Outcomes were correlated with variables of age, hearing history, HINT, and CNC scores. Results The CI group was significantly less accurate than the NH group in spoken (CI, M=63.1 %; NH, M=82.1%) and melodic (CI, M=61.6%; NH, M=84.2%) conditions. The CI group was more accurate in recognizing rising contour in the melodic condition compared with rising intonation in the spoken condition. Pitch ranking was a significant predictor of outcome for both groups in falling intonation and rising melodic contour; age at testing and hearing history variables were not predictive of outcomes. Conclusions Children with CIs were less accurate than NH children in perception of speech intonation, melodic contour, and pitch ranking. However, the larger pitch excursions of the melodic condition may assist in recognition of the rising inflection associated with the interrogative form. PMID:23442568
Characterizing the True Background Corona with SDO/AIA
NASA Technical Reports Server (NTRS)
Napier, Kate; Winebarger, Amy; Alexander, Caroline
2014-01-01
Characterizing the nature of the solar coronal background would enable scientists to more accurately determine plasma parameters, and may lead to a better understanding of the coronal heating problem. Because scientists study the 3D structure of the Sun in 2D, any line of sight includes both foreground and background material, and thus, the issue of background subtraction arises. By investigating the intensity values in and around an active region, using multiple wavelengths collected from the Atmospheric Imaging Assembly (AIA) on the Solar Dynamics Observatory (SDO) over an eight-hour period, this project aims to characterize the background as smooth or structured. Different methods were employed to measure the true coronal background and create minimum intensity images. These were then investigated for the presence of structure. The background images created were found to contain long-lived structures, including coronal loops, that were still present in all of the wavelengths, 193 Angstroms,171 Angstroms,131 Angstroms, and 211 Angstroms. The intensity profiles across the active region indicate that the background is much more structured than previously thought.
Predicting Airport Screening Officers' Visual Search Competency With a Rapid Assessment.
Mitroff, Stephen R; Ericson, Justin M; Sharpe, Benjamin
2018-03-01
Objective The study's objective was to assess a new personnel selection and assessment tool for aviation security screeners. A mobile app was modified to create a tool, and the question was whether it could predict professional screeners' on-job performance. Background A variety of professions (airport security, radiology, the military, etc.) rely on visual search performance-being able to detect targets. Given the importance of such professions, it is necessary to maximize performance, and one means to do so is to select individuals who excel at visual search. A critical question is whether it is possible to predict search competency within a professional search environment. Method Professional searchers from the USA Transportation Security Administration (TSA) completed a rapid assessment on a tablet-based X-ray simulator (XRAY Screener, derived from the mobile technology app Airport Scanner; Kedlin Company). The assessment contained 72 trials that were simulated X-ray images of bags. Participants searched for prohibited items and tapped on them with their finger. Results Performance on the assessment significantly related to on-job performance measures for the TSA officers such that those who were better XRAY Screener performers were both more accurate and faster at the actual airport checkpoint. Conclusion XRAY Screener successfully predicted on-job performance for professional aviation security officers. While questions remain about the underlying cognitive mechanisms, this quick assessment was found to significantly predict on-job success for a task that relies on visual search performance. Application It may be possible to quickly assess an individual's visual search competency, which could help organizations select new hires and assess their current workforce.
Doan, Bich-Thuy; Latorre Ossa, Heldmuth; Jugé, Lauriane; Gennisson, Jean-Luc; Tanter, Mickaël; Scherman, Daniel; Chabot, Guy G.; Mignet, Nathalie
2013-01-01
Background and Objectives. To determine the most appropriate technique for tumour followup in experimental therapeutics, we compared ultrasound (US) and magnetic resonance imaging (MRI) to characterize ectopic and orthotopic colon carcinoma models. Methods. CT26 tumours were implanted subcutaneously (s.c.) in Balb/c mice for the ectopic model or into the caecum for the orthotopic model. Tumours were evaluated by histology, spectrofluorescence, MRI, and US. Results. Histology of CT26 tumour showed homogeneously dispersed cancer cells and blood vessels. The visualization of the vascular network using labelled albumin showed that CT26 tumours were highly vascularized and disorganized. MRI allowed high-resolution and accurate 3D tumour measurements and provided additional anatomical and functional information. Noninvasive US imaging allowed good delineation of tumours despite an hypoechogenic signal. Monitoring of tumour growth with US could be accomplished as early as 5 days after implantation with a shorter acquisition time (<5 min) compared to MRI. Conclusion. MRI and US afforded excellent noninvasive imaging techniques to accurately follow tumour growth of ectopic and orthotopic CT26 tumours. These two techniques can be appropriately used for tumour treatment followup, with a preference for US imaging, due to its short acquisition time and simplicity of use. PMID:23936648
Wind Speed Perception and Risk
Agdas, Duzgun; Webster, Gregory D.; Masters, Forrest J.
2012-01-01
Background How accurately do people perceive extreme wind speeds and how does that perception affect the perceived risk? Prior research on human–wind interaction has focused on comfort levels in urban settings or knock-down thresholds. No systematic experimental research has attempted to assess people's ability to estimate extreme wind speeds and perceptions of their associated risks. Method We exposed 76 people to 10, 20, 30, 40, 50, and 60 mph (4.5, 8.9, 13.4, 17.9, 22.3, and 26.8 m/s) winds in randomized orders and asked them to estimate wind speed and the corresponding risk they felt. Results Multilevel modeling showed that people were accurate at lower wind speeds but overestimated wind speeds at higher levels. Wind speed perceptions mediated the direct relationship between actual wind speeds and perceptions of risk (i.e., the greater the perceived wind speed, the greater the perceived risk). The number of tropical cyclones people had experienced moderated the strength of the actual–perceived wind speed relationship; consequently, mediation was stronger for people who had experienced fewer storms. Conclusion These findings provide a clearer understanding of wind and risk perception, which can aid development of public policy solutions toward communicating the severity and risks associated with natural disasters. PMID:23226230
MRI in T staging of rectal cancer: How effective is it?
Mulla, MG; Deb, R; Singh, R
2010-01-01
Background: Rectal cancer constitutes about one-third of all gastrointestinal (GI) tract tumors. Because of the high recurrence rates (30%) in rectal cancer, it is vitally important to accurately stage these tumours preoperatively so that appropriate surgical resection can be undertaken. MRI is the ideal technique for the preoperative staging of these tumours. Aim: To determine the accuracy of local T staging of rectal cancer with MRI, using histopathological staging as the gold. Materials and Methods: Forty consecutive patients admitted with rectal cancer over a period of 18 months were included in this retrospective study. MRI scans were performed prior to surgery in all patients, on 1.5T scanners. Two radiologists, with a special interest in gastrointestinal imaging reported all images. Two dedicated histopathologists reported the histology slides. The accuracy of preoperative local MRI T staging was assessed by comparison with postoperative histopathological staging. Results: There was agreement between MRI and histopathology (TNM) staging in 12 patients (30%). The sensitivity and specificity of MRI for T staging was 89% and 67% respectively. The circumferential resection margin (CRM) status was accurately staged in 94.1% of the patients. Conclusions: Preoperative staging with MRI is sensitive in identifying CRM involvement, which is the main factor affecting the outcome of surgery. PMID:20607023
High-fidelity real-time maritime scene rendering
NASA Astrophysics Data System (ADS)
Shyu, Hawjye; Taczak, Thomas M.; Cox, Kevin; Gover, Robert; Maraviglia, Carlos; Cahill, Colin
2011-06-01
The ability to simulate authentic engagements using real-world hardware is an increasingly important tool. For rendering maritime environments, scene generators must be capable of rendering radiometrically accurate scenes with correct temporal and spatial characteristics. When the simulation is used as input to real-world hardware or human observers, the scene generator must operate in real-time. This paper introduces a novel, real-time scene generation capability for rendering radiometrically accurate scenes of backgrounds and targets in maritime environments. The new model is an optimized and parallelized version of the US Navy CRUISE_Missiles rendering engine. It was designed to accept environmental descriptions and engagement geometry data from external sources, render a scene, transform the radiometric scene using the electro-optical response functions of a sensor under test, and output the resulting signal to real-world hardware. This paper reviews components of the scene rendering algorithm, and details the modifications required to run this code in real-time. A description of the simulation architecture and interfaces to external hardware and models is presented. Performance assessments of the frame rate and radiometric accuracy of the new code are summarized. This work was completed in FY10 under Office of Secretary of Defense (OSD) Central Test and Evaluation Investment Program (CTEIP) funding and will undergo a validation process in FY11.
Gandola, Emanuele; Antonioli, Manuela; Traficante, Alessio; Franceschini, Simone; Scardi, Michele; Congestri, Roberta
2016-05-01
Toxigenic cyanobacteria are one of the main health risks associated with water resources worldwide, as their toxins can affect humans and fauna exposed via drinking water, aquaculture and recreation. Microscopy monitoring of cyanobacteria in water bodies and massive growth systems is a routine operation for cell abundance and growth estimation. Here we present ACQUA (Automated Cyanobacterial Quantification Algorithm), a new fully automated image analysis method designed for filamentous genera in Bright field microscopy. A pre-processing algorithm has been developed to highlight filaments of interest from background signals due to other phytoplankton and dust. A spline-fitting algorithm has been designed to recombine interrupted and crossing filaments in order to perform accurate morphometric analysis and to extract the surface pattern information of highlighted objects. In addition, 17 specific pattern indicators have been developed and used as input data for a machine-learning algorithm dedicated to the recognition between five widespread toxic or potentially toxic filamentous genera in freshwater: Aphanizomenon, Cylindrospermopsis, Dolichospermum, Limnothrix and Planktothrix. The method was validated using freshwater samples from three Italian volcanic lakes comparing automated vs. manual results. ACQUA proved to be a fast and accurate tool to rapidly assess freshwater quality and to characterize cyanobacterial assemblages in aquatic environments. Copyright © 2016 Elsevier B.V. All rights reserved.
Integration of Network Biology and Imaging to Study Cancer Phenotypes and Responses.
Tian, Ye; Wang, Sean S; Zhang, Zhen; Rodriguez, Olga C; Petricoin, Emanuel; Shih, Ie-Ming; Chan, Daniel; Avantaggiati, Maria; Yu, Guoqiang; Ye, Shaozhen; Clarke, Robert; Wang, Chao; Zhang, Bai; Wang, Yue; Albanese, Chris
2014-01-01
Ever growing "omics" data and continuously accumulated biological knowledge provide an unprecedented opportunity to identify molecular biomarkers and their interactions that are responsible for cancer phenotypes that can be accurately defined by clinical measurements such as in vivo imaging. Since signaling or regulatory networks are dynamic and context-specific, systematic efforts to characterize such structural alterations must effectively distinguish significant network rewiring from random background fluctuations. Here we introduced a novel integration of network biology and imaging to study cancer phenotypes and responses to treatments at the molecular systems level. Specifically, Differential Dependence Network (DDN) analysis was used to detect statistically significant topological rewiring in molecular networks between two phenotypic conditions, and in vivo Magnetic Resonance Imaging (MRI) was used to more accurately define phenotypic sample groups for such differential analysis. We applied DDN to analyze two distinct phenotypic groups of breast cancer and study how genomic instability affects the molecular network topologies in high-grade ovarian cancer. Further, FDA-approved arsenic trioxide (ATO) and the ND2-SmoA1 mouse model of Medulloblastoma (MB) were used to extend our analyses of combined MRI and Reverse Phase Protein Microarray (RPMA) data to assess tumor responses to ATO and to uncover the complexity of therapeutic molecular biology.
Veeraraghavan, Harini; Dashevsky, Brittany Z; Onishi, Natsuko; Sadinski, Meredith; Morris, Elizabeth; Deasy, Joseph O; Sutton, Elizabeth J
2018-03-19
We present a segmentation approach that combines GrowCut (GC) with cancer-specific multi-parametric Gaussian Mixture Model (GCGMM) to produce accurate and reproducible segmentations. We evaluated GCGMM using a retrospectively collected 75 invasive ductal carcinoma with ERPR+ HER2- (n = 15), triple negative (TN) (n = 9), and ER-HER2+ (n = 57) cancers with variable presentation (mass and non-mass enhancement) and background parenchymal enhancement (mild and marked). Expert delineated manual contours were used to assess the segmentation performance using Dice coefficient (DSC), mean surface distance (mSD), Hausdorff distance, and volume ratio (VR). GCGMM segmentations were significantly more accurate than GrowCut (GC) and fuzzy c-means clustering (FCM). GCGMM's segmentations and the texture features computed from those segmentations were the most reproducible compared with manual delineations and other analyzed segmentation methods. Finally, random forest (RF) classifier trained with leave-one-out cross-validation using features extracted from GCGMM segmentation resulted in the best accuracy for ER-HER2+ vs. ERPR+/TN (GCGMM 0.95, expert 0.95, GC 0.90, FCM 0.92) and for ERPR + HER2- vs. TN (GCGMM 0.92, expert 0.91, GC 0.77, FCM 0.83).
Destination memory accuracy and confidence in younger and older adults.
Johnson, Tara L; Jefferson, Susan C
2018-01-01
Background/Study Context: Nascent research on destination memory-remembering to whom we tell particular information-suggested that older adults have deficits in destination memory and are more confident on inaccurate responses than younger adults. This study assessed the effects of age, attentional resources, and mental imagery on destination memory accuracy and confidence in younger and older adults. Using computer format, participants told facts to pictures of famous people in one of four conditions (control, self-focus, refocus, imagery). Older adults had lower destination memory accuracy than younger adults, driven by a higher level of false alarms. Whereas younger adults were more confident in accurate answers, older adults were more confident in inaccurate answers. Accuracy across participants was lowest when attention was directed internally but significantly improved when mental imagery was used. Importantly, the age-related differences in false alarms and high-confidence inaccurate answers disappeared when imagery was used. Older adults are more likely than younger adults to commit destination memory errors and are less accurate in related confidence judgments. Furthermore, the use of associative memory strategies may help improve destination memory across age groups, improve the accuracy of confidence judgments in older adults, and decrease age-related destination memory impairment, particularly in young-old adults.
NASA Astrophysics Data System (ADS)
Gupta, Arun; Kim, Kyeong Yun; Hwang, Donghwi; Lee, Min Sun; Lee, Dong Soo; Lee, Jae Sung
2018-06-01
SPECT plays important role in peptide receptor targeted radionuclide therapy using theranostic radionuclides such as Lu-177 for the treatment of various cancers. However, SPECT studies must be quantitatively accurate because the reliable assessment of tumor uptake and tumor-to-normal tissue ratios can only be performed using quantitatively accurate images. Hence, it is important to evaluate performance parameters and quantitative accuracy of preclinical SPECT systems for therapeutic radioisotopes before conducting pre- and post-therapy SPECT imaging or dosimetry studies. In this study, we evaluated system performance and quantitative accuracy of NanoSPECT/CT scanner for Lu-177 imaging using point source and uniform phantom studies. We measured recovery coefficient, uniformity, spatial resolution, system sensitivity and calibration factor for mouse whole body standard aperture. We also performed the experiments using Tc-99m to compare the results with that of Lu-177. We found that the recovery coefficient of more than 70% for Lu-177 at the optimum noise level when nine iterations were used. The spatial resolutions of Lu-177 with and without adding uniform background was comparable to that of Tc-99m in axial, radial and tangential directions. System sensitivity measured for Lu-177 was almost three times less than that of Tc-99m.
Benchmark model correction of monitoring system based on Dynamic Load Test of Bridge
NASA Astrophysics Data System (ADS)
Shi, Jing-xian; Fan, Jiang
2018-03-01
Structural health monitoring (SHM) is a field of research in the area, and it’s designed to achieve bridge safety and reliability assessment, which needs to be carried out on the basis of the accurate simulation of the finite element model. Bridge finite element model is simplified of the structural section form, support conditions, material properties and boundary condition, which is based on the design and construction drawings, and it gets the calculation models and the results.But according to the design and specification requirements established finite element model due to its cannot fully reflect the true state of the bridge, so need to modify the finite element model to obtain the more accurate finite element model. Based on Da-guan river crossing of Ma - Zhao highway in Yunnan province as the background to do the dynamic load test test, we find that the impact coefficient of the theoretical model of the bridge is very different from the coefficient of the actual test, and the change is different; according to the actual situation, the calculation model is adjusted to get the correct frequency of the bridge, the revised impact coefficient found that the modified finite element model is closer to the real state, and provides the basis for the correction of the finite model.
Antigen detection based on background fluorescence quenching immunochromatographic assay.
Chen, Xiangjun; Xu, Yangyang; Yu, Jinsheng; Li, Jiutong; Zhou, Xuelei; Wu, Chuanyong; Ji, Qiuliang; Ren, Yuan; Wang, Liqun; Huang, Zhengyi; Zhuang, Hanling; Piao, Long; Head, Richard; Wang, Yajie; Lou, Jiatao
2014-09-02
Gold immunochromatographic assay (GICA) has been around for quite a while, but it is qualitative in the vast majority of applications. A fast, simple and quantitative GICA is in call for better medicine. In the current study, we have established a novel, quantitative GICA based on fluorescence quenching and nitrocellulose membrane background signals, called background fluorescence quenching immunochromatographic assay (bFQICA). Using model analyte alpha-fetoprotein (AFP), the present study assessed the performance of bFQICA in numerous assay aspects. With serial dilutions of the international AFP standard, standard curves for the calculation of AFP concentration were successfully established. At 10 and 100ngmL(-1) of the international AFP standard, the assay variability was defined with a coefficient of variance at 10.4% and 15.2%, respectively. For samples with extended range of AFP levels, bFQICA was able to detect AFP at as low as 1ngmL(-1). Fluorescence in bFQICA strips stayed constant over months. A good correlation between the results from bFQICA and from a well-established Roche electrochemiluminescence immunoassay was observed in 27 serum samples (r=0.98, p<0.001). In conclusion, our study has demonstrated distinctive features of bFQICA over conventional GICA, including utilization of a unique fluorescence ratio between nitrocellulose membrane background and specific signals (F1/F2) to ensure accurate measurements, combined qualitative and quantitative capabilities, and exceptionally high sensitivity for detection of very low levels of antigens. All of these features could make bFQICA attractive as a model for antigen-antibody complex based GICA, and could promote bFQICA to a broad range of applications for investigation of a variety of diseases. Copyright © 2014 Elsevier B.V. All rights reserved.
Quantized correlation coefficient for measuring reproducibility of ChIP-chip data.
Peng, Shouyong; Kuroda, Mitzi I; Park, Peter J
2010-07-27
Chromatin immunoprecipitation followed by microarray hybridization (ChIP-chip) is used to study protein-DNA interactions and histone modifications on a genome-scale. To ensure data quality, these experiments are usually performed in replicates, and a correlation coefficient between replicates is used often to assess reproducibility. However, the correlation coefficient can be misleading because it is affected not only by the reproducibility of the signal but also by the amount of binding signal present in the data. We develop the Quantized correlation coefficient (QCC) that is much less dependent on the amount of signal. This involves discretization of data into set of quantiles (quantization), a merging procedure to group the background probes, and recalculation of the Pearson correlation coefficient. This procedure reduces the influence of the background noise on the statistic, which then properly focuses more on the reproducibility of the signal. The performance of this procedure is tested in both simulated and real ChIP-chip data. For replicates with different levels of enrichment over background and coverage, we find that QCC reflects reproducibility more accurately and is more robust than the standard Pearson or Spearman correlation coefficients. The quantization and the merging procedure can also suggest a proper quantile threshold for separating signal from background for further analysis. To measure reproducibility of ChIP-chip data correctly, a correlation coefficient that is robust to the amount of signal present should be used. QCC is one such measure. The QCC statistic can also be applied in a variety of other contexts for measuring reproducibility, including analysis of array CGH data for DNA copy number and gene expression data.
Characterization of in-flight performance of ion propulsion systems
NASA Astrophysics Data System (ADS)
Sovey, James S.; Rawlin, Vincent K.
1993-06-01
In-flight measurements of ion propulsion performance, ground test calibrations, and diagnostic performance measurements were reviewed. It was found that accelerometers provided the most accurate in-flight thrust measurements compared with four other methods that were surveyed. An experiment has also demonstrated that pre-flight alignment of the thrust vector was sufficiently accurate so that gimbal adjustments and use of attitude control thrusters were not required to counter disturbance torques caused by thrust vector misalignment. The effects of facility background pressure, facility enhanced charge-exchange reactions, and contamination on ground-based performance measurements are also discussed. Vacuum facility pressures for inert-gas ion thruster life tests and flight qualification tests will have to be less than 2 mPa to ensure accurate performance measurements.
Characterization of in-flight performance of ion propulsion systems
NASA Technical Reports Server (NTRS)
Sovey, James S.; Rawlin, Vincent K.
1993-01-01
In-flight measurements of ion propulsion performance, ground test calibrations, and diagnostic performance measurements were reviewed. It was found that accelerometers provided the most accurate in-flight thrust measurements compared with four other methods that were surveyed. An experiment has also demonstrated that pre-flight alignment of the thrust vector was sufficiently accurate so that gimbal adjustments and use of attitude control thrusters were not required to counter disturbance torques caused by thrust vector misalignment. The effects of facility background pressure, facility enhanced charge-exchange reactions, and contamination on ground-based performance measurements are also discussed. Vacuum facility pressures for inert-gas ion thruster life tests and flight qualification tests will have to be less than 2 mPa to ensure accurate performance measurements.
René de Cotret, Laurent P; Siwick, Bradley J
2017-07-01
The general problem of background subtraction in ultrafast electron powder diffraction (UEPD) is presented with a focus on the diffraction patterns obtained from materials of moderately complex structure which contain many overlapping peaks and effectively no scattering vector regions that can be considered exclusively background. We compare the performance of background subtraction algorithms based on discrete and dual-tree complex (DTCWT) wavelet transforms when applied to simulated UEPD data on the M1-R phase transition in VO 2 with a time-varying background. We find that the DTCWT approach is capable of extracting intensities that are accurate to better than 2% across the whole range of scattering vector simulated, effectively independent of delay time. A Python package is available.
Gaziano, Thomas A; Young, Cynthia R; Fitzmaurice, Garrett; Atwood, Sidney; Gaziano, J Michael
2008-01-01
Summary Background Around 80% of all cardiovascular deaths occur in developing countries. Assessment of those patients at high risk is an important strategy for prevention. Since developing countries have limited resources for prevention strategies that require laboratory testing, we assessed if a risk prediction method that did not require any laboratory tests could be as accurate as one requiring laboratory information. Methods The National Health and Nutrition Examination Survey (NHANES) was a prospective cohort study of 14 407 US participants aged between 25–74 years at the time they were first examined (between 1971 and 1975). Our follow-up study population included participants with complete information on these surveys who did not report a history of cardiovascular disease (myocardial infarction, heart failure, stroke, angina) or cancer, yielding an analysis dataset N=6186. We compared how well either method could predict first-time fatal and non-fatal cardiovascular disease events in this cohort. For the laboratory-based model, which required blood testing, we used standard risk factors to assess risk of cardiovascular disease: age, systolic blood pressure, smoking status, total cholesterol, reported diabetes status, and current treatment for hypertension. For the non-laboratory-based model, we substituted body-mass index for cholesterol. Findings In the cohort of 6186, there were 1529 first-time cardiovascular events and 578 (38%) deaths due to cardiovascular disease over 21 years. In women, the laboratory-based model was useful for predicting events, with a c statistic of 0·829. The c statistic of the non-laboratory-based model was 0·831. In men, the results were similar (0·784 for the laboratory-based model and 0·783 for the non-laboratory-based model). Results were similar between the laboratory-based and non-laboratory-based models in both men and women when restricted to fatal events only. Interpretation A method that uses non-laboratory-based risk factors predicted cardiovascular events as accurately as one that relied on laboratory-based values. This approach could simplify risk assessment in situations where laboratory testing is inconvenient or unavailable. PMID:18342687
Objective Integrated Assessment of Functional Outcomes in Reduction Mammaplasty
Passaro, Ilaria; Malovini, Alberto; Faga, Angela; Toffola, Elena Dalla
2013-01-01
Background: The aim of our study was an objective integrated assessment of the functional outcomes of reduction mammaplasty. Methods: The study involved 17 women undergoing reduction mammaplasty from March 2009 to June 2011. Each patient was assessed before surgery and 2 months postoperatively with the original association of 4 subjective and objective assessment methods: a physiatric clinical examination, the Roland Morris Disability Questionnaire, the Berg Balance Scale, and a static force platform analysis. Results: All of the tests proved multiple statistically significant associated outcomes demonstrating a significant improvement in the functional status following reduction mammaplasty. Surgical correction of breast hypertrophy could achieve both spinal pain relief and recovery of performance status in everyday life tasks, owing to a muscular postural functional rearrangement with a consistent antigravity muscle activity sparing. Pain reduction in turn could reduce the antalgic stiffness and improved the spinal range of motion. In our sample, the improvement of the spinal range of motion in flexion matched a similar improvement in extension. Recovery of a more favorable postural pattern with reduction of the anterior imbalance was demonstrated by the static force stabilometry. Therefore, postoperatively, all of our patients narrowed the gap between the actual body barycenter and the ideal one. The static force platform assessment also consistently confirmed the effectiveness of an accurate clinical examination of functional impairment from breast hypertrophy. Conclusions: The static force platform assessment might help the clinician to support the diagnosis of functional impairment from a breast hypertrophy with objectively based data. PMID:25289256
Synthesis of Survey Questions That Accurately Discriminate the Elements of the TPACK Framework
ERIC Educational Resources Information Center
Jaikaran-Doe, Seeta; Doe, Peter Edward
2015-01-01
A number of validated survey instruments for assessing technological pedagogical content knowledge (TPACK) do not accurately discriminate between the seven elements of the TPACK framework particularly technological content knowledge (TCK) and technological pedagogical knowledge (TPK). By posing simple questions that assess technological,…
Self-Assessment and Continuing Professional Development: The Canadian Perspective
ERIC Educational Resources Information Center
Silver, Ivan; Campbell, Craig; Marlow, Bernard; Sargeant, Joan
2008-01-01
Introduction: Several recent studies highlight that physicians are not very accurate at assessing their competence in clinical domains when compared to objective measures of knowledge and performance. Instead of continuing to try to train physicians to be more accurate self-assessors, the research suggests that physicians will benefit from…
Foresight begins with FMEA. Delivering accurate risk assessments.
Passey, R D
1999-03-01
If sufficient factors are taken into account and two- or three-stage analysis is employed, failure mode and effect analysis represents an excellent technique for delivering accurate risk assessments for products and processes, and for relating them to legal liability. This article describes a format that facilitates easy interpretation.
Dual-threshold segmentation using Arimoto entropy based on chaotic bee colony optimization
NASA Astrophysics Data System (ADS)
Li, Li
2018-03-01
In order to extract target from complex background more quickly and accurately, and to further improve the detection effect of defects, a method of dual-threshold segmentation using Arimoto entropy based on chaotic bee colony optimization was proposed. Firstly, the method of single-threshold selection based on Arimoto entropy was extended to dual-threshold selection in order to separate the target from the background more accurately. Then intermediate variables in formulae of Arimoto entropy dual-threshold selection was calculated by recursion to eliminate redundant computation effectively and to reduce the amount of calculation. Finally, the local search phase of artificial bee colony algorithm was improved by chaotic sequence based on tent mapping. The fast search for two optimal thresholds was achieved using the improved bee colony optimization algorithm, thus the search could be accelerated obviously. A large number of experimental results show that, compared with the existing segmentation methods such as multi-threshold segmentation method using maximum Shannon entropy, two-dimensional Shannon entropy segmentation method, two-dimensional Tsallis gray entropy segmentation method and multi-threshold segmentation method using reciprocal gray entropy, the proposed method can segment target more quickly and accurately with superior segmentation effect. It proves to be an instant and effective method for image segmentation.
Characterization of xenon ion and neutral interactions in a well-characterized experiment
NASA Astrophysics Data System (ADS)
Patino, Marlene I.; Wirz, Richard E.
2018-06-01
Interactions between fast ions and slow neutral atoms are commonly dominated by charge-exchange and momentum-exchange collisions, which are important to understanding and simulating the performance and behavior of many plasma devices. To investigate these interactions, this work developed a simple, well-characterized experiment that accurately measures the behavior of high energy xenon ions incident on a background of xenon neutral atoms. By using well-defined operating conditions and a simple geometry, these results serve as canonical data for the development and validation of plasma models and models of neutral beam sources that need to ensure accurate treatment of angular scattering distributions of charge-exchange and momentum-exchange ions and neutrals. The energies used in this study are relevant for electric propulsion devices ˜1.5 keV and can be used to improve models of ion-neutral interactions in the plume. By comparing these results to both analytical and computational models of ion-neutral interactions, we discovered the importance of (1) accurately treating the differential cross-sections for momentum-exchange and charge-exchange collisions over a large range of neutral background pressures and (2) properly considering commonly overlooked interactions, such as ion-induced electron emission from nearby surfaces and neutral-neutral ionization collisions.
A high-quality annotated transcriptome of swine peripheral blood
USDA-ARS?s Scientific Manuscript database
Background: High throughput gene expression profiling assays of peripheral blood are widely used in biomedicine, as well as in animal genetics and physiology research. Accurate, comprehensive, and precise interpretation of such high throughput assays relies on well-characterized reference genomes an...
40 CFR 51.363 - Quality assurance.
Code of Federal Regulations, 2011 CFR
2011-07-01
... test, the evaporative system tests, and emission control component checks (as applicable); (vi...) A check of the Constant Volume Sampler flow calibration; (5) A check for the optimization of the... selection, and power absorption; (9) A check of the system's ability to accurately detect background...
WATER QUALITY CRITERIA DOCUMENTS
Background
Water quality standards and criteria are the foundation for a wide range of programs under the Clean Water Act. Specifically, under section 304(a)(1) of the Clean Water Act it requires EPA to develop criteria for water quality that accurately re...
NASA Astrophysics Data System (ADS)
Dimitroulopoulou, C.; Ashmore, M. R.; Terry, A. C.
2017-02-01
Health effects of air pollution on individuals depend on their personal exposure, but few modelling tools are available which can predict how the distribution of personal exposures within a city will change in response to policies to reduce emissions both indoors and outdoors. We describe a new probabilistic modelling framework (INDAIR-2/EXPAIR), which provides predictions of the personal exposure frequency distribution (PEFD) across a city to assess the effects of both reduced emissions from home sources and reduced roadside concentrations on population exposure. The model uses a national time activity database, which gives the percentage of each population group in different residential and non-residential micro-environments, and links this, for the home, to predictions of concentrations from a three-compartment model, and for non-residential microenvironments to empirical indoor/outdoor ratios. This paper presents modelled PEFDs for NO2 in the city of Leicester, for children, the elderly, and office workers, comparing results in different seasons and on different days of the week. While the mean NO2 population exposure was close to, or below the urban background concentration, the 95%ile of the PEFD was well above the urban background concentration. The relationship between both mean and 95%ile PEFD and urban background concentrations was strongly influenced by air exchange rate. The 24 h mean PEFD showed relative small differences between the population groups, with both removal of home sources and reductions of roadside concentrations on roads with a high traffic density having similar effects in reducing mean exposure. In contrast, the 1 h maximum of the PEFD was significantly higher for children and the elderly than for office workers, and showed a much greater response to reduced home emissions in these groups. The results demonstrate the importance of understanding the dynamics of NO2 exposure at a population level within different groups, if the benefits of policy interventions are to be accurately assessed.
Ondigo, Bartholomew N; Park, Gregory S; Gose, Severin O; Ho, Benjamin M; Ochola, Lyticia A; Ayodo, George O; Ofulla, Ayub V; John, Chandy C
2012-12-21
Multiplex cytometric bead assay (CBA) have a number of advantages over ELISA for antibody testing, but little information is available on standardization and validation of antibody CBA to multiple Plasmodium falciparum antigens. The present study was set to determine optimal parameters for multiplex testing of antibodies to P. falciparum antigens, and to compare results of multiplex CBA to ELISA. Antibodies to ten recombinant P. falciparum antigens were measured by CBA and ELISA in samples from 30 individuals from a malaria endemic area of Kenya and compared to known positive and negative control plasma samples. Optimal antigen amounts, monoplex vs multiplex testing, plasma dilution, optimal buffer, number of beads required were assessed for CBA testing, and results from CBA vs. ELISA testing were compared. Optimal amounts for CBA antibody testing differed according to antigen. Results for monoplex CBA testing correlated strongly with multiplex testing for all antigens (r = 0.88-0.99, P values from <0.0001 - 0.004), and antibodies to variants of the same antigen were accurately distinguished within a multiplex reaction. Plasma dilutions of 1:100 or 1:200 were optimal for all antigens for CBA testing. Plasma diluted in a buffer containing 0.05% sodium azide, 0.5% polyvinylalcohol, and 0.8% polyvinylpyrrolidone had the lowest background activity. CBA median fluorescence intensity (MFI) values with 1,000 antigen-conjugated beads/well did not differ significantly from MFI with 5,000 beads/well. CBA and ELISA results correlated well for all antigens except apical membrane antigen-1 (AMA-1). CBA testing produced a greater range of values in samples from malaria endemic areas and less background reactivity for blank samples than ELISA. With optimization, CBA may be the preferred method of testing for antibodies to P. falciparum antigens, as CBA can test for antibodies to multiple recombinant antigens from a single plasma sample and produces a greater range of values in positive samples and lower background readings for blank samples than ELISA.
ERIC Educational Resources Information Center
Garcia-Rea, Elizabeth A.; LePage, James P.
2010-01-01
With the high number of homeless, there is a critical need for rapid and accurate assessment of quality of life to assess program outcomes. The World Health Organization's WHOQOL-100 has demonstrated promise in accurately assessing quality-of-life in this population. However, its length may make large scale use impractical for working with a…
NASA Astrophysics Data System (ADS)
Bezur, L.; Marshall, J.; Ottaway, J. M.
A square-wave wavelength modulation system, based on a rotating quartz chopper with four quadrants of different thicknesses, has been developed and evaluated as a method for automatic background correction in carbon furnace atomic emission spectrometry. Accurate background correction is achieved for the residual black body radiation (Rayleigh scatter) from the tube wall and Mie scatter from particles generated by a sample matrix and formed by condensation of atoms in the optical path. Intensity modulation caused by overlap at the edges of the quartz plates and by the divergence of the optical beam at the position of the modulation chopper has been investigated and is likely to be small.
Environmental contamination due to shale gas development.
Annevelink, M P J A; Meesters, J A J; Hendriks, A J
2016-04-15
Shale gas development potentially contaminates both air and water compartments. To assist in governmental decision-making on future explorations, we reviewed scattered information on activities, emissions and concentrations related to shale gas development. We compared concentrations from monitoring programmes to quality standards as a first indication of environmental risks. Emissions could not be estimated accurately because of incomparable and insufficient data. Air and water concentrations range widely. Poor wastewater treatment posed the highest risk with concentrations exceeding both Natural Background Values (NBVs) by a factor 1000-10,000 and Lowest Quality Standards (LQSs) by a factor 10-100. Concentrations of salts, metals, volatile organic compounds (VOCs) and hydrocarbons exceeded aquatic ecotoxicological water standards. Future research must focus on measuring aerial and aquatic emissions of toxic chemicals, generalisation of experimental setups and measurement technics and further human and ecological risk assessment. Copyright © 2016 Elsevier B.V. All rights reserved.
Remotely Sensed Information and Field Data are both Essential to Assess Biodiversity CONDITION!
NASA Astrophysics Data System (ADS)
Sparrow, B.; Schaefer, M.; Scarth, P.; Phinn, S. R.; Christensen, R.; Lowe, A. J.; O'Neill, S.; Thurgate, N.; Wundke, D.
2015-12-01
Over the past year the TERN Ausplots facility has hosted a process to determine the definition of Biodiversity Condition in an Australian Continental Context, and conducted a wide collaborative process to determine which environmental attributes are required to be measures to accurately inform on biodiversity condition. A major output from this work was the acknowledgement that good quality data from both remotely sensed sources and good quality field collected data are both essential to provide the best information possible on biodiversity condition. This poster details some background to the project, the assesment of which attributes to measure, and if the are sources primarily from field based or remotely sensed measures. It then proceeds to provide three examples of ways in which the combination of data types provides a superior product as output, with one example being provided for the three cornerstone areas of condition: Structure, Function and Composition.
Monazzam, Azita; Razifar, Pasha; Lindhe, Örjan; Josephsson, Raymond; Långström, Bengt; Bergström, Mats
2005-01-01
Background Considering the width and importance of using Multicellular Tumor Spheroids (MTS) in oncology research, size determination of MTSs by an accurate and fast method is essential. In the present study an effective, fast and semi-automated method, SASDM, was developed to determinate the size of MTSs. The method was applied and tested in MTSs of three different cell-lines. Frozen section autoradiography and Hemotoxylin Eosin (H&E) staining was used for further confirmation. Results SASDM was shown to be effective, user-friendly, and time efficient, and to be more precise than the traditional methods and it was applicable for MTSs of different cell-lines. Furthermore, the results of image analysis showed high correspondence to the results of autoradiography and staining. Conclusion The combination of assessment of metabolic condition and image analysis in MTSs provides a good model to evaluate the effect of various anti-cancer treatments. PMID:16283948
Sediment Transport from Urban, Urbanizing, and Rural Areas in Johnson County, Kansas, 2006-08
Lee, Casey J.
2013-01-01
1. Studies have commonly illustrated that erosion and sediment transport from construction sites is extensive, typically 10-100X that of background levels. 2. However, to our knowledge, the affects of construction and urbanization have rarely been assessed (1) since erosion and sediment controls have been required at construction sites, and (2) at watershed (5-65 mi2) scales. This is primarily because of difficulty characterizing sediment loads in small basins. Studies (such as that illustrated from Timble, 1999) illustrated how large changes in surface erosion may not result in substantive changes in downstream sediment loads (b/c of sediment deposition on land-surfaces, floodplains, and in stream channels). 3. Improved technology (in-situ turbidity) sensors provide a good application b/c they provide an independent surrogate of sediment concentration that is more accurate at estimating sediment concentrations and loads that instantaneous streamflow.
New efficient optimizing techniques for Kalman filters and numerical weather prediction models
NASA Astrophysics Data System (ADS)
Famelis, Ioannis; Galanis, George; Liakatas, Aristotelis
2016-06-01
The need for accurate local environmental predictions and simulations beyond the classical meteorological forecasts are increasing the last years due to the great number of applications that are directly or not affected: renewable energy resource assessment, natural hazards early warning systems, global warming and questions on the climate change can be listed among them. Within this framework the utilization of numerical weather and wave prediction systems in conjunction with advanced statistical techniques that support the elimination of the model bias and the reduction of the error variability may successfully address the above issues. In the present work, new optimization methods are studied and tested in selected areas of Greece where the use of renewable energy sources is of critical. The added value of the proposed work is due to the solid mathematical background adopted making use of Information Geometry and Statistical techniques, new versions of Kalman filters and state of the art numerical analysis tools.
Role of Molecular Dynamics and Related Methods in Drug Discovery.
De Vivo, Marco; Masetti, Matteo; Bottegoni, Giovanni; Cavalli, Andrea
2016-05-12
Molecular dynamics (MD) and related methods are close to becoming routine computational tools for drug discovery. Their main advantage is in explicitly treating structural flexibility and entropic effects. This allows a more accurate estimate of the thermodynamics and kinetics associated with drug-target recognition and binding, as better algorithms and hardware architectures increase their use. Here, we review the theoretical background of MD and enhanced sampling methods, focusing on free-energy perturbation, metadynamics, steered MD, and other methods most consistently used to study drug-target binding. We discuss unbiased MD simulations that nowadays allow the observation of unsupervised ligand-target binding, assessing how these approaches help optimizing target affinity and drug residence time toward improved drug efficacy. Further issues discussed include allosteric modulation and the role of water molecules in ligand binding and optimization. We conclude by calling for more prospective studies to attest to these methods' utility in discovering novel drug candidates.
Building and Activating Students' Background Knowledge: It's What They Already Know That Counts
ERIC Educational Resources Information Center
Fisher, Douglas; Frey, Nancy; Lapp, Diane
2012-01-01
Students enter the middle grades with varying amounts of background knowledge. Teachers must assess student background knowledge for gaps or misconceptions and then provide instruction to build on that base. This article discusses effective strategies for assessing and developing students' background knowledge so they can become independent…
Fortier, Véronique; Levesque, Ives R
2018-06-01
Phase processing impacts the accuracy of quantitative susceptibility mapping (QSM). Techniques for phase unwrapping and background removal have been proposed and demonstrated mostly in brain. In this work, phase processing was evaluated in the context of large susceptibility variations (Δχ) and negligible signal, in particular for susceptibility estimation using the iterative phase replacement (IPR) algorithm. Continuous Laplacian, region-growing, and quality-guided unwrapping were evaluated. For background removal, Laplacian boundary value (LBV), projection onto dipole fields (PDF), sophisticated harmonic artifact reduction for phase data (SHARP), variable-kernel sophisticated harmonic artifact reduction for phase data (V-SHARP), regularization enabled sophisticated harmonic artifact reduction for phase data (RESHARP), and 3D quadratic polynomial field removal were studied. Each algorithm was quantitatively evaluated in simulation and qualitatively in vivo. Additionally, IPR-QSM maps were produced to evaluate the impact of phase processing on the susceptibility in the context of large Δχ with negligible signal. Quality-guided unwrapping was the most accurate technique, whereas continuous Laplacian performed poorly in this context. All background removal algorithms tested resulted in important phase inaccuracies, suggesting that techniques used for brain do not translate well to situations where large Δχ and no or low signal are expected. LBV produced the smallest errors, followed closely by PDF. Results suggest that quality-guided unwrapping should be preferred, with PDF or LBV for background removal, for QSM in regions with large Δχ and negligible signal. This reduces the susceptibility inaccuracy introduced by phase processing. Accurate background removal remains an open question. Magn Reson Med 79:3103-3113, 2017. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.
Bowyer, A E; Hillarp, A; Ezban, M; Persson, P; Kitchen, S
2016-07-01
Essentials Validated assays are required to precisely measure factor IX (FIX) activity in FIX products. N9-GP and two other FIX products were assessed in various coagulation assay systems at two sites. Large variations in FIX activity measurements were observed for N9-GP using some assays. One-stage and chromogenic assays accurately measuring FIX activity for N9-GP were identified. Background Measurement of factor IX activity (FIX:C) with activated partial thromboplastin time-based one-stage clotting assays is associated with a large degree of interlaboratory variation in samples containing glycoPEGylated recombinant FIX (rFIX), i.e. nonacog beta pegol (N9-GP). Validation and qualification of specific assays and conditions are necessary for the accurate assessment of FIX:C in samples containing N9-GP. Objectives To assess the accuracy of various one-stage clotting and chromogenic assays for measuring FIX:C in samples containing N9-GP as compared with samples containing rFIX or plasma-derived FIX (pdFIX) across two laboratory sites. Methods FIX:C, in severe hemophilia B plasma spiked with a range of concentrations (from very low, i.e. 0.03 IU mL(-1) , to high, i.e. 0.90 IU mL(-1) ) of N9-GP, rFIX (BeneFIX), and pdFIX (Mononine), was determined at two laboratory sites with 10 commercially available one-stage clotting assays and two chromogenic FIX:C assays. Assays were performed with a plasma calibrator and different analyzers. Results A high degree of variation in FIX:C measurement was observed for one-stage clotting assays for N9-GP as compared with rFIX or pdFIX. Acceptable N9-GP recovery was observed in the low-concentration to high-concentration samples tested with one-stage clotting assays using SynthAFax or DG Synth, or with chromogenic FIX:C assays. Similar patterns of FIX:C measurement were observed at both laboratory sites, with minor differences probably being attributable to the use of different analyzers. Conclusions These results suggest that, of the reagents tested, FIX:C in N9-GP-containing plasma samples can be most accurately measured with one-stage clotting assays using SynthAFax or DG Synth, or with chromogenic FIX:C assays. © 2016 International Society on Thrombosis and Haemostasis.
Wan, Xiaomin; Peng, Liubao; Li, Yuanjian
2015-01-01
Background In general, the individual patient-level data (IPD) collected in clinical trials are not available to independent researchers to conduct economic evaluations; researchers only have access to published survival curves and summary statistics. Thus, methods that use published survival curves and summary statistics to reproduce statistics for economic evaluations are essential. Four methods have been identified: two traditional methods 1) least squares method, 2) graphical method; and two recently proposed methods by 3) Hoyle and Henley, 4) Guyot et al. The four methods were first individually reviewed and subsequently assessed regarding their abilities to estimate mean survival through a simulation study. Methods A number of different scenarios were developed that comprised combinations of various sample sizes, censoring rates and parametric survival distributions. One thousand simulated survival datasets were generated for each scenario, and all methods were applied to actual IPD. The uncertainty in the estimate of mean survival time was also captured. Results All methods provided accurate estimates of the mean survival time when the sample size was 500 and a Weibull distribution was used. When the sample size was 100 and the Weibull distribution was used, the Guyot et al. method was almost as accurate as the Hoyle and Henley method; however, more biases were identified in the traditional methods. When a lognormal distribution was used, the Guyot et al. method generated noticeably less bias and a more accurate uncertainty compared with the Hoyle and Henley method. Conclusions The traditional methods should not be preferred because of their remarkable overestimation. When the Weibull distribution was used for a fitted model, the Guyot et al. method was almost as accurate as the Hoyle and Henley method. However, if the lognormal distribution was used, the Guyot et al. method was less biased compared with the Hoyle and Henley method. PMID:25803659
NASA Astrophysics Data System (ADS)
Yadav, Deepti; Arora, M. K.; Tiwari, K. C.; Ghosh, J. K.
2016-04-01
Hyperspectral imaging is a powerful tool in the field of remote sensing and has been used for many applications like mineral detection, detection of landmines, target detection etc. Major issues in target detection using HSI are spectral variability, noise, small size of the target, huge data dimensions, high computation cost, complex backgrounds etc. Many of the popular detection algorithms do not work for difficult targets like small, camouflaged etc. and may result in high false alarms. Thus, target/background discrimination is a key issue and therefore analyzing target's behaviour in realistic environments is crucial for the accurate interpretation of hyperspectral imagery. Use of standard libraries for studying target's spectral behaviour has limitation that targets are measured in different environmental conditions than application. This study uses the spectral data of the same target which is used during collection of the HSI image. This paper analyze spectrums of targets in a way that each target can be spectrally distinguished from a mixture of spectral data. Artificial neural network (ANN) has been used to identify the spectral range for reducing data and further its efficacy for improving target detection is verified. The results of ANN proposes discriminating band range for targets; these ranges were further used to perform target detection using four popular spectral matching target detection algorithm. Further, the results of algorithms were analyzed using ROC curves to evaluate the effectiveness of the ranges suggested by ANN over full spectrum for detection of desired targets. In addition, comparative assessment of algorithms is also performed using ROC.
Dual-wavelength digital holographic imaging with phase background subtraction
NASA Astrophysics Data System (ADS)
Khmaladze, Alexander; Matz, Rebecca L.; Jasensky, Joshua; Seeley, Emily; Holl, Mark M. Banaszak; Chen, Zhan
2012-05-01
Three-dimensional digital holographic microscopic phase imaging of objects that are thicker than the wavelength of the imaging light is ambiguous and results in phase wrapping. In recent years, several unwrapping methods that employed two or more wavelengths were introduced. These methods compare the phase information obtained from each of the wavelengths and extend the range of unambiguous height measurements. A straightforward dual-wavelength phase imaging method is presented which allows for a flexible tradeoff between the maximum height of the sample and the amount of noise the method can tolerate. For highly accurate phase measurements, phase unwrapping of objects with heights higher than the beat (synthetic) wavelength (i.e. the product of the original two wavelengths divided by their difference), can be achieved. Consequently, three-dimensional measurements of a wide variety of biological systems and microstructures become technically feasible. Additionally, an effective method of removing phase background curvature based on slowly varying polynomial fitting is proposed. This method allows accurate volume measurements of several small objects with the same image frame.
Interstellar cyanogen and the temperature of the cosmic microwave background radiation
NASA Technical Reports Server (NTRS)
Roth, Katherine C.; Meyer, David M.; Hawkins, Isabel
1993-01-01
We present the results of a recently completed effort to determine the amount of CN rotational excitation in five diffuse interstellar clouds for the purpose of accurately measuring the temperature of the cosmic microwave background radiation (CMBR). In addition, we report a new detection of emission from the strongest hyperfine component of the 2.64 mm CN rotational transition (N = 1-0) in the direction toward HD 21483. We have used this result in combination with existing emission measurements toward our other stars to correct for local excitation effects within diffuse clouds which raise the measured CN rotational temperature above that of the CMBR. After making this correction, we find a weighted mean value of T(CMBR) = 2.729 (+0.023, -0.031) K. This temperature is in excellent agreement with the new COBE measurement of 2.726 +/- 0.010 K (Mather et al., 1993). Our result, which samples the CMBR far from the near-Earth environment, attests to the accuracy of the COBE measurement and reaffirms the cosmic nature of this background radiation. From the observed agreement between our CMBR temperature and the COBE result, we conclude that corrections for local CN excitation based on millimeter emission measurements provide an accurate adjustment to the measured rotational excitation.
Instrumental background in balloon-borne gamma-ray spectrometers and techniques for its reduction
NASA Technical Reports Server (NTRS)
Gehrels, N.
1985-01-01
Instrumental background in balloon-borne gamma-ray spectrometers is presented. The calculations are based on newly available interaction cross sections and new analytic techniques, and are the most detailed and accurate published to date. Results compare well with measurements made in the 20 keV to 10 MeV energy range by the Goddard Low Energy Gamma-ray Spectrometer (LEGS). The principal components of the continuum background in spectrometers with GE detectors and thick active shields are: (1) elastic neutron scattering of atmospheric neutrons on the Ge nuclei; (2) aperture flux of atmospheric and cosmic gamma rays; (3) beta decays of unstable nuclides produced by nuclear interactions of atmospheric protons and neutrons with Ge nuclei; and (4) shield leakage of atmospheric gamma rays. The improved understanding of these components leads to several recommended techniques for reducing the background.
Miller, Thomas Martin; de Wet, Wouter C.; Patton, Bruce W.
2015-10-28
In this study, a computational assessment of the variation in terrestrial neutron and photon background from extraterrestrial sources is presented. The motivation of this assessment is to evaluate the practicality of developing a tool or database to estimate background in real time (or near–real time) during an experimental measurement or to even predict the background for future measurements. The extraterrestrial source focused on during this assessment is naturally occurring galactic cosmic rays (GCRs). The MCNP6 transport code was used to perform the computational assessment. However, the GCR source available in MCNP6 was not used. Rather, models developed and maintained bymore » NASA were used to generate the GCR sources. The largest variation in both neutron and photon background spectra was found to be caused by changes in elevation on Earth's surface, which can be as large as an order of magnitude. All other perturbations produced background variations on the order of a factor of 3 or less. The most interesting finding was that ~80% and 50% of terrestrial background neutrons and photons, respectively, are generated by interactions in Earth's surface and other naturally occurring and man-made objects near a detector of particles from extraterrestrial sources and their progeny created in Earth's atmosphere. In conclusion, this assessment shows that it will be difficult to estimate the terrestrial background from extraterrestrial sources without a good understanding of a detector's surroundings. Therefore, estimating or predicting background during a measurement environment like a mobile random search will be difficult.« less
pcr: an R package for quality assessment, analysis and testing of qPCR data
Ahmed, Mahmoud
2018-01-01
Background Real-time quantitative PCR (qPCR) is a broadly used technique in the biomedical research. Currently, few different analysis models are used to determine the quality of data and to quantify the mRNA level across the experimental conditions. Methods We developed an R package to implement methods for quality assessment, analysis and testing qPCR data for statistical significance. Double Delta CT and standard curve models were implemented to quantify the relative expression of target genes from CT in standard qPCR control-group experiments. In addition, calculation of amplification efficiency and curves from serial dilution qPCR experiments are used to assess the quality of the data. Finally, two-group testing and linear models were used to test for significance of the difference in expression control groups and conditions of interest. Results Using two datasets from qPCR experiments, we applied different quality assessment, analysis and statistical testing in the pcr package and compared the results to the original published articles. The final relative expression values from the different models, as well as the intermediary outputs, were checked against the expected results in the original papers and were found to be accurate and reliable. Conclusion The pcr package provides an intuitive and unified interface for its main functions to allow biologist to perform all necessary steps of qPCR analysis and produce graphs in a uniform way. PMID:29576953
Nguyen, Dat Tien; Park, Kang Ryoung
2016-07-21
With higher demand from users, surveillance systems are currently being designed to provide more information about the observed scene, such as the appearance of objects, types of objects, and other information extracted from detected objects. Although the recognition of gender of an observed human can be easily performed using human perception, it remains a difficult task when using computer vision system images. In this paper, we propose a new human gender recognition method that can be applied to surveillance systems based on quality assessment of human areas in visible light and thermal camera images. Our research is novel in the following two ways: First, we utilize the combination of visible light and thermal images of the human body for a recognition task based on quality assessment. We propose a quality measurement method to assess the quality of image regions so as to remove the effects of background regions in the recognition system. Second, by combining the features extracted using the histogram of oriented gradient (HOG) method and the measured qualities of image regions, we form a new image features, called the weighted HOG (wHOG), which is used for efficient gender recognition. Experimental results show that our method produces more accurate estimation results than the state-of-the-art recognition method that uses human body images.
Nguyen, Dat Tien; Park, Kang Ryoung
2016-01-01
With higher demand from users, surveillance systems are currently being designed to provide more information about the observed scene, such as the appearance of objects, types of objects, and other information extracted from detected objects. Although the recognition of gender of an observed human can be easily performed using human perception, it remains a difficult task when using computer vision system images. In this paper, we propose a new human gender recognition method that can be applied to surveillance systems based on quality assessment of human areas in visible light and thermal camera images. Our research is novel in the following two ways: First, we utilize the combination of visible light and thermal images of the human body for a recognition task based on quality assessment. We propose a quality measurement method to assess the quality of image regions so as to remove the effects of background regions in the recognition system. Second, by combining the features extracted using the histogram of oriented gradient (HOG) method and the measured qualities of image regions, we form a new image features, called the weighted HOG (wHOG), which is used for efficient gender recognition. Experimental results show that our method produces more accurate estimation results than the state-of-the-art recognition method that uses human body images. PMID:27455264
Kolandaivelu, Aravindan; Zviman, Menekhem M.; Castro, Valeria; Lardo, Albert C.; Berger, Ronald D.; Halperin, Henry R.
2010-01-01
Background Failure to achieve properly localized, permanent tissue destruction is a common cause of arrhythmia recurrence after cardiac ablation. Current methods of assessing lesion size and location during cardiac radiofrequency ablation are unreliable or not suited for repeated assessment during the procedure. MRI thermography could be used to delineate permanent ablation lesions because tissue heating above 50°C is the cause of permanent tissue destruction during radiofrequency ablation. However, image artifacts caused by cardiac motion, the ablation electrode, and radiofrequency ablation currently pose a challenge to MRI thermography in the heart. In the current study, we sought to demonstrate the feasibility of MRI thermography during cardiac ablation. Methods and Results An MRI-compatible electrophysiology catheter and filtered radiofrequency ablation system was used to perform ablation in the left ventricle of 6 mongrel dogs in a 1.5-T MRI system. Fast gradient-echo imaging was performed before and during radiofrequency ablation, and thermography images were derived from the preheating and postheating images. Lesion extent by thermography was within 20% of the gross pathology lesion. Conclusions MR thermography appears to be a promising technique for monitoring lesion formation and may allow for more accurate placement and titration of ablation, possibly reducing arrhythmia recurrences. PMID:20657028
Raj, S.; Sharma, V. L.; Singh, A. J.; Goel, S.
2016-01-01
Background. The available health information on websites should be reliable and accurate in order to make informed decisions by community. This study was done to assess the quality and readability of health information websites on World Wide Web in India. Methods. This cross-sectional study was carried out in June 2014. The key words “Health” and “Information” were used on search engines “Google” and “Yahoo.” Out of 50 websites (25 from each search engines), after exclusion, 32 websites were evaluated. LIDA tool was used to assess the quality whereas the readability was assessed using Flesch Reading Ease Score (FRES), Flesch-Kincaid Grade Level (FKGL), and SMOG. Results. Forty percent of websites (n = 13) were sponsored by government. Health On the Net Code of Conduct (HONcode) certification was present on 50% (n = 16) of websites. The mean LIDA score (74.31) was average. Only 3 websites scored high on LIDA score. Only five had readability scores at recommended sixth-grade level. Conclusion. Most health information websites had average quality especially in terms of usability and reliability and were written at high readability levels. Efforts are needed to develop the health information websites which can help general population in informed decision making. PMID:27119025
Bignulin, Sara; Falleti, Edmondo; Cmet, Sara; Cappello, Dario; Cussigh, Annarosa; Lenisa, Ilaria; Dissegna, Denis; Pugliese, Fabio; Vivarelli, Cinzia; Fabris, Carlo; Fabris, Carlo; Toniutto, Pierluigi
2016-01-01
Background and rationale. Acoustic radiation force impulse (ARFI) is a non-invasive tool used in the evaluation of liver fibrosis in HCV positive immune-competent patients. This study aimed to assess the accuracy of ARFI in discriminating liver transplanted patients with different graft fibrosis severity and to verify whether ARFI, eventually combined with non-invasive biochemical tests, could spare liver biopsies. This prospective study included 51 HCV positive liver transplanted patients who consecutively underwent to annual liver biopsy concomitantly with ARFI and blood chemistry tests measurements needed to calculate several non-invasive liver fibrosis tests. Overall ARFI showed an AUC of 0.885 in discriminating between patients without or with significant fibrosis (Ishak score 0-2vs. 3-6). Using a cut-off of 1.365 m/s, ARFI possesses a negative predictive value of 100% in identifying patients without significant fibrosis. AUC for Fibrotest was 0.848 in discriminating patients with Ishak fibrosis score 0-2 vs. 3-6. The combined assessment of ARFI and Fibro-test did not improve the results obtained by ARFI alone. ARFI measurement in HCV positive liver transplanted patients can be considered an easy and accurate non-invasive tool in identify patients with a benign course of HCV recurrence.
Little Association Between Wellness Policies and School-Reported Nutrition Practices
Lucarelli, Jennifer F.; Alaimo, Katherine; Belansky, Elaine S.; Mang, Ellen; Miles, Richard; Kelleher, Deanne K.; Bailey, Deborah; Drzal, Nicholas B.; Liu, Hui
2017-01-01
Background The Child Nutrition and WIC Reauthorization Act of 2004 mandated written school wellness policies. Little evidence exists to evaluate the impact of such policies. This study assessed the quality (comprehensiveness of topics addressed and strength of wording) of wellness policies and the agreement between written district-level policies and school-reported nutrition policies and practices in 48 low-income Michigan school districts participating in the School Nutrition Advances Kids study. Method Written wellness policy quality was assessed using the School Wellness Policy Evaluation Tool. School nutrition policies and practices were assessed using the School Environment and Policy Survey. Analysis of variance determined differences in policy quality, and Fisher’s exact test examined agreement between written policies and school-reported practices. Results Written wellness policies contained ambiguous language and addressed few practices, indicating low comprehensiveness and strength. Most districts adopted model wellness policy templates without modification, and the template used was the primary determinant of policy quality. Written wellness policies often did not reflect school-reported nutrition policies and practices. Conclusions School health advocates should avoid assumptions that written wellness policies accurately reflect school practices. Encouraging policy template customization and stronger, more specific language may enhance wellness policy quality, ensure consistency between policy and practice, and enhance implementation of school nutrition initiatives. PMID:25249567
Towards the Application of Fuzzy Logic for Developing a Novel Indoor Air Quality Index (FIAQI)
JAVID, Allahbakhsh; HAMEDIAN, Amir Abbas; GHARIBI, Hamed; SOWLAT, Mohammad Hossein
2016-01-01
Background: In the past few decades, Indoor Air Pollution (IAP) has become a primary concern to the point. It is increasingly believed to be of equal or greater importance to human health compared to ambient air. However, due to the lack of comprehensive indices for the integrated assessment of indoor air quality (IAQ), we aimed to develop a novel, Fuzzy-Based Indoor Air Quality Index (FIAQI) to bridge the existing gap in this area. Methods: We based our index on fuzzy logic, which enables us to overcome the limitations of traditional methods applied to develop environmental quality indices. Fifteen parameters, including the criteria air pollutants, volatile organic compounds, and bioaerosols were included in the FIAQI due mainly to their significant health effects. Weighting factors were assigned to the parameters based on the medical evidence available in the literature on their health effects. The final FIAQI consisted of 108 rules. In order to demonstrate the performance of the index, data were intentionally generated to cover a variety of quality levels. In addition, a sensitivity analysis was conducted to assess the validity of the index. Results: The FIAQI tends to be a comprehensive tool to classify IAQ and produce accurate results. Conclusion: It seems useful and reliable to be considered by authorities to assess IAQ environments. PMID:27114985
Particle radiation near the orbit of the Vacuum Wake Shield
NASA Technical Reports Server (NTRS)
Bering, Edgar A., III; Ignatiev, Alex
1990-01-01
The particle populations that are expected to inflict the most damage on thin film materials grown on the vacuum Wake Shield Facility (WSF) are ions and energetic neutral atoms with energies in the range of 100 eV to 20 keV. The production of films that have an order of magnitude fewer defects than are now available requires that the 1-keV particle flux be kept lower than 1000 particles/(sq cm s sr keV) (assuming a reasonable spectral shape). WSF will be flown on orbits with an inclination of 28 deg at altitudes of 300-700 km. Because of the background counting rate produced by the about 100 MeV trapped protons in the inner belt, obtaining accurate measurements of the particles of interest is very difficult. The quiet-time background fluxes of the relevant particles are not presently known. At times of magnetic activity, fluxes of 0.1-17 keV O(+) ions as great as 10 million ions/(sq cm s sr keV) have been observed flowing out of the ionosphere at these latitudes. It appears that instrumentation for detailed assessment is essential for the proof-of-concept flight(s) and that real-time monitoring of low-energy ion and energetic neutral radiation will be required for the production flights.
Zhao, Gang; Tan, Wei; Hou, Jiajia; Qiu, Xiaodong; Ma, Weiguang; Li, Zhixin; Dong, Lei; Zhang, Lei; Yin, Wangbao; Xiao, Liantuan; Axner, Ove; Jia, Suotang
2016-01-25
A methodology for calibration-free wavelength modulation spectroscopy (CF-WMS) that is based upon an extensive empirical description of the wavelength-modulation frequency response (WMFR) of DFB laser is presented. An assessment of the WMFR of a DFB laser by the use of an etalon confirms that it consists of two parts: a 1st harmonic component with an amplitude that is linear with the sweep and a nonlinear 2nd harmonic component with a constant amplitude. Simulations show that, among the various factors that affect the line shape of a background-subtracted peak-normalized 2f signal, such as concentration, phase shifts between intensity modulation and frequency modulation, and WMFR, only the last factor has a decisive impact. Based on this and to avoid the impractical use of an etalon, a novel method to pre-determine the parameters of the WMFR by fitting to a background-subtracted peak-normalized 2f signal has been developed. The accuracy of the new scheme to determine the WMFR is demonstrated and compared with that of conventional methods in CF-WMS by detection of trace acetylene. The results show that the new method provides a four times smaller fitting error than the conventional methods and retrieves concentration more accurately.
Epidemiology of Occupational Accidents in Iran Based on Social Security Organization Database
Mehrdad, Ramin; Seifmanesh, Shahdokht; Chavoshi, Farzaneh; Aminian, Omid; Izadi, Nazanin
2014-01-01
Background: Background: Today, occupational accidents are one of the most important problems in industrial world. Due to lack of appropriate system for registration and reporting, there is no accurate statistics of occupational accidents all over the world especially in developing countries. Objectives: The aim of this study is epidemiological assessment of occupational accidents in Iran. Materials and Methods: Information of available occupational accidents in Social Security Organization was extracted from accident reporting and registration forms. In this cross-sectional study, gender, age, economic activity, type of accident and injured body part in 22158 registered accidents during 2008 were described. Results: The occupational accidents rate was 253 in 100,000 workers in 2008. 98.2% of injured workers were men. The mean age of injured workers was 32.07 ± 9.12 years. The highest percentage belonged to age group of 25-34 years old. In our study, most of the accidents occurred in basic metals industry, electrical and non-electrical machines and construction industry. Falling down from height and crush injury were the most prevalent accidents. Upper and lower extremities were the most common injured body parts. Conclusion: Due to the high rate of accidents in metal and construction industries, engineering controls, the use of appropriate protective equipment and safety worker training seems necessary. PMID:24719699
NASA Astrophysics Data System (ADS)
Schweitzer, S.; Kirchengast, G.; Proschek, V.
2011-10-01
LEO-LEO infrared-laser occultation (LIO) is a new occultation technique between Low Earth Orbit (LEO) satellites, which applies signals in the short wave infrared spectral range (SWIR) within 2 μm to 2.5 μm. It is part of the LEO-LEO microwave and infrared-laser occultation (LMIO) method that enables to retrieve thermodynamic profiles (pressure, temperature, humidity) and altitude levels from microwave signals and profiles of greenhouse gases and further variables such as line-of-sight wind speed from simultaneously measured LIO signals. Due to the novelty of the LMIO method, detailed knowledge of atmospheric influences on LIO signals and of their suitability for accurate trace species retrieval did not yet exist. Here we discuss these influences, assessing effects from refraction, trace species absorption, aerosol extinction and Rayleigh scattering in detail, and addressing clouds, turbulence, wind, scattered solar radiation and terrestrial thermal radiation as well. We show that the influence of refractive defocusing, foreign species absorption, aerosols and turbulence is observable, but can be rendered small to negligible by use of the differential transmission principle with a close frequency spacing of LIO absorption and reference signals within 0.5%. The influences of Rayleigh scattering and terrestrial thermal radiation are found negligible. Cloud-scattered solar radiation can be observable under bright-day conditions, but this influence can be made negligible by a close time spacing (within 5 ms) of interleaved laser-pulse and background signals. Cloud extinction loss generally blocks SWIR signals, except very thin or sub-visible cirrus clouds, which can be addressed by retrieving a cloud layering profile and exploiting it in the trace species retrieval. Wind can have a small influence on the trace species absorption, which can be made negligible by using a simultaneously retrieved or a moderately accurate background wind speed profile. We conclude that the set of SWIR channels proposed for implementing the LMIO method (Kirchengast and Schweitzer, 2011) provides adequate sensitivity to accurately retrieve eight trace species of key importance to climate and atmospheric chemistry (H2O, CO2, 13CO2, C18OO, CH4, N2O, O3, CO) in the upper troposphere/lower stratosphere region outside clouds under all atmospheric conditions. Two further species (HDO, H218O) can be retrieved in the upper troposphere.
Richard-Davis, Gloria; Whittemore, Brianna; Disher, Anthony; Rice, Valerie Montgomery; Lenin, Rathinasamy B; Dollins, Camille; Siegel, Eric R; Eswaran, Hari
2018-01-01
Objective: Increased mammographic breast density is a well-established risk factor for breast cancer development, regardless of age or ethnic background. The current gold standard for categorizing breast density consists of a radiologist estimation of percent density according to the American College of Radiology (ACR) Breast Imaging Reporting and Data System (BI-RADS) criteria. This study compares paired qualitative interpretations of breast density on digital mammograms with quantitative measurement of density using Hologic’s Food and Drug Administration–approved R2 Quantra volumetric breast density assessment tool. Our goal was to find the best cutoff value of Quantra-calculated breast density for stratifying patients accurately into high-risk and low-risk breast density categories. Methods: Screening digital mammograms from 385 subjects, aged 18 to 64 years, were evaluated. These mammograms were interpreted by a radiologist using the ACR’s BI-RADS density method, and had quantitative density measured using the R2 Quantra breast density assessment tool. The appropriate cutoff for breast density–based risk stratification using Quantra software was calculated using manually determined BI-RADS scores as a gold standard, in which scores of D3/D4 denoted high-risk densities and D1/D2 denoted low-risk densities. Results: The best cutoff value for risk stratification using Quantra-calculated breast density was found to be 14.0%, yielding a sensitivity of 65%, specificity of 77%, and positive and negative predictive values of 75% and 69%, respectively. Under bootstrap analysis, the best cutoff value had a mean ± SD of 13.70% ± 0.89%. Conclusions: Our study is the first to publish on a North American population that assesses the accuracy of the R2 Quantra system at breast density stratification. Quantitative breast density measures will improve accuracy and reliability of density determination, assisting future researchers to accurately calculate breast cancer risks associated with density increase. PMID:29511356
A Novel Electronic Data Collection System for Large-Scale Surveys of Neglected Tropical Diseases
King, Jonathan D.; Buolamwini, Joy; Cromwell, Elizabeth A.; Panfel, Andrew; Teferi, Tesfaye; Zerihun, Mulat; Melak, Berhanu; Watson, Jessica; Tadesse, Zerihun; Vienneau, Danielle; Ngondi, Jeremiah; Utzinger, Jürg; Odermatt, Peter; Emerson, Paul M.
2013-01-01
Background Large cross-sectional household surveys are common for measuring indicators of neglected tropical disease control programs. As an alternative to standard paper-based data collection, we utilized novel paperless technology to collect data electronically from over 12,000 households in Ethiopia. Methodology We conducted a needs assessment to design an Android-based electronic data collection and management system. We then evaluated the system by reporting results of a pilot trial and from comparisons of two, large-scale surveys; one with traditional paper questionnaires and the other with tablet computers, including accuracy, person-time days, and costs incurred. Principle Findings The electronic data collection system met core functions in household surveys and overcame constraints identified in the needs assessment. Pilot data recorders took 264 (standard deviation (SD) 152 sec) and 260 sec (SD 122 sec) per person registered to complete household surveys using paper and tablets, respectively (P = 0.77). Data recorders felt a lack of connection with the interviewee during the first days using electronic devices, but preferred to collect data electronically in future surveys. Electronic data collection saved time by giving results immediately, obviating the need for double data entry and cross-correcting. The proportion of identified data entry errors in disease classification did not differ between the two data collection methods. Geographic coordinates collected using the tablets were more accurate than coordinates transcribed on a paper form. Costs of the equipment required for electronic data collection was approximately the same cost incurred for data entry of questionnaires, whereas repeated use of the electronic equipment may increase cost savings. Conclusions/Significance Conducting a needs assessment and pilot testing allowed the design to specifically match the functionality required for surveys. Electronic data collection using an Android-based technology was suitable for a large-scale health survey, saved time, provided more accurate geo-coordinates, and was preferred by recorders over standard paper-based questionnaires. PMID:24066147
Measuring physical activity during pregnancy
2011-01-01
Background Currently, little is known about physical activity patterns in pregnancy with prior estimates predominantly based on subjective assessment measures that are prone to error. Given the increasing obesity rates and the importance of physical activity in pregnancy, we evaluated the relationship and agreement between subjective and objective physical activity assessment tools to inform researchers and clinicians on optimal assessment of physical activity in pregnancy. Methods 48 pregnant women between 26-28 weeks gestation were recruited. The Yamax pedometer and Actigraph accelerometer were worn for 5-7 days under free living conditions and thereafter the International Physical Activity Questionnaire (IPAQ) was completed. IPAQ and pedometer estimates of activity were compared to the more robust and accurate accelerometer data. Results Of 48 women recruited, 30 women completed the study (mean age: 33.6 ± 4.7 years; mean BMI: 31.2 ± 5.1 kg/m2) and 18 were excluded (failure to wear [n = 8] and incomplete data [n = 10]). The accelerometer and pedometer correlated significantly on estimation of daily steps (ρ = 0.69, p < 0.01) and had good absolute agreement with low systematic error (mean difference: 505 ± 1498 steps/day). Accelerometer and IPAQ estimates of total, light and moderate Metabolic Equivalent minutes/day (MET min-1 day-1) were not significantly correlated and there was poor absolute agreement. Relative to the accelerometer, the IPAQ under predicted daily total METs (105.76 ± 259.13 min-1 day-1) and light METs (255.55 ± 128.41 min-1 day-1) and over predicted moderate METs (-112.25 ± 166.41 min-1 day-1). Conclusion Compared with the accelerometer, the pedometer appears to provide a reliable estimate of physical activity in pregnancy, whereas the subjective IPAQ measure performed less accurately in this setting. Future research measuring activity in pregnancy should optimally encompass objective measures of physical activity. Trial Registration Australian New Zealand Clinical Trial Registry Number: ACTRN12608000233325. Registered 7/5/2008. PMID:21418609
2012-01-01
Background In myocardial perfusion scintigraphy (MPS), typically a stress and a rest study is performed. If the stress study is considered normal, there is no need for a subsequent rest study. The aim of the study was to determine whether nuclear medicine technologists are able to assess the necessity of a rest study. Methods Gated MPS using a 2-day 99mTc protocol for 121 consecutive patients were studied. Visual interpretation by 3 physicians was used as gold standard for determining the need for a rest study based on the stress images. All nuclear medicine technologists performing MPS had to review 82 training cases of stress MPS images with comments regarding the need for rest studies, and thereafter a test consisting of 20 stress MPS images. After passing this test, the nuclear medicine technologists in charge of a stress MPS study assessed whether a rest study was needed or not or if he/she was uncertain and wanted to consult a physician. After that, the physician in charge interpreted the images and decided whether a rest study was required or not. Results The nuclear medicine technologists and the physicians in clinical routine agreed in 103 of the 107 cases (96%) for which the technologists felt certain regarding the need for a rest study. In the remaining 14 cases the technologists were uncertain, i.e. wanted to consult a physician. The agreement between the technologists and the physicians in clinical routine was very good, resulting in a kappa value of 0.92. There was no statistically significant difference in the evaluations made by technicians and physicians (P = 0.617). Conclusions The nuclear medicine technologists were able to accurately determine whether a rest study was necessary. There was very good agreement between nuclear medicine technologists and physicians in the assessment of the need for a rest study. If the technologists can make this decision, the effectiveness of the nuclear medicine department will improve. PMID:22947251
Frizzelle, Brian G; Evenson, Kelly R; Rodriguez, Daniel A; Laraia, Barbara A
2009-01-01
Background Health researchers have increasingly adopted the use of geographic information systems (GIS) for analyzing environments in which people live and how those environments affect health. One aspect of this research that is often overlooked is the quality and detail of the road data and whether or not it is appropriate for the scale of analysis. Many readily available road datasets, both public domain and commercial, contain positional errors or generalizations that may not be compatible with highly accurate geospatial locations. This study examined the accuracy, completeness, and currency of four readily available public and commercial sources for road data (North Carolina Department of Transportation, StreetMap Pro, TIGER/Line 2000, TIGER/Line 2007) relative to a custom road dataset which we developed and used for comparison. Methods and Results A custom road network dataset was developed to examine associations between health behaviors and the environment among pregnant and postpartum women living in central North Carolina in the United States. Three analytical measures were developed to assess the comparative accuracy and utility of four publicly and commercially available road datasets and the custom dataset in relation to participants' residential locations over three time periods. The exclusion of road segments and positional errors in the four comparison road datasets resulted in between 5.9% and 64.4% of respondents lying farther than 15.24 meters from their nearest road, the distance of the threshold set by the project to facilitate spatial analysis. Agreement, using a Pearson's correlation coefficient, between the customized road dataset and the four comparison road datasets ranged from 0.01 to 0.82. Conclusion This study demonstrates the importance of examining available road datasets and assessing their completeness, accuracy, and currency for their particular study area. This paper serves as an example for assessing the feasibility of readily available commercial or public road datasets, and outlines the steps by which an improved custom dataset for a study area can be developed. PMID:19409088
Richard-Davis, Gloria; Whittemore, Brianna; Disher, Anthony; Rice, Valerie Montgomery; Lenin, Rathinasamy B; Dollins, Camille; Siegel, Eric R; Eswaran, Hari
2018-01-01
Increased mammographic breast density is a well-established risk factor for breast cancer development, regardless of age or ethnic background. The current gold standard for categorizing breast density consists of a radiologist estimation of percent density according to the American College of Radiology (ACR) Breast Imaging Reporting and Data System (BI-RADS) criteria. This study compares paired qualitative interpretations of breast density on digital mammograms with quantitative measurement of density using Hologic's Food and Drug Administration-approved R2 Quantra volumetric breast density assessment tool. Our goal was to find the best cutoff value of Quantra-calculated breast density for stratifying patients accurately into high-risk and low-risk breast density categories. Screening digital mammograms from 385 subjects, aged 18 to 64 years, were evaluated. These mammograms were interpreted by a radiologist using the ACR's BI-RADS density method, and had quantitative density measured using the R2 Quantra breast density assessment tool. The appropriate cutoff for breast density-based risk stratification using Quantra software was calculated using manually determined BI-RADS scores as a gold standard, in which scores of D3/D4 denoted high-risk densities and D1/D2 denoted low-risk densities. The best cutoff value for risk stratification using Quantra-calculated breast density was found to be 14.0%, yielding a sensitivity of 65%, specificity of 77%, and positive and negative predictive values of 75% and 69%, respectively. Under bootstrap analysis, the best cutoff value had a mean ± SD of 13.70% ± 0.89%. Our study is the first to publish on a North American population that assesses the accuracy of the R2 Quantra system at breast density stratification. Quantitative breast density measures will improve accuracy and reliability of density determination, assisting future researchers to accurately calculate breast cancer risks associated with density increase.
Assessment of the foot and ankle in elite athletes.
Schon, Lew C
2009-06-01
An accurate assessment of the foot and ankle problem in elite athletes is the foundation of a treatment plan and prognosis. The special pressures of professional sports, where managers, agents, and lawyers may be involved, makes a thorough assessment especially critical for sound decision-making. Evaluation includes taking a history of the acute and chronic condition, including mechanism, physical sensation at injury, compensatory stresses, and general medical review. The athlete is assessed physically in several different ways, including comprehensive focal examination and alignment in static and dynamic nonweight-bearing and weight-bearing modes. This comprehensive process is essential to accurate assessment.
Joucla, Sébastien; Franconville, Romain; Pippow, Andreas; Kloppenburg, Peter; Pouzat, Christophe
2013-08-01
Calcium imaging has become a routine technique in neuroscience for subcellular to network level investigations. The fast progresses in the development of new indicators and imaging techniques call for dedicated reliable analysis methods. In particular, efficient and quantitative background fluorescence subtraction routines would be beneficial to most of the calcium imaging research field. A background-subtracted fluorescence transients estimation method that does not require any independent background measurement is therefore developed. This method is based on a fluorescence model fitted to single-trial data using a classical nonlinear regression approach. The model includes an appropriate probabilistic description of the acquisition system's noise leading to accurate confidence intervals on all quantities of interest (background fluorescence, normalized background-subtracted fluorescence time course) when background fluorescence is homogeneous. An automatic procedure detecting background inhomogeneities inside the region of interest is also developed and is shown to be efficient on simulated data. The implementation and performances of the proposed method on experimental recordings from the mouse hypothalamus are presented in details. This method, which applies to both single-cell and bulk-stained tissues recordings, should help improving the statistical comparison of fluorescence calcium signals between experiments and studies. Copyright © 2013 Elsevier Ltd. All rights reserved.
Sexually Transmitted Diseases: A Selective, Annotated Bibliography.
ERIC Educational Resources Information Center
Planned Parenthood Federation of America, Inc., New York, NY. Education Dept.
This document contains a reference sheet and an annotated bibliography concerned with sexually transmitted diseases (STD). The reference sheet provides a brief, accurate overview of STDs which includes both statistical and background information. The bibliography contains 83 entries, listed alphabetically, that deal with STDs. Books and articles…
ERIC Educational Resources Information Center
Riedl, Richard
1986-01-01
Describes a student magazine publishing project in which the participating junior high school students accessed the information utility, CompuServe, to gather current and accurate background information for their magazine articles. Student use of CompuServe is described, and the value and costs of using CompuServe are discussed. (MBR)
Inter-observer variation in the diagnosis of neurologic abnormalities in the horse
USDA-ARS?s Scientific Manuscript database
Background – The diagnosis of EPM relies heavily on the clinical examination. The accurate identification of neurologic signs during a clinical examination is critical to the interpretation of laboratory results. Objective – To investigate the level of agreement between board-certified veterinary in...
Arora, Harendra; Martinelli, Susan M.
2017-01-01
Background: The Accreditation Council for Graduate Medical Education's Next Accreditation System requires residency programs to semiannually submit composite milestone data on each resident's performance. This report describes and evaluates a new assessment review procedure piloted in our departmental Clinical Competency Committee (CCC) semi-annual meeting in June 2016. Methods: A modified Delphi technique was utilized to develop key performance indicators (KPI) linking milestone descriptors to clinical practice. In addition, the CCC identified six specific milestone sub-competencies that would be prescored with objective data prior to the meeting. Each resident was independently placed on the milestones by 3 different CCC faculty members. Milestone placement data of the same cohort of 42 residents (Clinical Anesthesia Years 1–3) were collected to calculate inter-rater reliability of the assessment procedures before and after the implemented changes. A survey was administrated to collect CCC feedback on the new procedure. Results: The procedure assisted in reducing meeting time from 8 to 3.5 hours. Survey of the CCC members revealed positive perception of the procedure. Higher inter-rater reliability of the milestone placement was obtained using the implemented KPIs (Intraclass correlation coefficient [ICC] single measure range: before=.53–.94, after=.74–.98). Conclusion: We found the new assessment procedure beneficial to the efficiency and transparency of the assessment process. Further improvement of the procedure involves refinement of KPIs and additional faculty development on KPIs to allow non-CCC faculty to provide more accurate resident evaluations. PMID:29766033
A statistical assessment of population trends for data deficient Mexican amphibians
Thessen, Anne E.; Arias-Caballero, Paulina; Ayala-Orozco, Bárbara
2014-01-01
Background. Mexico has the world’s fifth largest population of amphibians and the second country with the highest quantity of threatened amphibian species. About 10% of Mexican amphibians lack enough data to be assigned to a risk category by the IUCN, so in this paper we want to test a statistical tool that, in the absence of specific demographic data, can assess a species’ risk of extinction, population trend, and to better understand which variables increase their vulnerability. Recent studies have demonstrated that the risk of species decline depends on extrinsic and intrinsic traits, thus including both of them for assessing extinction might render more accurate assessment of threats. Methods. We harvested data from the Encyclopedia of Life (EOL) and the published literature for Mexican amphibians, and used these data to assess the population trend of some of the Mexican species that have been assigned to the Data Deficient category of the IUCN using Random Forests, a Machine Learning method that gives a prediction of complex processes and identifies the most important variables that account for the predictions. Results. Our results show that most of the data deficient Mexican amphibians that we used have decreasing population trends. We found that Random Forests is a solid way to identify species with decreasing population trends when no demographic data is available. Moreover, we point to the most important variables that make species more vulnerable for extinction. This exercise is a very valuable first step in assigning conservation priorities for poorly known species. PMID:25548736
A statistical assessment of population trends for data deficient Mexican amphibians.
Quintero, Esther; Thessen, Anne E; Arias-Caballero, Paulina; Ayala-Orozco, Bárbara
2014-01-01
Background. Mexico has the world's fifth largest population of amphibians and the second country with the highest quantity of threatened amphibian species. About 10% of Mexican amphibians lack enough data to be assigned to a risk category by the IUCN, so in this paper we want to test a statistical tool that, in the absence of specific demographic data, can assess a species' risk of extinction, population trend, and to better understand which variables increase their vulnerability. Recent studies have demonstrated that the risk of species decline depends on extrinsic and intrinsic traits, thus including both of them for assessing extinction might render more accurate assessment of threats. Methods. We harvested data from the Encyclopedia of Life (EOL) and the published literature for Mexican amphibians, and used these data to assess the population trend of some of the Mexican species that have been assigned to the Data Deficient category of the IUCN using Random Forests, a Machine Learning method that gives a prediction of complex processes and identifies the most important variables that account for the predictions. Results. Our results show that most of the data deficient Mexican amphibians that we used have decreasing population trends. We found that Random Forests is a solid way to identify species with decreasing population trends when no demographic data is available. Moreover, we point to the most important variables that make species more vulnerable for extinction. This exercise is a very valuable first step in assigning conservation priorities for poorly known species.
Validation study of a web-based assessment of functional recovery after radical prostatectomy
2010-01-01
Background Good clinical care of prostate cancer patients after radical prostatectomy depends on careful assessment of post-operative morbidities, yet physicians do not always judge patient symptoms accurately. Logistical problems associated with using paper questionnaire limit their use in the clinic. We have implemented a web-interface ("STAR") for patient-reported outcomes after radical prostatectomy. Methods We analyzed data on the first 9 months of clinical implementation to evaluate the validity of the STAR questionnaire to assess functional outcomes following radical prostatectomy. We assessed response rate, internal consistency within domains, and the association between survey responses and known predictors of sexual and urinary function, including age, time from surgery, nerve sparing status and co-morbidities. Results Of 1581 men sent an invitation to complete the instrument online, 1235 responded for a response rate of 78%. Cronbach's alpha was 0.84, 0.86 and 0.97 for bowel, urinary and sexual function respectively. All known predictors of sexual and urinary function were significantly associated with survey responses in the hypothesized direction. Conclusions We have found that web-based assessment of functional recovery after radical prostatectomy is practical and feasible. The instrument demonstrated excellent psychometric properties, suggested that validity is maintained when questions are transferred from paper to electronic format and when patients give responses that they know will be seen by their doctor and added to their clinic record. As such, our system allows ready implementation of patient-reported outcomes into routine clinical practice. PMID:20687938
NASA Astrophysics Data System (ADS)
Mazidi, Hesam; Nehorai, Arye; Lew, Matthew D.
2018-02-01
In single-molecule (SM) super-resolution microscopy, the complexity of a biological structure, high molecular density, and a low signal-to-background ratio (SBR) may lead to imaging artifacts without a robust localization algorithm. Moreover, engineered point spread functions (PSFs) for 3D imaging pose difficulties due to their intricate features. We develop a Robust Statistical Estimation algorithm, called RoSE, that enables joint estimation of the 3D location and photon counts of SMs accurately and precisely using various PSFs under conditions of high molecular density and low SBR.
Predictive Monitoring for Improved Management of Glucose Levels
Reifman, Jaques; Rajaraman, Srinivasan; Gribok, Andrei; Ward, W. Kenneth
2007-01-01
Background Recent developments and expected near-future improvements in continuous glucose monitoring (CGM) devices provide opportunities to couple them with mathematical forecasting models to produce predictive monitoring systems for early, proactive glycemia management of diabetes mellitus patients before glucose levels drift to undesirable levels. This article assesses the feasibility of data-driven models to serve as the forecasting engine of predictive monitoring systems. Methods We investigated the capabilities of data-driven autoregressive (AR) models to (1) capture the correlations in glucose time-series data, (2) make accurate predictions as a function of prediction horizon, and (3) be made portable from individual to individual without any need for model tuning. The investigation is performed by employing CGM data from nine type 1 diabetic subjects collected over a continuous 5-day period. Results With CGM data serving as the gold standard, AR model-based predictions of glucose levels assessed over nine subjects with Clarke error grid analysis indicated that, for a 30-minute prediction horizon, individually tuned models yield 97.6 to 100.0% of data in the clinically acceptable zones A and B, whereas cross-subject, portable models yield 95.8 to 99.7% of data in zones A and B. Conclusions This study shows that, for a 30-minute prediction horizon, data-driven AR models provide sufficiently-accurate and clinically-acceptable estimates of glucose levels for timely, proactive therapy and should be considered as the modeling engine for predictive monitoring of patients with type 1 diabetes mellitus. It also suggests that AR models can be made portable from individual to individual with minor performance penalties, while greatly reducing the burden associated with model tuning and data collection for model development. PMID:19885110
Lederer, Philip; Shiraishi, Ray W.; Wadonda-Kabondo, Nellie; Date, Anand; Matatiyo, Blackson; Dokubo, E. Kainne
2017-01-01
Abstract Background. Awareness of human immunodeficiency virus (HIV) status among all people with HIV is critical for epidemic control. We aimed to assess accurate knowledge of HIV status, defined as concordance with serosurvey test results from the 2010 Malawi Demographic Health Survey (MDHS), and to identify risk factors for seropositivity among adults (aged 15–49) reporting a most recently negative test within 12 months. Methods. Data were analyzed from the 2010 MDHS. A logistic regression model was constructed to determine factors independently associated with HIV seropositivity after a recently negative test. All analyses controlled for the survey’s complex design. Results. A total of 11 649 adults tested for HIV during this MDHS reported ever being sexually active. Among these, HIV seroprevalence was 12.0%, but only 61.7% had accurate knowledge of their status. Forty percent (40.3%; 95% confidence interval [CI], 36.8–43.8) of seropositive respondents reported a most recently negative test. Of those reporting that this negative test was within 12 months (n = 3630), seroprevalence was 7.2% for women (95% CI, 5.7–9.2), 5.2% for men (95% CI, 3.9–6.9), higher in the South, and higher in rural areas for men. Women with higher education and men in the richest quintile were at higher risk. More than 1 lifetime union was significantly associated with recent HIV infection, whereas never being married was significantly protective. Conclusions. Self-reported HIV status based on prior test results can underestimate seroprevalence. These results highlight the need for posttest risk assessment and support for people who test negative for HIV and repeat testing in people at high risk for HIV infection. PMID:28480233
Jiang, Hui; Hanna, Eriny; Gatto, Cheryl L.; Page, Terry L.; Bhuva, Bharat; Broadie, Kendal
2016-01-01
Background Aversive olfactory classical conditioning has been the standard method to assess Drosophila learning and memory behavior for decades, yet training and testing are conducted manually under exceedingly labor-intensive conditions. To overcome this severe limitation, a fully automated, inexpensive system has been developed, which allows accurate and efficient Pavlovian associative learning/memory analyses for high-throughput pharmacological and genetic studies. New Method The automated system employs a linear actuator coupled to an odorant T-maze with airflow-mediated transfer of animals between training and testing stages. Odorant, airflow and electrical shock delivery are automatically administered and monitored during training trials. Control software allows operator-input variables to define parameters of Drosophila learning, short-term memory and long-term memory assays. Results The approach allows accurate learning/memory determinations with operational fail-safes. Automated learning indices (immediately post-training) and memory indices (after 24 hours) are comparable to traditional manual experiments, while minimizing experimenter involvement. Comparison with Existing Methods The automated system provides vast improvements over labor-intensive manual approaches with no experimenter involvement required during either training or testing phases. It provides quality control tracking of airflow rates, odorant delivery and electrical shock treatments, and an expanded platform for high-throughput studies of combinational drug tests and genetic screens. The design uses inexpensive hardware and software for a total cost of ~$500US, making it affordable to a wide range of investigators. Conclusions This study demonstrates the design, construction and testing of a fully automated Drosophila olfactory classical association apparatus to provide low-labor, high-fidelity, quality-monitored, high-throughput and inexpensive learning and memory behavioral assays. PMID:26703418
2011-01-01
Background Integration of genomic variation with phenotypic information is an effective approach for uncovering genotype-phenotype associations. This requires an accurate identification of the different types of variation in individual genomes. Results We report the integration of the whole genome sequence of a single Holstein Friesian bull with data from single nucleotide polymorphism (SNP) and comparative genomic hybridization (CGH) array technologies to determine a comprehensive spectrum of genomic variation. The performance of resequencing SNP detection was assessed by combining SNPs that were identified to be either in identity by descent (IBD) or in copy number variation (CNV) with results from SNP array genotyping. Coding insertions and deletions (indels) were found to be enriched for size in multiples of 3 and were located near the N- and C-termini of proteins. For larger indels, a combination of split-read and read-pair approaches proved to be complementary in finding different signatures. CNVs were identified on the basis of the depth of sequenced reads, and by using SNP and CGH arrays. Conclusions Our results provide high resolution mapping of diverse classes of genomic variation in an individual bovine genome and demonstrate that structural variation surpasses sequence variation as the main component of genomic variability. Better accuracy of SNP detection was achieved with little loss of sensitivity when algorithms that implemented mapping quality were used. IBD regions were found to be instrumental for calculating resequencing SNP accuracy, while SNP detection within CNVs tended to be less reliable. CNV discovery was affected dramatically by platform resolution and coverage biases. The combined data for this study showed that at a moderate level of sequencing coverage, an ensemble of platforms and tools can be applied together to maximize the accurate detection of sequence and structural variants. PMID:22082336
Armstrong, Ian S; Hoffmann, Sandra A
2016-11-01
The interest in quantitative single photon emission computer tomography (SPECT) shows potential in a number of clinical applications and now several vendors are providing software and hardware solutions to allow 'SUV-SPECT' to mirror metrics used in PET imaging. This brief technical report assesses the accuracy of activity concentration measurements using a new algorithm 'xSPECT' from Siemens Healthcare. SPECT/CT data were acquired from a uniform cylinder with 5, 10, 15 and 20 s/projection and NEMA image quality phantom with 25 s/projection. The NEMA phantom had hot spheres filled with an 8 : 1 activity concentration relative to the background compartment. Reconstructions were performed using parameters defined by manufacturer presets available with the algorithm. The accuracy of activity concentration measurements was assessed. A dose calibrator-camera cross-calibration factor (CCF) was derived from the uniform phantom data. In uniform phantom images, a positive bias was observed, ranging from ∼6% in the lower count images to ∼4% in the higher-count images. On the basis of the higher-count data, a CCF of 0.96 was derived. As expected, considerable negative bias was measured in the NEMA spheres using region mean values whereas positive bias was measured in the four largest NEMA spheres. Nonmonotonically increasing recovery curves for the hot spheres suggested the presence of Gibbs edge enhancement from resolution modelling. Sufficiently accurate activity concentration measurements can easily be measured on images reconstructed with the xSPECT algorithm without a CCF. However, the use of a CCF is likely to improve accuracy further. A manual conversion of voxel values into SUV should be possible, provided that the patient weight, injected activity and time between injection and imaging are all known accurately.
Cleary, Jane; Daniells, Suzie; Okely, Anthony D; Batterham, Marijka; Nicholls, Jessie
2008-01-01
Bioelectrical impedance equations are frequently used by food and nutrition professionals to estimate percent fat mass in overweight and obese children. However, it is not known whether they are accurate for such children, as they have been primarily developed for children of varying body weights. The aim of this cross-sectional study was to evaluate the predictive validity of four previously published prediction equations developed for the pediatric population, among a sample of overweight and obese children. Thirty overweight or obese children (mean age=7.57+/-1.28 years) underwent measurement of fat mass, percent fat mass, and fat-free mass using dual-energy x-ray absorptiometry (DEXA) and bioelectrical impedance analysis (BIA). Impedance values from the BIA were entered into the four prediction equations and Pearson correlations used to determine the significance of associations between each of the BIA prediction equations and DEXA for percent fat mass, fat mass, and fat-free mass. For percent fat mass, paired t tests were used to assess differences between the methods and the technique of Bland and Altman was used to determine bias and error. Results showed that the mean percent fat mass as determined by DEXA for this age group was 40.79%. In comparison with other BIA prediction equations, the Schaefer equation had the closest mean value of 41.98%, and was the only equation not to significantly differ from the DEXA (P=0.121). This study suggests that the Schaefer equation is the only accurate BIA prediction equation for assessing percent fat mass in this sample of overweight and obese children from primarily white backgrounds.
Long‐Term Post‐CABG Survival: Performance of Clinical Risk Models Versus Actuarial Predictions
Carr, Brendan M.; Romeiser, Jamie; Ruan, Joyce; Gupta, Sandeep; Seifert, Frank C.; Zhu, Wei
2015-01-01
Abstract Background/aim Clinical risk models are commonly used to predict short‐term coronary artery bypass grafting (CABG) mortality but are less commonly used to predict long‐term mortality. The added value of long‐term mortality clinical risk models over traditional actuarial models has not been evaluated. To address this, the predictive performance of a long‐term clinical risk model was compared with that of an actuarial model to identify the clinical variable(s) most responsible for any differences observed. Methods Long‐term mortality for 1028 CABG patients was estimated using the Hannan New York State clinical risk model and an actuarial model (based on age, gender, and race/ethnicity). Vital status was assessed using the Social Security Death Index. Observed/expected (O/E) ratios were calculated, and the models' predictive performances were compared using a nested c‐index approach. Linear regression analyses identified the subgroup of risk factors driving the differences observed. Results Mortality rates were 3%, 9%, and 17% at one‐, three‐, and five years, respectively (median follow‐up: five years). The clinical risk model provided more accurate predictions. Greater divergence between model estimates occurred with increasing long‐term mortality risk, with baseline renal dysfunction identified as a particularly important driver of these differences. Conclusions Long‐term mortality clinical risk models provide enhanced predictive power compared to actuarial models. Using the Hannan risk model, a patient's long‐term mortality risk can be accurately assessed and subgroups of higher‐risk patients can be identified for enhanced follow‐up care. More research appears warranted to refine long‐term CABG clinical risk models. doi: 10.1111/jocs.12665 (J Card Surg 2016;31:23–30) PMID:26543019
Mandigout, Stéphane; Lacroix, Justine; Ferry, Béatrice; Vuillerme, Nicolas; Compagnat, Maxence; Daviet, Jean-Christophe
2017-12-01
Background In the subacute stroke phase, the monitoring of ambulatory activity and activities of daily life with wearable sensors may have relevant clinical applications. Do current commercially available wearable activity trackers allow us to objectively assess the energy expenditure of these activities? The objective of the present study was to compare the energy expenditure evaluated by indirect calorimetry during the course of a scenario consisting of everyday activities while estimating the energy expenditure using several commercialised wearable sensors in post-stroke patients (less than six months since stroke). Method Twenty-four patients (age 68.2 ± 13.9; post-stroke delay 34 ± 25 days) voluntarily participated in this study. Each patient underwent a scenario of various everyday tasks (transfer, walking, etc.). During the implementation, patients wore 14 wearable sensors (Armband, Actigraph GT3X, Actical, pedometer) to obtain an estimate of the energy expenditure. The actual energy expenditure was concurrently determined by indirect calorimetry. Results Except for the Armband worn on the non-plegic side, the results of our study show a significant difference between the energy expenditure values estimated by the various sensors and the actual energy expenditure when the scenario is considered as a whole. Conclusion The present results suggest that, for a series of everyday tasks, the wearable sensors underestimate the actual energy expenditure values in post-stroke patients in the subacute phase and are therefore not accurate. Several factors are likely to confound the results: types of activity, prediction equations, the position of the sensor and the hemiplegia side.
A molecular computational model improves the preoperative diagnosis of thyroid nodules
2012-01-01
Background Thyroid nodules with indeterminate cytological features on fine needle aspiration (FNA) cytology have a 20% risk of thyroid cancer. The aim of the current study was to determine the diagnostic utility of an 8-gene assay to distinguish benign from malignant thyroid neoplasm. Methods The mRNA expression level of 9 genes (KIT, SYNGR2, C21orf4, Hs.296031, DDI2, CDH1, LSM7, TC1, NATH) was analysed by quantitative PCR (q-PCR) in 93 FNA cytological samples. To evaluate the diagnostic utility of all the genes analysed, we assessed the area under the curve (AUC) for each gene individually and in combination. BRAF exon 15 status was determined by pyrosequencing. An 8-gene computational model (Neural Network Bayesian Classifier) was built and a multiple-variable analysis was then performed to assess the correlation between the markers. Results The AUC for each significant marker ranged between 0.625 and 0.900, thus all the significant markers, alone and in combination, can be used to distinguish between malignant and benign FNA samples. The classifier made up of KIT, CDH1, LSM7, C21orf4, DDI2, TC1, Hs.296031 and BRAF had a predictive power of 88.8%. It proved to be useful for risk stratification of the most critical cytological group of the indeterminate lesions for which there is the greatest need of accurate diagnostic markers. Conclusion The genetic classification obtained with this model is highly accurate at differentiating malignant from benign thyroid lesions and might be a useful adjunct in the preoperative management of patients with thyroid nodules. PMID:22958914
Tai, Dean C.S.; Wang, Shi; Cheng, Chee Leong; Peng, Qiwen; Yan, Jie; Chen, Yongpeng; Sun, Jian; Liang, Xieer; Zhu, Youfu; Rajapakse, Jagath C.; Welsch, Roy E.; So, Peter T.C.; Wee, Aileen; Hou, Jinlin; Yu, Hanry
2014-01-01
Background & Aims There is increasing need for accurate assessment of liver fibrosis/cirrhosis. We aimed to develop qFibrosis, a fully-automated assessment method combining quantification of histopathological architectural features, to address unmet needs in core biopsy evaluation of fibrosis in chronic hepatitis B (CHB) patients. Methods qFibrosis was established as a combined index based on 87 parameters of architectural features. Images acquired from 25 Thioacetamide-treated rat samples and 162 CHB core biopsies were used to train and test qFibrosis and to demonstrate its reproducibility. qFibrosis scoring was analyzed employing Metavir and Ishak fibrosis staging as standard references, and collagen proportionate area (CPA) measurement for comparison. Results qFibrosis faithfully and reliably recapitulates Metavir fibrosis scores, as it can identify differences between all stages in both animal samples (p <0.001) and human biopsies (p <0.05). It is robust to sampling size, allowing for discrimination of different stages in samples of different sizes (area under the curve (AUC): 0.93–0.99 for animal samples: 1–16 mm2; AUC: 0.84–0.97 for biopsies: 10–44 mm in length). qFibrosis can significantly predict staging underestimation in suboptimal biopsies (<15 mm) and under- and over-scoring by different pathologists (p <0.001). qFibrosis can also differentiate between Ishak stages 5 and 6 (AUC: 0.73, p = 0.008), suggesting the possibility of monitoring intra-stage cirrhosis changes. Best of all, qFibrosis demonstrates superior performance to CPA on all counts. Conclusions qFibrosis can improve fibrosis scoring accuracy and throughput, thus allowing for reproducible and reliable analysis of efficacies of anti-fibrotic therapies in clinical research and practice. PMID:24583249
Particle Streak Anemometry: A New Method for Proximal Flow Sensing from Aircraft
NASA Astrophysics Data System (ADS)
Nichols, T. W.
Accurate sensing of relative air flow direction from fixed-wing small unmanned aircraft (sUAS) is challenging with existing multi-hole pitot-static and vane systems. Sub-degree direction accuracy is generally not available on such systems and disturbances to the local flow field, induced by the airframe, introduce an additional error source. An optical imaging approach to make a relative air velocity measurement with high-directional accuracy is presented. Optical methods offer the capability to make a proximal measurement in undisturbed air outside of the local flow field without the need to place sensors on vulnerable probes extended ahead of the aircraft. Current imaging flow analysis techniques for laboratory use rely on relatively thin imaged volumes and sophisticated hardware and intensity thresholding in low-background conditions. A new method is derived and assessed using a particle streak imaging technique that can be implemented with low-cost commercial cameras and illumination systems, and can function in imaged volumes of arbitrary depth with complex background signal. The new technique, referred to as particle streak anemometry (PSA) (to differentiate from particle streak velocimetry which makes a field measurement rather than a single bulk flow measurement) utilizes a modified Canny Edge detection algorithm with a connected component analysis and principle component analysis to detect streak ends in complex imaging conditions. A linear solution for the air velocity direction is then implemented with a random sample consensus (RANSAC) solution approach. A single DOF non-linear, non-convex optimization problem is then solved for the air speed through an iterative approach. The technique was tested through simulation and wind tunnel tests yielding angular accuracies under 0.2 degrees, superior to the performance of existing commercial systems. Air speed error standard deviations varied from 1.6 to 2.2 m/s depending on the techniques of implementation. While air speed sensing is secondary to accurate flow direction measurement, the air speed results were in line with commercial pitot static systems at low speeds.
Lecerf, G; Fessy, M H; Philippot, R; Massin, P; Giraud, F; Flecher, X; Girard, J; Mertl, P; Marchetti, E; Stindel, E
2009-05-01
BACKGROUND OBJECTIVE: Femoral offset is supposed to influence the results of hip replacement but little is known about the accurate method of measure and the true effect of offset modifications. This article is a collection of independent anatomic, radiological and clinical works, which purpose is to assess knowledge of the implications of femoral offset for preoperative templating and total hip arthroplasty. There is a strong correlation between femoral offset, abductors lever arm and hip abductor strength. Hip lateralization is independent of the femoral endomedullary characteristics. The abductors lever arm is highly correlated to the gluteus medius activation angle. There were correlations between femoral offset and endomedullary shape. The hip center was high and medial for stovepipe metaphysis while it was lower and lateralized for champagne - flute upper femur. A study was performed to compare the femoral offset measured by X-ray and CT-scan in 50 patients, demonstrated that plain radiography underestimates offset measurement. The 2D templating cannot appreciate the rotation of the lower limb. Taking into account the horizontal plane is essential to obtain proper 3D planning of the femoral offset. A randomized study was designed to compare femoral offset measurements after hip resurfacing and total hip arthroplasty. This study underlined hip resurfacing reduced the femoral offset, while hip replacement increased offset. However, the reduction of femoral offset after hip resurfacing does not affect the function. A pilot study was designed to assess the results of 120 hip arthroplasties with a modular femoral neck. This study showed that the use of a modular collar ensures an easier restoration of the femoral offset. A cohort of high offset stems (Lubinus 117 degrees) was retrospectively assessed. The survival rate was slightly lower that the standard design reported in the Swedish register. Finally, the measurement of offset and leg length was assessed with the help of computer assistance. The software changed the initial schedule (obtained by templating) in 29%. Therefore, femoral offset restoration is essential to improve function and longevity of hip arthroplasty. CT-scan is more accurate than plain radiography to assess femoral offset. Hip resurfacing decreases offset without effect on function. Modular neck and computer assistance may improve intraoperative calculation and reproduction of femoral offset. Increasing offset with a standard cemented design may decrease long-term fixation. Level IV: Retrospective or historical series.
2012-01-01
Background The short inversion time inversion recovery (STIR) black-blood technique has been used to visualize myocardial edema, and thus to differentiate acute from chronic myocardial lesions. However, some cardiovascular magnetic resonance (CMR) groups have reported variable image quality, and hence the diagnostic value of STIR in routine clinical practice has been put into question. The aim of our study was to analyze image quality and diagnostic performance of STIR using a set of pulse sequence parameters dedicated to edema detection, and to discuss possible factors that influence image quality. We hypothesized that STIR imaging is an accurate and robust way of detecting myocardial edema in non-selected patients with acute myocardial infarction. Methods Forty-six consecutive patients with acute myocardial infarction underwent CMR (day 4.5, +/- 1.6) including STIR for the assessment of myocardial edema and late gadolinium enhancement (LGE) for quantification of myocardial necrosis. Thirty of these patients underwent a follow-up CMR at approximately six months (195 +/- 39 days). Both STIR and LGE images were evaluated separately on a segmental basis for image quality as well as for presence and extent of myocardial hyper-intensity, with both visual and semi-quantitative (threshold-based) analysis. LGE was used as a reference standard for localization and extent of myocardial necrosis (acute) or scar (chronic). Results Image quality of STIR images was rated as diagnostic in 99.5% of cases. At the acute stage, the sensitivity and specificity of STIR to detect infarcted segments on visual assessment was 95% and 78% respectively, and on semi-quantitative assessment was 99% and 83%, respectively. STIR differentiated acutely from chronically infarcted segments with a sensitivity of 95% by both methods and with a specificity of 99% by visual assessment and 97% by semi-quantitative assessment. The extent of hyper-intense areas on acute STIR images was 85% larger than those on LGE images, with a larger myocardial salvage index in reperfused than in non-reperfused infarcts (p = 0.035). Conclusions STIR with appropriate pulse sequence settings is accurate in detecting acute myocardial infarction (MI) and distinguishing acute from chronic MI with both visual and semi-quantitative analysis. Due to its unique technical characteristics, STIR should be regarded as an edema-weighted rather than a purely T2-weighted technique. PMID:22455461
Image-derived input function with factor analysis and a-priori information.
Simončič, Urban; Zanotti-Fregonara, Paolo
2015-02-01
Quantitative PET studies often require the cumbersome and invasive procedure of arterial cannulation to measure the input function. This study sought to minimize the number of necessary blood samples by developing a factor-analysis-based image-derived input function (IDIF) methodology for dynamic PET brain studies. IDIF estimation was performed as follows: (a) carotid and background regions were segmented manually on an early PET time frame; (b) blood-weighted and tissue-weighted time-activity curves (TACs) were extracted with factor analysis; (c) factor analysis results were denoised and scaled using the voxels with the highest blood signal; (d) using population data and one blood sample at 40 min, whole-blood TAC was estimated from postprocessed factor analysis results; and (e) the parent concentration was finally estimated by correcting the whole-blood curve with measured radiometabolite concentrations. The methodology was tested using data from 10 healthy individuals imaged with [(11)C](R)-rolipram. The accuracy of IDIFs was assessed against full arterial sampling by comparing the area under the curve of the input functions and by calculating the total distribution volume (VT). The shape of the image-derived whole-blood TAC matched the reference arterial curves well, and the whole-blood area under the curves were accurately estimated (mean error 1.0±4.3%). The relative Logan-V(T) error was -4.1±6.4%. Compartmental modeling and spectral analysis gave less accurate V(T) results compared with Logan. A factor-analysis-based IDIF for [(11)C](R)-rolipram brain PET studies that relies on a single blood sample and population data can be used for accurate quantification of Logan-V(T) values.
Biological Markers for Pulpal Inflammation: A Systematic Review
Galicia, Johnah C.; Peters, Ove A.
2016-01-01
Background and Objective Pulpitis is mainly caused by an opportunistic infection of the pulp space with commensal oral microorganisms. Depending on the state of inflammation, different treatment regimes are currently advocated. Predictable vital pulp therapy depends on accurate determination of the pulpal status that will allow repair to occur. The role of several players of the host response in pulpitis is well documented: cytokines, proteases, inflammatory mediators, growth factors, antimicrobial peptides and others contribute to pulpal defense mechanisms; these factors may serve as biomarkers that indicate the status of the pulp. Therefore, the aim of this systematic review was to evaluate the presence of biomarkers in pulpitis. Methods The electronic databases of MEDLINE, EMBASE, Scopus and other sources were searched for English and non-English articles published through February 2015. Two independent reviewers extracted information regarding study design, tissue or analyte used, outcome measures, results and conclusions for each article. The quality of the included studies was assessed using a modification of the Newcastle-Ottawa-Scale. Results and Conclusions From the initial 847 publications evaluated, a total of 57 articles were included in this review. In general, irreversible pulpitis was associated with different expression of various biomarkers compared to normal controls. These biomarkers were significantly expressed not only in pulp tissue, but also in gingival crevicular fluid that can be collected non-invasively, and in dentin fluid that can be analyzed without extirpating the entire pulpal tissue. Such data may then be used to accurately differentiate diseased from healthy pulp tissue. The interplay of pulpal biomarkers and their potential use for a more accurate and biologically based diagnostic tool in endodontics is envisaged. PMID:27898727
Tucker, Natalia S.; Cyr, Amy E.; Ademuyiwa, Foluso O.; Tabchy, Adel; George, Krystl; Sharma, Piyush; Jin, Linda X.; Sanati, Souzan; Aft, Rebecca; Gao, Feng; Margenthaler, Julie A.; Gillanders, William E.
2016-01-01
Objective Assess the performance characteristics of axillary ultrasound (AUS) for accurate exclusion of clinically significant axillary lymph node (ALN) disease. Background Sentinel lymph node biopsy (SLNB) is currently the standard of care for staging the axilla in patients with clinical T1–T2, N0 breast cancer. AUS is a noninvasive alternative to SLNB for staging the axilla. Methods Patients were identified using a prospectively maintained database. Sensitivity, specificity, and negative predictive value (NPV) were calculated by comparing AUS findings to pathology results. Multivariate analyses were performed to identify patient and/or tumor characteristics associated with false negative (FN) AUS. A blinded review of FN and matched true negative cases was performed by two independent medical oncologists to compare treatment recommendations and actual treatment received. Recurrence-free survival was described using Kaplan-Meier product limit methods. Results 647 patients with clinical T1–T2, N0 breast cancer underwent AUS between January, 2008 and March, 2013. AUS had a sensitivity of 70%, NPV of 84% and PPV of 56% for the detection of ALN disease. For detection of clinically significant disease (> 2.0 mm), AUS had a sensitivity of 76% and NPV of 89%. FN AUS did not significantly impact adjuvant medical decision making. Patients with FN AUS had recurrence-free survival equivalent to patients with pathologic N0 disease. Conclusions AUS accurately excludes clinically significant ALN disease in patients with clinical T1–T2, N0 breast cancer. AUS may be an alternative to SLNB in these patients where axillary surgery is no longer considered therapeutic, and predictors of tumor biology are increasingly used to make adjuvant therapy decisions. PMID:26779976
The accuracy of assessment of walking distance in the elective spinal outpatients setting.
Okoro, Tosan; Qureshi, Assad; Sell, Beulah; Sell, Philip
2010-02-01
Self reported walking distance is a clinically relevant measure of function. The aim of this study was to define patient accuracy and understand factors that might influence perceived walking distance in an elective spinal outpatients setting. A prospective cohort study. 103 patients were asked to perform one test of distance estimation and 2 tests of functional distance perception using pre-measured landmarks. Standard spine specific outcomes included the patient reported claudication distance, Oswestry disability index (ODI), Low Back Outcome Score (LBOS), visual analogue score (VAS) for leg and back, and other measures. There are over-estimators and under-estimators. Overall, the accuracy to within 9.14 metres (m) (10 yards) was poor at only 5% for distance estimation and 40% for the two tests of functional distance perception. Distance: Actual distance 111 m; mean response 245 m (95% CI 176.3-314.7), Functional test 1 actual distance 29.2 m; mean response 71.7 m (95% CI 53.6-88.9) Functional test 2 actual distance 19.6 m; mean response 47.4 m (95% CI 35.02-59.95). Surprisingly patients over 60 years of age (n = 43) are twice as accurate with each test performed compared to those under 60 (n = 60) (average 70% overestimation compared to 140%; p = 0.06). Patients in social class I (n = 18) were more accurate than those in classes II-V (n = 85): There was a positive correlation between poor accuracy and increasing MZD (Pearson's correlation coefficient 0.250; p = 0.012). ODI, LBOS and other parameters measured showed no correlation. Subjective distance perception and estimation is poor in this population. Patients over 60 and those with a professional background are more accurate but still poor.
Reproducibility of Regional Pulse Wave Velocity in Healthy Subjects
Lee, Nak Bum
2009-01-01
Background/Aims Despite the clinical importance and widespread use of pulse wave velocity (PWV), there are no standards for pulse sensors or for system requirements to ensure accurate pulse wave measurement. We assessed the reproducibility of PWV values using a newly developed PWV measurement system. Methods The system used in this study was the PP-1000, which simultaneously provides regional PWV values from arteries at four different sites (carotid, femoral, radial, and dorsalis pedis). Seventeen healthy male subjects without any cardiovascular disease participated in this study. Two observers performed two consecutive measurements in the same subject in random order. To evaluate the reproducibility of the system, two sets of analyses (within-observer and between-observer) were performed. Results The means±SD of PWV for the aorta, arm, and leg were 7.0±1.48, 8.43±1.14, and 8.09±0.98 m/s as measured by observer A and 6.76±1.00, 7.97±0.80, and 7.97±0.72 m/s by observer B, respectively. Between-observer differences for the aorta, arm, and leg were 0.14±0.62, 0.18±0.84, and 0.07±0.86 m/s, respectively, and the correlation coefficients were high, especially for aortic PWV (r=0.93). All the measurements showed significant correlation coefficients, ranging from 0.94 to 0.99. Conclusions The PWV measurement system used in this study provides accurate analysis results with high reproducibility. It is necessary to provide an accurate algorithm for the detection of additional features such as flow wave, reflection wave, and dicrotic notch from a pulse waveform. PMID:19270477
Marine atmospheric effects on electro-optical systems performance
NASA Astrophysics Data System (ADS)
Richter, Juergen H.; Hughes, Herbert G.
1990-09-01
For the past twelve years, a coordinated tri-service effort has been underway in the United States Department of Defense to provide an atmospheric effects assessment capability for existing and planned electro-optical (E0) systems. This paper reviews the exploratory development effort in the US Navy. A key responsibility for the Navy was the development of marine aerosol models. An initial model, the Navy Aerosol Model (NAN), was developed, tested, and transitioned into LOWTRAN 6. A more comprehensive model, the Navy Oceanic Vertical Aerosol Model (NOVAM), has been formulated and is presently undergoing comprehensive evaluation and testing. Marine aerosols and their extinction properties are only one important factor in EO systems performance assessment. For many EO systems applications, an accurate knowledge of marine background radiances is required in addition to considering the effects of the intervening atmosphere. Accordingly, a capability was developed to estimate the apparent sea surface radiance for different sea states and meteorological conditions. Also, an empirical relationship was developed which directly relates apparent mean sea temperature to calculated mean sky temperature. In situ measurements of relevant environmental parameters are essential for real-time EO systems performance assessment. Direct measurement of slant path extinction would be most desirable. This motivated a careful investigation of lidar (light detection and ranging) techniques including improvements to single-ended lidar profile inversion algorithms and development of new lidar techniques such as double-ended and dual-angle configurations. It was concluded that single-ended, single frequency lidars can not be used to infer slant path extinction with an accuracy necessary to make meaningful performance assessments. Other lidar configurations may find limited application in model validation and research efforts. No technique has emerged yet which could be considered ready for shipboard implementation. A shipboard real-time performance assessment system was developed and named PREOS (Performance and Range for EO Systems). PREOS has been incorporated into the Navy's Tactical Environmental Support System (TESS). The present version of PREOS is a first step in accomplishing the complex task of real-time systems performance assessment. Improved target and background models are under development and will be incorporated into TESS when tested and validated. A reliable assessment capability can be used to develop Tactical Decision Aids (TDAs). TDAs permit optimum selection or combination of sensors and estimation of a ship's own vulnerability against hostile systems.
The cosmic gamma-ray background from Type Ia supernovae
NASA Technical Reports Server (NTRS)
The, Lih-Sin; Leising, Mark D.; Clayton, Donald D.
1993-01-01
We present an improved calculation of the cumulative gamma-ray spectrum of Type Ia supernovae during the history of the universe. We follow Clayton & Ward (1975) in using a few Friedmann models and two simple histories of the average galaxian nucleosynthesis rate, but we improve their calculation by modeling the gamma-ray scattering in detailed numerical models of SN Ia's. The results confirm that near 1 MeV the SN Ia background may dominate, and that it is potentially observable, with high scientific importance. A very accurate measurement of the cosmic background spectrum between 0.1 and 1.0 MeV may reveal the turn-on time and the evolution of the rate of Type Ia supernova nucleosynthesis in the universe.
42 CFR 412.610 - Assessment schedule.
Code of Federal Regulations, 2011 CFR
2011-10-01
.... An inpatient rehabilitation facility must maintain all patient assessment data sets completed on... patient assessment data. The encoded patient assessment data must accurately reflect the patient's...
42 CFR 412.610 - Assessment schedule.
Code of Federal Regulations, 2014 CFR
2014-10-01
.... An inpatient rehabilitation facility must maintain all patient assessment data sets completed on... patient assessment data. The encoded patient assessment data must accurately reflect the patient's...
42 CFR 412.610 - Assessment schedule.
Code of Federal Regulations, 2012 CFR
2012-10-01
.... An inpatient rehabilitation facility must maintain all patient assessment data sets completed on... patient assessment data. The encoded patient assessment data must accurately reflect the patient's...
42 CFR 412.610 - Assessment schedule.
Code of Federal Regulations, 2013 CFR
2013-10-01
.... An inpatient rehabilitation facility must maintain all patient assessment data sets completed on... patient assessment data. The encoded patient assessment data must accurately reflect the patient's...
A 3D image sensor with adaptable charge subtraction scheme for background light suppression
NASA Astrophysics Data System (ADS)
Shin, Jungsoon; Kang, Byongmin; Lee, Keechang; Kim, James D. K.
2013-02-01
We present a 3D ToF (Time-of-Flight) image sensor with adaptive charge subtraction scheme for background light suppression. The proposed sensor can alternately capture high resolution color image and high quality depth map in each frame. In depth-mode, the sensor requires enough integration time for accurate depth acquisition, but saturation will occur in high background light illumination. We propose to divide the integration time into N sub-integration times adaptively. In each sub-integration time, our sensor captures an image without saturation and subtracts the charge to prevent the pixel from the saturation. In addition, the subtraction results are cumulated N times obtaining a final result image without background illumination at full integration time. Experimental results with our own ToF sensor show high background suppression performance. We also propose in-pixel storage and column-level subtraction circuit for chiplevel implementation of the proposed method. We believe the proposed scheme will enable 3D sensors to be used in out-door environment.
Estimation of channel parameters and background irradiance for free-space optical link.
Khatoon, Afsana; Cowley, William G; Letzepis, Nick; Giggenbach, Dirk
2013-05-10
Free-space optical communication can experience severe fading due to optical scintillation in long-range links. Channel estimation is also corrupted by background and electrical noise. Accurate estimation of channel parameters and scintillation index (SI) depends on perfect removal of background irradiance. In this paper, we propose three different methods, the minimum-value (MV), mean-power (MP), and maximum-likelihood (ML) based methods, to remove the background irradiance from channel samples. The MV and MP methods do not require knowledge of the scintillation distribution. While the ML-based method assumes gamma-gamma scintillation, it can be easily modified to accommodate other distributions. Each estimator's performance is compared using simulation data as well as experimental measurements. The estimators' performance are evaluated from low- to high-SI areas using simulation data as well as experimental trials. The MV and MP methods have much lower complexity than the ML-based method. However, the ML-based method shows better SI and background-irradiance estimation performance.
Teachers' literacy-related knowledge and self-perceptions in relation to preparation and experience.
Spear-Swerling, Louise; Brucker, Pamela Owen; Alfano, Michael P
2005-12-01
After rating their own literacy-related knowledge in three areas (knowledge about reading/reading development, phonemic awareness/phonics, and morpheme awareness/structural analysis), graduate teacher-education students completed five tasks intended to measure their actual disciplinary knowledge in these areas. Teachers with high levels of prior background (i.e., course preparation and experience) rated themselves as significantly more knowledgeable than did low-background teachers in all areas; high-background participants also significantly outperformed low-background participants on all tasks. However, even high-background teachers scored well below ceiling on the tasks. Regression analyses indicated that teachers' self-perceptions and knowledge were positively influenced by both level of preparation and teaching experience, although the influences on teachers' knowledge differed by task. Teachers had some accurate perceptions of their own knowledge, especially in the area of phonics. Results suggest that differentiating levels of preparation may be useful in studying teacher knowledge, and also support the notion of a substantial gap between research on reading and teacher preparation in reading.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-01-06
... accurate energy information because of design changes will violate 16 CFR 305.4. 2. Comparative Information Background: Under EPCA, the Commission may require disclosure of comparative energy consumption information... meet the [ENERGY STAR 3.0] spec.''). Comments: Commenters generally favored including comparative...
Affective Prosody Labeling in Youths with Bipolar Disorder or Severe Mood Dysregulation
ERIC Educational Resources Information Center
Deveney, Christen M.; Brotman, Melissa A.; Decker, Ann Marie; Pine, Daniel S.; Leibenluft, Ellen
2012-01-01
Background: Accurate identification of nonverbal emotional cues is essential to successful social interactions, yet most research is limited to emotional face expression labeling. Little research focuses on the processing of emotional prosody, or tone of verbal speech, in clinical populations. Methods: Using the Diagnostic Analysis of Nonverbal…
Predictors of Student Retention in Colleges of Agriculture.
ERIC Educational Resources Information Center
Dyer, James E.; Breja, Lisa M.; Wittler, Penny S. Haase
The primary purpose of this study was to identify those factors that most accurately predict a student's intention to complete a degree in a college of agriculture. Specific research objectives were to identify similarities and differences of college of agriculture freshmen from predominately urban backgrounds, as compared to those in an…
The GO4KIDDS Brief Adaptive Scale
ERIC Educational Resources Information Center
Perry, Adrienne; Taheri, Azin; Ting, Victoria; Weiss, Jonathan
2015-01-01
Background: Accurate measurement of adaptive behaviour is important in both clinical and research contexts. While several good clinical measures exist, as well as brief research measures for adults with intellectual disability, there is need for a brief and efficient measure for research with children and youth. We present preliminary psychometric…
Perceived Credibility and Eyewitness Testimony of Children with Intellectual Disabilities
ERIC Educational Resources Information Center
Henry, L.; Ridley, A.; Perry, J.; Crane, L.
2011-01-01
Background: Although children with intellectual disabilities (ID) often provide accurate witness testimony, jurors tend to perceive their witness statements to be inherently unreliable. Method: The current study explored the free recall transcripts of child witnesses with ID who had watched a video clip, relative to those of typically developing…
Factors Influencing Science Content Accuracy in Elementary Inquiry Science Lessons
ERIC Educational Resources Information Center
Nowicki, Barbara L.; Sullivan-Watts, Barbara; Shim, Minsuk K.; Young, Betty; Pockalny, Robert
2013-01-01
Elementary teachers face increasing demands to engage children in authentic science process and argument while simultaneously preparing them with knowledge of science facts, vocabulary, and concepts. This reform is particularly challenging due to concerns that elementary teachers lack adequate science background to teach science accurately. This…
Process and Outcome Study of Multidisciplinary Prosthetic Treatment for Velopharyngeal Dysfunction
ERIC Educational Resources Information Center
Sell, Debbie; Mars, Michael; Worrell, Emma
2006-01-01
Background: A prosthetic approach to velopharyngeal dysfunction (VPD) is not new. However, a collaborative interdisciplinary team approach by a speech-and-language therapist, dental specialist and maxillofacial technician, including accurate fitting using nasendoscopy, has provided an opportunity to define the clinical care pathway, and audit the…
Background/Question/Methods Solar radiation is a significant environmental driver that impacts the quality and resilience of terrestrial and aquatic habitats, yet its spatiotemporal variations are complicated to model accurately at high resolution over large, complex watersheds. ...
The Salutogenic Wellness Promotion Scale for Older Adults
ERIC Educational Resources Information Center
Becker, Craig M.; Chaney, Beth H.; Shores, Kindal; Glascoff, Mary
2015-01-01
Background: From 1990 to 2050, the population aged 60 years and older will rise from 9% to 21%. Healthy aging initiatives are vital to promote individual, community, and global well-being during this transformation. Accurate health measurement tools are needed to successfully guide strategies and plot progress. The validated multi-dimensional…
2010-09-01
response equipment. After the hardware and software infrastructure is complete, the focus will shift to creating soundscapes over headphones and...Background sounds will emulate a range of conditions from quiet deserts to busy urban streets. Accurate portrayals of military soundscapes and listening
Intellectual Assessment of Children from Culturally Diverse Backgrounds.
ERIC Educational Resources Information Center
Armour-Thomas, Eleanor
1992-01-01
Examines assumptions and premises of standardized tests of mental ability and reviews extant theories and research on intellectual functioning of children from culturally different backgrounds. Discusses implications of these issues and perspectives for new directions for intellectual assessment for children from culturally different backgrounds.…
RenderView: physics-based multi- and hyperspectral rendering using measured background panoramics
NASA Astrophysics Data System (ADS)
Talcott, Denise M.; Brown, Wade W.; Thomas, David J.
2003-09-01
As part of the survivability engineering process it is necessary to accurately model and visualize the vehicle signatures in multi- or hyperspectral bands of interest. The signature at a given wavelength is a function of the surface optical properties, reflection of the background and, in the thermal region, the emission of thermal radiation. Currently, it is difficult to obtain and utilize background models that are of sufficient fidelity when compared with the vehicle models. In addition, the background models create an additional layer of uncertainty in estimating the vehicles signature. Therefore, to meet exacting rendering requirements we have developed RenderView, which incorporates the full bidirectional reflectance distribution function (BRDF). Instead of using a modeled background we have incorporated a measured calibrated background panoramic image to provide the high fidelity background interaction. Uncertainty in the background signature is reduced to the error in the measurement which is considerably smaller than the uncertainty inherent in a modeled background. RenderView utilizes a number of different descriptions of the BRDF, including the Sandford-Robertson. In addition, it provides complete conservation of energy with off axis sampling. A description of RenderView will be presented along with a methodology developed for collecting background panoramics. Examples of the RenderView output and the background panoramics will be presented along with our approach to handling the solar irradiance problem.
2014-01-01
Background Community participation is mandatory in the prevention of Dengue outbreaks. Taking public views into account is crucial to guide more effective planning and quicker community participation in preventing campaigns. This study aims to assess community perceptions of Madeira population in order to explore their involvement in the A. aegypti’s control and reinforce health-educational planning. Due to the lack of accurate methodologies for measuring perception, a new tool to assess the community’s perceptions was built. Methods A cross-sectional survey was performed in the Island’s aegypti-infested area, exploring residents’ perceptions regarding most critical community behaviour: aegypti-source reduction and their domestic aegypti-breeding sites. A novel tool defining five essential topics which underlie the source reduction’s awareness and accession was built, herein called Essential-Perception (EP) analysis. Results Of 1276 individuals, 1182 completed the questionnaire (92 · 6%). EP-Score analysis revealed that community’s perceptions were scarce, inconsistent and possibly incorrect. Most of the population (99 · 6%) did not completely understood the five essential topics explored. An average of 54 · 2% of residents only partially understood each essential topic, revealing inconsistencies in their understanding. Each resident apparently believed in an average of four false assumptions/myths. Significant association (p<0.001) was found between both the EP-Score level and the domestic presence of breeding sites, supporting the validity of this EP-analysis. Aedes aegypti’s breeding sites, consisting of décor/leisure containers, presented an atypical pattern of infestation comparing with dengue prone regions. Conclusions The studied population was not prepared for being fully engaged in dengue prevention. Evidences suggest that EP-methodology was efficient and accurate in assessing the community perception and its compliance to practices. Moreover, it suggested a list of myths that could persist in the community. This is the first study reporting an aegypti-entomological pattern and community’s perception in a developed dengue-prone region. Tailored messages considering findings of this study are recommended to be used in future campaigns in order to more effectively impact the community perception and behaviour. PMID:24428823
Jones, Andria Q; Dewey, Catherine E; Doré, Kathryn; Majowicz, Shannon E; McEwen, Scott A; Waltner-Toews, David
2006-01-01
Background Exposure assessment is typically the greatest weakness of epidemiologic studies of disinfection by-products (DBPs) in drinking water, which largely stems from the difficulty in obtaining accurate data on individual-level water consumption patterns and activity. Thus, surrogate measures for such waterborne exposures are commonly used. Little attention however, has been directed towards formal validation of these measures. Methods We conducted a study in the City of Hamilton, Ontario (Canada) in 2001–2002, to assess the accuracy of two surrogate measures of home water source: (a) urban/rural status as assigned using residential postal codes, and (b) mapping of residential postal codes to municipal water systems within a Geographic Information System (GIS). We then assessed the accuracy of a commonly-used surrogate measure of an individual's actual drinking water source, namely, their home water source. Results The surrogates for home water source provided good classification of residents served by municipal water systems (approximately 98% predictive value), but did not perform well in classifying those served by private water systems (average: 63.5% predictive value). More importantly, we found that home water source was a poor surrogate measure of the individuals' actual drinking water source(s), being associated with high misclassification errors. Conclusion This study demonstrated substantial misclassification errors associated with a surrogate measure commonly used in studies of drinking water disinfection byproducts. Further, the limited accuracy of two surrogate measures of an individual's home water source heeds caution in their use in exposure classification methodology. While these surrogates are inexpensive and convenient, they should not be substituted for direct collection of accurate data pertaining to the subjects' waterborne disease exposure. In instances where such surrogates must be used, estimation of the misclassification and its subsequent effects are recommended for the interpretation and communication of results. Our results also lend support for further investigation into the quantification of the exposure misclassification associated with these surrogate measures, which would provide useful estimates for consideration in interpretation of waterborne disease studies. PMID:16729887
Evaluation of a national pharmacy‐based syndromic surveillance system
Muchaal, PK; Parker, S; Meganath, K; Landry, L; Aramini, J
2015-01-01
Background Traditional public health surveillance provides accurate information but is typically not timely. New early warning systems leveraging timely electronic data are emerging, but the public health value of such systems is still largely unknown. Objective To assess the timeliness and accuracy of pharmacy sales data for both respiratory and gastrointestinal infections and to determine its utility in supporting the surveillance of gastrointestinal illness. Methods To assess timeliness, a prospective and retrospective analysis of data feeds was used to compare the chronological characteristics of each data stream. To assess accuracy, Ontario antiviral prescriptions were compared to confirmed cases of influenza and cases of influenza-like-illness (ILI) from August 2009 to January 2015 and Nova Scotia sales of respiratory over-the-counter products (OTC) were compared to laboratory reports of respiratory pathogen detections from January 2014 to March 2015. Enteric outbreak data (2011-2014) from Nova Scotia were compared to sales of gastrointestinal products for the same time period. To assess utility, pharmacy sales of gastrointestinal products were monitored across Canada to detect unusual increases and reports were disseminated to the provinces and territories once a week between December 2014 and March 2015 and then a follow-up evaluation survey of stakeholders was conducted. Results Ontario prescriptions of antivirals between 2009 and 2015 correlated closely with the onset dates and magnitude of confirmed influenza cases. Nova Scotia sales of respiratory OTC products correlated with increases in non-influenza respiratory pathogens in the community. There were no definitive correlations identified between the occurrence of enteric outbreaks and the sales of gastrointestinal OTCs in Nova Scotia. Evaluation of national monitoring showed no significant increases in sales of gastrointestinal products that could be linked to outbreaks that included more than one province or territory. Conclusion Monitoring of pharmacy-based drug prescriptions and OTC sales can provide a timely and accurate complement to traditional respiratory public health surveillance activities but initial evaluation did not show that tracking gastrointestinal-related OTCs were of value in identifying an enteric disease outbreak in more than one province or territory during the study period. PMID:29769953
The EPIC-MOS Particle-Induced Background Spectra
NASA Technical Reports Server (NTRS)
Kuntz, K. D.; Sowden, S. L.
2007-01-01
In order to analyse diffuse emission that fills the field of view, one must accurately characterize the instrumental backgrounds. For the XMM-Newton EPIC instrument these backgrounds include a temporally variable "quiescent" component. as well as the strongly variable soft proton contamination. We have characterized the spectral and spatial response of the EPIC detectors to these background components and have developed tools to remove these backgrounds from observations. The "quiescent" component was characterized using a combination of the filter-wheel-closed data and a database of unexposed-region data. The soft proton contamination was characterized by differencing images and spectra taken during flared and flare-free intervals. After application of our modeled backgrounds, the differences between independent observations of the same region of "blank sky" are consistent with the statistical uncertainties except when there is clear spectral evidence of solar wind charge exchange emission. Using a large sample of blank sky data, we show that strong magnetospheric SWCX emission requires elevated solar wind fluxes; observations through the densest part of the magnetosheath are not necessarily strongly contaminated with SWCX emission.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Archer, Daniel E.; Hornback, Donald Eric; Johnson, Jeffrey O.
This report summarizes the findings of a two year effort to systematically assess neutron and gamma backgrounds relevant to operational modeling and detection technology implementation. The first year effort focused on reviewing the origins of background sources and their impact on measured rates in operational scenarios of interest. The second year has focused on the assessment of detector and algorithm performance as they pertain to operational requirements against the various background sources and background levels.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Church, J; Slaughter, D; Norman, E
Error rates in a cargo screening system such as the Nuclear Car Wash [1-7] depend on the standard deviation of the background radiation count rate. Because the Nuclear Car Wash is an active interrogation technique, the radiation signal for fissile material must be detected above a background count rate consisting of cosmic, ambient, and neutron-activated radiations. It was suggested previously [1,6] that the Corresponding negative repercussions for the sensitivity of the system were shown. Therefore, to assure the most accurate estimation of the variation, experiments have been performed to quantify components of the actual variance in the background count rate,more » including variations in generator power, irradiation time, and container contents. The background variance is determined by these experiments to be a factor of 2 smaller than values assumed in previous analyses, resulting in substantially improved projections of system performance for the Nuclear Car Wash.« less
Human adaptations for the visual assessment of strength and fighting ability from the body and face
Sell, Aaron; Cosmides, Leda; Tooby, John; Sznycer, Daniel; von Rueden, Christopher; Gurven, Michael
2008-01-01
Selection in species with aggressive social interactions favours the evolution of cognitive mechanisms for assessing physical formidability (fighting ability or resource-holding potential). The ability to accurately assess formidability in conspecifics has been documented in a number of non-human species, but has not been demonstrated in humans. Here, we report tests supporting the hypothesis that the human cognitive architecture includes mechanisms that assess fighting ability—mechanisms that focus on correlates of upper-body strength. Across diverse samples of targets that included US college students, Bolivian horticulturalists and Andean pastoralists, subjects in the US were able to accurately estimate the physical strength of male targets from photos of their bodies and faces. Hierarchical linear modelling shows that subjects were extracting cues of strength that were largely independent of height, weight and age, and that corresponded most strongly to objective measures of upper-body strength—even when the face was all that was available for inspection. Estimates of women's strength were less accurate, but still significant. These studies are the first empirical demonstration that, for humans, judgements of strength and judgements of fighting ability not only track each other, but accurately track actual upper-body strength. PMID:18945661
2014-01-01
Background Integrating rehabilitation services through wearable systems has the potential to accurately assess the type, intensity, duration, and quality of movement necessary for procuring key outcome measures. Objectives This review aims to explore wearable accelerometry-based technology (ABT) capable of assessing mobility-related functional activities intended for rehabilitation purposes in community settings for neurological populations. In this review, we focus on the accuracy of ABT-based methods, types of outcome measures, and the implementation of ABT in non-clinical settings for rehabilitation purposes. Data sources Cochrane, PubMed, Web of Knowledge, EMBASE, and IEEE Xplore. The search strategy covered three main areas, namely wearable technology, rehabilitation, and setting. Study selection Potentially relevant studies were categorized as systems either evaluating methods or outcome parameters. Methods Methodological qualities of studies were assessed by two customized checklists, depending on their categorization and rated independently by three blinded reviewers. Results Twelve studies involving ABT met the eligibility criteria, of which three studies were identified as having implemented ABT for rehabilitation purposes in non-clinical settings. From the twelve studies, seven studies achieved high methodological quality scores. These studies were not only capable of assessing the type, quantity, and quality measures of functional activities, but could also distinguish healthy from non-healthy subjects and/or address disease severity levels. Conclusion While many studies support ABT’s potential for telerehabilitation, few actually utilized it to assess mobility-related functional activities outside laboratory settings. To generate more appropriate outcome measures, there is a clear need to translate research findings and novel methods into practice. PMID:24625308
EVALUATING DISABILITY OVER DISCRETE PERIODS OF TIME
Gill, Thomas M.; Gahbauer, Evelyne A.
2009-01-01
Background To advance the field of disability assessment, additional developmental work is needed. The objective of this study was to determine the potential value of participant recall when evaluating disability over discrete periods of time. Methods We studied 491 residents of greater New Haven, Connecticut who were aged 76 years or older. Participants completed a comprehensive assessment that included several new questions on disability in four essential activities of daily living (bathing, dressing, transferring, and walking). Participants were also assessed for disability in the same activities during monthly telephone interviews before and after the comprehensive assessment. Chronic disability was defined as a new disability that was present for at least three consecutive months. Results We found that up to half of the incident disability episodes, which would otherwise have been missed, can be ascertained if participants are asked to recall whether they have had disability “at any time” since the prior assessment; that these disability episodes, which are ascertained by participant recall, confer high risk for the subsequent development of chronic disability, with an adjusted hazard ratio of 2.5 (95% confidence interval: 1.1, 5.8); and that participant recall for the absence of disability becomes increasingly inaccurate as the duration of the assessment interval increases, with 2.2%, 6.0%, 6.9% and 9.1% of participants having inaccurate recall at 1, 3, 6, and 12 months, respectively. Conclusions Our results demonstrate both the promise and limitations of participant recall and suggest that additional strategies are needed to more completely and accurately ascertain the occurrence of disability among older persons. PMID:18559633
GC-Content Normalization for RNA-Seq Data
2011-01-01
Background Transcriptome sequencing (RNA-Seq) has become the assay of choice for high-throughput studies of gene expression. However, as is the case with microarrays, major technology-related artifacts and biases affect the resulting expression measures. Normalization is therefore essential to ensure accurate inference of expression levels and subsequent analyses thereof. Results We focus on biases related to GC-content and demonstrate the existence of strong sample-specific GC-content effects on RNA-Seq read counts, which can substantially bias differential expression analysis. We propose three simple within-lane gene-level GC-content normalization approaches and assess their performance on two different RNA-Seq datasets, involving different species and experimental designs. Our methods are compared to state-of-the-art normalization procedures in terms of bias and mean squared error for expression fold-change estimation and in terms of Type I error and p-value distributions for tests of differential expression. The exploratory data analysis and normalization methods proposed in this article are implemented in the open-source Bioconductor R package EDASeq. Conclusions Our within-lane normalization procedures, followed by between-lane normalization, reduce GC-content bias and lead to more accurate estimates of expression fold-changes and tests of differential expression. Such results are crucial for the biological interpretation of RNA-Seq experiments, where downstream analyses can be sensitive to the supplied lists of genes. PMID:22177264
NASA Astrophysics Data System (ADS)
Qin, Ruogu; Xu, Jeff; Xu, Ronald; Kim, Chulhong; Wang, Lihong V.
2010-02-01
Background: Clinical ultrasound (US) uses ultrasonic scattering contrast to characterize subcutaneous anatomic structures. Photoacoustic (PA) imaging detects the functional properties of thick biological tissue with high optical contrast. In the case of image-guided cancer ablation therapy, simultaneous US and PA imaging can be useful for intraoperative assessment of tumor boundaries and ablation margins. In this regard, accurate co-registration between imaging modalities and high sensitivity to cancer cells are important. Methods: We synthesized poly-lactic-co-glycolic acid (PLGA) microbubbles (MBs) and nanobubbles (NBs) encapsulating India ink or indocyanine green (ICG). Multiple tumor simulators were fabricated by entrapping ink MBs or NBs at various concentrations in gelatin phantoms for simultaneous US and PA imaging. MBs and NBs were also conjugated with CC49 antibody to target TAG-72, a human glycoprotein complex expressed in many epithelial-derived cancers. Results: Accurate co-registration and intensity correlation were observed in US and PA images of MB and NB tumor simulators. MBs and NBs conjugating with CC49 effectively bound with over-expressed TAG-72 in LS174T colon cancer cell cultures. ICG was also encapsulated in MBs and NBs for the potential to integrate US, PA, and fluorescence imaging. Conclusions: Multifunctional MBs and NBs can be potentially used as a general contrast agent for multimodal intraoperative imaging of tumor boundaries and therapeutic margins.
Dispatcher Recognition of Stroke Using the National Academy Medical Priority Dispatch System
Buck, Brian H; Starkman, Sidney; Eckstein, Marc; Kidwell, Chelsea S; Haines, Jill; Huang, Rainy; Colby, Daniel; Saver, Jeffrey L
2009-01-01
Background Emergency Medical Dispatchers (EMDs) play an important role in optimizing stroke care if they are able to accurately identify calls regarding acute cerebrovascular disease. This study was undertaken to assess the diagnostic accuracy of the current national protocol guiding dispatcher questioning of 911 callers to identify stroke, QA Guide v 11.1 of the National Academy Medical Priority Dispatch System (MPDS). Methods We identified all Los Angeles Fire Department paramedic transports of patients to UCLA Medical Center during the 12 month period from January to December 2005 in a prospectively maintained database. Dispatcher-assigned MPDS codes for each of these patient transports were abstracted from the paramedic run sheets and compared to final hospital discharge diagnosis. Results Among 3474 transported patients, 96 (2.8%) had a final diagnosis of stroke or transient ischemic attack. Dispatchers assigned a code of potential stroke to 44.8% of patients with a final discharge diagnosis of stroke or TIA. Dispatcher identification of stroke showed a sensitivity of 0.41, specificity of 0.96, positive predictive value of 0.45, and negative predictive value of 0.95. Conclusions Dispatcher recognition of stroke calls using the widely employed MPDS algorithm is suboptimal, with failure to identify more than half of stroke patients as likely stroke. Revisions to the current national dispatcher structured interview and complaint identification algorithm for stroke may facilitate more accurate recognition of stroke by EMDs. PMID:19390065
Chiò, A; Logroscino, G; Traynor, BJ; Collins, J; Simeone, JC; Goldstein, LA; White, LA
2014-01-01
Background Amyotrophic lateral sclerosis (ALS) is relatively rare, yet the economic and social burden is substantial. Having accurate incidence and prevalence estimates would facilitate efficient allocation of healthcare resources. Objective To provide a comprehensive and critical review of the epidemiologic literature on ALS. Methods MEDLINE and EMBASE (1995–2011) databases of population-based studies on ALS incidence and prevalence reporting quantitative data were analyzed. Data extracted included study location and time, design and data sources, case ascertainment methods, and incidence and/or prevalence rates. Medians and inter-quartile ranges (IQRs) were calculated, and ALS case estimates derived using 2010 population estimates. Results In all, 37 articles met inclusion criteria. In Europe, the median (IQR) incidence rate (/100,000 population) was 2.08 (1.47–2.43), corresponding to an estimated 15,355 (10,852–17,938) cases. Median (IQR) prevalence (/100,000 population) was 5.40 (4.06–7.89), or 39,863 (29,971–58,244) prevalent cases. Conclusions Disparity in rates among ALS incidence and prevalence studies may be due to differences in study design or true variations in population demographics, such as age, and geography, including environmental factors and genetic predisposition. Additional large-scale studies that use standardized case ascertainment methods are needed to more accurately assess the true global burden of ALS. PMID:23860588
Nanavati, Tania; Seemaladinne, Nirupama; Regier, Michael; Yossuck, Panitan; Pergami, Paola
2015-01-01
Background Neonatal hypoxic ischemic encephalopathy (HIE) is a major cause of mortality, morbidity, and long-term neurological deficits. Despite the availability of neuroimaging and neurophysiological testing, tools for accurate early diagnosis and prediction of developmental outcome are still lacking. The goal of this study was to determine if combined use of magnetic resonance imaging (MRI) and electroencephalography (EEG) findings could support outcome prediction. Methods We retrospectively reviewed records of 17 HIE neonates, classified brain MRI and EEG findings based on severity, and assessed clinical outcome up to 48 months. We determined the relation between MRI/EEG findings and clinical outcome. Results We demonstrated a significant relationship between MRI findings and clinical outcome (Fisher’s exact test, p = 0.017). EEG provided no additional information about the outcome beyond that contained in the MRI score. The statistical model for outcome prediction based on random forests suggested that EEG readings at 24 hours and 72 hours could be important variables for outcome prediction, but this needs to be investigated further. Conclusion Caution should be used when discussing prognosis for neonates with mild-to-moderate HIE based on early MR imaging and EEG findings. A robust, quantitative marker of HIE severity that allows for accurate prediction of long-term outcome, particularly for mild-to-moderate cases, is still needed. PMID:25862075
Strong, Vivian E.; Selby, Luke V.; Sovel, Mindy; Disa, Joseph J.; Hoskins, William; DeMatteo, Ronald; Scardino, Peter; Jaques, David P.
2015-01-01
Background Studying surgical secondary events is an evolving effort with no current established system for database design, standard reporting, or definitions. Using the Clavien-Dindo classification as a guide, in 2001 we developed a Surgical Secondary Events database based on grade of event and required intervention to begin prospectively recording and analyzing all surgical secondary events (SSE). Study Design Events are prospectively entered into the database by attending surgeons, house staff, and research staff. In 2008 we performed a blinded external audit of 1,498 operations that were randomly selected to examine the quality and reliability of the data. Results 1,498 of 4,284 operations during the 3rd quarter of 2008 were audited. 79% (N=1,180) of the operations did not have a secondary event while 21% (N=318) of operations had an identified event. 91% (1,365) of operations were correctly entered into the SSE database. 97% (129/133) of missed secondary events were Grades I and II. Three Grade III (2%) and one Grade IV (1%) secondary event were missed. There were no missed Grade 5 secondary events. Conclusion Grade III – IV events are more accurately collected than Grade I – II events. Robust and accurate secondary events data can be collected by clinicians and research staff and these data can safely be used for quality improvement projects and research. PMID:25319579
An IDEA for Short Term Outbreak Projection: Nearcasting Using the Basic Reproduction Number
Fisman, David N.; Hauck, Tanya S.; Tuite, Ashleigh R.; Greer, Amy L.
2013-01-01
Background Communicable disease outbreaks of novel or existing pathogens threaten human health around the globe. It would be desirable to rapidly characterize such outbreaks and develop accurate projections of their duration and cumulative size even when limited preliminary data are available. Here we develop a mathematical model to aid public health authorities in tracking the expansion and contraction of outbreaks with explicit representation of factors (other than population immunity) that may slow epidemic growth. Methodology The Incidence Decay and Exponential Adjustment (IDEA) model is a parsimonious function that uses the basic reproduction number R0, along with a discounting factor to project the growth of outbreaks using only basic epidemiological information (e.g., daily incidence counts). Principal Findings Compared to simulated data, IDEA provides highly accurate estimates of total size and duration for a given outbreak when R0 is low or moderate, and also identifies turning points or new waves. When tested with an outbreak of pandemic influenza A (H1N1), the model generates estimated incidence at the i+1th serial interval using data from the ith serial interval within an average of 20% of actual incidence. Conclusions and Significance This model for communicable disease outbreaks provides rapid assessments of outbreak growth and public health interventions. Further evaluation in the context of real-world outbreaks will establish the utility of IDEA as a tool for front-line epidemiologists. PMID:24391797
Internal Flows in Free Drops (IFFD)
NASA Technical Reports Server (NTRS)
Trinh, E. H.; Sadhal, Satwindar S.; Thomas, D. A.; Crouch, R. K.
1998-01-01
Within the framework of an Earth-based research task investigating the internal flows within freely levitated drops, a low-gravity technology development experiment has been designed and carried out within the NASA Glovebox facility during the STS-83 and STS-94 Shuttle flights (MSL-1 mission). The goal was narrowly defined as the assessment of the capabilities of a resonant single-axis ultrasonic levitator to stably position free drops in the Shuttle environment with a precision required for the detailed measurement of internal flows. The results of this entirely crew-operated investigation indicate that the approach is fundamentally sound, but also that the ultimate stability of the positioning is highly dependent on the residual acceleration characteristic of the Spacecraft, and to a certain extent, on the initial drop deployment of the drop. The principal results are: the measured dependence of the residual drop rotation and equilibrium drop shape on the ultrasonic power level, the experimental evaluation of the typical drop translational stability in a realistic low-gravity environment, and the semi-quantitative evaluation of background internal flows within quasi-isothermal drops. Based on these results, we conclude that the successful design of a full-scale Microgravity experiment is possible, and would allow accurate the measurement of thermocapillary flows within transparent drops. The need has been demonstrated, however, for the capability for accurately deploying the drop, for a quiescent environment, and for precise mechanical adjustments of the levitator.
Gfeller, Kate; Jiang, Dingfeng; Oleson, Jacob; Driscoll, Virginia; Olszewski, Carol; Knutson, John F.; Turner, Christopher; Gantz, Bruce
2011-01-01
Background Cochlear implants (CI) are effective in transmitting salient features of speech, especially in quiet, but current CI technology is not well suited in transmission of key musical structures (e.g., melody, timbre). It is possible, however, that sung lyrics, which are commonly heard in real-world music may provide acoustical cues that support better music perception. Objective The purpose of this study was to examine how accurately adults who use CIs (n=87) and those with normal hearing (NH) (n=17) are able to recognize real-world music excerpts based upon musical and linguistic (lyrics) cues. Results CI recipients were significantly less accurate than NH listeners on recognition of real-world music with or, in particular, without lyrics; however, CI recipients whose devices transmitted acoustic plus electric stimulation were more accurate than CI recipients reliant upon electric stimulation alone (particularly items without linguistic cues). Recognition by CI recipients improved as a function of linguistic cues. Methods Participants were tested on melody recognition of complex melodies (pop, country, classical styles). Results were analyzed as a function of: hearing status and history, device type (electric only or acoustic plus electric stimulation), musical style, linguistic and musical cues, speech perception scores, cognitive processing, music background, age, and in relation to self-report on listening acuity and enjoyment. Age at time of testing was negatively correlated with recognition performance. Conclusions These results have practical implications regarding successful participation of CI users in music-based activities that include recognition and accurate perception of real-world songs (e.g., reminiscence, lyric analysis, listening for enjoyment). PMID:22803258
A novel background field removal method for MRI using projection onto dipole fields (PDF).
Liu, Tian; Khalidov, Ildar; de Rochefort, Ludovic; Spincemaille, Pascal; Liu, Jing; Tsiouris, A John; Wang, Yi
2011-11-01
For optimal image quality in susceptibility-weighted imaging and accurate quantification of susceptibility, it is necessary to isolate the local field generated by local magnetic sources (such as iron) from the background field that arises from imperfect shimming and variations in magnetic susceptibility of surrounding tissues (including air). Previous background removal techniques have limited effectiveness depending on the accuracy of model assumptions or information input. In this article, we report an observation that the magnetic field for a dipole outside a given region of interest (ROI) is approximately orthogonal to the magnetic field of a dipole inside the ROI. Accordingly, we propose a nonparametric background field removal technique based on projection onto dipole fields (PDF). In this PDF technique, the background field inside an ROI is decomposed into a field originating from dipoles outside the ROI using the projection theorem in Hilbert space. This novel PDF background removal technique was validated on a numerical simulation and a phantom experiment and was applied in human brain imaging, demonstrating substantial improvement in background field removal compared with the commonly used high-pass filtering method. Copyright © 2011 John Wiley & Sons, Ltd.
Yoon, S; Pak, M-J; Park, S; Yoo, J; Ha, W-H; Jang, H-K; Kim, J K
2014-12-01
(32)P measurements of urine samples and internal dose assessments were conducted for workers in life science laboratories. A procedure for sample pre-treatment was established and validation was performed to exclude interference and to detect (32)P levels accurately. The detection conditions for Cherenkov radiation were evaluated and the accuracy of Cherenkov radiation measurements validated. The analytical and measurement procedures were applied to urine samples collected from 11 workers from life sciences laboratories. The results of the measurements generally indicated very low background radiation levels, but daily urine samples from two workers were above the minimum detectable activity. The (32)P concentrations for two of the workers were 29.3 ± 10.4 Bq•d(-1) and 24.1 ± 11.8 Bq•d(-1), respectively, at intake levels of 4.12 kBq and 2.61 kBq. The effective doses for these two workers were 4.6 μSv and 2.9 μSv. Overall, the results indicate very low levels of radioactivity, except for cases related to specific working conditions.
Schilling, Katherine; Applegate, Rachel
2012-01-01
Objectives and Background: Libraries are increasingly called upon to demonstrate student learning outcomes and the tangible benefits of library educational programs. This study reviewed and compared the efficacy of traditionally used measures for assessing library instruction, examining the benefits and drawbacks of assessment measures and exploring the extent to which knowledge, attitudes, and behaviors actually paralleled demonstrated skill levels. Methods: An overview of recent literature on the evaluation of information literacy education addressed these questions: (1) What evaluation measures are commonly used for evaluating library instruction? (2) What are the pros and cons of popular evaluation measures? (3) What are the relationships between measures of skills versus measures of attitudes and behavior? Research outcomes were used to identify relationships between measures of attitudes, behaviors, and skills, which are typically gathered via attitudinal surveys, written skills tests, or graded exercises. Results and Conclusions: Results provide useful information about the efficacy of instructional evaluation methods, including showing significant disparities between attitudes, skills, and information usage behaviors. This information can be used by librarians to implement the most appropriate evaluation methods for measuring important variables that accurately demonstrate students' attitudes, behaviors, or skills. PMID:23133325
Assessment of a head-mounted miniature monitor
NASA Technical Reports Server (NTRS)
Hale, J. P., II
1992-01-01
Two experiments were conducted to assess the capabilities and limitations of the Private Eye, a miniature, head-mounted monitor. The first experiment compared the Private Eye with a cathode ray tube (CRT) and hard copy in both a constrained and unconstrained work envelope. The task was a simulated maintenance and assembly task that required frequent reference to the displayed information. A main effect of presentation media indicated faster placement times using the CRT as compared with hard copy. There were no significant differences between the Private Eye and either the CRT or hard copy for identification, placement, or total task times. The goal of the second experiment was to determine the effects of various local visual parameters on the ability of the user to accurately perceive the information of the Private Eye. The task was an interactive video game. No significant performance differences were found under either bright or dark ambient illumination environments nor with either visually simple or complex task backgrounds. Glare reflected off of the bezel surrounding the monitor did degrade performance. It was concluded that this head-mounted, miniature monitor could serve a useful role for in situ operations, especially in microgravity environments.
Understanding auditory distance estimation by humpback whales: a computational approach.
Mercado, E; Green, S R; Schneider, J N
2008-02-01
Ranging, the ability to judge the distance to a sound source, depends on the presence of predictable patterns of attenuation. We measured long-range sound propagation in coastal waters to assess whether humpback whales might use frequency degradation cues to range singing whales. Two types of neural networks, a multi-layer and a single-layer perceptron, were trained to classify recorded sounds by distance traveled based on their frequency content. The multi-layer network successfully classified received sounds, demonstrating that the distorting effects of underwater propagation on frequency content provide sufficient cues to estimate source distance. Normalizing received sounds with respect to ambient noise levels increased the accuracy of distance estimates by single-layer perceptrons, indicating that familiarity with background noise can potentially improve a listening whale's ability to range. To assess whether frequency patterns predictive of source distance were likely to be perceived by whales, recordings were pre-processed using a computational model of the humpback whale's peripheral auditory system. Although signals processed with this model contained less information than the original recordings, neural networks trained with these physiologically based representations estimated source distance more accurately, suggesting that listening whales should be able to range singers using distance-dependent changes in frequency content.
Periodontal disease and metabolic syndrome: A qualitative critical review of their association
Watanabe, Keiko; Cho, Yale D.
2014-01-01
Background Metabolic syndrome (MetS) is a conglomerate of several physical conditions/diseases that, as a group, increases the risk of mortality resulting from development of T2DM and cardiovascular diseases (CVD). These conditions/diseases include glucose intolerance/insulin resistance, hypertension, obesity, and dyslipidemia. The results from epidemiological studies suggest that there is an association between metabolic syndrome (MetS) and periodontitis, it is therefore important to understand the current status of the association and a possible contribution of periodontitis to MetS. Objective This review will qualitatively analyze published papers on the association of MetS and periodontitis/periodontal disease to clarify the current status of the association and suggest future directions for studies which may unravel the causal relationship between them. Results Of 309 papers related to MetS and periodontitis, 26 are original research papers that investigated the relationship/association between periodontal disease and MetS. Criteria used to assess periodontitis and MetS as well as overall study designs and patient recruitment criteria varied greatly among these studies. Conclusion All these studies demonstrated a positive association between periodontal disease and MetS. However, due to the heterogeneity of criteria to assess periodontitis and MetS and also paucity of longitudinal studies, it is difficult to determine the relative contribution of periodontitis to MetS. Age and the number of positive components of MetS appear to strengthen the relationship, however, incidence of each disease entity increases with ageing. Thus, mechanistic studies are also necessary to unravel the inter-relationship between periodontitis and MetS. In this regard, a use of animal models will be helpful as they are more uniform in regards to genetic background and have minimum confounding factors. Finally, development of accurate, quantitative assessment of gingival inflammation are necessary in order to determine the influence of periodontal disease on the development of MetS and its components. PMID:24880501
Unstable matter and the 1-0 MeV gamma-ray background
NASA Technical Reports Server (NTRS)
Daly, Ruth A.
1988-01-01
The spectrum of photons produced by an unstable particle which decayed while the universe was young is calculated. This spectrum is compared to that of the 1-10 MeV shoulder, a feature of the high-energy, extragalactic gamma-ray background, whose origin has not yet been determined. The calculated spectrum contains two parameters which are adjusted to obtain a maximal fit to the observed spectrum; the fit thus obtained is accurate to the 99 percent confidence level. The implications for the mass, lifetime, initial abundance, and branching ratio of the unstable particle are discussed.
Put the Family Back in Family Health History: A Multiple-Informant Approach.
Lin, Jielu; Marcum, Christopher S; Myers, Melanie F; Koehly, Laura M
2017-05-01
An accurate family health history is essential for individual risk assessment. This study uses a multiple-informant approach to examine whether family members have consistent perceptions of shared familial risk for four common chronic conditions (heart disease, Type 2 diabetes, high cholesterol, and hypertension) and whether accounting for inconsistency in family health history reports leads to more accurate risk assessment. In 2012-2013, individual and family health histories were collected from 127 adult informants of 45 families in the Greater Cincinnati Area. Pedigrees were linked within each family to assess inter-informant (in)consistency regarding common biological family member's health history. An adjusted risk assessment based on pooled pedigrees of multiple informants was evaluated to determine whether it could more accurately identify individuals affected by common chronic conditions, using self-reported disease diagnoses as a validation criterion. Analysis was completed in 2015-2016. Inter-informant consistency in family health history reports was 54% for heart disease, 61% for Type 2 diabetes, 43% for high cholesterol, and 41% for hypertension. Compared with the unadjusted risk assessment, the adjusted risk assessment correctly identified an additional 7%-13% of the individuals who had been diagnosed, with a ≤2% increase in cases that were predicted to be at risk but had not been diagnosed. Considerable inconsistency exists in individual knowledge of their family health history. Accounting for such inconsistency can, nevertheless, lead to a more accurate genetic risk assessment tool. A multiple-informant approach is potentially powerful when coupled with technology to support clinical decisions. Published by Elsevier Inc.
Childhood Chronic Physical Aggression Associates with Adult Cytokine Levels in Plasma
Provençal, Nadine; Suderman, Matthew J.; Vitaro, Frank; Szyf, Moshe; Tremblay, Richard E.
2013-01-01
Background An increasing number of animal and human studies are indicating that inflammation is associated with behavioral disorders including aggression. This study investigates the association between chronic physical aggression during childhood and plasma cytokine levels in early adulthood. Methodology/Principal Findings Two longitudinal studies were used to select males on a chronic physical aggression trajectory from childhood to adolescence (n = 7) and a control group from the same background (n = 25). Physical aggression was assessed yearly by teachers from childhood to adolescence and plasma levels of 10 inflammatory cytokines were assessed at age 26 and 28 years. Compared to the control group, males on a chronic physical aggression trajectory from childhood to adolescence had consistently lower plasma levels of five cytokines: lower pro-inflammatory interleukins IL-1α (T(28.7) = 3.48, P = 0.002) and IL-6 (T(26.9) = 3.76, P = 0.001), lower anti-inflammatory interleukin IL-4 (T(27.1) = 4.91, P = 0.00004) and IL-10 (T(29.8) = 2.84, P = 0.008) and lower chemokine IL-8 (T(26) = 3.69, P = 0.001). The plasma levels of four cytokines accurately predicted aggressive and control group membership for all subjects. Conclusions/Significance Physical aggression of boys during childhood is a strong predictor of reduced plasma levels of cytokines in early adulthood. The causal and physiological relations underlying this association should be further investigated since animal data suggest that some cytokines such as IL-6 and IL-1β play a causal role in aggression. PMID:23922720
Exposure assessment of diesel bus emissions.
Yip, Maricela; Madl, Pierre; Wiegand, Aaron; Hofmann, Werner
2006-12-01
The goal of this study was to measure ultrafine particle concentrations with diameters less than 1 mum emitted by diesel buses and to assess resulting human exposure levels. The study was conducted at the Woolloongabba Busway station in Brisbane, Australia in the winter months of 2002 during which temperature inversions frequently occurred. Most buses that utilize the station are fuelled by diesel, the exhaust of which contains a significant quantity of particle matter. Passengers waiting at the station are exposed to these particles emitted from the buses. During the course of this study, passenger census was conducted, based on video surveillance, yielding person-by-person waiting time data. Furthermore, a bus census revealed accurate information about the total number of diesel versus Compressed Natural Gas (CNG) powered buses. Background (outside of the bus station) and platform measurements of ultrafine particulate number size distributions were made to determine ambient aerosol concentrations. Particle number exposure concentration ranges from 10 and 40 to 60% of bus related exhaust fumes. This changes dramatically when considering the particle mass exposure concentration, where most passengers are exposed to about 50 to 80% of exhaust fumes. The obtained data can be very significant for comparison with similar work of this type because it is shown in previous studies that exhaust emissions causes cancer in laboratory animals. It was assumed that significant differences between platform and background distributions were due to bus emissions which, combined with passenger waiting times, yielded an estimate of passenger exposure to ultrafine particles from diesel buses. From an exposure point of view, the Busway station analyzed resembles a street canyon. Although the detected exhaust particle concentration at the outbound platform is found to be in the picogram range, exposure increases with the time passengers spend on the platform along with their breathing frequency.
ERIC Educational Resources Information Center
Pistone, Nancy
This handbook is a guide to help educators and administrators with the decisions they face in the design of an arts assessment. The guide is divided into two broad parts: Part 1: "Background for Thoughtful Arts Education Assessment"; and Part 2: "Assessment Design in Action." The guide includes: (1) a brief background on the…
Unterrainer, Marcus; Vettermann, Franziska; Brendel, Matthias; Holzgreve, Adrien; Lifschitz, Michael; Zähringer, Matthias; Suchorska, Bogdana; Wenter, Vera; Illigens, Ben M; Bartenstein, Peter; Albert, Nathalie L
2017-12-01
PET with O-(2- 18 F-fluoroethyl)-L-tyrosine ( 18 F-FET) has reached increasing clinical significance for patients with brain neoplasms. For quantification of standard PET-derived parameters such as the tumor-to-background ratio, the background activity is assessed using a region of interest (ROI) or volume of interest (VOI) in unaffected brain tissue. However, there is no standardized approach regarding the assessment of the background reference. Therefore, we evaluated the intra- and inter-reader variability of commonly applied approaches for clinical 18 F-FET PET reading. The background activity of 20 18 F-FET PET scans was independently evaluated by 6 readers using a (i) simple 2D-ROI, (ii) spherical VOI with 3.0 cm diameter, and (iii) VOI consisting of crescent-shaped ROIs; each in the contralateral, non-affected hemisphere including white and gray matter in line with the European Association of Nuclear Medicine (EANM) and German guidelines. To assess intra-reader variability, each scan was evaluated 10 times by each reader. The coefficient of variation (CoV) was assessed for determination of intra- and inter-reader variability. In a second step, the best method was refined by instructions for a guided background activity assessment and validated by 10 further scans. Compared to the other approaches, the crescent-shaped VOIs revealed most stable results with the lowest intra-reader variabilities (median CoV 1.52%, spherical VOI 4.20%, 2D-ROI 3.69%; p < 0.001) and inter-reader variabilities (median CoV 2.14%, spherical VOI 4.02%, 2D-ROI 3.83%; p = 0.001). Using the guided background assessment, both intra-reader variabilities (median CoV 1.10%) and inter-reader variabilities (median CoV 1.19%) could be reduced even more. The commonly applied methods for background activity assessment show different variability which might hamper 18 F-FET PET quantification and comparability in multicenter settings. The proposed background activity assessment using a (guided) crescent-shaped VOI allows minimization of both intra- and inter-reader variability and might facilitate comprehensive methodological standardization of amino acid PET which is of interest in the light of the anticipated EANM technical guidelines.
Integration of neutron time-of-flight single-crystal Bragg peaks in reciprocal space
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schultz, Arthur J; Joergensen, Mads; Wang, Xiaoping
2014-01-01
The intensity of single crystal Bragg peaks obtained by mapping neutron time-of-flight event data into reciprocal space and integrating in various ways are compared. These include spherical integration with a fixed radius, ellipsoid fitting and integrating of the peak intensity and one-dimensional peak profile fitting. In comparison to intensities obtained by integrating in real detector histogram space, the data integrated in reciprocal space results in better agreement factors and more accurate atomic parameters. Furthermore, structure refinement using integrated intensities from one-dimensional profile fitting is demonstrated to be more accurate than simple peak-minus-background integration.
Deng, Yong; Luo, Zhaoyang; Jiang, Xu; Xie, Wenhao; Luo, Qingming
2015-07-01
We propose a method based on a decoupled fluorescence Monte Carlo model for constructing fluorescence Jacobians to enable accurate quantification of fluorescence targets within turbid media. The effectiveness of the proposed method is validated using two cylindrical phantoms enclosing fluorescent targets within homogeneous and heterogeneous background media. The results demonstrate that our method can recover relative concentrations of the fluorescent targets with higher accuracy than the perturbation fluorescence Monte Carlo method. This suggests that our method is suitable for quantitative fluorescence diffuse optical tomography, especially for in vivo imaging of fluorophore targets for diagnosis of different diseases and abnormalities.
Meta-Analysis of the Effects of Early Education Interventions on Cognitive and Social Development
ERIC Educational Resources Information Center
Camilli, Gregory; Vargas, Sadako; Ryan, Sharon; Barnett, W. Steven
2010-01-01
Background/Context: There is much current interest in the impact of early childhood education programs on preschoolers and, in particular, on the magnitude of cognitive and affective gains. Purpose/Objective/Research Question/Focus of Study: Because this new segment of public education requires significant funding, accurate descriptions are…
Teenagers' Web Questions Compared with a Sexuality Curriculum: An Exploration
ERIC Educational Resources Information Center
Goldman, Juliette D. G.; McCutchen, Lisa E.
2012-01-01
Background: Teenagers need information about their changing bodies. Many young people do not receive adequate or accurate puberty/sexuality education from their parents or school, so many teenagers are going online to have their sexuality questions answered. Purpose: This research examines teenagers' web questions on sexuality, and an example of…
A Meta-Analysis of Adult-Rated Child Personality and Academic Performance in Primary Education
ERIC Educational Resources Information Center
Poropat, Arthur E.
2014-01-01
Background: Personality is reliably associated with academic performance, but personality measurement in primary education can be problematic. Young children find it difficult to accurately self-rate personality, and dominant models of adult personality may be inappropriate for children. Aims: This meta-analysis was conducted to determine the…
ERIC Educational Resources Information Center
Kern, Ben D.; Graber, Kim C.; Shen, Sa; Hillman, Charles H.; McLoughlin, Gabriella
2018-01-01
Background: Socioeconomic status (SES) is the most accurate predictor of academic performance in US schools. Third-grade reading is highly predictive of high school graduation. Chronic physical activity (PA) is shown to improve cognition and academic performance. We hypothesized that school-based PA opportunities (recess and physical education)…
Prevalence and Severity of Voice and Swallowing Difficulties in Mitochondrial Disease
ERIC Educational Resources Information Center
Read, Jennifer L.; Whittaker, Roger G.; Miller, Nick; Clark, Sue; Taylor, Robert; McFarland, Robert; Turnbull, Douglass
2012-01-01
Background: Mutations of mitochondrial DNA (mtDNA) cause a broad spectrum of clinical phenotypes. Anecdotal evidence suggests that voice and swallow problems are a common feature of these diseases. Aims: To characterize accurately the prevalence and severity of voice and swallow problems in a large cohort of patients with mitochondrial disease.…
General Knowledge Monitoring as a Predictor of In-Class Exam Performance
ERIC Educational Resources Information Center
Hartwig, Marissa K.; Was, Chris A.; Isaacson, Randy M.; Dunlosky, John
2012-01-01
Background: Current theories of self-regulated learning predict a positive link between student monitoring accuracy and performance: students who more accurately monitor their knowledge of a particular set of materials are expected to more effectively regulate their subsequent study of those materials, which in turn should lead to higher test…
Evaluation of Autism-Related Health Information on the Web
ERIC Educational Resources Information Center
Grant, Nicole; Rodger, Sylvia; Hoffmann, Tammy
2015-01-01
Background: The Internet is a frequently accessed source of information for parents of a child with autism. To help parents make informed decisions about treatment options, websites should contain accurate information. This study aimed to evaluate the quality of information in a sample of autism-relevant websites. Materials and Methods:…
ERIC Educational Resources Information Center
Summers, Van; Molis, Michelle R.
2004-01-01
Listeners with normal-hearing sensitivity recognize speech more accurately in the presence of fluctuating background sounds, such as a single competing voice, than in unmodulated noise at the same overall level. These performance differences ore greatly reduced in listeners with hearing impairment, who generally receive little benefit from…
The Impact of Weight Perception on the Health Behaviors of College Students
ERIC Educational Resources Information Center
Osborn, Jessica; Naquin, Mildred; Gillan, Wynn; Bowers, Ashley
2016-01-01
Background: Obesity has links to numerous health problems. Having an accurate perception of one's own weight is an important aspect of maintaining an appropriate weight. Purpose: The purpose of this study was to examine relationships among perceived body weight, actual body weight, body satisfaction, and selected health behaviors. Methods: The…
Depth-color fusion strategy for 3-D scene modeling with Kinect.
Camplani, Massimo; Mantecon, Tomas; Salgado, Luis
2013-12-01
Low-cost depth cameras, such as Microsoft Kinect, have completely changed the world of human-computer interaction through controller-free gaming applications. Depth data provided by the Kinect sensor presents several noise-related problems that have to be tackled to improve the accuracy of the depth data, thus obtaining more reliable game control platforms and broadening its applicability. In this paper, we present a depth-color fusion strategy for 3-D modeling of indoor scenes with Kinect. Accurate depth and color models of the background elements are iteratively built, and used to detect moving objects in the scene. Kinect depth data is processed with an innovative adaptive joint-bilateral filter that efficiently combines depth and color by analyzing an edge-uncertainty map and the detected foreground regions. Results show that the proposed approach efficiently tackles main Kinect data problems: distance-dependent depth maps, spatial noise, and temporal random fluctuations are dramatically reduced; objects depth boundaries are refined, and nonmeasured depth pixels are interpolated. Moreover, a robust depth and color background model and accurate moving objects silhouette are generated.
Directional Histogram Ratio at Random Probes: A Local Thresholding Criterion for Capillary Images
Lu, Na; Silva, Jharon; Gu, Yu; Gerber, Scott; Wu, Hulin; Gelbard, Harris; Dewhurst, Stephen; Miao, Hongyu
2013-01-01
With the development of micron-scale imaging techniques, capillaries can be conveniently visualized using methods such as two-photon and whole mount microscopy. However, the presence of background staining, leaky vessels and the diffusion of small fluorescent molecules can lead to significant complexity in image analysis and loss of information necessary to accurately quantify vascular metrics. One solution to this problem is the development of accurate thresholding algorithms that reliably distinguish blood vessels from surrounding tissue. Although various thresholding algorithms have been proposed, our results suggest that without appropriate pre- or post-processing, the existing approaches may fail to obtain satisfactory results for capillary images that include areas of contamination. In this study, we propose a novel local thresholding algorithm, called directional histogram ratio at random probes (DHR-RP). This method explicitly considers the geometric features of tube-like objects in conducting image binarization, and has a reliable performance in distinguishing small vessels from either clean or contaminated background. Experimental and simulation studies suggest that our DHR-RP algorithm is superior over existing thresholding methods. PMID:23525856
Information, history and background on the development and maintenance of NOx and SOx (ecological criteria) assessment. There are 2 separate sites that have the background and history on the individual NOx/SOx health assessments.
Image Segmentation Using Minimum Spanning Tree
NASA Astrophysics Data System (ADS)
Dewi, M. P.; Armiati, A.; Alvini, S.
2018-04-01
This research aim to segmented the digital image. The process of segmentation is to separate the object from the background. So the main object can be processed for the other purposes. Along with the development of technology in digital image processing application, the segmentation process becomes increasingly necessary. The segmented image which is the result of the segmentation process should accurate due to the next process need the interpretation of the information on the image. This article discussed the application of minimum spanning tree on graph in segmentation process of digital image. This method is able to separate an object from the background and the image will change to be the binary images. In this case, the object that being the focus is set in white, while the background is black or otherwise.
Song, Zhixin; Tang, Wenzhong; Shan, Baoqing
2017-10-01
Evaluating heavy metal pollution status and ecological risk in river sediments is a complex task, requiring consideration of contaminant pollution levels, as well as effects of biological processes within the river system. There are currently no simple or low-cost approaches to heavy metal assessment in river sediments. Here, we introduce a system of assessment for pollution status of heavy metals in river sediments, using measurements of Cd in the Shaocun River sediments as a case study. This system can be used to identify high-risk zones of the river that should be given more attention. First, we evaluated the pollution status of Cd in the river sediments based on their total Cd content, and calculated a risk assessment, using local geochemical background values at various sites along the river. Using both acetic acid and ethylenediaminetetraacetic acid to extracted the fractions of Cd in sediments, and used DGT to evaluate the bioavailability of Cd. Thus, DGT provided a measure of potentially bioavailable concentrations of Cd concentrations in the sediments. Last, we measured Cd contents in plant tissue collected at the same site to compare with our other measures. A Pearson's correlation analysis showed that Cd-Plant correlated significantly with Cd-HAc, (r = 0.788, P < 0.01), Cd-EDTA (r = 0.925, P < 0.01), Cd-DGT (r = 0.976, P < 0.01), and Cd-Total (r = 0.635, P < 0.05). We demonstrate that this system of assessment is a useful means of assessing heavy metal pollution status and ecological risk in river sediments. Copyright © 2017 Elsevier Ltd. All rights reserved.
Radiographic evaluation of BFX acetabular component position in dogs.
Renwick, Alasdair; Gemmill, Toby; Pink, Jonathan; Brodbelt, David; McKee, Malcolm
2011-07-01
To assess the reliability of radiographic measurement of angle of lateral opening (ALO) and angle of version of BFX acetabular cups. In vitro radiographic study. BFX cups (24, 28, and 32 mm). Total hip replacement constructs (cups, 17 mm femoral head and a #7 CFX stem) were mounted on an inclinometer. Ventrodorsal radiographs were obtained with ALO varying between 21° and 70° and inclination set at 0°, 10°, 20°, and 30°. Radiographs were randomized using a random sequence generator. Three observers blinded to the radiograph order assessed ALO using 3 methods: (1) an ellipse method based on trigonometry; (2) using a measurement from the center of the femoral head to the truncated surface of the cup; (3) by visual estimation using a reference chart. Version was measured by assessing the ventral edge of the truncated surface. ALO methods 2 and 3 were accurate and precise to within 10° and were significantly more accurate and precise than method 1 (P < .001). All methods were significantly less accurate with increasing inclination. Version measurement was accurate and precise to within 7° with 0-20° of inclination, but significantly less accurate with 30° of inclination. Methods 2 and 3, but not method 1, were sufficiently accurate and precise to be clinically useful. Version measurement was clinically useful when inclination was ≤ 20°. © Copyright 2011 by The American College of Veterinary Surgeons.
Henry, Luke C.; Elbin, RJ; Collins, Michael W.; Marchetti, Gregory; Kontos, Anthony P.
2016-01-01
Background Previous research estimates that the majority of athletes with sport-related concussion (SRC) will recover between 7–10 days following injury. This short, temporal window of recovery is predominately based on symptom resolution and cognitive improvement, and does not accurately reflect recent advances to the clinical assessment model. Objective To characterize SRC recovery at 1-week post-injury time intervals on symptom, neurocognitive, and vestibular-oculomotor outcomes, and examine gender differences on SRC recovery time. Methods A prospective, repeated measures design was used to examine the temporal resolution of neurocognitive, symptom, and vestibular-oculomotor impairment in 66 subjects (16.5 ± 1.9 years, range 14–23, 64% male) with SRC. Results Recovery time across all outcomes was between 21–28 days post SRC for most athletes. Symptoms demonstrated the greatest improvement in the first 2 weeks, while neurocognitive impairment lingered across various domains up to 28 days post SRC. Vestibular-oculomotor decrements also resolved between one to three weeks post injury. There were no gender differences in neurocognitive recovery. Males were more likely to be asymptomatic by the fourth week and reported less vestibular-oculomotor impairment than females at weeks 1 and 2. Conclusion When utilizing the recommended “comprehensive” approach for concussion assessment, recovery time for SRC is approximately three to four weeks, which is longer than the commonly reported 7–14 days. Sports medicine clinicians should use a variety of complementing assessment tools to capture the heterogeneity of SRC. PMID:26445375
NASA Astrophysics Data System (ADS)
Clancy, R. T.; Wolff, M. J.; Malin, M. C.; Cantor, B. A.
2010-12-01
MARCI UV band imaging photometry within (260nm) and outside (320nm) the Hartley ozone band absorption supports daily global mapping of Mars ozone column abundances. Key retrieval issues include accurate UV radiometric calibrations, detailed specifications of surface and atmospheric background reflectance (surface albedo, atmospheric Raleigh and dust scattering/absorption), and simultaneous cloud retrievals. The implementation of accurate radiative transfer (RT) treatments of these processes has been accomplished (Wolff et al., 2010) such that daily global mapping retrievals for Mars ozone columns have been completed for the 2006-2010 period of MARCI global imaging. Ozone retrievals are most accurate for high column abundances associated with mid-to-high latitude regions during fall, winter, and spring seasons. We present a survey of these MARCI ozone column retrievals versus season, latitude, longitude, and year.
Afghan National Army: DOD Has Taken Steps to Remedy Poor Management of Vehicle Maintenance Program
2016-07-01
contract and program were designed to promote the accurate assessment of Afghan vehicle maintenance needs, contractor performance, and cost...containment; (2) the U.S. government provided effective management and oversight of contractor performance; and (3) the contract met its program objectives...maintenance, (2) underestimated the cost of spare parts, and (3) established performance metrics that did not accurately assess contractor performance or
Can blind persons accurately assess body size from the voice?
Pisanski, Katarzyna; Oleszkiewicz, Anna; Sorokowska, Agnieszka
2016-04-01
Vocal tract resonances provide reliable information about a speaker's body size that human listeners use for biosocial judgements as well as speech recognition. Although humans can accurately assess men's relative body size from the voice alone, how this ability is acquired remains unknown. In this study, we test the prediction that accurate voice-based size estimation is possible without prior audiovisual experience linking low frequencies to large bodies. Ninety-one healthy congenitally or early blind, late blind and sighted adults (aged 20-65) participated in the study. On the basis of vowel sounds alone, participants assessed the relative body sizes of male pairs of varying heights. Accuracy of voice-based body size assessments significantly exceeded chance and did not differ among participants who were sighted, or congenitally blind or who had lost their sight later in life. Accuracy increased significantly with relative differences in physical height between men, suggesting that both blind and sighted participants used reliable vocal cues to size (i.e. vocal tract resonances). Our findings demonstrate that prior visual experience is not necessary for accurate body size estimation. This capacity, integral to both nonverbal communication and speech perception, may be present at birth or may generalize from broader cross-modal correspondences. © 2016 The Author(s).
Can blind persons accurately assess body size from the voice?
Oleszkiewicz, Anna; Sorokowska, Agnieszka
2016-01-01
Vocal tract resonances provide reliable information about a speaker's body size that human listeners use for biosocial judgements as well as speech recognition. Although humans can accurately assess men's relative body size from the voice alone, how this ability is acquired remains unknown. In this study, we test the prediction that accurate voice-based size estimation is possible without prior audiovisual experience linking low frequencies to large bodies. Ninety-one healthy congenitally or early blind, late blind and sighted adults (aged 20–65) participated in the study. On the basis of vowel sounds alone, participants assessed the relative body sizes of male pairs of varying heights. Accuracy of voice-based body size assessments significantly exceeded chance and did not differ among participants who were sighted, or congenitally blind or who had lost their sight later in life. Accuracy increased significantly with relative differences in physical height between men, suggesting that both blind and sighted participants used reliable vocal cues to size (i.e. vocal tract resonances). Our findings demonstrate that prior visual experience is not necessary for accurate body size estimation. This capacity, integral to both nonverbal communication and speech perception, may be present at birth or may generalize from broader cross-modal correspondences. PMID:27095264
Non-stationary background intensity and Caribbean seismic events
NASA Astrophysics Data System (ADS)
Valmy, Larissa; Vaillant, Jean
2014-05-01
We consider seismic risk calculation based on models with non-stationary background intensity. The aim is to improve predictive strategies in the framework of seismic risk assessment from models describing at best the seismic activity in the Caribbean arc. Appropriate statistical methods are required for analyzing the volumes of data collected. The focus is on calculating earthquakes occurrences probability and analyzing spatiotemporal evolution of these probabilities. The main modeling tool is the point process theory in order to take into account past history prior to a given date. Thus, the seismic event conditional intensity is expressed by means of the background intensity and the self exciting component. This intensity can be interpreted as the expected event rate per time and / or surface unit. The most popular intensity model in seismology is the ETAS (Epidemic Type Aftershock Sequence) model introduced and then generalized by Ogata [2, 3]. We extended this model and performed a comparison of different probability density functions for the triggered event times [4]. We illustrate our model by considering the CDSA (Centre de Données Sismiques des Antilles) catalog [1] which contains more than 7000 seismic events occurred in the Lesser Antilles arc. Statistical tools for testing the background intensity stationarity and for dynamical segmentation are presented. [1] Bengoubou-Valérius M., Bazin S., Bertil D., Beauducel F. and Bosson A. (2008). CDSA: a new seismological data center for the French Lesser Antilles, Seismol. Res. Lett., 79 (1), 90-102. [2] Ogata Y. (1998). Space-time point-process models for earthquake occurrences, Annals of the Institute of Statistical Mathematics, 50 (2), 379-402. [3] Ogata, Y. (2011). Significant improvements of the space-time ETAS model for forecasting of accurate baseline seismicity, Earth, Planets and Space, 63 (3), 217-229. [4] Valmy L. and Vaillant J. (2013). Statistical models in seismology: Lesser Antilles arc case, Bull. Soc. géol. France, 2013, 184 (1), 61-67.
Štrbac, Snežana; Kašanin Grubin, Milica; Vasić, Nebojša
2017-11-30
The main objective of this paper is to evaluate how a choice of different background values may affect assessing the anthropogenic heavy metal pollution in sediments from Tisza River (Serbia). The second objective of this paper is to underline significance of using geochemical background values when establishing quality criteria for sediment. Enrichment factor (EF), geoaccumulation index (I geo ), pollution load index (PLI), and potential ecological risk index (PERI) were calculated using different background values. Three geochemical (average metal concentrations in continental crust, average metal concentrations in shale, and average metal concentrations in non-contaminated core sediment samples) and two statistical methods (delineation method and principal component analyses) were used for calculating background values. It can be concluded that obtained information of pollution status can be more dependent on the use of background values than the index/factor chosen. The best option to assess the potential river sediment contamination is to compare obtained concentrations of analyzed elements with concentrations of mineralogically and texturally comparable, uncontaminated core sediment samples. Geochemical background values should be taken into account when establishing quality criteria for soils, sediments, and waters. Due to complexity of the local lithology, it is recommended that environmental monitoring and assessment include selection of an appropriate background values to gain understanding of the geochemistry and potential source of pollution in a given environment.
Droplet Digital™ PCR Next-Generation Sequencing Library QC Assay.
Heredia, Nicholas J
2018-01-01
Digital PCR is a valuable tool to quantify next-generation sequencing (NGS) libraries precisely and accurately. Accurately quantifying NGS libraries enable accurate loading of the libraries on to the sequencer and thus improve sequencing performance by reducing under and overloading error. Accurate quantification also benefits users by enabling uniform loading of indexed/barcoded libraries which in turn greatly improves sequencing uniformity of the indexed/barcoded samples. The advantages gained by employing the Droplet Digital PCR (ddPCR™) library QC assay includes the precise and accurate quantification in addition to size quality assessment, enabling users to QC their sequencing libraries with confidence.
Accurate Modeling of the Terrestrial Gamma-Ray Background for Homeland Security Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sandness, Gerald A.; Schweppe, John E.; Hensley, Walter K.
2009-10-24
Abstract–The Pacific Northwest National Laboratory has developed computer models to simulate the use of radiation portal monitors to screen vehicles and cargo for the presence of illicit radioactive material. The gamma radiation emitted by the vehicles or cargo containers must often be measured in the presence of a relatively large gamma-ray background mainly due to the presence of potassium, uranium, and thorium (and progeny isotopes) in the soil and surrounding building materials. This large background is often a significant limit to the detection sensitivity for items of interest and must be modeled accurately for analyzing homeland security situations. Calculations ofmore » the expected gamma-ray emission from a disk of soil and asphalt were made using the Monte Carlo transport code MCNP and were compared to measurements made at a seaport with a high-purity germanium detector. Analysis revealed that the energy spectrum of the measured background could not be reproduced unless the model included gamma rays coming from the ground out to distances of at least 300 m. The contribution from beyond about 50 m was primarily due to gamma rays that scattered in the air before entering the detectors rather than passing directly from the ground to the detectors. These skyshine gamma rays contribute tens of percent to the total gamma-ray spectrum, primarily at energies below a few hundred keV. The techniques that were developed to efficiently calculate the contributions from a large soil disk and a large air volume in a Monte Carlo simulation are described and the implications of skyshine in portal monitoring applications are discussed.« less
Extracting information in spike time patterns with wavelets and information theory.
Lopes-dos-Santos, Vítor; Panzeri, Stefano; Kayser, Christoph; Diamond, Mathew E; Quian Quiroga, Rodrigo
2015-02-01
We present a new method to assess the information carried by temporal patterns in spike trains. The method first performs a wavelet decomposition of the spike trains, then uses Shannon information to select a subset of coefficients carrying information, and finally assesses timing information in terms of decoding performance: the ability to identify the presented stimuli from spike train patterns. We show that the method allows: 1) a robust assessment of the information carried by spike time patterns even when this is distributed across multiple time scales and time points; 2) an effective denoising of the raster plots that improves the estimate of stimulus tuning of spike trains; and 3) an assessment of the information carried by temporally coordinated spikes across neurons. Using simulated data, we demonstrate that the Wavelet-Information (WI) method performs better and is more robust to spike time-jitter, background noise, and sample size than well-established approaches, such as principal component analysis, direct estimates of information from digitized spike trains, or a metric-based method. Furthermore, when applied to real spike trains from monkey auditory cortex and from rat barrel cortex, the WI method allows extracting larger amounts of spike timing information. Importantly, the fact that the WI method incorporates multiple time scales makes it robust to the choice of partly arbitrary parameters such as temporal resolution, response window length, number of response features considered, and the number of available trials. These results highlight the potential of the proposed method for accurate and objective assessments of how spike timing encodes information. Copyright © 2015 the American Physiological Society.
Gibson, Juliet F; Huang, Jing; Liu, Kristina J; Carlson, Kacie R; Foss, Francine; Choi, Jaehyuk; Edelson, Richard; Hussong, Jerry W.; Mohl, Ramsey; Hill, Sally; Girardi, Sally
2016-01-01
Background Accurate quantification of malignant cells in the peripheral blood of patients with cutaneous T cell lymphoma (CTCL) is important for early detection, prognosis, and monitoring disease burden. Objective Determine the spectrum of current clinical practices; critically evaluate elements of current ISCL B1 and B2 staging criteria; and assess the potential role of TCR-Vβ analysis by flow cytometry. Methods We assessed current clinical practices by survey, and performed a retrospective analysis of 161 patients evaluated at Yale (2011-2014) to compare the sensitivity, specificity, PPV, and NPV of parameters for ISCL B2 staging. Results There was heterogeneity in clinical practices among institutions. ISCL B1 criteria did not capture five Yale cohort patients with immunophenotypic abnormalities who later progressed. TCR-Vβ testing was more specific than PCR and aided diagnosis in detecting clonality, but was of limited benefit in quantification of tumor burden. Limitations Because of limited follow-up involving a single center, further investigation will be necessary to conclude whether our proposed diagnostic algorithm is of general clinical benefit. Conclusion We propose further study of “modified B1 criteria”: CD4/CD8 ratio ≥5, %CD4+/CD26- ≥ 20%, %CD4+/CD7- ≥ 20%, with evidence of clonality. TCR-Vβ testing should be considered in future diagnostic and staging algorithms. PMID:26874819
Martinez, Stella M.; Foucher, Juliette; Combis, Jean-Marc; Métivier, Sophie; Brunetto, Maurizia; Capron, Dominique; Bourlière, Marc; Bronowicki, Jean-Pierre; Dao, Thong; Maynard-Muet, Marianne; Lucidarme, Damien; Merrouche, Wassil; Forns, Xavier; de Lédinghen, Victor
2012-01-01
Background/Aims Liver stiffness (LS) measurement by means of transient elastography (TE) is accurate to predict fibrosis stage. The effect of antiviral treatment and virologic response on LS was assessed and compared with untreated patients with chronic hepatitis C (CHC). Methods TE was performed at baseline, and at weeks 24, 48, and 72 in 515 patients with CHC. Results 323 treated (62.7%) and 192 untreated patients (37.3%) were assessed. LS experienced a significant decline in treated patients and remained stable in untreated patients at the end of study (P<0.0001). The decline was significant for patients with baseline LS ≥ 7.1 kPa (P<0.0001 and P 0.03, for LS ≥9.5 and ≥7.1 kPa vs lower values, respectively). Sustained virological responders and relapsers had a significant LS improvement whereas a trend was observed in nonresponders (mean percent change −16%, −10% and −2%, for SVR, RR and NR, respectively, P 0.03 for SVR vs NR). In multivariate analysis, high baseline LS (P<0.0001) and ALT levels, antiviral therapy and non-1 genotype were independent predictors of LS improvement. Conclusions LS decreases during and after antiviral treatment in patients with CHC. The decrease is significant in sustained responders and relapsers (particularly in those with high baseline LS) and suggests an improvement in liver damage. PMID:23082200
NASA Astrophysics Data System (ADS)
Xu, Ronald X.; Xu, Jeff S.; Huang, Jiwei; Tweedle, Michael F.; Schmidt, Carl; Povoski, Stephen P.; Martin, Edward W.
2010-02-01
Background: Accurate assessment of tumor boundaries and intraoperative detection of therapeutic margins are important oncologic principles for minimal recurrence rates and improved long-term outcomes. However, many existing cancer imaging tools are based on preoperative image acquisition and do not provide real-time intraoperative information that supports critical decision-making in the operating room. Method: Poly lactic-co-glycolic acid (PLGA) microbubbles (MBs) and nanobubbles (NBs) were synthesized by a modified double emulsion method. The MB and NB surfaces were conjugated with CC49 antibody to target TAG-72 antigen, a human glycoprotein complex expressed in many epithelial-derived cancers. Multiple imaging agents were encapsulated in MBs and NBs for multimodal imaging. Both one-step and multi-step cancer targeting strategies were explored. Active MBs/NBs were also fabricated for therapeutic margin assessment in cancer ablation therapies. Results: The multimodal contrast agents and the cancer-targeting strategies were tested on tissue simulating phantoms, LS174 colon cancer cell cultures, and cancer xenograft nude mice. Concurrent multimodal imaging was demonstrated using fluorescence and ultrasound imaging modalities. Technical feasibility of using active MBs and portable imaging tools such as ultrasound for intraoperative therapeutic margin assessment was demonstrated in a biological tissue model. Conclusion: The cancer-specific multimodal contrast agents described in this paper have the potential for intraoperative detection of tumor boundaries and therapeutic margins.
Assemblathon 2: evaluating de novo methods of genome assembly in three vertebrate species
2013-01-01
Background The process of generating raw genome sequence data continues to become cheaper, faster, and more accurate. However, assembly of such data into high-quality, finished genome sequences remains challenging. Many genome assembly tools are available, but they differ greatly in terms of their performance (speed, scalability, hardware requirements, acceptance of newer read technologies) and in their final output (composition of assembled sequence). More importantly, it remains largely unclear how to best assess the quality of assembled genome sequences. The Assemblathon competitions are intended to assess current state-of-the-art methods in genome assembly. Results In Assemblathon 2, we provided a variety of sequence data to be assembled for three vertebrate species (a bird, a fish, and snake). This resulted in a total of 43 submitted assemblies from 21 participating teams. We evaluated these assemblies using a combination of optical map data, Fosmid sequences, and several statistical methods. From over 100 different metrics, we chose ten key measures by which to assess the overall quality of the assemblies. Conclusions Many current genome assemblers produced useful assemblies, containing a significant representation of their genes and overall genome structure. However, the high degree of variability between the entries suggests that there is still much room for improvement in the field of genome assembly and that approaches which work well in assembling the genome of one species may not necessarily work well for another. PMID:23870653
Simulation-based Mastery Learning Improves Cardiac Auscultation Skills in Medical Students
McGaghie, William C.; Cohen, Elaine R.; Kaye, Marsha; Wayne, Diane B.
2010-01-01
Background Cardiac auscultation is a core clinical skill. However, prior studies show that trainee skills are often deficient and that clinical experience is not a proxy for competence. Objective To describe a mastery model of cardiac auscultation education and evaluate its effectiveness in improving bedside cardiac auscultation skills. Design Untreated control group design with pretest and posttest. Participants Third-year students who received a cardiac auscultation curriculum and fourth year students who did not. Intervention A cardiac auscultation curriculum consisting of a computer tutorial and a cardiac patient simulator. All third-year students were required to meet or exceed a minimum passing score (MPS) set by an expert panel at posttest. Measurements Diagnostic accuracy with simulated heart sounds and actual patients. Results Trained third-year students (n = 77) demonstrated significantly higher cardiac auscultation accuracy compared to untrained fourth year students (n = 31) in assessment of simulated heart sounds (93.8% vs. 73.9%, p < 0.001) and with real patients (81.8% vs. 75.1%, p = 0.003). USMLE scores correlated modestly with a computer-based multiple choice assessment using simulated heart sounds but not with bedside skills on real patients. Conclusions A cardiac auscultation curriculum consisting of deliberate practice with a computer-based tutorial and a cardiac patient simulator resulted in improved assessment of simulated heart sounds and more accurate examination of actual patients. PMID:20339952
2012-01-01
Background Hyperpolarised helium MRI (He3 MRI) is a new technique that enables imaging of the air distribution within the lungs. This allows accurate determination of the ventilation distribution in vivo. The technique has the disadvantages of requiring an expensive helium isotope, complex apparatus and moving the patient to a compatible MRI scanner. Electrical impedance tomography (EIT) a non-invasive bedside technique that allows constant monitoring of lung impedance, which is dependent on changes in air space capacity in the lung. We have used He3MRI measurements of ventilation distribution as the gold standard for assessment of EIT. Methods Seven rats were ventilated in supine, prone, left and right lateral position with 70% helium/30% oxygen for EIT measurements and pure helium for He3 MRI. The same ventilator and settings were used for both measurements. Image dimensions, geometric centre and global in homogeneity index were calculated. Results EIT images were smaller and of lower resolution and contained less anatomical detail than those from He3 MRI. However, both methods could measure positional induced changes in lung ventilation, as assessed by the geometric centre. The global in homogeneity index were comparable between the techniques. Conclusion EIT is a suitable technique for monitoring ventilation distribution and inhomgeneity as assessed by comparison with He3 MRI. PMID:22966835
Security Applications Of Computer Motion Detection
NASA Astrophysics Data System (ADS)
Bernat, Andrew P.; Nelan, Joseph; Riter, Stephen; Frankel, Harry
1987-05-01
An important area of application of computer vision is the detection of human motion in security systems. This paper describes the development of a computer vision system which can detect and track human movement across the international border between the United States and Mexico. Because of the wide range of environmental conditions, this application represents a stringent test of computer vision algorithms for motion detection and object identification. The desired output of this vision system is accurate, real-time locations for individual aliens and accurate statistical data as to the frequency of illegal border crossings. Because most detection and tracking routines assume rigid body motion, which is not characteristic of humans, new algorithms capable of reliable operation in our application are required. Furthermore, most current detection and tracking algorithms assume a uniform background against which motion is viewed - the urban environment along the US-Mexican border is anything but uniform. The system works in three stages: motion detection, object tracking and object identi-fication. We have implemented motion detection using simple frame differencing, maximum likelihood estimation, mean and median tests and are evaluating them for accuracy and computational efficiency. Due to the complex nature of the urban environment (background and foreground objects consisting of buildings, vegetation, vehicles, wind-blown debris, animals, etc.), motion detection alone is not sufficiently accurate. Object tracking and identification are handled by an expert system which takes shape, location and trajectory information as input and determines if the moving object is indeed representative of an illegal border crossing.
NASA Astrophysics Data System (ADS)
Fang, Jinsheng; Bao, Lijun; Li, Xu; van Zijl, Peter C. M.; Chen, Zhong
2017-08-01
Background field removal is an important MR phase preprocessing step for quantitative susceptibility mapping (QSM). It separates the local field induced by tissue magnetic susceptibility sources from the background field generated by sources outside a region of interest, e.g. brain, such as air-tissue interface. In the vicinity of air-tissue boundary, e.g. skull and paranasal sinuses, where large susceptibility variations exist, present background field removal methods are usually insufficient and these regions often need to be excluded by brain mask erosion at the expense of losing information of local field and thus susceptibility measures in these regions. In this paper, we propose an extension to the variable-kernel sophisticated harmonic artifact reduction for phase data (V-SHARP) background field removal method using a region adaptive kernel (R-SHARP), in which a scalable spherical Gaussian kernel (SGK) is employed with its kernel radius and weights adjustable according to an energy "functional" reflecting the magnitude of field variation. Such an energy functional is defined in terms of a contour and two fitting functions incorporating regularization terms, from which a curve evolution model in level set formation is derived for energy minimization. We utilize it to detect regions of with a large field gradient caused by strong susceptibility variation. In such regions, the SGK will have a small radius and high weight at the sphere center in a manner adaptive to the voxel energy of the field perturbation. Using the proposed method, the background field generated from external sources can be effectively removed to get a more accurate estimation of the local field and thus of the QSM dipole inversion to map local tissue susceptibility sources. Numerical simulation, phantom and in vivo human brain data demonstrate improved performance of R-SHARP compared to V-SHARP and RESHARP (regularization enabled SHARP) methods, even when the whole paranasal sinus regions are preserved in the brain mask. Shadow artifacts due to strong susceptibility variations in the derived QSM maps could also be largely eliminated using the R-SHARP method, leading to more accurate QSM reconstruction.
Fang, Jinsheng; Bao, Lijun; Li, Xu; van Zijl, Peter C M; Chen, Zhong
2017-08-01
Background field removal is an important MR phase preprocessing step for quantitative susceptibility mapping (QSM). It separates the local field induced by tissue magnetic susceptibility sources from the background field generated by sources outside a region of interest, e.g. brain, such as air-tissue interface. In the vicinity of air-tissue boundary, e.g. skull and paranasal sinuses, where large susceptibility variations exist, present background field removal methods are usually insufficient and these regions often need to be excluded by brain mask erosion at the expense of losing information of local field and thus susceptibility measures in these regions. In this paper, we propose an extension to the variable-kernel sophisticated harmonic artifact reduction for phase data (V-SHARP) background field removal method using a region adaptive kernel (R-SHARP), in which a scalable spherical Gaussian kernel (SGK) is employed with its kernel radius and weights adjustable according to an energy "functional" reflecting the magnitude of field variation. Such an energy functional is defined in terms of a contour and two fitting functions incorporating regularization terms, from which a curve evolution model in level set formation is derived for energy minimization. We utilize it to detect regions of with a large field gradient caused by strong susceptibility variation. In such regions, the SGK will have a small radius and high weight at the sphere center in a manner adaptive to the voxel energy of the field perturbation. Using the proposed method, the background field generated from external sources can be effectively removed to get a more accurate estimation of the local field and thus of the QSM dipole inversion to map local tissue susceptibility sources. Numerical simulation, phantom and in vivo human brain data demonstrate improved performance of R-SHARP compared to V-SHARP and RESHARP (regularization enabled SHARP) methods, even when the whole paranasal sinus regions are preserved in the brain mask. Shadow artifacts due to strong susceptibility variations in the derived QSM maps could also be largely eliminated using the R-SHARP method, leading to more accurate QSM reconstruction. Copyright © 2017. Published by Elsevier Inc.
Galievsky, Victor A; Stasheuski, Alexander S; Krylov, Sergey N
2017-10-17
The limit-of-detection (LOD) in analytical instruments with fluorescence detection can be improved by reducing noise of optical background. Efficiently reducing optical background noise in systems with spectrally nonuniform background requires complex optimization of an emission filter-the main element of spectral filtration. Here, we introduce a filter-optimization method, which utilizes an expression for the signal-to-noise ratio (SNR) as a function of (i) all noise components (dark, shot, and flicker), (ii) emission spectrum of the analyte, (iii) emission spectrum of the optical background, and (iv) transmittance spectrum of the emission filter. In essence, the noise components and the emission spectra are determined experimentally and substituted into the expression. This leaves a single variable-the transmittance spectrum of the filter-which is optimized numerically by maximizing SNR. Maximizing SNR provides an accurate way of filter optimization, while a previously used approach based on maximizing a signal-to-background ratio (SBR) is the approximation that can lead to much poorer LOD specifically in detection of fluorescently labeled biomolecules. The proposed filter-optimization method will be an indispensable tool for developing new and improving existing fluorescence-detection systems aiming at ultimately low LOD.
Visual assessment of CPR quality during pediatric cardiac arrest: does point of view matter?
Jones, Angela; Lin, Yiqun; Nettel-Aguirre, Alberto; Gilfoyle, Elaine; Cheng, Adam
2015-05-01
In many clinical settings, providers rely on visual assessment when delivering feedback on CPR quality. Little is known about the accuracy of visual assessment of CPR quality. We aimed to determine how accurate pediatric providers are in their visual assessment of CPR quality and to identify the optimal position relative to the patient for accurate CPR assessment. We videotaped high-quality CPR (based on 2010 American Heart Association guidelines) and 3 variations of poor quality CPR in a simulated resuscitation, filmed from the foot, head and the side of the manikin. Participants watched 12 videos and completed a questionnaire to assess CPR quality. One hundred and twenty-five participants were recruited. The overall accuracy of visual assessment of CPR quality was 65.6%. Accuracy was better from the side (70.8%) and foot (68.8%) of the bed when compared to the head of the bed (57.2%; p<0.001). The side was the best position for assessing depth (p<0.001). Rate assessment was equivalent between positions (p=0.58). The side and foot of the bed were superior to the head when assessing chest recoil (p<0.001). Factors associated with increased accuracy in visual assessment of CPR quality included recent CPR course completion (p=0.034) and involvement in more cardiac arrests as a team member (p=0.003). Healthcare providers struggle to accurately assess the quality of CPR using visual assessment. If visual assessment is being used, providers should stand at the side of the bed. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Packard, Corey D.; Klein, Mark D.; Viola, Timothy S.; Hepokoski, Mark A.
2016-10-01
The ability to predict electro-optical (EO) signatures of diverse targets against cluttered backgrounds is paramount for signature evaluation and/or management. Knowledge of target and background signatures is essential for a variety of defense-related applications. While there is no substitute for measured target and background signatures to determine contrast and detection probability, the capability to simulate any mission scenario with desired environmental conditions is a tremendous asset for defense agencies. In this paper, a systematic process for the thermal and visible-through-infrared simulation of camouflaged human dismounts in cluttered outdoor environments is presented. This process, utilizing the thermal and EO/IR radiance simulation tool TAIThermIR (and MuSES), provides a repeatable and accurate approach for analyzing contrast, signature and detectability of humans in multiple wavebands. The engineering workflow required to combine natural weather boundary conditions and the human thermoregulatory module developed by ThermoAnalytics is summarized. The procedure includes human geometry creation, human segmental physiology description and transient physical temperature prediction using environmental boundary conditions and active thermoregulation. Radiance renderings, which use Sandford-Robertson BRDF optical surface property descriptions and are coupled with MODTRAN for the calculation of atmospheric effects, are demonstrated. Sensor effects such as optical blurring and photon noise can be optionally included, increasing the accuracy of detection probability outputs that accompany each rendering. This virtual evaluation procedure has been extensively validated and provides a flexible evaluation process that minimizes the difficulties inherent in human-subject field testing. Defense applications such as detection probability assessment, camouflage pattern evaluation, conspicuity tests and automatic target recognition are discussed.
ERIC Educational Resources Information Center
Reschly, Daniel J.; And Others
Findings from the Iowa Assessment Project are examined regarding the assessment and use of information on adaptive behavior and sociocultural background in decisions about students with mild mental retardation. Background aspects reviewed include terminology regarding mild retardation; research, litigation, and legislation on the topic during the…
Ferlaino, Michael; Rogers, Mark F.; Shihab, Hashem A.; Mort, Matthew; Cooper, David N.; Gaunt, Tom R.; Campbell, Colin
2018-01-01
Background Small insertions and deletions (indels) have a significant influence in human disease and, in terms of frequency, they are second only to single nucleotide variants as pathogenic mutations. As the majority of mutations associated with complex traits are located outside the exome, it is crucial to investigate the potential pathogenic impact of indels in non-coding regions of the human genome. Results We present FATHMM-indel, an integrative approach to predict the functional effect, pathogenic or neutral, of indels in non-coding regions of the human genome. Our method exploits various genomic annotations in addition to sequence data. When validated on benchmark data, FATHMM-indel significantly outperforms CADD and GAVIN, state of the art models in assessing the pathogenic impact of non-coding variants. FATHMM-indel is available via a web server at indels.biocompute.org.uk. Conclusions FATHMM-indel can accurately predict the functional impact and prioritise small indels throughout the whole non-coding genome. PMID:28985712
Cortical membrane potential signature of optimal states for sensory signal detection
McGinley, Matthew J.; David, Stephen V.; McCormick, David A.
2015-01-01
The neural correlates of optimal states for signal detection task performance are largely unknown. One hypothesis holds that optimal states exhibit tonically depolarized cortical neurons with enhanced spiking activity, such as occur during movement. We recorded membrane potentials of auditory cortical neurons in mice trained on a challenging tone-in-noise detection task while assessing arousal with simultaneous pupillometry and hippocampal recordings. Arousal measures accurately predicted multiple modes of membrane potential activity, including: rhythmic slow oscillations at low arousal, stable hyperpolarization at intermediate arousal, and depolarization during phasic or tonic periods of hyper-arousal. Walking always occurred during hyper-arousal. Optimal signal detection behavior and sound-evoked responses, at both sub-threshold and spiking levels, occurred at intermediate arousal when pre-decision membrane potentials were stably hyperpolarized. These results reveal a cortical physiological signature of the classically-observed inverted-U relationship between task performance and arousal, and that optimal detection exhibits enhanced sensory-evoked responses and reduced background synaptic activity. PMID:26074005
Use of a vision model to quantify the significance of factors effecting target conspicuity
NASA Astrophysics Data System (ADS)
Gilmore, M. A.; Jones, C. K.; Haynes, A. W.; Tolhurst, D. J.; To, M.; Troscianko, T.; Lovell, P. G.; Parraga, C. A.; Pickavance, K.
2006-05-01
When designing camouflage it is important to understand how the human visual system processes the information to discriminate the target from the background scene. A vision model has been developed to compare two images and detect differences in local contrast in each spatial frequency channel. Observer experiments are being undertaken to validate this vision model so that the model can be used to quantify the relative significance of different factors affecting target conspicuity. Synthetic imagery can be used to design improved camouflage systems. The vision model is being used to compare different synthetic images to understand what features in the image are important to reproduce accurately and to identify the optimum way to render synthetic imagery for camouflage effectiveness assessment. This paper will describe the vision model and summarise the results obtained from the initial validation tests. The paper will also show how the model is being used to compare different synthetic images and discuss future work plans.
NASA Astrophysics Data System (ADS)
Mooney, P.; Morgan, L.
2015-08-01
In the last number of years there has been increased interest from researchers in investigating and understanding the characteristics and backgrounds of citizens who contribute to Volunteered Geographic Information (VGI) and Citizen Science (CS) projects. Much of the reluctance from stakeholders such as National Mapping Agencies, Environmental Ministries, etc. to use data and information generated and collected by VGI and CS projects grows from the lack of knowledge and understanding about who these contributors are. As they are drawn from the crowd there is a sense of the unknown about these citizens. Subsequently there are justifiable concerns about these citizens' ability to collect, generate and manage high quality and accurate spatial, scientific and environmental data and information. This paper provides a meta review of some of the key literature in the domain of VGI and CS to assess if these concerns are well founded and what efforts are ongoing to improve our understanding of the crowd.
Synthesis of urban greenhouse gas emission estimates from the Indianapolis Flux Experiment (INFLUX)
NASA Astrophysics Data System (ADS)
Turnbull, J. C.; Davis, K. J.; Deng, A.; Lauvaux, T.; Miles, N. L.; Richardson, S.; Sarmiento, D. P.; Wu, K.; Brewer, A.; Hardesty, R. M.; McKain, K.; Sweeney, C.; Gurney, K. R.; Liang, J.; O'Keeffe, D.; Patarasuk, R.; Cambaliza, M. O. L.; Harvey, R. M.; Heimburger, A. M. F.; Shepson, P. B.; Karion, A.; Lopez-Coto, I.; Prasad, K.; Whetstone, J. R.
2016-12-01
The Indianapolis Flux Experiment (INFLUX) is testing the boundaries of our ability to use atmospheric measurements to quantify urban greenhouse gas (GHG) emissions. The project brings together high-resolution (in both space and time) inventory assessments, a multi-year record of in situ CO2, CH4and CO from tower-based and aircraft-based atmospheric measurements along with a complementary suite of 35 trace gases and isotopes from flasks collected at the same sites, and atmospheric modelling. Together, these provide high-accuracy, high-resolution, continuous monitoring of emissions of GHGs from the city. Here we synthesize the results to date, and demonstrate broad agreement amongst city-wide emission rates determined from the various top-down and bottom-up methods. We highlight the areas where ongoing efforts are reducing uncertainties in the overall flux estimation, including accurate representation of atmospheric transport, partitioning of GHG source types and the influence of background atmospheric GHG mole fractions.
Validating Machine Learning Algorithms for Twitter Data Against Established Measures of Suicidality
2016-01-01
Background One of the leading causes of death in the United States (US) is suicide and new methods of assessment are needed to track its risk in real time. Objective Our objective is to validate the use of machine learning algorithms for Twitter data against empirically validated measures of suicidality in the US population. Methods Using a machine learning algorithm, the Twitter feeds of 135 Mechanical Turk (MTurk) participants were compared with validated, self-report measures of suicide risk. Results Our findings show that people who are at high suicidal risk can be easily differentiated from those who are not by machine learning algorithms, which accurately identify the clinically significant suicidal rate in 92% of cases (sensitivity: 53%, specificity: 97%, positive predictive value: 75%, negative predictive value: 93%). Conclusions Machine learning algorithms are efficient in differentiating people who are at a suicidal risk from those who are not. Evidence for suicidality can be measured in nonclinical populations using social media data. PMID:27185366
Arthroscopic approach and anatomy of the hip
Aprato, Alessandro; Giachino, Matteo; Masse, Alessandro
2016-01-01
Summary Background Hip arthroscopy has gained popularity among the orthopedic community and a precise assessment of indications, techniques and results is constantly brought on. Methods In this chapter the principal standard entry portals for central and peripheral compartment are discussed. The description starts from the superficial landmarks for portals placement and continues with the deep layers. For each entry point an illustration of the main structures encountered is provided and the principal structures at risk for different portals are accurately examined. Articular anatomical description is carried out from the arthroscope point of view and sub-divided into central and peripheral compartment. The two compartments are systematically analyzed and the accessible articular areas for each portal explained. Moreover, some anatomical variations that can be found in the normal hip are reported. Conclusion The anatomical knowledge of the hip joint along with a precise notion of the structures encountered with the arthroscope is an essential requirement for a secure and successful surgery. Level of evidence: V. PMID:28066735
Persky, Susan; Kaphingst, Kimberly A.; Allen, Vincent C.; Senay, Ibrahim
2013-01-01
Background Communication of lung cancer risk information between providers and African-American patients occurs in a context marked by race-based health disparities. Purpose A controlled experiment assessed whether perceived physician race influenced African-American patients’ (n=127) risk perception accuracy following the provision of objective lung cancer risk information. Methods Participants interacted with a virtual reality-based, simulated physician who provided personalized cancer risk information. Results Participants who interacted with a racially discordant virtual doctor were less accurate in their risk perceptions at post-test than those who interacted with a concordant virtual doctor, F(1,94)=4.02, p=.048. This effect was amplified among current smokers. Effects were not mediated by trust in the provider, engagement with the health care system, or attention during the encounter. Conclusions The current study demonstrates that African-American patients’ perceptions of a doctor’s race are sufficient to independently impact their processing of lung cancer risk information. PMID:23389688
Veyrat, Colette; Larrazet, Fabrice; Pellerin, Denis
2005-10-01
There is renewed interest in isovolumic contraction (IC) in tissue Doppler echocardiography of the myocardial walls, which is revisited in this editorial with new regional velocity data. The aims are to recall traditional background information and to emphasize the need to master the rapidly evolving tissue Doppler procedures for the accurate display of brief IC. IC, a preejectional component of great physiologic interest, is very demanding in terms of ultrasound technology. The onset and end of its motion velocities should be unambiguously defined versus the QRS complex and ejection wall motion. This is a prerequisite for exploiting the new information as guidance toward new therapeutic strategies from a practical viewpoint. However, IC preload dependence should be kept in mind, because of its limited potential for contractility studies. Finally, when only duration measurements are made in the assessment of ventricular dyssynchrony, regional preejectional duration is the pertinent tool to single out the onset of ejection local wall motion.
Kiel, Elizabeth J; Buss, Kristin A
2011-10-01
Early social withdrawal and protective parenting predict a host of negative outcomes, warranting examination of their development. Mothers' accurate anticipation of their toddlers' fearfulness may facilitate transactional relations between toddler fearful temperament and protective parenting, leading to these outcomes. Currently, we followed 93 toddlers (42 female; on average 24.76 months) and their mothers (9% underrepresented racial/ethnic backgrounds) over 3 years. We gathered laboratory observation of fearful temperament, maternal protective behavior, and maternal accuracy during toddlerhood and a multi-method assessment of children's social withdrawal and mothers' self-reported protective behavior at kindergarten entry. When mothers displayed higher accuracy, toddler fearful temperament significantly related to concurrent maternal protective behavior and indirectly predicted kindergarten social withdrawal and maternal protective behavior. These results highlight the important role of maternal accuracy in linking fearful temperament and protective parenting, which predict further social withdrawal and protection, and point to toddlerhood for efforts of prevention of anxiety-spectrum outcomes.
Kiel, Elizabeth J.; Buss, Kristin A.
2011-01-01
Early social withdrawal and protective parenting predict a host of negative outcomes, warranting examination of their development. Mothers’ accurate anticipation of their toddlers’ fearfulness may facilitate transactional relations between toddler fearful temperament and protective parenting, leading to these outcomes. Currently, we followed 93 toddlers (42 female; on average 24.76 months) and their mothers (9% underrepresented racial/ethnic backgrounds) over 3 years. We gathered laboratory observation of fearful temperament, maternal protective behavior, and maternal accuracy during toddlerhood and a multi-method assessment of children’s social withdrawal and mothers’ self-reported protective behavior at kindergarten entry. When mothers displayed higher accuracy, toddler fearful temperament significantly related to concurrent maternal protective behavior and indirectly predicted kindergarten social withdrawal and maternal protective behavior. These results highlight the important role of maternal accuracy in linking fearful temperament and protective parenting, which predict further social withdrawal and protection, and point to toddlerhood for efforts of prevention of anxiety-spectrum outcomes. PMID:21537895
Effects of a Leisure Programme on Quality of Life and Stress of Individuals with ASD
ERIC Educational Resources Information Center
Garcia-Villamisar, D. A.; Dattilo, J.
2010-01-01
Background: Even though there is research demonstrating a positive relationship between leisure participation and the two constructs of quality of life and stress reduction, current conceptualisation of leisure as a contributor to quality of life is limited. In addition, in spite of improvements in accurate diagnosis of autism spectrum disorder…
ERIC Educational Resources Information Center
Gill, Michele Gregoire; Hoffman, Bobby
2009-01-01
Background/Context: Although teachers' core instructional beliefs are difficult to accurately measure, they provide a framework for understanding the thinking that underlies important curricular and pedagogical decisions made in the classroom. Previous research has primarily used self-report to study teacher beliefs, but self-report is better for…
ERIC Educational Resources Information Center
Heo, K. H.; Squires, J.; Yovanoff, P.
2008-01-01
Background: Accurate and efficient developmental screening measures are critical for early identification of developmental problems; however, few reliable and valid tests are available in Korea as well as other countries outside the USA. The Ages and Stages Questionnaires (ASQ) was chosen for study with young children in Korea. Methods: The ASQ…
Accurate Interpretation of the 12-Lead ECG Electrode Placement: A Systematic Review
ERIC Educational Resources Information Center
Khunti, Kirti
2014-01-01
Background: Coronary heart disease (CHD) patients require monitoring through ECGs; the 12-lead electrocardiogram (ECG) is considered to be the non-invasive gold standard. Examples of incorrect treatment because of inaccurate or poor ECG monitoring techniques have been reported in the literature. The findings that only 50% of nurses and less than…
ERIC Educational Resources Information Center
Anderson, Karen L.; Goldstein, Howard
2004-01-01
Children typically learn in classroom environments that have background noise and reverberation that interfere with accurate speech perception. Amplification technology can enhance the speech perception of students who are hard of hearing. Purpose: This study used a single-subject alternating treatments design to compare the speech recognition…
ERIC Educational Resources Information Center
Gygi, Brian; Shafiro, Valeriy
2013-01-01
Purpose: Previously, Gygi and Shafiro (2011) found that when environmental sounds are semantically incongruent with the background scene (e.g., horse galloping in a restaurant), they can be identified more accurately by young normal-hearing listeners (YNH) than sounds congruent with the scene (e.g., horse galloping at a racetrack). This study…
NASA Astrophysics Data System (ADS)
Fukami, Christine S.; Sullivan, Amy P.; Ryan Fulgham, S.; Murschell, Trey; Borch, Thomas; Smith, James N.; Farmer, Delphine K.
2016-07-01
Particle-into-Liquid Samplers (PILS) have become a standard aerosol collection technique, and are widely used in both ground and aircraft measurements in conjunction with off-line ion chromatography (IC) measurements. Accurate and precise background samples are essential to account for gas-phase components not efficiently removed and any interference in the instrument lines, collection vials or off-line analysis procedures. For aircraft sampling with PILS, backgrounds are typically taken with in-line filters to remove particles prior to sample collection once or twice per flight with more numerous backgrounds taken on the ground. Here, we use data collected during the Front Range Air Pollution and Photochemistry Éxperiment (FRAPPÉ) to demonstrate that not only are multiple background filter samples are essential to attain a representative background, but that the chemical background signals do not follow the Gaussian statistics typically assumed. Instead, the background signals for all chemical components analyzed from 137 background samples (taken from ∼78 total sampling hours over 18 flights) follow a log-normal distribution, meaning that the typical approaches of averaging background samples and/or assuming a Gaussian distribution cause an over-estimation of background samples - and thus an underestimation of sample concentrations. Our approach of deriving backgrounds from the peak of the log-normal distribution results in detection limits of 0.25, 0.32, 3.9, 0.17, 0.75 and 0.57 μg m-3 for sub-micron aerosol nitrate (NO3-), nitrite (NO2-), ammonium (NH4+), sulfate (SO42-), potassium (K+) and calcium (Ca2+), respectively. The difference in backgrounds calculated from assuming a Gaussian distribution versus a log-normal distribution were most extreme for NH4+, resulting in a background that was 1.58× that determined from fitting a log-normal distribution.
Duhé, Abby F.; Gilmore, L. Anne; Burton, Jeffrey H.; Martin, Corby K.; Redman, Leanne M.
2016-01-01
Background Infant formula is a major source of nutrition for infants with over half of all infants in the United States consuming infant formula exclusively or in combination with breast milk. The energy in infant powdered formula is derived from the powder and not the water making it necessary to develop methods that can accurately estimate the amount of powder used prior to reconstitution. Objective To assess the use of the Remote Food Photography Method (RFPM) to accurately estimate the weight of infant powdered formula before reconstitution among the standard serving sizes. Methods For each serving size (1-scoop, 2-scoop, 3-scoop, and 4-scoop), a set of seven test bottles and photographs were prepared including the recommended gram weight of powdered formula of the respective serving size by the manufacturer, three bottles and photographs containing 15%, 10%, and 5% less powdered formula than recommended, and three bottles and photographs containing 5%, 10%, and 15% more powdered formula than recommended (n=28). Ratio estimates of the test photographs as compared to standard photographs were obtained using standard RFPM analysis procedures. The ratio estimates and the United States Department of Agriculture (USDA) data tables were used to generate food and nutrient information to provide the RFPM estimates. Statistical Analyses Performed Equivalence testing using the two one-sided t- test (TOST) approach was used to determine equivalence between the actual gram weights and the RFPM estimated weights for all samples, within each serving size, and within under-prepared and over-prepared bottles. Results For all bottles, the gram weights estimated by the RFPM were within 5% equivalence bounds with a slight under-estimation of 0.05 g (90% CI [−0.49, 0.40]; p<0.001) and mean percent error ranging between 0.32% and 1.58% among the four serving sizes. Conclusion The maximum observed mean error was an overestimation of 1.58% of powdered formula by the RFPM under controlled laboratory conditions indicating that the RFPM accurately estimated infant powdered formula. PMID:26947889
A rater training protocol to assess team performance.
Eppich, Walter; Nannicelli, Anna P; Seivert, Nicholas P; Sohn, Min-Woong; Rozenfeld, Ranna; Woods, Donna M; Holl, Jane L
2015-01-01
Simulation-based methodologies are increasingly used to assess teamwork and communication skills and provide team training. Formative feedback regarding team performance is an essential component. While effective use of simulation for assessment or training requires accurate rating of team performance, examples of rater-training programs in health care are scarce. We describe our rater training program and report interrater reliability during phases of training and independent rating. We selected an assessment tool shown to yield valid and reliable results and developed a rater training protocol with an accompanying rater training handbook. The rater training program was modeled after previously described high-stakes assessments in the setting of 3 facilitated training sessions. Adjacent agreement was used to measure interrater reliability between raters. Nine raters with a background in health care and/or patient safety evaluated team performance of 42 in-situ simulations using post-hoc video review. Adjacent agreement increased from the second training session (83.6%) to the third training session (85.6%) when evaluating the same video segments. Adjacent agreement for the rating of overall team performance was 78.3%, which was added for the third training session. Adjacent agreement was 97% 4 weeks posttraining and 90.6% at the end of independent rating of all simulation videos. Rater training is an important element in team performance assessment, and providing examples of rater training programs is essential. Articulating key rating anchors promotes adequate interrater reliability. In addition, using adjacent agreement as a measure allows differentiation between high- and low-performing teams on video review. © 2015 The Alliance for Continuing Education in the Health Professions, the Society for Academic Continuing Medical Education, and the Council on Continuing Medical Education, Association for Hospital Medical Education.
EGASP: the human ENCODE Genome Annotation Assessment Project
Guigó, Roderic; Flicek, Paul; Abril, Josep F; Reymond, Alexandre; Lagarde, Julien; Denoeud, France; Antonarakis, Stylianos; Ashburner, Michael; Bajic, Vladimir B; Birney, Ewan; Castelo, Robert; Eyras, Eduardo; Ucla, Catherine; Gingeras, Thomas R; Harrow, Jennifer; Hubbard, Tim; Lewis, Suzanna E; Reese, Martin G
2006-01-01
Background We present the results of EGASP, a community experiment to assess the state-of-the-art in genome annotation within the ENCODE regions, which span 1% of the human genome sequence. The experiment had two major goals: the assessment of the accuracy of computational methods to predict protein coding genes; and the overall assessment of the completeness of the current human genome annotations as represented in the ENCODE regions. For the computational prediction assessment, eighteen groups contributed gene predictions. We evaluated these submissions against each other based on a 'reference set' of annotations generated as part of the GENCODE project. These annotations were not available to the prediction groups prior to the submission deadline, so that their predictions were blind and an external advisory committee could perform a fair assessment. Results The best methods had at least one gene transcript correctly predicted for close to 70% of the annotated genes. Nevertheless, the multiple transcript accuracy, taking into account alternative splicing, reached only approximately 40% to 50% accuracy. At the coding nucleotide level, the best programs reached an accuracy of 90% in both sensitivity and specificity. Programs relying on mRNA and protein sequences were the most accurate in reproducing the manually curated annotations. Experimental validation shows that only a very small percentage (3.2%) of the selected 221 computationally predicted exons outside of the existing annotation could be verified. Conclusion This is the first such experiment in human DNA, and we have followed the standards established in a similar experiment, GASP1, in Drosophila melanogaster. We believe the results presented here contribute to the value of ongoing large-scale annotation projects and should guide further experimental methods when being scaled up to the entire human genome sequence. PMID:16925836
Road Extraction from AVIRIS Using Spectral Mixture and Q-Tree Filter Techniques
NASA Technical Reports Server (NTRS)
Gardner, Margaret E.; Roberts, Dar A.; Funk, Chris; Noronha, Val
2001-01-01
Accurate road location and condition information are of primary importance in road infrastructure management. Additionally, spatially accurate and up-to-date road networks are essential in ambulance and rescue dispatch in emergency situations. However, accurate road infrastructure databases do not exist for vast areas, particularly in areas with rapid expansion. Currently, the US Department of Transportation (USDOT) extends great effort in field Global Positioning System (GPS) mapping and condition assessment to meet these informational needs. This methodology, though effective, is both time-consuming and costly, because every road within a DOT's jurisdiction must be field-visited to obtain accurate information. Therefore, the USDOT is interested in identifying new technologies that could help meet road infrastructure informational needs more effectively. Remote sensing provides one means by which large areas may be mapped with a high standard of accuracy and is a technology with great potential in infrastructure mapping. The goal of our research is to develop accurate road extraction techniques using high spatial resolution, fine spectral resolution imagery. Additionally, our research will explore the use of hyperspectral data in assessing road quality. Finally, this research aims to define the spatial and spectral requirements for remote sensing data to be used successfully for road feature extraction and road quality mapping. Our findings will facilitate the USDOT in assessing remote sensing as a new resource in infrastructure studies.
Miller, Joseph D; Slipchenko, Mikhail N; Meyer, Terrence R
2011-07-04
Hybrid femtosecond/picosecond coherent anti-Stokes Raman scattering (fs/ps CARS) offers accurate thermometry at kHz rates for combustion diagnostics. In high-temperature flames, selection of probe-pulse characteristics is key to simultaneously optimizing signal-to-nonresonant-background ratio, signal strength, and spectral resolution. We demonstrate a simple method for enhancing signal-to-nonresonant-background ratio by using a narrowband Lorentzian filter to generate a time-asymmetric probe pulse with full-width-half-maximum (FWHM) pulse width of only 240 fs. This allows detection within just 310 fs after the Raman excitation for eliminating nonresonant background while retaining 45% of the resonant signal at 2000 K. The narrow linewidth is comparable to that of a time-symmetric sinc2 probe pulse with a pulse width of ~2.4 ps generated with a conventional 4-f pulse shaper. This allows nonresonant-background-free, frequency-domain vibrational spectroscopy at high temperature, as verified using comparisons to a time-dependent theoretical fs/ps CARS model.
Genetic background effects in quantitative genetics: gene-by-system interactions.
Sardi, Maria; Gasch, Audrey P
2018-04-11
Proper cell function depends on networks of proteins that interact physically and functionally to carry out physiological processes. Thus, it seems logical that the impact of sequence variation in one protein could be significantly influenced by genetic variants at other loci in a genome. Nonetheless, the importance of such genetic interactions, known as epistasis, in explaining phenotypic variation remains a matter of debate in genetics. Recent work from our lab revealed that genes implicated from an association study of toxin tolerance in Saccharomyces cerevisiae show extensive interactions with the genetic background: most implicated genes, regardless of allele, are important for toxin tolerance in only one of two tested strains. The prevalence of background effects in our study adds to other reports of widespread genetic-background interactions in model organisms. We suggest that these effects represent many-way interactions with myriad features of the cellular system that vary across classes of individuals. Such gene-by-system interactions may influence diverse traits and require new modeling approaches to accurately represent genotype-phenotype relationships across individuals.
Advantages of new cardiovascular risk-assessment strategies in high-risk patients with hypertension.
Ruilope, Luis M; Segura, Julian
2005-10-01
Accurate assessment of cardiovascular disease (CVD) risk in patients with hypertension is important when planning appropriate treatment of modifiable risk factors. The causes of CVD are multifactorial, and hypertension seldom exists as an isolated risk factor. Classic models of risk assessment are more accurate than a simple counting of risk factors, but they are not generalizable to all populations. In addition, the risk associated with hypertension is graded, continuous, and independent of other risk factors, and this is not reflected in classic models of risk assessment. This article is intended to review both classic and newer models of CVD risk assessment. MEDLINE was searched for articles published between 1990 and 2005 that contained the terms cardiovascular disease, hypertension, or risk assessment. Articles describing major clinical trials, new data about cardiovascular risk, or global risk stratification were selected for review. Some patients at high long-term risk for CVD events (eg, patients aged <50 years with multiple risk factors) may go untreated because they do not meet the absolute risk-intervention threshold of 20% risk over 10 years with the classic model. Recognition of the limitations of classic risk-assessment models led to new guidelines, particularly those of the European Society of Hypertension-European Society of Cardiology. These guidelines view hypertension as one of many risk and disease factors that require treatment to decrease risk. These newer guidelines include a more comprehensive range of risk factors and more finely graded blood pressure ranges to stratify patients by degree of risk. Whether they accurately predict CVD risk in most populations is not known. Evidence from the Valsartan Antihypertensive Long-term Use Evaluation (VALUE) study, which stratified patients by several risk and disease factors, highlights the predictive value of some newer CVD risk assessments. Modern risk assessments, which include blood pressure along with a wide array of modifiable risk factors, may be more accurate than classic models for CVD risk prediction.
Manifold structure preservative for hyperspectral target detection
NASA Astrophysics Data System (ADS)
Imani, Maryam
2018-05-01
A nonparametric method termed as manifold structure preservative (MSP) is proposed in this paper for hyperspectral target detection. MSP transforms the feature space of data to maximize the separation between target and background signals. Moreover, it minimizes the reconstruction error of targets and preserves the topological structure of data in the projected feature space. MSP does not need to consider any distribution for target and background data. So, it can achieve accurate results in real scenarios due to avoiding unreliable assumptions. The proposed MSP detector is compared to several popular detectors and the experiments on a synthetic data and two real hyperspectral images indicate the superior ability of it in target detection.
Background oriented schlieren in a density stratified fluid.
Verso, Lilly; Liberzon, Alex
2015-10-01
Non-intrusive quantitative fluid density measurement methods are essential in the stratified flow experiments. Digital imaging leads to synthetic schlieren methods in which the variations of the index of refraction are reconstructed computationally. In this study, an extension to one of these methods, called background oriented schlieren, is proposed. The extension enables an accurate reconstruction of the density field in stratified liquid experiments. Typically, the experiments are performed by the light source, background pattern, and the camera positioned on the opposite sides of a transparent vessel. The multimedia imaging through air-glass-water-glass-air leads to an additional aberration that destroys the reconstruction. A two-step calibration and image remapping transform are the key components that correct the images through the stratified media and provide a non-intrusive full-field density measurements of transparent liquids.
Comparison of Model Prediction with Measurements of Galactic Background Noise at L-Band
NASA Technical Reports Server (NTRS)
LeVine, David M.; Abraham, Saji; Kerr, Yann H.; Wilson, Willam J.; Skou, Niels; Sobjaerg, S.
2004-01-01
The spectral window at L-band (1.413 GHz) is important for passive remote sensing of surface parameters such as soil moisture and sea surface salinity that are needed to understand the hydrological cycle and ocean circulation. Radiation from celestial (mostly galactic) sources is strong in this window and an accurate accounting for this background radiation is often needed for calibration. Modem radio astronomy measurements in this spectral window have been converted into a brightness temperature map of the celestial sky at L-band suitable for use in correcting passive measurements. This paper presents a comparison of the background radiation predicted by this map with measurements made with several modem L-band remote sensing radiometers. The agreement validates the map and the procedure for locating the source of down-welling radiation.
ON THE PROPER USE OF THE REDUCED SPEED OF LIGHT APPROXIMATION
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gnedin, Nickolay Y., E-mail: gnedin@fnal.gov
I show that the reduced speed of light (RSL) approximation, when used properly (i.e., as originally designed—only for local sources but not for the cosmic background), remains a highly accurate numerical method for modeling cosmic reionization. Simulated ionization and star formation histories from the “Cosmic Reionization on Computers” project are insensitive to the adopted value of the RSL for as long as that value does not fall below about 10% of the true speed of light. A recent claim of the failure of the RSL approximation in the Illustris reionization model appears to be due to the effective speed ofmore » light being reduced in the equation for the cosmic background too and hence illustrates the importance of maintaining the correct speed of light in modeling the cosmic background.« less
Ptosis assessment spectacles: a new method of measuring lid position and movement in children.
Khandwala, Mona; Dey, Sarju; Harcourt, Cassie; Wood, Clive; Jones, Carole A
2011-01-01
Accurate assessment of eyelid position and movement is vital in planning the surgical correction of ptosis. Conventional measurements taken using a millimeter ruler are considered the gold standard, although in young children this can be a difficult procedure. The authors have designed ptosis assessment spectacles with a measuring millimeter scale marked on the center of the lens to facilitate accurate assessment of eyelid position and function in children. The purpose of the study was to assess the accuracy and reproducibility of eyelid measurement using these ptosis assessment spectacles. Fifty-two children aged 2-12 years were recruited in this study. Each child underwent 2 sets of measurements. The first was undertaken by an ophthalmologist in the conventional manner using a ruler, and the second set made with ptosis assessment spectacles. On each occasion the palpebral aperture, skin crease, and levator function were recorded in millimeters. A verbal analog scale was used to assess parent satisfaction with each method. Clinically acceptable reproducibility was shown with the ruler and the spectacles for all measurements: palpebral aperture, skin crease, and levator function. Parents significantly preferred the glasses for measurement, as compared with the ruler (p < 0.05). The spectacles are as accurate as conventional methods of measurement, but are easier to use. Children tolerate these spectacles well, and most parents preferred them to the ruler.
Reducing DRIFT backgrounds with a submicron aluminized-mylar cathode
NASA Astrophysics Data System (ADS)
Battat, J. B. R.; Daw, E.; Dorofeev, A.; Ezeribe, A. C.; Fox, J. R.; Gauvreau, J.-L.; Gold, M.; Harmon, L.; Harton, J.; Lafler, R.; Landers, J.; Lauer, R. J.; Lee, E. R.; Loomba, D.; Lumnah, A.; Matthews, J.; Miller, E. H.; Mouton, F.; Murphy, A. St. J.; Paling, S. M.; Phan, N.; Sadler, S. W.; Scarff, A.; Schuckman, F. G.; Snowden-Ifft, D.; Spooner, N. J. C.; Walker, D.
2015-09-01
Background events in the DRIFT-IId dark matter detector, mimicking potential WIMP signals, are predominantly caused by alpha decays on the central cathode in which the alpha particle is completely or partially absorbed by the cathode material. We installed a 0.9 μm thick aluminized-mylar cathode as a way to reduce the probability of producing these backgrounds. We study three generations of cathode (wire, thin-film, and radiologically clean thin-film) with a focus on the ratio of background events to alpha decays. Two independent methods of measuring the absolute alpha decay rate are used to ensure an accurate result, and agree to within 10%. Using alpha range spectroscopy, we measure the radiologically cleanest cathode version to have a contamination of 3.3±0.1 ppt 234U and 73±2 ppb 238U. This cathode reduces the probability of producing an RPR from an alpha decay by a factor of 70±20 compared to the original stainless steel wire cathode. First results are presented from a texturized version of the cathode, intended to be even more transparent to alpha particles. These efforts, along with other background reduction measures, have resulted in a drop in the observed background rate from 500/day to 1/day. With the recent implementation of full-volume fiducialization, these remaining background events are identified, allowing for background-free operation.
Gearing, Robin E; Lizardi, Dana
2009-09-01
Religion impacts suicidality. One's degree of religiosity can potentially serve as a protective factor against suicidal behavior. To accurately assess risk of suicide, it is imperative to understand the role of religion in suicidality. PsycINFO and MEDLINE databases were searched for published articles on religion and suicide between 1980 and 2008. Epidemiological data on suicidality across four religions, and the influence of religion on suicidality are presented. Practice guidelines are presented for incorporating religiosity into suicide risk assessment. Suicide rates and risk and protective factors for suicide vary across religions. It is essential to assess for degree of religious commitment and involvement to accurately identify suicide risk.
NASA Astrophysics Data System (ADS)
Mittelstaedt, Eric; Davaille, Anne; van Keken, Peter E.; Gracias, Nuno; Escartin, Javier
2010-10-01
Diffuse flow velocimetry (DFV) is introduced as a new, noninvasive, optical technique for measuring the velocity of diffuse hydrothermal flow. The technique uses images of a motionless, random medium (e.g., rocks) obtained through the lens of a moving refraction index anomaly (e.g., a hot upwelling). The method works in two stages. First, the changes in apparent background deformation are calculated using particle image velocimetry (PIV). The deformation vectors are determined by a cross correlation of pixel intensities across consecutive images. Second, the 2-D velocity field is calculated by cross correlating the deformation vectors between consecutive PIV calculations. The accuracy of the method is tested with laboratory and numerical experiments of a laminar, axisymmetric plume in fluids with both constant and temperature-dependent viscosity. Results show that average RMS errors are ˜5%-7% and are most accurate in regions of pervasive apparent background deformation which is commonly encountered in regions of diffuse hydrothermal flow. The method is applied to a 25 s video sequence of diffuse flow from a small fracture captured during the Bathyluck'09 cruise to the Lucky Strike hydrothermal field (September 2009). The velocities of the ˜10°C-15°C effluent reach ˜5.5 cm/s, in strong agreement with previous measurements of diffuse flow. DFV is found to be most accurate for approximately 2-D flows where background objects have a small spatial scale, such as sand or gravel.
Background Characterization for Thermal Ion Release Experiments with 224Ra
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kwong, H.; /Stanford U., Phys. Dept.; Rowson, P.
The Enriched Xenon Observatory for neutrinoless double beta decay uses {sup 136}Ba identification as a means for verifying the decay's occurrence in {sup 136}Xe. A current challenge is the release of Ba ions from the Ba extraction probe, and one possible solution is to heat the probe to high temperatures to release the ions. The investigation of this method requires a characterization of the alpha decay background in our test apparatus, which uses a {sup 228}Th source that produces {sup 224}Ra daughters, the ionization energies of which are similar to those of Ba. For this purpose, we ran a backgroundmore » count with our apparatus maintained at a vacuum, and then three counts with the apparatus filled with Xe gas. We were able to match up our alpha spectrum in vacuum with the known decay scheme of {sup 228}Th, while the spectrum in xenon gas had too many unresolved ambiguities for an accurate characterization. We also found that the alpha decays occurred at a near-zero rate both in vacuum and in xenon gas, which indicates that the rate was determined by {sup 228}Th decays. With these background measurements, we can in the future make a more accurate measurement of the temperature dependency of the ratio of ions to neutral atoms released from the hot surface of the probe, which may lead to a successful method of Ba ion release.« less
Alternate Assessment Manual for the Arizona Student Achievement Program
ERIC Educational Resources Information Center
Arizona Department of Education, 2005
2005-01-01
The Alternate Assessment Code of Ethics informs school personnel involved in alternate assessments of ethical, nondiscriminatory assessment practices and underscores the diligence necessary to provide accurate assessment data for instructional decision-making. The importance of commitment and adherence to the Alternate Assessment Code of Ethics by…
An experimental method for the assessment of color simulation tools.
Lillo, Julio; Alvaro, Leticia; Moreira, Humberto
2014-07-22
The Simulcheck method for evaluating the accuracy of color simulation tools in relation to dichromats is described and used to test three color simulation tools: Variantor, Coblis, and Vischeck. A total of 10 dichromats (five protanopes, five deuteranopes) and 10 normal trichromats participated in the current study. Simulcheck includes two psychophysical tasks: the Pseudoachromatic Stimuli Identification task and the Minimum Achromatic Contrast task. The Pseudoachromatic Stimuli Identification task allows determination of the two chromatic angles (h(uv) values) that generate a minimum response in the yellow–blue opponent mechanism and, consequently, pseudoachromatic stimuli (greens or reds). The Minimum Achromatic Contrast task requires the selection of the gray background that produces minimum contrast (near zero change in the achromatic mechanism) for each pseudoachromatic stimulus selected in the previous task (L(R) values). Results showed important differences in the colorimetric transformations performed by the three evaluated simulation tools and their accuracy levels. Vischeck simulation accurately implemented the algorithm of Brettel, Viénot, and Mollon (1997). Only Vischeck appeared accurate (similarity in huv and L(R) values between real and simulated dichromats) and, consequently, could render reliable color selections. It is concluded that Simulcheck is a consistent method because it provided an equivalent pattern of results for huv and L(R) values irrespective of the stimulus set used to evaluate a simulation tool. Simulcheck was also considered valid because real dichromats provided expected huv and LR values when performing the two psychophysical tasks included in this method. © 2014 ARVO.
Accuracy and consistency of weights provided by home bathroom scales
2013-01-01
Background Self-reported body weight is often used for calculation of Body Mass Index because it is easy to collect. Little is known about sources of error introduced by using bathroom scales to measure weight at home. The objective of this study was to evaluate the accuracy and consistency of digital versus dial-type bathroom scales commonly used for self-reported weight. Methods Participants brought functioning bathroom scales (n = 18 dial-type, n = 43 digital-type) to a central location. Trained researchers assessed accuracy and consistency using certified calibration weights at 10 kg, 25 kg, 50 kg, 75 kg, 100 kg, and 110 kg. Data also were collected on frequency of calibration, age and floor surface beneath the scale. Results All participants reported using their scale on hard surface flooring. Before calibration, all digital scales displayed 0, but dial scales displayed a mean absolute initial weight of 0.95 (1.9 SD) kg. Digital scales accurately weighed test loads whereas dial-type scale weights differed significantly (p < 0.05). Imprecision of dial scales was significantly greater than that of digital scales at all weights (p < 0.05). Accuracy and precision did not vary by scale age. Conclusions Digital home bathroom scales provide sufficiently accurate and consistent weights for public health research. Reminders to zero scales before each use may further improve accuracy of self-reported weight. PMID:24341761
Gutiérrez, J. J.; Russell, James K.
2016-01-01
Background. Cardiopulmonary resuscitation (CPR) feedback devices are being increasingly used. However, current accelerometer-based devices overestimate chest displacement when CPR is performed on soft surfaces, which may lead to insufficient compression depth. Aim. To assess the performance of a new algorithm for measuring compression depth and rate based on two accelerometers in a simulated resuscitation scenario. Materials and Methods. Compressions were provided to a manikin on two mattresses, foam and sprung, with and without a backboard. One accelerometer was placed on the chest and the second at the manikin's back. Chest displacement and mattress displacement were calculated from the spectral analysis of the corresponding acceleration every 2 seconds and subtracted to compute the actual sternal-spinal displacement. Compression rate was obtained from the chest acceleration. Results. Median unsigned error in depth was 2.1 mm (4.4%). Error was 2.4 mm in the foam and 1.7 mm in the sprung mattress (p < 0.001). Error was 3.1/2.0 mm and 1.8/1.6 mm with/without backboard for foam and sprung, respectively (p < 0.001). Median error in rate was 0.9 cpm (1.0%), with no significant differences between test conditions. Conclusion. The system provided accurate feedback on chest compression depth and rate on soft surfaces. Our solution compensated mattress displacement, avoiding overestimation of compression depth when CPR is performed on soft surfaces. PMID:27999808
Mahon, Jeffrey L.; Beam, Craig A.; Marcovina, Santica M.; Boulware, David C.; Palmer, Jerry P.; Winter, William E.; Skyler, Jay S.; Krischer, Jeffrey P.
2018-01-01
Background Detection of below-threshold first-phase insulin release or FPIR (1 + 3 minute insulin concentrations during an intravenous glucose tolerance test [IVGTT]) is important in type 1 diabetes prediction and prevention studies including the TrialNet Oral Insulin Prevention Trial. We assessed whether an insulin immunoenzymometric assay (IEMA) could replace the less practical but current standard of a radioimmunoassay (RIA) for FPIR. Methods One hundred thirty-three islet autoantibody positive relatives of persons with type 1 diabetes underwent 161 IVGTTs. Insulin concentrations were measured by both assays in 1056 paired samples. A rule classifying FPIR (below-threshold, above-threshold, uncertain) by the IEMA was derived and validated against FPIR by the RIA. Results The insulin IEMA-based rule accurately classified below- and above-threshold FPIRs by the RIA in 110/161 (68%) IVGTTs, but was uncertain in 51/161 (32%) tests for which FPIR by RIA is needed. An uncertain FPIR by the IEMA was more likely among below-threshold vs above-threshold FPIRs by the RIA (64% [30/47] vs. 18% [21/114], respectively; p < 0.05). Conclusions An insulin IEMA for FPIR in subjects at risk for type 1 diabetes accurately determined below- and above-threshold FPIRs in 2/3 of tests relative to the current standard of the insulin RIA, but could not reliably classify the remaining FPIRs. TrialNet is limiting the insulin RIA for FPIR to the latter given the practical advantages of the more specific IEMA. PMID:21843518
Billi, Fabrizio; Benya, Paul; Kavanaugh, Aaron; Adams, John; Ebramzadeh, Edward; McKellop, Harry
2012-02-01
Numerous studies indicate highly crosslinked polyethylenes reduce the wear debris volume generated by hip arthroplasty acetabular liners. This, in turns, requires new methods to isolate and characterize them. We describe a method for extracting polyethylene wear particles from bovine serum typically used in wear tests and for characterizing their size, distribution, and morphology. Serum proteins were completely digested using an optimized enzymatic digestion method that prevented the loss of the smallest particles and minimized their clumping. Density-gradient ultracentrifugation was designed to remove contaminants and recover the particles without filtration, depositing them directly onto a silicon wafer. This provided uniform distribution of the particles and high contrast against the background, facilitating accurate, automated, morphometric image analysis. The accuracy and precision of the new protocol were assessed by recovering and characterizing particles from wear tests of three types of polyethylene acetabular cups (no crosslinking and 5 Mrads and 7.5 Mrads of gamma irradiation crosslinking). The new method demonstrated important differences in the particle size distributions and morphologic parameters among the three types of polyethylene that could not be detected using prior isolation methods. The new protocol overcomes a number of limitations, such as loss of nanometer-sized particles and artifactual clumping, among others. The analysis of polyethylene wear particles produced in joint simulator wear tests of prosthetic joints is a key tool to identify the wear mechanisms that produce the particles and predict and evaluate their effects on periprosthetic tissues.
Myocardial strains from 3D displacement encoded magnetic resonance imaging
2012-01-01
Background The ability to measure and quantify myocardial motion and deformation provides a useful tool to assist in the diagnosis, prognosis and management of heart disease. The recent development of magnetic resonance imaging methods, such as harmonic phase analysis of tagging and displacement encoding with stimulated echoes (DENSE), make detailed non-invasive 3D kinematic analyses of human myocardium possible in the clinic and for research purposes. A robust analysis method is required, however. Methods We propose to estimate strain using a polynomial function which produces local models of the displacement field obtained with DENSE. Given a specific polynomial order, the model is obtained as the least squares fit of the acquired displacement field. These local models are subsequently used to produce estimates of the full strain tensor. Results The proposed method is evaluated on a numerical phantom as well as in vivo on a healthy human heart. The evaluation showed that the proposed method produced accurate results and showed low sensitivity to noise in the numerical phantom. The method was also demonstrated in vivo by assessment of the full strain tensor and to resolve transmural strain variations. Conclusions Strain estimation within a 3D myocardial volume based on polynomial functions yields accurate and robust results when validated on an analytical model. The polynomial field is capable of resolving the measured material positions from the in vivo data, and the obtained in vivo strains values agree with previously reported myocardial strains in normal human hearts. PMID:22533791
Rozen, Warren Matthew; Spychal, Robert T.; Hunter-Smith, David J.
2016-01-01
Background Accurate volumetric analysis is an essential component of preoperative planning in both reconstructive and aesthetic breast procedures towards achieving symmetrization and patient-satisfactory outcome. Numerous comparative studies and reviews of individual techniques have been reported. However, a unifying review of all techniques comparing their accuracy, reliability, and practicality has been lacking. Methods A review of the published English literature dating from 1950 to 2015 using databases, such as PubMed, Medline, Web of Science, and EMBASE, was undertaken. Results Since Bouman’s first description of water displacement method, a range of volumetric assessment techniques have been described: thermoplastic casting, direct anthropomorphic measurement, two-dimensional (2D) imaging, and computed tomography (CT)/magnetic resonance imaging (MRI) scans. However, most have been unreliable, difficult to execute and demonstrate limited practicability. Introduction of 3D surface imaging has revolutionized the field due to its ease of use, fast speed, accuracy, and reliability. However, its widespread use has been limited by its high cost and lack of high level of evidence. Recent developments have unveiled the first web-based 3D surface imaging program, 4D imaging, and 3D printing. Conclusions Despite its importance, an accurate, reliable, and simple breast volumetric analysis tool has been elusive until the introduction of 3D surface imaging technology. However, its high cost has limited its wide usage. Novel adjunct technologies, such as web-based 3D surface imaging program, 4D imaging, and 3D printing, appear promising. PMID:27047788
Terrestrial laser scanning-based bridge structural condition assessment : InTrans project reports.
DOT National Transportation Integrated Search
2016-05-01
Objective, accurate, and fast assessment of a bridges structural condition is critical to the timely assessment of safety risks. : Current practices for bridge condition assessment rely on visual observations and manual interpretation of reports a...
Assessment of distraction from erotic stimuli by nonerotic interference.
Anderson, Alex B; Hamilton, Lisa Dawn
2015-01-01
Distraction from erotic cues during sexual encounters is a major contributor to sexual difficulties in men and women. Being able to assess distraction in studies of sexual arousal will help clarify underlying contributions to sexual problems. The current study aimed to identify the most accurate assessment of distraction from erotic cues in healthy men (n = 29) and women (n = 38). Participants were assigned to a no distraction, low distraction, or high distraction condition. Distraction was induced using an auditory distraction task presented during the viewing of an erotic video. Attention to erotic cues was assessed using three methods: a written quiz, a visual quiz, and a self-reported distraction measure. Genital and psychological sexual responses were also measured. Self-reported distraction and written quiz scores most accurately represented the level of distraction present, while self-reported distraction also corresponded with a decrease in genital arousal. Findings support the usefulness of self-report measures in conjunction with a brief quiz on the erotic material as the most accurate and sensitive ways to simply measure experimentally-induced distraction. Insight into distraction assessment techniques will enable evaluation of naturally occurring distraction in patients suffering from sexual problems.
Assessment of validity with polytrauma Veteran populations.
Bush, Shane S; Bass, Carmela
2015-01-01
Veterans with polytrauma have suffered injuries to multiple body parts and organs systems, including the brain. The injuries can generate a triad of physical, neurologic/cognitive, and emotional symptoms. Accurate diagnosis is essential for the treatment of these conditions and for fair allocation of benefits. To accurately diagnose polytrauma disorders and their related problems, clinicians take into account the validity of reported history and symptoms, as well as clinical presentations. The purpose of this article is to describe the assessment of validity with polytrauma Veteran populations. Review of scholarly and other relevant literature and clinical experience are utilized. A multimethod approach to validity assessment that includes objective, standardized measures increases the confidence that can be placed in the accuracy of self-reported symptoms and physical, cognitive, and emotional test results. Due to the multivariate nature of polytrauma and the multiple disciplines that play a role in diagnosis and treatment, an ideal model of validity assessment with polytrauma Veteran populations utilizes neurocognitive, neurological, neuropsychiatric, and behavioral measures of validity. An overview of these validity assessment approaches as applied to polytrauma Veteran populations is presented. Veterans, the VA, and society are best served when accurate diagnoses are made.
NASA Astrophysics Data System (ADS)
Hoffman, Joseph Loris
1999-11-01
This study examined the information-seeking strategies and science content understandings learners developed as a result of using on-line resources in the University of Michigan Digital Library and on the World Wide Web. Eight pairs of sixth grade students from two teachers' classrooms were observed during inquiries for astronomy, ecology, geology, and weather, and a final transfer task assessed learners' capabilities at the end of the school year. Data included video recordings of students' screen activity and conversations, journals and completed activity sheets, final artifacts, and semi-structured interviews. Learners' information-seeking strategies included activities related to asking, planning, tool usage, searching, assessing, synthesizing, writing, and creating. Analysis of data found a majority of learners posed meaningful, openended questions, used technological tools appropriately, developed pertinent search topics, were thoughtful in queries to the digital library, browsed sites purposefully to locate information, and constructed artifacts with novel formats. Students faced challenges when planning activities, assessing resources, and synthesizing information. Possible explanations were posed linking pedagogical practices with learners' growth and use of inquiry strategies. Data from classroom-lab video and teacher interviews showed varying degrees of student scaffolding: development and critique of initial questions, utilization of search tools, use of journals for reflection on activities, and requirements for final artifacts. Science content understandings included recalling information, offering explanations, articulating relationships, and extending explanations. A majority of learners constructed partial understandings limited to information recall and simple explanations, and these occasionally contained inaccurate conceptualizations. Web site design features had some influence on the construction of learners' content understandings. Analysis of data suggests sites with high quality general design, navigation, and content helped to foster the construction of broad and accurate understandings, while context and interactivity had less impact. However, student engagement with inquiry strategies had a greater impact on the construction of understandings. Gaining accurate and in-depth understandings from on-line resources is a complex process for young learners. Teachers can support students by helping them engage in all phases of the information-seeking process, locate useful information with prescreened resources, build background understanding with off-line instruction, and process new information deeply through extending writing and conversation.
Oladele, Edward Adekola; Ormond, Louise; Adeyemi, Olusegun; Patrick, David; Okoh, Festus; Oresanya, Olusola Bukola; Valadez, Joseph J.
2012-01-01
Background In Nigeria, 30% of child deaths are due to malaria. The National Malaria Control Program of Nigeria (NMCP) during 2009 initiated a program to improve the quality of paediatric malaria services delivered in health facilities (HF). This study reports a rapid approach used to assess the existing quality of services in Jigawa state at decentralised levels of the health system. Methods NMCP selected Lot Quality Assurance Sampling (LQAS) to identify the variation in HF service quality among Senatorial Districts (SD). LQAS was selected because it was affordable and could be used by local health workers (HW) in a population-based survey. NMCP applied a 2-stage LQAS using a structured Rapid Health Facility Assessment (R-HFA) tool to identify high and low performing SD for specified indicators. Findings LQAS identified variations in HF performance (n = 21) and enabled resources to be targeted to address priorities. All SD exhibited deficient essential services, supplies and equipment. Only 9.7% of HF had Artemisinin-based Combination Therapies and other first-line treatments for childhood illnesses. No SD and few HF exhibited adequate HW performance for the assessment, treatment or counselling of sick children. Using the IMCI algorithm, 17.5% of HW assessed the child’s vaccination status, 46.8% assessed nutritional status, and 65.1% assessed children for dehydration. Only 5.1% of HW treatments were appropriate for the assessment. Exit interviews revealed that 5.1% of caregivers knew their children’s illness, and only 19.9% could accurately describe how to administer the prescribed drug. Conclusion This R-HFA, using LQAS principles, is a rapid, simple tool for assessing malaria services and can be used at scale. It identified technical deficiencies that could be corrected by improved continuing medical education, targeted supervision, and recurrent R-HFA assessments of the quality of services. PMID:23028519
ERIC Educational Resources Information Center
Hall, Tracey E.; Baker, Scott
This paper provides background information on school reform and describes efforts to implement an assessment system for students with disabilities in 12 nongraded primary classrooms. Background information briefly covers the school restructuring movement, the history of nongraded primary education, alternative assessment strategies which focus on…
USDA-ARS?s Scientific Manuscript database
Background: Accurate determination of food-borne pathogen serotype and genotype information is important for disease surveillance and outbreak source tracking. E. coli serotype O157:H7 and non-O157 of Shiga toxin-producing E. coli (STEC) serogroups, including O26, O45, O103, O111, O121, O145 (top ...
ERIC Educational Resources Information Center
Almeida, Renita A.; Dickinson, J. Edwin; Maybery, Murray T.; Badcock, Johanna C.; Badcock, David R.
2010-01-01
The Embedded Figures Test (EFT) requires detecting a shape within a complex background and individuals with autism or high Autism-spectrum Quotient (AQ) scores are faster and more accurate on this task than controls. This research aimed to uncover the visual processes producing this difference. Previously we developed a search task using radial…
ERIC Educational Resources Information Center
Springer, Kristen W.; Sheridan, Jennifer; Kuo, Daphne; Carnes, Molly
2007-01-01
Objective: Child maltreatment has been linked to negative adult health outcomes; however, much past research includes only clinical samples of women, focuses exclusively on sexual abuse and/or fails to control for family background and childhood characteristics, both potential confounders. Further research is needed to obtain accurate,…
Veronika Leitold; Michael Keller; Douglas C Morton; Bruce D Cook; Yosio E Shimabukuro
2015-01-01
Background: Carbon stocks and fluxes in tropical forests remain large sources of uncertainty in the global carbon budget. Airborne lidar remote sensing is a powerful tool for estimating aboveground biomass, provided that lidar measurements penetrate dense forest vegetation to generate accurate estimates of surface topography and canopy heights. Tropical forest areas...
ERIC Educational Resources Information Center
Scott, Robert A.
Which model provides a more accurate picture of future job responsibilities and professional standing for middle-level administrators in colleges and universities: the robot, or the reinsman who has ability, courage, and control but stays in the background? Competing forces are described in this speech: the institution's desire for stability and…
ERIC Educational Resources Information Center
Timmermans, Anneke C.; Kuyper, Hans; Werf, Greetje
2015-01-01
Background: In several tracked educational systems, realizing optimal placements in classes in the first year of secondary education depends on the accuracy of teacher expectations. Aims: The aim of this study was to investigate between-teacher differences in their expectations regarding the academic aptitude of their students. Sample: The sample…
ERIC Educational Resources Information Center
Tanaka, James W.; Wolf, Julie M.; Klaiman, Cheryl; Koenig, Kathleen; Cockburn, Jeffrey; Herlihy, Lauren; Brown, Carla; Stahl, Sherin S.; South, Mikle; McPartland, James C.; Kaiser, Martha D.; Schultz, Robert T.
2012-01-01
Background: Although impaired social-emotional ability is a hallmark of autism spectrum disorder (ASD), the perceptual skills and mediating strategies contributing to the social deficits of autism are not well understood. A perceptual skill that is fundamental to effective social communication is the ability to accurately perceive and interpret…
Simultaneous Inversion of UXO Parameters and Background Response
2012-03-01
11. SUPPLEMENTARY NO TES 12a. DISTRIBUTION/AVAILABILITY STATEMENT Unclassified/Unlimited 12b. DISTRIBUTIO N CODE 13. ABSTRACT (Maximum 200...demonstrated an ability to accurate recover dipole parameters using the simultaneous inversion method. Numerical modeling code for solving Maxwell’s...magnetics 15. NUMBER O F PAGES 160 16. PRICE CODE 17. SECURITY CLASSIFICATIO N OF REPORT Unclassified 18. SECURITY
Automated bone age assessment of older children using the radius
NASA Astrophysics Data System (ADS)
Tsao, Sinchai; Gertych, Arkadiusz; Zhang, Aifeng; Liu, Brent J.; Huang, Han K.
2008-03-01
The Digital Hand Atlas in Assessment of Skeletal Development is a large-scale Computer Aided Diagnosis (CAD) project for automating the process of grading Skeletal Development of children from 0-18 years of age. It includes a complete collection of 1,400 normal hand X-rays of children between the ages of 0-18 years of age. Bone Age Assessment is used as an index of skeletal development for detection of growth pathologies that can be related to endocrine, malnutrition and other disease types. Previous work at the Image Processing and Informatics Lab (IPILab) allowed the bone age CAD algorithm to accurately assess bone age of children from 1 to 16 (male) or 14 (female) years of age using the Phalanges as well as the Carpal Bones. At the older ages (16(male) or 14(female) -19 years of age) the Phalanges as well as the Carpal Bones are fully developed and do not provide well-defined features for accurate bone age assessment. Therefore integration of the Radius Bone as a region of interest (ROI) is greatly needed and will significantly improve the ability to accurately assess the bone age of older children. Preliminary studies show that an integrated Bone Age CAD that utilizes the Phalanges, Carpal Bones and Radius forms a robust method for automatic bone age assessment throughout the entire age range (1-19 years of age).
Accurate Land Company, Inc., Acadia Subdivision, Plat 1 and Plat 2
The EPA is providing notice of an Administrative Penalty Assessment in the form of an Expedited Storm Water Settlement Agreement against Accurate Land Company, Inc., a business located at 12035 University Ave., Suite 100, Clive, IA 50235, for alleged viola
Identifying walking trips from GPS and accelerometer data in adolescent females
Rodriguez, Daniel; Cho, GH; Elder, John; Conway, Terry; Evenson, Kelly R; Ghosh-Dastidar, Bonnie; Shay, Elizabeth; Cohen, Deborah A; Veblen-Mortenson, Sarah; Pickrell, Julie; Lytle, Leslie
2013-01-01
Background Studies that have combined accelerometers and global positioning systems (GPS) to identify walking have done so in carefully controlled conditions. This study tested algorithms for identifying walking trips from accelerometer and GPS data in free-living conditions. The study also assessed the accuracy of the locations where walking occurred compared to what participants reported in a diary. Methods A convenience sample of high school females was recruited (N=42) in 2007. Participants wore a GPS unit and an accelerometer, and recorded their out-of-school travel for six days. Split-sample validation was used to examine agreement in the daily and total number of walking trips with Kappa statistics and count regression models, while agreement in locations visited by walking was examined with geographic information systems. Results Agreement varied based on the parameters of the algorithm, with algorithms exhibiting moderate to substantial agreement with self-reported daily (Kappa = 0.33–0.48) and weekly (Kappa = 0.41–0.64) walking trips. Comparison of reported locations reached by walking and GPS data suggest that reported locations are accurate. Conclusions The use of GPS and accelerometers is promising for assessing the number of walking trips and the walking locations of adolescent females. PMID:21934163
Assessment of oxygen supplementation during air travel.
Cramer, D.; Ward, S.; Geddes, D.
1996-01-01
BACKGROUND: The aim of this study was to simulate an in flight environment at sea level with a fractional inspired concentration of oxygen (FiO2) of 0.15 to determine how much supplemental oxygen was needed to restore a subject's oxygen saturation (SaO2) to 90% or to the level previously attained when breathing room air (FiO2 of 0.21). METHODS: Three groups were selected with normal, obstructive, and restrictive lung function. Using a sealed body plethysmograph an environment with an FiO2 of 0.15 was created and mass spectrometry was used to monitor the FiO2. Supplemental oxygen was administered to the patient by nasal cannulae. SaO2 was continuously monitored and recorded at an FiO2 of 0.21, 0.15, and 0.15 + supplemental oxygen. RESULTS: When given 2 l/m of supplemental oxygen all patients in the 15% environment returned to a similar SaO2 value as that obtained using the 21% oxygen environment. One patient with airways obstruction needed 3 l/m of supplemental oxygen to raise his SaO2 above 90%. CONCLUSIONS: This technique, which simulates an aircraft environment, enables an accurate assessment to be made of supplemental oxygen requirements. PMID:8711658
NASA Astrophysics Data System (ADS)
Castillo-López, Elena; Dominguez, Jose Antonio; Pereda, Raúl; de Luis, Julio Manuel; Pérez, Ruben; Piña, Felipe
2017-10-01
Accurate determination of water depth is indispensable in multiple aspects of civil engineering (dock construction, dikes, submarines outfalls, trench control, etc.). To determine the type of atmospheric correction most appropriate for the depth estimation, different accuracies are required. Accuracy in bathymetric information is highly dependent on the atmospheric correction made to the imagery. The reduction of effects such as glint and cross-track illumination in homogeneous shallow-water areas improves the results of the depth estimations. The aim of this work is to assess the best atmospheric correction method for the estimation of depth in shallow waters, considering that reflectance values cannot be greater than 1.5 % because otherwise the background would not be seen. This paper addresses the use of hyperspectral imagery to quantitative bathymetric mapping and explores one of the most common problems when attempting to extract depth information in conditions of variable water types and bottom reflectances. The current work assesses the accuracy of some classical bathymetric algorithms (Polcyn-Lyzenga, Philpot, Benny-Dawson, Hamilton, principal component analysis) when four different atmospheric correction methods are applied and water depth is derived. No atmospheric correction is valid for all type of coastal waters, but in heterogeneous shallow water the model of atmospheric correction 6S offers good results.
Zimmermann, Nicolle; Cardoso, Caroline de Oliveira; Trentini, Clarissa Marceli; Grassi-Oliveira, Rodrigo; Fonseca, Rochele Paz
2015-01-01
Executive functions are involved in a series of human neurological and psychiatric disorders. For this reason, appropriate assessment tools with age and education adjusted norms for symptom diagnosis are necessary. To present normative data for adults (19-75 year-olds; with five years of education or more) on the Modified Wisconsin Card Sorting Test (MWCST), Stroop color and word test and Digit Span test. Age and education effects were investigated. Three samples were formed after inclusion criteria and data analysis: MWCST (n=124); Digit Span (n=123), and Stroop test (n=158). Groups were divided into young (19-39), middle-aged (40-59) and older (60-75) participants with five to eight years of education and nine years of education or more. Two-way ANOVA and ANCOVA analyses were used. Education effects were found in most variables of the three tasks. An age effect was only found on color naming and color-word naming speed from the Stroop test. No interactions were detected. In countries with heterogeneous educational backgrounds, the use of stratified norms by education to assess at least some components of executive functions is essential for an ethical and accurate cognitive diagnosis.
Zimmermann, Nicolle; Cardoso, Caroline de Oliveira; Trentini, Clarissa Marceli; Grassi-Oliveira, Rodrigo; Fonseca, Rochele Paz
2015-01-01
Executive functions are involved in a series of human neurological and psychiatric disorders. For this reason, appropriate assessment tools with age and education adjusted norms for symptom diagnosis are necessary. Objective To present normative data for adults (19-75 year-olds; with five years of education or more) on the Modified Wisconsin Card Sorting Test (MWCST), Stroop color and word test and Digit Span test. Age and education effects were investigated. Methods Three samples were formed after inclusion criteria and data analysis: MWCST (n=124); Digit Span (n=123), and Stroop test (n=158). Groups were divided into young (19-39), middle-aged (40-59) and older (60-75) participants with five to eight years of education and nine years of education or more. Two-way ANOVA and ANCOVA analyses were used. Results Education effects were found in most variables of the three tasks. An age effect was only found on color naming and color-word naming speed from the Stroop test. No interactions were detected. Conclusion In countries with heterogeneous educational backgrounds, the use of stratified norms by education to assess at least some components of executive functions is essential for an ethical and accurate cognitive diagnosis. PMID:29213953
Burns, Ryan; Hannon, James C.; Brusseau, Timothy A.; Shultz, Barry; Eisenman, Patricia
2013-01-01
Background. Previous research suggests that use of BMI as a screening tool to assess health in youth has limitations. Valid alternative measures to assess body composition are needed to accurately identify children who are aerobically fit, which is an indicator of health status. The purpose of this study was to examine the associations between select anthropometric measures and cardiorespiratory fitness test performance in middle-school students. Methods. Participants included 134 students (65 boys and 69 girls) recruited from the 6th, 7th, and 8th grades. Anthropometric measures consisted of BMI, waist circumference (WC), waist-to-height ratio (WHtR), and percent body fat estimated from two-site skinfolds (%BF-SKF), as well as the hand-held OMRON BIA device (%BF-BIA). Cardiorespiratory fitness tests included the one-mile run and PACER test. Data were collected on four separate testing days during the students' physical education classes. Results. There were statistically significant moderate correlations between the %BF estimations, WHtR, and cardiorespiratory fitness test scores in both genders (P < .001). BMI at best only displayed weak correlations with the cardiorespiratory fitness test scores. Conclusions. The results suggest that alternative measures such as %BF-SKF, %BF-BIA, and WHtR may be more valid indicators of youth aerobic fitness lending to their preferred use over BMI. PMID:23533727
Li, Wylie Wai Yee; Lam, Wendy Wing Tak; Shun, Shiow-Ching; Lai, Yeur-Hur; Law, Wai-Lun; Poon, Jensen; Fielding, Richard
2013-01-01
Background Accurate assessment of unmet supportive care needs is essential for optimal cancer patient care. This study used confirmatory factor analysis (CFA) to test the known factor structures of the short form of Supportive Care Need Survey (SCNS-34) in Hong Kong and Taiwan Chinese patients diagnosed with colorectal cancer (CRC). Methods 360 Hong Kong and 263 Taiwanese Chinese CRC patients completed the Chinese version of SCNS-SF34. Comparative measures (patient satisfaction, anxiety, depression, and symptom distress) tested convergent validity while known group differences were examined to test discriminant validity. Results The original 5-factor and recent 4-factor models of the SCNS demonstrated poor data fit using CFA in both Hong Kong and Taiwan samples. Subsequently a modified five-factor model with correlated residuals demonstrated acceptable fit in both samples. Correlations demonstrated convergent and divergent validity and known group differences were observed. Conclusions While the five-factor model demonstrated a better fit for data from Chinese colorectal cancer patients, some of the items within its domain overlapped, suggesting item redundancy. The five-factor model showed good psychometric properties in these samples but also suggests conceptualization of unmet supportive care needs are currently inadequate. PMID:24146774
Kania-Richmond, Ania; Weeks, Laura; Scholten, Jeffrey; Reney, Mikaël
2016-01-01
Background: Practice based research networks (PBRNs) are increasingly used as a tool for evidence based practice. We developed and tested the feasibility of using software to enable online collection of patient data within a chiropractic PBRN to support clinical decision making and research in participating clinics. Purpose: To assess the feasibility of using online software to collect quality patient information. Methods: The study consisted of two phases: 1) Assessment of the quality of information provided, using a standardized form; and 2) Exploration of patients’ perspectives and experiences regarding online information provision through semi-structured interviews. Data analysis was descriptive. Results: Forty-five new patients were recruited. Thirty-six completed online forms, which were submitted by an appropriate person 100% of the time, with an error rate of less than 1%, and submitted in a timely manner 83% of the time. Twenty-one participants were interviewed. Overall, online forms were preferred given perceived security, ease of use, and enabling provision of more accurate information. Conclusions: Use of online software is feasible, provides high quality information, and is preferred by most participants. A pen-and-paper format should be available for patients with this preference and in case of technical difficulties. PMID:27069272
Elissen, Arianne M J; Struijs, Jeroen N; Baan, Caroline A; Ruwaard, Dirk
2015-05-01
To support providers and commissioners in accurately assessing their local populations' health needs, this study produces an overview of Dutch predictive risk models for health care, focusing specifically on the type, combination and relevance of included determinants for achieving the Triple Aim (improved health, better care experience, and lower costs). We conducted a mixed-methods study combining document analyses, interviews and a Delphi study. Predictive risk models were identified based on a web search and expert input. Participating in the study were Dutch experts in predictive risk modelling (interviews; n=11) and experts in healthcare delivery, insurance and/or funding methodology (Delphi panel; n=15). Ten predictive risk models were analysed, comprising 17 unique determinants. Twelve were considered relevant by experts for estimating community health needs. Although some compositional similarities were identified between models, the combination and operationalisation of determinants varied considerably. Existing predictive risk models provide a good starting point, but optimally balancing resources and targeting interventions on the community level will likely require a more holistic approach to health needs assessment. Development of additional determinants, such as measures of people's lifestyle and social network, may require policies pushing the integration of routine data from different (healthcare) sources. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Mittler, Jessica N.; Landon, Bruce E.; Zaslavsky, Alan M.; Cleary, Paul D.
2011-01-01
Background Medicare beneficiaries' awareness of Medicare managed care plans is critical for realizing the potential benefits of coverage choices. Objectives To assess the relationships of the number of Medicare risk plans, managed care penetration, and stability of plans in an area with traditional Medicare beneficiaries' awareness of the program. Research Design Cross-sectional analysis of Medicare Current Beneficiary Survey data about beneficiaries' awareness and knowledge of Medicare managed care plan availability. Logistic regression models used to assess the relationships between awareness and market characteristics. Subjects Traditional Medicare beneficiaries (n = 3,597) who had never been enrolled in Medicare managed care, but had at least one plan available in their area in 2002, and excluding beneficiaries under 65, receiving Medicaid, or with end stage renal disease. Measures Traditional Medicare beneficiaries' knowledge of Medicare managed care plans in general and in their area. Results Having more Medicare risk plans available was significantly associated with greater awareness, and having an intermediate number of plans (2-4) was significantly associated with more accurate knowledge of Medicare risk plan availability than was having fewer or more plans. Conclusions Medicare may have more success engaging consumers in choice and capturing the benefits of plan competition by more actively selecting and managing the plan choice set. PMID:22340776
EEG Frequency Changes Prior to Making Errors in an Easy Stroop Task
Atchley, Rachel; Klee, Daniel; Oken, Barry
2017-01-01
Background: Mind-wandering is a form of off-task attention that has been associated with negative affect and rumination. The goal of this study was to assess potential electroencephalographic markers of task-unrelated thought, or mind-wandering state, as related to error rates during a specialized cognitive task. We used EEG to record frontal frequency band activity while participants completed a Stroop task that was modified to induce boredom, task-unrelated thought, and therefore mind-wandering. Methods: A convenience sample of 27 older adults (50–80 years) completed a computerized Stroop matching task. Half of the Stroop trials were congruent (word/color match), and the other half were incongruent (mismatched). Behavioral data and EEG recordings were assessed. EEG analysis focused on the 1-s epochs prior to stimulus presentation in order to compare trials followed by correct versus incorrect responses. Results: Participants made errors on 9% of incongruent trials. There were no errors on congruent trials. There was a decrease in alpha and theta band activity during the epochs followed by error responses. Conclusion: Although replication of these results is necessary, these findings suggest that potential mind-wandering, as evidenced by errors, can be characterized by a decrease in alpha and theta activity compared to on-task, accurate performance periods. PMID:29163101
Milner, Danny A.; Valim, Clarissa; Luo, Robert; Playforth, Krupa B.; Kamiza, Steve; Molyneux, Malcolm E.; Seydel, Karl B.; Taylor, Terrie E.
2012-01-01
Background The conventional clinical case definition of cerebral malaria (CM) is imprecise but specificity is improved by a definitive clinical feature such as retinopathy or confirming sequestration of parasites in a post-mortem examination of the brain. A full autopsy is often not possible, since it is costly and may encounter resistance of the deceased's family. Methods We have assessed the use of a cytological smear of brain tissue, obtained post-mortem by supraorbital sampling, for the purpose of quantifying cerebral sequestration in children with fatal malaria in Blantyre, Malawi. We have compared this method to histological quantification of parasites at autopsy. Results The number of parasites present on cytological smears correlated with the proportion of vessels parasitized as assessed by histology of fixed and stained brain tissue. Use of cytological results in addition to the standard clinical case definition increases the specificity of the clinical case definition alone from 48.3% to 100% with a minimal change in sensitivity. Conclusions Post-mortem supraorbital sampling of brain tissue improves the specificity of the diagnosis of fatal cerebral malaria and provides accurate quantitative estimates of cerebral sequestration. This tool can be of great value in clinical, pathogenetic, and epidemiological research studies on cerebral malaria. PMID:22291197
Ben Mansour, Khaireddine; Rezzoug, Nasser; Gorce, Philippe
2015-10-01
The purpose of this paper was to determine which types of inertial sensors and which advocated locations should be used for reliable and accurate gait event detection and temporal parameter assessment in normal adults. In addition, we aimed to remove the ambiguity found in the literature of the definition of the initial contact (IC) from the lumbar accelerometer. Acceleration and angular velocity data was gathered from the lumbar region and the distal edge of each shank. This data was evaluated in comparison to an instrumented treadmill and an optoelectronic system during five treadmill speed sessions. The lumbar accelerometer showed that the peak of the anteroposterior component was the most accurate for IC detection. Similarly, the valley that followed the peak of the vertical component was the most precise for terminal contact (TC) detection. Results based on ANOVA and Tukey tests showed that the set of inertial methods was suitable for temporal gait assessment and gait event detection in able-bodied subjects. For gait event detection, an exception was found with the shank accelerometer. The tool was suitable for temporal parameters assessment, despite the high root mean square error on the detection of IC (RMSEIC) and TC (RMSETC). The shank gyroscope was found to be as accurate as the kinematic method since the statistical tests revealed no significant difference between the two techniques for the RMSE off all gait events and temporal parameters. The lumbar and shank accelerometers were the most accurate alternative to the shank gyroscope for gait event detection and temporal parameters assessment, respectively. Copyright © 2015. Published by Elsevier B.V.
Quality assessment of color images based on the measure of just noticeable color difference
NASA Astrophysics Data System (ADS)
Chou, Chun-Hsien; Hsu, Yun-Hsiang
2014-01-01
Accurate assessment on the quality of color images is an important step to many image processing systems that convey visual information of the reproduced images. An accurate objective image quality assessment (IQA) method is expected to give the assessment result highly agreeing with the subjective assessment. To assess the quality of color images, many approaches simply apply the metric for assessing the quality of gray scale images to each of three color channels of the color image, neglecting the correlation among three color channels. In this paper, a metric for assessing color images' quality is proposed, in which the model of variable just-noticeable color difference (VJNCD) is employed to estimate the visibility thresholds of distortion inherent in each color pixel. With the estimated visibility thresholds of distortion, the proposed metric measures the average perceptible distortion in terms of the quantized distortion according to the perceptual error map similar to that defined by National Bureau of Standards (NBS) for converting the color difference enumerated by CIEDE2000 to the objective score of perceptual quality assessment. The perceptual error map in this case is designed for each pixel according to the visibility threshold estimated by the VJNCD model. The performance of the proposed metric is verified by assessing the test images in the LIVE database, and is compared with those of many well-know IQA metrics. Experimental results indicate that the proposed metric is an effective IQA method that can accurately predict the image quality of color images in terms of the correlation between objective scores and subjective evaluation.
Zhou, Zhongxing; Gao, Feng; Zhao, Huijuan; Zhang, Lixin
2011-03-01
Noise characterization through estimation of the noise power spectrum (NPS) is a central component of the evaluation of digital x-ray systems. Extensive works have been conducted to achieve accurate and precise measurement of NPS. One approach to improve the accuracy of the NPS measurement is to reduce the statistical variance of the NPS results by involving more data samples. However, this method is based on the assumption that the noise in a radiographic image is arising from stochastic processes. In the practical data, the artifactuals always superimpose on the stochastic noise as low-frequency background trends and prevent us from achieving accurate NPS. The purpose of this study was to investigate an appropriate background detrending technique to improve the accuracy of NPS estimation for digital x-ray systems. In order to achieve the optimal background detrending technique for NPS estimate, four methods for artifactuals removal were quantitatively studied and compared: (1) Subtraction of a low-pass-filtered version of the image, (2) subtraction of a 2-D first-order fit to the image, (3) subtraction of a 2-D second-order polynomial fit to the image, and (4) subtracting two uniform exposure images. In addition, background trend removal was separately applied within original region of interest or its partitioned sub-blocks for all four methods. The performance of background detrending techniques was compared according to the statistical variance of the NPS results and low-frequency systematic rise suppression. Among four methods, subtraction of a 2-D second-order polynomial fit to the image was most effective in low-frequency systematic rise suppression and variances reduction for NPS estimate according to the authors' digital x-ray system. Subtraction of a low-pass-filtered version of the image led to NPS variance increment above low-frequency components because of the side lobe effects of frequency response of the boxcar filtering function. Subtracting two uniform exposure images obtained the worst result on the smoothness of NPS curve, although it was effective in low-frequency systematic rise suppression. Subtraction of a 2-D first-order fit to the image was also identified effective for background detrending, but it was worse than subtraction of a 2-D second-order polynomial fit to the image according to the authors' digital x-ray system. As a result of this study, the authors verified that it is necessary and feasible to get better NPS estimate by appropriate background trend removal. Subtraction of a 2-D second-order polynomial fit to the image was the most appropriate technique for background detrending without consideration of processing time.
Almajwal, Ali M; Williams, Peter G; Batterham, Marijka J
2011-07-01
To assess the accuracy of resting energy expenditure (REE) measurement in a sample of overweight and obese Saudi males, using the BodyGem device (BG) with whole room calorimetry (WRC) as a reference, and to evaluate the accuracy of predictive equations. Thirty-eight subjects (mean +/- SD, age 26.8+/- 3.7 years, body mass index 31.0+/- 4.8) were recruited during the period from 5 February 2007 to 28 March 2008. Resting energy expenditure was measured using a WRC and BG device, and also calculated using 7 prediction equations. Mean differences, bias, percent of bias (%bias), accurate estimation, underestimation and overestimation were calculated. Repeated measures with the BG were not significantly different (accurate prediction: 81.6%; %bias 1.1+/- 6.3, p>0.24) with limits of agreement ranging from +242 to -200 kcal. Resting energy expenditure measured by BG was significantly less than WRC values (accurate prediction: 47.4%; %bias: 11.0+/- 14.6, p = 0.0001) with unacceptably wide limits of agreement. Harris-Benedict, Schofield and World Health Organization equations were the most accurate, estimating REE within 10% of measured REE, but none seem appropriate to predict the REE of individuals. There was a poor agreement between the REE measured by WRC compared to BG or predictive equations. The BG assessed REE accurately in 47.4% of the subjects on an individual level.
42 CFR 412.610 - Assessment schedule.
Code of Federal Regulations, 2010 CFR
2010-10-01
... section. (e) Accuracy of the patient assessment data. The encoded patient assessment data must accurately... instrument record retention. An inpatient rehabilitation facility must maintain all patient assessment data sets completed on Medicare Part A fee-for-service patients within the previous 5 years and Medicare...
Tudrej, Benoit V.; Heintz, Anne-Laure; Rehman, Michaela B.; Marcelli, Daniel; Ingrand, Pierre; Binder, Philippe
2017-01-01
Abstract Background: Most adolescents consult their general practitioner (GP) for common reasons, somatic or administrative but many of them have hidden feelings of distress. Objectives: To assess the immediate impact of ‘ordinary’ consultations on feelings of distress among adolescents and to compare adolescents experiencing difficulties (D) to those with no difficulties (N). To analyse how accurately GPs assess the impact of their consultation on adolescents’ feelings. Methods: GPs were randomly selected from two non-contiguous French administrative areas between April and June 2006. Fifty-three GPs gave two questionnaires to the first 10 to 15 adolescents aged 12 to 20 seen in consultation. One questionnaire was issued before the consultation and the other one afterwards. Adolescents had to position themselves about different aspects of well-being and say where they would seek help if they had problems. A GP questionnaire assessed how well they estimated their impact on the adolescent’s feeling of well-being. Results: Six hundred and sixty-five adolescents were assessed. They reported feeling better about their health, being able to talk, having someone to talk to or to confide in and on feeling understood. The D group (n = 147) felt significantly better compared to the N group (n = 518). GPs tended to underestimate this improvement, especially regarding adolescents in the D group feeling better about their health. Conclusions: Consulting a GP generates increased well-being among adolescents, especially for those experiencing difficulties. GPs tend to underestimate the positive impact they may have. Further studies are needed to explore if this benefit is permanent over time. PMID:28714758
Raman, Subha V.; Sahu, Anurag; Merchant, Ali Z.; Louis, Louis B.; Firstenberg, Michael S.; Sun, Benjamin
2009-01-01
Background Left ventricular assist devices (LVADs) provide a bridge to recovery or heart transplantation, but require serial assessment. Echocardiographic approaches may be limited by device artifact and acoustic window. Cardiovascular computed tomography (CCT) provides noninvasive imaging of LVADs, yet no study has evaluated CCT’s impact on clinical care. We evaluated the diagnostic findings and clinical impact of CCT for noninvasive assessment of patients with LVADs. Methods CCT examinations performed between 2005 and 2008 in patients with LVADs were identified. Acquisitions were completed on the identical 64 detector-row scanner with intravenous contrast administration; electrocardiographic gating was used in patients with pulsatile devices, while peripheral pulse gating was used in patients with continuous-flow devices. Comparison was made between CCT results and 30-day outcomes, including echocardiographic and intraoperative findings. Results Thirty-two CCT examinations from 28 patients were reviewed. Indications included evaluation of low cardiac output symptoms, assessment of cannula position, low flow reading on the LVAD, and surgical planning. CCT identified critical findings in 6 patients including thrombosis and inlet cannula malposition, all confirmed intraoperatively; one case of intra-LVAD thrombus was missed by CCT. Using intraoperative findings as the gold standard, CCT’s sensitivity was 85% and specificity was 100%. Echocardiographic LVAD evaluation did not correlate with findings on CCT (kappa = −0.29, 95% CI −0.73−0.13). Conclusions This preliminary observational cohort study indicates that noninvasive imaging using CCT of LVADs is feasible and accurate. CCT warrants consideration in the initial evaluation of symptomatic patients with LVADs. PMID:19782594
Herrod, Pjj; Cox, M; Keevil, H; Smith, Kje; Lund, J N
2018-04-01
Background and aims Late recognition of sepsis and consequent death remains a problem. To address this, the National Institute for Health and Care Excellence has published updated guidance recommending the use of the Quick Sequential Organ Failure Assessment (Q-SOFA) score when assessing patients at risk of sepsis following the publication of the Third International Consensus Definitions for Sepsis and Septic Shock. The trauma from major surgery produces a systemic inflammatory response syndrome (SIRS) postoperatively as part of its natural history, which may falsely trigger scoring systems. We aimed to assess the accuracy of Q-SOFA and SIRS criteria as recommended scores for early detection of sepsis and septic complications in the first 48hrs after colorectal cancer surgery. Methods We reviewed all elective major colorectal operations in a single centre during a 12-month period from prospectively maintained electronic records. Results One hundred and thirty nine patients were included in this study. In all, 29 patients developed postoperative infective complications in hospital. Nineteen patients triggered on SIRS without developing infective complications, while 42 patients triggered on Q-SOFA with no infective complications. The area under the ROC curve was 0.52 for Q-SOFA and 0.67 for SIRS. Discussion Q-SOFA appears to perform little better than a coin toss at identifying postoperative sepsis after colorectal cancer resection and is inferior to the SIRS criteria. More work is required to assess whether a combination of scoring criteria, biochemical markers and automated tools could increase accurate detection of postoperative infection and trigger early intervention.
Interpretation of Brain CT Scans in the Field by Critical Care Physicians in a Mobile Stroke Unit
Zakariassen, Erik; Lindner, Thomas; Nome, Terje; Bache, Kristi G.; Røislien, Jo; Gleditsch, Jostein; Solyga, Volker; Russell, David; Lund, Christian G.
2017-01-01
ABSTRACT BACKGROUND AND PURPOSE In acute stroke, thromboembolism or spontaneous hemorrhage abruptly reduces blood flow to a part of the brain. To limit necrosis, rapid radiological identification of the pathological mechanism must be conducted to allow the initiation of targeted treatment. The aim of the Norwegian Acute Stroke Prehospital Project is to determine if anesthesiologists, trained in prehospital critical care, may accurately assess cerebral computed tomography (CT) scans in a mobile stroke unit (MSU). METHODS In this pilot study, 13 anesthesiologists assessed unselected acute stroke patients with a cerebral CT scan in an MSU. The scans were simultaneously available by teleradiology at the receiving hospital and the on‐call radiologist. CT scan interpretation was focused on the radiological diagnosis of acute stroke and contraindications for thrombolysis. The aim of this study was to find inter‐rater agreement between the pre‐ and in‐hospital radiological assessments. A neuroradiologist evaluated all CT scans retrospectively. Statistical analysis of inter‐rater agreement was analyzed with Cohen's kappa. RESULTS Fifty‐one cerebral CT scans from the MSU were included. Inter‐rater agreement between prehospital anesthesiologists and the in‐hospital on‐call radiologists was excellent in finding radiological selection for thrombolysis (kappa .87). Prehospital CT scans were conducted in median 10 minutes (7 and 14 minutes) in the MSU, and median 39 minutes (31 and 48 minutes) before arrival at the receiving hospital. CONCLUSION This pilot study shows that anesthesiologists trained in prehospital critical care may effectively assess cerebral CT scans in an MSU, and determine if there are radiological contraindications for thrombolysis. PMID:28766306
2014-01-01
Background and purpose It is difficult to evaluate glenoid component periprosthetic radiolucencies in total shoulder arthroplasties (TSAs) using plain radiographs. This study was performed to evaluate whether computed tomography (CT) using a specific patient position in the CT scanner provides a better method for assessing radiolucencies in TSA. Methods Following TSA, 11 patients were CT scanned in a lateral decubitus position with maximum forward flexion, which aligns the glenoid orientation with the axis of the CT scanner. Follow-up CT scanning is part of our routine patient care. Glenoid component periprosthetic lucency was assessed according to the Molé score and it was compared to routine plain radiographs by 5 observers. Results The protocol almost completely eliminated metal artifacts in the CT images and allowed accurate assessment of periprosthetic lucency of the glenoid fixation. Positioning of the patient within the CT scanner as described was possible for all 11 patients. A radiolucent line was identified in 54 of the 55 observed CT scans and osteolysis was identified in 25 observations. The average radiolucent line Molé score was 3.4 (SD 2.7) points with plain radiographs and 9.5 (SD 0.8) points with CT scans (p = 0.001). The mean intra-observer variance was lower in the CT scan group than in the plain radiograph group (p = 0.001). Interpretation The CT scan protocol we used is of clinical value in routine assessment of glenoid periprosthetic lucency after TSA. The technique improves the ability to detect and monitor radiolucent lines and, therefore, possibly implant loosening also. PMID:24286563
A systematic approach to adnexal masses discovered on ultrasound: the ADNEx MR scoring system.
Sadowski, Elizabeth A; Robbins, Jessica B; Rockall, Andrea G; Thomassin-Naggara, Isabelle
2018-03-01
Adnexal lesions are a common occurrence in radiology practice and imaging plays a crucial role in triaging women appropriately. Current trends toward early detection and characterization have increased the need for accurate imaging assessment of adnexal lesions prior to treatment. Ultrasound is the first-line imaging modality for assessing adnexal lesions; however, approximately 20% of lesions are incompletely characterized after ultrasound evaluation. Secondary assessment with MR imaging using the ADNEx MR Scoring System has been demonstrated as highly accurate in the characterization of adnexal lesions and in excluding ovarian cancer. This review will address the role of MR imaging in further assessment of adnexal lesions discovered on US, and the utility of the ADNEx MR Scoring System.
Benjamin, Sara E; Neelon, Brian; Ball, Sarah C; Bangdiwala, Shrikant I; Ammerman, Alice S; Ward, Dianne S
2007-01-01
Background Few assessment instruments have examined the nutrition and physical activity environments in child care, and none are self-administered. Given the emerging focus on child care settings as a target for intervention, a valid and reliable measure of the nutrition and physical activity environment is needed. Methods To measure inter-rater reliability, 59 child care center directors and 109 staff completed the self-assessment concurrently, but independently. Three weeks later, a repeat self-assessment was completed by a sub-sample of 38 directors to assess test-retest reliability. To assess criterion validity, a researcher-administered environmental assessment was conducted at 69 centers and was compared to a self-assessment completed by the director. A weighted kappa test statistic and percent agreement were calculated to assess agreement for each question on the self-assessment. Results For inter-rater reliability, kappa statistics ranged from 0.20 to 1.00 across all questions. Test-retest reliability of the self-assessment yielded kappa statistics that ranged from 0.07 to 1.00. The inter-quartile kappa statistic ranges for inter-rater and test-retest reliability were 0.45 to 0.63 and 0.27 to 0.45, respectively. When percent agreement was calculated, questions ranged from 52.6% to 100% for inter-rater reliability and 34.3% to 100% for test-retest reliability. Kappa statistics for validity ranged from -0.01 to 0.79, with an inter-quartile range of 0.08 to 0.34. Percent agreement for validity ranged from 12.9% to 93.7%. Conclusion This study provides estimates of criterion validity, inter-rater reliability and test-retest reliability for an environmental nutrition and physical activity self-assessment instrument for child care. Results indicate that the self-assessment is a stable and reasonably accurate instrument for use with child care interventions. We therefore recommend the Nutrition and Physical Activity Self-Assessment for Child Care (NAP SACC) instrument to researchers and practitioners interested in conducting healthy weight intervention in child care. However, a more robust, less subjective measure would be more appropriate for researchers seeking an outcome measure to assess intervention impact. PMID:17615078
An assessment of RELAP5-3D using the Edwards-O'Brien Blowdown problem
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tomlinson, E.T.; Aumiller, D.L.
1999-07-01
The RELAP5-3D (version bt) computer code was used to assess the United States Nuclear Regulatory Commission's Standard Problem 1 (Edwards-O'Brien Blowdown Test). The RELAP5-3D standard installation problem based on the Edwards-O'Brien Blowdown Test was modified to model the appropriate initial conditions and to represent the proper location of the instruments present in the experiment. The results obtained using the modified model are significantly different from the original calculation indicating the need to model accurately the experimental conditions if an accurate assessment of the calculational model is to be obtained.
2017-09-01
ER D C/ CH L TR -1 7- 15 Strategic Environmental Research and Development Program Develop Accurate Methods for Characterizing and...current environments. This research will provide more accurate methods for assessing contaminated sediment stability for many DoD and Environmental...47.88026 pascals yards 0.9144 meters ERDC/CHL TR-17-15 xi Executive Summary Objective The proposed research goal is to develop laboratory methods
NASA Astrophysics Data System (ADS)
Zhou, Anran; Xie, Weixin; Pei, Jihong; Chen, Yapei
2018-02-01
For ship targets detection in cluttered infrared image sequences, a robust detection method, based on the probabilistic single Gaussian model of sea background in Fourier domain, is put forward. The amplitude spectrum sequences at each frequency point of the pure seawater images in Fourier domain, being more stable than the gray value sequences of each background pixel in the spatial domain, are regarded as a Gaussian model. Next, a probability weighted matrix is built based on the stability of the pure seawater's total energy spectrum in the row direction, to make the Gaussian model more accurate. Then, the foreground frequency points are separated from the background frequency points by the model. Finally, the false-alarm points are removed utilizing ships' shape features. The performance of the proposed method is tested by visual and quantitative comparisons with others.
Modulated Raman Spectroscopy for Enhanced Cancer Diagnosis at the Cellular Level
De Luca, Anna Chiara; Dholakia, Kishan; Mazilu, Michael
2015-01-01
Raman spectroscopy is emerging as a promising and novel biophotonics tool for non-invasive, real-time diagnosis of tissue and cell abnormalities. However, the presence of a strong fluorescence background is a key issue that can detract from the use of Raman spectroscopy in routine clinical care. The review summarizes the state-of-the-art methods to remove the fluorescence background and explores recent achievements to address this issue obtained with modulated Raman spectroscopy. This innovative approach can be used to extract the Raman spectral component from the fluorescence background and improve the quality of the Raman signal. We describe the potential of modulated Raman spectroscopy as a rapid, inexpensive and accurate clinical tool to detect the presence of bladder cancer cells. Finally, in a broader context, we show how this approach can greatly enhance the sensitivity of integrated Raman spectroscopy and microfluidic systems, opening new prospects for portable higher throughput Raman cell sorting. PMID:26110401
On the proper use of the reduced speed of light approximation
Gnedin, Nickolay Y.
2016-12-07
I show that the Reduced Speed of Light (RSL) approximation, when used properly (i.e. as originally designed - only for the local sources but not for the cosmic background), remains a highly accurate numerical method for modeling cosmic reionization. Simulated ionization and star formation histories from the "Cosmic Reionization On Computers" (CROC) project are insensitive to the adopted value of the reduced speed of light for as long as that value does not fall below about 10% of the true speed of light. Here, a recent claim of the failure of the RSL approximation in the Illustris reionization model appearsmore » to be due to the effective speed of light being reduced in the equation for the cosmic background too, and, hence, illustrates the importance of maintaining the correct speed of light in modeling the cosmic background.« less
On the proper use of the reduced speed of light approximation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gnedin, Nickolay Y.
I show that the Reduced Speed of Light (RSL) approximation, when used properly (i.e. as originally designed - only for the local sources but not for the cosmic background), remains a highly accurate numerical method for modeling cosmic reionization. Simulated ionization and star formation histories from the "Cosmic Reionization On Computers" (CROC) project are insensitive to the adopted value of the reduced speed of light for as long as that value does not fall below about 10% of the true speed of light. Here, a recent claim of the failure of the RSL approximation in the Illustris reionization model appearsmore » to be due to the effective speed of light being reduced in the equation for the cosmic background too, and, hence, illustrates the importance of maintaining the correct speed of light in modeling the cosmic background.« less
Can muon-induced backgrounds explain the DAMA data?
NASA Astrophysics Data System (ADS)
Klinger, Joel; Kudryavtsev, Vitaly A.
2016-05-01
We present an accurate simulation of the muon-induced background in the DAMA/LIBRA experiment. Muon sampling underground has been performed using the MUSIC/MUSUN codes and subsequent interactions in the rock around the DAMA/LIBRA detector cavern and the experimental setup including shielding, have been simulated with GEANT4.9.6. In total we simulate the equivalent of 20 years of muon data. We have calculated the total muon-induced neutron flux in the DAMA/LIBRA detector cavern as Φμ n = 1.0 × 10-9 cm-2s-1, which is consistent with other simulations. After selecting events which satisfy the DAMA/LIBRA signal criteria, our simulation predicts 3.49 × 10-5 cpd/kg/keV which accounts for less than 0.3% of the DAMA/LIBRA modulation amplitude. We conclude from our work that muon-induced backgrounds are unable to contribute to the observed signal modulation.
Kuligowski, J; Quintás, G; Garrigues, S; de la Guardia, M
2010-03-15
A new background correction method for the on-line coupling of gradient liquid chromatography and Fourier transform infrared spectrometry has been developed. It is based on the use of a point-to-point matching algorithm that compares the absorption spectra of the sample data set with those of a previously recorded reference data set in order to select an appropriate reference spectrum. The spectral range used for the point-to-point comparison is selected with minimal user-interaction, thus facilitating considerably the application of the whole method. The background correction method has been successfully tested on a chromatographic separation of four nitrophenols running acetonitrile (0.08%, v/v TFA):water (0.08%, v/v TFA) gradients with compositions ranging from 35 to 85% (v/v) acetonitrile, giving accurate results for both, baseline resolved and overlapped peaks. Copyright (c) 2009 Elsevier B.V. All rights reserved.
A depth enhancement strategy for kinect depth image
NASA Astrophysics Data System (ADS)
Quan, Wei; Li, Hua; Han, Cheng; Xue, Yaohong; Zhang, Chao; Hu, Hanping; Jiang, Zhengang
2018-03-01
Kinect is a motion sensing input device which is widely used in computer vision and other related fields. However, there are many inaccurate depth data in Kinect depth images even Kinect v2. In this paper, an algorithm is proposed to enhance Kinect v2 depth images. According to the principle of its depth measuring, the foreground and the background are considered separately. As to the background, the holes are filled according to the depth data in the neighborhood. And as to the foreground, a filling algorithm, based on the color image concerning about both space and color information, is proposed. An adaptive joint bilateral filtering method is used to reduce noise. Experimental results show that the processed depth images have clean background and clear edges. The results are better than ones of traditional Strategies. It can be applied in 3D reconstruction fields to pretreat depth image in real time and obtain accurate results.
DNDO Report: Predicting Solar Modulation Potentials for Modeling Cosmic Background Radiation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Behne, Patrick Alan
The modeling of the detectability of special nuclear material (SNM) at ports and border crossings requires accurate knowledge of the background radiation at those locations. Background radiation originates from two main sources, cosmic and terrestrial. Cosmic background is produced by high-energy galactic cosmic rays (GCR) entering the atmosphere and inducing a cascade of particles that eventually impact the earth’s surface. The solar modulation potential represents one of the primary inputs to modeling cosmic background radiation. Usosokin et al. formally define solar modulation potential as “the mean energy loss [per unit charge] of a cosmic ray particle inside the heliosphere…” Modulationmore » potential, a function of elevation, location, and time, shares an inverse relationship with cosmic background radiation. As a result, radiation detector thresholds require adjustment to account for differing background levels, caused partly by differing solar modulations. Failure to do so can result in higher rates of false positives and failed detection of SNM for low and high levels of solar modulation potential, respectively. This study focuses on solar modulation’s time dependence, and seeks the best method to predict modulation for future dates using Python. To address the task of predicting future solar modulation, we utilize both non-linear least squares sinusoidal curve fitting and cubic spline interpolation. This material will be published in transactions of the ANS winter meeting of November, 2016.« less
Pool, Robert; Montgomery, Catherine M.; Morar, Neetha S.; Mweemba, Oliver; Ssali, Agnes; Gafos, Mitzy; Lees, Shelley; Stadler, Jonathan; Nunn, Andrew; Crook, Angela; Hayes, Richard; McCormack, Sheena
2010-01-01
Background Accurate data on adherence and sexual behaviour are crucial in microbicide (and other HIV-related) research. In the absence of a “gold standard” the collection of such data relies largely on participant self-reporting. The Microbicides Development Programme has developed a mixed method/triangulation model for generating more accurate data on adherence and sexual behaviour. Methodology/Principal Findings Data were collected from a random subsample of 725 women using structured case record form (CRF) interviews, coital diaries (CD) and in-depth interviews (IDI). Returned used and unused gel applicators were counted and additional data collected through focus group discussions and ethnography. The model is described in detail in a companion paper [1]. When CRF, CD and IDI are compared there is some inconsistency with regard to reporting of sexual behaviour, gel or condom use in more than half. Inaccuracies are least prevalent in the IDI and most prevalent in the CRF, where participants tend to under-report frequency of sex and gel and condom use. Women reported more sex, gel and condom use than their partners. IDI data on adherence match the applicator-return data more closely than the CRF. The main reasons for inaccuracies are participants forgetting, interviewer error, desirability bias, problems with the definition and delineation of key concepts (e.g. “sex act”). Most inaccuracies were unintentional and could be rectified during data collection. Conclusions/Significance The CRF – the main source of self-report data on behaviour and adherence in many studies – was the least accurate with regard to measuring sexual behaviour, gel and condom use. This has important implications for the use of structured questionnaires for the collection of data on sexual behaviour and adherence. Integrating in-depth interviews and triangulation into clinical trials could increase the richness and accuracy of behavioural and adherence data. PMID:20657774
French, Dustin D; Gill, Manjot; Mitchell, Christopher; Jackson, Kathryn; Kho, Abel; Bryar, Paul J
2016-01-01
Background Visual acuity is the primary measure used in ophthalmology to determine how well a patient can see. Visual acuity for a single eye may be recorded in multiple ways for a single patient visit (eg, Snellen vs. Jäger units vs. font print size), and be recorded for either distance or near vision. Capturing the best documented visual acuity (BDVA) of each eye in an individual patient visit is an important step for making electronic ophthalmology clinical notes useful in research. Objective Currently, there is limited methodology for capturing BDVA in an efficient and accurate manner from electronic health record (EHR) notes. We developed an algorithm to detect BDVA for right and left eyes from defined fields within electronic ophthalmology clinical notes. Methods We designed an algorithm to detect the BDVA from defined fields within 295,218 ophthalmology clinical notes with visual acuity data present. About 5668 unique responses were identified and an algorithm was developed to map all of the unique responses to a structured list of Snellen visual acuities. Results Visual acuity was captured from a total of 295,218 ophthalmology clinical notes during the study dates. The algorithm identified all visual acuities in the defined visual acuity section for each eye and returned a single BDVA for each eye. A clinician chart review of 100 random patient notes showed a 99% accuracy detecting BDVA from these records and 1% observed error. Conclusions Our algorithm successfully captures best documented Snellen distance visual acuity from ophthalmology clinical notes and transforms a variety of inputs into a structured Snellen equivalent list. Our work, to the best of our knowledge, represents the first attempt at capturing visual acuity accurately from large numbers of electronic ophthalmology notes. Use of this algorithm can benefit research groups interested in assessing visual acuity for patient centered outcome. All codes used for this study are currently available, and will be made available online at https://phekb.org. PMID:27146002
Horsley, Alex; Macleod, Kenneth; Gupta, Ruchi; Goddard, Nick; Bell, Nicholas
2014-01-01
Background The Innocor device contains a highly sensitive photoacoustic gas analyser that has been used to perform multiple breath washout (MBW) measurements using very low concentrations of the tracer gas SF6. Use in smaller subjects has been restricted by the requirement for a gas analyser response time of <100 ms, in order to ensure accurate estimation of lung volumes at rapid ventilation rates. Methods A series of previously reported and novel enhancements were made to the gas analyser to produce a clinically practical system with a reduced response time. An enhanced lung model system, capable of delivering highly accurate ventilation rates and volumes, was used to assess in vitro accuracy of functional residual capacity (FRC) volume calculation and the effects of flow and gas signal alignment on this. Results 10–90% rise time was reduced from 154 to 88 ms. In an adult/child lung model, accuracy of volume calculation was −0.9 to 2.9% for all measurements, including those with ventilation rate of 30/min and FRC of 0.5 L; for the un-enhanced system, accuracy deteriorated at higher ventilation rates and smaller FRC. In a separate smaller lung model (ventilation rate 60/min, FRC 250 ml, tidal volume 100 ml), mean accuracy of FRC measurement for the enhanced system was minus 0.95% (range −3.8 to 2.0%). Error sensitivity to flow and gas signal alignment was increased by ventilation rate, smaller FRC and slower analyser response time. Conclusion The Innocor analyser can be enhanced to reliably generate highly accurate FRC measurements down at volumes as low as those simulating infant lung settings. Signal alignment is a critical factor. With these enhancements, the Innocor analyser exceeds key technical component recommendations for MBW apparatus. PMID:24892522
Perceived Versus Objective Breast Cancer, Breast Cancer Risk in Diverse Women
Fehniger, Julia; Livaudais-Toman, Jennifer; Karliner, Leah; Kerlikowske, Karla; Tice, Jeffrey A.; Quinn, Jessica; Ozanne, Elissa
2014-01-01
Abstract Background: Prior research suggests that women do not accurately estimate their risk for breast cancer. Estimating and informing women of their risk is essential for tailoring appropriate screening and risk reduction strategies. Methods: Data were collected for BreastCARE, a randomized controlled trial designed to evaluate a PC-tablet based intervention providing multiethnic women and their primary care physicians with tailored information about breast cancer risk. We included women ages 40–74 visiting general internal medicine primary care clinics at one academic practice and one safety net practice who spoke English, Spanish, or Cantonese, and had no personal history of breast cancer. We collected baseline information regarding risk perception and concern. Women were categorized as high risk (vs. average risk) if their family history met criteria for referral to genetic counseling or if they were in the top 5% of risk for their age based on the Gail or Breast Cancer Surveillance Consortium Model (BCSC) breast cancer risk model. Results: Of 1,261 participants, 25% (N=314) were classified as high risk. More average risk than high risk women had correct risk perception (72% vs. 18%); 25% of both average and high risk women reported being very concerned about breast cancer. Average risk women with correct risk perception were less likely to be concerned about breast cancer (odds ratio [OR]=0.3; 95% confidence interval [CI]=0.2–0.4) while high risk women with correct risk perception were more likely to be concerned about breast cancer (OR=5.1; 95%CI=2.7–9.6). Conclusions: Many women did not accurately perceive their risk for breast cancer. Women with accurate risk perception had an appropriate level of concern about breast cancer. Improved methods of assessing and informing women of their breast cancer risk could motivate high risk women to apply appropriate prevention strategies and allay unnecessary concern among average risk women. PMID:24372085
ADER discontinuous Galerkin schemes for general-relativistic ideal magnetohydrodynamics
NASA Astrophysics Data System (ADS)
Fambri, F.; Dumbser, M.; Köppel, S.; Rezzolla, L.; Zanotti, O.
2018-07-01
We present a new class of high-order accurate numerical algorithms for solving the equations of general-relativistic ideal magnetohydrodynamics in curved space-times. In this paper, we assume the background space-time to be given and static, i.e. we make use of the Cowling approximation. The governing partial differential equations are solved via a new family of fully discrete and arbitrary high-order accurate path-conservative discontinuous Galerkin (DG) finite-element methods combined with adaptive mesh refinement and time accurate local time-stepping. In order to deal with shock waves and other discontinuities, the high-order DG schemes are supplemented with a novel a posteriori subcell finite-volume limiter, which makes the new algorithms as robust as classical second-order total-variation diminishing finite-volume methods at shocks and discontinuities, but also as accurate as unlimited high-order DG schemes in smooth regions of the flow. We show the advantages of this new approach by means of various classical two- and three-dimensional benchmark problems on fixed space-times. Finally, we present a performance and accuracy comparisons between Runge-Kutta DG schemes and ADER high-order finite-volume schemes, showing the higher efficiency of DG schemes.
ADER discontinuous Galerkin schemes for general-relativistic ideal magnetohydrodynamics
NASA Astrophysics Data System (ADS)
Fambri, F.; Dumbser, M.; Köppel, S.; Rezzolla, L.; Zanotti, O.
2018-03-01
We present a new class of high-order accurate numerical algorithms for solving the equations of general-relativistic ideal magnetohydrodynamics in curved spacetimes. In this paper we assume the background spacetime to be given and static, i.e. we make use of the Cowling approximation. The governing partial differential equations are solved via a new family of fully-discrete and arbitrary high-order accurate path-conservative discontinuous Galerkin (DG) finite-element methods combined with adaptive mesh refinement and time accurate local timestepping. In order to deal with shock waves and other discontinuities, the high-order DG schemes are supplemented with a novel a-posteriori subcell finite-volume limiter, which makes the new algorithms as robust as classical second-order total-variation diminishing finite-volume methods at shocks and discontinuities, but also as accurate as unlimited high-order DG schemes in smooth regions of the flow. We show the advantages of this new approach by means of various classical two- and three-dimensional benchmark problems on fixed spacetimes. Finally, we present a performance and accuracy comparisons between Runge-Kutta DG schemes and ADER high-order finite-volume schemes, showing the higher efficiency of DG schemes.
2011-01-01
Background Decision curve analysis has been introduced as a method to evaluate prediction models in terms of their clinical consequences if used for a binary classification of subjects into a group who should and into a group who should not be treated. The key concept for this type of evaluation is the "net benefit", a concept borrowed from utility theory. Methods We recall the foundations of decision curve analysis and discuss some new aspects. First, we stress the formal distinction between the net benefit for the treated and for the untreated and define the concept of the "overall net benefit". Next, we revisit the important distinction between the concept of accuracy, as typically assessed using the Youden index and a receiver operating characteristic (ROC) analysis, and the concept of utility of a prediction model, as assessed using decision curve analysis. Finally, we provide an explicit implementation of decision curve analysis to be applied in the context of case-control studies. Results We show that the overall net benefit, which combines the net benefit for the treated and the untreated, is a natural alternative to the benefit achieved by a model, being invariant with respect to the coding of the outcome, and conveying a more comprehensive picture of the situation. Further, within the framework of decision curve analysis, we illustrate the important difference between the accuracy and the utility of a model, demonstrating how poor an accurate model may be in terms of its net benefit. Eventually, we expose that the application of decision curve analysis to case-control studies, where an accurate estimate of the true prevalence of a disease cannot be obtained from the data, is achieved with a few modifications to the original calculation procedure. Conclusions We present several interrelated extensions to decision curve analysis that will both facilitate its interpretation and broaden its potential area of application. PMID:21696604
Pregnancy and Neonatal Diabetes Outcomes in Remote Australia (PANDORA) study
2013-01-01
Background Diabetes in pregnancy carries an increased risk of adverse pregnancy outcomes for both the mother and foetus, but it also provides an excellent early opportunity for intervention in the life course for both mother and baby. In the context of the escalating epidemic of chronic diseases among Indigenous Australians, it is vital that this risk is reduced as early as possible in the life course of the individual. The aims of the PANDORA Study are to: (i) accurately assess rates of diabetes in pregnancy in the Northern Territory (NT) of Australia, where 38% of babies are born to Indigenous mothers; (ii) assess demographic, clinical, biochemical, anthropometric, socioeconomic and early life development factors that may contribute to key maternal and neonatal birth outcomes associated with diabetes in pregnancy; and (iii) monitor relevant post-partum clinical outcomes for both the mothers and their babies. Methods/Design Eligible participants are all NT women with diabetes in pregnancy aged 16 years and over. Information collected includes: standard antenatal clinical information, diagnosis and management of diabetes in pregnancy, socio-economic status, standard clinical birth information (delivery, gestational age, birth weight, adverse antenatal and birth outcomes). Cord blood is collected at the time of delivery and detailed neonatal anthropometric measurements performed within 72 hours of birth. Information will also be collected regarding maternal post-partum glucose tolerance and cardio-metabolic risk factor status, breastfeeding and growth of the baby up to 2 years post-partum in the first instance. Discussion This study will accurately document rates and outcomes of diabetes in pregnancy in the NT of Australia, including the high-risk Indigenous Australian population. The results of this study should contribute to policy and clinical guidelines with the goal of reducing the future risk of obesity and diabetes in both mothers and their offspring. PMID:24289168
Molina-López, Rafael A.; Casal, Jordi; Darwich, Laila
2011-01-01
Background Morbidity studies complement the understanding of hazards to raptors by identifying natural or anthropogenic factors. Descriptive epidemiological studies of wildlife have become an important source of information about hazards to wildlife populations. On the other hand, data referenced to the overall wild population could provide a more accurate assessment of the potential impact of the morbidity/mortality causes in populations of wild birds. Methodology/Principal Findings The present study described the morbidity causes of hospitalized wild raptors and their incidence in the wild populations, through a long term retrospective study conducted at a wildlife rehabilitation centre of Catalonia (1995–2007). Importantly, Seasonal Cumulative Incidences (SCI) were calculated considering estimations of the wild population in the region and trend analyses were applied among the different years. A total of 7021 birds were analysed: 7 species of Strigiformes (n = 3521) and 23 of Falconiformes (n = 3500). The main causes of morbidity were trauma (49.5%), mostly in the Falconiformes, and orphaned/young birds (32.2%) mainly in the Strigiformes. During wintering periods, the largest morbidity incidence was observed in Accipiter gentillis due to gunshot wounds and in Tyto alba due to vehicle trauma. Within the breeding season, Falco tinnunculus (orphaned/young category) and Bubo bubo (electrocution and metabolic disorders) represented the most affected species. Cases due to orphaned/young, infectious/parasitic diseases, electrocution and unknown trauma tended to increase among years. By contrast, cases by undetermined cause, vehicle trauma and captivity decreased throughout the study period. Interestingly, gunshot injuries remained constant during the study period. Conclusions/Significance Frequencies of morbidity causes calculated as the proportion of each cause referred to the total number of admitted cases, allowed a qualitative assessment of hazards for the studied populations. However, cumulative incidences based on estimated wild raptor population provided a more accurate approach to the potential ecological impact of the morbidity causes in the wild populations. PMID:21966362
2011-01-01
Background The emergence and massive spread of bluetongue in Western Europe during 2006-2008 had disastrous consequences for sheep and cattle production and confirmed the ability of Palaearctic Culicoides (Diptera: Ceratopogonidae) to transmit the virus. Some aspects of Culicoides ecology, especially host-seeking and feeding behaviors, remain insufficiently described due to the difficulty of collecting them directly on a bait animal, the most reliable method to evaluate biting rates. Our aim was to compare typical animal-baited traps (drop trap and direct aspiration) to both a new sticky cover trap and a UV-light/suction trap (the most commonly used method to collect Culicoides). Methods/results Collections were made from 1.45 hours before sunset to 1.45 hours after sunset in June/July 2009 at an experimental sheep farm (INRA, Nouzilly, Western France), with 3 replicates of a 4 sites × 4 traps randomized Latin square using one sheep per site. Collected Culicoides individuals were sorted morphologically to species, sex and physiological stages for females. Sibling species were identified using a molecular assay. A total of 534 Culicoides belonging to 17 species was collected. Abundance was maximal in the drop trap (232 females and 4 males from 10 species) whereas the diversity was the highest in the UV-light/suction trap (136 females and 5 males from 15 species). Significant between-trap differences abundance and parity rates were observed. Conclusions Only the direct aspiration collected exclusively host-seeking females, despite a concern that human manipulation may influence estimation of the biting rate. The sticky cover trap assessed accurately the biting rate of abundant species even if it might act as an interception trap. The drop trap collected the highest abundance of Culicoides and may have caught individuals not attracted by sheep but by its structure. Finally, abundances obtained using the UV-light/suction trap did not estimate accurately Culicoides biting rate. PMID:21707980