Molina, Iñigo; Martinez, Estibaliz; Arquero, Agueda; Pajares, Gonzalo; Sanchez, Javier
2012-01-01
Landcover is subject to continuous changes on a wide variety of temporal and spatial scales. Those changes produce significant effects in human and natural activities. Maintaining an updated spatial database with the occurred changes allows a better monitoring of the Earth’s resources and management of the environment. Change detection (CD) techniques using images from different sensors, such as satellite imagery, aerial photographs, etc., have proven to be suitable and secure data sources from which updated information can be extracted efficiently, so that changes can also be inventoried and monitored. In this paper, a multisource CD methodology for multiresolution datasets is applied. First, different change indices are processed, then different thresholding algorithms for change/no_change are applied to these indices in order to better estimate the statistical parameters of these categories, finally the indices are integrated into a change detection multisource fusion process, which allows generating a single CD result from several combination of indices. This methodology has been applied to datasets with different spectral and spatial resolution properties. Then, the obtained results are evaluated by means of a quality control analysis, as well as with complementary graphical representations. The suggested methodology has also been proved efficiently for identifying the change detection index with the higher contribution. PMID:22737023
Molina, Iñigo; Martinez, Estibaliz; Arquero, Agueda; Pajares, Gonzalo; Sanchez, Javier
2012-01-01
Landcover is subject to continuous changes on a wide variety of temporal and spatial scales. Those changes produce significant effects in human and natural activities. Maintaining an updated spatial database with the occurred changes allows a better monitoring of the Earth's resources and management of the environment. Change detection (CD) techniques using images from different sensors, such as satellite imagery, aerial photographs, etc., have proven to be suitable and secure data sources from which updated information can be extracted efficiently, so that changes can also be inventoried and monitored. In this paper, a multisource CD methodology for multiresolution datasets is applied. First, different change indices are processed, then different thresholding algorithms for change/no_change are applied to these indices in order to better estimate the statistical parameters of these categories, finally the indices are integrated into a change detection multisource fusion process, which allows generating a single CD result from several combination of indices. This methodology has been applied to datasets with different spectral and spatial resolution properties. Then, the obtained results are evaluated by means of a quality control analysis, as well as with complementary graphical representations. The suggested methodology has also been proved efficiently for identifying the change detection index with the higher contribution.
NASA Astrophysics Data System (ADS)
Sierra-Pérez, Julián; Torres-Arredondo, M.-A.; Alvarez-Montoya, Joham
2018-01-01
Structural health monitoring consists of using sensors integrated within structures together with algorithms to perform load monitoring, damage detection, damage location, damage size and severity, and prognosis. One possibility is to use strain sensors to infer structural integrity by comparing patterns in the strain field between the pristine and damaged conditions. In previous works, the authors have demonstrated that it is possible to detect small defects based on strain field pattern recognition by using robust machine learning techniques. They have focused on methodologies based on principal component analysis (PCA) and on the development of several unfolding and standardization techniques, which allow dealing with multiple load conditions. However, before a real implementation of this approach in engineering structures, changes in the strain field due to conditions different from damage occurrence need to be isolated. Since load conditions may vary in most engineering structures and promote significant changes in the strain field, it is necessary to implement novel techniques for uncoupling such changes from those produced by damage occurrence. A damage detection methodology based on optimal baseline selection (OBS) by means of clustering techniques is presented. The methodology includes the use of hierarchical nonlinear PCA as a nonlinear modeling technique in conjunction with Q and nonlinear-T 2 damage indices. The methodology is experimentally validated using strain measurements obtained by 32 fiber Bragg grating sensors bonded to an aluminum beam under dynamic bending loads and simultaneously submitted to variations in its pitch angle. The results demonstrated the capability of the methodology for clustering data according to 13 different load conditions (pitch angles), performing the OBS and detecting six different damages induced in a cumulative way. The proposed methodology showed a true positive rate of 100% and a false positive rate of 1.28% for a 99% of confidence.
NASA Astrophysics Data System (ADS)
Schmidt, S.; Heyns, P. S.; de Villiers, J. P.
2018-02-01
In this paper, a fault diagnostic methodology is developed which is able to detect, locate and trend gear faults under fluctuating operating conditions when only vibration data from a single transducer, measured on a healthy gearbox are available. A two-phase feature extraction and modelling process is proposed to infer the operating condition and based on the operating condition, to detect changes in the machine condition. Information from optimised machine and operating condition hidden Markov models are statistically combined to generate a discrepancy signal which is post-processed to infer the condition of the gearbox. The discrepancy signal is processed and combined with statistical methods for automatic fault detection and localisation and to perform fault trending over time. The proposed methodology is validated on experimental data and a tacholess order tracking methodology is used to enhance the cost-effectiveness of the diagnostic methodology.
Building change detection via a combination of CNNs using only RGB aerial imageries
NASA Astrophysics Data System (ADS)
Nemoto, Keisuke; Hamaguchi, Ryuhei; Sato, Masakazu; Fujita, Aito; Imaizumi, Tomoyuki; Hikosaka, Shuhei
2017-10-01
Building change information extracted from remote sensing imageries is important for various applications such as urban management and marketing planning. The goal of this work is to develop a methodology for automatically capturing building changes from remote sensing imageries. Recent studies have addressed this goal by exploiting 3-D information as a proxy for building height. In contrast, because in practice it is expensive or impossible to prepare 3-D information, we do not rely on 3-D data but focus on using only RGB aerial imageries. Instead, we employ deep convolutional neural networks (CNNs) to extract effective features, and improve change detection accuracy in RGB remote sensing imageries. We consider two aspects of building change detection, building detection and subsequent change detection. Our proposed methodology was tested on several areas, which has some differences such as dominant building characteristics and varying brightness values. On all over the tested areas, the proposed method provides good results for changed objects, with recall values over 75 % with a strict overlap requirement of over 50% in intersection-over-union (IoU). When the IoU threshold was relaxed to over 10%, resulting recall values were over 81%. We conclude that use of CNNs enables accurate detection of building changes without employing 3-D information.
Image Change Detection via Ensemble Learning
DOE Office of Scientific and Technical Information (OSTI.GOV)
Martin, Benjamin W; Vatsavai, Raju
2013-01-01
The concept of geographic change detection is relevant in many areas. Changes in geography can reveal much information about a particular location. For example, analysis of changes in geography can identify regions of population growth, change in land use, and potential environmental disturbance. A common way to perform change detection is to use a simple method such as differencing to detect regions of change. Though these techniques are simple, often the application of these techniques is very limited. Recently, use of machine learning methods such as neural networks for change detection has been explored with great success. In this work,more » we explore the use of ensemble learning methodologies for detecting changes in bitemporal synthetic aperture radar (SAR) images. Ensemble learning uses a collection of weak machine learning classifiers to create a stronger classifier which has higher accuracy than the individual classifiers in the ensemble. The strength of the ensemble lies in the fact that the individual classifiers in the ensemble create a mixture of experts in which the final classification made by the ensemble classifier is calculated from the outputs of the individual classifiers. Our methodology leverages this aspect of ensemble learning by training collections of weak decision tree based classifiers to identify regions of change in SAR images collected of a region in the Staten Island, New York area during Hurricane Sandy. Preliminary studies show that the ensemble method has approximately 11.5% higher change detection accuracy than an individual classifier.« less
NASA Astrophysics Data System (ADS)
Burrell, A. L.; Evans, J. P.; Liu, Y.
2017-12-01
Dryland degradation is an issue of international significance as dryland regions play a substantial role in global food production. Remotely sensed data provide the only long term, large scale record of changes within dryland ecosystems. The Residual Trend, or RESTREND, method is applied to satellite observations to detect dryland degradation. Whilst effective in most cases, it has been shown that the RESTREND method can fail to identify degraded pixels if the relationship between vegetation and precipitation has broken-down as a result of severe or rapid degradation. This study presents an extended version of the RESTREND methodology that incorporates the Breaks For Additive Seasonal and Trend method to identify step changes in the time series that are related to significant structural changes in the ecosystem, e.g. land use changes. When applied to Australia, this new methodology, termed Time Series Segmentation and Residual Trend analysis (TSS-RESTREND), was able to detect degradation in 5.25% of pixels compared to only 2.0% for RESTREND alone. This modified methodology was then assessed in two regions with known histories of degradation where it was found to accurately capture both the timing and directionality of ecosystem change.
The characteristics and interpretability of land surface change and implications for project design
Sohl, Terry L.; Gallant, Alisa L.; Loveland, Thomas R.
2004-01-01
The need for comprehensive, accurate information on land-cover change has never been greater. While remotely sensed imagery affords the opportunity to provide information on land-cover change over large geographic expanses at a relatively low cost, the characteristics of land-surface change bring into question the suitability of many commonly used methodologies. Algorithm-based methodologies to detect change generally cannot provide the same level of accuracy as the analyses done by human interpreters. Results from the Land Cover Trends project, a cooperative venture that includes the U.S. Geological Survey, Environmental Protection Agency, and National Aeronautics and Space Administration, have shown that land-cover conversion is a relatively rare event, occurs locally in small patches, varies geographically and temporally, and is spectrally ambiguous. Based on these characteristics of change and the type of information required, manual interpretation was selected as the primary means of detecting change in the Land Cover Trends project. Mixtures of algorithm-based detection and manual interpretation may often prove to be the most feasible and appropriate design for change-detection applications. Serious examination of the expected characteristics and measurability of change must be considered during the design and implementation phase of any change analysis project.
Tregidgo, Daniel J; West, Sarah E; Ashmore, Mike R
2013-11-01
Citizen science is having increasing influence on environmental monitoring as its advantages are becoming recognised. However methodologies are often simplified to make them accessible to citizen scientists. We tested whether a recent citizen science survey (the OPAL Air Survey) could detect trends in lichen community composition over transects away from roads. We hypothesised that the abundance of nitrophilic lichens would decrease with distance from the road, while that of nitrophobic lichens would increase. The hypothesised changes were detected along strong pollution gradients, but not where the road source was relatively weak, or background pollution relatively high. We conclude that the simplified OPAL methodology can detect large contrasts in nitrogenous pollution, but it may not be able to detect more subtle changes in pollution exposure. Similar studies are needed in conjunction with the ever-growing body of citizen science work to ensure that the limitations of these methods are fully understood. Copyright © 2013 Elsevier Ltd. All rights reserved.
Oil Spill Detection: Past and Future Trends
NASA Astrophysics Data System (ADS)
Topouzelis, Konstantinos; Singha, Suman
2016-08-01
In the last 15 years, the detection of oil spills by satellite means has been moved from experimental to operational. Actually, what is really changed is the satellite image availability. From the late 1990's, in the age of "no data" we have moved forward 15 years to the age of "Sentinels" with an abundance of data. Either large accident related to offshore oil exploration and production activity or illegal discharges from tankers, oil on the sea surface is or can be now regularly monitored, over European Waters. National and transnational organizations (i.e. European Maritime Safety Agency's 'CleanSeaNet' Service) are routinely using SAR imagery to detect oil due to it's all weather, day and night imaging capability. However, all these years the scientific methodology on the detection remains relatively constant. From manual analysis to fully automatic detection methodologies, no significant contribution has been published in the last years and certainly none has dramatically changed the rules of the detection. On the contrary, although the overall accuracy of the methodology is questioned, the four main classification steps (dark area detection, features extraction, statistic database creation, and classification) are continuously improving. In recent years, researchers came up with the use of polarimetric SAR data for oil spill detection and characterizations, although utilization of Pol-SAR data for this purpose still remains questionable due to lack of verified dataset and low spatial coverage of Pol-SAR data. The present paper is trying to point out the drawbacks of the oil spill detection in the last years and focus on the bottlenecks of the oil spill detection methodologies. Also, solutions on the basis of data availability, management and analysis are proposed. Moreover, an ideal detection system is discussed regarding satellite image and in situ observations using different scales and sensors.
Landsat change detection can aid in water quality monitoring
NASA Technical Reports Server (NTRS)
Macdonald, H. C.; Steele, K. F.; Waite, W. P.; Shinn, M. R.
1977-01-01
Comparison between Landsat-1 and -2 imagery of Arkansas provided evidence of significant land use changes during the 1972-75 time period. Analysis of Arkansas historical water quality information has shown conclusively that whereas point source pollution generally can be detected by use of water quality data collected by state and federal agencies, sampling methodologies for nonpoint source contamination attributable to surface runoff are totally inadequate. The expensive undertaking of monitoring all nonpoint sources for numerous watersheds can be lessened by implementing Landsat change detection analyses.
NASA Technical Reports Server (NTRS)
1982-01-01
An effective data collection methodology for evaluating software development methodologies was applied to four different software development projects. Goals of the data collection included characterizing changes and errors, characterizing projects and programmers, identifying effective error detection and correction techniques, and investigating ripple effects. The data collected consisted of changes (including error corrections) made to the software after code was written and baselined, but before testing began. Data collection and validation were concurrent with software development. Changes reported were verified by interviews with programmers.
How Knowledge Organizations Work: The Case of Detectives
ERIC Educational Resources Information Center
Gottschalk, Petter; Holgersson, Stefan; Karlsen, Jan Terje
2009-01-01
Purpose: The purpose of this paper is to conceptualize detectives in police investigations as knowledge workers. Design/methodology/approach: The paper is based on a literature review covering knowledge organizations, police organizations, police investigations, and detectives as knowledge workers. Findings: The paper finds that the changing role…
This study will provide a general methodology for integrating threshold information from multiple species ecological metrics, allow for prediction of changes of alternative stable states, and provide a risk assessment tool that can be applied to adaptive management. The integr...
NASA Astrophysics Data System (ADS)
Lieberman, Robert; Kwong, Heston; Liu, Brent; Huang, H. K.
2009-02-01
The chest x-ray radiological features of tuberculosis patients are well documented, and the radiological features that change in response to successful pharmaceutical therapy can be followed with longitudinal studies over time. The patients can also be classified as either responsive or resistant to pharmaceutical therapy based on clinical improvement. We have retrospectively collected time series chest x-ray images of 200 patients diagnosed with tuberculosis receiving the standard pharmaceutical treatment. Computer algorithms can be created to utilize image texture features to assess the temporal changes in the chest x-rays of the tuberculosis patients. This methodology provides a framework for a computer-assisted detection (CAD) system that may provide physicians with the ability to detect poor treatment response earlier in pharmaceutical therapy. Early detection allows physicians to respond with more timely treatment alternatives and improved outcomes. Such a system has the potential to increase treatment efficacy for millions of patients each year.
NASA Technical Reports Server (NTRS)
Potter, Christopher S.
2013-01-01
Landsat satellite imagery was analyzed to generate a detailed record of 10 years of vegetation disturbance and regrowth for Pacific coastal areas of Marin and San Francisco Counties. The Landsat Ecosystem Disturbance Adaptive Processing System (LEDAPS) methodology, a transformation of Tasseled-Cap data space, was applied to detected changes in perennial coastal shrubland, woodland, and forest cover from 1999 to 2009. Results showed several principal points of interest, within which extensive contiguous areas of similar LEDAPS vegetation change (either disturbed or restored) were detected. Regrowth areas were delineated as burned forest areas in the Point Reyes National Seashore (PRNS) from the 1995 Vision Fire. LEDAPS-detected disturbance patterns on Inverness Ridge, PRNS in areas observed with dieback of tanoak and bay laurel trees was consistent with defoliation by sudden oak death (Phytophthora ramorum). LEDAPS regrowth pixels were detected over much of the predominantly grassland/herbaceous cover of the Olema Valley ranchland near PRNS. Extensive restoration of perennial vegetation cover on Crissy Field, Baker Beach and Lobos Creek dunes in San Francisco was identified. Based on these examples, the LEDAPS methodology will be capable of fulfilling much of the need for continual, low-cost monitoring of emerging changes to coastal ecosystems.
Change Detection Analysis of Water Pollution in Coimbatore Region using Different Color Models
NASA Astrophysics Data System (ADS)
Jiji, G. Wiselin; Devi, R. Naveena
2017-12-01
The data acquired through remote sensing satellites furnish facts about the land and water at varying resolutions and has been widely used for several change detection studies. Apart from the existence of many change detection methodologies and techniques, emergence of new ones continues to subsist. Existing change detection techniques exploit images that are either in gray scale or RGB color model. In this paper we introduced color models for performing change detection for water pollution. Here the polluted lakes are classified and post-classification change detection techniques are applied to RGB images and results obtained are analysed for changes to exist or not. Furthermore RGB images obtained after classification when converted to any of the two color models YCbCr and YIQ is found to produce the same results as that of the RGB model images. Thus it can be concluded that other color models like YCbCr, YIQ can be used as substitution to RGB color model for analysing change detection with regard to water pollution.
NASA Astrophysics Data System (ADS)
Serra, Roger; Lopez, Lautaro
2018-05-01
Different approaches on the detection of damages based on dynamic measurement of structures have appeared in the last decades. They were based, amongst others, on changes in natural frequencies, modal curvatures, strain energy or flexibility. Wavelet analysis has also been used to detect the abnormalities on modal shapes induced by damages. However the majority of previous work was made with non-corrupted by noise signals. Moreover, the damage influence for each mode shape was studied separately. This paper proposes a new methodology based on combined modal wavelet transform strategy to cope with noisy signals, while at the same time, able to extract the relevant information from each mode shape. The proposed methodology will be then compared with the most frequently used and wide-studied methods from the bibliography. To evaluate the performance of each method, their capacity to detect and localize damage will be analyzed in different cases. The comparison will be done by simulating the oscillations of a cantilever steel beam with and without defect as a numerical case. The proposed methodology proved to outperform classical methods in terms of noisy signals.
NASA Technical Reports Server (NTRS)
Potter, Christopher
2013-01-01
The Landsat Ecosystem Disturbance Adaptive Processing System (LEDAPS) methodology was applied to detected changes in perennial vegetation cover at marshland sites in Northern California reported to have undergone restoration between 1999 and 2009. Results showed extensive contiguous areas of restored marshland plant cover at 10 of the 14 sites selected. Gains in either woody shrub cover and/or from recovery of herbaceous cover that remains productive and evergreen on a year-round basis could be mapped out from the image results. However, LEDAPS may not be highly sensitive changes in wetlands that have been restored mainly with seasonal herbaceous cover (e.g., vernal pools), due to the ephemeral nature of the plant greenness signal. Based on this evaluation, the LEDAPS methodology would be capable of fulfilling a pressing need for consistent, continual, low-cost monitoring of changes in marshland ecosystems of the Pacific Flyway.
Variance change point detection for fractional Brownian motion based on the likelihood ratio test
NASA Astrophysics Data System (ADS)
Kucharczyk, Daniel; Wyłomańska, Agnieszka; Sikora, Grzegorz
2018-01-01
Fractional Brownian motion is one of the main stochastic processes used for describing the long-range dependence phenomenon for self-similar processes. It appears that for many real time series, characteristics of the data change significantly over time. Such behaviour one can observe in many applications, including physical and biological experiments. In this paper, we present a new technique for the critical change point detection for cases where the data under consideration are driven by fractional Brownian motion with a time-changed diffusion coefficient. The proposed methodology is based on the likelihood ratio approach and represents an extension of a similar methodology used for Brownian motion, the process with independent increments. Here, we also propose a statistical test for testing the significance of the estimated critical point. In addition to that, an extensive simulation study is provided to test the performance of the proposed method.
Using scan statistics for congenital anomalies surveillance: the EUROCAT methodology.
Teljeur, Conor; Kelly, Alan; Loane, Maria; Densem, James; Dolk, Helen
2015-11-01
Scan statistics have been used extensively to identify temporal clusters of health events. We describe the temporal cluster detection methodology adopted by the EUROCAT (European Surveillance of Congenital Anomalies) monitoring system. Since 2001, EUROCAT has implemented variable window width scan statistic for detecting unusual temporal aggregations of congenital anomaly cases. The scan windows are based on numbers of cases rather than being defined by time. The methodology is imbedded in the EUROCAT Central Database for annual application to centrally held registry data. The methodology was incrementally adapted to improve the utility and to address statistical issues. Simulation exercises were used to determine the power of the methodology to identify periods of raised risk (of 1-18 months). In order to operationalize the scan methodology, a number of adaptations were needed, including: estimating date of conception as unit of time; deciding the maximum length (in time) and recency of clusters of interest; reporting of multiple and overlapping significant clusters; replacing the Monte Carlo simulation with a lookup table to reduce computation time; and placing a threshold on underlying population change and estimating the false positive rate by simulation. Exploration of power found that raised risk periods lasting 1 month are unlikely to be detected except when the relative risk and case counts are high. The variable window width scan statistic is a useful tool for the surveillance of congenital anomalies. Numerous adaptations have improved the utility of the original methodology in the context of temporal cluster detection in congenital anomalies.
NASA Technical Reports Server (NTRS)
Macdonald, H.; Steele, K. (Principal Investigator); Waite, W.; Rice, R.; Shinn, M.; Dillard, T.; Petersen, C.
1977-01-01
The author has identified the following significant results. Comparison between LANDSAT 1 and 2 imagery of Arkansas provided evidence of significant land use changes during the 1972-75 time period. Analysis of Arkansas historical water quality information has shown conclusively that whereas point source pollution generally can be detected by use of water quality data collected by state and federal agencies, sampling methodologies for nonpoint source contamination attributable to surface runoff are totally inadequate. The expensive undertaking of monitoring all nonpoint sources for numerous watersheds can be lessened by implementing LANDSAT change detection analyses.
Brown, Christopher J; O'Connor, Mary I; Poloczanska, Elvira S; Schoeman, David S; Buckley, Lauren B; Burrows, Michael T; Duarte, Carlos M; Halpern, Benjamin S; Pandolfi, John M; Parmesan, Camille; Richardson, Anthony J
2016-04-01
Climate change is shifting species' distribution and phenology. Ecological traits, such as mobility or reproductive mode, explain variation in observed rates of shift for some taxa. However, estimates of relationships between traits and climate responses could be influenced by how responses are measured. We compiled a global data set of 651 published marine species' responses to climate change, from 47 papers on distribution shifts and 32 papers on phenology change. We assessed the relative importance of two classes of predictors of the rate of change, ecological traits of the responding taxa and methodological approaches for quantifying biological responses. Methodological differences explained 22% of the variation in range shifts, more than the 7.8% of the variation explained by ecological traits. For phenology change, methodological approaches accounted for 4% of the variation in measurements, whereas 8% of the variation was explained by ecological traits. Our ability to predict responses from traits was hindered by poor representation of species from the tropics, where temperature isotherms are moving most rapidly. Thus, the mean rate of distribution change may be underestimated by this and other global syntheses. Our analyses indicate that methodological approaches should be explicitly considered when designing, analysing and comparing results among studies. To improve climate impact studies, we recommend that (1) reanalyses of existing time series state how the existing data sets may limit the inferences about possible climate responses; (2) qualitative comparisons of species' responses across different studies be limited to studies with similar methodological approaches; (3) meta-analyses of climate responses include methodological attributes as covariates; and (4) that new time series be designed to include the detection of early warnings of change or ecologically relevant change. Greater consideration of methodological attributes will improve the accuracy of analyses that seek to quantify the role of climate change in species' distribution and phenology changes. © 2015 John Wiley & Sons Ltd.
Waring, Mike; Bielfeldt, Stephan; Mätzold, Katja; Wilhelm, Klaus-Peter
2013-02-01
Chronic wounds require frequent dressing changes. Adhesive dressings used for this indication can be damaging to the stratum corneum, particularly in the elderly where the skin tends to be thinner. Understanding the level of damage caused by dressing removal can aid dressing selection. This study used a novel methodology that applied a stain to the skin and measured the intensity of that stain after repeated application and removal of a series of different adhesive types. Additionally, a traditional method of measuring skin barrier damage (transepidermal water loss) was also undertaken and compared with the staining methodology. The staining methodology and measurement of transepidermal water loss differentiated the adhesive dressings, showing that silicone adhesives caused least trauma to the skin. The staining methodology was shown to be as effective as transepidermal water loss in detecting damage to the stratum corneum and was shown to detect disruption of the barrier earlier than the traditional technique. © 2012 John Wiley & Sons A/S.
Souli, Maria P.; Klonos, Panagiotis; Fragopoulou, Adamantia F.; Mavragani, Ifigeneia V.; Pateras, Ioannis S.; Kostomitsopoulos, Nikolaos; Margaritis, Lukas H.; Zoumpoulis, Pavlos; Kaklamanis, Loukas; Kletsas, Dimitris; Gorgoulis, Vassilis G.; Kyritsis, Apostolos; Pissis, Polycarpos; Georgakilas, Alexandros G.
2017-01-01
The dielectric properties of biological tissues can contribute non-invasively to a better characterization and understanding of the structural properties and physiology of living organisms. The question we asked, is whether these induced changes are effected by an endogenous or exogenous cellular stress, and can they be detected non-invasively in the form of a dielectric response, e.g., an AC conductivity switch in the broadband frequency spectrum. This study constitutes the first methodological approach for the detection of environmental stress-induced damage in mammalian tissues by the means of broadband dielectric spectroscopy (BDS) at the frequencies of 1–106 Hz. Firstly, we used non-ionizing (NIR) and ionizing radiation (IR) as a typical environmental stress. Specifically, rats were exposed to either digital enhanced cordless telecommunication (DECT) radio frequency electromagnetic radiation or to γ-radiation, respectively. The other type of stress, characterized usually by high genomic instability, was the pathophysiological state of human cancer (lung and prostate). Analyzing the results of isothermal dielectric measurements provided information on the tissues’ water fraction. In most cases, our methodology proved sufficient in detecting structural changes, especially in the case of IR and malignancy. Useful specific dielectric response patterns are detected and correlated with each type of stress. Our results point towards the development of a dielectric-based methodology for better understanding and, in a relatively invasive way, the biological and structural changes effected by radiation and developing lung or prostate cancer often associated with genomic instability. PMID:28420124
PWAS EMIS-ECIS Active Carbon Filter Residual Life Estimation Methodology
2013-09-23
change in the EMIS spectrum. This method is similar to the full width at half maximum (FWHM) method implemented in the fiber Bragg grating ( FBG ), where...the intensity of the light reflected by the FBG at the half peak frequency is used to detect the strain change in the FBG . 4 W911NF-11-1-0210...grating ( FBG ), where the intensity of the light reflected by the FBG at the half peak frequency is used to detect the strain change in the FBG . A brief
Clustering approaches to feature change detection
NASA Astrophysics Data System (ADS)
G-Michael, Tesfaye; Gunzburger, Max; Peterson, Janet
2018-05-01
The automated detection of changes occurring between multi-temporal images is of significant importance in a wide range of medical, environmental, safety, as well as many other settings. The usage of k-means clustering is explored as a means for detecting objects added to a scene. The silhouette score for the clustering is used to define the optimal number of clusters that should be used. For simple images having a limited number of colors, new objects can be detected by examining the change between the optimal number of clusters for the original and modified images. For more complex images, new objects may need to be identified by examining the relative areas covered by corresponding clusters in the original and modified images. Which method is preferable depends on the composition and range of colors present in the images. In addition to describing the clustering and change detection methodology of our proposed approach, we provide some simple illustrations of its application.
Real-time 3D change detection of IEDs
NASA Astrophysics Data System (ADS)
Wathen, Mitch; Link, Norah; Iles, Peter; Jinkerson, John; Mrstik, Paul; Kusevic, Kresimir; Kovats, David
2012-06-01
Road-side bombs are a real and continuing threat to soldiers in theater. CAE USA recently developed a prototype Volume based Intelligence Surveillance Reconnaissance (VISR) sensor platform for IED detection. This vehicle-mounted, prototype sensor system uses a high data rate LiDAR (1.33 million range measurements per second) to generate a 3D mapping of roadways. The mapped data is used as a reference to generate real-time change detection on future trips on the same roadways. The prototype VISR system is briefly described. The focus of this paper is the methodology used to process the 3D LiDAR data, in real-time, to detect small changes on and near the roadway ahead of a vehicle traveling at moderate speeds with sufficient warning to stop the vehicle at a safe distance from the threat. The system relies on accurate navigation equipment to geo-reference the reference run and the change-detection run. Since it was recognized early in the project that detection of small changes could not be achieved with accurate navigation solutions alone, a scene alignment algorithm was developed to register the reference run with the change detection run prior to applying the change detection algorithm. Good success was achieved in simultaneous real time processing of scene alignment plus change detection.
Change detection from remotely sensed images: From pixel-based to object-based approaches
NASA Astrophysics Data System (ADS)
Hussain, Masroor; Chen, Dongmei; Cheng, Angela; Wei, Hui; Stanley, David
2013-06-01
The appetite for up-to-date information about earth's surface is ever increasing, as such information provides a base for a large number of applications, including local, regional and global resources monitoring, land-cover and land-use change monitoring, and environmental studies. The data from remote sensing satellites provide opportunities to acquire information about land at varying resolutions and has been widely used for change detection studies. A large number of change detection methodologies and techniques, utilizing remotely sensed data, have been developed, and newer techniques are still emerging. This paper begins with a discussion of the traditionally pixel-based and (mostly) statistics-oriented change detection techniques which focus mainly on the spectral values and mostly ignore the spatial context. This is succeeded by a review of object-based change detection techniques. Finally there is a brief discussion of spatial data mining techniques in image processing and change detection from remote sensing data. The merits and issues of different techniques are compared. The importance of the exponential increase in the image data volume and multiple sensors and associated challenges on the development of change detection techniques are highlighted. With the wide use of very-high-resolution (VHR) remotely sensed images, object-based methods and data mining techniques may have more potential in change detection.
Evaluation of methodology for detecting/predicting migration of forest species
Dale S. Solomon; William B. Leak
1996-01-01
Available methods for analyzing migration of forest species are evaluated, including simulation models, remeasured plots, resurveys, pollen/vegetation analysis, and age/distance trends. Simulation models have provided some of the most drastic estimates of species changes due to predicted changes in global climate. However, these models require additional testing...
Characterization of normality of chaotic systems including prediction and detection of anomalies
NASA Astrophysics Data System (ADS)
Engler, Joseph John
Accurate prediction and control pervades domains such as engineering, physics, chemistry, and biology. Often, it is discovered that the systems under consideration cannot be well represented by linear, periodic nor random data. It has been shown that these systems exhibit deterministic chaos behavior. Deterministic chaos describes systems which are governed by deterministic rules but whose data appear to be random or quasi-periodic distributions. Deterministically chaotic systems characteristically exhibit sensitive dependence upon initial conditions manifested through rapid divergence of states initially close to one another. Due to this characterization, it has been deemed impossible to accurately predict future states of these systems for longer time scales. Fortunately, the deterministic nature of these systems allows for accurate short term predictions, given the dynamics of the system are well understood. This fact has been exploited in the research community and has resulted in various algorithms for short term predictions. Detection of normality in deterministically chaotic systems is critical in understanding the system sufficiently to able to predict future states. Due to the sensitivity to initial conditions, the detection of normal operational states for a deterministically chaotic system can be challenging. The addition of small perturbations to the system, which may result in bifurcation of the normal states, further complicates the problem. The detection of anomalies and prediction of future states of the chaotic system allows for greater understanding of these systems. The goal of this research is to produce methodologies for determining states of normality for deterministically chaotic systems, detection of anomalous behavior, and the more accurate prediction of future states of the system. Additionally, the ability to detect subtle system state changes is discussed. The dissertation addresses these goals by proposing new representational techniques and novel prediction methodologies. The value and efficiency of these methods are explored in various case studies. Presented is an overview of chaotic systems with examples taken from the real world. A representation schema for rapid understanding of the various states of deterministically chaotic systems is presented. This schema is then used to detect anomalies and system state changes. Additionally, a novel prediction methodology which utilizes Lyapunov exponents to facilitate longer term prediction accuracy is presented and compared with other nonlinear prediction methodologies. These novel methodologies are then demonstrated on applications such as wind energy, cyber security and classification of social networks.
An Approach to V&V of Embedded Adaptive Systems
NASA Technical Reports Server (NTRS)
Liu, Yan; Yerramalla, Sampath; Fuller, Edgar; Cukic, Bojan; Gururajan, Srikaruth
2004-01-01
Rigorous Verification and Validation (V&V) techniques are essential for high assurance systems. Lately, the performance of some of these systems is enhanced by embedded adaptive components in order to cope with environmental changes. Although the ability of adapting is appealing, it actually poses a problem in terms of V&V. Since uncertainties induced by environmental changes have a significant impact on system behavior, the applicability of conventional V&V techniques is limited. In safety-critical applications such as flight control system, the mechanisms of change must be observed, diagnosed, accommodated and well understood prior to deployment. In this paper, we propose a non-conventional V&V approach suitable for online adaptive systems. We apply our approach to an intelligent flight control system that employs a particular type of Neural Networks (NN) as the adaptive learning paradigm. Presented methodology consists of a novelty detection technique and online stability monitoring tools. The novelty detection technique is based on Support Vector Data Description that detects novel (abnormal) data patterns. The Online Stability Monitoring tools based on Lyapunov's Stability Theory detect unstable learning behavior in neural networks. Cases studies based on a high fidelity simulator of NASA's Intelligent Flight Control System demonstrate a successful application of the presented V&V methodology. ,
Comparison of power curve monitoring methods
NASA Astrophysics Data System (ADS)
Cambron, Philippe; Masson, Christian; Tahan, Antoine; Torres, David; Pelletier, Francis
2017-11-01
Performance monitoring is an important aspect of operating wind farms. This can be done through the power curve monitoring (PCM) of wind turbines (WT). In the past years, important work has been conducted on PCM. Various methodologies have been proposed, each one with interesting results. However, it is difficult to compare these methods because they have been developed using their respective data sets. The objective of this actual work is to compare some of the proposed PCM methods using common data sets. The metric used to compare the PCM methods is the time needed to detect a change in the power curve. Two power curve models will be covered to establish the effect the model type has on the monitoring outcomes. Each model was tested with two control charts. Other methodologies and metrics proposed in the literature for power curve monitoring such as areas under the power curve and the use of statistical copulas have also been covered. Results demonstrate that model-based PCM methods are more reliable at the detecting a performance change than other methodologies and that the effectiveness of the control chart depends on the types of shift observed.
Multi-stage methodology to detect health insurance claim fraud.
Johnson, Marina Evrim; Nagarur, Nagen
2016-09-01
Healthcare costs in the US, as well as in other countries, increase rapidly due to demographic, economic, social, and legal changes. This increase in healthcare costs impacts both government and private health insurance systems. Fraudulent behaviors of healthcare providers and patients have become a serious burden to insurance systems by bringing unnecessary costs. Insurance companies thus develop methods to identify fraud. This paper proposes a new multistage methodology for insurance companies to detect fraud committed by providers and patients. The first three stages aim at detecting abnormalities among providers, services, and claim amounts. Stage four then integrates the information obtained in the previous three stages into an overall risk measure. Subsequently, a decision tree based method in stage five computes risk threshold values. The final decision stating whether the claim is fraudulent is made by comparing the risk value obtained in stage four with the risk threshold value from stage five. The research methodology performs well on real-world insurance data.
Absorption into fluorescence. A method to sense biologically relevant gas molecules
NASA Astrophysics Data System (ADS)
Strianese, Maria; Varriale, Antonio; Staiano, Maria; Pellecchia, Claudio; D'Auria, Sabato
2011-01-01
In this work we present an innovative optical sensing methodology based on the use of biomolecules as molecular gating nano-systems. Here, as an example, we report on the detection ofanalytes related to climate change. In particular, we focused our attention on the detection ofnitric oxide (NO) and oxygen (O2). Our methodology builds on the possibility of modulating the excitation intensity of a fluorescent probe used as a transducer and a sensor molecule whose absorption is strongly affected by the binding of an analyte of interest used as a filter. The two simple conditions that have to be fulfilled for the method to work are: (a) the absorption spectrum of the sensor placed inside the cuvette, and acting as the recognition element for the analyte of interest, should strongly change upon the binding of the analyte and (b) the fluorescence dye transducer should exhibit an excitation band which overlaps with one or more absorption bands of the sensor. The absorption band of the sensor affected by the binding of the specific analyte should overlap with the excitation band of the transducer. The high sensitivity of fluorescence detection combined with the use of proteins as highly selective sensors makes this method a powerful basis for the development of a new generation of analytical assays. Proof-of-principle results showing that cytochrome c peroxidase (CcP) for NO detection and myoglobin (Mb) for O2 detection can be successfully used by exploiting our new methodology are reported. The proposed technology can be easily expanded to the determination of different target analytes.
Risk Metrics for Android (trademark) Devices
2017-02-01
allows for easy distribution of malware. This report surveys malware distribution methodologies , then describes current work being done to determine the...given a standard weight of wi = 1. Two data sets were used for testing this methodology . Because the authors are Chinese, they chose to download apps...Order Analysis excels at handling non -obfuscated apps, but may not be able to detect malware that employs encryption or dynamically changes its payload
Practical identification of moisture sources in building assemblies using infrared thermography
NASA Astrophysics Data System (ADS)
McIntosh, Gregory B.; Colantonio, Antonio
2015-05-01
Water, in its various phases, in any environment other than desert (hot or cold) conditions, is the single most destructive element that causes deterioration of materials and failure of building assemblies. It is the key element present in the formation of mold and fungi that lead to indoor air quality problems. Water is the primary element that needs to be managed in buildings to ensure human comfort, health and safety. Under the right thermodynamic conditions the detection of moisture in its various states is possible through the use of infrared thermography for a large variety of building assemblies and materials. The difficulty is that moisture is transient and mobile from one environment to another via air movement, vapor pressure or phase change. Building materials and enclosures provide both repositories and barriers to this moisture movement. In real life steady state conditions do not exist for moisture within building materials and enclosures. Thus the detection of moisture is in a constant state of transition. Sometimes you will see it and sometimes you will not. Understanding the limitations at the time of inspection will go a long way to mitigating unsatisfied clients or difficult litigation. Moisture detection can be observed by IRT via three physical mechanisms; latent heat absorption or release during phase change; a change in conductive heat transfer; and a change in thermal capacitance. Complicating the three methodologies is the factor of variable temperature differentials and variable mass air flow on, through and around surfaces being inspected. Building enclosures come in variable assembly types and are designed to perform differently in different environmental regions. Sources for moisture accumulation will vary for different environmental conditions. Detection methodologies will change for each assembly type in different ambient environments. This paper will look at the issue of the methodologies for detection of the presence of moisture and determination of the various sources from which it accumulates in building assemblies. The end objective for IRT based moisture detection inspections is not to just identify that moisture is present but to determine its extent and source. Accurate assessment of the source(s) and root cause of the moisture is critical to the development of a permanent solution to the problem.
Towards Comprehensive Variation Models for Designing Vehicle Monitoring Systems
NASA Technical Reports Server (NTRS)
McAdams, Daniel A.; Tumer, Irem Y.; Clancy, Daniel (Technical Monitor)
2002-01-01
When designing vehicle vibration monitoring systems for aerospace devices, it is common to use well-established models of vibration features to determine whether failures or defects exist. Most of the algorithms used for failure detection rely on these models to detect significant changes in a flight environment. In actual practice, however, most vehicle vibration monitoring systems are corrupted by high rates of false alarms and missed detections. This crucial roadblock makes their implementation in real vehicles (e.g., helicopter transmissions and aircraft engines) difficult, making their operation costly and unreliable. Research conducted at the NASA Ames Research Center has determined that a major reason for the high rates of false alarms and missed detections is the numerous sources of statistical variations that are not taken into account in the modeling assumptions. In this paper, we address one such source of variations, namely, those caused during the design and manufacturing of rotating machinery components that make up aerospace systems. We present a novel way of modeling the vibration response by including design variations via probabilistic methods. Using such models, we develop a methodology to account for design and manufacturing variations, and explore the changes in the vibration response to determine its stochastic nature. We explore the potential of the methodology using a nonlinear cam-follower model, where the spring stiffness values are assumed to follow a normal distribution. The results demonstrate initial feasibility of the method, showing great promise in developing a general methodology for designing more accurate aerospace vehicle monitoring systems.
NASA Astrophysics Data System (ADS)
Camacho-Navarro, Jhonatan; Ruiz, Magda; Villamizar, Rodolfo; Mujica, Luis; Moreno-Beltrán, Gustavo; Quiroga, Jabid
2017-05-01
Continuous monitoring for damage detection in structural assessment comprises implementation of low cost equipment and efficient algorithms. This work describes the stages involved in the design of a methodology with high feasibility to be used in continuous damage assessment. Specifically, an algorithm based on a data-driven approach by using principal component analysis and pre-processing acquired signals by means of cross-correlation functions, is discussed. A carbon steel pipe section and a laboratory tower were used as test structures in order to demonstrate the feasibility of the methodology to detect abrupt changes in the structural response when damages occur. Two types of damage cases are studied: crack and leak for each structure, respectively. Experimental results show that the methodology is promising in the continuous monitoring of real structures.
Tene, A; Tobin, B; Dyckmans, J; Ray, D; Black, K; Nieuwenhuis, M
2011-03-01
A thinning experiment stand at Avoca, Ballinvalley, on the east coast of the Republic of Ireland was used to test a developed methodology aimed at monitoring drought stress, based on the analysis of growth rings obtained by coring. The stand incorporated six plots representing three thinning regimes (light, moderate and heavy) and was planted in the spring of 1943 on a brown earth soil. Radial growth (early- and latewood) was measured for the purpose of this study. A multidisciplinary approach was used to assess historic tree response to climate: specifically, the application of statistical tools such as principal component and canonical correlation analysis to dendrochronology, stable isotopes, ring density proxy, blue reflectance and forest biometrics. Results showed that radial growth was a good proxy for monitoring changes to moisture deficit, while maximum density and blue reflectance were appropriate for assessing changes in accumulated temperature for the growing season. Rainfall also influenced radial growth changes but not significantly, and was a major factor in stable carbon and oxygen discrimination, mostly in the latewood formation phase. Stable oxygen isotope analysis was more accurate than radial growth analysis in drought detection, as it helped detect drought signals in both early- and latewood while radial growth analysis only detected the drought signal in earlywood. Many studies have shown that tree rings provide vital information for marking past climatic events. This work provides a methodology to better identify and understand how commonly measured tree proxies relate to environmental parameters, and can best be used to characterize and pinpoint drought events (variously described using parameters such as like moisture deficit, accumulated temperature, rainfall and potential evaporation).
Jan Seibert; Jeffrey J. McDonnell
2010-01-01
The effect of land-use or land-cover change on stream runoff dynamics is not fully understood. In many parts of the world, forest management is the major land-cover change agent. While the paired catchment approach has been the primary methodology used to quantify such effects, it is only possible for small headwater catchments where there is uniformity in...
NASA Astrophysics Data System (ADS)
De Ridder, Simon; Vandermarliere, Benjamin; Ryckebusch, Jan
2016-11-01
A framework based on generalized hierarchical random graphs (GHRGs) for the detection of change points in the structure of temporal networks has recently been developed by Peel and Clauset (2015 Proc. 29th AAAI Conf. on Artificial Intelligence). We build on this methodology and extend it to also include the versatile stochastic block models (SBMs) as a parametric family for reconstructing the empirical networks. We use five different techniques for change point detection on prototypical temporal networks, including empirical and synthetic ones. We find that none of the considered methods can consistently outperform the others when it comes to detecting and locating the expected change points in empirical temporal networks. With respect to the precision and the recall of the results of the change points, we find that the method based on a degree-corrected SBM has better recall properties than other dedicated methods, especially for sparse networks and smaller sliding time window widths.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carmichael, Joshua Daniel; Carr, Christina; Pettit, Erin C.
We apply a fully autonomous icequake detection methodology to a single day of high-sample rate (200 Hz) seismic network data recorded from the terminus of Taylor Glacier, ANT that temporally coincided with a brine release episode near Blood Falls (May 13, 2014). We demonstrate a statistically validated procedure to assemble waveforms triggered by icequakes into populations of clusters linked by intra-event waveform similarity. Our processing methodology implements a noise-adaptive power detector coupled with a complete-linkage clustering algorithm and noise-adaptive correlation detector. This detector-chain reveals a population of 20 multiplet sequences that includes ~150 icequakes and produces zero false alarms onmore » the concurrent, diurnally variable noise. Our results are very promising for identifying changes in background seismicity associated with the presence or absence of brine release episodes. We thereby suggest that our methodology could be applied to longer time periods to establish a brine-release monitoring program for Blood Falls that is based on icequake detections.« less
NASA Astrophysics Data System (ADS)
Kanawade, Rajesh; Stelzle, Florian; Schmidt, Michael
This paper presents a novel methodology in early detection of clinical shock by monitoring hemodynamic changes using diffuse reflectance measurement technique. Detailed prototype of the reflectance measurement system and data analysis technique of hemodynamic monitoring was carried out in our laboratory. The real time in-vivo measurements were done from the index finger. This study demonstrates preliminary results of real time monitoring of reduced/- oxyhemoglobin changes during clogging and unclogging of blood flow in the finger tip. The obtained results were verified with pulse-oximeter values, connected to the tip of the same index finger.
NASA Astrophysics Data System (ADS)
Torres-Arredondo, M.-A.; Sierra-Pérez, Julián; Cabanes, Guénaël
2016-05-01
The process of measuring and analysing the data from a distributed sensor network all over a structural system in order to quantify its condition is known as structural health monitoring (SHM). For the design of a trustworthy health monitoring system, a vast amount of information regarding the inherent physical characteristics of the sources and their propagation and interaction across the structure is crucial. Moreover, any SHM system which is expected to transition to field operation must take into account the influence of environmental and operational changes which cause modifications in the stiffness and damping of the structure and consequently modify its dynamic behaviour. On that account, special attention is paid in this paper to the development of an efficient SHM methodology where robust signal processing and pattern recognition techniques are integrated for the correct interpretation of complex ultrasonic waves within the context of damage detection and identification. The methodology is based on an acousto-ultrasonics technique where the discrete wavelet transform is evaluated for feature extraction and selection, linear principal component analysis for data-driven modelling and self-organising maps for a two-level clustering under the principle of local density. At the end, the methodology is experimentally demonstrated and results show that all the damages were detectable and identifiable.
Multi-Temporal Classification and Change Detection Using Uav Images
NASA Astrophysics Data System (ADS)
Makuti, S.; Nex, F.; Yang, M. Y.
2018-05-01
In this paper different methodologies for the classification and change detection of UAV image blocks are explored. UAV is not only the cheapest platform for image acquisition but it is also the easiest platform to operate in repeated data collections over a changing area like a building construction site. Two change detection techniques have been evaluated in this study: the pre-classification and the post-classification algorithms. These methods are based on three main steps: feature extraction, classification and change detection. A set of state of the art features have been used in the tests: colour features (HSV), textural features (GLCM) and 3D geometric features. For classification purposes Conditional Random Field (CRF) has been used: the unary potential was determined using the Random Forest algorithm while the pairwise potential was defined by the fully connected CRF. In the performed tests, different feature configurations and settings have been considered to assess the performance of these methods in such challenging task. Experimental results showed that the post-classification approach outperforms the pre-classification change detection method. This was analysed using the overall accuracy, where by post classification have an accuracy of up to 62.6 % and the pre classification change detection have an accuracy of 46.5 %. These results represent a first useful indication for future works and developments.
Bissessor, Liselle; Wilson, Janet; McAuliffe, Gary; Upton, Arlo
2017-06-16
Trichomonas vaginalis (TV) prevalence varies among different communities and peoples. The availability of robust molecular platforms for the detection of TV has advanced diagnosis; however, molecular tests are more costly than phenotypic methodologies, and testing all urogenital samples is costly. We recently replaced culture methods with the Aptima Trichomonas vaginalis nucleic acid amplification test on specific request and as reflex testing by the laboratory, and have audited this change. Data were collected from August 2015 (microbroth culture and microscopy) and August 2016 (Aptima TV assay) including referrer, testing volumes, results and test cost estimates. In August 2015, 10,299 vaginal swabs, and in August 2016, 2,189 specimens (urogenital swabs and urines), were tested. The positivity rate went from 0.9% to 5.3%, and overall more TV infections were detected in 2016. The number needed to test and cost for one positive TV result respectively was 111 and $902.55 in 2015, and 19 and $368.92 in 2016. Request volumes and positivity rates differed among referrers. The methodology change was associated with higher overall detection of TV, and reductions in the numbers needed to test/cost for one TV diagnosis. Our audit suggests that there is room for improvement with TV test requesting in our community.
Fozooni, Tahereh; Ravan, Hadi; Sasan, Hosseinali
2017-12-01
Due to their unique properties, such as programmability, ligand-binding capability, and flexibility, nucleic acids can serve as analytes and/or recognition elements for biosensing. To improve the sensitivity of nucleic acid-based biosensing and hence the detection of a few copies of target molecule, different modern amplification methodologies, namely target-and-signal-based amplification strategies, have already been developed. These recent signal amplification technologies, which are capable of amplifying the signal intensity without changing the targets' copy number, have resulted in fast, reliable, and sensitive methods for nucleic acid detection. Working in cell-free settings, researchers have been able to optimize a variety of complex and quantitative methods suitable for deploying in live-cell conditions. In this study, a comprehensive review of the signal amplification technologies for the detection of nucleic acids is provided. We classify the signal amplification methodologies into enzymatic and non-enzymatic strategies with a primary focus on the methods that enable us to shift away from in vitro detecting to in vivo imaging. Finally, the future challenges and limitations of detection for cellular conditions are discussed.
Detection and mapping of delays in early cortical folding derived from in utero MRI
NASA Astrophysics Data System (ADS)
Habas, Piotr A.; Rajagopalan, Vidya; Scott, Julia A.; Kim, Kio; Roosta, Ahmad; Rousseau, Francois; Barkovich, A. James; Glenn, Orit A.; Studholme, Colin
2011-03-01
Understanding human brain development in utero and detecting cortical abnormalities related to specific clinical conditions is an important area of research. In this paper, we describe and evaluate methodology for detection and mapping of delays in early cortical folding from population-based studies of fetal brain anatomies imaged in utero. We use a general linear modeling framework to describe spatiotemporal changes in curvature of the developing brain and explore the ability to detect and localize delays in cortical folding in the presence of uncertainty in estimation of the fetal age. We apply permutation testing to examine which regions of the brain surface provide the most statistical power to detect a given folding delay at a given developmental stage. The presented methodology is evaluated using MR scans of fetuses with normal brain development and gestational ages ranging from 20.57 to 27.86 weeks. This period is critical in early cortical folding and the formation of the primary and secondary sulci. Finally, we demonstrate a clinical application of the framework for detection and localization of folding delays in fetuses with isolated mild ventriculomegaly.
NASA Technical Reports Server (NTRS)
Potter, Christopher S.
2014-01-01
The Landsat Ecosystem Disturbance Adaptive Processing System (LEDAPS) methodology was applied to detected changes in forest vegetation cover for areas burned by wildfires in the Sierra Nevada Mountains of California between the periods of 1975- 79 and 1995-1999. Results for areas burned by wildfire between 1995 and 1999 confirmed the importance of regrowing forest vegetation over 17% of the combined burned areas. A notable fraction (12%) of the entire 5-km (unburned) buffer area outside the 1995-199 fires perimeters showed decline in forest cover, and not nearly as many regrowing forest areas, covering only 3% of all the 1995-1999 buffer areas combined. Areas burned by wildfire between 1975 and 1979 confirmed the importance of disturbed (or declining evergreen) vegetation covering 13% of the combined 1975- 1979 burned areas. Based on comparison of these results to ground-based survey data, the LEDAPS methodology should be capable of fulfilling much of the need for consistent, low-cost monitoring of changes due to climate and biological factors in western forest regrowth following stand-replacing disturbances.
Matthew J. Gregory; Zhiqiang Yang; David M. Bell; Warren B. Cohen; Sean Healey; Janet L. Ohmann; Heather M. Roberts
2015-01-01
Mapping vegetation and landscape change at fine spatial scales is needed to inform natural resource and conservation planning, but such maps are expensive and time-consuming to produce. For Landsat-based methodologies, mapping efforts are hampered by the daunting task of manipulating multivariate data for millions to billions of pixels. The advent of cloud-based...
Post-Disaster Damage Assessment Through Coherent Change Detection on SAR Imagery
NASA Astrophysics Data System (ADS)
Guida, L.; Boccardo, P.; Donevski, I.; Lo Schiavo, L.; Molinari, M. E.; Monti-Guarnieri, A.; Oxoli, D.; Brovelli, M. A.
2018-04-01
Damage assessment is a fundamental step to support emergency response and recovery activities in a post-earthquake scenario. In recent years, UAVs and satellite optical imagery was applied to assess major structural damages before technicians could reach the areas affected by the earthquake. However, bad weather conditions may harm the quality of these optical assessments, thus limiting the practical applicability of these techniques. In this paper, the application of Synthetic Aperture Radar (SAR) imagery is investigated and a novel approach to SAR-based damage assessment is presented. Coherent Change Detection (CCD) algorithms on multiple interferometrically pre-processed SAR images of the area affected by the seismic event are exploited to automatically detect potential damages to buildings and other physical structures. As a case study, the 2016 Central Italy earthquake involving the cities of Amatrice and Accumoli was selected. The main contribution of the research outlined above is the integration of a complex process, requiring the coordination of a variety of methods and tools, into a unitary framework, which allows end-to-end application of the approach from SAR data pre-processing to result visualization in a Geographic Information System (GIS). A prototype of this pipeline was implemented, and the outcomes of this methodology were validated through an extended comparison with traditional damage assessment maps, created through photo-interpretation of high resolution aerial imagery. The results indicate that the proposed methodology is able to perform damage detection with a good level of accuracy, as most of the detected points of change are concentrated around highly damaged buildings.
NASA Technical Reports Server (NTRS)
Melcher, Kevin J.; Cruz, Jose A.; Johnson Stephen B.; Lo, Yunnhon
2015-01-01
This paper describes a quantitative methodology for bounding the false positive (FP) and false negative (FN) probabilities associated with a human-rated launch vehicle abort trigger (AT) that includes sensor data qualification (SDQ). In this context, an AT is a hardware and software mechanism designed to detect the existence of a specific abort condition. Also, SDQ is an algorithmic approach used to identify sensor data suspected of being corrupt so that suspect data does not adversely affect an AT's detection capability. The FP and FN methodologies presented here were developed to support estimation of the probabilities of loss of crew and loss of mission for the Space Launch System (SLS) which is being developed by the National Aeronautics and Space Administration (NASA). The paper provides a brief overview of system health management as being an extension of control theory; and describes how ATs and the calculation of FP and FN probabilities relate to this theory. The discussion leads to a detailed presentation of the FP and FN methodology and an example showing how the FP and FN calculations are performed. This detailed presentation includes a methodology for calculating the change in FP and FN probabilities that result from including SDQ in the AT architecture. To avoid proprietary and sensitive data issues, the example incorporates a mixture of open literature and fictitious reliability data. Results presented in the paper demonstrate the effectiveness of the approach in providing quantitative estimates that bound the probability of a FP or FN abort determination.
Assessment of Data Fusion Algorithms for Earth Observation Change Detection Processes.
Molina, Iñigo; Martinez, Estibaliz; Morillo, Carmen; Velasco, Jesus; Jara, Alvaro
2016-09-30
In this work a parametric multi-sensor Bayesian data fusion approach and a Support Vector Machine (SVM) are used for a Change Detection problem. For this purpose two sets of SPOT5-PAN images have been used, which are in turn used for Change Detection Indices (CDIs) calculation. For minimizing radiometric differences, a methodology based on zonal "invariant features" is suggested. The choice of one or the other CDI for a change detection process is a subjective task as each CDI is probably more or less sensitive to certain types of changes. Likewise, this idea might be employed to create and improve a "change map", which can be accomplished by means of the CDI's informational content. For this purpose, information metrics such as the Shannon Entropy and "Specific Information" have been used to weight the changes and no-changes categories contained in a certain CDI and thus introduced in the Bayesian information fusion algorithm. Furthermore, the parameters of the probability density functions (pdf's) that best fit the involved categories have also been estimated. Conversely, these considerations are not necessary for mapping procedures based on the discriminant functions of a SVM. This work has confirmed the capabilities of probabilistic information fusion procedure under these circumstances.
Hassan, Sedky H A; Van Ginkel, Steven W; Kim, Sung-Min; Yoon, Sung-Hwan; Joo, Jin-Ho; Shin, Beom-Soo; Jeon, Byong-Hun; Bae, Wookeun; Oh, Sang-Eun
2010-08-01
A novel toxicity detection methodology based on sulfur-oxidizing bacteria (SOB) has been developed for the rapid and reliable detection of toxic chemicals in water. The methodology exploits the ability of SOB to oxidize sulfur particles in the presence of oxygen to produce sulfuric acid. The reaction results in an increase in electrical conductivity (EC) and a decrease in pH. The assay is based on the inhibition of SOB in the presence of toxic chemicals by measuring changes in EC and pH. We found that SOB biosensor can detect toxic chemicals, such as heavy metals and CN-, in the 5-2000ppb range. One bacterium was isolated from an SOB biosensor and the 16S rRNA gene of the bacterial strain has 99% and 96% sequence similarity to Acidithiobacillus sp. ORCS6 and Acidithiobacillus caldus DSM 8584, respectively. The isolate was identified as A. caldus SMK. The SOB biosensor is ideally suited for monitoring toxic chemicals in water having the advantages of high sensitivity and quick detection.
Bustos, Alejandro; Rubio, Higinio; Castejón, Cristina; García-Prada, Juan Carlos
2018-03-06
An efficient maintenance is a key consideration in systems of railway transport, especially in high-speed trains, in order to avoid accidents with catastrophic consequences. In this sense, having a method that allows for the early detection of defects in critical elements, such as the bogie mechanical components, is a crucial for increasing the availability of rolling stock and reducing maintenance costs. The main contribution of this work is the proposal of a methodology that, based on classical signal processing techniques, provides a set of parameters for the fast identification of the operating state of a critical mechanical system. With this methodology, the vibratory behaviour of a very complex mechanical system is characterised, through variable inputs, which will allow for the detection of possible changes in the mechanical elements. This methodology is applied to a real high-speed train in commercial service, with the aim of studying the vibratory behaviour of the train (specifically, the bogie) before and after a maintenance operation. The results obtained with this methodology demonstrated the usefulness of the new procedure and allowed for the disclosure of reductions between 15% and 45% in the spectral power of selected Intrinsic Mode Functions (IMFs) after the maintenance operation.
EMD-Based Methodology for the Identification of a High-Speed Train Running in a Gear Operating State
García-Prada, Juan Carlos
2018-01-01
An efficient maintenance is a key consideration in systems of railway transport, especially in high-speed trains, in order to avoid accidents with catastrophic consequences. In this sense, having a method that allows for the early detection of defects in critical elements, such as the bogie mechanical components, is a crucial for increasing the availability of rolling stock and reducing maintenance costs. The main contribution of this work is the proposal of a methodology that, based on classical signal processing techniques, provides a set of parameters for the fast identification of the operating state of a critical mechanical system. With this methodology, the vibratory behaviour of a very complex mechanical system is characterised, through variable inputs, which will allow for the detection of possible changes in the mechanical elements. This methodology is applied to a real high-speed train in commercial service, with the aim of studying the vibratory behaviour of the train (specifically, the bogie) before and after a maintenance operation. The results obtained with this methodology demonstrated the usefulness of the new procedure and allowed for the disclosure of reductions between 15% and 45% in the spectral power of selected Intrinsic Mode Functions (IMFs) after the maintenance operation. PMID:29509690
NASA Astrophysics Data System (ADS)
Jang, Sunyoung; Jaszczak, R. J.; Tsui, B. M. W.; Metz, C. E.; Gilland, D. R.; Turkington, T. G.; Coleman, R. E.
1998-08-01
The purpose of this work was to evaluate lesion detectability with and without nonuniform attenuation compensation (AC) in myocardial perfusion SPECT imaging in women using an anthropomorphic phantom and receiver operating characteristics (ROC) methodology. Breast attenuation causes artifacts in reconstructed images and may increase the difficulty of diagnosis of myocardial perfusion imaging in women. The null hypothesis tested using the ROC study was that nonuniform AC does not change the lesion detectability in myocardial perfusion SPECT imaging in women. The authors used a filtered backprojection (FBP) reconstruction algorithm and Chang's (1978) single iteration method for AC. In conclusion, with the authors' proposed myocardial defect model nuclear medicine physicians demonstrated no significant difference for the detection of the anterior wall defect; however, a greater accuracy for the detection of the inferior wall defect was observed without nonuniform AC than with it (P-value=0.0034). Medical physicists did not demonstrate any statistically significant difference in defect detection accuracy with or without nonuniform AC in the female phantom.
Methodology for Evaluating Raw Material Changes to RSRM Elastomeric Insulation Materials
NASA Technical Reports Server (NTRS)
Mildenhall, Scott D.; McCool, Alex (Technical Monitor)
2001-01-01
The Reusable Solid Rocket Motor (RSRM) uses asbestos and silicon dioxide filled acrylonitrile butadiene rubber (AS-NBR) as the primary internal insulation to protect the case from heat. During the course of the RSRM Program, several changes have been made to the raw materials and processing of the AS-NBR elastomeric insulation material. These changes have been primarily caused by raw materials becoming obsolete. In addition, some process changes have been implemented that were deemed necessary to improve the quality and consistency of the AS-NBR insulation material. Each change has been evaluated using unique test efforts customized to determine the potential impacts of the specific raw material or process change. Following the evaluations, the various raw material and process changes were successfully implemented with no detectable effect on the performance of the AS-NBR insulation. This paper will discuss some of the raw material and process changes evaluated, the methodology used in designing the unique test plans, and the general evaluation results. A summary of the change history of RSRM AS-NBR internal insulation is also presented.
I. Arismendi; S. L. Johnson; J. B. Dunham
2015-01-01
Statistics of central tendency and dispersion may not capture relevant or desired characteristics of the distribution of continuous phenomena and, thus, they may not adequately describe temporal patterns of change. Here, we present two methodological approaches that can help to identify temporal changes in environmental regimes. First, we use higher-order statistical...
Mechanical modulation method for ultrasensitive phase measurements in photonics biosensing.
Patskovsky, S; Maisonneuve, M; Meunier, M; Kabashin, A V
2008-12-22
A novel polarimetry methodology for phase-sensitive measurements in single reflection geometry is proposed for applications in optical transduction-based biological sensing. The methodology uses altering step-like chopper-based mechanical phase modulation for orthogonal s- and p- polarizations of light reflected from the sensing interface and the extraction of phase information at different harmonics of the modulation. We show that even under a relatively simple experimental arrangement, the methodology provides the resolution of phase measurements as low as 0.007 deg. We also examine the proposed approach using Total Internal Reflection (TIR) and Surface Plasmon Resonance (SPR) geometries. For TIR geometry, the response appears to be strongly dependent on the prism material with the best values for high refractive index Si. The detection limit for Si-based TIR is estimated as 10(-5) in terms Refractive Index Units (RIU) change. SPR geometry offers much stronger phase response due to a much sharper phase characteristics. With the detection limit of 3.2*10(-7) RIU, the proposed methodology provides one of best sensitivities for phase-sensitive SPR devices. Advantages of the proposed method include high sensitivity, simplicity of experimental setup and noise immunity as a result of a high stability modulation.
NASA Astrophysics Data System (ADS)
Chakraborty, S.; Banerjee, A.; Gupta, S. K. S.; Christensen, P. R.; Papandreou-Suppappola, A.
2017-12-01
Multitemporal observations acquired frequently by satellites with short revisit periods such as the Moderate Resolution Imaging Spectroradiometer (MODIS), is an important source for modeling land cover. Due to the inherent seasonality of the land cover, harmonic modeling reveals hidden state parameters characteristic to it, which is used in classifying different land cover types and in detecting changes due to natural or anthropogenic factors. In this work, we use an eight day MODIS composite to create a Normalized Difference Vegetation Index (NDVI) time-series of ten years. Improved hidden parameter estimates of the nonlinear harmonic NDVI model are obtained using the Particle Filter (PF), a sequential Monte Carlo estimator. The nonlinear estimation based on PF is shown to improve parameter estimation for different land cover types compared to existing techniques that use the Extended Kalman Filter (EKF), due to linearization of the harmonic model. As these parameters are representative of a given land cover, its applicability in near real-time detection of land cover change is also studied by formulating a metric that captures parameter deviation due to change. The detection methodology is evaluated by considering change as a rare class problem. This approach is shown to detect change with minimum delay. Additionally, the degree of change within the change perimeter is non-uniform. By clustering the deviation in parameters due to change, this spatial variation in change severity is effectively mapped and validated with high spatial resolution change maps of the given regions.
Assessment of Data Fusion Algorithms for Earth Observation Change Detection Processes
Molina, Iñigo; Martinez, Estibaliz; Morillo, Carmen; Velasco, Jesus; Jara, Alvaro
2016-01-01
In this work a parametric multi-sensor Bayesian data fusion approach and a Support Vector Machine (SVM) are used for a Change Detection problem. For this purpose two sets of SPOT5-PAN images have been used, which are in turn used for Change Detection Indices (CDIs) calculation. For minimizing radiometric differences, a methodology based on zonal “invariant features” is suggested. The choice of one or the other CDI for a change detection process is a subjective task as each CDI is probably more or less sensitive to certain types of changes. Likewise, this idea might be employed to create and improve a “change map”, which can be accomplished by means of the CDI’s informational content. For this purpose, information metrics such as the Shannon Entropy and “Specific Information” have been used to weight the changes and no-changes categories contained in a certain CDI and thus introduced in the Bayesian information fusion algorithm. Furthermore, the parameters of the probability density functions (pdf’s) that best fit the involved categories have also been estimated. Conversely, these considerations are not necessary for mapping procedures based on the discriminant functions of a SVM. This work has confirmed the capabilities of probabilistic information fusion procedure under these circumstances. PMID:27706048
Climate Change Detection and Attribution of Infrared Spectrum Measurements
NASA Technical Reports Server (NTRS)
Phojanamongkolkij, Nipa; Parker, Peter A.; Mlynczak, Martin G.
2012-01-01
Climate change occurs when the Earth's energy budget changes due to natural or possibly anthropogenic forcings. These forcings cause the climate system to adjust resulting in a new climate state that is warmer or cooler than the original. The key question is how to detect and attribute climate change. The inference of infrared spectral signatures of climate change has been discussed in the literature for nearly 30 years. Pioneering work in the 1980s noted that distinct spectral signatures would be evident in changes in the infrared radiance emitted by the Earth and its atmosphere, and that these could be observed from orbiting satellites. Since then, a number of other studies have advanced the concepts of spectral signatures of climate change. Today the concept of using spectral signatures to identify and attribute atmospheric composition change is firmly accepted and is the foundation of the Climate Absolute Radiance and Refractivity Observatory (CLARREO) satellite mission being developed at NASA. In this work, we will present an overview of the current climate change detection concept using climate model calculations as surrogates for climate change. Any future research work improving the methodology to achieve this concept will be valuable to our society.
Rate change detection of frequency modulated signals: developmental trends.
Cohen-Mimran, Ravit; Sapir, Shimon
2011-08-26
The aim of this study was to examine developmental trends in rate change detection of auditory rhythmic signals (repetitive sinusoidally frequency modulated tones). Two groups of children (9-10 years old and 11-12 years old) and one group of young adults performed a rate change detection (RCD) task using three types of stimuli. The rate of stimulus modulation was either constant (CR), raised by 1 Hz in the middle of the stimulus (RR1) or raised by 2 Hz in the middle of the stimulus (RR2). Performance on the RCD task significantly improved with age. Also, the different stimuli showed different developmental trajectories. When the RR2 stimulus was used, results showed adult-like performance by the age of 10 years but when the RR1 stimulus was used performance continued to improve beyond 12 years of age. Rate change detection of repetitive sinusoidally frequency modulated tones show protracted development beyond the age of 12 years. Given evidence for abnormal processing of auditory rhythmic signals in neurodevelopmental conditions, such as dyslexia, the present methodology might help delineate the nature of these conditions.
An Investigation of Automatic Change Detection for Topographic Map Updating
NASA Astrophysics Data System (ADS)
Duncan, P.; Smit, J.
2012-08-01
Changes to the landscape are constantly occurring and it is essential for geospatial and mapping organisations that these changes are regularly detected and captured, so that map databases can be updated to reflect the current status of the landscape. The Chief Directorate of National Geospatial Information (CD: NGI), South Africa's national mapping agency, currently relies on manual methods of detecting changes and capturing these changes. These manual methods are time consuming and labour intensive, and rely on the skills and interpretation of the operator. It is therefore necessary to move towards more automated methods in the production process at CD: NGI. The aim of this research is to do an investigation into a methodology for automatic or semi-automatic change detection for the purpose of updating topographic databases. The method investigated for detecting changes is through image classification as well as spatial analysis and is focussed on urban landscapes. The major data input into this study is high resolution aerial imagery and existing topographic vector data. Initial results indicate the traditional pixel-based image classification approaches are unsatisfactory for large scale land-use mapping and that object-orientated approaches hold more promise. Even in the instance of object-oriented image classification generalization of techniques on a broad-scale has provided inconsistent results. A solution may lie with a hybrid approach of pixel and object-oriented techniques.
NASA Astrophysics Data System (ADS)
Stone, Dáithí A.; Hansen, Gerrit
2016-09-01
Despite being a well-established research field, the detection and attribution of observed climate change to anthropogenic forcing is not yet provided as a climate service. One reason for this is the lack of a methodology for performing tailored detection and attribution assessments on a rapid time scale. Here we develop such an approach, based on the translation of quantitative analysis into the "confidence" language employed in recent Assessment Reports of the Intergovernmental Panel on Climate Change. While its systematic nature necessarily ignores some nuances examined in detailed expert assessments, the approach nevertheless goes beyond most detection and attribution studies in considering contributors to building confidence such as errors in observational data products arising from sparse monitoring networks. When compared against recent expert assessments, the results of this approach closely match those of the existing assessments. Where there are small discrepancies, these variously reflect ambiguities in the details of what is being assessed, reveal nuances or limitations of the expert assessments, or indicate limitations of the accuracy of the sort of systematic approach employed here. Deployment of the method on 116 regional assessments of recent temperature and precipitation changes indicates that existing rules of thumb concerning the detectability of climate change ignore the full range of sources of uncertainty, most particularly the importance of adequate observational monitoring.
ERIC Educational Resources Information Center
Valasek, Mark A.; Repa, Joyce J.
2005-01-01
In recent years, real-time polymerase chain reaction (PCR) has emerged as a robust and widely used methodology for biological investigation because it can detect and quantify very small amounts of specific nucleic acid sequences. As a research tool, a major application of this technology is the rapid and accurate assessment of changes in gene…
Microseismic techniques for avoiding induced seismicity during fluid injection
Matzel, Eric; White, Joshua; Templeton, Dennise; ...
2014-01-01
The goal of this research is to develop a fundamentally better approach to geological site characterization and early hazard detection. We combine innovative techniques for analyzing microseismic data with a physics-based inversion model to forecast microseismic cloud evolution. The key challenge is that faults at risk of slipping are often too small to detect during the site characterization phase. Our objective is to devise fast-running methodologies that will allow field operators to respond quickly to changing subsurface conditions.
Support Vector Machines for Multitemporal and Multisensor Change Detection in a Mining Area
NASA Astrophysics Data System (ADS)
Hecheltjen, Antje; Waske, Bjorn; Thonfeld, Frank; Braun, Matthias; Menz, Gunter
2010-12-01
Long-term change detection often implies the challenge of incorporating multitemporal data from different sensors. Most of the conventional change detection algorithms are designed for bi-temporal datasets from the same sensors detecting only the existence of changes. The labeling of change areas remains a difficult task. To overcome such drawbacks, much attention has been given lately to algorithms arising from machine learning, such as Support Vector Machines (SVMs). While SVMs have been applied successfully for land cover classifications, the exploitation of this approach for change detection is still in its infancy. Few studies have already proven the applicability of SVMs for bi- and multitemporal change detection using data from one sensor only. In this paper we demonstrate the application of SVM for multitemporal and -sensor change detection. Our study site covers lignite open pit mining areas in the German state North Rhine-Westphalia. The dataset consists of bi-temporal Landsat data and multi-temporal ERS SAR data covering two time slots (2001 and 2009). The SVM is conducted using the IDL program imageSVM. Change is deduced from one time slot to the next resulting in two change maps. In contrast to change detection, which is based on post-classification comparison, change detection is seen here as a specific classification problem. Thus, changes are directly classified from a layer-stack of the two years. To reduce the number of change classes, we created a change mask using the magnitude of Change Vector Analysis (CVA). Training data were selected for different change classes (e.g. forest to mining or mining to agriculture) as well as for the no-change classes (e.g. agriculture). Subsequently, they were divided in two independent sets for training the SVMs and accuracy assessment, respectively. Our study shows the applicability of SVMs to classify changes via SVMs. The proposed method yielded a change map of reclaimed and active mines. The use of ERS SAR data, however, did not add to the accuracy compared to Landsat data only. A great advantage compared to other change detection approaches are the labeled change maps, which are a direct output of the methodology. Our approach also overcomes the drawback of post-classification comparison, namely the propagation of classification inaccuracies.
A Mode-Shape-Based Fault Detection Methodology for Cantilever Beams
NASA Technical Reports Server (NTRS)
Tejada, Arturo
2009-01-01
An important goal of NASA's Internal Vehicle Health Management program (IVHM) is to develop and verify methods and technologies for fault detection in critical airframe structures. A particularly promising new technology under development at NASA Langley Research Center is distributed Bragg fiber optic strain sensors. These sensors can be embedded in, for instance, aircraft wings to continuously monitor surface strain during flight. Strain information can then be used in conjunction with well-known vibrational techniques to detect faults due to changes in the wing's physical parameters or to the presence of incipient cracks. To verify the benefits of this technology, the Formal Methods Group at NASA LaRC has proposed the use of formal verification tools such as PVS. The verification process, however, requires knowledge of the physics and mathematics of the vibrational techniques and a clear understanding of the particular fault detection methodology. This report presents a succinct review of the physical principles behind the modeling of vibrating structures such as cantilever beams (the natural model of a wing). It also reviews two different classes of fault detection techniques and proposes a particular detection method for cracks in wings, which is amenable to formal verification. A prototype implementation of these methods using Matlab scripts is also described and is related to the fundamental theoretical concepts.
Hayes, J E; McGreevy, P D; Forbes, S L; Laing, G; Stuetz, R M
2018-08-01
Detection dogs serve a plethora of roles within modern society, and are relied upon to identify threats such as explosives and narcotics. Despite their importance, research and training regarding detection dogs has involved ambiguity. This is partially due to the fact that the assessment of effectiveness regarding detection dogs continues to be entrenched within a traditional, non-scientific understanding. Furthermore, the capabilities of detection dogs are also based on their olfactory physiology and training methodologies, both of which are hampered by knowledge gaps. Additionally, the future of detection dogs is strongly influenced by welfare and social implications. Most importantly however, is the emergence of progressively inexpensive and efficacious analytical methodologies including gas chromatography related techniques, "e-noses", and capillary electrophoresis. These analytical methodologies provide both an alternative and assistor for the detection dog industry, however the interrelationship between these two detection paradigms requires clarification. These factors, when considering their relative contributions, illustrate a need to address research gaps, formalise the detection dog industry and research process, as well as take into consideration analytical methodologies and their influence on the future status of detection dogs. This review offers an integrated assessment of the factors involved in order to determine the current and future status of detection dogs. Copyright © 2018 Elsevier B.V. All rights reserved.
Assessment of SRS ambient air monitoring network
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abbott, K.; Jannik, T.
Three methodologies have been used to assess the effectiveness of the existing ambient air monitoring system in place at the Savannah River Site in Aiken, SC. Effectiveness was measured using two metrics that have been utilized in previous quantification of air-monitoring network performance; frequency of detection (a measurement of how frequently a minimum number of samplers within the network detect an event), and network intensity (a measurement of how consistent each sampler within the network is at detecting events). In addition to determining the effectiveness of the current system, the objective of performing this assessment was to determine what, ifmore » any, changes could make the system more effective. Methodologies included 1) the Waite method of determining sampler distribution, 2) the CAP88- PC annual dose model, and 3) a puff/plume transport model used to predict air concentrations at sampler locations. Data collected from air samplers at SRS in 2015 compared with predicted data resulting from the methodologies determined that the frequency of detection for the current system is 79.2% with sampler efficiencies ranging from 5% to 45%, and a mean network intensity of 21.5%. One of the air monitoring stations had an efficiency of less than 10%, and detected releases during just one sampling period of the entire year, adding little to the overall network intensity. By moving or removing this sampler, the mean network intensity increased to about 23%. Further work in increasing the network intensity and simulating accident scenarios to further test the ambient air system at SRS is planned« less
Spotlight SAR interferometry for terrain elevation mapping and interferometric change detection
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eichel, P.H.; Ghiglia, D.C.; Jakowatz, C.V. Jr.
1996-02-01
In this report, we employ an approach quite different from any previous work; we show that a new methodology leads to a simpler and clearer understanding of the fundamental principles of SAR interferometry. This methodology also allows implementation of an important collection mode that has not been demonstrated to date. Specifically, we introduce the following six new concepts for the processing of interferometric SAR (INSAR) data: (1) processing using spotlight mode SAR imaging (allowing ultra-high resolution), as opposed to conventional strip-mapping techniques; (2) derivation of the collection geometry constraints required to avoid decorrelation effects in two-pass INSAR; (3) derivation ofmore » maximum likelihood estimators for phase difference and the change parameter employed in interferometric change detection (ICD); (4) processing for the two-pass case wherein the platform ground tracks make a large crossing angle; (5) a robust least-squares method for two-dimensional phase unwrapping formulated as a solution to Poisson`s equation, instead of using traditional path-following techniques; and (6) the existence of a simple linear scale factor that relates phase differences between two SAR images to terrain height. We show both theoretical analysis, as well as numerous examples that employ real SAR collections to demonstrate the innovations listed above.« less
NASA Astrophysics Data System (ADS)
He, Jingjing; Wang, Dengjiang; Zhang, Weifang
2015-03-01
This study presents an experimental and modeling study for damage detection and quantification in riveted lap joints. Embedded lead zirconate titanate piezoelectric (PZT) ceramic wafer-type sensors are employed to perform in-situ non-destructive testing during fatigue cyclical loading. A multi-feature integration method is developed to quantify the crack size using signal features of correlation coefficient, amplitude change, and phase change. In addition, probability of detection (POD) model is constructed to quantify the reliability of the developed sizing method. Using the developed crack size quantification method and the resulting POD curve, probabilistic fatigue life prediction can be performed to provide comprehensive information for decision-making. The effectiveness of the overall methodology is demonstrated and validated using several aircraft lap joint specimens from different manufactures and under different loading conditions.
Multitemporal spatial pattern analysis of Tulum's tropical coastal landscape
NASA Astrophysics Data System (ADS)
Ramírez-Forero, Sandra Carolina; López-Caloca, Alejandra; Silván-Cárdenas, José Luis
2011-11-01
The tropical coastal landscape of Tulum in Quintana Roo, Mexico has a high ecological, economical, social and cultural value, it provides environmental and tourism services at global, national, regional and local levels. The landscape of the area is heterogeneous and presents random fragmentation patterns. In recent years, tourist services of the region has been increased promoting an accelerate expansion of hotels, transportation and recreation infrastructure altering the complex landscape. It is important to understand the environmental dynamics through temporal changes on the spatial patterns and to propose a better management of this ecological area to the authorities. This paper addresses a multi-temporal analysis of land cover changes from 1993 to 2000 in Tulum using Thematic Mapper data acquired by Landsat-5. Two independent methodologies were applied for the analysis of changes in the landscape and for the definition of fragmentation patterns. First, an Iteratively Multivariate Alteration Detection (IR-MAD) algorithm was used to detect and localize land cover change/no-change areas. Second, the post-classification change detection evaluated using the Support Vector Machine (SVM) algorithm. Landscape metrics were calculated from the results of IR-MAD and SVM. The analysis of the metrics indicated, among other things, a higher fragmentation pattern along roadways.
Stone, Daithi A.; Hansen, Gerrit
2015-11-21
Despite being a well-established research field, the detection and attribution of observed climate change to anthropogenic forcing is not yet provided as a climate service. One reason for this is the lack of a methodology for performing tailored detection and attribution assessments on a rapid time scale. Here we develop such an approach, based on the translation of quantitative analysis into the “confidence” language employed in recent Assessment Reports of the Intergovernmental Panel on Climate Change. While its systematic nature necessarily ignores some nuances examined in detailed expert assessments, the approach nevertheless goes beyond most detection and attribution studies inmore » considering contributors to building confidence such as errors in observational data products arising from sparse monitoring networks. When compared against recent expert assessments, the results of this approach closely match those of the existing assessments. Where there are small discrepancies, these variously reflect ambiguities in the details of what is being assessed, reveal nuances or limitations of the expert assessments, or indicate limitations of the accuracy of the sort of systematic approach employed here. Deployment of the method on 116 regional assessments of recent temperature and precipitation changes indicates that existing rules of thumb concerning the detectability of climate change ignore the full range of sources of uncertainty, most particularly the importance of adequate observational monitoring.« less
Lockie, Robert G; Farzad, Jalilvand; Orjalo, Ashley J; Giuliano, Dominic V; Moreno, Matthew R; Wright, Glenn A
2017-02-01
Lockie, RG, Jalilvand, F, Orjalo, AJ, Giuliano, DV, Moreno, MR, and Wright, GA. A methodological report: Adapting the 505 change-of-direction speed test specific to American football. J Strength Cond Res 31(2): 539-547, 2017-The 505 involves a 10-m sprint past a timing gate, followed by a 180° change-of-direction (COD) performed over 5 m. This methodological report investigated an adapted 505 (A505) designed to be football-specific by changing the distances to 10 and 5 yd. Twenty-five high school football players (6 linemen [LM]; 8 quarterbacks, running backs, and linebackers [QB/RB/LB]; 11 receivers and defensive backs [R/DB]) completed the A505 and 40-yd sprint. The difference between A505 and 0 to 10-yd time determined the COD deficit for each leg. In a follow-up session, 10 subjects completed the A505 again and 10 subjects completed the 505. Reliability was analyzed by t-tests to determine between-session differences, typical error (TE), and coefficient of variation. Test usefulness was examined via TE and smallest worthwhile change (SWC) differences. Pearson's correlations calculated relationships between the A505 and 505, and A505 and COD deficit with the 40-yd sprint. A 1-way analysis of variance (p ≤ 0.05) derived between-position differences in the A505 and COD deficit. There were no between-session differences for the A505 (p = 0.45-0.76; intraclass correlation coefficient = 0.84-0.95; TE = 2.03-4.13%). Additionally, the A505 was capable of detecting moderate performance changes (SWC0.5 > TE). The A505 correlated with the 505 and 40-yard sprint (r = 0.58-0.92), suggesting the modified version assessed similar qualities. Receivers and defensive backs were faster than LM in the A505 for both legs, and right-leg COD deficit. Quarterbacks, running backs, and linebackers were faster than LM in the right-leg A505. The A505 is reliable, can detect moderate performance changes, and can discriminate between football position groups.
Zimmermann, Boris; Kohler, Achim
2014-01-01
Background It is imperative to have reliable and timely methodologies for analysis and monitoring of seed plants in order to determine climate-related plant processes. Moreover, impact of environment on plant fitness is predominantly based on studies of female functions, while the contribution of male gametophytes is mostly ignored due to missing data on pollen quality. We explored the use of infrared spectroscopy of pollen for an inexpensive and rapid characterization of plants. Methodology The study was based on measurement of pollen samples by two Fourier transform infrared techniques: single reflectance attenuated total reflectance and transmission measurement of sample pellets. The experimental set, with a total of 813 samples, included five pollination seasons and 300 different plant species belonging to all principal spermatophyte clades (conifers, monocotyledons, eudicots, and magnoliids). Results The spectroscopic-based methodology enables detection of phylogenetic variations, including the separation of confamiliar and congeneric species. Furthermore, the methodology enables measurement of phenotypic plasticity by the detection of inter-annual variations within the populations. The spectral differences related to environment and taxonomy are interpreted biochemically, specifically variations of pollen lipids, proteins, carbohydrates, and sporopollenins. The study shows large variations of absolute content of nutrients for congenital species pollinating in the same environmental conditions. Moreover, clear correlation between carbohydrate-to-protein ratio and pollination strategy has been detected. Infrared spectral database with respect to biochemical variation among the range of species, climate and biogeography will significantly improve comprehension of plant-environment interactions, including impact of global climate change on plant communities. PMID:24748390
Use of uterine electromyography to diagnose term and preterm labor
LUCOVNIK, MIHA; KUON, RUBEN J.; CHAMBLISS, LINDA R.; MANER, WILLIAM L.; SHI, SHAO-QING; SHI, LEILI; BALDUCCI, JAMES; GARFIELD, ROBERT E.
2011-01-01
Current methodologies to assess the process of labor, such as tocodynamometry or intrauterine pressure catheters, fetal fibronectin, cervical length measurement and digital cervical examination, have several major drawbacks. They only measure the onset of labor indirectly and do not detect cellular changes characteristic of true labor. Consequently, their predictive values for term or preterm delivery are poor. Uterine contractions are a result of the electrical activity within the myometrium. Measurement of uterine electromyography (EMG) has been shown to detect contractions as accurately as the currently used methods. In addition, changes in cell excitability and coupling required for effective contractions that lead to delivery are reflected in changes of several EMG parameters. Use of uterine EMG can help to identify patients in true labor better than any other method presently employed in the clinic. PMID:21241260
Automatic food intake detection based on swallowing sounds.
Makeyev, Oleksandr; Lopez-Meyer, Paulo; Schuckers, Stephanie; Besio, Walter; Sazonov, Edward
2012-11-01
This paper presents a novel fully automatic food intake detection methodology, an important step toward objective monitoring of ingestive behavior. The aim of such monitoring is to improve our understanding of eating behaviors associated with obesity and eating disorders. The proposed methodology consists of two stages. First, acoustic detection of swallowing instances based on mel-scale Fourier spectrum features and classification using support vector machines is performed. Principal component analysis and a smoothing algorithm are used to improve swallowing detection accuracy. Second, the frequency of swallowing is used as a predictor for detection of food intake episodes. The proposed methodology was tested on data collected from 12 subjects with various degrees of adiposity. Average accuracies of >80% and >75% were obtained for intra-subject and inter-subject models correspondingly with a temporal resolution of 30s. Results obtained on 44.1 hours of data with a total of 7305 swallows show that detection accuracies are comparable for obese and lean subjects. They also suggest feasibility of food intake detection based on swallowing sounds and potential of the proposed methodology for automatic monitoring of ingestive behavior. Based on a wearable non-invasive acoustic sensor the proposed methodology may potentially be used in free-living conditions.
Automatic food intake detection based on swallowing sounds
Makeyev, Oleksandr; Lopez-Meyer, Paulo; Schuckers, Stephanie; Besio, Walter; Sazonov, Edward
2012-01-01
This paper presents a novel fully automatic food intake detection methodology, an important step toward objective monitoring of ingestive behavior. The aim of such monitoring is to improve our understanding of eating behaviors associated with obesity and eating disorders. The proposed methodology consists of two stages. First, acoustic detection of swallowing instances based on mel-scale Fourier spectrum features and classification using support vector machines is performed. Principal component analysis and a smoothing algorithm are used to improve swallowing detection accuracy. Second, the frequency of swallowing is used as a predictor for detection of food intake episodes. The proposed methodology was tested on data collected from 12 subjects with various degrees of adiposity. Average accuracies of >80% and >75% were obtained for intra-subject and inter-subject models correspondingly with a temporal resolution of 30s. Results obtained on 44.1 hours of data with a total of 7305 swallows show that detection accuracies are comparable for obese and lean subjects. They also suggest feasibility of food intake detection based on swallowing sounds and potential of the proposed methodology for automatic monitoring of ingestive behavior. Based on a wearable non-invasive acoustic sensor the proposed methodology may potentially be used in free-living conditions. PMID:23125873
Tanaka, Yoshiyuki; Mizoe, Genki; Kawaguchi, Tomohiro
2015-01-01
This paper proposes a simple diagnostic methodology for checking the ability of proprioceptive/kinesthetic sensation by using a robotic device. The perception ability of virtual frictional forces is examined in operations of the robotic device by the hand at a uniform slow velocity along the virtual straight/circular path. Experimental results by healthy subjects demonstrate that percentage of correct answers for the designed perceptual tests changes in the motion direction as well as the arm configuration and the HFM (human force manipulability) measure. It can be supposed that the proposed methodology can be applied into the early detection of neuromuscular/neurological disorders.
Baty, Florent; Klingbiel, Dirk; Zappa, Francesco; Brutsche, Martin
2015-12-01
Alternative splicing is an important component of tumorigenesis. Recent advent of exon array technology enables the detection of alternative splicing at a genome-wide scale. The analysis of high-throughput alternative splicing is not yet standard and methodological developments are still needed. We propose a novel statistical approach-Dually Constrained Correspondence Analysis-for the detection of splicing changes in exon array data. Using this methodology, we investigated the genome-wide alteration of alternative splicing in patients with non-small cell lung cancer treated by bevacizumab/erlotinib. Splicing candidates reveal a series of genes related to carcinogenesis (SFTPB), cell adhesion (STAB2, PCDH15, HABP2), tumor aggressiveness (ARNTL2), apoptosis, proliferation and differentiation (PDE4D, FLT3, IL1R2), cell invasion (ETV1), as well as tumor growth (OLFM4, FGF14), tumor necrosis (AFF3) or tumor suppression (TUSC3, CSMD1, RHOBTB2, SERPINB5), with indication of known alternative splicing in a majority of genes. DCCA facilitates the identification of putative biologically relevant alternative splicing events in high-throughput exon array data. Copyright © 2015 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Chen, Y. C.; Shih, H. Y.; Chen, J. Y.; Tan, W. J.; Chen, Y. F.
2013-07-01
An optically detectable gas sensor based on the high surface sensitivity of functionalized polyethylenimine/starch In0.15Ga0.85N/GaN strained semiconductor multiple quantum wells (MQWs) has been developed. Due to the excellent piezoelectricity of the MQWs, the change of surface charges caused by chemical interaction can introduce a strain and induce an internal field. In turn, it tilts the energy levels of the MQWs and modifies the optical properties. Through the measurement of the changes in photoluminescence as well as Raman scattering spectra under different concentrations of carbon dioxide gas, we demonstrate the feasibility and high sensitivity of the sensors derived from our methodology.
NASA Technical Reports Server (NTRS)
Muller, Dagmar; Krasemann, Hajo; Brewin, Robert J. W.; Deschamps, Pierre-Yves; Doerffer, Roland; Fomferra, Norman; Franz, Bryan A.; Grant, Mike G.; Groom, Steve B.; Melin, Frederic;
2015-01-01
The Ocean Colour Climate Change Initiative intends to provide a long-term time series of ocean colour data and investigate the detectable climate impact. A reliable and stable atmospheric correction procedure is the basis for ocean colour products of the necessary high quality. In order to guarantee an objective selection from a set of four atmospheric correction processors, the common validation strategy of comparisons between in-situ and satellite derived water leaving reflectance spectra, is extended by a ranking system. In principle, the statistical parameters such as root mean square error, bias, etc. and measures of goodness of fit, are transformed into relative scores, which evaluate the relationship of quality dependent on the algorithms under study. The sensitivity of these scores to the selected database has been assessed by a bootstrapping exercise, which allows identification of the uncertainty in the scoring results. Although the presented methodology is intended to be used in an algorithm selection process, this paper focusses on the scope of the methodology rather than the properties of the individual processors.
A Cooperative IDS Approach Against MPTCP Attacks
2017-06-01
physical testbeds in order to present a methodology that allows distributed IDSs (DIDS) to cooperate in a manner that permits effective detection of...reconstruct MPTCP subflows and detect malicious content. Next, we build physical testbeds in order to present a methodology that allows distributed IDSs...hypotheses on a more realistic testbed environment. • Developing a methodology to incorporate multiple IDSs, real and virtual, to be able to detect cross
Protein detection by Simple Western™ analysis.
Harris, Valerie M
2015-01-01
Protein Simple© has taken a well-known protein detection method, the western blot, and revolutionized it. The Simple Western™ system uses capillary electrophoresis to identify and quantitate a protein of interest. Protein Simple© provides multiple detection apparatuses (Wes, Sally Sue, or Peggy Sue) that are suggested to save scientists valuable time by allowing the researcher to prepare the protein sample, load it along with necessary antibodies and substrates, and walk away. Within 3-5 h the protein will be separated by size, or charge, immuno-detection of target protein will be accurately quantitated, and results will be immediately made available. Using the Peggy Sue instrument, one study recently examined changes in MAPK signaling proteins in the sex-determining stage of gonadal development. Here the methodology is described.
Damage detection of engine bladed-disks using multivariate statistical analysis
NASA Astrophysics Data System (ADS)
Fang, X.; Tang, J.
2006-03-01
The timely detection of damage in aero-engine bladed-disks is an extremely important and challenging research topic. Bladed-disks have high modal density and, particularly, their vibration responses are subject to significant uncertainties due to manufacturing tolerance (blade-to-blade difference or mistuning), operating condition change and sensor noise. In this study, we present a new methodology for the on-line damage detection of engine bladed-disks using their vibratory responses during spin-up or spin-down operations which can be measured by blade-tip-timing sensing technique. We apply a principle component analysis (PCA)-based approach for data compression, feature extraction, and denoising. The non-model based damage detection is achieved by analyzing the change between response features of the healthy structure and of the damaged one. We facilitate such comparison by incorporating the Hotelling's statistic T2 analysis, which yields damage declaration with a given confidence level. The effectiveness of the method is demonstrated by case studies.
Barbraud, C.; Nichols, J.D.; Hines, J.E.; Hafner, H.
2003-01-01
Coloniality has mainly been studied from an evolutionary perspective, but relatively few studies have developed methods for modelling colony dynamics. Changes in number of colonies over time provide a useful tool for predicting and evaluating the responses of colonial species to management and to environmental disturbance. Probabilistic Markov process models have been recently used to estimate colony site dynamics using presence-absence data when all colonies are detected in sampling efforts. Here, we define and develop two general approaches for the modelling and analysis of colony dynamics for sampling situations in which all colonies are, and are not, detected. For both approaches, we develop a general probabilistic model for the data and then constrain model parameters based on various hypotheses about colony dynamics. We use Akaike's Information Criterion (AIC) to assess the adequacy of the constrained models. The models are parameterised with conditional probabilities of local colony site extinction and colonization. Presence-absence data arising from Pollock's robust capture-recapture design provide the basis for obtaining unbiased estimates of extinction, colonization, and detection probabilities when not all colonies are detected. This second approach should be particularly useful in situations where detection probabilities are heterogeneous among colony sites. The general methodology is illustrated using presence-absence data on two species of herons (Purple Heron, Ardea purpurea and Grey Heron, Ardea cinerea). Estimates of the extinction and colonization rates showed interspecific differences and strong temporal and spatial variations. We were also able to test specific predictions about colony dynamics based on ideas about habitat change and metapopulation dynamics. We recommend estimators based on probabilistic modelling for future work on colony dynamics. We also believe that this methodological framework has wide application to problems in animal ecology concerning metapopulation and community dynamics.
NASA Astrophysics Data System (ADS)
Canu, Michael; Duque, Mauricio; de Hosson, Cécile
2017-01-01
Engineering students on control courses lack a deep understanding of equilibrium and stability that are crucial concepts in this discipline. Several studies have shown that students find it difficult to understand simple familiar or academic static equilibrium cases as well as dynamic ones from mechanics even if they know the discipline's criteria and formulae. Our aim is to study the impact of a specific and innovative classroom session, containing well-chosen situations that address students' misconceptions. We propose an example of Active Learning experiment based both on the Didactical Engineering methodology and the Conceptual Fields Theory that aims at promoting a conceptual change in students. The chosen methodology allows, at the same time, a proper design of the student learning activities, an accurate monitoring of the students' rational use during the tasks and provides an internal tool for the evaluation of the session's efficiency. Although the expected starting conceptual change was detected, it would require another activity in order to be reinforced.
A Comprehensive Guide for Performing Sample Preparation and Top-Down Protein Analysis
Padula, Matthew P.; Berry, Iain J.; O′Rourke, Matthew B.; Raymond, Benjamin B.A.; Santos, Jerran; Djordjevic, Steven P.
2017-01-01
Methodologies for the global analysis of proteins in a sample, or proteome analysis, have been available since 1975 when Patrick O′Farrell published the first paper describing two-dimensional gel electrophoresis (2D-PAGE). This technique allowed the resolution of single protein isoforms, or proteoforms, into single ‘spots’ in a polyacrylamide gel, allowing the quantitation of changes in a proteoform′s abundance to ascertain changes in an organism′s phenotype when conditions change. In pursuit of the comprehensive profiling of the proteome, significant advances in technology have made the identification and quantitation of intact proteoforms from complex mixtures of proteins more routine, allowing analysis of the proteome from the ‘Top-Down’. However, the number of proteoforms detected by Top-Down methodologies such as 2D-PAGE or mass spectrometry has not significantly increased since O’Farrell’s paper when compared to Bottom-Up, peptide-centric techniques. This article explores and explains the numerous methodologies and technologies available to analyse the proteome from the Top-Down with a strong emphasis on the necessity to analyse intact proteoforms as a better indicator of changes in biology and phenotype. We arrive at the conclusion that the complete and comprehensive profiling of an organism′s proteome is still, at present, beyond our reach but the continuing evolution of protein fractionation techniques and mass spectrometry brings comprehensive Top-Down proteome profiling closer. PMID:28387712
A Comprehensive Guide for Performing Sample Preparation and Top-Down Protein Analysis.
Padula, Matthew P; Berry, Iain J; O Rourke, Matthew B; Raymond, Benjamin B A; Santos, Jerran; Djordjevic, Steven P
2017-04-07
Methodologies for the global analysis of proteins in a sample, or proteome analysis, have been available since 1975 when Patrick O'Farrell published the first paper describing two-dimensional gel electrophoresis (2D-PAGE). This technique allowed the resolution of single protein isoforms, or proteoforms, into single 'spots' in a polyacrylamide gel, allowing the quantitation of changes in a proteoform's abundance to ascertain changes in an organism's phenotype when conditions change. In pursuit of the comprehensive profiling of the proteome, significant advances in technology have made the identification and quantitation of intact proteoforms from complex mixtures of proteins more routine, allowing analysis of the proteome from the 'Top-Down'. However, the number of proteoforms detected by Top-Down methodologies such as 2D-PAGE or mass spectrometry has not significantly increased since O'Farrell's paper when compared to Bottom-Up, peptide-centric techniques. This article explores and explains the numerous methodologies and technologies available to analyse the proteome from the Top-Down with a strong emphasis on the necessity to analyse intact proteoforms as a better indicator of changes in biology and phenotype. We arrive at the conclusion that the complete and comprehensive profiling of an organism's proteome is still, at present, beyond our reach but the continuing evolution of protein fractionation techniques and mass spectrometry brings comprehensive Top-Down proteome profiling closer.
Durán, Gema M; Contento, Ana M; Ríos, Ángel
2013-11-01
Based on the highly sensitive fluorescence change of water-soluble CdSe/ZnS core-shell quantum dots (QD) by paraquat herbicide, a simple, rapid and reproducible methodology was developed to selectively determine paraquat (PQ) in water samples. The methodology enabled the use of simple pretreatment procedure based on the simple water solubilization of CdSe/ZnS QDs with hydrophilic heterobifunctional thiol ligands, such as 3-mercaptopropionic acid (3-MPA), using microwave irradiation. The resulting water-soluble QDs exhibit a strong fluorescence emission at 596 nm with a high and reproducible photostability. The proposed analytical method thus satisfies the need for a simple, sensible and rapid methodology to determine residues of paraquat in water samples, as required by the increasingly strict regulations for health protection introduced in recent years. The sensitivity of the method, expressed as detection limits, was as low as 3.0 ng L(-1). The lineal range was between 10-5×10(3) ng L(-1). RSD values in the range of 71-102% were obtained. The analytical applicability of proposed method was demonstrated by analyzing water samples from different procedence. Copyright © 2013 Elsevier B.V. All rights reserved.
Techniques for automatic large scale change analysis of temporal multispectral imagery
NASA Astrophysics Data System (ADS)
Mercovich, Ryan A.
Change detection in remotely sensed imagery is a multi-faceted problem with a wide variety of desired solutions. Automatic change detection and analysis to assist in the coverage of large areas at high resolution is a popular area of research in the remote sensing community. Beyond basic change detection, the analysis of change is essential to provide results that positively impact an image analyst's job when examining potentially changed areas. Present change detection algorithms are geared toward low resolution imagery, and require analyst input to provide anything more than a simple pixel level map of the magnitude of change that has occurred. One major problem with this approach is that change occurs in such large volume at small spatial scales that a simple change map is no longer useful. This research strives to create an algorithm based on a set of metrics that performs a large area search for change in high resolution multispectral image sequences and utilizes a variety of methods to identify different types of change. Rather than simply mapping the magnitude of any change in the scene, the goal of this research is to create a useful display of the different types of change in the image. The techniques presented in this dissertation are used to interpret large area images and provide useful information to an analyst about small regions that have undergone specific types of change while retaining image context to make further manual interpretation easier. This analyst cueing to reduce information overload in a large area search environment will have an impact in the areas of disaster recovery, search and rescue situations, and land use surveys among others. By utilizing a feature based approach founded on applying existing statistical methods and new and existing topological methods to high resolution temporal multispectral imagery, a novel change detection methodology is produced that can automatically provide useful information about the change occurring in large area and high resolution image sequences. The change detection and analysis algorithm developed could be adapted to many potential image change scenarios to perform automatic large scale analysis of change.
Spectral anomaly methods for aerial detection using KUT nuisance rejection
NASA Astrophysics Data System (ADS)
Detwiler, R. S.; Pfund, D. M.; Myjak, M. J.; Kulisek, J. A.; Seifert, C. E.
2015-06-01
This work discusses the application and optimization of a spectral anomaly method for the real-time detection of gamma radiation sources from an aerial helicopter platform. Aerial detection presents several key challenges over ground-based detection. For one, larger and more rapid background fluctuations are typical due to higher speeds, larger field of view, and geographically induced background changes. As well, the possible large altitude or stand-off distance variations cause significant steps in background count rate as well as spectral changes due to increased gamma-ray scatter with detection at higher altitudes. The work here details the adaptation and optimization of the PNNL-developed algorithm Nuisance-Rejecting Spectral Comparison Ratios for Anomaly Detection (NSCRAD), a spectral anomaly method previously developed for ground-based applications, for an aerial platform. The algorithm has been optimized for two multi-detector systems; a NaI(Tl)-detector-based system and a CsI detector array. The optimization here details the adaptation of the spectral windows for a particular set of target sources to aerial detection and the tailoring for the specific detectors. As well, the methodology and results for background rejection methods optimized for the aerial gamma-ray detection using Potassium, Uranium and Thorium (KUT) nuisance rejection are shown. Results indicate that use of a realistic KUT nuisance rejection may eliminate metric rises due to background magnitude and spectral steps encountered in aerial detection due to altitude changes and geographically induced steps such as at land-water interfaces.
Modern proposal of methodology for retrieval of characteristic synthetic rainfall hyetographs
NASA Astrophysics Data System (ADS)
Licznar, Paweł; Burszta-Adamiak, Ewa; Łomotowski, Janusz; Stańczyk, Justyna
2017-11-01
Modern engineering workshop of designing and modelling complex drainage systems is based on hydrodynamic modelling and has a probabilistic character. Its practical application requires a change regarding rainfall models accepted at the input. Previously used artificial rainfall models of simplified form, e.g. block precipitation or Euler's type II model rainfall are no longer sufficient. It is noticeable that urgent clarification is needed as regards the methodology of standardized rainfall hyetographs that would take into consideration the specifics of local storm rainfall temporal dynamics. The aim of the paper is to present a proposal for innovative methodology for determining standardized rainfall hyetographs, based on statistical processing of the collection of actual local precipitation characteristics. Proposed methodology is based on the classification of standardized rainfall hyetographs with the use of cluster analysis. Its application is presented on the example of selected rain gauges localized in Poland. Synthetic rainfall hyetographs achieved as a final result may be used for hydrodynamic modelling of sewerage systems, including probabilistic detection of necessary capacity of retention reservoirs.
Quantifying Standing Dead Tree Volume and Structural Loss with Voxelized Terrestrial Lidar Data
NASA Astrophysics Data System (ADS)
Popescu, S. C.; Putman, E.
2017-12-01
Standing dead trees (SDTs) are an important forest component and impact a variety of ecosystem processes, yet the carbon pool dynamics of SDTs are poorly constrained in terrestrial carbon cycling models. The ability to model wood decay and carbon cycling in relation to detectable changes in tree structure and volume over time would greatly improve such models. The overall objective of this study was to provide automated aboveground volume estimates of SDTs and automated procedures to detect, quantify, and characterize structural losses over time with terrestrial lidar data. The specific objectives of this study were: 1) develop an automated SDT volume estimation algorithm providing accurate volume estimates for trees scanned in dense forests; 2) develop an automated change detection methodology to accurately detect and quantify SDT structural loss between subsequent terrestrial lidar observations; and 3) characterize the structural loss rates of pine and oak SDTs in southeastern Texas. A voxel-based volume estimation algorithm, "TreeVolX", was developed and incorporates several methods designed to robustly process point clouds of varying quality levels. The algorithm operates on horizontal voxel slices by segmenting the slice into distinct branch or stem sections then applying an adaptive contour interpolation and interior filling process to create solid reconstructed tree models (RTMs). TreeVolX estimated large and small branch volume with an RMSE of 7.3% and 13.8%, respectively. A voxel-based change detection methodology was developed to accurately detect and quantify structural losses and incorporated several methods to mitigate the challenges presented by shifting tree and branch positions as SDT decay progresses. The volume and structural loss of 29 SDTs, composed of Pinus taeda and Quercus stellata, were successfully estimated using multitemporal terrestrial lidar observations over elapsed times ranging from 71 - 753 days. Pine and oak structural loss rates were characterized by estimating the amount of volumetric loss occurring in 20 equal-interval height bins of each SDT. Results showed that large pine snags exhibited more rapid structural loss in comparison to medium-sized oak snags in this study.
Enhancement of snow cover change detection with sparse representation and dictionary learning
NASA Astrophysics Data System (ADS)
Varade, D.; Dikshit, O.
2014-11-01
Sparse representation and decoding is often used for denoising images and compression of images with respect to inherent features. In this paper, we adopt a methodology incorporating sparse representation of a snow cover change map using the K-SVD trained dictionary and sparse decoding to enhance the change map. The pixels often falsely characterized as "changes" are eliminated using this approach. The preliminary change map was generated using differenced NDSI or S3 maps in case of Resourcesat-2 and Landsat 8 OLI imagery respectively. These maps are extracted into patches for compressed sensing using Discrete Cosine Transform (DCT) to generate an initial dictionary which is trained by the K-SVD approach. The trained dictionary is used for sparse coding of the change map using the Orthogonal Matching Pursuit (OMP) algorithm. The reconstructed change map incorporates a greater degree of smoothing and represents the features (snow cover changes) with better accuracy. The enhanced change map is segmented using kmeans to discriminate between the changed and non-changed pixels. The segmented enhanced change map is compared, firstly with the difference of Support Vector Machine (SVM) classified NDSI maps and secondly with a reference data generated as a mask by visual interpretation of the two input images. The methodology is evaluated using multi-spectral datasets from Resourcesat-2 and Landsat-8. The k-hat statistic is computed to determine the accuracy of the proposed approach.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stone, Daithi A.; Hansen, Gerrit
Despite being a well-established research field, the detection and attribution of observed climate change to anthropogenic forcing is not yet provided as a climate service. One reason for this is the lack of a methodology for performing tailored detection and attribution assessments on a rapid time scale. Here we develop such an approach, based on the translation of quantitative analysis into the “confidence” language employed in recent Assessment Reports of the Intergovernmental Panel on Climate Change. While its systematic nature necessarily ignores some nuances examined in detailed expert assessments, the approach nevertheless goes beyond most detection and attribution studies inmore » considering contributors to building confidence such as errors in observational data products arising from sparse monitoring networks. When compared against recent expert assessments, the results of this approach closely match those of the existing assessments. Where there are small discrepancies, these variously reflect ambiguities in the details of what is being assessed, reveal nuances or limitations of the expert assessments, or indicate limitations of the accuracy of the sort of systematic approach employed here. Deployment of the method on 116 regional assessments of recent temperature and precipitation changes indicates that existing rules of thumb concerning the detectability of climate change ignore the full range of sources of uncertainty, most particularly the importance of adequate observational monitoring.« less
Suzuki, Masahiko; Mitoma, Hiroshi; Yoneyama, Mitsuru
2017-01-01
Long-term and objective monitoring is necessary for full assessment of the condition of patients with Parkinson's disease (PD). Recent advances in biotechnology have seen the development of various types of wearable (body-worn) sensor systems. By using accelerometers and gyroscopes, these devices can quantify motor abnormalities, including decreased activity and gait disturbances, as well as nonmotor signs, such as sleep disturbances and autonomic dysfunctions in PD. This review discusses methodological problems inherent in wearable devices. Until now, analysis of the mean values of motion-induced signals on a particular day has been widely applied in the clinical management of PD patients. On the other hand, the reliability of these devices to detect various events, such as freezing of gait and dyskinesia, has been less than satisfactory. Quantification of disease-specific changes rather than nonspecific changes is necessary.
2017-01-01
Long-term and objective monitoring is necessary for full assessment of the condition of patients with Parkinson's disease (PD). Recent advances in biotechnology have seen the development of various types of wearable (body-worn) sensor systems. By using accelerometers and gyroscopes, these devices can quantify motor abnormalities, including decreased activity and gait disturbances, as well as nonmotor signs, such as sleep disturbances and autonomic dysfunctions in PD. This review discusses methodological problems inherent in wearable devices. Until now, analysis of the mean values of motion-induced signals on a particular day has been widely applied in the clinical management of PD patients. On the other hand, the reliability of these devices to detect various events, such as freezing of gait and dyskinesia, has been less than satisfactory. Quantification of disease-specific changes rather than nonspecific changes is necessary. PMID:28607801
Analysis and Implementation of Methodologies for the Monitoring of Changes in Eye Fundus Images
NASA Astrophysics Data System (ADS)
Gelroth, A.; Rodríguez, D.; Salvatelli, A.; Drozdowicz, B.; Bizai, G.
2011-12-01
We present a support system for changes detection in fundus images of the same patient taken at different time intervals. This process is useful for monitoring pathologies lasting for long periods of time, as are usually the ophthalmologic. We propose a flow of preprocessing, processing and postprocessing applied to a set of images selected from a public database, presenting pathological advances. A test interface was developed designed to select the images to be compared in order to apply the different methods developed and to display the results. We measure the system performance in terms of sensitivity, specificity and computation times. We have obtained good results, higher than 84% for the first two parameters and processing times lower than 3 seconds for 512x512 pixel images. For the specific case of detection of changes associated with bleeding, the system responds with sensitivity and specificity over 98%.
NASA Astrophysics Data System (ADS)
Molinario, G.; Baraldi, A.; Altstatt, A. L.; Nackoney, J.
2011-12-01
The University of Maryland has been a USAID Central Africa Rregional Program for the Environment (CARPE) cross-cutting partner for many years, providing remote sensing derived information on forest cover and forest cover changes in support of CARPE's objectives of diminishing forest degradation, loss and biodiversity loss as a result of poor or inexistent land use planning strategies. Together with South Dakota State University, Congo Basin-wide maps have been provided that map forest cover loss at a maximum of 60m resolution, using Landsat imagery and higher resolution imagery for algorithm training and validation. However, to better meet the needs within the CARPE Landscapes, which call for higher resolution, more accurate land cover change maps, UMD has been exploring the use of the SIAM automatic spectral -rule classifier together with pan-sharpened Landsat data (15m resolution) and Very High Resolution imagery from various sources. The pilot project is being developed in collaboration with the African Wildlife Foundation in the Maringa Lopori Wamba CARPE Landscape. If successful in the future this methodology will make the creation of high resolution change maps faster and easier, making it accessible to other entities in the Congo Basin that need accurate land cover and land use change maps in order, for example, to create sustainable land use plans, conserve biodiversity and resources and prepare Reducing Emissions from forest Degradation and Deforestation (REDD) Measurement, Reporting and Verification (MRV) projects. The paper describes the need for higher resolution land cover change maps that focus on forest change dynamics such as the cycling between primary forests, secondary forest, agriculture and other expanding and intensifying land uses in the Maringa Lopori Wamba CARPE Landscape in the Equateur Province of the Democratic Republic of Congo. The Methodology uses the SIAM remote sensing imagery automatic spectral rule classifier, together with pan-sharpened Landsat imagery with 15m resolution and Very High Resolution imagery from different sensors, obtained from the Department of Defense database that was recently opened to NASA and its Earth Observation partners. Particular emphasis is placed on the detection of agricultural fields and their expansion in primary forests or intensification in secondary forests and fallow fields, as this is the primary driver of deforestation in this area. Fields in this area area also of very small size and irregular shapes, often partly obscured by neighboring forest canopy, hence the technical challenge of correctly detecting them and tracking them through time. Finally, the potential for use of this methodology in other regions where information on land cover changes is needed for land use sustainability planning, is also addressed.
Wave-Based Algorithms and Bounds for Target Support Estimation
2015-05-15
vector electromagnetic formalism in [5]. This theory leads to three main variants of the optical theorem detector, in particular, three alternative...further expands the applicability for transient pulse change detection of ar- bitrary nonlinear-media and time-varying targets [9]. This report... electromagnetic methods a new methodology to estimate the minimum convex source region and the (possibly nonconvex) support of a scattering target from knowledge of
Buildings Change Detection Based on Shape Matching for Multi-Resolution Remote Sensing Imagery
NASA Astrophysics Data System (ADS)
Abdessetar, M.; Zhong, Y.
2017-09-01
Buildings change detection has the ability to quantify the temporal effect, on urban area, for urban evolution study or damage assessment in disaster cases. In this context, changes analysis might involve the utilization of the available satellite images with different resolutions for quick responses. In this paper, to avoid using traditional method with image resampling outcomes and salt-pepper effect, building change detection based on shape matching is proposed for multi-resolution remote sensing images. Since the object's shape can be extracted from remote sensing imagery and the shapes of corresponding objects in multi-scale images are similar, it is practical for detecting buildings changes in multi-scale imagery using shape analysis. Therefore, the proposed methodology can deal with different pixel size for identifying new and demolished buildings in urban area using geometric properties of objects of interest. After rectifying the desired multi-dates and multi-resolutions images, by image to image registration with optimal RMS value, objects based image classification is performed to extract buildings shape from the images. Next, Centroid-Coincident Matching is conducted, on the extracted building shapes, based on the Euclidean distance measurement between shapes centroid (from shape T0 to shape T1 and vice versa), in order to define corresponding building objects. Then, New and Demolished buildings are identified based on the obtained distances those are greater than RMS value (No match in the same location).
Calibration methodology for proportional counters applied to yield measurements of a neutron burst.
Tarifeño-Saldivia, Ariel; Mayer, Roberto E; Pavez, Cristian; Soto, Leopoldo
2014-01-01
This paper introduces a methodology for the yield measurement of a neutron burst using neutron proportional counters. This methodology is to be applied when single neutron events cannot be resolved in time by nuclear standard electronics, or when a continuous current cannot be measured at the output of the counter. The methodology is based on the calibration of the counter in pulse mode, and the use of a statistical model to estimate the number of detected events from the accumulated charge resulting from the detection of the burst of neutrons. The model is developed and presented in full detail. For the measurement of fast neutron yields generated from plasma focus experiments using a moderated proportional counter, the implementation of the methodology is herein discussed. An experimental verification of the accuracy of the methodology is presented. An improvement of more than one order of magnitude in the accuracy of the detection system is obtained by using this methodology with respect to previous calibration methods.
DeRocco, Vanessa; Anderson, Trevor; Piehler, Jacob; Erie, Dorothy A; Weninger, Keith
2010-11-01
To enable studies of conformational changes within multimolecular complexes, we present a simultaneous, four-color single molecule fluorescence methodology implemented with total internal reflection illumination and camera-based, wide-field detection. We further demonstrate labeling histidine-tagged proteins noncovalently with Tris-nitrilotriacetic acid (Tris-NTA)-conjugated dyes to achieve single molecule detection. We combine these methods to colocalize the mismatch repair protein MutSα on DNA while monitoring MutSα-induced DNA bending using Förster resonance energy transfer (FRET) and to monitor assembly of membrane-tethered SNARE protein complexes.
DeRocco, Vanessa C.; Anderson, Trevor; Piehler, Jacob; Erie, Dorothy A.; Weninger, Keith
2010-01-01
To allow studies of conformational changes within multi-molecular complexes, we present a simultaneous, 4-color single molecule fluorescence methodology implemented with total internal reflection illumination and camera based, wide-field detection. We further demonstrate labeling histidine-tagged proteins non-covalently with tris-Nitrilotriacetic acid (tris-NTA) conjugated dyes to achieve single molecule detection. We combine these methods to co-localize the mismatch repair protein MutSα on DNA while monitoring MutSα-induced DNA bending using Förster resonance energy transfer (FRET) and to monitor assembly of membrane-tethered SNARE protein complexes. PMID:21091445
3D/4D multiscale imaging in acute lymphoblastic leukemia cells: visualizing dynamics of cell death
NASA Astrophysics Data System (ADS)
Sarangapani, Sreelatha; Mohan, Rosmin Elsa; Patil, Ajeetkumar; Lang, Matthew J.; Asundi, Anand
2017-06-01
Quantitative phase detection is a new methodology that provides quantitative information on cellular morphology to monitor the cell status, drug response and toxicity. In this paper the morphological changes in acute leukemia cells treated with chitosan were detected using d'Bioimager a robust imaging system. Quantitative phase image of the cells was obtained with numerical analysis. Results show that the average area and optical volume of the chitosan treated cells is significantly reduced when compared with the control cells, which reveals the effect of chitosan on the cancer cells. From the results it can be attributed that d'Bioimager can be used as a non-invasive imaging alternative to measure the morphological changes of the living cells in real time.
Saneiro, Mar; Salmeron-Majadas, Sergio
2014-01-01
We report current findings when considering video recordings of facial expressions and body movements to provide affective personalized support in an educational context from an enriched multimodal emotion detection approach. In particular, we describe an annotation methodology to tag facial expression and body movements that conform to changes in the affective states of learners while dealing with cognitive tasks in a learning process. The ultimate goal is to combine these annotations with additional affective information collected during experimental learning sessions from different sources such as qualitative, self-reported, physiological, and behavioral information. These data altogether are to train data mining algorithms that serve to automatically identify changes in the learners' affective states when dealing with cognitive tasks which help to provide emotional personalized support. PMID:24892055
Saneiro, Mar; Santos, Olga C; Salmeron-Majadas, Sergio; Boticario, Jesus G
2014-01-01
We report current findings when considering video recordings of facial expressions and body movements to provide affective personalized support in an educational context from an enriched multimodal emotion detection approach. In particular, we describe an annotation methodology to tag facial expression and body movements that conform to changes in the affective states of learners while dealing with cognitive tasks in a learning process. The ultimate goal is to combine these annotations with additional affective information collected during experimental learning sessions from different sources such as qualitative, self-reported, physiological, and behavioral information. These data altogether are to train data mining algorithms that serve to automatically identify changes in the learners' affective states when dealing with cognitive tasks which help to provide emotional personalized support.
Real time testing of intelligent relays for synchronous distributed generation islanding detection
NASA Astrophysics Data System (ADS)
Zhuang, Davy
As electric power systems continue to grow to meet ever-increasing energy demand, their security, reliability, and sustainability requirements also become more stringent. The deployment of distributed energy resources (DER), including generation and storage, in conventional passive distribution feeders, gives rise to integration problems involving protection and unintentional islanding. Distributed generators need to be islanded for safety reasons when disconnected or isolated from the main feeder as distributed generator islanding may create hazards to utility and third-party personnel, and possibly damage the distribution system infrastructure, including the distributed generators. This thesis compares several key performance indicators of a newly developed intelligent islanding detection relay, against islanding detection devices currently used by the industry. The intelligent relay employs multivariable analysis and data mining methods to arrive at decision trees that contain both the protection handles and the settings. A test methodology is developed to assess the performance of these intelligent relays on a real time simulation environment using a generic model based on a real-life distribution feeder. The methodology demonstrates the applicability and potential advantages of the intelligent relay, by running a large number of tests, reflecting a multitude of system operating conditions. The testing indicates that the intelligent relay often outperforms frequency, voltage and rate of change of frequency relays currently used for islanding detection, while respecting the islanding detection time constraints imposed by standing distributed generator interconnection guidelines.
Dobrovolsky, Vasily N; Revollo, Javier; Petibone, Dayton M; Heflich, Robert H
2017-01-01
The Pig-a assay is being developed as an in vivo gene mutation assay for regulatory safety assessments. The assay is based on detecting mutation in the endogenous Pig-a gene of treated rats by using flow cytometry to measure changes in cell surface markers of peripheral blood cells. Here we present a methodology for demonstrating that phenotypically mutant rat T-cells identified by flow cytometry contain mutations in the Pig-a gene, an important step for validating the assay. In our approach, the mutant phenotype T-cells are sorted into individual wells of 96-well plates and expanded into clones. Subsequent sequencing of genomic DNA from the expanded clones confirms that the Pig-a assay detects exactly what it claims to detect-cells with mutations in the endogenous Pig-a gene. In addition, determining the spectra of Pig-a mutations provides information for better understanding the mutational mechanism of compounds of interest. Our methodology of combining phenotypic antibody labeling, magnetic enrichment, sorting, and single-cell clonal expansion can be used in genotoxicity/mutagenicity studies and in other general immunotoxicology research requiring identification, isolation, and expansion of extremely rare subpopulations of T-cells.
Select Methodology for Validating Advanced Satellite Measurement Systems
NASA Technical Reports Server (NTRS)
Larar, Allen M.; Zhou, Daniel K.; Liu, Xi; Smith, William L.
2008-01-01
Advanced satellite sensors are tasked with improving global measurements of the Earth's atmosphere, clouds, and surface to enable enhancements in weather prediction, climate monitoring capability, and environmental change detection. Measurement system validation is crucial to achieving this goal and maximizing research and operational utility of resultant data. Field campaigns including satellite under-flights with well calibrated FTS sensors aboard high-altitude aircraft are an essential part of the validation task. This presentation focuses on an overview of validation methodology developed for assessment of high spectral resolution infrared systems, and includes results of preliminary studies performed to investigate the performance of the Infrared Atmospheric Sounding Interferometer (IASI) instrument aboard the MetOp-A satellite.
NASA Astrophysics Data System (ADS)
Berrocoso, M.; Fernandez-Ros, A.; Prates, G.; Martin, M.; Hurtado, R.; Pereda, J.; Garcia, M. J.; Garcia-Cañada, L.; Ortiz, R.; Garcia, A.
2012-04-01
The surface deformation has been an essential parameter for the onset and evolution of the eruptive process of the island of El Hierro (October 2011) as well as for forecasting changes in seismic and volcanic activity during the crisis period. From GNSS-GPS observations the reactivation is early detected by analizing the change in the deformation of the El Hierro Island regional geodynamics. It is found that the surface deformation changes are detected before the occurrence of seismic activity using the station FRON (GRAFCAN). The evolution of the process has been studied by the analysis of time series of topocentric coordinates and the variation of the distance between stations on the island of El Hierro (GRAFCAN station;IGN network; and UCA-CSIC points) and LPAL-IGS station on the island of La Palma. In this work the main methodologies and their results are shown: •The location (and its changes) of the litospheric pressure source obtained by applying the Mogi model. •Kalman filtering technique for high frequency time series, used to make the forecasts issued for volcanic emergency management. •Correlations between deformation of the different GPS stations and their relationship with seismovolcanic settings.
Torsional Ultrasound Sensor Optimization for Soft Tissue Characterization
Melchor, Juan; Muñoz, Rafael; Rus, Guillermo
2017-01-01
Torsion mechanical waves have the capability to characterize shear stiffness moduli of soft tissue. Under this hypothesis, a computational methodology is proposed to design and optimize a piezoelectrics-based transmitter and receiver to generate and measure the response of torsional ultrasonic waves. The procedure employed is divided into two steps: (i) a finite element method (FEM) is developed to obtain a transmitted and received waveform as well as a resonance frequency of a previous geometry validated with a semi-analytical simplified model and (ii) a probabilistic optimality criteria of the design based on inverse problem from the estimation of robust probability of detection (RPOD) to maximize the detection of the pathology defined in terms of changes of shear stiffness. This study collects different options of design in two separated models, in transmission and contact, respectively. The main contribution of this work describes a framework to establish such as forward, inverse and optimization procedures to choose a set of appropriate parameters of a transducer. This methodological framework may be generalizable for other different applications. PMID:28617353
NASA Astrophysics Data System (ADS)
Ortiz-Jaramillo, B.; Fandiño Toro, H. A.; Benitez-Restrepo, H. D.; Orjuela-Vargas, S. A.; Castellanos-Domínguez, G.; Philips, W.
2012-03-01
Infrared Non-Destructive Testing (INDT) is known as an effective and rapid method for nondestructive inspection. It can detect a broad range of near-surface structuring flaws in metallic and composite components. Those flaws are modeled as a smooth contour centered at peaks of stored thermal energy, termed Regions of Interest (ROI). Dedicated methodologies must detect the presence of those ROIs. In this paper, we present a methodology for ROI extraction in INDT tasks. The methodology deals with the difficulties due to the non-uniform heating. The non-uniform heating affects low spatial/frequencies and hinders the detection of relevant points in the image. In this paper, a methodology for ROI extraction in INDT using multi-resolution analysis is proposed, which is robust to ROI low contrast and non-uniform heating. The former methodology includes local correlation, Gaussian scale analysis and local edge detection. In this methodology local correlation between image and Gaussian window provides interest points related to ROIs. We use a Gaussian window because thermal behavior is well modeled by Gaussian smooth contours. Also, the Gaussian scale is used to analyze details in the image using multi-resolution analysis avoiding low contrast, non-uniform heating and selection of the Gaussian window size. Finally, local edge detection is used to provide a good estimation of the boundaries in the ROI. Thus, we provide a methodology for ROI extraction based on multi-resolution analysis that is better or equal compared with the other dedicate algorithms proposed in the state of art.
NASA Astrophysics Data System (ADS)
Kranz, Olaf; Lang, Stefan; Schoepfer, Elisabeth
2017-09-01
Mining natural resources serve fundamental societal needs or commercial interests, but it may well turn into a driver of violence and regional instability. In this study, very high resolution (VHR) optical stereo satellite data are analysed to monitor processes and changes in one of the largest artisanal and small-scale mining sites in the Democratic Republic of the Congo, which is among the world's wealthiest countries in exploitable minerals To identify the subtle structural changes, the applied methodological framework employs object-based change detection (OBCD) based on optical VHR data and generated digital surface models (DSM). Results prove the DSM-based change detection approach enhances the assessment gained from sole 2D analyses by providing valuable information about changes in surface structure or volume. Land cover changes as analysed by OBCD reveal an increase in bare soil area by a rate of 47% between April 2010 and September 2010, followed by a significant decrease of 47.5% until March 2015. Beyond that, DSM differencing enabled the characterisation of small-scale features such as pits and excavations. The presented Earth observation (EO)-based monitoring of mineral exploitation aims at a better understanding of the relations between resource extraction and conflict, and thus providing relevant information for potential mitigation strategies and peace building.
Alotaibi, Madawi; Long, Toby; Kennedy, Elizabeth; Bavishi, Siddhi
2014-01-01
The purpose of this study was to review published research on the use of the Gross Motor Function Measure (GMFM-88) and (GMFM-66) as outcome measures to determine if these tools detect changes in gross motor function in children with cerebral palsy (CP) undergoing interventions. A comprehensive literature search was conducted using Medline and PubMed to identify studies published from January 2000 through January 2011 that reported the accuracy of GMFM-88 and GMFM-66 to measure changes over time in children with CP undergoing interventions. The keywords used for the search were "GMFM" and "CP". Two of the authors (M.A. and S.B.) reviewed the titles and abstracts found in the databases. The methodological quality of the studies was assessed by using the Critical Review Form-Quantitative Studies. Of 62 papers initially identified, 21 studies fulfilled the inclusion criteria. These articles consist of three longitudinal studies, six randomized controlled trials, four repeated measure design, six pre-post test design, a case series and one non-randomized prospective study. The included studies were generally of moderate to high methodological quality. The studies included children from a wide age range of 10 months to 16 years. According to the National Health and Medical Research Council, the study designs were level II, III-2, III-3 and IV. The review suggests that the GMFM-88 and GMFM-66 are useful as outcome measures to detect changes in gross motor function in children with CP undergoing interventions. Implications for Rehabilitation Accurate measurement of change in gross motor skill acquisition is important to determine effectiveness of intervention programs in children with cerebral palsy (CP). The Gross Motor Function Measure (GMFM-88 and GMFM-66) are common tools used by rehabilitation specialists to measure gross motor function in children with CP. The GMFM appears to be an effective outcome tool for measuring change in gross motor function according to a small number of randomized control studies utilizing participant populations of convenience.
Catanuto, Giuseppe; Taher, Wafa; Rocco, Nicola; Catalano, Francesca; Allegra, Dario; Milotta, Filippo Luigi Maria; Stanco, Filippo; Gallo, Giovanni; Nava, Maurizio Bruno
2018-03-20
Breast shape is defined utilizing mainly qualitative assessment (full, flat, ptotic) or estimates, such as volume or distances between reference points, that cannot describe it reliably. We will quantitatively describe breast shape with two parameters derived from a statistical methodology denominated principal component analysis (PCA). We created a heterogeneous dataset of breast shapes acquired with a commercial infrared 3-dimensional scanner on which PCA was performed. We plotted on a Cartesian plane the two highest values of PCA for each breast (principal components 1 and 2). Testing of the methodology on a preoperative and postoperative surgical case and test-retest was performed by two operators. The first two principal components derived from PCA are able to characterize the shape of the breast included in the dataset. The test-retest demonstrated that different operators are able to obtain very similar values of PCA. The system is also able to identify major changes in the preoperative and postoperative stages of a two-stage reconstruction. Even minor changes were correctly detected by the system. This methodology can reliably describe the shape of a breast. An expert operator and a newly trained operator can reach similar results in a test/re-testing validation. Once developed and after further validation, this methodology could be employed as a good tool for outcome evaluation, auditing, and benchmarking.
Weight status and the perception of body image in men
Gardner, Rick M
2014-01-01
Understanding the role of body size in relation to the accuracy of body image perception in men is an important topic because of the implications for avoiding and treating obesity, and it may serve as a potential diagnostic criterion for eating disorders. The early research on this topic produced mixed findings. About one-half of the early studies showed that obese men overestimated their body size, with the remaining half providing accurate estimates. Later, improvements in research technology and methodology provided a clearer indication of the role of weight status in body image perception. Research in our laboratory has also produced diverse findings, including that obese subjects sometimes overestimate their body size. However, when examining our findings across several studies, obese subjects had about the same level of accuracy in estimating their body size as normal-weight subjects. Studies in our laboratory also permitted the separation of sensory and nonsensory factors in body image perception. In all but one instance, no differences were found overall between the ability of obese and normal-weight subjects to detect overall changes in body size. Importantly, however, obese subjects are better at detecting changes in their body size when the image is distorted to be too thin as compared to too wide. Both obese and normal-weight men require about a 3%–7% change in the width of their body size in order to detect the change reliably. Correlations between a range of body mass index values and body size estimation accuracy indicated no relationship between these variables. Numerous studies in other laboratories asked men to place their body size into discrete categorizes, ranging from thin to obese. Researchers found that overweight and obese men underestimate their weight status, and that men are less accurate in their categorizations than are women. Cultural influences have been found to be important, with body size underestimations occurring in cultures where a larger body is found to be desirable. Methodological issues are reviewed with recommendations for future studies. PMID:25114606
Review of methodology and technology available for the detection of extrasolar planetary systems
NASA Technical Reports Server (NTRS)
Tarter, J. C.; Black, D. C.; Billingham, J.
1985-01-01
Four approaches exist for the detection of extrasolar planets. According to the only direct method, the planet is imaged at some wavelength in a manner which makes it possible to differentiate its own feeble luminosity (internal energy source plus reflected starlight) from that of the nearby host star. The three indirect methods involve the detection of a planetary mass companion on the basis of the observable effects it has on the host star. A search is conducted regarding the occurrence of regular, periodic changes in the stellar spatial motion (astrometric method) or the velocity of stellar emission line spectra (spectroscopic method) or in the apparent total stellar luminosity (photometric method). Details regarding the approaches employed for implementing the considered methods are discussed.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-19
... DEPARTMENT OF COMMERCE International Trade Administration Methodological Change for Implementation..., the Department of Commerce (``the Department'') will implement a methodological change to reduce... administrative reviews involving merchandise from the PRC and Vietnam. Methodological Change In antidumping duty...
Okada, Sachiko; Nagase, Keisuke; Ito, Ayako; Ando, Fumihiko; Nakagawa, Yoshiaki; Okamoto, Kazuya; Kume, Naoto; Takemura, Tadamasa; Kuroda, Tomohiro; Yoshihara, Hiroyuki
2014-01-01
Comparison of financial indices helps to illustrate differences in operations and efficiency among similar hospitals. Outlier data tend to influence statistical indices, and so detection of outliers is desirable. Development of a methodology for financial outlier detection using information systems will help to reduce the time and effort required, eliminate the subjective elements in detection of outlier data, and improve the efficiency and quality of analysis. The purpose of this research was to develop such a methodology. Financial outliers were defined based on a case model. An outlier-detection method using the distances between cases in multi-dimensional space is proposed. Experiments using three diagnosis groups indicated successful detection of cases for which the profitability and income structure differed from other cases. Therefore, the method proposed here can be used to detect outliers. Copyright © 2013 John Wiley & Sons, Ltd.
Using State Estimation Residuals to Detect Abnormal SCADA Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ma, Jian; Chen, Yousu; Huang, Zhenyu
2010-04-30
Detection of abnormal supervisory control and data acquisition (SCADA) data is critically important for safe and secure operation of modern power systems. In this paper, a methodology of abnormal SCADA data detection based on state estimation residuals is presented. Preceded with a brief overview of outlier detection methods and bad SCADA data detection for state estimation, the framework of the proposed methodology is described. Instead of using original SCADA measurements as the bad data sources, the residuals calculated based on the results of the state estimator are used as the input for the outlier detection algorithm. The BACON algorithm ismore » applied to the outlier detection task. The IEEE 118-bus system is used as a test base to evaluate the effectiveness of the proposed methodology. The accuracy of the BACON method is compared with that of the 3-σ method for the simulated SCADA measurements and residuals.« less
Benedek, C; Descombes, X; Zerubia, J
2012-01-01
In this paper, we introduce a new probabilistic method which integrates building extraction with change detection in remotely sensed image pairs. A global optimization process attempts to find the optimal configuration of buildings, considering the observed data, prior knowledge, and interactions between the neighboring building parts. We present methodological contributions in three key issues: 1) We implement a novel object-change modeling approach based on Multitemporal Marked Point Processes, which simultaneously exploits low-level change information between the time layers and object-level building description to recognize and separate changed and unaltered buildings. 2) To answer the challenges of data heterogeneity in aerial and satellite image repositories, we construct a flexible hierarchical framework which can create various building appearance models from different elementary feature-based modules. 3) To simultaneously ensure the convergence, optimality, and computation complexity constraints raised by the increased data quantity, we adopt the quick Multiple Birth and Death optimization technique for change detection purposes, and propose a novel nonuniform stochastic object birth process which generates relevant objects with higher probability based on low-level image features.
Object-Based Change Detection Using High-Resolution Remotely Sensed Data and GIS
NASA Astrophysics Data System (ADS)
Sofina, N.; Ehlers, M.
2012-08-01
High resolution remotely sensed images provide current, detailed, and accurate information for large areas of the earth surface which can be used for change detection analyses. Conventional methods of image processing permit detection of changes by comparing remotely sensed multitemporal images. However, for performing a successful analysis it is desirable to take images from the same sensor which should be acquired at the same time of season, at the same time of a day, and - for electro-optical sensors - in cloudless conditions. Thus, a change detection analysis could be problematic especially for sudden catastrophic events. A promising alternative is the use of vector-based maps containing information about the original urban layout which can be related to a single image obtained after the catastrophe. The paper describes a methodology for an object-based search of destroyed buildings as a consequence of a natural or man-made catastrophe (e.g., earthquakes, flooding, civil war). The analysis is based on remotely sensed and vector GIS data. It includes three main steps: (i) generation of features describing the state of buildings; (ii) classification of building conditions; and (iii) data import into a GIS. One of the proposed features is a newly developed 'Detected Part of Contour' (DPC). Additionally, several features based on the analysis of textural information corresponding to the investigated vector objects are calculated. The method is applied to remotely sensed images of areas that have been subjected to an earthquake. The results show the high reliability of the DPC feature as an indicator for change.
NASA Astrophysics Data System (ADS)
Moradi, Saed; Moallem, Payman; Sabahi, Mohamad Farzan
2018-03-01
False alarm rate and detection rate are still two contradictory metrics for infrared small target detection in an infrared search and track system (IRST), despite the development of new detection algorithms. In certain circumstances, not detecting true targets is more tolerable than detecting false items as true targets. Hence, considering background clutter and detector noise as the sources of the false alarm in an IRST system, in this paper, a false alarm aware methodology is presented to reduce false alarm rate while the detection rate remains undegraded. To this end, advantages and disadvantages of each detection algorithm are investigated and the sources of the false alarms are determined. Two target detection algorithms having independent false alarm sources are chosen in a way that the disadvantages of the one algorithm can be compensated by the advantages of the other one. In this work, multi-scale average absolute gray difference (AAGD) and Laplacian of point spread function (LoPSF) are utilized as the cornerstones of the desired algorithm of the proposed methodology. After presenting a conceptual model for the desired algorithm, it is implemented through the most straightforward mechanism. The desired algorithm effectively suppresses background clutter and eliminates detector noise. Also, since the input images are processed through just four different scales, the desired algorithm has good capability for real-time implementation. Simulation results in term of signal to clutter ratio and background suppression factor on real and simulated images prove the effectiveness and the performance of the proposed methodology. Since the desired algorithm was developed based on independent false alarm sources, our proposed methodology is expandable to any pair of detection algorithms which have different false alarm sources.
Optimization of a Viability PCR Method for the Detection of Listeria monocytogenes in Food Samples.
Agustí, Gemma; Fittipaldi, Mariana; Codony, Francesc
2018-06-01
Rapid detection of Listeria and other microbial pathogens in food is an essential part of quality control and it is critical for ensuring the safety of consumers. Culture-based methods for detecting foodborne pathogens are time-consuming, laborious and cannot detect viable but non-culturable microorganism, whereas viability PCR methodology provides quick results; it is able to detect viable but non-culturable cells, and allows for easier handling of large amount of samples. Although the most critical point to use viability PCR technique is achieving the complete exclusion of dead cell amplification signals, many improvements are being introduced to overcome this. In the present work, the yield of dead cell DNA neutralization was enhanced by incorporating two new sample treatment strategies: tube change combined with a double light treatment. This procedure was successfully tested using artificially contaminated food samples, showing improved neutralization of dead cell DNA.
Updating National Topographic Data Base Using Change Detection Methods
NASA Astrophysics Data System (ADS)
Keinan, E.; Felus, Y. A.; Tal, Y.; Zilberstien, O.; Elihai, Y.
2016-06-01
The traditional method for updating a topographic database on a national scale is a complex process that requires human resources, time and the development of specialized procedures. In many National Mapping and Cadaster Agencies (NMCA), the updating cycle takes a few years. Today, the reality is dynamic and the changes occur every day, therefore, the users expect that the existing database will portray the current reality. Global mapping projects which are based on community volunteers, such as OSM, update their database every day based on crowdsourcing. In order to fulfil user's requirements for rapid updating, a new methodology that maps major interest areas while preserving associated decoding information, should be developed. Until recently, automated processes did not yield satisfactory results, and a typically process included comparing images from different periods. The success rates in identifying the objects were low, and most were accompanied by a high percentage of false alarms. As a result, the automatic process required significant editorial work that made it uneconomical. In the recent years, the development of technologies in mapping, advancement in image processing algorithms and computer vision, together with the development of digital aerial cameras with NIR band and Very High Resolution satellites, allow the implementation of a cost effective automated process. The automatic process is based on high-resolution Digital Surface Model analysis, Multi Spectral (MS) classification, MS segmentation, object analysis and shape forming algorithms. This article reviews the results of a novel change detection methodology as a first step for updating NTDB in the Survey of Israel.
Van Pamel, Anton; Brett, Colin R; Lowe, Michael J S
2014-12-01
Improving the ultrasound inspection capability for coarse-grained metals remains of longstanding interest and is expected to become increasingly important for next-generation electricity power plants. Conventional ultrasonic A-, B-, and C-scans have been found to suffer from strong background noise caused by grain scattering, which can severely limit the detection of defects. However, in recent years, array probes and full matrix capture (FMC) imaging algorithms have unlocked exciting possibilities for improvements. To improve and compare these algorithms, we must rely on robust methodologies to quantify their performance. This article proposes such a methodology to evaluate the detection performance of imaging algorithms. For illustration, the methodology is applied to some example data using three FMC imaging algorithms; total focusing method (TFM), phase-coherent imaging (PCI), and decomposition of the time-reversal operator with multiple scattering filter (DORT MSF). However, it is important to note that this is solely to illustrate the methodology; this article does not attempt the broader investigation of different cases that would be needed to compare the performance of these algorithms in general. The methodology considers the statistics of detection, presenting the detection performance as probability of detection (POD) and probability of false alarm (PFA). A test sample of coarse-grained nickel super alloy, manufactured to represent materials used for future power plant components and containing some simple artificial defects, is used to illustrate the method on the candidate algorithms. The data are captured in pulse-echo mode using 64-element array probes at center frequencies of 1 and 5 MHz. In this particular case, it turns out that all three algorithms are shown to perform very similarly when comparing their flaw detection capabilities.
Bearing damage assessment using Jensen-Rényi Divergence based on EEMD
NASA Astrophysics Data System (ADS)
Singh, Jaskaran; Darpe, A. K.; Singh, S. P.
2017-03-01
An Ensemble Empirical Mode Decomposition (EEMD) and Jensen Rényi divergence (JRD) based methodology is proposed for the degradation assessment of rolling element bearings using vibration data. The EEMD decomposes vibration signals into a set of intrinsic mode functions (IMFs). A systematic methodology to select IMFs that are sensitive and closely related to the fault is proposed in the paper. The change in probability distribution of the energies of the sensitive IMFs is measured through JRD which acts as a damage identification parameter. Evaluation of JRD with sensitive IMFs makes it largely unaffected by change/fluctuations in operating conditions. Further, an algorithm based on Chebyshev's inequality is applied to JRD to identify exact points of change in bearing health and remove outliers. The identified change points are investigated for fault classification as possible locations where specific defect initiation could have taken place. For fault classification, two new parameters are proposed: 'α value' and Probable Fault Index, which together classify the fault. To standardize the degradation process, a Confidence Value parameter is proposed to quantify the bearing degradation value in a range of zero to unity. A simulation study is first carried out to demonstrate the robustness of the proposed JRD parameter under variable operating conditions of load and speed. The proposed methodology is then validated on experimental data (seeded defect data and accelerated bearing life test data). The first validation on two different vibration datasets (inner/outer) obtained from seeded defect experiments demonstrate the effectiveness of JRD parameter in detecting a change in health state as the severity of fault changes. The second validation is on two accelerated life tests. The results demonstrate the proposed approach as a potential tool for bearing performance degradation assessment.
Attributing Changing Rates of Temperature Record Breaking to Anthropogenic Influences
NASA Astrophysics Data System (ADS)
King, Andrew D.
2017-11-01
Record-breaking temperatures attract attention from the media, so understanding how and why the rate of record breaking is changing may be useful in communicating the effects of climate change. A simple methodology designed for estimating the anthropogenic influence on rates of record breaking in a given time series is proposed here. The frequency of hot and cold record-breaking temperature occurrences is shown to be changing due to the anthropogenic influence on the climate. Using ensembles of model simulations with and without human-induced forcings, it is demonstrated that the effect of climate change on global record-breaking temperatures can be detected as far back as the 1930s. On local scales, a climate change signal is detected more recently at most locations. The anthropogenic influence on the increased occurrence of hot record-breaking temperatures is clearer than it is for the decreased occurrence of cold records. The approach proposed here could be applied in rapid attribution studies of record extremes to quantify the influence of climate change on the rate of record breaking in addition to the climate anomaly being studied. This application is demonstrated for the global temperature record of 2016 and the Central England temperature record in 2014.
NASA Astrophysics Data System (ADS)
Drzewiecki, Wojciech
2017-12-01
We evaluated the performance of nine machine learning regression algorithms and their ensembles for sub-pixel estimation of impervious areas coverages from Landsat imagery. The accuracy of imperviousness mapping in individual time points was assessed based on RMSE, MAE and R2. These measures were also used for the assessment of imperviousness change intensity estimations. The applicability for detection of relevant changes in impervious areas coverages at sub-pixel level was evaluated using overall accuracy, F-measure and ROC Area Under Curve. The results proved that Cubist algorithm may be advised for Landsat-based mapping of imperviousness for single dates. Stochastic gradient boosting of regression trees (GBM) may be also considered for this purpose. However, Random Forest algorithm is endorsed for both imperviousness change detection and mapping of its intensity. In all applications the heterogeneous model ensembles performed at least as well as the best individual models or better. They may be recommended for improving the quality of sub-pixel imperviousness and imperviousness change mapping. The study revealed also limitations of the investigated methodology for detection of subtle changes of imperviousness inside the pixel. None of the tested approaches was able to reliably classify changed and non-changed pixels if the relevant change threshold was set as one or three percent. Also for fi ve percent change threshold most of algorithms did not ensure that the accuracy of change map is higher than the accuracy of random classifi er. For the threshold of relevant change set as ten percent all approaches performed satisfactory.
BATSE gamma-ray burst line search. 2: Bayesian consistency methodology
NASA Technical Reports Server (NTRS)
Band, D. L.; Ford, L. A.; Matteson, J. L.; Briggs, M.; Paciesas, W.; Pendleton, G.; Preece, R.; Palmer, D.; Teegarden, B.; Schaefer, B.
1994-01-01
We describe a Bayesian methodology to evaluate the consistency between the reported Ginga and Burst and Transient Source Experiment (BATSE) detections of absorption features in gamma-ray burst spectra. Currently no features have been detected by BATSE, but this methodology will still be applicable if and when such features are discovered. The Bayesian methodology permits the comparison of hypotheses regarding the two detectors' observations and makes explicit the subjective aspects of our analysis (e.g., the quantification of our confidence in detector performance). We also present non-Bayesian consistency statistics. Based on preliminary calculations of line detectability, we find that both the Bayesian and non-Bayesian techniques show that the BATSE and Ginga observations are consistent given our understanding of these detectors.
Hedrick, S C; Rothman, M L; Chapko, M; Inui, T S; Kelly, J R; Ehreth, J
1991-01-01
The Adult Day Health Care Evaluation Study was developed in response to a congressional mandate to study the medical efficacy and cost effectiveness of the Adult Day Health Care (ADHC) effort in the Department of Veterans Affairs (VA). Four sites providing ADHC in VA facilities are participating in an ongoing randomized controlled trial. Three years of developmental work prior to the study addressed methodological issues that were problematic in previous studies. This developmental work resulted in the methodological approaches described here: (1) a patient recruitment process that actively recruits and screens all potential candidates using empirically developed admission criteria based on predictors of nursing home placement in VA; (2) the selection and development of measures of medical efficacy that assess a wide range of patient and caregiver outcomes with sufficient sensitivity to detect small but clinically important changes; and (3) methods for detailed, accurate, and efficient measurement of utilization and costs of health care within and outside VA. These approaches may be helpful to other researchers and may advance the methodological sophistication of long-term care program evaluation. PMID:1991678
NASA Astrophysics Data System (ADS)
Peidou, Athina C.; Fotopoulos, Georgia; Pagiatakis, Spiros
2017-10-01
The main focus of this paper is to assess the feasibility of utilizing dedicated satellite gravity missions in order to detect large-scale solid mass transfer events (e.g. landslides). Specifically, a sensitivity analysis of Gravity Recovery and Climate Experiment (GRACE) gravity field solutions in conjunction with simulated case studies is employed to predict gravity changes due to past subaerial and submarine mass transfer events, namely the Agulhas slump in southeastern Africa and the Heart Mountain Landslide in northwestern Wyoming. The detectability of these events is evaluated by taking into account the expected noise level in the GRACE gravity field solutions and simulating their impact on the gravity field through forward modelling of the mass transfer. The spectral content of the estimated gravity changes induced by a simulated large-scale landslide event is estimated for the known spatial resolution of the GRACE observations using wavelet multiresolution analysis. The results indicate that both the Agulhas slump and the Heart Mountain Landslide could have been detected by GRACE, resulting in {\\vert }0.4{\\vert } and {\\vert }0.18{\\vert } mGal change on GRACE solutions, respectively. The suggested methodology is further extended to the case studies of the submarine landslide in Tohoku, Japan, and the Grand Banks landslide in Newfoundland, Canada. The detectability of these events using GRACE solutions is assessed through their impact on the gravity field.
Combined Volatolomics for Monitoring of Human Body Chemistry
Broza, Yoav Y.; Zuri, Liat; Haick, Hossam
2014-01-01
Analysis of volatile organic compounds (VOCs) is a promising approach for non-invasive, fast and potentially inexpensive diagnostics. Here, we present a new methodology for profiling the body chemistry by using the volatile fraction of molecules in various body fluids. Using mass spectrometry and cross-reactive nanomaterial-based sensors array, we demonstrate that simultaneous VOC detection from breath and skin would provide complementary, non-correlated information of the body's volatile metabolites profile. Eventually with further wide population validation studies, such a methodology could provide more accurate monitoring of pathological changes compared to the information provided by a single body fluid. The qualitative and quantitative methods presented here offers a variety of options for novel mapping of the metabolic properties of complex organisms, including humans. PMID:24714440
Combined volatolomics for monitoring of human body chemistry.
Broza, Yoav Y; Zuri, Liat; Haick, Hossam
2014-04-09
Analysis of volatile organic compounds (VOCs) is a promising approach for non-invasive, fast and potentially inexpensive diagnostics. Here, we present a new methodology for profiling the body chemistry by using the volatile fraction of molecules in various body fluids. Using mass spectrometry and cross-reactive nanomaterial-based sensors array, we demonstrate that simultaneous VOC detection from breath and skin would provide complementary, non-correlated information of the body's volatile metabolites profile. Eventually with further wide population validation studies, such a methodology could provide more accurate monitoring of pathological changes compared to the information provided by a single body fluid. The qualitative and quantitative methods presented here offers a variety of options for novel mapping of the metabolic properties of complex organisms, including humans.
Anomaly Detection Techniques for the Condition Monitoring of Tidal Turbines
2014-09-29
particularly beneficial to this industry. This paper explores the use of the CRISP - DM data mining process model for identifying key trends within...within tidal turbines with limited historical data. Using the CRISP - DM data mining methodology (Wirth & Hipp, 2000), key relationships between...indicate a change in the response of the system, indicating the possible onset of a fault. 1.2.1. CRISP - DM The CRISP - DM (Cross-Industry Standard
Ramallo, I Ayelen; García, Paula; Furlan, Ricardo L E
2015-11-01
A dual readout autographic assay to detect acetylcholinesterase inhibitors present in complex matrices adsorbed on reversed-phase or normal-phase thin-layer chromatography plates is described. Enzyme gel entrapment with an amphiphilic copolymer was used for assay development. The effects of substrate and enzyme concentrations, pH, incubation time, and incubation temperature on the sensitivity and the detection limit of the assay were evaluated. Experimental design and response surface methodology were used to optimize conditions with a minimum number of experiments. The assay allowed the detection of 0.01% w/w of physostigmine in both a spiked Sonchus oleraceus L. extract chromatographed on normal phase and a spiked Pimenta racemosa (Mill.) J.W. Moore leaf essential oil chromatographed on reversed phase. Finally, the reversed-phase thin-layer chromatography assay was applied to reveal the presence of an inhibitor in the Cymbopogon citratus (DC.) Stapf essential oil. The developed assay is able to detect acetylcholinesterase inhibitors present in complex matrixes that were chromatographed in normal phase or reversed-phase thin-layer chromatography. The detection limit for physostigmine on both normal and reversed phase was of 1×10(-4) μg. The results can be read by a change in color and/or a change in fluorescence. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Automatic Feature Selection and Improved Classification in SICADA Counterfeit Electronics Detection
2017-03-20
The SICADA methodology was developed to detect such counterfeit microelectronics by collecting power side channel data and applying machine learning...to identify counterfeits. This methodology has been extended to include a two-step automated feature selection process and now uses a one-class SVM...classifier. We describe this methodology and show results for empirical data collected from several types of Microchip dsPIC33F microcontrollers
Land-Cover Trends of the Southern California Mountains Ecoregion
Soulard, Christopher E.; Raumann, Christian G.; Wilson, Tamara S.
2007-01-01
This report presents an assessment of land-use and land-cover (LU/LC) change in the Southern California Mountains ecoregion for the period 1973-2001. The Southern California Mountains is one of 84 Level-III ecoregions as defined by the U.S. Environmental Protection Agency (EPA). Ecoregions have served as a spatial framework for environmental resource management, denoting areas that contain a geographically distinct assemblage of biotic and abiotic phenomena including geology, physiography, vegetation, climate, soils, land use, wildlife, and hydrology. The established Land Cover Trends methodology generates estimates of change for ecoregions using a probability sampling approach and change-detection analysis of thematic land-cover images derived from Landsat satellite imagery.
Laser-Induced Population Inversion in Rhodamine 6G for Lysozyme Oligomer Detection.
Hanczyc, Piotr; Sznitko, Lech
2017-06-06
Fluorescence spectroscopy is a common method for detecting amyloid fibrils in which organic fluorophores are used as markers that exhibit an increase in quantum yield upon binding. However, most of the dyes exhibit enhanced emission only when bound to mature fibrils, and significantly weaker signals are obtained in the presence of amyloid oligomers. In the concept of population inversion, a laser is used as an excitation source to keep the major fraction of molecules in the excited state to create the pathways for the occurrence of stimulated emission. In the case of the proteins, the conformational changes lead to the self-ordering and thus different light scattering conditions that can influence the optical signatures of the generated light. Using this methodology, we show it is possible to optically detect amyloid oligomers using commonly available staining dyes in which population inversion can be induced. The results indicate that rhodamine 6G molecules are complexed with oligomers, and using a laser-assisted methodology, weakly emissive states can be detected. Significant spectral red-shifting of rhodamine 6G dispersed with amyloid oligomers and a notable difference determined by comparison of spectra of the fibrils suggest the existence of specific dye aggregates around the oligomer binding sites. This approach can provide new insights into intermediate oligomer states that are believed to be responsible for toxic seeding in neurodegeneration diseases.
Automated Health Alerts Using In-Home Sensor Data for Embedded Health Assessment
Guevara, Rainer Dane; Rantz, Marilyn
2015-01-01
We present an example of unobtrusive, continuous monitoring in the home for the purpose of assessing early health changes. Sensors embedded in the environment capture behavior and activity patterns. Changes in patterns are detected as potential signs of changing health. We first present results of a preliminary study investigating 22 features extracted from in-home sensor data. A 1-D alert algorithm was then implemented to generate health alerts to clinicians in a senior housing facility. Clinicians analyze each alert and provide a rating on the clinical relevance. These ratings are then used as ground truth for training and testing classifiers. Here, we present the methodology for four classification approaches that fuse multisensor data. Results are shown using embedded sensor data and health alert ratings collected on 21 seniors over nine months. The best results show similar performance for two techniques, where one approach uses only domain knowledge and the second uses supervised learning for training. Finally, we propose a health change detection model based on these results and clinical expertise. The system of in-home sensors and algorithms for automated health alerts provides a method for detecting health problems very early so that early treatment is possible. This method of passive in-home sensing alleviates compliance issues. PMID:27170900
Finite element model updating and damage detection for bridges using vibration measurement.
DOT National Transportation Integrated Search
2013-12-01
In this report, the results of a study on developing a damage detection methodology based on Statistical Pattern Recognition are : presented. This methodology uses a new damage sensitive feature developed in this study that relies entirely on modal :...
Using State Estimation Residuals to Detect Abnormal SCADA Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ma, Jian; Chen, Yousu; Huang, Zhenyu
2010-06-14
Detection of manipulated supervisory control and data acquisition (SCADA) data is critically important for the safe and secure operation of modern power systems. In this paper, a methodology of detecting manipulated SCADA data based on state estimation residuals is presented. A framework of the proposed methodology is described. Instead of using original SCADA measurements as the bad data sources, the residuals calculated based on the results of the state estimator are used as the input for the outlier detection process. The BACON algorithm is applied to detect outliers in the state estimation residuals. The IEEE 118-bus system is used asmore » a test case to evaluate the effectiveness of the proposed methodology. The accuracy of the BACON method is compared with that of the 3-σ method for the simulated SCADA measurements and residuals.« less
18 CFR 342.4 - Other rate changing methodologies.
Code of Federal Regulations, 2012 CFR
2012-04-01
... 18 Conservation of Power and Water Resources 1 2012-04-01 2012-04-01 false Other rate changing methodologies. 342.4 Section 342.4 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY... METHODOLOGIES AND PROCEDURES § 342.4 Other rate changing methodologies. (a) Cost-of-service rates. A carrier may...
Real-time Microseismic Processing for Induced Seismicity Hazard Detection
DOE Office of Scientific and Technical Information (OSTI.GOV)
Matzel, Eric M.
Induced seismicity is inherently associated with underground fluid injections. If fluids are injected in proximity to a pre-existing fault or fracture system, the resulting elevated pressures can trigger dynamic earthquake slip, which could both damage surface structures and create new migration pathways. The goal of this research is to develop a fundamentally better approach to geological site characterization and early hazard detection. We combine innovative techniques for analyzing microseismic data with a physics-based inversion model to forecast microseismic cloud evolution. The key challenge is that faults at risk of slipping are often too small to detect during the site characterizationmore » phase. Our objective is to devise fast-running methodologies that will allow field operators to respond quickly to changing subsurface conditions.« less
NASA Astrophysics Data System (ADS)
Luján, José M.; Bermúdez, Vicente; Guardiola, Carlos; Abbad, Ali
2010-10-01
In-cylinder pressure measurement has historically been used for off-line combustion diagnosis, but online application for real-time combustion control has become of great interest. This work considers low computing-cost methods for analysing the instant variation of the chamber pressure, directly obtained from the electric signal provided by a traditional piezoelectric sensor. Presented methods are based on the detection of sudden changes in the chamber pressure, which are amplified by the pressure derivative, and which are due to thermodynamic phenomena within the cylinder. Signal analysis tools both in time and in time-frequency domains are used for detecting the start of combustion, the end of combustion and the heat release peak. Results are compared with classical thermodynamic analysis and validated in several turbocharged diesel engines.
Detection of generalized synchronization using echo state networks
NASA Astrophysics Data System (ADS)
Ibáñez-Soria, D.; Garcia-Ojalvo, J.; Soria-Frisch, A.; Ruffini, G.
2018-03-01
Generalized synchronization between coupled dynamical systems is a phenomenon of relevance in applications that range from secure communications to physiological modelling. Here, we test the capabilities of reservoir computing and, in particular, echo state networks for the detection of generalized synchronization. A nonlinear dynamical system consisting of two coupled Rössler chaotic attractors is used to generate temporal series consisting of time-locked generalized synchronized sequences interleaved with unsynchronized ones. Correctly tuned, echo state networks are able to efficiently discriminate between unsynchronized and synchronized sequences even in the presence of relatively high levels of noise. Compared to other state-of-the-art techniques of synchronization detection, the online capabilities of the proposed Echo State Network based methodology make it a promising choice for real-time applications aiming to monitor dynamical synchronization changes in continuous signals.
Vibration-Based Data Used to Detect Cracks in Rotating Disks
NASA Technical Reports Server (NTRS)
Gyekenyesi, Andrew L.; Sawicki, Jerzy T.; Martin, Richard E.; Baaklini, George Y.
2004-01-01
Rotor health monitoring and online damage detection are increasingly gaining the interest of aircraft engine manufacturers. This is primarily due to the fact that there is a necessity for improved safety during operation as well as a need for lower maintenance costs. Applied techniques for the damage detection and health monitoring of rotors are essential for engine safety, reliability, and life prediction. Recently, the United States set the ambitious goal of reducing the fatal accident rate for commercial aviation by 80 percent within 10 years. In turn, NASA, in collaboration with the Federal Aviation Administration, other Federal agencies, universities, and the airline and aircraft industries, responded by developing the Aviation Safety Program. This program provides research and technology products needed to help the aerospace industry achieve their aviation safety goal. The Nondestructive Evaluation (NDE) Group of the Optical Instrumentation Technology Branch at the NASA Glenn Research Center is currently developing propulsion-system-specific technologies to detect damage prior to catastrophe under the propulsion health management task. Currently, the NDE group is assessing the feasibility of utilizing real-time vibration data to detect cracks in turbine disks. The data are obtained from radial blade-tip clearance and shaft-clearance measurements made using capacitive or eddy-current probes. The concept is based on the fact that disk cracks distort the strain field within the component. This, in turn, causes a small deformation in the disk's geometry as well as a possible change in the system's center of mass. The geometric change and the center of mass shift can be indirectly characterized by monitoring the amplitude and phase of the first harmonic (i.e., the 1 component) of the vibration data. Spin pit experiments and full-scale engine tests have been conducted while monitoring for crack growth with this detection methodology. Even so, published data are extremely limited, and the basic foundation of the methodology has not been fully studied. The NDE group is working on developing this foundation on the basis of theoretical modeling as well as experimental data by using the newly constructed subscale spin system shown in the preceding photograph. This, in turn, involved designing an optimal sub-scale disk that was meant to represent a full-scale turbine disk; conducting finite element analyses of undamaged and damaged disks to define the disk's deformation and the resulting shift in center of mass; and creating a rotordynamic model of the complete disk and shaft assembly to confirm operation beyond the first critical concerning the subscale experimental setup. The finite element analysis data, defining the center of mass shift due to disk damage, are shown. As an example, the change in the center of mass for a disk spinning at 8000 rpm with a 0.963-in. notch was 1.3 x 10(exp -4) in. The actual vibration response of an undamaged disk as well as the theoretical response of a cracked disk is shown. Experiments with cracked disks are continuing, and new approaches for analyzing the captured vibration data are being developed to better detect damage in a rotor. In addition, the subscale spin system is being used to test the durability and sensitivity of new NDE sensors that focus on detecting localized damage. This is designed to supplement the global response of the crack-detection methodology described here.
Census Cities Project and atlas of urban and regional change
NASA Technical Reports Server (NTRS)
Wray, J. R.
1970-01-01
The research design and imagery utilization for urban applications of remote sensing are reviewed, including the combined use of sensor and census data and aircraft and spacecraft sensor platforms. The related purposes of the Census Cities Project are elucidated: (1) to assess the role of remote sensors on high altitude platforms for comparative study of urban areas; (2) to detect changes in selected U.S. urban areas between the 1970 census and the time of launching of an earth-orbiting sensor platform prior to next census; (3) to test the satellite sensor platform utility to monitor urban change and serve as a control for sensor image interpretation; (4) to design an information system for incorporating graphic sensor data with census-type data gathered by traditional techniques; (5) to identify and to design user-oriented end-products or information services; and (6) to ascertain what organizational capability would be needed to provide such services on a continuing basis. A need to develop not only a spatial data information system, but also a methodology for detecting and interpreting change is implied.
Caillaud, Amandine; de la Iglesia, Pablo; Darius, H Taiana; Pauillac, Serge; Aligizaki, Katerina; Fraga, Santiago; Chinain, Mireille; Diogène, Jorge
2010-06-14
Ciguatera fish poisoning (CFP) occurs mainly when humans ingest finfish contaminated with ciguatoxins (CTXs). The complexity and variability of such toxins have made it difficult to develop reliable methods to routinely monitor CFP with specificity and sensitivity. This review aims to describe the methodologies available for CTX detection, including those based on the toxicological, biochemical, chemical, and pharmaceutical properties of CTXs. Selecting any of these methodological approaches for routine monitoring of ciguatera may be dependent upon the applicability of the method. However, identifying a reference validation method for CTXs is a critical and urgent issue, and is dependent upon the availability of certified CTX standards and the coordinated action of laboratories. Reports of CFP cases in European hospitals have been described in several countries, and are mostly due to travel to CFP endemic areas. Additionally, the recent detection of the CTX-producing tropical genus Gambierdiscus in the eastern Atlantic Ocean of the northern hemisphere and in the Mediterranean Sea, as well as the confirmation of CFP in the Canary Islands and possibly in Madeira, constitute other reasons to study the onset of CFP in Europe [1]. The question of the possible contribution of climate change to the distribution of toxin-producing microalgae and ciguateric fish is raised. The impact of ciguatera onset on European Union (EU) policies will be discussed with respect to EU regulations on marine toxins in seafood. Critical analysis and availability of methodologies for CTX determination is required for a rapid response to suspected CFP cases and to conduct sound CFP risk analysis.
Caillaud, Amandine; de la Iglesia, Pablo; Darius, H. Taiana; Pauillac, Serge; Aligizaki, Katerina; Fraga, Santiago; Chinain, Mireille; Diogène, Jorge
2010-01-01
Ciguatera fish poisoning (CFP) occurs mainly when humans ingest finfish contaminated with ciguatoxins (CTXs). The complexity and variability of such toxins have made it difficult to develop reliable methods to routinely monitor CFP with specificity and sensitivity. This review aims to describe the methodologies available for CTX detection, including those based on the toxicological, biochemical, chemical, and pharmaceutical properties of CTXs. Selecting any of these methodological approaches for routine monitoring of ciguatera may be dependent upon the applicability of the method. However, identifying a reference validation method for CTXs is a critical and urgent issue, and is dependent upon the availability of certified CTX standards and the coordinated action of laboratories. Reports of CFP cases in European hospitals have been described in several countries, and are mostly due to travel to CFP endemic areas. Additionally, the recent detection of the CTX-producing tropical genus Gambierdiscus in the eastern Atlantic Ocean of the northern hemisphere and in the Mediterranean Sea, as well as the confirmation of CFP in the Canary Islands and possibly in Madeira, constitute other reasons to study the onset of CFP in Europe [1]. The question of the possible contribution of climate change to the distribution of toxin-producing microalgae and ciguateric fish is raised. The impact of ciguatera onset on European Union (EU) policies will be discussed with respect to EU regulations on marine toxins in seafood. Critical analysis and availability of methodologies for CTX determination is required for a rapid response to suspected CFP cases and to conduct sound CFP risk analysis. PMID:20631873
Detection of Cyanotoxins During Potable Water Treatment
USDA-ARS?s Scientific Manuscript database
In 2007, the U.S. EPA listed three cyanobacterial toxins on the CCL3 containment priority list for potable drinking waters. This paper describes all methodologies used for detection of these toxins, and assesses each on a cost/benefit basis. Methodologies for microcystin, cylindrospermopsin, and a...
Quantifying Biomass and Bare Earth Changes from the Hayman Fire Using Multi-temporal Lidar
NASA Astrophysics Data System (ADS)
Stoker, J. M.; Kaufmann, M. R.; Greenlee, S. K.
2007-12-01
Small-footprint multiple-return lidar data collected in the Cheesman Lake property prior to the 2002 Hayman fire in Colorado provided an excellent opportunity to evaluate Lidar as a tool to predict and analyze fire effects on both soil erosion and overstory structure. Re-measuring this area and applying change detection techniques allowed for analyses at a high level of detail. Our primary objectives focused on the use of change detection techniques using multi-temporal lidar data to: (1) evaluate the effectiveness of change detection to identify and quantify areas of erosion or deposition caused by post-fire rain events and rehab activities; (2) identify and quantify areas of biomass loss or forest structure change due to the Hayman fire; and (3) examine effects of pre-fire fuels and vegetation structure derived from lidar data on patterns of burn severity. While we were successful in identifying areas where changes occurred, the original error bounds on the variation in actual elevations made it difficult, if not misleading to quantify volumes of material changed on a per pixel basis. In order to minimize these variations in the two datasets, we investigated several correction and co-registration methodologies. The lessons learned from this project highlight the need for a high level of flight planning and understanding of errors in a lidar dataset in order to correctly estimate and report quantities of vertical change. Directly measuring vertical change using only lidar without ancillary information can provide errors that could make quantifications confusing, especially in areas with steep slopes.
3D change detection at street level using mobile laser scanning point clouds and terrestrial images
NASA Astrophysics Data System (ADS)
Qin, Rongjun; Gruen, Armin
2014-04-01
Automatic change detection and geo-database updating in the urban environment are difficult tasks. There has been much research on detecting changes with satellite and aerial images, but studies have rarely been performed at the street level, which is complex in its 3D geometry. Contemporary geo-databases include 3D street-level objects, which demand frequent data updating. Terrestrial images provides rich texture information for change detection, but the change detection with terrestrial images from different epochs sometimes faces problems with illumination changes, perspective distortions and unreliable 3D geometry caused by the lack of performance of automatic image matchers, while mobile laser scanning (MLS) data acquired from different epochs provides accurate 3D geometry for change detection, but is very expensive for periodical acquisition. This paper proposes a new method for change detection at street level by using combination of MLS point clouds and terrestrial images: the accurate but expensive MLS data acquired from an early epoch serves as the reference, and terrestrial images or photogrammetric images captured from an image-based mobile mapping system (MMS) at a later epoch are used to detect the geometrical changes between different epochs. The method will automatically mark the possible changes in each view, which provides a cost-efficient method for frequent data updating. The methodology is divided into several steps. In the first step, the point clouds are recorded by the MLS system and processed, with data cleaned and classified by semi-automatic means. In the second step, terrestrial images or mobile mapping images at a later epoch are taken and registered to the point cloud, and then point clouds are projected on each image by a weighted window based z-buffering method for view dependent 2D triangulation. In the next step, stereo pairs of the terrestrial images are rectified and re-projected between each other to check the geometrical consistency between point clouds and stereo images. Finally, an over-segmentation based graph cut optimization is carried out, taking into account the color, depth and class information to compute the changed area in the image space. The proposed method is invariant to light changes, robust to small co-registration errors between images and point clouds, and can be applied straightforwardly to 3D polyhedral models. This method can be used for 3D street data updating, city infrastructure management and damage monitoring in complex urban scenes.
The Many Hazards of Trend Evaluation
NASA Astrophysics Data System (ADS)
Henebry, G. M.; de Beurs, K.; Zhang, X.; Kimball, J. S.; Small, C.
2014-12-01
Given the awareness in the scientific community of global scale drivers such as population growth, globalization, and climatic variation and change, many studies seek to identify temporal patterns in data that may be plausibly related to one or more aspect of global change. Here we explore two questions: "What constitutes a trend in a time series?" and "How can a trend be misinterpreted?" There are manifold hazards—both methodological and psychological—in detecting a trend, quantifying its magnitude, assessing its significance, identifying probable causes, and evaluating the implications of the trend. These hazards can combine to elevate the risk of misinterpreting the trend. In contrast, evaluation of multiple trends within a biogeophysical framework can attenuate the risk of misinterpretation. We review and illustrate these hazards and demonstrate the efficacy of an approach using multiple indicators detecting significant trends (MIDST) applied to time series of remote sensing data products.
Super-Sensitive and Robust Biosensors from Supported Polymer Bilayers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Paxton, Walter F.
2015-09-01
Biological organisms are potentially the most sensitive and selective biological detection systems known, yet we are currently severely limited in our ability to exploit biological interactions in sensory devices, due in part to the limited stability of biological systems and derived materials. This proposal addresses an important aspect of integrating biological sensory materials in a solid state device. If successful, such technology could enable entirely new classes of robust biosensors that could be miniaturized and deployed in the field. The critical aims of the proposed work were 1) the calibration of a more versatile approach to measuring pH, 2) themore » use of this method to monitor pH changes caused by the light-induced pumping of protons across vesicles with bacteriorhodopsin integrated into the membranes (either polymer or lipid); 3) the preparation of bilayer assemblies on platinum surfaces; 4) the enhanced detection of lightinduced pH changes driven by bR-loaded supported bilayers. I have developed a methodology that may enable that at interfaces and developed a methodology to characterize the functionality of bilayer membranes with reconstituted membrane proteins. The integrity of the supported bilayer films however must be optimized prior to the full realization of the work originally envisioned in the original proposal. Nevertheless, the work performed on this project and the encouraging results it has produced has demonstrated that these goals are challenging yet within reach.« less
NASA Astrophysics Data System (ADS)
Levesque, M.
Artificial satellites, and particularly space junk, drift continuously from their known orbits. In the surveillance-of-space context, they must be observed frequently to ensure that the corresponding orbital parameter database entries are up-to-date. Autonomous ground-based optical systems are periodically tasked to observe these objects, calculate the difference between their predicted and real positions and update object orbital parameters. The real satellite positions are provided by the detection of the satellite streaks in the astronomical images specifically acquired for this purpose. This paper presents the image processing techniques used to detect and extract the satellite positions. The methodology includes several processing steps including: image background estimation and removal, star detection and removal, an iterative matched filter for streak detection, and finally false alarm rejection algorithms. This detection methodology is able to detect very faint objects. Simulated data were used to evaluate the methodology's performance and determine the sensitivity limits where the algorithm can perform detection without false alarm, which is essential to avoid corruption of the orbital parameter database.
An Adaptive Database Intrusion Detection System
ERIC Educational Resources Information Center
Barrios, Rita M.
2011-01-01
Intrusion detection is difficult to accomplish when attempting to employ current methodologies when considering the database and the authorized entity. It is a common understanding that current methodologies focus on the network architecture rather than the database, which is not an adequate solution when considering the insider threat. Recent…
2000-02-01
HIDS] Program: Power Drive Train Crack Detection Diagnostics and Prognostics ife Usage Monitoring and Damage Tolerance; Techniques, Methodologies, and...and Prognostics , Life Usage Monitoring , and Damage Tolerance; Techniques, Methodologies, and Experiences Andrew Hess Harrison Chin William Hardman...continuing program and deployed engine monitoring systems in fixed to evaluate helicopter diagnostic, prognostic , and wing aircraft, notably on the A
NASA Astrophysics Data System (ADS)
Mujumdar, Pradeep P.
2014-05-01
Climate change results in regional hydrologic change. The three prominent signals of global climate change, viz., increase in global average temperatures, rise in sea levels and change in precipitation patterns convert into signals of regional hydrologic change in terms of modifications in water availability, evaporative water demand, hydrologic extremes of floods and droughts, water quality, salinity intrusion in coastal aquifers, groundwater recharge and other related phenomena. A major research focus in hydrologic sciences in recent years has been assessment of impacts of climate change at regional scales. An important research issue addressed in this context deals with responses of water fluxes on a catchment scale to the global climatic change. A commonly adopted methodology for assessing the regional hydrologic impacts of climate change is to use the climate projections provided by the General Circulation Models (GCMs) for specified emission scenarios in conjunction with the process-based hydrologic models to generate the corresponding hydrologic projections. The scaling problem arising because of the large spatial scales at which the GCMs operate compared to those required in distributed hydrologic models, and their inability to satisfactorily simulate the variables of interest to hydrology are addressed by downscaling the GCM simulations to hydrologic scales. Projections obtained with this procedure are burdened with a large uncertainty introduced by the choice of GCMs and emission scenarios, small samples of historical data against which the models are calibrated, downscaling methods used and other sources. Development of methodologies to quantify and reduce such uncertainties is a current area of research in hydrology. In this presentation, an overview of recent research carried out by the author's group on assessment of hydrologic impacts of climate change addressing scale issues and quantification of uncertainties is provided. Methodologies developed with conditional random fields, Dempster-Shafer theory, possibility theory, imprecise probabilities and non-stationary extreme value theory are discussed. Specific applications on uncertainty quantification in impacts on streamflows, evaporative water demands, river water quality and urban flooding are presented. A brief discussion on detection and attribution of hydrologic change at river basin scales, contribution of landuse change and likely alterations in return levels of hydrologic extremes is also provided.
Technical Aspects for the Creation of a Multi-Dimensional Land Information System
NASA Astrophysics Data System (ADS)
Ioannidis, Charalabos; Potsiou, Chryssy; Soile, Sofia; Verykokou, Styliani; Mourafetis, George; Doulamis, Nikolaos
2016-06-01
The complexity of modern urban environments and civil demands for fast, reliable and affordable decision-making requires not only a 3D Land Information System, which tends to replace traditional 2D LIS architectures, but also the need to address the time and scale parameters, that is, the 3D geometry of buildings in various time instances (4th dimension) at various levels of detail (LoDs - 5th dimension). This paper describes and proposes solutions for technical aspects that need to be addressed for the 5D modelling pipeline. Such solutions include the creation of a 3D model, the application of a selective modelling procedure between various time instances and at various LoDs, enriched with cadastral and other spatial data, and a procedural modelling approach for the representation of the inner parts of the buildings. The methodology is based on automatic change detection algorithms for spatial-temporal analysis of the changes that took place in subsequent time periods, using dense image matching and structure from motion algorithms. The selective modelling approach allows a detailed modelling only for the areas where spatial changes are detected. The procedural modelling techniques use programming languages for the textual semantic description of a building; they require the modeller to describe its part-to-whole relationships. Finally, a 5D viewer is developed, in order to tackle existing limitations that accompany the use of global systems, such as the Google Earth or the Google Maps, as visualization software. An application based on the proposed methodology in an urban area is presented and it provides satisfactory results.
Methodology for stereoscopic motion-picture quality assessment
NASA Astrophysics Data System (ADS)
Voronov, Alexander; Vatolin, Dmitriy; Sumin, Denis; Napadovsky, Vyacheslav; Borisov, Alexey
2013-03-01
Creating and processing stereoscopic video imposes additional quality requirements related to view synchronization. In this work we propose a set of algorithms for detecting typical stereoscopic-video problems, which appear owing to imprecise setup of capture equipment or incorrect postprocessing. We developed a methodology for analyzing the quality of S3D motion pictures and for revealing their most problematic scenes. We then processed 10 modern stereo films, including Avatar, Resident Evil: Afterlife and Hugo, and analyzed changes in S3D-film quality over the years. This work presents real examples of common artifacts (color and sharpness mismatch, vertical disparity and excessive horizontal disparity) in the motion pictures we processed, as well as possible solutions for each problem. Our results enable improved quality assessment during the filming and postproduction stages.
Structural Damage Detection Using Changes in Natural Frequencies: Theory and Applications
NASA Astrophysics Data System (ADS)
He, K.; Zhu, W. D.
2011-07-01
A vibration-based method that uses changes in natural frequencies of a structure to detect damage has advantages over conventional nondestructive tests in detecting various types of damage, including loosening of bolted joints, using minimum measurement data. Two major challenges associated with applications of the vibration-based damage detection method to engineering structures are addressed: accurate modeling of structures and the development of a robust inverse algorithm to detect damage, which are defined as the forward and inverse problems, respectively. To resolve the forward problem, new physics-based finite element modeling techniques are developed for fillets in thin-walled beams and for bolted joints, so that complex structures can be accurately modeled with a reasonable model size. To resolve the inverse problem, a logistical function transformation is introduced to convert the constrained optimization problem to an unconstrained one, and a robust iterative algorithm using a trust-region method, called the Levenberg-Marquardt method, is developed to accurately detect the locations and extent of damage. The new methodology can ensure global convergence of the iterative algorithm in solving under-determined system equations and deal with damage detection problems with relatively large modeling error and measurement noise. The vibration-based damage detection method is applied to various structures including lightning masts, a space frame structure and one of its components, and a pipeline. The exact locations and extent of damage can be detected in the numerical simulation where there is no modeling error and measurement noise. The locations and extent of damage can be successfully detected in experimental damage detection.
Detection of traffic incidents using nonlinear time series analysis
NASA Astrophysics Data System (ADS)
Fragkou, A. D.; Karakasidis, T. E.; Nathanail, E.
2018-06-01
In this study, we present results of the application of nonlinear time series analysis on traffic data for incident detection. More specifically, we analyze daily volume records of Attica Tollway (Greece) collected from sensors located at various locations. The analysis was performed using the Recurrence Plot (RP) and Recurrence Quantification Analysis (RQA) method of the volume data of the lane closest to the median. The results show that it is possible to identify, through the abrupt change of the dynamics of the system revealed by RPs and RQA, the occurrence of incidents on the freeway and differentiate from recurrent traffic congestion. The proposed methodology could be of interest for big data traffic analysis.
Birefringence imaging in biological tissue using polarization sensitive optical coherent tomography
De Boer, Johannes F.; Milner, Thomas E.; Nelson, J. Stuart
2001-01-01
Employing a low coherence Michelson interferometer, two dimensional images of optical birefringence in turbid samples as a function of depth are measured. Polarization sensitive detection of the signal formed by interference of backscattered light from the sample and a mirror or reference plane in the reference arm which defines a reference optical path length, give the optical phase delay between light propagating along the fast and slow axes of the birefringence sample. Images showing the change in birefringence in response to irradiation of the sample are produced as an example of the detection apparatus and methodology. The technique allow rapid, noncontact investigation of tissue or sample diagnostic imaging for various medical or materials procedures.
Technologic developments in the field of photonics for the detection of urinary bladder cancer.
Palmer, Scott; Sokolovski, Sergei G; Rafailov, Edik; Nabi, Ghulam
2013-12-01
Bladder cancer is a common cause of morbidity and mortality worldwide in an aging population. Each year, thousands of people, mostly men, are diagnosed with this disease, but many of them present too late to receive optimal treatment. As with all cancers, early diagnosis of bladder cancer significantly improves the efficacy of therapy and increases survival and recurrence-free survival rates. Ongoing research has identified many limitations about the sensitivity of standard diagnostic procedures in detecting early-stage tumors and precancerous changes. The consequences of this are often tumor progression and increased tumor burden, leading to a decrease in patient quality of life and a vast increase in treatment costs. The necessity for improved early detection of bladder cancer has spurred on research into novel methods that use a wide range of biological and photonic phenomena. This review will broadly discuss standard detection methodologies and their major limitations before covering novel photonic techniques for early tumor detection and staging, assessing their diagnostic accuracy for flat and precancerous changes. We will do so in the context of both cystoscopic examination and the screening of voided urine and will also touch on the concept of using photonic technology as a surgical tool for tumor ablation. Copyright © 2013 Elsevier Inc. All rights reserved.
Temperature and heat wave trends in northwest Mexico
NASA Astrophysics Data System (ADS)
Martínez-Austria, Polioptro F.; Bandala, Erick R.; Patiño-Gómez, Carlos
2016-02-01
Increase in temperature extremes is one of the main expected impacts of climate change, as well as one of the first signs of its occurrence. Nevertheless, results emerging from General Circulation Models, while sufficient for large scales, are not enough for forecasting local trends and, hence, the IPCC has called for local studies based on on-site data. Indeed, it is expected that climate extremes will be detected much earlier than changes in climate averages. Heat waves are among the most important and least studied climate extremes, however its occurrence has been only barely studied and even its very definition remains controversial. This paper discusses the observed changes in temperature trends and heat waves in Northwestern Mexico, one of the most vulnerable regions of the country. The climate records in two locations of the region are analyzed, including one of the cities with extreme climate in Mexico, Mexicali City in the state of Baja California and the Yaqui River basin at Sonora State using three different methodologies. Results showed clear trends on temperature increase and occurrence of heat waves in both of the study zones using the three methodologies proposed. As result, some policy making suggestion are included in order to increase the adaptability of the studied regions to climate change, particularly related with heat wave occurrence.
Nixon, Annabel; Doll, Helen; Kerr, Cicely; Burge, Russel; Naegeli, April N
2016-02-19
Regulatory guidance recommends anchor-based methods for interpretation of treatment effects measured by PRO endpoints. Methodological pros and cons of patient global ratings of change vs. patient global ratings of concept have been discussed but empirical evidence in support of either approach is lacking. This study evaluated the performance of patient global ratings of change and patient global ratings of concept for interpreting patient stability and patient improvement. Patient global ratings of change and patient global ratings of concept were included in a psychometric validation study of an osteoporosis-targeted PRO instrument (the OPAQ-PF) to assess its ability to detect change and to derive responder definitions. 144 female osteoporosis patients with (n = 37) or without (n = 107) a recent (within 6 weeks) fragility fracture completed the OPAQ-PF and global items at baseline, 2 weeks (no recent fracture), and 12 weeks (recent fracture) post-baseline. Results differed between the two methods. Recent fracture patients reported more improvement while patients without recent fracture reported more stability on ratings of change than ratings of concept. However, correlations with OPAQ-PF score change were stronger for ratings of concept than ratings of change (both groups). Effect sizes for OPAQ-PF score change increased consistently with level of change in ratings of concept but inconsistently with ratings of change, with the mean AUC for prediction of a one-point change being 0.72 vs. 0.56. This study provides initial empirical support for methodological and regulatory recommendations to use patient global ratings of concept rather than ratings of change when interpreting change captured by PRO instruments in studies evaluating treatment effects. These findings warrant being confirmed in a purpose-designed larger scale analysis.
NASA Astrophysics Data System (ADS)
Pradeep, K. R.; Thomas, A. M.; Basker, V. T.
2018-03-01
Structural health monitoring (SHM) is an essential component of futuristic civil, mechanical and aerospace structures. It detects the damages in system or give warning about the degradation of structure by evaluating performance parameters. This is achieved by the integration of sensors and actuators into the structure. Study of damage detection process in piezoelectric sensor and actuator integrated sandwich cantilever beam is carried out in this paper. Possible skin-core debond at the root of the cantilever beam is simulated and compared with undamaged case. The beam is actuated using piezoelectric actuators and performance differences are evaluated using Polyvinylidene fluoride (PVDF) sensors. The methodology utilized is the voltage/strain response of the damaged versus undamaged beam against transient actuation. Finite element model of piezo-beam is simulated in ANSYSTM using 8 noded coupled field element, with nodal degrees of freedoms are translations in the x, y directions and voltage. An aluminium sandwich beam with a length of 800mm, thickness of core 22.86mm and thickness of skin 0.3mm is considered. Skin-core debond is simulated in the model as unmerged nodes. Reduction in the fundamental frequency of the damaged beam is found to be negligible. But the voltage response of the PVDF sensor under transient excitation shows significantly visible change indicating the debond. Piezo electric based damage detection system is an effective tool for the damage detection of aerospace and civil structural system having inaccessible/critical locations and enables online monitoring possibilities as the power requirement is minimal.
A dynamic social systems model for considering structural factors in HIV prevention and detection
Latkin, Carl; Weeks, Margaret; Glasman, Laura; Galletly, Carol; Albarracin, Dolores
2010-01-01
We present a model for HIV-related behaviors that emphasizes the dynamic and social nature of the structural factors that influence HIV prevention and detection. Key structural dimensions of the model include resources, science and technology, formal social control, informal social influences and control, social interconnectedness, and settings. These six dimensions can be conceptualized on macro, meso, and micro levels. Given the inherent complexity of structural factors and their interrelatedness, HIV prevention interventions may focus on different levels and dimensions. We employ a systems perspective to describe the interconnected and dynamic processes of change among social systems and their components. The topics of HIV testing and safer injection facilities are analyzed using this structural framework. Finally, we discuss methodological issues in the development and evaluation of structural interventions for HIV prevention and detection. PMID:20838871
Geomorphic and landform survey of Northern Appennine Range (NAR)
NASA Technical Reports Server (NTRS)
Marino, C. M. (Principal Investigator); Zilioli, E.
1977-01-01
The author has identified the following significant results. An approach to landslide hazard detection was developed through the analysis of satellite imagery (LANDSAT 2) showing many landslide areas that occur on marine silts and clays in northern Appennine Range in Italy. A landslide risk score was given for large areas by narrowing and extending well defined areas, whose behavior and reflectivity variation was due to upper surface changes. Results show that this methodology allows evolution pattern of clay outflows to be distinguished.
Identification of flood-rich and flood-poor periods in flood series
NASA Astrophysics Data System (ADS)
Mediero, Luis; Santillán, David; Garrote, Luis
2015-04-01
Recently, a general concern about non-stationarity of flood series has arisen, as changes in catchment response can be driven by several factors, such as climatic and land-use changes. Several studies to detect trends in flood series at either national or trans-national scales have been conducted. Trends are usually detected by the Mann-Kendall test. However, the results of this test depend on the starting and ending year of the series, which can lead to different results in terms of the period considered. The results can be conditioned to flood-poor and flood-rich periods located at the beginning or end of the series. A methodology to identify statistically significant flood-rich and flood-poor periods is developed, based on the comparison between the expected sampling variability of floods when stationarity is assumed and the observed variability of floods in a given series. The methodology is applied to a set of long series of annual maximum floods, peaks over threshold and counts of annual occurrences in peaks over threshold series observed in Spain in the period 1942-2009. Mediero et al. (2014) found a general decreasing trend in flood series in some parts of Spain that could be caused by a flood-rich period observed in 1950-1970, placed at the beginning of the flood series. The results of this study support the findings of Mediero et al. (2014), as a flood-rich period in 1950-1970 was identified in most of the selected sites. References: Mediero, L., Santillán, D., Garrote, L., Granados, A. Detection and attribution of trends in magnitude, frequency and timing of floods in Spain, Journal of Hydrology, 517, 1072-1088, 2014.
NASA Astrophysics Data System (ADS)
Madariaga, J. M.; Torre-Fdez, I.; Ruiz-Galende, P.; Aramendia, J.; Gomez-Nubla, L.; Fdez-Ortiz de Vallejuelo, S.; Maguregui, M.; Castro, K.; Arana, G.
2018-04-01
Advanced methodologies based on Raman spectroscopy are proposed to detect prebiotic and biotic molecules in returned samples from Mars: (a) optical microscopy with confocal micro-Raman, (b) the SCA instrument, (c) Raman Imaging. Examples for NWA 6148.
Craig, Hugh; Berretta, Regina; Moscato, Pablo
2016-01-01
In this study we propose a novel, unsupervised clustering methodology for analyzing large datasets. This new, efficient methodology converts the general clustering problem into the community detection problem in graph by using the Jensen-Shannon distance, a dissimilarity measure originating in Information Theory. Moreover, we use graph theoretic concepts for the generation and analysis of proximity graphs. Our methodology is based on a newly proposed memetic algorithm (iMA-Net) for discovering clusters of data elements by maximizing the modularity function in proximity graphs of literary works. To test the effectiveness of this general methodology, we apply it to a text corpus dataset, which contains frequencies of approximately 55,114 unique words across all 168 written in the Shakespearean era (16th and 17th centuries), to analyze and detect clusters of similar plays. Experimental results and comparison with state-of-the-art clustering methods demonstrate the remarkable performance of our new method for identifying high quality clusters which reflect the commonalities in the literary style of the plays. PMID:27571416
Endoscopic vs. tactile evaluation of subgingival calculus.
Osborn, Joy B; Lenton, Patricia A; Lunos, Scott A; Blue, Christine M
2014-08-01
Endoscopic technology has been developed to facilitate imagery for use during diagnostic and therapeutic phases of periodontal care. The purpose of this study was to compare the level of subgingival calculus detection using a periodontal endoscope with that of conventional tactile explorer in periodontitis subjects. A convenience sample of 26 subjects with moderate periodontitis in at least 2 quadrants was recruited from the University of Minnesota School of Dentistry to undergo quadrant scaling and root planing. One quadrant from each subject was randomized for tactile calculus detection alone and the other quadrant for tactile detection plus the Perioscope ™ (Perioscopy Inc., Oakland, Cali). A calculus index on a 0 to 3 score was performed at baseline and at 2 post-scaling and root planing visits. Sites where calculus was detected at visit 1 were retreated. T-tests were used to determine within-subject differences between Perioscope™ and tactile measures, and changes in measures between visits. Significantly more calculus was detected using the Perioscope™ vs. tactile explorer for all 3 subject visits (p<0.005). Mean changes (reduction) in calculus detection from baseline to visit 1 were statistically significant for both the Perioscope™ and tactile quadrants (p<0.0001). However, further reductions in calculus detection from visit 1 to visit 2 was only significant for the Perioscope™ quadrant (p<0.025), indicating that this methodology was able to more precisely detect calculus at this visit. It was concluded that the addition of a visual component to calculus detection via the Perioscope™ was most helpful in the re-evaluation phase of periodontal therapy. Copyright © 2014 The American Dental Hygienists’ Association.
Implementation of efficient trajectories for an ultrasonic scanner using chaotic maps
NASA Astrophysics Data System (ADS)
Almeda, A.; Baltazar, A.; Treesatayapun, C.; Mijarez, R.
2012-05-01
Typical ultrasonic methodology for nondestructive scanning evaluation uses systematic scanning paths. In many cases, this approach is time inefficient and also energy and computational power consuming. Here, a methodology for the scanning of defects using an ultrasonic echo-pulse scanning technique combined with chaotic trajectory generation is proposed. This is implemented in a Cartesian coordinate robotic system developed in our lab. To cover the entire search area, a chaotic function and a proposed mirror mapping were incorporated. To improve detection probability, our proposed scanning methodology is complemented with a probabilistic approach of discontinuity detection. The developed methodology was found to be more efficient than traditional ones used to localize and characterize hidden flaws.
Field validation of protocols developed to evaluate in-line mastitis detection systems.
Kamphuis, C; Dela Rue, B T; Eastwood, C R
2016-02-01
This paper reports on a field validation of previously developed protocols for evaluating the performance of in-line mastitis-detection systems. The protocols outlined 2 requirements of these systems: (1) to detect cows with clinical mastitis (CM) promptly and accurately to enable timely and appropriate treatment and (2) to identify cows with high somatic cell count (SCC) to manage bulk milk SCC levels. Gold standard measures, evaluation tests, performance measures, and performance targets were proposed. The current study validated the protocols on commercial dairy farms with automated in-line mastitis-detection systems using both electrical conductivity (EC) and SCC sensor systems that both monitor at whole-udder level. The protocol for requirement 1 was applied on 3 commercial farms. For requirement 2, the protocol was applied on 6 farms; 3 of them had low bulk milk SCC (128×10(3) cells/mL) and were the same farms as used for field evaluation of requirement 1. Three farms with high bulk milk SCC (270×10(3) cells/mL) were additionally enrolled. The field evaluation methodology and results were presented at a workshop including representation from 7 international suppliers of in-line mastitis-detection systems. Feedback was sought on the acceptance of standardized performance evaluation protocols and recommended refinements to the protocols. Although the methodology for requirement 1 was relatively labor intensive and required organizational skills over an extended period, no major issues were encountered during the field validation of both protocols. The validation, thus, proved the protocols to be practical. Also, no changes to the data collection process were recommended by the technology supplier representatives. However, 4 recommendations were made to refine the protocols: inclusion of an additional analysis that ignores small (low-density) clot observations in the definition of CM, extension of the time window from 4 to 5 milkings for timely alerts for CM, setting a maximum number of 10 milkings for the time window to detect a CM episode, and presentation of sensitivity for a larger range of false alerts per 1,000 milkings replacing minimum performance targets. The recommended refinements are discussed with suggested changes to the original protocols. The information presented is intended to inform further debate toward achieving international agreement on standard protocols to evaluate performance of in-line mastitis-detection systems. Copyright © 2016 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
El Houda Thabet, Rihab; Combastel, Christophe; Raïssi, Tarek; Zolghadri, Ali
2015-09-01
The paper develops a set membership detection methodology which is applied to the detection of abnormal positions of aircraft control surfaces. Robust and early detection of such abnormal positions is an important issue for early system reconfiguration and overall optimisation of aircraft design. In order to improve fault sensitivity while ensuring a high level of robustness, the method combines a data-driven characterisation of noise and a model-driven approach based on interval prediction. The efficiency of the proposed methodology is illustrated through simulation results obtained based on data recorded in several flight scenarios of a highly representative aircraft benchmark.
Quantum Dots Applied to Methodology on Detection of Pesticide and Veterinary Drug Residues.
Zhou, Jia-Wei; Zou, Xue-Mei; Song, Shang-Hong; Chen, Guan-Hua
2018-02-14
The pesticide and veterinary drug residues brought by large-scale agricultural production have become one of the issues in the fields of food safety and environmental ecological security. It is necessary to develop the rapid, sensitive, qualitative and quantitative methodology for the detection of pesticide and veterinary drug residues. As one of the achievements of nanoscience, quantum dots (QDs) have been widely used in the detection of pesticide and veterinary drug residues. In these methodology studies, the used QD-signal styles include fluorescence, chemiluminescence, electrochemical luminescence, photoelectrochemistry, etc. QDs can also be assembled into sensors with different materials, such as QD-enzyme, QD-antibody, QD-aptamer, and QD-molecularly imprinted polymer sensors, etc. Plenty of study achievements in the field of detection of pesticide and veterinary drug residues have been obtained from the different combinations among these signals and sensors. They are summarized in this paper to provide a reference for the QD application in the detection of pesticide and veterinary drug residues.
NASA Astrophysics Data System (ADS)
García, Alicia; Berrocoso, Manuel; Marrero, José M.; Fernández-Ros, Alberto; Prates, Gonçalo; De la Cruz-Reyna, Servando; Ortiz, Ramón
2014-06-01
The 2011 volcanic unrest at El Hierro Island illustrated the need for a Volcanic Alert System (VAS) specifically designed for the management of volcanic crises developing after long repose periods. The VAS comprises the monitoring network, the software tools for analysis of the monitoring parameters, the Volcanic Activity Level (VAL) management, and the assessment of hazard. The VAS presented here focuses on phenomena related to moderate eruptions, and on potentially destructive volcano-tectonic earthquakes and landslides. We introduce a set of new data analysis tools, aimed to detect data trend changes, as well as spurious signals related to instrumental failure. When data-trend changes and/or malfunctions are detected, a watchdog is triggered, issuing a watch-out warning (WOW) to the Monitoring Scientific Team (MST). The changes in data patterns are then translated by the MST into a VAL that is easy to use and understand by scientists, technicians, and decision-makers. Although the VAS was designed specifically for the unrest episodes at El Hierro, the methodologies may prove useful at other volcanic systems.
The effect of the rate of hydrostatic pressure depressurization on cells in culture.
Tworkoski, Ellen; Glucksberg, Matthew R; Johnson, Mark
2018-01-01
Changes in hydrostatic pressure, at levels as low as 10 mm Hg, have been reported in some studies to alter cell function in vitro; however, other studies have found no detectable changes using similar methodologies. We here investigate the hypothesis that the rate of depressurization, rather than elevated hydrostatic pressure itself, may be responsible for these reported changes. Hydrostatic pressure (100 mm Hg above atmospheric pressure) was applied to bovine aortic endothelial cells (BAECs) and PC12 neuronal cells using pressurized gas for periods ranging from 3 hours to 9 days, and then the system was either slowly (~30 minutes) or rapidly (~5 seconds) depressurized. Cell viability, apoptosis, proliferation, and F-actin distribution were then assayed. Our results did not show significant differences between rapidly and slowly depressurized cells that would explain differences previously reported in the literature. Moreover, we found no detectable effect of elevated hydrostatic pressure (with slow depressurization) on any measured variables. Our results do not confirm the findings of other groups that modest increases in hydrostatic pressure affect cell function, but we are not able to explain their findings.
The effect of the rate of hydrostatic pressure depressurization on cells in culture
Tworkoski, Ellen; Glucksberg, Matthew R.
2018-01-01
Changes in hydrostatic pressure, at levels as low as 10 mm Hg, have been reported in some studies to alter cell function in vitro; however, other studies have found no detectable changes using similar methodologies. We here investigate the hypothesis that the rate of depressurization, rather than elevated hydrostatic pressure itself, may be responsible for these reported changes. Hydrostatic pressure (100 mm Hg above atmospheric pressure) was applied to bovine aortic endothelial cells (BAECs) and PC12 neuronal cells using pressurized gas for periods ranging from 3 hours to 9 days, and then the system was either slowly (~30 minutes) or rapidly (~5 seconds) depressurized. Cell viability, apoptosis, proliferation, and F-actin distribution were then assayed. Our results did not show significant differences between rapidly and slowly depressurized cells that would explain differences previously reported in the literature. Moreover, we found no detectable effect of elevated hydrostatic pressure (with slow depressurization) on any measured variables. Our results do not confirm the findings of other groups that modest increases in hydrostatic pressure affect cell function, but we are not able to explain their findings. PMID:29315329
Leaf Movements of Indoor Plants Monitored by Terrestrial LiDAR
Herrero-Huerta, Mónica; Lindenbergh, Roderik; Gard, Wolfgang
2018-01-01
Plant leaf movement is induced by some combination of different external and internal stimuli. Detailed geometric characterization of such movement is expected to improve understanding of these mechanisms. A metric high-quality, non-invasive and innovative sensor system to analyze plant movement is Terrestrial LiDAR (TLiDAR). This technique has an active sensor and is, therefore, independent of light conditions, able to obtain accurate high spatial and temporal resolution point clouds. In this study, a movement parameterization approach of leaf plants based on TLiDAR is introduced. For this purpose, two Calathea roseopicta plants were scanned in an indoor environment during 2 full-days, 1 day in natural light conditions and the other in darkness. The methodology to estimate leaf movement is based on segmenting individual leaves using an octree-based 3D-grid and monitoring the changes in their orientation by Principal Component Analysis. Additionally, canopy variations of the plant as a whole were characterized by a convex-hull approach. As a result, 9 leaves in plant 1 and 11 leaves in plant 2 were automatically detected with a global accuracy of 93.57 and 87.34%, respectively, compared to a manual detection. Regarding plant 1, in natural light conditions, the displacement average of the leaves between 7.00 a.m. and 12.30 p.m. was 3.67 cm as estimated using so-called deviation maps. The maximum displacement was 7.92 cm. In addition, the orientation changes of each leaf within a day were analyzed. The maximum variation in the vertical angle was 69.6° from 12.30 to 6.00 p.m. In darkness, the displacements were smaller and showed a different orientation pattern. The canopy volume of plant 1 changed more in the morning (4.42 dm3) than in the afternoon (2.57 dm3). The results of plant 2 largely confirmed the results of the first plant and were added to check the robustness of the methodology. The results show how to quantify leaf orientation variation and leaf movements along a day at mm accuracy in different light conditions. This confirms the feasibility of the proposed methodology to robustly analyse leaf movements. PMID:29527217
NASA Astrophysics Data System (ADS)
Van Den Hoek, J.
2014-12-01
Relationships between environmental change and armed conflict have long been studied. Sometimes referred to as 'warfare' or 'conflict' ecology, much of this scholarship has come in response to local-level perceptions of landscape or livelihood changes that result from regional armed conflict. However, such studies have, first, typically focused on spatiotemporally acute and readily detectable environmental change, like deforestation, to the exclusion of protracted and more subtle environmental changes, like agricultural degradation; second, been limited to situational conflicts or circumstances, thereby inhibiting broader theoretical development; and, third, often only considered the environmental consequences rather than the environmental or climatic circumstances that may contribute to conflict. As a result, there is little opportunity for methodological or theoretical cohesion between studies. In this presentation, I synthesize findings from three case studies examining the interrelationships between agricultural change and armed conflict in the semi-arid landscapes of northwest Pakistan, Palestine, and southern Syria. Using coarse through very high resolution remotely sensed imagery, socio-economic and demographic data, conflict databases, open-source programming, and building on theoretical underpinnings of political ecology and conflict studies, I present methods and modeling approaches that aid in overcoming data scarcity and disparity between scales of analysis and integrate environmental and conflict data in spatiotemporally explicit ways. Results from these case studies illuminate the interrelationships between both protracted and acute agricultural change and armed conflict, and have broad relevance for understanding the means by which environment, conflict, and livelihoods are linked, a nexus that will only become tighter with the advance of global climate change.
A Method for Co-Designing Theory-Based Behaviour Change Systems for Health Promotion.
Janols, Rebecka; Lindgren, Helena
2017-01-01
A methodology was defined and developed for designing theory-based behaviour change systems for health promotion that can be tailored to the individual. Theories from two research fields were combined with a participatory action research methodology. Two case studies applying the methodology were conducted. During and between group sessions the participants created material and designs following the behaviour change strategy themes, which were discussed, analysed and transformed into a design of a behaviour change system. Theories in behavioural change and persuasive technology guided the data collection, data analyses, and the design of a behaviour change system. The methodology has strong emphasis on the target group's participation in the design process. The different aspects brought forward related to behaviour change strategies defined in literature on persuasive technology, and the dynamics of these are associated to needs and motivation defined in literature on behaviour change. It was concluded that the methodology aids the integration of theories into a participatory action research design process, and aids the analyses and motivations of design choices.
Response-Guided Community Detection: Application to Climate Index Discovery
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bello, Gonzalo; Angus, Michael; Pedemane, Navya
Discovering climate indices-time series that summarize spatiotemporal climate patterns-is a key task in the climate science domain. In this work, we approach this task as a problem of response-guided community detection; that is, identifying communities in a graph associated with a response variable of interest. To this end, we propose a general strategy for response-guided community detection that explicitly incorporates information of the response variable during the community detection process, and introduce a graph representation of spatiotemporal data that leverages information from multiple variables. We apply our proposed methodology to the discovery of climate indices associated with seasonal rainfall variability.more » Our results suggest that our methodology is able to capture the underlying patterns known to be associated with the response variable of interest and to improve its predictability compared to existing methodologies for data-driven climate index discovery and official forecasts.« less
75 FR 46942 - Agency Forms Undergoing Paperwork Reduction Act Review
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-04
... employers. Should any needed methodological changes be identified, NIOSH will submit a request for modification to OMB. If no substantive methodological changes are required, the phase II study will proceed and... complete the questionnaire on the web or by telephone at that time.) Assuming no methodological changes...
Soil moisture retrival from Sentinel-1 and Modis synergy
NASA Astrophysics Data System (ADS)
Gao, Qi; Zribi, Mehrez; Escorihuela, Maria Jose; Baghdadi, Nicolas
2017-04-01
This study presents two methodologies retrieving soil moisture from SAR remote sensing data. The study is based on Sentinel-1 data in the VV polarization, over a site in Urgell, Catalunya (Spain). In the two methodologies using change detection techniques, preprocessed radar data are combined with normalized difference vegetation index (NDVI) auxiliary data to estimate the mean soil moisture with a resolution of 1km. By modeling the relationship between the backscatter difference and NDVI, the soil moisture at a specific NDVI value is retrieved. The first algorithm is already developed on West Africa(Zribi et al., 2014) from ERS scatterometer data to estimate soil water status. In this study, it is adapted to Sentinel-1 data and take into account the high repetitiveness of data in optimizing the inversion approach. Another new method is developed based on the backscatter difference between two adjacent days of Sentinel-1 data w.r.t. NDVI, with smaller vegetation change, the backscatter difference is more sensitive to soil moisture. The proposed methodologies have been validated with the ground measurement in two demonstrative fields with RMS error about 0.05 (in volumetric moisture), and the coherence between soil moisture variations and rainfall events is observed. Soil moisture maps at 1km resolution are generated for the study area. The results demonstrate the potential of Sentinel-1 data for the retrieval of soil moisture at 1km or even better resolution.
Kurosawa, Tatsuo; Watanabe, Mitsuo
2016-12-01
Glycosylation profiles significantly change during oncogenesis. Aberrant glycosylation can be used as a cancer biomarker in clinical settings. Different glycoforms can be separately detected using lectin affinity electrophoresis and lectin array-based methods. However, most methodologies and procedures need experienced technique to perform the assays and expertise to interpret the results. To apply glycomarkers for clinical practice, a robust assay system with an easy-to-use workflow is required. Wako's μTASWako i30, a fully automated immunoanalyzer, was developed for in vitro diagnostics based on microfluidic technology. It utilizes the principles of liquid-phase binding assay, where immunoreactions are performed in a liquid phase, and electrokinetic analyte transport assay. Capillary electrophoresis on microfluidic chip has enabled the detection of different glycoform types of alpha-fetoprotein (AFP), a serum biomarker for hepatocellular carcinoma. AFP with altered glycosylation can be separated based on the reactivity to Lens culinaris agglutinin on electrophoresis. The glycoform AFP-L3 was reportedly more specific in hepatocellular carcinoma. This assay system can provide a high sensitivity and rapid results in 9 min. The test results for ratio of AFP-L3 to total AFP using μTASWako i30 are correlated with those of conventional methodology. The μTASWako assay system and the technology can be utilized for glycosylation analysis in the postgenomic era. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Astrophysics Data System (ADS)
Montereale-Gavazzi, Giacomo; Roche, Marc; Lurton, Xavier; Degrendele, Koen; Terseleer, Nathan; Van Lancker, Vera
2018-06-01
To characterize seafloor substrate type, seabed mapping and particularly multibeam echosounding are increasingly used. Yet, the utilisation of repetitive MBES-borne backscatter surveys to monitor the environmental status of the seafloor remains limited. Often methodological frameworks are missing, and should comprise of a suite of change detection procedures, similarly to those developed in the terrestrial sciences. In this study, pre-, ensemble and post-classification approaches were tested on an eight km2 study site within a Habitat Directive Area in the Belgian part of the North Sea. In this area, gravel beds with epifaunal assemblages were observed. Flourishing of the fauna is constrained by overtopping with sand or increased turbidity levels, which could result from anthropogenic activities. Monitoring of the gravel to sand ratio was hence put forward as an indicator of good environmental status. Seven acoustic surveys were undertaken from 2004 to 2015. The methods allowed quantifying temporal trends and patterns of change of the main substrate classes identified in the study area; namely fine to medium homogenous sand, medium sand with bioclastic detritus and medium to coarse sand with gravel. Results indicated that by considering the entire study area and the entire time series, the gravel to sand ratio fluctuated, but was overall stable. Nonetheless, when only the biodiversity hotspots were considered, net losses and a gradual trend, indicative of potential smothering, was captured by ensemble and post-classification approaches respectively. Additionally, a two-dimensional morphological analysis, based on the bathymetric data, suggested a loss of profile complexity from 2004 to 2015. Causal relationships with natural and anthropogenic stressors are yet to be established. The methodologies presented and discussed are repeatable and can be applied to broad-scale geographical extents given that broad-scale time series datasets become available.
Rivera, Alba; Larrosa, Nieves; Mirelis, Beatriz; Navarro, Ferran
2014-02-01
β-lactam antimicrobial agents are frequently used to treat infections caused by Enterobacteriaceae. The main mechanism of resistance to these antibiotics is the production of certain enzymes, collectively named β-lactamases. Due to their substrate profile and their epidemiological implications, the most clinically important β-lactamases are extended-spectrum β-lactamases, class C β-lactamases and carbapenemases. Phenotypic detection of these enzymes may be complicated and is based on the use of specific inhibitors of each β-lactamase and on the loss of activity on some β-lactam indicators. Various international committees postulate that it is no longer necessary to interpret the susceptibility results or determine the mechanism of resistance. Several critics disagree, however, and consider that susceptibility results should be interpreted until more data are available on the clinical efficacy of treatment with β-lactams. Given these methodological difficulties and constant changes in the interpretation criteria, we consider that training and external quality controls are essential to keep updated in this field. For learning purposes, these external quality controls should always be accompanied by a review of the results and methodology used, and the analysis of errors. In this paper we review and contextualize all the aspects related to the detection and interpretation of these β-lactamases. Copyright © 2014 Elsevier España, S.L. All rights reserved.
Malleability of Attitudes or Malleability of the IAT?
Han, H. Anna; Czellar, Sandor; Olson, Michael A.; Fazio, Russell H.
2009-01-01
In the current set of experiments, we establish, and explore the consequences of, the imprecision that characterizes the attribute response labels typically employed in the Implicit Association Test (IAT). In Experiment 1, we demonstrate the malleability of the IAT, as conventionally implemented. IAT scores are shown to be influenced by perspective mindsets induced by an unrelated preceding task. Then, we explore how the malleability of the IAT can lead to the inference that attitude change has occurred even when there is very good reason to believe it has not (Experiment 2), and conversely, how it can obscure the detection of attitude change when such change is indeed likely to have occurred (Experiment 3). We provide conceptual explanations for these discrepancies and suggest methodological improvements to enhance the specificity of IAT measures. PMID:20401162
Use of remote sensing for land use policy formulation
NASA Technical Reports Server (NTRS)
1987-01-01
The overall objectives and strategies of the Center for Remote Sensing remain to provide a center for excellence for multidisciplinary scientific expertise to address land-related global habitability and earth observing systems scientific issues. Specific research projects that were underway during the final contract period include: digital classification of coniferous forest types in Michigan's northern lower peninsula; a physiographic ecosystem approach to remote classification and mapping; land surface change detection and inventory; analysis of radiant temperature data; and development of methodologies to assess possible impacts of man's changes of land surface on meteorological parameters. Significant progress in each of the five project areas has occurred. Summaries on each of the projects are provided.
Ultracompact vibrometry measurement with nanometric accuracy using optical feedback
NASA Astrophysics Data System (ADS)
Jha, Ajit; Azcona, Francisco; Royo, Santiago
2015-05-01
The nonlinear dynamics of a semiconductor laser with optical feedback (OF) combined with direct current modulation of the laser is demonstrated to suffice for the measurement of subwavelength changes in the position of a vibrating object. So far, classical Optical Feedback Interferometry (OFI) has been used to measure the vibration of an object given its amplitude is greater than half the wavelength of emission, and the resolution of the measurement limited to some tenths of the wavelength after processing. We present here a methodology which takes advantage of the combination of two different phenomena: continuous wave frequency modulation (CWFM), induced by direct modulation of the laser, and non-linear dynamics inside of the laser cavity subject to optical self-injection (OSI). The methodology we propose shows how to detect vibration amplitudes smaller than half the emission wavelength with resolutions way beyond λ/2, extending the typical performance of OFI setups to very small amplitudes. A detailed mathematical model and simulation results are presented to support the proposed methodology, showing its ability to perform such displacement measurements of frequencies in the MHz range, depending upon the modulation frequency. Such approach makes the technique a suitable candidate, among other applications, to economic laser-based ultrasound measurements, with applications in nondestructive testing of materials (thickness, flaws, density, stresses), among others. The results of simulations of the proposed approach confirm the merit of the figures as detection of amplitudes of vibration below λ/2) with resolutions in the nanometer range.
Unsupervised change detection in a particular vegetation land cover type using spectral angle mapper
NASA Astrophysics Data System (ADS)
Renza, Diego; Martinez, Estibaliz; Molina, Iñigo; Ballesteros L., Dora M.
2017-04-01
This paper presents a new unsupervised change detection methodology for multispectral images applied to specific land covers. The proposed method involves comparing each image against a reference spectrum, where the reference spectrum is obtained from the spectral signature of the type of coverage you want to detect. In this case the method has been tested using multispectral images (SPOT5) of the community of Madrid (Spain), and multispectral images (Quickbird) of an area over Indonesia that was impacted by the December 26, 2004 tsunami; here, the tests have focused on the detection of changes in vegetation. The image comparison is obtained by applying Spectral Angle Mapper between the reference spectrum and each multitemporal image. Then, a threshold to produce a single image of change is applied, which corresponds to the vegetation zones. The results for each multitemporal image are combined through an exclusive or (XOR) operation that selects vegetation zones that have changed over time. Finally, the derived results were compared against a supervised method based on classification with the Support Vector Machine. Furthermore, the NDVI-differencing and the Spectral Angle Mapper techniques were selected as unsupervised methods for comparison purposes. The main novelty of the method consists in the detection of changes in a specific land cover type (vegetation), therefore, for comparison purposes, the best scenario is to compare it with methods that aim to detect changes in a specific land cover type (vegetation). This is the main reason to select NDVI-based method and the post-classification method (SVM implemented in a standard software tool). To evaluate the improvements using a reference spectrum vector, the results are compared with the basic-SAM method. In SPOT5 image, the overall accuracy was 99.36% and the κ index was 90.11%; in Quickbird image, the overall accuracy was 97.5% and the κ index was 82.16%. Finally, the precision results of the method are comparable to those of a supervised method, supported by low detection of false positives and false negatives, along with a high overall accuracy and a high kappa index. On the other hand, the execution times were comparable to those of unsupervised methods of low computational load.
Du, Xiaojiao; Jiang, Ding; Hao, Nan; Qian, Jing; Dai, Liming; Zhou, Lei; Hu, Jianping; Wang, Kun
2016-10-04
The development of novel detection methodologies in electrochemiluminescence (ECL) aptasensor fields with simplicity and ultrasensitivity is essential for constructing biosensing architectures. Herein, a facile, specific, and sensitive methodology was developed unprecedentedly for quantitative detection of microcystin-LR (MC-LR) based on three-dimensional boron and nitrogen codoped graphene hydrogels (BN-GHs) assisted steric hindrance amplifying effect between the aptamer and target analytes. The recognition reaction was monitored by quartz crystal microbalance (QCM) to validate the possible steric hindrance effect. First, the BN-GHs were synthesized via self-assembled hydrothermal method and then applied as the Ru(bpy) 3 2+ immobilization platform for further loading the biomolecule aptamers due to their nanoporous structure and large specific surface area. Interestingly, we discovered for the first time that, without the aid of conventional double-stranded DNA configuration, such three-dimensional nanomaterials can directly amplify the steric hindrance effect between the aptamer and target analytes to a detectable level, and this facile methodology could be for an exquisite assay. With the MC-LR as a model, this novel ECL biosensor showed a high sensitivity and a wide linear range. This strategy supplies a simple and versatile platform for specific and sensitive determination of a wide range of aptamer-related targets, implying that three-dimensional nanomaterials would play a crucial role in engineering and developing novel detection methodologies for ECL aptasensing fields.
Method for detection of a few pathogenic bacteria and determination of live versus dead cells
NASA Astrophysics Data System (ADS)
Horikawa, Shin; Chen, I.-Hsuan; Du, Songtao; Liu, Yuzhe; Wikle, Howard C.; Suh, Sang-Jin; Barbaree, James M.; Chin, Bryan A.
2016-05-01
This paper presents a method for detection of a few pathogenic bacteria and determination of live versus dead cells. The method combines wireless phage-coated magnetoelastic (ME) biosensors and a surface-scanning dectector, enabling real-time monitoring of the growth of specific bacteria in a nutrient broth. The ME biosensor used in this investigation is composed of a strip-shaped ME resonator upon which an engineered bacteriophage is coated to capture a pathogen of interest. E2 phage with high binding affinity for Salmonella Typhimurium was used as a model study. The specificity of E2 phage has been reported to be 1 in 105 background bacteria. The phage-coated ME biosensors were first exposed to a low-concentration Salmonella suspension to capture roughly 300 cells on the sensor surface. When the growth of Salmonella in the broth occurs, the mass of the biosensor increases, which results in a decrease in the biosensor's resonant frequency. Monitoring of this mass- induced resonant frequency change allows for real-time detection of the presence of Salmonella. Detection of a few bacteria is also possible by growing them to a sufficient number. The surface-scanning detector was used to measure resonant frequency changes of 25 biosensors sequentially in an automated manner as a function of time. This methodology offers direct, real-time detection, quantification, and viability determination of specific bacteria. The rate of the sensor's resonant frequency change was found to be largely dependent on the number of initially bound cells and the efficiency of cell growth.
NASA Astrophysics Data System (ADS)
Sharma, Archie; Corona, Enrique; Mitra, Sunanda; Nutter, Brian S.
2006-03-01
Early detection of structural damage to the optic nerve head (ONH) is critical in diagnosis of glaucoma, because such glaucomatous damage precedes clinically identifiable visual loss. Early detection of glaucoma can prevent progression of the disease and consequent loss of vision. Traditional early detection techniques involve observing changes in the ONH through an ophthalmoscope. Stereo fundus photography is also routinely used to detect subtle changes in the ONH. However, clinical evaluation of stereo fundus photographs suffers from inter- and intra-subject variability. Even the Heidelberg Retina Tomograph (HRT) has not been found to be sufficiently sensitive for early detection. A semi-automated algorithm for quantitative representation of the optic disc and cup contours by computing accumulated disparities in the disc and cup regions from stereo fundus image pairs has already been developed using advanced digital image analysis methodologies. A 3-D visualization of the disc and cup is achieved assuming camera geometry. High correlation among computer-generated and manually segmented cup to disc ratios in a longitudinal study involving 159 stereo fundus image pairs has already been demonstrated. However, clinical usefulness of the proposed technique can only be tested by a fully automated algorithm. In this paper, we present a fully automated algorithm for segmentation of optic cup and disc contours from corresponding stereo disparity information. Because this technique does not involve human intervention, it eliminates subjective variability encountered in currently used clinical methods and provides ophthalmologists with a cost-effective and quantitative method for detection of ONH structural damage for early detection of glaucoma.
In situ photoacoustic characterization for porous silicon growing: Detection principles
NASA Astrophysics Data System (ADS)
Ramirez-Gutierrez, C. F.; Castaño-Yepes, J. D.; Rodriguez-García, M. E.
2016-05-01
There are a few methodologies for monitoring the in-situ formation of Porous Silicon (PS). One of the methodologies is photoacoustic. Previous works that reported the use of photoacoustic to study the PS formation do not provide the physical explanation of the origin of the signal. In this paper, a physical explanation of the origin of the photoacoustic signal during the PS etching is provided. The incident modulated radiation and changes in the reflectance are taken as thermal sources. In this paper, a useful methodology is proposed to determine the etching rate, porosity, and refractive index of a PS film by the determination of the sample thickness, using scanning electron microscopy images. This method was developed by carrying out two different experiments using the same anodization conditions. The first experiment consisted of growth of the samples with different etching times to prove the periodicity of the photoacoustic signal, while the second one considered the growth samples using three different wavelengths that are correlated with the period of the photoacoustic signal. The last experiment showed that the period of the photoacoustic signal is proportional to the laser wavelength.
Deciphering dynamics of clathrin-mediated endocytosis in a living organism
Heidotting, Spencer P.; Huber, Scott D.
2016-01-01
Current understanding of clathrin-mediated endocytosis (CME) dynamics is based on detection and tracking of fluorescently tagged clathrin coat components within cultured cells. Because of technical limitations inherent to detection and tracking of single fluorescent particles, CME dynamics is not characterized in vivo, so the effects of mechanical cues generated during development of multicellular organisms on formation and dissolution of clathrin-coated structures (CCSs) have not been directly observed. Here, we use growth rates of fluorescence signals obtained from short CCS intensity trace fragments to assess CME dynamics. This methodology does not rely on determining the complete lifespan of individual endocytic assemblies. Therefore, it allows for real-time monitoring of spatiotemporal changes in CME dynamics and is less prone to errors associated with particle detection and tracking. We validate the applicability of this approach to in vivo systems by demonstrating the reduction of CME dynamics during dorsal closure of Drosophila melanogaster embryos. PMID:27458134
Detection of isotype switch rearrangement in bulk culture by PCR.
Max, E E; Mills, F C; Chu, C
2001-05-01
When a B lymphocyte changes from synthesizing IgM to synthesizing IgG, IgA, or IgE, this isotype switch is generally accompanied by a unique DNA rearrangement. The protocols in this unit describe two polymerase chain reaction (PCR)-based strategies for detecting switch rearrangements in bulk culture. The first involves direct PCR across the switch junctions, providing the opportunity for characterizing the recombination products by nucleotide sequence analysis; however, because of characteristics inherent to the PCR methodology this strategy cannot easily be used as a quantitative assay for recombination. A support protocol details the preparation of the 5' Su PCR probe for this protocol. The second basic protocol describes a method known as digestion-circularization PCR (DCPCR) that is more amenable to quantitation but yields no information on structure of the recombination products. Both techniques should be capable of detecting reciprocal deletion circles as well as functional recombination products remaining on the expressed chromosome.
Godino-Llorente, J I; Gómez-Vilda, P
2004-02-01
It is well known that vocal and voice diseases do not necessarily cause perceptible changes in the acoustic voice signal. Acoustic analysis is a useful tool to diagnose voice diseases being a complementary technique to other methods based on direct observation of the vocal folds by laryngoscopy. Through the present paper two neural-network based classification approaches applied to the automatic detection of voice disorders will be studied. Structures studied are multilayer perceptron and learning vector quantization fed using short-term vectors calculated accordingly to the well-known Mel Frequency Coefficient cepstral parameterization. The paper shows that these architectures allow the detection of voice disorders--including glottic cancer--under highly reliable conditions. Within this context, the Learning Vector quantization methodology demonstrated to be more reliable than the multilayer perceptron architecture yielding 96% frame accuracy under similar working conditions.
Investigative change detection: identifying new topics using lexicon-based search
NASA Astrophysics Data System (ADS)
Hintz, Kenneth J.
2002-08-01
In law enforcement there is much textual data which needs to be searched in order to detect new threats. A new methodology which can be applied to this need is the automatic searching of the contents of documents from known sources to construct a lexicon of words used by that source. When analyzing future documents, the occurrence of words which have not been lexiconized are indicative of the introduction of a new topic into the source's lexicon which should be examined in its context by an analyst. A system analogous to this has been built and used to detect Fads and Categories on web sites. Fad refers to the first appearance of a word not in the lexicon; Category refers to the repeated appearance of a Fad word and the exceeding of some frequency or spatial occurrence metric indicating a permanence to the Category.
Quesada, Jose Antonio; Melchor, Inmaculada; Nolasco, Andreu
2017-05-26
The analysis of spatio-temporal patterns of disease or death in urban areas has been developed mainly from the ecological studies approach. These designs may have some limitations like the ecological fallacy and instability with few cases. The objective of this study was to apply the point process methodology, as a complement to that of aggregated data, to study HIV/AIDS mortality in men in the city of Alicante (Spain). A case-control study in residents in the city during the period 2004-2011 was designed. Cases were men who died from HIV/AIDS and controls represented the general population, matched by age to cases. The risk surfaces of death over the city were estimated using the log-risk function of intensities, and we contrasted their temporal variations over the two periods. High risk significant areas of death by HIV/AIDS, which coincide with the most deprived areas in the city, were detected. Significant spatial change of the areas at risk between the periods studied was not detected. The point process methodology is a useful tool to analyse the patterns of death by HIV/AIDS in urban areas.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-22
... comment the submission of additional information concerning the methodological changes for the digital... additional information concerning the methodological changes suggested in the comments by Mr. Shumate for the...-loss. The Commission is requesting a detailed description of the methodological changes that would be...
18 CFR 301.5 - Changes in Average System Cost methodology.
Code of Federal Regulations, 2012 CFR
2012-04-01
... 18 Conservation of Power and Water Resources 1 2012-04-01 2012-04-01 false Changes in Average System Cost methodology. 301.5 Section 301.5 Conservation of Power and Water Resources FEDERAL ENERGY... ACT § 301.5 Changes in Average System Cost methodology. (a) The Administrator, at his or her...
18 CFR 301.5 - Changes in Average System Cost methodology.
Code of Federal Regulations, 2014 CFR
2014-04-01
... 18 Conservation of Power and Water Resources 1 2014-04-01 2014-04-01 false Changes in Average System Cost methodology. 301.5 Section 301.5 Conservation of Power and Water Resources FEDERAL ENERGY... ACT § 301.5 Changes in Average System Cost methodology. (a) The Administrator, at his or her...
18 CFR 301.5 - Changes in Average System Cost methodology.
Code of Federal Regulations, 2013 CFR
2013-04-01
... 18 Conservation of Power and Water Resources 1 2013-04-01 2013-04-01 false Changes in Average System Cost methodology. 301.5 Section 301.5 Conservation of Power and Water Resources FEDERAL ENERGY... ACT § 301.5 Changes in Average System Cost methodology. (a) The Administrator, at his or her...
18 CFR 301.5 - Changes in Average System Cost methodology.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 18 Conservation of Power and Water Resources 1 2011-04-01 2011-04-01 false Changes in Average System Cost methodology. 301.5 Section 301.5 Conservation of Power and Water Resources FEDERAL ENERGY... ACT § 301.5 Changes in Average System Cost methodology. (a) The Administrator, at his or her...
NASA Astrophysics Data System (ADS)
Hyer, E. J.; Peterson, D. A.; Curtis, C. A.; Schmidt, C. C.; Hoffman, J.; Prins, E. M.
2014-12-01
The Fire Locating and Monitoring of Burning Emissions (FLAMBE) system converts satellite observations of thermally anomalous pixels into spatially and temporally continuous estimates of smoke release from open biomass burning. This system currently processes data from a constellation of 5 geostationary and 2 polar-orbiting sensors. Additional sensors, including NPP VIIRS and the imager on the Korea COMS-1 geostationary satellite, will soon be added. This constellation experiences schedule changes and outages of various durations, making the set of available scenes for fire detection highly variable on an hourly and daily basis. Adding to the complexity, the latency of the satellite data is variable between and within sensors. FLAMBE shares with many fire detection systems the goal of detecting as many fires as possible as early as possible, but the FLAMBE system must also produce a consistent estimate of smoke production with minimal artifacts from the changing constellation. To achieve this, NRL has developed a system of asynchronous processing and cross-calibration that permits satellite data to be used as it arrives, while preserving the consistency of the smoke emission estimates. This talk describes the asynchronous data ingest methodology, including latency statistics for the constellation. We also provide an overview and show results from the system we have developed to normalize multi-sensor fire detection for consistency.
NASA Astrophysics Data System (ADS)
Jia, Xiaodong; Jin, Chao; Buzza, Matt; Di, Yuan; Siegel, David; Lee, Jay
2018-01-01
Successful applications of Diffusion Map (DM) in machine failure detection and diagnosis have been reported in several recent studies. DM provides an efficient way to visualize the high-dimensional, complex and nonlinear machine data, and thus suggests more knowledge about the machine under monitoring. In this paper, a DM based methodology named as DM-EVD is proposed for machine degradation assessment, abnormality detection and diagnosis in an online fashion. Several limitations and challenges of using DM for machine health monitoring have been analyzed and addressed. Based on the proposed DM-EVD, a deviation based methodology is then proposed to include more dimension reduction methods. In this work, the incorporation of Laplacian Eigen-map and Principal Component Analysis (PCA) are explored, and the latter algorithm is named as PCA-Dev and is validated in the case study. To show the successful application of the proposed methodology, case studies from diverse fields are presented and investigated in this work. Improved results are reported by benchmarking with other machine learning algorithms.
Detecting Biosphere anomalies hotspots
NASA Astrophysics Data System (ADS)
Guanche-Garcia, Yanira; Mahecha, Miguel; Flach, Milan; Denzler, Joachim
2017-04-01
The current amount of satellite remote sensing measurements available allow for applying data-driven methods to investigate environmental processes. The detection of anomalies or abnormal events is crucial to monitor the Earth system and to analyze their impacts on ecosystems and society. By means of a combination of statistical methods, this study proposes an intuitive and efficient methodology to detect those areas that present hotspots of anomalies, i.e. higher levels of abnormal or extreme events or more severe phases during our historical records. Biosphere variables from a preliminary version of the Earth System Data Cube developed within the CAB-LAB project (http://earthsystemdatacube.net/) have been used in this study. This database comprises several atmosphere and biosphere variables expanding 11 years (2001-2011) with 8-day of temporal resolution and 0.25° of global spatial resolution. In this study, we have used 10 variables that measure the biosphere. The methodology applied to detect abnormal events follows the intuitive idea that anomalies are assumed to be time steps that are not well represented by a previously estimated statistical model [1].We combine the use of Autoregressive Moving Average (ARMA) models with a distance metric like Mahalanobis distance to detect abnormal events in multiple biosphere variables. In a first step we pre-treat the variables by removing the seasonality and normalizing them locally (μ=0,σ=1). Additionally we have regionalized the area of study into subregions of similar climate conditions, by using the Köppen climate classification. For each climate region and variable we have selected the best ARMA parameters by means of a Bayesian Criteria. Then we have obtained the residuals by comparing the fitted models with the original data. To detect the extreme residuals from the 10 variables, we have computed the Mahalanobis distance to the data's mean (Hotelling's T^2), which considers the covariance matrix of the joint distribution. The proposed methodology has been applied to different areas around the globe. The results show that the method is able to detect historic events and also provides a useful tool to define sensitive regions. This method and results have been developed within the framework of the project BACI (http://baci-h2020.eu/), which aims to integrate Earth Observation data to monitor the earth system and assessing the impacts of terrestrial changes. [1] V. Chandola, A., Banerjee and v., Kumar. Anomaly detection: a survey. ACM computing surveys (CSUR), vol. 41, n. 3, 2009. [2] P. Mahalanobis. On the generalised distance in statistics. Proceedings National Institute of Science, vol. 2, pp 49-55, 1936.
Time Series UAV Image-Based Point Clouds for Landslide Progression Evaluation Applications
Moussa, Adel; El-Sheimy, Naser; Habib, Ayman
2017-01-01
Landslides are major and constantly changing threats to urban landscapes and infrastructure. It is essential to detect and capture landslide changes regularly. Traditional methods for monitoring landslides are time-consuming, costly, dangerous, and the quality and quantity of the data is sometimes unable to meet the necessary requirements of geotechnical projects. This motivates the development of more automatic and efficient remote sensing approaches for landslide progression evaluation. Automatic change detection involving low-altitude unmanned aerial vehicle image-based point clouds, although proven, is relatively unexplored, and little research has been done in terms of accounting for volumetric changes. In this study, a methodology for automatically deriving change displacement rates, in a horizontal direction based on comparisons between extracted landslide scarps from multiple time periods, has been developed. Compared with the iterative closest projected point (ICPP) registration method, the developed method takes full advantage of automated geometric measuring, leading to fast processing. The proposed approach easily processes a large number of images from different epochs and enables the creation of registered image-based point clouds without the use of extensive ground control point information or further processing such as interpretation and image correlation. The produced results are promising for use in the field of landslide research. PMID:29057847
Time Series UAV Image-Based Point Clouds for Landslide Progression Evaluation Applications.
Al-Rawabdeh, Abdulla; Moussa, Adel; Foroutan, Marzieh; El-Sheimy, Naser; Habib, Ayman
2017-10-18
Landslides are major and constantly changing threats to urban landscapes and infrastructure. It is essential to detect and capture landslide changes regularly. Traditional methods for monitoring landslides are time-consuming, costly, dangerous, and the quality and quantity of the data is sometimes unable to meet the necessary requirements of geotechnical projects. This motivates the development of more automatic and efficient remote sensing approaches for landslide progression evaluation. Automatic change detection involving low-altitude unmanned aerial vehicle image-based point clouds, although proven, is relatively unexplored, and little research has been done in terms of accounting for volumetric changes. In this study, a methodology for automatically deriving change displacement rates, in a horizontal direction based on comparisons between extracted landslide scarps from multiple time periods, has been developed. Compared with the iterative closest projected point (ICPP) registration method, the developed method takes full advantage of automated geometric measuring, leading to fast processing. The proposed approach easily processes a large number of images from different epochs and enables the creation of registered image-based point clouds without the use of extensive ground control point information or further processing such as interpretation and image correlation. The produced results are promising for use in the field of landslide research.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jacobi, Rober
2007-03-28
This Topical Report (#6 of 9) consists of the figures 3.6-13 to (and including) 3.6-18 (and appropriate figure captions) that accompany the Final Technical Progress Report entitled: “Innovative Methodology for Detection of Fracture-Controlled Sweet Spots in the Northern Appalachian Basin” for DOE/NETL Award DE-AC26-00NT40698.
Sela, Itamar; Ashkenazy, Haim; Katoh, Kazutaka; Pupko, Tal
2015-07-01
Inference of multiple sequence alignments (MSAs) is a critical part of phylogenetic and comparative genomics studies. However, from the same set of sequences different MSAs are often inferred, depending on the methodologies used and the assumed parameters. Much effort has recently been devoted to improving the ability to identify unreliable alignment regions. Detecting such unreliable regions was previously shown to be important for downstream analyses relying on MSAs, such as the detection of positive selection. Here we developed GUIDANCE2, a new integrative methodology that accounts for: (i) uncertainty in the process of indel formation, (ii) uncertainty in the assumed guide tree and (iii) co-optimal solutions in the pairwise alignments, used as building blocks in progressive alignment algorithms. We compared GUIDANCE2 with seven methodologies to detect unreliable MSA regions using extensive simulations and empirical benchmarks. We show that GUIDANCE2 outperforms all previously developed methodologies. Furthermore, GUIDANCE2 also provides a set of alternative MSAs which can be useful for downstream analyses. The novel algorithm is implemented as a web-server, available at: http://guidance.tau.ac.il. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.
Changes in fat oxidation in response to various regimes of high intensity interval training (HIIT).
Astorino, Todd Anthony; Schubert, Matthew M
2018-01-01
Increased whole-body fat oxidation (FOx) has been consistently demonstrated in response to moderate intensity continuous exercise training. Completion of high intensity interval training (HIIT) and its more intense form, sprint interval training (SIT), has also been reported to increase FOx in different populations. An explanation for this increase in FOx is primarily peripheral adaptations via improvements in mitochondrial content and function. However, studies examining changes in FOx are less common in response to HIIT or SIT than those determining increases in maximal oxygen uptake which is concerning, considering that FOx has been identified as a predictor of weight gain and glycemic control. In this review, we explored physiological and methodological issues underpinning existing literature concerning changes in FOx in response to HIIT and SIT. Our results show that completion of interval training increases FOx in approximately 50% of studies, with the frequency of increased FOx higher in response to studies using HIIT compared to SIT. Significant increases in β-HAD, citrate synthase, fatty acid binding protein, or FAT/CD36 are likely responsible for the greater FOx seen in these studies. We encourage scientists to adopt strict methodological procedures to attenuate day-to-day variability in FOx, which is dramatic, and develop standardized procedures for assessing FOx, which may improve detection of changes in FOx in response to HIIT.
Effectively identifying user profiles in network and host metrics
NASA Astrophysics Data System (ADS)
Murphy, John P.; Berk, Vincent H.; Gregorio-de Souza, Ian
2010-04-01
This work presents a collection of methods that is used to effectively identify users of computers systems based on their particular usage of the software and the network. Not only are we able to identify individual computer users by their behavioral patterns, we are also able to detect significant deviations in their typical computer usage over time, or compared to a group of their peers. For instance, most people have a small, and relatively unique selection of regularly visited websites, certain email services, daily work hours, and typical preferred applications for mandated tasks. We argue that these habitual patterns are sufficiently specific to identify fully anonymized network users. We demonstrate that with only a modest data collection capability, profiles of individual computer users can be constructed so as to uniquely identify a profiled user from among their peers. As time progresses and habits or circumstances change, the methods presented update each profile so that changes in user behavior can be reliably detected over both abrupt and gradual time frames, without losing the ability to identify the profiled user. The primary benefit of our methodology allows one to efficiently detect deviant behaviors, such as subverted user accounts, or organizational policy violations. Thanks to the relative robustness, these techniques can be used in scenarios with very diverse data collection capabilities, and data privacy requirements. In addition to behavioral change detection, the generated profiles can also be compared against pre-defined examples of known adversarial patterns.
Papadimitropoulos, Adam; Rovithakis, George A; Parisini, Thomas
2007-07-01
In this paper, the problem of fault detection in mechanical systems performing linear motion, under the action of friction phenomena is addressed. The friction effects are modeled through the dynamic LuGre model. The proposed architecture is built upon an online neural network (NN) approximator, which requires only system's position and velocity. The friction internal state is not assumed to be available for measurement. The neural fault detection methodology is analyzed with respect to its robustness and sensitivity properties. Rigorous fault detectability conditions and upper bounds for the detection time are also derived. Extensive simulation results showing the effectiveness of the proposed methodology are provided, including a real case study on an industrial actuator.
Effects-Based Operations in the Cyber Domain
2017-05-03
as the joint targeting methodology . The description that Batschelet gave the traditional targeting methodology included a process of, “Decide, Detect...technology, requires new planning and methodology to fight back. This paper evaluates current Department of Defense doctrine to look at ways to conduct...developing its cyber tactics, techniques, and procedures, which, includes various targeting methodologies , such as the use of effects-based
Label-Free Aptasensors for the Detection of Mycotoxins
Rhouati, Amina; Catanante, Gaelle; Nunes, Gilvanda; Hayat, Akhtar; Marty, Jean-Louis
2016-01-01
Various methodologies have been reported in the literature for the qualitative and quantitative monitoring of mycotoxins in food and feed samples. Based on their enhanced specificity, selectivity and versatility, bio-affinity assays have inspired many researchers to develop sensors by exploring bio-recognition phenomena. However, a significant problem in the fabrication of these devices is that most of the biomolecules do not generate an easily measurable signal upon binding to the target analytes, and signal-generating labels are required to perform the measurements. In this context, aptamers have been emerged as a potential and attractive bio-recognition element to design label-free aptasensors for various target analytes. Contrary to other bioreceptor-based approaches, the aptamer-based assays rely on antigen binding-induced conformational changes or oligomerization states rather than binding-assisted changes in adsorbed mass or charge. This review will focus on current designs in label-free conformational switchable design strategies, with a particular focus on applications in the detection of mycotoxins. PMID:27999353
NASA Technical Reports Server (NTRS)
Buchanan, Vanessa D.; Woods, Brenton; Harper, Susana A.; Beeson, Harold D.; Perez, Horacio; Ryder, Valerie; Tapia, Alma S.; Pedley, Michael D.
2017-01-01
NASA-STD-6001B states "all nonmetals tested in accordance with NASA-STD-6001 should be retested every 10 years or as required by the responsible program/project." The retesting of materials helps ensure the most accurate data are used in material selection. Manufacturer formulas and processes can change over time, sometimes without an update to product number and material information. Material performance in certain NASA-STD-6001 tests can be particularly vulnerable to these changes, such as material offgas (Test 7). In addition, Test 7 analysis techniques at NASA White Sands Test Facility were dramatically enhanced in the early 1990s, resulting in improved detection capabilities. Low level formaldehyde identification was improved again in 2004. Understanding the limitations in offgas analysis data prior to 1990 puts into question the validity and current applicability of that data. Case studies on Super Koropon (Registered trademark) and Aeroglaze (Registered trademark) topcoat highlight the importance of material retesting.
Ferreira, Martiña; Blanco, Lucía; Garrido, Alejandro; Vieites, Juan M; Cabado, Ana G
2013-05-01
The toxic effects of the organotin compounds (OTCs) monobutyltin (MBT), dibutyltin (DBT), and tributyltin (TBT) were evaluated in vitro in a neuroblastoma human cell line. Mechanisms of cell death, apoptosis versus necrosis, were studied by using several markers: inhibition of cell viability and proliferation, F-actin, and mitochondrial membrane potential changes as well as reactive oxygen species (ROS) production and DNA fragmentation. The most toxic effects were detected with DBT and TBT even at very low concentrations (0.1-1 μM). In contrast, MBT induced lighter cytotoxic changes at the higher doses tested. None of the studied compounds stimulated propidium iodide uptake, although the most toxic chemical, TBT, caused lactate dehydrogenase release at the higher concentrations tested. These findings suggest that in neuroblastoma, OTC-induced cytotoxicity involves different pathways depending on the compound, concentration, and incubation time. A screening method for DBT and TBT quantification based on cell viability loss was developed, allowing a fast detection alternative to complex methodology.
Detection of white matter lesion regions in MRI using SLIC0 and convolutional neural network.
Diniz, Pedro Henrique Bandeira; Valente, Thales Levi Azevedo; Diniz, João Otávio Bandeira; Silva, Aristófanes Corrêa; Gattass, Marcelo; Ventura, Nina; Muniz, Bernardo Carvalho; Gasparetto, Emerson Leandro
2018-04-19
White matter lesions are non-static brain lesions that have a prevalence rate up to 98% in the elderly population. Because they may be associated with several brain diseases, it is important that they are detected as soon as possible. Magnetic Resonance Imaging (MRI) provides three-dimensional data with the possibility to detect and emphasize contrast differences in soft tissues, providing rich information about the human soft tissue anatomy. However, the amount of data provided for these images is far too much for manual analysis/interpretation, representing a difficult and time-consuming task for specialists. This work presents a computational methodology capable of detecting regions of white matter lesions of the brain in MRI of FLAIR modality. The techniques highlighted in this methodology are SLIC0 clustering for candidate segmentation and convolutional neural networks for candidate classification. The methodology proposed here consists of four steps: (1) images acquisition, (2) images preprocessing, (3) candidates segmentation and (4) candidates classification. The methodology was applied on 91 magnetic resonance images provided by DASA, and achieved an accuracy of 98.73%, specificity of 98.77% and sensitivity of 78.79% with 0.005 of false positives, without any false positives reduction technique, in detection of white matter lesion regions. It is demonstrated the feasibility of the analysis of brain MRI using SLIC0 and convolutional neural network techniques to achieve success in detection of white matter lesions regions. Copyright © 2018. Published by Elsevier B.V.
Risk assessment methodology applied to counter IED research & development portfolio prioritization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shevitz, Daniel W; O' Brien, David A; Zerkle, David K
2009-01-01
In an effort to protect the United States from the ever increasing threat of domestic terrorism, the Department of Homeland Security, Science and Technology Directorate (DHS S&T), has significantly increased research activities to counter the terrorist use of explosives. More over, DHS S&T has established a robust Counter-Improvised Explosive Device (C-IED) Program to Deter, Predict, Detect, Defeat, and Mitigate this imminent threat to the Homeland. The DHS S&T portfolio is complicated and changing. In order to provide the ''best answer'' for the available resources, DHS S&T would like some ''risk based'' process for making funding decisions. There is a definitemore » need for a methodology to compare very different types of technologies on a common basis. A methodology was developed that allows users to evaluate a new ''quad chart'' and rank it, compared to all other quad charts across S&T divisions. It couples a logic model with an evidential reasoning model using an Excel spreadsheet containing weights of the subjective merits of different technologies. The methodology produces an Excel spreadsheet containing the aggregate rankings of the different technologies. It uses Extensible Logic Modeling (ELM) for logic models combined with LANL software called INFTree for evidential reasoning.« less
76 FR 55804 - Dicamba; Pesticide Tolerances
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-09
... Considerations A. Analytical Enforcement Methodology Adequate enforcement methodologies, Methods I and II--gas chromatography with electron capture detection (GC/ECD), are available to enforce the tolerance expression. The...
Qualitative and Quantitative Analysis of Histone Deacetylases in Kidney Tissue Sections.
Ververis, Katherine; Marzully, Selly; Samuel, Chrishan S; Hewitson, Tim D; Karagiannis, Tom C
2016-01-01
Fluorescent microscope imaging technologies are increasing in their applications and are being used on a wide scale. However methods used to quantify the level of fluorescence intensity are often not utilized-perhaps given the result may be immediately seen, quantification of the data may not seem necessary. However there are a number of reasons given to quantify fluorescent images including the importance of removing potential bias in the data upon observation as well as quantification of large numbers of images gives statistical power to detect subtle changes in experiments. In addition discreet localization of a protein could be detected without selection bias that may not be detectable by eye. Such data will be deemed useful when detecting the levels of HDAC enzymes within cells in order to develop more effective HDAC inhibitor compounds for use against multiple diseased states. Hence, we discuss a methodology devised to analyze fluorescent images using Image J to detect the mean fluorescence intensity of the 11 metal-dependent HDAC enzymes using murine kidney tissue sections as an example.
Planetary Transmission Diagnostics
NASA Technical Reports Server (NTRS)
Lewicki, David G. (Technical Monitor); Samuel, Paul D.; Conroy, Joseph K.; Pines, Darryll J.
2004-01-01
This report presents a methodology for detecting and diagnosing gear faults in the planetary stage of a helicopter transmission. This diagnostic technique is based on the constrained adaptive lifting algorithm. The lifting scheme, developed by Wim Sweldens of Bell Labs, is a time domain, prediction-error realization of the wavelet transform that allows for greater flexibility in the construction of wavelet bases. Classic lifting analyzes a given signal using wavelets derived from a single fundamental basis function. A number of researchers have proposed techniques for adding adaptivity to the lifting scheme, allowing the transform to choose from a set of fundamental bases the basis that best fits the signal. This characteristic is desirable for gear diagnostics as it allows the technique to tailor itself to a specific transmission by selecting a set of wavelets that best represent vibration signals obtained while the gearbox is operating under healthy-state conditions. However, constraints on certain basis characteristics are necessary to enhance the detection of local wave-form changes caused by certain types of gear damage. The proposed methodology analyzes individual tooth-mesh waveforms from a healthy-state gearbox vibration signal that was generated using the vibration separation (synchronous signal-averaging) algorithm. Each waveform is separated into analysis domains using zeros of its slope and curvature. The bases selected in each analysis domain are chosen to minimize the prediction error, and constrained to have the same-sign local slope and curvature as the original signal. The resulting set of bases is used to analyze future-state vibration signals and the lifting prediction error is inspected. The constraints allow the transform to effectively adapt to global amplitude changes, yielding small prediction errors. However, local wave-form changes associated with certain types of gear damage are poorly adapted, causing a significant change in the prediction error. The constrained adaptive lifting diagnostic algorithm is validated using data collected from the University of Maryland Transmission Test Rig and the results are discussed.
NASA Astrophysics Data System (ADS)
Heathfield, D.; Walker, I. J.; Grilliot, M. J.
2016-12-01
The recent emergence of terrestrial laser scanning (TLS) and unmanned aerial systems (UAS) as mapping platforms in geomorphology research has allowed for expedited acquisition of high spatial and temporal resolution, three-dimensional topographic datasets. TLS provides dense 3D `point cloud' datasets that require careful acquisition strategies and appreciable post-processing to produce accurate digital elevation models (DEMs). UAS provide overlapping nadir and oblique imagery that can be analysed using Structure from Motion (SfM) photogrammetry software to provide accurate, high-resolution orthophoto mosaics and accurate digital surface models (DSMs). Both methods yield centimeter to decimeter scale accuracy, depending on various hardware and field acquisition considerations (e.g., camera resolution, flight height, on-site GNSS control, etc.). Combined, the UAS-SfM workflow provides a comparable and more affordable solution to the more expensive TLS or aerial LiDAR methods. This paper compares and contrasts SfM and TLS survey methodologies and related workflow costs and benefits as used to quantify and examine seasonal beach-dune erosion and recovery processes at a site (Calvert Island) on British Columbia's central coast in western Canada. Seasonal SfM- and TLS-derived DEMs were used to quantify spatial patterns of surface elevation change, geomorphic responses, and related significant sediment volume changes. Cluster maps of positive (depositional) and negative (erosional) change are analysed to detect and interpret the geomorphic and sediment budget responses following an erosive water level event during winter 2016 season (Oct. 2015 - Apr. 2016). Vantage cameras also provided qualitative data on the frequency and magnitude of environmental drivers (e.g., tide, wave, wind forcing) of erosion and deposition events during the observation period. In addition, we evaluate the costs, time expenditures, and accuracy considerations for both SfM and TLS methodologies.
A new method for measuring mechanical properties of laryngeal mucosa.
Hemler, R J; Wieneke, G H; van Riel, A M; Lebacq, J; Dejonckere, P H
2001-03-01
A study of the effect of exogenous hazardous agents or conditions on the mechanical characteristics of vocal fold mucosa should meet three methodological criteria. 1) The outer surface of the mucosa should be exposed to the agent or condition while the inner surface is exposed to a physiological environment. 2) Even slight changes in mechanical characteristics should be detected. 3) The applied strain should be within physiological ranges. To date, no such method has been described in the literature. A method meeting the listed criteria is proposed and evaluated here.
Aghajari, Rozita; Azadbakht, Azadeh
2018-04-15
A streptomycin-specific aptamer was used as a receptor molecule for ultrasensitive quantitation of streptomycin. The glassy carbon (GC) electrode was modified with palladium nanoparticles decorated on chitosan-carbon nanotube (PdNPs/CNT/Chi) and aminated aptamer against streptomycin. Modification of the sensing interface was characterized by scanning electron microscopy (SEM), energy-dispersive X-ray (EDS), wavelength-dispersive X-ray spectroscopy (WDX), cyclic voltammetry (CVs), and electrochemical impedance spectroscopy (EIS). The methodologies applied for designing the proposed biosensor are based on target-induced conformational changes of streptomycin-specific aptamer, leading to detectable signal change. Sensing experiments were performed in the streptomycin concentration range from 0.1 to 1500 nM in order to evaluate the sensor response as a function of streptomycin concentration. Based on the results, the charge transfer resistance (R ct ) values increased proportionally to enhanced streptomycin content. The limit of detection was found to be as low as 18 pM. The superior selectivity and affinity of aptamer/PdNPs/CNT/Chi modified electrode for streptomycin recognition made it favorable for versatile applications such as streptomycin analysis in real samples. Copyright © 2018 Elsevier Inc. All rights reserved.
GIS applied to location of fires detection towers in domain area of tropical forest.
Eugenio, Fernando Coelho; Rosa Dos Santos, Alexandre; Fiedler, Nilton Cesar; Ribeiro, Guido Assunção; da Silva, Aderbal Gomes; Juvanhol, Ronie Silva; Schettino, Vitor Roberto; Marcatti, Gustavo Eduardo; Domingues, Getúlio Fonseca; Alves Dos Santos, Gleissy Mary Amaral Dino; Pezzopane, José Eduardo Macedo; Pedra, Beatriz Duguy; Banhos, Aureo; Martins, Lima Deleon
2016-08-15
In most countries, the loss of biodiversity caused by the fires is worrying. In this sense, the fires detection towers are crucial for rapid identification of fire outbreaks and can also be used in environmental inspection, biodiversity monitoring, telecommunications mechanisms, telemetry and others. Currently the methodologies for allocating fire detection towers over large areas are numerous, complex and non-standardized by government supervisory agencies. Therefore, this study proposes and evaluates different methodologies to best location of points to install fire detection towers considering the topography, risk areas, conservation units and heat spots. Were used Geographic Information Systems (GIS) techniques and unaligned stratified systematic sampling for implementing and evaluating 9 methods for allocating fire detection towers. Among the methods evaluated, the C3 method was chosen, represented by 140 fire detection towers, with coverage of: a) 67% of the study area, b) 73.97% of the areas with high risk, c) 70.41% of the areas with very high risk, d) 70.42% of the conservation units and e) 84.95% of the heat spots in 2014. The proposed methodology can be adapted to areas of other countries. Copyright © 2016 Elsevier B.V. All rights reserved.
Antibiotic Resistome: Improving Detection and Quantification Accuracy for Comparative Metagenomics.
Elbehery, Ali H A; Aziz, Ramy K; Siam, Rania
2016-04-01
The unprecedented rise of life-threatening antibiotic resistance (AR), combined with the unparalleled advances in DNA sequencing of genomes and metagenomes, has pushed the need for in silico detection of the resistance potential of clinical and environmental metagenomic samples through the quantification of AR genes (i.e., genes conferring antibiotic resistance). Therefore, determining an optimal methodology to quantitatively and accurately assess AR genes in a given environment is pivotal. Here, we optimized and improved existing AR detection methodologies from metagenomic datasets to properly consider AR-generating mutations in antibiotic target genes. Through comparative metagenomic analysis of previously published AR gene abundance in three publicly available metagenomes, we illustrate how mutation-generated resistance genes are either falsely assigned or neglected, which alters the detection and quantitation of the antibiotic resistome. In addition, we inspected factors influencing the outcome of AR gene quantification using metagenome simulation experiments, and identified that genome size, AR gene length, total number of metagenomics reads and selected sequencing platforms had pronounced effects on the level of detected AR. In conclusion, our proposed improvements in the current methodologies for accurate AR detection and resistome assessment show reliable results when tested on real and simulated metagenomic datasets.
Bourantas, Christos V; Papafaklis, Michail I; Athanasiou, Lambros; Kalatzis, Fanis G; Naka, Katerina K; Siogkas, Panagiotis K; Takahashi, Saeko; Saito, Shigeru; Fotiadis, Dimitrios I; Feldman, Charles L; Stone, Peter H; Michalis, Lampros K
2013-09-01
To develop and validate a new methodology that allows accurate 3-dimensional (3-D) coronary artery reconstruction using standard, simple angiographic and intravascular ultrasound (IVUS) data acquired during routine catheterisation enabling reliable assessment of the endothelial shear stress (ESS) distribution. Twenty-two patients (22 arteries: 7 LAD; 7 LCx; 8 RCA) who underwent angiography and IVUS examination were included. The acquired data were used for 3-D reconstruction using a conventional method and a new methodology that utilised the luminal 3-D centreline to place the detected IVUS borders and anatomical landmarks to estimate their orientation. The local ESS distribution was assessed by computational fluid dynamics. In corresponding consecutive 3 mm segments, lumen, plaque and ESS measurements in the 3-D models derived by the centreline approach were highly correlated to those derived from the conventional method (r>0.98 for all). The centreline methodology had a 99.5% diagnostic accuracy for identifying segments exposed to low ESS and provided similar estimations to the conventional method for the association between the change in plaque burden and ESS (centreline method: slope= -1.65%/Pa, p=0.078; conventional method: slope= -1.64%/Pa, p=0.084; p =0.69 for difference between the two methodologies). The centreline methodology provides geometrically correct models and permits reliable ESS computation. The ability to utilise data acquired during routine coronary angiography and IVUS examination will facilitate clinical investigation of the role of local ESS patterns in the natural history of coronary atherosclerosis.
Lubelchek, Ronald J.; Max, Blake; Sandusky, Caroline J.; Hota, Bala; Barker, David E.
2009-01-01
Introduction To explore whether an assay change was responsible for an increasing proportion of patients with undetectable HIV viral loads at our urban HIV clinic, we selected highly stable patients, examining their viral loads before and after changing assays. We compared the proportion with detectable viremia during RT-PCR vs. bDNA periods. Methodology/Principal Findings We selected patients with ≥1 viral loads assessed during both RT-PCR and bDNA periods. We included patients with stable CD4 counts, excluding patients with viral loads ≥1,000 copies/ml or any significant changes in therapy. Out of 4500 clinic patients, 419 patients (1588 viral loads) were included. 39% of viral loads were reported as detectable by RT-PCR vs. 5% reported as detectable by bDNA. The mean coefficient of variation was higher before vs. after assay change. We found an odds' ratio of 16.7 for having a viral load >75 copies/ml during the RT-PCR vs. bDNA periods. Discussion These data support previous reports, suggesting that bDNA may more reliably discriminate between viral suppression and low level viremia in stable patients on therapy. Low-level viremia, noted more with RT-PCR, may promote unneeded testing, while differences in viral load reliability may impact antiretroviral trial and quality assurance endpoints. Commonly used plasma separator tubes may differentially affect RT-PCR and bDNA results. PMID:19547711
A new scenario-based approach to damage detection using operational modal parameter estimates
NASA Astrophysics Data System (ADS)
Hansen, J. B.; Brincker, R.; López-Aenlle, M.; Overgaard, C. F.; Kloborg, K.
2017-09-01
In this paper a vibration-based damage localization and quantification method, based on natural frequencies and mode shapes, is presented. The proposed technique is inspired by a damage assessment methodology based solely on the sensitivity of mass-normalized experimental determined mode shapes. The present method differs by being based on modal data extracted by means of Operational Modal Analysis (OMA) combined with a reasonable Finite Element (FE) representation of the test structure and implemented in a scenario-based framework. Besides a review of the basic methodology this paper addresses fundamental theoretical as well as practical considerations which are crucial to the applicability of a given vibration-based damage assessment configuration. Lastly, the technique is demonstrated on an experimental test case using automated OMA. Both the numerical study as well as the experimental test case presented in this paper are restricted to perturbations concerning mass change.
Etheridge, Thomas J.; Boulineau, Rémi L.; Herbert, Alex; Watson, Adam T.; Daigaku, Yasukazu; Tucker, Jem; George, Sophie; Jönsson, Peter; Palayret, Matthieu; Lando, David; Laue, Ernest; Osborne, Mark A.; Klenerman, David; Lee, Steven F.; Carr, Antony M.
2014-01-01
Development of single-molecule localization microscopy techniques has allowed nanometre scale localization accuracy inside cells, permitting the resolution of ultra-fine cell structure and the elucidation of crucial molecular mechanisms. Application of these methodologies to understanding processes underlying DNA replication and repair has been limited to defined in vitro biochemical analysis and prokaryotic cells. In order to expand these techniques to eukaryotic systems, we have further developed a photo-activated localization microscopy-based method to directly visualize DNA-associated proteins in unfixed eukaryotic cells. We demonstrate that motion blurring of fluorescence due to protein diffusivity can be used to selectively image the DNA-bound population of proteins. We designed and tested a simple methodology and show that it can be used to detect changes in DNA binding of a replicative helicase subunit, Mcm4, and the replication sliding clamp, PCNA, between different stages of the cell cycle and between distinct genetic backgrounds. PMID:25106872
Application of atomic force microscopy as a nanotechnology tool in food science.
Yang, Hongshun; Wang, Yifen; Lai, Shaojuan; An, Hongjie; Li, Yunfei; Chen, Fusheng
2007-05-01
Atomic force microscopy (AFM) provides a method for detecting nanoscale structural information. First, this review explains the fundamentals of AFM, including principle, manipulation, and analysis. Applications of AFM are then reported in food science and technology research, including qualitative macromolecule and polymer imaging, complicated or quantitative structure analysis, molecular interaction, molecular manipulation, surface topography, and nanofood characterization. The results suggested that AFM could bring insightful knowledge on food properties, and the AFM analysis could be used to illustrate some mechanisms of property changes during processing and storage. However, the current difficulty in applying AFM to food research is lacking appropriate methodology for different food systems. Better understanding of AFM technology and developing corresponding methodology for complicated food systems would lead to a more in-depth understanding of food properties at macromolecular levels and enlarge their applications. The AFM results could greatly improve the food processing and storage technologies.
Rapid Detection of Ebola Virus with a Reagent-Free, Point-of-Care Biosensor
Baca, Justin T.; Severns, Virginia; Lovato, Debbie; Branch, Darren W.; Larson, Richard S.
2015-01-01
Surface acoustic wave (SAW) sensors can rapidly detect Ebola antigens at the point-of-care without the need for added reagents, sample processing, or specialized personnel. This preliminary study demonstrates SAW biosensor detection of the Ebola virus in a concentration-dependent manner. The detection limit with this methodology is below the average level of viremia detected on the first day of symptoms by PCR. We observe a log-linear sensor response for highly fragmented Ebola viral particles, with a detection limit corresponding to 1.9 × 104 PFU/mL prior to virus inactivation. We predict greatly improved sensitivity for intact, infectious Ebola virus. This point-of-care methodology has the potential to detect Ebola viremia prior to symptom onset, greatly enabling infection control and rapid treatment. This biosensor platform is powered by disposable AA batteries and can be rapidly adapted to detect other emerging diseases in austere conditions. PMID:25875186
Towards improved NDE and SHM methodologies incorporating nonlinear structural features
NASA Astrophysics Data System (ADS)
Chillara, Vamshi Krishna
Ultrasound is widely employed in Nondestructive Evaluation (NDE) and Structural Health Monitoring (SHM) applications to detect and characterize damage/defects in materials. In particular, ultrasonic guided waves are considered a foremost candidate for in-situ monitoring applications. Conventional ultrasonic techniques rely on changes/discontinuities in linear elastic material properties, namely the Young's modulus and shear modulus to detect damage. On the other hand, nonlinear ultrasonic techniques that rely on micro-scale nonlinear material/structural behavior are proven to be sensitive to damage induced microstructural changes that precede macro-scale damage and are hence capable of early damage detection. The goal of this thesis is to investigate the capabilities of nonlinear guided waves --- a fusion of nonlinear ultrasonic techniques with the guided wave methodologies for early damage detection. To that end, the thesis focuses on two important aspects of the problem: 1. Wavemechanics - deals with ultrasonic guided wave propagation in nonlinear waveguides; 2. Micromechanics - deals with correlating ultrasonic response with micro-scale nonlinear material behavior. For the development of efficient NDE and SHM methodologies that incorporate nonlinear structural features, a detailed understanding of the above aspects is indispensable. In this thesis, the wavemechanics aspect of the problem is dealt with from both theoretical and numerical standpoints. A generalized theoretical framework is developed to study higher harmonic guided waves in plates. This was employed to study second harmonic guided waves in pipes using a large-radius asymptotic approximation. Second harmonic guided waves in plates are studied from a numerical standpoint. Theoretical predictions are validated and some key aspects of higher harmonic generation in waveguides are outlined. Finally, second harmonic guided waves in plates with inhomogeneous and localized nonlinearities are studied and some important aspects of guided wave mode selection are addressed. The other part of the work focused on developing a micromechanics based understanding of ultrasonic higher harmonic generation. Three important aspects of micro-scale material behavior, namely tension-compression asymmetry, shearnormal coupling and deformation induced asymmetry are identified and their role in ultrasonic higher harmonic generation is discussed. Tension-compression asymmetry is identified to cause second (even) harmonic generation in materials. Then, shearnormal coupling is identified to cause generation of secondary waves of different polarity than the primary waves. In addition, deformation induced anisotropy due to the presence of residual stress/strain and its contribution to ultrasonic higher harmonic generation is qualitatively discussed. Also, the tension-compression asymmetry in the material is quantified using an energy based measure. The above measure is employed to develop a homogenization based approach amenable to multi-scale analysis to correlate microstructure with ultrasonic higher harmonic generation. Finally, experimental investigations concerning third harmonic SH wave generation in plates are carried out and the effect of load and temperature changes on nonlinear ultrasonic measurements are discussed in the context of SHM. It was found that while nonlinear ultrasound is sensitive to micro-scale damage, the relative nonlinearity parameter may not always be the best measure to quantify the nonlinearity as it is subject to spurious effects from changes in environmental factors such as loads and temperature.
NASA Technical Reports Server (NTRS)
Bundick, W. Thomas
1990-01-01
A methodology for designing a failure detection and identification (FDI) system to detect and isolate control element failures in aircraft control systems is reviewed. An FDI system design for a modified B-737 aircraft resulting from this methodology is also reviewed, and the results of evaluating this system via simulation are presented. The FDI system performed well in a no-turbulence environment, but it experienced an unacceptable number of false alarms in atmospheric turbulence. An adaptive FDI system, which adjusts thresholds and other system parameters based on the estimated turbulence level, was developed and evaluated. The adaptive system performed well over all turbulence levels simulated, reliably detecting all but the smallest magnitude partially-missing-surface failures.
Global-Context Based Salient Region Detection in Nature Images
NASA Astrophysics Data System (ADS)
Bao, Hong; Xu, De; Tang, Yingjun
Visually saliency detection provides an alternative methodology to image description in many applications such as adaptive content delivery and image retrieval. One of the main aims of visual attention in computer vision is to detect and segment the salient regions in an image. In this paper, we employ matrix decomposition to detect salient object in nature images. To efficiently eliminate high contrast noise regions in the background, we integrate global context information into saliency detection. Therefore, the most salient region can be easily selected as the one which is globally most isolated. The proposed approach intrinsically provides an alternative methodology to model attention with low implementation complexity. Experiments show that our approach achieves much better performance than that from the existing state-of-art methods.
NASA Astrophysics Data System (ADS)
Scott, R.; Entwistle, N. S.
2017-12-01
Gravel bed rivers and their associated wider systems present an ideal subject for development and improvement of rapid monitoring tools, with features dynamic enough to evolve within relatively short-term timescales. For detecting and quantifying topographical evolution, UAV based remote sensing has manifested as a reliable, low cost, and accurate means of topographic data collection. Here we present some validated methodologies for detection of geomorphic change at resolutions down to 0.05 m, building on the work of Wheaton et al. (2009) and Milan et al. (2007), to generate mesh based and pointcloud comparison data to produce a reliable picture of topographic evolution. Results are presented for the River Glen, Northumberland, UK. Recent channel avulsion and floodplain interaction, resulting in damage to flood defence structures make this site a particularly suitable case for application of geomorphic change detection methods, with the UAV platform at its centre. We compare multi-temporal, high-resolution point clouds derived from SfM processing, cross referenced with aerial LiDAR data, over a 1.5 km reach of the watercourse. Changes detected included bank erosion, bar and splay deposition, vegetation stripping and incipient channel avulsion. Utilisation of the topographic data for numerical modelling, carried out using CAESAR-Lisflood predicted the avulsion of the main channel, resulting in erosion of and potentially complete circumvention of original channel and flood levees. A subsequent UAV survey highlighted topographic change and reconfiguration of the local sedimentary conveyor as we predicted with preliminary modelling. The combined monitoring and modelling approach has allowed probable future geomorphic configurations to be predicted permitting more informed implementation of channel and floodplain management strategies.
Sports-related brain injuries: connecting pathology to diagnosis.
Pan, James; Connolly, Ian D; Dangelmajer, Sean; Kintzing, James; Ho, Allen L; Grant, Gerald
2016-04-01
Brain injuries are becoming increasingly common in athletes and represent an important diagnostic challenge. Early detection and management of brain injuries in sports are of utmost importance in preventing chronic neurological and psychiatric decline. These types of injuries incurred during sports are referred to as mild traumatic brain injuries, which represent a heterogeneous spectrum of disease. The most dramatic manifestation of chronic mild traumatic brain injuries is termed chronic traumatic encephalopathy, which is associated with profound neuropsychiatric deficits. Because chronic traumatic encephalopathy can only be diagnosed by postmortem examination, new diagnostic methodologies are needed for early detection and amelioration of disease burden. This review examines the pathology driving changes in athletes participating in high-impact sports and how this understanding can lead to innovations in neuroimaging and biomarker discovery.
16S rRNA beacons for bacterial monitoring during human space missions.
Larios-Sanz, Maia; Kourentzi, Katerina D; Warmflash, David; Jones, Jeffrey; Pierson, Duane L; Willson, Richard C; Fox, George E
2007-04-01
Microorganisms are unavoidable in space environments and their presence has, at times, been a source of problems. Concerns about disease during human space missions are particularly important considering the significant changes the immune system incurs during spaceflight and the history of microbial contamination aboard the Mir space station. Additionally, these contaminants may have adverse effects on instrumentation and life-support systems. A sensitive, highly specific system to detect, characterize, and monitor these microbial populations is essential. Herein we describe a monitoring approach that uses 16S rRNA targeted molecular beacons to successfully detect several specific bacterial groupings. This methodology will greatly simplify in-flight monitoring by minimizing sample handling and processing. We also address and provide solutions to target accessibility problems encountered in hybridizations that target 16S rRNA.
Data mining of atmospheric parameters associated with coastal earthquakes
NASA Astrophysics Data System (ADS)
Cervone, Guido
Earthquakes are natural hazards that pose a serious threat to society and the environment. A single earthquake can claim thousands of lives, cause damages for billions of dollars, destroy natural landmarks and render large territories uninhabitable. Studying earthquakes and the processes that govern their occurrence, is of fundamental importance to protect lives, properties and the environment. Recent studies have shown that anomalous changes in land, ocean and atmospheric parameters occur prior to earthquakes. The present dissertation introduces an innovative methodology and its implementation to identify anomalous changes in atmospheric parameters associated with large coastal earthquakes. Possible geophysical mechanisms are discussed in view of the close interaction between the lithosphere, the hydrosphere and the atmosphere. The proposed methodology is a multi strategy data mining approach which combines wavelet transformations, evolutionary algorithms, and statistical analysis of atmospheric data to analyze possible precursory signals. One dimensional wavelet transformations and statistical tests are employed to identify significant singularities in the data, which may correspond to anomalous peaks due to the earthquake preparatory processes. Evolutionary algorithms and other localized search strategies are used to analyze the spatial and temporal continuity of the anomalies detected over a large area (about 2000 km2), to discriminate signals that are most likely associated with earthquakes from those due to other, mostly atmospheric, phenomena. Only statistically significant singularities occurring within a very short time of each other, and which tract a rigorous geometrical path related to the geological properties of the epicentral area, are considered to be associated with a seismic event. A program called CQuake was developed to implement and validate the proposed methodology. CQuake is a fully automated, real time semi-operational system, developed to study precursory signals associated with earthquakes. CQuake can be used for the retrospective analysis of past earthquakes, and for detecting early warning information about impending events. Using CQuake more than 300 earthquakes have been analyzed. In the case of coastal earthquakes with magnitude larger than 5.0, prominent anomalies are found up to two weeks prior to the main event. In case of earthquakes occurring away from the coast, no strong anomaly is detected. The identified anomalies provide a potentially reliable mean to mitigate earthquake risks in the future, and can be used to develop a fully operational forecasting system.
Brown, Matt A; Bishnoi, Ram J; Dholakia, Sara; Velligan, Dawn I
2016-01-20
Recent failures to detect efficacy in clinical trials investigating pharmacological treatments for schizophrenia raise concerns regarding the potential contribution of methodological shortcomings to this research. This review provides an examination of two key methodological issues currently suspected of playing a role in hampering schizophrenia drug development; 1) limitations on the translational utility of preclinical development models, and 2) methodological challenges posed by increased placebo effects. Recommendations for strategies to address these methodological issues are addressed.
ERIC Educational Resources Information Center
Conley-Ware, Lakita D.
2010-01-01
This research addresses a real world cyberspace problem, where currently no cross industry standard methodology exists. The goal is to develop a model for identification and detection of vulnerabilities and threats of cyber-crime or cyber-terrorism where cyber-technology is the vehicle to commit the criminal or terrorist act (CVCT). This goal was…
NASA Astrophysics Data System (ADS)
Alegre, D. M.; Koroishi, E. H.; Melo, G. P.
2015-07-01
This paper presents a methodology for detection and localization of faults by using state observers. State Observers can rebuild the states not measured or values from points of difficult access in the system. So faults can be detected in these points without the knowledge of its measures, and can be track by the reconstructions of their states. In this paper this methodology will be applied in a system which represents a simplified model of a vehicle. In this model the chassis of the car was represented by a flat plate, which was divided in finite elements of plate (plate of Kirchoff), in addition, was considered the car suspension (springs and dampers). A test rig was built and the developed methodology was used to detect and locate faults on this system. In analyses done, the idea is to use a system with a specific fault, and then use the state observers to locate it, checking on a quantitative variation of the parameter of the system which caused this crash. For the computational simulations the software MATLAB was used.
Harte, Richard; Glynn, Liam; Rodríguez-Molinero, Alejandro; Baker, Paul MA; Scharf, Thomas; ÓLaighin, Gearóid
2017-01-01
Background Design processes such as human-centered design (HCD), which involve the end user throughout the product development and testing process, can be crucial in ensuring that the product meets the needs and capabilities of the user, particularly in terms of safety and user experience. The structured and iterative nature of HCD can often conflict with the necessary rapid product development life-cycles associated with the competitive connected health industry. Objective The aim of this study was to apply a structured HCD methodology to the development of a smartphone app that was to be used within a connected health fall risk detection system. Our methodology utilizes so called discount usability engineering techniques to minimize the burden on resources during development and maintain a rapid pace of development. This study will provide prospective designers a detailed description of the application of a HCD methodology. Methods A 3-phase methodology was applied. In the first phase, a descriptive “use case” was developed by the system designers and analyzed by both expert stakeholders and end users. The use case described the use of the app and how various actors would interact with it and in what context. A working app prototype and a user manual were then developed based on this feedback and were subjected to a rigorous usability inspection. Further changes were made both to the interface and support documentation. The now advanced prototype was exposed to user testing by end users where further design recommendations were made. Results With combined expert and end-user analysis of a comprehensive use case having originally identified 21 problems with the system interface, we have only seen and observed 3 of these problems in user testing, implying that 18 problems were eliminated between phase 1 and 3. Satisfactory ratings were obtained during validation testing by both experts and end users, and final testing by users shows the system requires low mental, physical, and temporal demands according to the NASA Task Load Index (NASA-TLX). Conclusions From our observation of older adults’ interactions with smartphone interfaces, there were some recurring themes. Clear and relevant feedback as the user attempts to complete a task is critical. Feedback should include pop-ups, sound tones, color or texture changes, or icon changes to indicate that a function has been completed successfully, such as for the connection sequence. For text feedback, clear and unambiguous language should be used so as not to create anxiety, particularly when it comes to saving data. Warning tones or symbols, such as caution symbols or shrill tones, should only be used if absolutely necessary. Our HCD methodology, designed and implemented based on the principles of the International Standard Organizaton (ISO) 9241-210 standard, produced a functional app interface within a short production cycle, which is now suitable for use by older adults in long term clinical trials. PMID:28559227
NASA Astrophysics Data System (ADS)
Masselink, Loes; Baartman, Jantiene; Verbesselt, Jan; Borchardt, Peter
2017-04-01
Kyrgyzstan has a long history of nomadic lifestyle in which pastures play an important role. However, currently the pastures are subject to severe grazing-induced degradation. Deteriorating levels of biomass, palatability and biodiversity reduce the pastures' productivity. To counter this and introduce sustainable pasture management, up-to-date information regarding the ecological conditions of the pastures is essential. This research aimed to investigate the potential of a remote sensing-based methodology to detect changing ecological pasture conditions in the Kara-Unkur watershed, Kyrgyzstan. The relations between Vegetation Indices (VIs) from Landsat ETM+ images and biomass, palatability and species richness field data were investigated. Both simple and multiple linear regression (MLR) analyses, including terrain attributes, were applied. Subsequently, trends of these three pasture conditions were mapped using time series analysis. The results show that biomass is most accurately estimated by a model including the Modified Soil Adjusted Vegetation Index (MSAVI) and a slope factor (R2 = 0.65, F = 0.0006). Regarding palatability, a model including the Enhanced Vegetation Index (EVI), Northness Index, Near Infrared (NIR) and Red band was most accurate (R2 = 0.61, F = 0.0160). Species richness was most accurately estimated by a model including Topographic Wetness Index (TWI), Eastness Index and estimated biomass (R2 = 0.81, F = 0.0028). Subsequent trend analyses of all three estimated ecological pasture conditions presented very similar trend patterns. Despite the need for a more robust validation, this study confirms the high potential of a remote sensing based methodology to detect changing ecological pasture conditions.
Monitoring tropical forest degradation using time series analysis of Landsat and Sentinel-2 data
NASA Astrophysics Data System (ADS)
Bullock, E.; Woodcock, C. E.
2017-12-01
Tropical forest loss is expected to be contribute 5 to 15% of anthropogenic carbon emissions in the coming century. The wide range of expected emissions is indicative of the large uncertainties that exist in the terrestrial carbon cycle. Total carbon loss from forest conversion consists of loss from deforestation plus loss from degradation. There have been significant improvements in the ability to relate plot-level estimates of carbon stocks to remote sensing-derived calculations of deforestation to estimate total carbon emissions from forest loss. These approaches, however, have been limited in their ability to assess the magnitude, extent, and overall impact of forest degradation. The causes of tropical degradation include selective logging, fuel wood collection, fires, and the development of forest plantations. This study demonstrates a newly developed methodology for detecting subtle changes in forest structure and condition using time series analysis of Landsat and Sentinel-2 data. The research shows how the ability to detect small changes in forest biomass, in addition to changes in forest composition, can be improved by incorporating historical context and multi-sensor data fusion. Results are demonstrated from two climatically unique tropical forests in Thailand and Brazil.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hixson, Kim K.; Adkins, Joshua N.; Baker, Scott E.
2006-11-03
Yersinia pestis, the causative agent of plague, is listed by the CDC as a level A select pathogen. To better enable detection, intervention and treatment of Y. pestis infections, it is necessary to understand its protein expression under conditions that promote or inhibit virulence. To this end, we have utilized a novel combination of the accurate mass and time tag methodology of mass spectrometry and clustering analysis using OmniViz™ to compare the protein abundance changes of 992 identified proteins under four growth conditions. Temperature and Ca2+ concentration were used to trigger virulence associated protein expression fundamental to the low calciummore » response. High-resolution liquid chromatography and electrospray ionization mass spectrometry were utilized to determine protein identity and abundance on the genome-wide level. The cluster analyses revealed, in a rapid visual platform, the reproducibility of the current method as well as relevant protein abundance changes of expected and novel proteins relating to a specific growth condition and sub-cellular location. Using this method, 89 proteins were identified as having a similar abundance change profile to 29 known virulence associated proteins, providing additional biomarker candidates for future detection and vaccine development strategies.« less
Photonic Crystal Structures with Tunable Structure Color as Colorimetric Sensors
Wang, Hui; Zhang, Ke-Qin
2013-01-01
Colorimetric sensing, which transduces environmental changes into visible color changes, provides a simple yet powerful detection mechanism that is well-suited to the development of low-cost and low-power sensors. A new approach in colorimetric sensing exploits the structural color of photonic crystals (PCs) to create environmentally-influenced color-changeable materials. PCs are composed of periodic dielectrics or metallo-dielectric nanostructures that affect the propagation of electromagnetic waves (EM) by defining the allowed and forbidden photonic bands. Simultaneously, an amazing variety of naturally occurring biological systems exhibit iridescent color due to the presence of PC structures throughout multi-dimensional space. In particular, some kinds of the structural colors in living organisms can be reversibly changed in reaction to external stimuli. Based on the lessons learned from natural photonic structures, some specific examples of PCs-based colorimetric sensors are presented in detail to demonstrate their unprecedented potential in practical applications, such as the detections of temperature, pH, ionic species, solvents, vapor, humidity, pressure and biomolecules. The combination of the nanofabrication technique, useful design methodologies inspired by biological systems and colorimetric sensing will lead to substantial developments in low-cost, miniaturized and widely deployable optical sensors. PMID:23539027
Surveillance Systems for Waterborne Protozoa Past, Present and Future
OVERVIEW I. Brief introduction to waterborne Cryptosporidium Historical perspective on detecting Cryptosporidium Current detection methodologies II. US EPA’s waterborne protozoan research program Detecting, typing, and tracking sources of Cryptosporidium contami...
Georgoulas, George; Georgopoulos, Voula C; Stylios, Chrysostomos D
2006-01-01
This paper proposes a novel integrated methodology to extract features and classify speech sounds with intent to detect the possible existence of a speech articulation disorder in a speaker. Articulation, in effect, is the specific and characteristic way that an individual produces the speech sounds. A methodology to process the speech signal, extract features and finally classify the signal and detect articulation problems in a speaker is presented. The use of support vector machines (SVMs), for the classification of speech sounds and detection of articulation disorders is introduced. The proposed method is implemented on a data set where different sets of features and different schemes of SVMs are tested leading to satisfactory performance.
Warren, Megan R; Sangiamo, Daniel T; Neunuebel, Joshua P
2018-03-01
An integral component in the assessment of vocal behavior in groups of freely interacting animals is the ability to determine which animal is producing each vocal signal. This process is facilitated by using microphone arrays with multiple channels. Here, we made important refinements to a state-of-the-art microphone array based system used to localize vocal signals produced by freely interacting laboratory mice. Key changes to the system included increasing the number of microphones as well as refining the methodology for localizing and assigning vocal signals to individual mice. We systematically demonstrate that the improvements in the methodology for localizing mouse vocal signals led to an increase in the number of signals detected as well as the number of signals accurately assigned to an animal. These changes facilitated the acquisition of larger and more comprehensive data sets that better represent the vocal activity within an experiment. Furthermore, this system will allow more thorough analyses of the role that vocal signals play in social communication. We expect that such advances will broaden our understanding of social communication deficits in mouse models of neurological disorders. Copyright © 2018 Elsevier B.V. All rights reserved.
Domain Anomaly Detection in Machine Perception: A System Architecture and Taxonomy.
Kittler, Josef; Christmas, William; de Campos, Teófilo; Windridge, David; Yan, Fei; Illingworth, John; Osman, Magda
2014-05-01
We address the problem of anomaly detection in machine perception. The concept of domain anomaly is introduced as distinct from the conventional notion of anomaly used in the literature. We propose a unified framework for anomaly detection which exposes the multifaceted nature of anomalies and suggest effective mechanisms for identifying and distinguishing each facet as instruments for domain anomaly detection. The framework draws on the Bayesian probabilistic reasoning apparatus which clearly defines concepts such as outlier, noise, distribution drift, novelty detection (object, object primitive), rare events, and unexpected events. Based on these concepts we provide a taxonomy of domain anomaly events. One of the mechanisms helping to pinpoint the nature of anomaly is based on detecting incongruence between contextual and noncontextual sensor(y) data interpretation. The proposed methodology has wide applicability. It underpins in a unified way the anomaly detection applications found in the literature. To illustrate some of its distinguishing features, in here the domain anomaly detection methodology is applied to the problem of anomaly detection for a video annotation system.
PCB congener analysis with Hall electrolytic conductivity detection
DOE Office of Scientific and Technical Information (OSTI.GOV)
Edstrom, R.D.
1989-01-01
This work reports the development of an analytical methodology for the analysis of PCB congeners based on integrating relative retention data provided by other researchers. The retention data were transposed into a multiple retention marker system which provided good precision in the calculation of relative retention indices for PCB congener analysis. Analytical run times for the developed methodology were approximately one hour using a commercially available GC capillary column. A Tracor Model 700A Hall Electrolytic Conductivity Detector (HECD) was employed in the GC detection of Aroclor standards and environmental samples. Responses by the HECD provided good sensitivity and were reasonablymore » predictable. Ten response factors were calculated based on the molar chlorine content of each homolog group. Homolog distributions were determined for Aroclors 1016, 1221, 1232, 1242, 1248, 1254, 1260, 1262 along with binary and ternary mixtures of the same. These distributions were compared with distributions reported by other researchers using electron capture detection as well as chemical ionization mass spectrometric methodologies. Homolog distributions acquired by the HECD methodology showed good correlation with the previously mentioned methodologies. The developed analytical methodology was used in the analysis of bluefish (Pomatomas saltatrix) and weakfish (Cynoscion regalis) collected from the York River, lower James River and lower Chesapeake Bay in Virginia. Total PCB concentrations were calculated and homolog distributions were constructed from the acquired data. Increases in total PCB concentrations were found in the analyzed fish samples during the fall of 1985 collected from the lower James River and lower Chesapeake Bay.« less
Vanegas, Fernando; Bratanov, Dmitry; Powell, Kevin; Weiss, John; Gonzalez, Felipe
2018-01-17
Recent advances in remote sensed imagery and geospatial image processing using unmanned aerial vehicles (UAVs) have enabled the rapid and ongoing development of monitoring tools for crop management and the detection/surveillance of insect pests. This paper describes a (UAV) remote sensing-based methodology to increase the efficiency of existing surveillance practices (human inspectors and insect traps) for detecting pest infestations (e.g., grape phylloxera in vineyards). The methodology uses a UAV integrated with advanced digital hyperspectral, multispectral, and RGB sensors. We implemented the methodology for the development of a predictive model for phylloxera detection. In this method, we explore the combination of airborne RGB, multispectral, and hyperspectral imagery with ground-based data at two separate time periods and under different levels of phylloxera infestation. We describe the technology used-the sensors, the UAV, and the flight operations-the processing workflow of the datasets from each imagery type, and the methods for combining multiple airborne with ground-based datasets. Finally, we present relevant results of correlation between the different processed datasets. The objective of this research is to develop a novel methodology for collecting, processing, analising and integrating multispectral, hyperspectral, ground and spatial data to remote sense different variables in different applications, such as, in this case, plant pest surveillance. The development of such methodology would provide researchers, agronomists, and UAV practitioners reliable data collection protocols and methods to achieve faster processing techniques and integrate multiple sources of data in diverse remote sensing applications.
Evaluation of an integrated graphical display to promote acute change detection in ICU patients
Anders, Shilo; Albert, Robert; Miller, Anne; Weinger, Matthew B.; Doig, Alexa K.; Behrens, Michael; Agutter, Jim
2012-01-01
Objective The purpose of this study was to evaluate ICU nurses’ ability to detect patient change using an integrated graphical information display (IGID) versus a conventional tabular ICU patient information display (i.e. electronic chart). Design Using participants from two different sites, we conducted a repeated measures simulator-based experiment to assess ICU nurses’ ability to detect abnormal patient variables using a novel IGID versus a conventional tabular information display. Patient scenarios and display presentations were fully counterbalanced. Measurements We measured percent correct detection of abnormal patient variables, nurses’ perceived workload (NASA-TLX), and display usability ratings. Results 32 ICU nurses (87% female, median age of 29 years, and median ICU experience of 2.5 years) using the IGID detected more abnormal variables compared to the tabular display [F (1,119)=13.0, p < 0.05]. There was a significant main effect of site [F (1, 119)=14.2], with development site participants doing better. There were no significant differences in nurses’ perceived workload. The IGID display was rated as more usable than the conventional display, [F (1, 60)=31.7]. Conclusion Overall, nurses reported more important physiological information with the novel IGID than tabular display. Moreover, the finding of site differences may reflect local influences in work practice and involvement in iterative display design methodology. Information displays developed using user-centered design should accommodate the full diversity of the intended user population across use sites. PMID:22534099
Detection and avoidance of errors in computer software
NASA Technical Reports Server (NTRS)
Kinsler, Les
1989-01-01
The acceptance test errors of a computer software project to determine if the errors could be detected or avoided in earlier phases of development. GROAGSS (Gamma Ray Observatory Attitude Ground Support System) was selected as the software project to be examined. The development of the software followed the standard Flight Dynamics Software Development methods. GROAGSS was developed between August 1985 and April 1989. The project is approximately 250,000 lines of code of which approximately 43,000 lines are reused from previous projects. GROAGSS had a total of 1715 Change Report Forms (CRFs) submitted during the entire development and testing. These changes contained 936 errors. Of these 936 errors, 374 were found during the acceptance testing. These acceptance test errors were first categorized into methods of avoidance including: more clearly written requirements; detail review; code reading; structural unit testing; and functional system integration testing. The errors were later broken down in terms of effort to detect and correct, class of error, and probability that the prescribed detection method would be successful. These determinations were based on Software Engineering Laboratory (SEL) documents and interviews with the project programmers. A summary of the results of the categorizations is presented. The number of programming errors at the beginning of acceptance testing can be significantly reduced. The results of the existing development methodology are examined for ways of improvements. A basis is provided for the definition is a new development/testing paradigm. Monitoring of the new scheme will objectively determine its effectiveness on avoiding and detecting errors.
Chikayama, Eisuke; Suto, Michitaka; Nishihara, Takashi; Shinozaki, Kazuo; Hirayama, Takashi; Kikuchi, Jun
2008-01-01
Background Metabolic phenotyping has become an important ‘bird's-eye-view’ technology which can be applied to higher organisms, such as model plant and animal systems in the post-genomics and proteomics era. Although genotyping technology has expanded greatly over the past decade, metabolic phenotyping has languished due to the difficulty of ‘top-down’ chemical analyses. Here, we describe a systematic NMR methodology for stable isotope-labeling and analysis of metabolite mixtures in plant and animal systems. Methodology/Principal Findings The analysis method includes a stable isotope labeling technique for use in living organisms; a systematic method for simultaneously identifying a large number of metabolites by using a newly developed HSQC-based metabolite chemical shift database combined with heteronuclear multidimensional NMR spectroscopy; Principal Components Analysis; and a visualization method using a coarse-grained overview of the metabolic system. The database contains more than 1000 1H and 13C chemical shifts corresponding to 142 metabolites measured under identical physicochemical conditions. Using the stable isotope labeling technique in Arabidopsis T87 cultured cells and Bombyx mori, we systematically detected >450 HSQC peaks in each 13C-HSQC spectrum derived from model plant, Arabidopsis T87 cultured cells and the invertebrate animal model Bombyx mori. Furthermore, for the first time, efficient 13C labeling has allowed reliable signal assignment using analytical separation techniques such as 3D HCCH-COSY spectra in higher organism extracts. Conclusions/Significance Overall physiological changes could be detected and categorized in relation to a critical developmental phase change in B. mori by coarse-grained representations in which the organization of metabolic pathways related to a specific developmental phase was visualized on the basis of constituent changes of 56 identified metabolites. Based on the observed intensities of 13C atoms of given metabolites on development-dependent changes in the 56 identified 13C-HSQC signals, we have determined the changes in metabolic networks that are associated with energy and nitrogen metabolism. PMID:19030231
Quantifying Structural and Compositional Changes in Forest Cover in NW Yunnan, China
NASA Astrophysics Data System (ADS)
Hakkenberg, C.
2012-12-01
NW Yunnan, China is a region renowned for high levels of biodiversity, endemism and genetically distinct refugial plant populations. It is also a focal area for China's national reforestation efforts like the Natural Forest Protection Program (NFPP), intended to control erosion in the Upper Yangtze watershed. As part of a larger project to investigate the role of reforestation programs in facilitating the emergence of increasingly species-rich forest communities on a previously degraded and depauperate land mosaic in montane SW China, this study uses a series of Landsat TM images to quantify the spatial pattern and rate of structural and compositional change in forests recovering from medium to large-scale disturbances in the area over the past 25 years. Beyond the fundamental need to assess the outcomes of one of the world's largest reforestation programs, this research offers approaches to confronting two critical methodological issues: (1) techniques for characterizing subtle changes in the nature of vegetation cover, and (2) reducing change detection uncertainty due to persistent cloud cover and shadow. To address difficulties in accurately assessing the structure and composition of vegetative regrowth, a biophysical model was parameterized with over 300 ground-truthed canopy cover assessment points to determine pattern and rate of long-term vegetation changes. To combat pervasive shadow and cloud cover, an interactive generalized additive model (GAM) model based on topographic and spatial predictors was used to overcome some of the constraints of satellite image analysis in Himalayan regions characterized by extreme topography and extensive cloud cover during the summer monsoon. The change detection is assessed for accuracy using ground-truthed observations in a variety of forest cover types and topographic positions. Results indicate effectiveness in reducing the areal extent of unclassified regions and increasing total change detection accuracy. In addition to quantifying forest cover change in this section of NW Yunnan, the analysis attempts to qualify that change - distinguishing among distinct disturbance histories and post-recovery successional pathways.
Spectral Target Detection using Schroedinger Eigenmaps
NASA Astrophysics Data System (ADS)
Dorado-Munoz, Leidy P.
Applications of optical remote sensing processes include environmental monitoring, military monitoring, meteorology, mapping, surveillance, etc. Many of these tasks include the detection of specific objects or materials, usually few or small, which are surrounded by other materials that clutter the scene and hide the relevant information. This target detection process has been boosted lately by the use of hyperspectral imagery (HSI) since its high spectral dimension provides more detailed spectral information that is desirable in data exploitation. Typical spectral target detectors rely on statistical or geometric models to characterize the spectral variability of the data. However, in many cases these parametric models do not fit well HSI data that impacts the detection performance. On the other hand, non-linear transformation methods, mainly based on manifold learning algorithms, have shown a potential use in HSI transformation, dimensionality reduction and classification. In target detection, non-linear transformation algorithms are used as preprocessing techniques that transform the data to a more suitable lower dimensional space, where the statistical or geometric detectors are applied. One of these non-linear manifold methods is the Schroedinger Eigenmaps (SE) algorithm that has been introduced as a technique for semi-supervised classification. The core tool of the SE algorithm is the Schroedinger operator that includes a potential term that encodes prior information about the materials present in a scene, and enables the embedding to be steered in some convenient directions in order to cluster similar pixels together. A completely novel target detection methodology based on SE algorithm is proposed for the first time in this thesis. The proposed methodology does not just include the transformation of the data to a lower dimensional space but also includes the definition of a detector that capitalizes on the theory behind SE. The fact that target pixels and those similar pixels are clustered in a predictable region of the low-dimensional representation is used to define a decision rule that allows one to identify target pixels over the rest of pixels in a given image. In addition, a knowledge propagation scheme is used to combine spectral and spatial information as a means to propagate the "potential constraints" to nearby points. The propagation scheme is introduced to reinforce weak connections and improve the separability between most of the target pixels and the background. Experiments using different HSI data sets are carried out in order to test the proposed methodology. The assessment is performed from a quantitative and qualitative point of view, and by comparing the SE-based methodology against two other detection methodologies that use linear/non-linear algorithms as transformations and the well-known Adaptive Coherence/Cosine Estimator (ACE) detector. Overall results show that the SE-based detector outperforms the other two detection methodologies, which indicates the usefulness of the SE transformation in spectral target detection problems.
Monitoring of Progressive Damage in Buildings Using Laser Scan Data
NASA Astrophysics Data System (ADS)
Puente, I.; Lindenbergh, R.; Van Natijne, A.; Esposito, R.; Schipper, R.
2018-05-01
Vulnerability of buildings to natural and man-induced hazards has become a main concern for our society. Ensuring their serviceability, safety and sustainability is of vital importance and the main reason for setting up monitoring systems to detect damages at an early stage. In this work, a method is presented for detecting changes from laser scan data, where no registration between different epochs is needed. To show the potential of the method, a case study of a laboratory test carried out at the Stevin laboratory of Delft University of Technology was selected. The case study was a quasi-static cyclic pushover test on a two-story high unreinforced masonry structure designed to simulate damage evolution caused by cyclic loading. During the various phases, we analysed the behaviour of the masonry walls by monitoring the deformation of each masonry unit. First a plane is fitted to the selected wall point cloud, consisting of one single terrestrial laser scan, using Principal Component Analysis (PCA). Second, the segmentation of individual elements is performed. Then deformations with respect to this plane model, for each epoch and specific element, are determined by computing their corresponding rotation and cloud-to-plane distances. The validation of the changes detected within this approach is done by comparison with traditional deformation analysis based on co-registered TLS point clouds between two or more epochs of building measurements. Initial results show that the sketched methodology is indeed able to detect changes at the mm level while avoiding 3D point cloud registration, which is a main issue in computer vision and remote sensing.
Ong, Chengsi; Lee, Jan Hau; Leow, Melvin K S; Puthucheary, Zudin A
2017-09-01
Evidence suggests that critically ill children develop muscle wasting, which could affect outcomes. Muscle ultrasound has been used to track muscle wasting and association with outcomes in critically ill adults but not children. This review aims to summarize methodological considerations of muscle ultrasound, structural findings, and possibilities for its application in the assessment of nutrition and functional outcomes in critically ill children. Medline, Embase, and CINAHL databases were searched up until April 2016. Articles describing skeletal muscle ultrasound in children and critically ill adults were analyzed qualitatively for details on techniques and findings. Thickness and cross-sectional area of various upper and lower body muscles have been studied to quantify muscle mass and detect muscle changes. The quadriceps femoris muscle is one of the most commonly measured muscles due to its relation to mobility and is sensitive to changes over time. However, the margin of error for quadriceps thickness is too wide to reliably detect muscle changes in critically ill children. Muscle size and its correlation with strength and function also have not yet been studied in critically ill children. Echogenicity, used to detect compromised muscle structure in neuromuscular disease, may be another property worth studying in critically ill children. Muscle ultrasound may be useful in detecting muscle wasting in critically ill children but has not been shown to be sufficiently reliable in this population. Further study of the reliability and correlation with functional outcomes and nutrition intake is required before muscle ultrasound is routinely employed in critically ill children.
Brunoni, André R; Tadini, Laura; Fregni, Felipe
2010-03-03
There have been many changes in clinical trials methodology since the introduction of lithium and the beginning of the modern era of psychopharmacology in 1949. The nature and importance of these changes have not been fully addressed to date. As methodological flaws in trials can lead to false-negative or false-positive results, the objective of our study was to evaluate the impact of methodological changes in psychopharmacology clinical research over the past 60 years. We performed a systematic review from 1949 to 2009 on MEDLINE and Web of Science electronic databases, and a hand search of high impact journals on studies of seven major drugs (chlorpromazine, clozapine, risperidone, lithium, fluoxetine and lamotrigine). All controlled studies published 100 months after the first trial were included. Ninety-one studies met our inclusion criteria. We analyzed the major changes in abstract reporting, study design, participants' assessment and enrollment, methodology and statistical analysis. Our results showed that the methodology of psychiatric clinical trials changed substantially, with quality gains in abstract reporting, results reporting, and statistical methodology. Recent trials use more informed consent, periods of washout, intention-to-treat approach and parametric tests. Placebo use remains high and unchanged over time. Clinical trial quality of psychopharmacological studies has changed significantly in most of the aspects we analyzed. There was significant improvement in quality reporting and internal validity. These changes have increased study efficiency; however, there is room for improvement in some aspects such as rating scales, diagnostic criteria and better trial reporting. Therefore, despite the advancements observed, there are still several areas that can be improved in psychopharmacology clinical trials.
NASA Astrophysics Data System (ADS)
Jin, Chenhao; Li, Jingcheng; Jang, Shinae; Sun, Xiaorong; Christenson, Richard
2015-03-01
Structural health monitoring has drawn significant attention in the past decades with numerous methodologies and applications for civil structural systems. Although many researchers have developed analytical and experimental damage detection algorithms through vibration-based methods, these methods are not widely accepted for practical structural systems because of their sensitivity to uncertain environmental and operational conditions. The primary environmental factor that influences the structural modal properties is temperature. The goal of this article is to analyze the natural frequency-temperature relationships and detect structural damage in the presence of operational and environmental variations using modal-based method. For this purpose, correlations between natural frequency and temperature are analyzed to select proper independent variables and inputs for the multiple linear regression model and neural network model. In order to capture the changes of natural frequency, confidence intervals to detect the damages for both models are generated. A long-term structural health monitoring system was installed on an in-service highway bridge located in Meriden, Connecticut to obtain vibration and environmental data. Experimental testing results show that the variability of measured natural frequencies due to temperature is captured, and the temperature-induced changes in natural frequencies have been considered prior to the establishment of the threshold in the damage warning system. This novel approach is applicable for structural health monitoring system and helpful to assess the performance of the structure for bridge management and maintenance.
2013-01-01
Background We have recently reported on the changes in plasma free amino acid (PFAA) profiles in lung cancer patients and the efficacy of a PFAA-based, multivariate discrimination index for the early detection of lung cancer. In this study, we aimed to verify the usefulness and robustness of PFAA profiling for detecting lung cancer using new test samples. Methods Plasma samples were collected from 171 lung cancer patients and 3849 controls without apparent cancer. PFAA levels were measured by high-performance liquid chromatography (HPLC)–electrospray ionization (ESI)–mass spectrometry (MS). Results High reproducibility was observed for both the change in the PFAA profiles in the lung cancer patients and the discriminating performance for lung cancer patients compared to previously reported results. Furthermore, multivariate discriminating functions obtained in previous studies clearly distinguished the lung cancer patients from the controls based on the area under the receiver-operator characteristics curve (AUC of ROC = 0.731 ~ 0.806), strongly suggesting the robustness of the methodology for clinical use. Moreover, the results suggested that the combinatorial use of this classifier and tumor markers improves the clinical performance of tumor markers. Conclusions These findings suggest that PFAA profiling, which involves a relatively simple plasma assay and imposes a low physical burden on subjects, has great potential for improving early detection of lung cancer. PMID:23409863
Arismendi, Ivan; Johnson, Sherri L.; Dunham, Jason B.
2015-01-01
Statistics of central tendency and dispersion may not capture relevant or desired characteristics of the distribution of continuous phenomena and, thus, they may not adequately describe temporal patterns of change. Here, we present two methodological approaches that can help to identify temporal changes in environmental regimes. First, we use higher-order statistical moments (skewness and kurtosis) to examine potential changes of empirical distributions at decadal extents. Second, we adapt a statistical procedure combining a non-metric multidimensional scaling technique and higher density region plots to detect potentially anomalous years. We illustrate the use of these approaches by examining long-term stream temperature data from minimally and highly human-influenced streams. In particular, we contrast predictions about thermal regime responses to changing climates and human-related water uses. Using these methods, we effectively diagnose years with unusual thermal variability and patterns in variability through time, as well as spatial variability linked to regional and local factors that influence stream temperature. Our findings highlight the complexity of responses of thermal regimes of streams and reveal their differential vulnerability to climate warming and human-related water uses. The two approaches presented here can be applied with a variety of other continuous phenomena to address historical changes, extreme events, and their associated ecological responses.
Lirio, R B; Dondériz, I C; Pérez Abalo, M C
1992-08-01
The methodology of Receiver Operating Characteristic curves based on the signal detection model is extended to evaluate the accuracy of two-stage diagnostic strategies. A computer program is developed for the maximum likelihood estimation of parameters that characterize the sensitivity and specificity of two-stage classifiers according to this extended methodology. Its use is briefly illustrated with data collected in a two-stage screening for auditory defects.
Usefulness of MLPA in the detection of SHOX deletions.
Funari, Mariana F A; Jorge, Alexander A L; Souza, Silvia C A L; Billerbeck, Ana E C; Arnhold, Ivo J P; Mendonca, Berenice B; Nishi, Mirian Y
2010-01-01
SHOX haploinsufficiency causes a wide spectrum of short stature phenotypes, such as Leri-Weill dyschondrosteosis (LWD) and disproportionate short stature (DSS). SHOX deletions are responsible for approximately two thirds of isolated haploinsufficiency; therefore, it is important to determine the most appropriate methodology for detection of gene deletion. In this study, three methodologies for the detection of SHOX deletions were compared: the fluorescence in situ hybridization (FISH), microsatellite analysis and multiplex ligation-dependent probe amplification (MLPA). Forty-four patients (8 LWD and 36 DSS) were analyzed. The cosmid LLNOYCO3'M'34F5 was used as a probe for the FISH analysis and microsatellite analysis were performed using three intragenic microsatellite markers. MLPA was performed using commercial kits. Twelve patients (8 LWD and 4 DSS) had deletions in SHOX area detected by MLPA and 2 patients generated discordant results with the other methodologies. In the first case, the deletion was not detected by FISH. In the second case, both FISH and microsatellite analyses were unable to identify the intragenic deletion. In conclusion, MLPA was more sensitive, less expensive and less laborious; therefore, it should be used as the initial molecular method for the detection of SHOX gene deletion. Copyright © 2010 Elsevier Masson SAS. All rights reserved.
NASA Astrophysics Data System (ADS)
Svejkosky, Joseph
The spectral signatures of vehicles in hyperspectral imagery exhibit temporal variations due to the preponderance of surfaces with material properties that display non-Lambertian bi-directional reflectance distribution functions (BRDFs). These temporal variations are caused by changing illumination conditions, changing sun-target-sensor geometry, changing road surface properties, and changing vehicle orientations. To quantify these variations and determine their relative importance in a sub-pixel vehicle reacquisition and tracking scenario, a hyperspectral vehicle BRDF sampling experiment was conducted in which four vehicles were rotated at different orientations and imaged over a six-hour period. The hyperspectral imagery was calibrated using novel in-scene methods and converted to reflectance imagery. The resulting BRDF sampled time-series imagery showed a strong vehicle level BRDF dependence on vehicle shape in off-nadir imaging scenarios and a strong dependence on vehicle color in simulated nadir imaging scenarios. The imagery also exhibited spectral features characteristic of sampling the BRDF of non-Lambertian targets, which were subsequently verified with simulations. In addition, the imagery demonstrated that the illumination contribution from vehicle adjacent horizontal surfaces significantly altered the shape and magnitude of the vehicle reflectance spectrum. The results of the BRDF sampling experiment illustrate the need for a target vehicle BRDF model and detection scheme that incorporates non-Lambertian BRDFs. A new detection algorithm called Eigenvector Loading Regression (ELR) is proposed that learns a hyperspectral vehicle BRDF from a series of BRDF measurements using regression in a lower dimensional space and then applies the learned BRDF to make test spectrum predictions. In cases of non-Lambertian vehicle BRDF, this detection methodology performs favorably when compared to subspace detections algorithms and graph-based detection algorithms that do not account for the target BRDF. The algorithms are compared using a test environment in which observed spectral reflectance signatures from the BRDF sampling experiment are implanted into aerial hyperspectral imagery that contain large quantities of vehicles.
Estimating HIV incidence and detection rates from surveillance data.
Posner, Stephanie J; Myers, Leann; Hassig, Susan E; Rice, Janet C; Kissinger, Patricia; Farley, Thomas A
2004-03-01
Markov models that incorporate HIV test information can increase precision in estimates of new infections and permit the estimation of detection rates. The purpose of this study was to assess the functioning of a Markov model for estimating new HIV infections and HIV detection rates in Louisiana using surveillance data. We expanded a discrete-time Markov model by accounting for the change in AIDS case definition made by the Centers for Disease Control and Prevention in 1993. The model was applied to quarterly HIV/AIDS surveillance data reported in Louisiana from 1981 to 1996 for various exposure and demographic subgroups. When modeling subgroups defined by exposure categories, we adjusted for the high proportion of missing exposure information among recent cases. We ascertained sensitivity to changes in various model assumptions. The model was able to produce results consistent with other sources of information in the state. Estimates of new infections indicated a transition of the HIV epidemic in Louisiana from (1) predominantly white men and men who have sex with men to (2) women, blacks, and high-risk heterosexuals. The model estimated that 61% of all HIV/AIDS cases were detected and reported by 1996, yet half of all HIV/non-AIDS cases were yet to be detected. Sensitivity analyses demonstrated that the model was robust to several uncertainties. In general, the methodology provided a useful and flexible alternative for estimating infection and detection trends using data from a U.S. surveillance program. Its use for estimating current infection will need further exploration to address assumptions related to newer treatments.
Analysis of the impact of safeguards criteria
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mullen, M.F.; Reardon, P.T.
As part of the US Program of Technical Assistance to IAEA Safeguards, the Pacific Northwest Laboratory (PNL) was asked to assist in developing and demonstrating a model for assessing the impact of setting criteria for the application of IAEA safeguards. This report presents the results of PNL's work on the task. The report is in three parts. The first explains the technical approach and methodology. The second contains an example application of the methodology. The third presents the conclusions of the study. PNL used the model and computer programs developed as part of Task C.5 (Estimation of Inspection Efforts) ofmore » the Program of Technical Assistance. The example application of the methodology involves low-enriched uranium conversion and fuel fabrication facilities. The effects of variations in seven parameters are considered: false alarm probability, goal probability of detection, detection goal quantity, the plant operator's measurement capability, the inspector's variables measurement capability, the inspector's attributes measurement capability, and annual plant throughput. Among the key results and conclusions of the analysis are the following: the variables with the greatest impact on the probability of detection are the inspector's measurement capability, the goal quantity, and the throughput; the variables with the greatest impact on inspection costs are the throughput, the goal quantity, and the goal probability of detection; there are important interactions between variables. That is, the effects of a given variable often depends on the level or value of some other variable. With the methodology used in this study, these interactions can be quantitatively analyzed; reasonably good approximate prediction equations can be developed using the methodology described here.« less
Sequencing CYP2D6 for the detection of poor-metabolizers in post-mortem blood samples with tramadol.
Fonseca, Suzana; Amorim, António; Costa, Heloísa Afonso; Franco, João; Porto, Maria João; Santos, Jorge Costa; Dias, Mário
2016-08-01
Tramadol concentrations and analgesic effect are dependent on the CYP2D6 enzymatic activity. It is well known that some genetic polymorphisms are responsible for the variability in the expression of this enzyme and in the individual drug response. The detection of allelic variants described as non-functional can be useful to explain some circumstances of death in the study of post-mortem cases with tramadol. A Sanger sequencing methodology was developed for the detection of genetic variants that cause absent or reduced CYP2D6 activity, such as *3, *4, *6, *8, *10 and *12 alleles. This methodology, as well as the GC/MS method for the detection and quantification of tramadol and its main metabolites in blood samples was fully validated in accordance with international guidelines. Both methodologies were successfully applied to 100 post-mortem blood samples and the relation between toxicological and genetic results evaluated. Tramadol metabolism, expressed as its metabolites concentration ratio (N-desmethyltramadol/O-desmethyltramadol), has been shown to be correlated with the poor-metabolizer phenotype based on genetic characterization. It was also demonstrated the importance of enzyme inhibitors identification in toxicological analysis. According to our knowledge, this is the first study where a CYP2D6 sequencing methodology is validated and applied to post-mortem samples, in Portugal. The developed methodology allows the data collection of post-mortem cases, which is of primordial importance to enhance the application of these genetic tools to forensic toxicology and pathology. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Methodological and Pedagogical Potential of Reflection in Development of Contemporary Didactics
ERIC Educational Resources Information Center
Chupina, Valentina A.; Pleshakova, Anastasiia Yu.; Konovalova, Maria E.
2016-01-01
Applicability of the issue under research is preconditioned by the need of practical pedagogics to expand methodological and methodical tools of contemporary didactics. The purpose of the article is to detect the methodological core of reflection as a form of thinking and to provide insight thereunto on the basis of systematic attributes of the…
A changing climate: impacts on human exposures to O3 using an integrated modeling methodology
Predicting the impacts of changing climate on human exposure to air pollution requires future scenarios that account for changes in ambient pollutant concentrations, population sizes and distributions, and housing stocks. An integrated methodology to model changes in human exposu...
In situ photoacoustic characterization for porous silicon growing: Detection principles
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ramirez-Gutierrez, C. F.; Licenciatura en Ingeniería Física, Facultad de Ingeniería, Universidad Autónoma de Querétaro, C. P. 76010 Querétaro, Qro.; Castaño-Yepes, J. D.
There are a few methodologies for monitoring the in-situ formation of Porous Silicon (PS). One of the methodologies is photoacoustic. Previous works that reported the use of photoacoustic to study the PS formation do not provide the physical explanation of the origin of the signal. In this paper, a physical explanation of the origin of the photoacoustic signal during the PS etching is provided. The incident modulated radiation and changes in the reflectance are taken as thermal sources. In this paper, a useful methodology is proposed to determine the etching rate, porosity, and refractive index of a PS film bymore » the determination of the sample thickness, using scanning electron microscopy images. This method was developed by carrying out two different experiments using the same anodization conditions. The first experiment consisted of growth of the samples with different etching times to prove the periodicity of the photoacoustic signal, while the second one considered the growth samples using three different wavelengths that are correlated with the period of the photoacoustic signal. The last experiment showed that the period of the photoacoustic signal is proportional to the laser wavelength.« less
Noninvasive detection of diabetes mellitus
NASA Astrophysics Data System (ADS)
Eppstein, Jonathan A.; Bursell, Sven-Erik
1992-05-01
Recent advances in fluorescence spectroscopy of the lens reveal the potential of a non-invasive device and methodology to sensitively measure changes in the lens of the eye associated with diabetes mellitus. The system relies on the detection of the spectrum of fluorescence emitted from a selected volume (approximately 1/10 mm3) of the lens of living human subjects using low power excitation illumination from monochromatic light sources. The sensitivity of this technique is based on the measurement of the fluorescence intensity in a selected region of the fluorescence spectrum and normalization of this fluorescence with respect to attenuation (scattering and absorption) of the incident excitation light. The amplitude of the unshifted Rayleigh line, measured as part of the fluorescence spectrum, is used as a measure of the attenuation of the excitation light in the lens. Using this methodology we have demonstrated that the normalized lens fluorescence provides a more sensitive discrimination between diabetic and non-diabetic lenses than more conventional measurements of fluorescence intensity from the lens. The existing instrumentation will be described as well as the proposed design for a commercial version of the instrument expected to be ready for FDA trials by late 1992. The results from clinical measurements are used to describe a relationship between normalized lens fluorescence and hemoglobin A1c levels in diabetic patients.
Fail-Safe Design for Large Capacity Lithium-Ion Battery Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, G. H.; Smith, K.; Ireland, J.
2012-07-15
A fault leading to a thermal runaway in a lithium-ion battery is believed to grow over time from a latent defect. Significant efforts have been made to detect lithium-ion battery safety faults to proactively facilitate actions minimizing subsequent losses. Scaling up a battery greatly changes the thermal and electrical signals of a system developing a defect and its consequent behaviors during fault evolution. In a large-capacity system such as a battery for an electric vehicle, detecting a fault signal and confining the fault locally in the system are extremely challenging. This paper introduces a fail-safe design methodology for large-capacity lithium-ionmore » battery systems. Analysis using an internal short circuit response model for multi-cell packs is presented that demonstrates the viability of the proposed concept for various design parameters and operating conditions. Locating a faulty cell in a multiple-cell module and determining the status of the fault's evolution can be achieved using signals easily measured from the electric terminals of the module. A methodology is introduced for electrical isolation of a faulty cell from the healthy cells in a system to prevent further electrical energy feed into the fault. Experimental demonstration is presented supporting the model results.« less
Turner, Andrew D.; Higgins, Cowan; Davidson, Keith; Veszelovszki, Andrea; Payne, Daniel; Hungerford, James; Higman, Wendy
2015-01-01
Regular occurrence of brevetoxin-producing toxic phytoplankton in commercial shellfishery areas poses a significant risk to shellfish consumer health. Brevetoxins and their causative toxic phytoplankton are more limited in their global distribution than most marine toxins impacting commercial shellfisheries. On the other hand, trends in climate change could conceivably lead to increased risk posed by these toxins in UK waters. A request was made by UK food safety authorities to examine these toxins more closely to aid possible management strategies, should they pose a threat in the future. At the time of writing, brevetoxins have been detected in the Gulf of Mexico, the Southeast US coast and in New Zealand waters, where regulatory levels for brevetoxins in shellfish have existed for some time. This paper reviews evidence concerning the prevalence of brevetoxins and brevetoxin-producing phytoplankton in the UK, together with testing methodologies. Chemical, biological and biomolecular methods are reviewed, including recommendations for further work to enable effective testing. Although the focus here is on the UK, from a strategic standpoint many of the topics discussed will also be of interest in other parts of the world since new and emerging marine biotoxins are of global concern. PMID:25775421
Evaluating markers for the early detection of cancer: overview of study designs and methods.
Baker, Stuart G; Kramer, Barnett S; McIntosh, Martin; Patterson, Blossom H; Shyr, Yu; Skates, Steven
2006-01-01
The field of cancer biomarker development has been evolving rapidly. New developments both in the biologic and statistical realms are providing increasing opportunities for evaluation of markers for both early detection and diagnosis of cancer. To review the major conceptual and methodological issues in cancer biomarker evaluation, with an emphasis on recent developments in statistical methods together with practical recommendations. We organized this review by type of study: preliminary performance, retrospective performance, prospective performance and cancer screening evaluation. For each type of study, we discuss methodologic issues, provide examples and discuss strengths and limitations. Preliminary performance studies are useful for quickly winnowing down the number of candidate markers; however their results may not apply to the ultimate target population, asymptomatic subjects. If stored specimens from cohort studies with clinical cancer endpoints are available, retrospective studies provide a quick and valid way to evaluate performance of the markers or changes in the markers prior to the onset of clinical symptoms. Prospective studies have a restricted role because they require large sample sizes, and, if the endpoint is cancer on biopsy, there may be bias due to overdiagnosis. Cancer screening studies require very large sample sizes and long follow-up, but are necessary for evaluating the marker as a trigger of early intervention.
QESA: Quarantine Extraterrestrial Sample Analysis Methodology
NASA Astrophysics Data System (ADS)
Simionovici, A.; Lemelle, L.; Beck, P.; Fihman, F.; Tucoulou, R.; Kiryukhina, K.; Courtade, F.; Viso, M.
2018-04-01
Our nondestructive, nm-sized, hyperspectral analysis methodology of combined X-rays/Raman/IR probes in BSL4 quarantine, renders our patented mini-sample holder ideal for detecting extraterrestrial life. Our Stardust and Archean results validate it.
Advanced Technologies and Methodology for Automated Ultrasonic Testing Systems Quantification
DOT National Transportation Integrated Search
2011-04-29
For automated ultrasonic testing (AUT) detection and sizing accuracy, this program developed a methodology for quantification of AUT systems, advancing and quantifying AUT systems imagecapture capabilities, quantifying the performance of multiple AUT...
NASA Astrophysics Data System (ADS)
Tsao, Sinchai; Wilkins, Bryce; Page, Kathleen A.; Singh, Manbir
2012-03-01
A novel MRI protocol has been developed to investigate the differential effects of glucose or fructose consumption on whole-brain functional brain connectivity. A previous study has reported a decrease in the fMRI blood oxygen level dependent (BOLD) signal of the hypothalamus following glucose ingestion, but due to technical limitations, was restricted to a single slice covering the hypothalamus, and thus unable to detect whole-brain connectivity. In another previous study, a protocol was devised to acquire whole-brain fMRI data following food intake, but only after restricting image acquisition to an MR sampling or repetition time (TR) of 20s, making the protocol unsuitable to detect functional connectivity above 0.025Hz. We have successfully implemented a continuous 36-min, 40 contiguous slices, whole-brain BOLD acquisition protocol on a 3T scanner with TR=4.5s to ensure detection of up to 0.1Hz frequencies for whole-brain functional connectivity analysis. Human data were acquired first with ingestion of water only, followed by a glucose or fructose drink within the scanner, without interrupting the scanning. Whole-brain connectivity was analyzed using standard correlation methodology in the 0.01-0.1 Hz range. The correlation coefficient differences between fructose and glucose ingestion among targeted regions were converted to t-scores using the water-only correlation coefficients as a null condition. Results show a dramatic increase in the hypothalamic connectivity to the hippocampus, amygdala, insula, caudate and the nucleus accumben for fructose over glucose. As these regions are known to be key components of the feeding and reward brain circuits, these results suggest a preference for fructose ingestion.
Determination of Stent Frame Displacement After Endovascular Aneurysm Sealing.
van Veen, Ruben; van Noort, Kim; Schuurmann, Richte C L; Wille, Jan; Slump, Cornelis H; de Vries, Jean-Paul P M
2018-02-01
To describe and validate a new methodology for visualizing and quantifying 3-dimensional (3D) displacement of the stent frames of the Nellix endosystem after endovascular aneurysm sealing (EVAS). The 3D positions of the stent frames were registered to 5 fixed anatomical landmarks on the post-EVAS computed tomography (CT) scans, facilitating comparison of the position and shape of the stent frames between consecutive follow-up scans. Displacement of the proximal and distal ends of the stent frames, the entire stent frame trajectories, as well as changes in distance between the stent frames were determined for 6 patients with >5-mm displacement and 6 patients with <5-mm displacement at 1-year follow-up. The measurements were performed by 2 independent observers; the intraclass correlation coefficient (ICC) was used to determine interobserver variability. Three types of displacement were identified: displacement of the proximal and/or distal end of the stent frames, lateral displacement of one or both stent frames, and stent frame buckling. The ICC ranged from good (0.750) to excellent (0.958). No endoleak or migration was detected in the 12 patients on conventional CT angiography at 1 year. However, of the 6 patients with >5-mm displacement on the 1-year CT as determined by the new methodology, 2 went on to develop a type Ia endoleak in longer follow-up, and displacement progressed to >15 mm for 2 other patients. No endoleak or progressive displacement was appreciated for the patients with <5-mm displacement. The sac anchoring principle of the Nellix endosystem may result in several types of displacement that have not been observed during surveillance of regular endovascular aneurysm repairs. The presented methodology allows precise 3D determination of the Nellix endosystems and can detect subtle displacement better than standard CT angiography. Displacement >5 mm on the 1-year CT scans reconstructed with the new methodology may forecast impaired sealing and anchoring of the Nellix endosystem.
Toward Failure Modeling In Complex Dynamic Systems: Impact of Design and Manufacturing Variations
NASA Technical Reports Server (NTRS)
Tumer, Irem Y.; McAdams, Daniel A.; Clancy, Daniel (Technical Monitor)
2001-01-01
When designing vehicle vibration monitoring systems for aerospace devices, it is common to use well-established models of vibration features to determine whether failures or defects exist. Most of the algorithms used for failure detection rely on these models to detect significant changes during a flight environment. In actual practice, however, most vehicle vibration monitoring systems are corrupted by high rates of false alarms and missed detections. Research conducted at the NASA Ames Research Center has determined that a major reason for the high rates of false alarms and missed detections is the numerous sources of statistical variations that are not taken into account in the. modeling assumptions. In this paper, we address one such source of variations, namely, those caused during the design and manufacturing of rotating machinery components that make up aerospace systems. We present a novel way of modeling the vibration response by including design variations via probabilistic methods. The results demonstrate initial feasibility of the method, showing great promise in developing a general methodology for designing more accurate aerospace vehicle vibration monitoring systems.
The Design of a Quantitative Western Blot Experiment
Taylor, Sean C.; Posch, Anton
2014-01-01
Western blotting is a technique that has been in practice for more than three decades that began as a means of detecting a protein target in a complex sample. Although there have been significant advances in both the imaging and reagent technologies to improve sensitivity, dynamic range of detection, and the applicability of multiplexed target detection, the basic technique has remained essentially unchanged. In the past, western blotting was used simply to detect a specific target protein in a complex mixture, but now journal editors and reviewers are requesting the quantitative interpretation of western blot data in terms of fold changes in protein expression between samples. The calculations are based on the differential densitometry of the associated chemiluminescent and/or fluorescent signals from the blots and this now requires a fundamental shift in the experimental methodology, acquisition, and interpretation of the data. We have recently published an updated approach to produce quantitative densitometric data from western blots (Taylor et al., 2013) and here we summarize the complete western blot workflow with a focus on sample preparation and data analysis for quantitative western blotting. PMID:24738055
Methodology for the passive detection and discrimination of chemical and biological aerosols
NASA Astrophysics Data System (ADS)
Marinelli, William J.; Shokhirev, Kirill N.; Konno, Daisei; Rossi, David C.; Richardson, Martin
2013-05-01
The standoff detection and discrimination of aerosolized biological and chemical agents has traditionally been addressed through LIDAR approaches, but sensor systems using these methods have yet to be deployed. We discuss the development and testing of an approach to detect these aerosols using the deployed base of passive infrared hyperspectral sensors used for chemical vapor detection. The detection of aerosols requires the inclusion of down welling sky and up welling ground radiation in the description of the radiative transfer process. The wavelength and size dependent ratio of absorption to scattering provides much of the discrimination capability. The approach to the detection of aerosols utilizes much of the same phenomenology employed in vapor detection; however, the sensor system must acquire information on non-line-of-sight sources of radiation contributing to the scattering process. We describe the general methodology developed to detect chemical or biological aerosols, including justifications for the simplifying assumptions that enable the development of a real-time sensor system. Mie scattering calculations, aerosol size distribution dependence, and the angular dependence of the scattering on the aerosol signature will be discussed. This methodology will then be applied to two test cases: the ground level release of a biological aerosol (BG) and a nonbiological confuser (kaolin clay) as well as the debris field resulting from the intercept of a cruise missile carrying a thickened VX warhead. A field measurement, conducted at the Utah Test and Training Range will be used to illustrate the issues associated with the use of the method.
ERIC Educational Resources Information Center
Macy, Barry A.; Mirvis, Philip H.
1982-01-01
A standardized methodology for identifying, defining, and measuring work behavior and performance rather than production, and a methodology that estimates the costs and benefits of work innovation are presented for assessing organizational effectiveness and program costs versus benefits in organizational change programs. Factors in a cost-benefit…
Rana, Muhit; Balcioglu, Mustafa; Robertson, Neil M.; Hizir, Mustafa Salih; Yumak, Sumeyra
2017-01-01
The EPA's recommended maximum allowable level of inorganic mercury in drinking water is 2 ppb (10 nM). To our knowledge, the most sensitive colorimetric mercury sensor reported to date has a limit of detection (LOD) of 800 pM. Here, we report an instrument-free and highly practical colorimetric methodology, which enables detection of as low as 2 ppt (10 pM) of mercury and/or silver ions with the naked eye using a gold nanoprobe. Synthesis of the nanoprobe costs less than $1.42, which is enough to perform 200 tests in a microplate; less than a penny for each test. We have demonstrated the detection of inorganic mercury from water, soil and urine samples. The assay takes about four hours and the color change is observed within minutes after the addition of the last required element of the assay. The nanoprobe is highly programmable which allows for the detection of mercury and/or silver ions separately or simultaneously by changing only a single parameter of the assay. This highly sensitive approach for the visual detection relies on the combination of the signal amplification features of the hybridization chain reaction with the plasmonic properties of the gold nanoparticles. Considering that heavy metal ion contamination of natural resources is a major challenge and routine environmental monitoring is needed, yet time-consuming, this colorimetric approach may be instrumental for on-site heavy metal ion detection. Since the color transition can be measured in a variety of formats including using the naked eye, a simple UV-Vis spectrophotometer, or recording using mobile phone apps for future directions, our cost-efficient assay and method have the potential to be translated into the field. PMID:28451261
Evaluation of Incident Detection Methodologies
DOT National Transportation Integrated Search
1999-10-01
Original Report Date: October 1998. The detection of freeway incidents is an essential element of an area's traffic management system. Incidents need to be detected and handled as promptly as possible to minimize delay to the public. Various algorith...
Class imbalance in unsupervised change detection - A diagnostic analysis from urban remote sensing
NASA Astrophysics Data System (ADS)
Leichtle, Tobias; Geiß, Christian; Lakes, Tobia; Taubenböck, Hannes
2017-08-01
Automatic monitoring of changes on the Earth's surface is an intrinsic capability and simultaneously a persistent methodological challenge in remote sensing, especially regarding imagery with very-high spatial resolution (VHR) and complex urban environments. In order to enable a high level of automatization, the change detection problem is solved in an unsupervised way to alleviate efforts associated with collection of properly encoded prior knowledge. In this context, this paper systematically investigates the nature and effects of class distribution and class imbalance in an unsupervised binary change detection application based on VHR imagery over urban areas. For this purpose, a diagnostic framework for sensitivity analysis of a large range of possible degrees of class imbalance is presented, which is of particular importance with respect to unsupervised approaches where the content of images and thus the occurrence and the distribution of classes are generally unknown a priori. Furthermore, this framework can serve as a general technique to evaluate model transferability in any two-class classification problem. The applied change detection approach is based on object-based difference features calculated from VHR imagery and subsequent unsupervised two-class clustering using k-means, genetic k-means and self-organizing map (SOM) clustering. The results from two test sites with different structural characteristics of the built environment demonstrated that classification performance is generally worse in imbalanced class distribution settings while best results were reached in balanced or close to balanced situations. Regarding suitable accuracy measures for evaluating model performance in imbalanced settings, this study revealed that the Kappa statistics show significant response to class distribution while the true skill statistic was widely insensitive to imbalanced classes. In general, the genetic k-means clustering algorithm achieved the most robust results with respect to class imbalance while the SOM clustering exhibited a distinct optimization towards a balanced distribution of classes.
Aquino, Arturo; Gegundez-Arias, Manuel Emilio; Marin, Diego
2010-11-01
Optic disc (OD) detection is an important step in developing systems for automated diagnosis of various serious ophthalmic pathologies. This paper presents a new template-based methodology for segmenting the OD from digital retinal images. This methodology uses morphological and edge detection techniques followed by the Circular Hough Transform to obtain a circular OD boundary approximation. It requires a pixel located within the OD as initial information. For this purpose, a location methodology based on a voting-type algorithm is also proposed. The algorithms were evaluated on the 1200 images of the publicly available MESSIDOR database. The location procedure succeeded in 99% of cases, taking an average computational time of 1.67 s. with a standard deviation of 0.14 s. On the other hand, the segmentation algorithm rendered an average common area overlapping between automated segmentations and true OD regions of 86%. The average computational time was 5.69 s with a standard deviation of 0.54 s. Moreover, a discussion on advantages and disadvantages of the models more generally used for OD segmentation is also presented in this paper.
NASA Astrophysics Data System (ADS)
Sakellariou, J. S.; Fassois, S. D.
2006-11-01
A stochastic output error (OE) vibration-based methodology for damage detection and assessment (localization and quantification) in structures under earthquake excitation is introduced. The methodology is intended for assessing the state of a structure following potential damage occurrence by exploiting vibration signal measurements produced by low-level earthquake excitations. It is based upon (a) stochastic OE model identification, (b) statistical hypothesis testing procedures for damage detection, and (c) a geometric method (GM) for damage assessment. The methodology's advantages include the effective use of the non-stationary and limited duration earthquake excitation, the handling of stochastic uncertainties, the tackling of the damage localization and quantification subproblems, the use of "small" size, simple and partial (in both the spatial and frequency bandwidth senses) identified OE-type models, and the use of a minimal number of measured vibration signals. Its feasibility and effectiveness are assessed via Monte Carlo experiments employing a simple simulation model of a 6 storey building. It is demonstrated that damage levels of 5% and 20% reduction in a storey's stiffness characteristics may be properly detected and assessed using noise-corrupted vibration signals.
Yu, Alexander C; Cimino, James J
2011-04-01
Most existing controlled terminologies can be characterized as collections of terms, wherein the terms are arranged in a simple list or organized in a hierarchy. These kinds of terminologies are considered useful for standardizing terms and encoding data and are currently used in many existing information systems. However, they suffer from a number of limitations that make data reuse difficult. Relatively recently, it has been proposed that formal ontological methods can be applied to some of the problems of terminological design. Biomedical ontologies organize concepts (embodiments of knowledge about biomedical reality) whereas terminologies organize terms (what is used to code patient data at a certain point in time, based on the particular terminology version). However, the application of these methods to existing terminologies is not straightforward. The use of these terminologies is firmly entrenched in many systems, and what might seem to be a simple option of replacing these terminologies is not possible. Moreover, these terminologies evolve over time in order to suit the needs of users. Any methodology must therefore take these constraints into consideration, hence the need for formal methods of managing changes. Along these lines, we have developed a formal representation of the concept-term relation, around which we have also developed a methodology for management of terminology changes. The objective of this study was to determine whether our methodology would result in improved retrieval of data. Comparison of two methods for retrieving data encoded with terms from the International Classification of Diseases (ICD-9-CM), based on their recall when retrieving data for ICD-9-CM terms whose codes had changed but which had retained their original meaning (code change). Recall and interclass correlation coefficient. Statistically significant differences were detected (p<0.05) with the McNemar test for two terms whose codes had changed. Furthermore, when all the cases are combined in an overall category, our method also performs statistically significantly better (p<0.05). Our study shows that an ontology-based ICD-9-CM data retrieval method that takes into account the effects of terminology changes performs better on recall than one that does not in the retrieval of data for terms whose codes had changed but which retained their original meaning. Copyright © 2011 Elsevier Inc. All rights reserved.
Yu, Alexander C.; Cimino, James J.
2012-01-01
Objective Most existing controlled terminologies can be characterized as collections of terms, wherein the terms are arranged in a simple list or organized in a hierarchy. These kinds of terminologies are considered useful for standardizing terms and encoding data and are currently used in many existing information systems. However, they suffer from a number of limitations that make data reuse difficult. Relatively recently, it has been proposed that formal ontological methods can be applied to some of the problems of terminological design. Biomedical ontologies organize concepts (embodiments of knowledge about biomedical reality) whereas terminologies organize terms (what is used to code patient data at a certain point in time, based on the particular terminology version). However, the application of these methods to existing terminologies is not straightforward. The use of these terminologies is firmly entrenched in many systems, and what might seem to be a simple option of replacing these terminologies is not possible. Moreover, these terminologies evolve over time in order to suit the needs of users. Any methodology must therefore take these constraints into consideration, hence the need for formal methods of managing changes. Along these lines, we have developed a formal representation of the concept-term relation, around which we have also developed a methodology for management of terminology changes. The objective of this study was to determine whether our methodology would result in improved retrieval of data. Design Comparison of two methods for retrieving data encoded with terms from the International Classification of Diseases (ICD-9-CM), based on their recall when retrieving data for ICD-9-CM terms whose codes had changed but which had retained their original meaning (code change). Measurements Recall and interclass correlation coefficient. Results Statistically significant differences were detected (p<0.05) with the McNemar test for two terms whose codes had changed. Furthermore, when all the cases are combined in an overall category, our method also performs statistically significantly better (p < 0.05). Conclusion Our study shows that an ontology-based ICD-9-CM data retrieval method that takes into account the effects of terminology changes performs better on recall than one that does not in the retrieval of data for terms whose codes had changed but which retained their original meaning. PMID:21262390
NASA Astrophysics Data System (ADS)
Hunka, Frantisek; Matula, Jiri
2017-07-01
Transaction based approach is utilized in some methodologies in business process modeling. Essential parts of these transactions are human beings. The notion of agent or actor role is usually used for them. The paper on a particular example describes possibilities of Design Engineering Methodology for Organizations (DEMO) and Resource-Event-Agent (REA) methodology. Whereas the DEMO methodology can be regarded as a generic methodology having its foundation in the theory of Enterprise Ontology the REA methodology is regarded as the domain specific methodology and has its origin in accountancy systems. The results of these approaches is that the DEMO methodology captures everything that happens in the reality with a good empirical evidence whereas the REA methodology captures only changes connected with economic events. Economic events represent either change of the property rights to economic resource or consumption or production of economic resources. This results from the essence of economic events and their connection to economic resources.
Ge, Jing; Zhang, Guoping
2015-01-01
Advanced intelligent methodologies could help detect and predict diseases from the EEG signals in cases the manual analysis is inefficient available, for instance, the epileptic seizures detection and prediction. This is because the diversity and the evolution of the epileptic seizures make it very difficult in detecting and identifying the undergoing disease. Fortunately, the determinism and nonlinearity in a time series could characterize the state changes. Literature review indicates that the Delay Vector Variance (DVV) could examine the nonlinearity to gain insight into the EEG signals but very limited work has been done to address the quantitative DVV approach. Hence, the outcomes of the quantitative DVV should be evaluated to detect the epileptic seizures. To develop a new epileptic seizure detection method based on quantitative DVV. This new epileptic seizure detection method employed an improved delay vector variance (IDVV) to extract the nonlinearity value as a distinct feature. Then a multi-kernel functions strategy was proposed in the extreme learning machine (ELM) network to provide precise disease detection and prediction. The nonlinearity is more sensitive than the energy and entropy. 87.5% overall accuracy of recognition and 75.0% overall accuracy of forecasting were achieved. The proposed IDVV and multi-kernel ELM based method was feasible and effective for epileptic EEG detection. Hence, the newly proposed method has importance for practical applications.
Damage detection of civil infrastructures with piezoelectric oscillator sensors
NASA Astrophysics Data System (ADS)
Roh, Y. R.; Kim, D. Y.; Park, S. H.; Yun, C. B.
2006-03-01
Many researches have been reported on the condition monitoring of civil infrastructures by means of piezoelectric sensors. Most of them made use of the impedance change of the piezoelectric device in relation to the creation of internal damages to the structure. The impedance measurement is a well accepted method in the piezoelectric sensor area, and has been proved by many authors to be useful for civil structure diagnosis. However, the impedance measurement normally requires sophisticated equipment and analysis technology. For more general and wide application of the piezoelectric diagnosis tool, a new methodology is desired to overcome the limitations of the impedance measurement. This paper presents the feasibility of a piezoelectric oscillator sensor to detect the damages in civil infrastructures. The oscillator sensor is composed of an electronic feedback oscillator circuit and a piezoelectric thickness mode vibrator to be attached to the structure of interest. Damage to the structure causes a change in the impedance spectrum of the structure, which results in a corresponding change of the resonant frequency of the structure. The oscillator sensors can instantly detect the frequency change in a very simple manner. Feasibility of the piezoelectric oscillator sensor was verified in this work with a sample aluminum plate where artificial cracks of different depth were imposed in sequence. Validity of the measurement was confirmed through comparison of the experimental data with the results of finite element analyses of the plate with cracks. Performance of the oscillator sensor was also compared with that of its conventional counterpart, i.e. impedance measurement, to manifest the superiority of the oscillator sensor.
NASA Astrophysics Data System (ADS)
Kishore, G. V. K.; Kumar, Anish; Rajkumar, K. V.; Purnachandra Rao, B.; Pramanik, Debabrata; Kapoor, Komal; Jha, Sanjay Kumar
2017-12-01
The paper presents a new methodology for detection and evaluation of mild steel (MS) can material embedded into oxide dispersion strengthened (ODS) steel tubes by magnetic Barkhausen emission (MBE) technique. The high frequency MBE measurements (125 Hz sweep frequency and 70-200 kHz analyzing frequency) are found to be very sensitive for detection of presence of MS on the surface of the ODS steel tube. However, due to a shallow depth of information from the high frequency MBE measurements, it cannot be used for evaluation of the thickness of the embedded MS. The low frequency MBE measurements (0.5 Hz sweep frequency and 2-20 kHz analyzing frequency) indicate presence of two MBE RMS voltage peaks corresponding to the MS and the ODS steel. The ratio of the two peaks changes with the thickness of the MS and hence, can be used for measurement of the thickness of the MS layer.
Improved detection of radioactive material using a series of measurements
NASA Astrophysics Data System (ADS)
Mann, Jenelle
The goal of this project is to develop improved algorithms for detection of radioactive sources that have low signal compared to background. The detection of low signal sources is of interest in national security applications where the source may have weak ionizing radiation emissions, is heavily shielded, or the counting time is short (such as portal monitoring). Traditionally to distinguish signal from background the decision threshold (y*) is calculated by taking a long background count and limiting the false negative error (alpha error) to 5%. Some problems with this method include: background is constantly changing due to natural environmental fluctuations and large amounts of data are being taken as the detector continuously scans that are not utilized. Rather than looking at a single measurement, this work investigates looking at a series of N measurements and develops an appropriate decision threshold for exceeding the decision threshold n times in a series of N. This methodology is investigated for a rectangular, triangular, sinusoidal, Poisson, and Gaussian distribution.
NASA Astrophysics Data System (ADS)
Rambaldi, Marcello; Filimonov, Vladimir; Lillo, Fabrizio
2018-03-01
Given a stationary point process, an intensity burst is defined as a short time period during which the number of counts is larger than the typical count rate. It might signal a local nonstationarity or the presence of an external perturbation to the system. In this paper we propose a procedure for the detection of intensity bursts within the Hawkes process framework. By using a model selection scheme we show that our procedure can be used to detect intensity bursts when both their occurrence time and their total number is unknown. Moreover, the initial time of the burst can be determined with a precision given by the typical interevent time. We apply our methodology to the midprice change in foreign exchange (FX) markets showing that these bursts are frequent and that only a relatively small fraction is associated with news arrival. We show lead-lag relations in intensity burst occurrence across different FX rates and we discuss their relation with price jumps.
Damage identification via asymmetric active magnetic bearing acceleration feedback control
NASA Astrophysics Data System (ADS)
Zhao, Jie; DeSmidt, Hans; Yao, Wei
2015-04-01
A Floquet-based damage detection methodology for cracked rotor systems is developed and demonstrated on a shaft-disk system. This approach utilizes measured changes in the system natural frequencies to estimate the severity and location of shaft structural cracks during operation. The damage detection algorithms are developed with the initial guess solved by least square method and iterative damage parameter vector by updating the eigenvector updating. Active Magnetic Bearing is introduced to break the symmetric structure of rotor system and the tuning range of proper stiffness/virtual mass gains is studied. The system model is built based on energy method and the equations of motion are derived by applying assumed modes method and Lagrange Principle. In addition, the crack model is based on the Strain Energy Release Rate (SERR) concept in fracture mechanics. Finally, the method is synthesized via harmonic balance and numerical examples for a shaft/disk system demonstrate the effectiveness in detecting both location and severity of the structural damage.
HYPNOTIC TACTILE ANESTHESIA: Psychophysical and Signal-Detection Analyses
Tataryn, Douglas J.; Kihlstrom, John F.
2017-01-01
Two experiments that studied the effects of hypnotic suggestions on tactile sensitivity are reported. Experiment 1 found that suggestions for anesthesia, as measured by both traditional psychophysical methods and signal detection procedures, were linearly related to hypnotizability. Experiment 2 employed the same methodologies in an application of the real-simulator paradigm to examine the effects of suggestions for both anesthesia and hyperesthesia. Significant effects of hypnotic suggestion on both sensitivity and bias were found in the anesthesia condition but not for the hyperesthesia condition. A new bias parameter, C′, indicated that much of the bias found in the initial analyses was artifactual, a function of changes in sensitivity across conditions. There were no behavioral differences between reals and simulators in any of the conditions, though analyses of postexperimental interviews suggested the 2 groups had very different phenomenal experiences. PMID:28230465
Landsat-based trend analysis of lake dynamics across northern permafrost regions
Nitze, Ingmar; Grosse, Guido; Jones, Benjamin M.; Arp, Christopher D.; Ulrich, Mathias; Federov, Alexander; Veremeeva, Alexandra
2017-01-01
Lakes are a ubiquitous landscape feature in northern permafrost regions. They have a strong impact on carbon, energy and water fluxes and can be quite responsive to climate change. The monitoring of lake change in northern high latitudes, at a sufficiently accurate spatial and temporal resolution, is crucial for understanding the underlying processes driving lake change. To date, lake change studies in permafrost regions were based on a variety of different sources, image acquisition periods and single snapshots, and localized analysis, which hinders the comparison of different regions. Here we present, a methodology based on machine-learning based classification of robust trends of multi-spectral indices of Landsat data (TM,ETM+, OLI) and object-based lake detection, to analyze and compare the individual, local and regional lake dynamics of four different study sites (Alaska North Slope, Western Alaska, Central Yakutia, Kolyma Lowland) in the northern permafrost zone from 1999 to 2014. Regional patterns of lake area change on the Alaska North Slope (-0.69%), Western Alaska (-2.82%), and Kolyma Lowland (-0.51%) largely include increases due to thermokarst lake expansion, but more dominant lake area losses due to catastrophic lake drainage events. In contrast, Central Yakutia showed a remarkable increase in lake area of 48.48%, likely resulting from warmer and wetter climate conditions over the latter half of the study period. Within all study regions, variability in lake dynamics was associated with differences in permafrost characteristics, landscape position (i.e. upland vs. lowland), and surface geology. With the global availability of Landsat data and a consistent methodology for processing the input data derived from robust trends of multi-spectral indices, we demonstrate a transferability, scalability and consistency of lake change analysis within the northern permafrost region.
NASA Astrophysics Data System (ADS)
Mills, R. T.; Kumar, J.; Hoffman, F. M.; Hargrove, W. W.; Spruce, J.
2011-12-01
Variations in vegetation phenology, the annual temporal pattern of leaf growth and senescence, can be a strong indicator of ecological change or disturbance. However, phenology is also strongly influenced by seasonal, interannual, and long-term trends in climate, making identification of changes in forest ecosystems a challenge. Forest ecosystems are vulnerable to extreme weather events, insect and disease attacks, wildfire, harvesting, and other land use change. Normalized difference vegetation index (NDVI), a remotely sensed measure of greenness, provides a proxy for phenology. NDVI for the conterminous United States (CONUS) derived from the Moderate Resolution Spectroradiometer (MODIS) at 250 m resolution was used in this study to develop phenological signatures of ecological regimes called phenoregions. By applying a quantitative data mining technique to the NDVI measurements for every eight days over the entire MODIS record, annual maps of phenoregions were developed. This geospatiotemporal cluster analysis technique employs high performance computing resources, enabling analysis of such very large data sets. This technique produces a prescribed number of prototypical phenological states to which every location belongs in any year. Analysis of the shifts among phenological states yields information about responses to interannual climate variability and, more importantly, changes in ecosystem health due to disturbances. Moreover, a large change in the phenological states occupied by a single location over time indicates a significant disturbance or ecological shift. This methodology has been applied for identification of various forest disturbance events, including wildfire, tree mortality due to Mountain Pine Beetle, and other insect infestation and diseases, as well as extreme events like storms and hurricanes in the U.S. Presented will be results from analysis of phenological state dynamics, along with disturbance and validation data.
Polyion selective polymeric membrane-based pulstrode as a detector in flow-injection analysis.
Bell-Vlasov, Andrea K; Zajda, Joanna; Eldourghamy, Ayman; Malinowska, Elzbieta; Meyerhoff, Mark E
2014-04-15
A method for the detection of polyions using fully reversible polyion selective polymeric membrane type pulstrodes as detectors in a flow-injection analysis (FIA) system is examined. The detection electrode consists of a plasticized polymeric membrane doped with 10 wt % of tridodecylmethylammonium-dinonylnaphthalene sulfonate (TDMA/DNNS) ion-exchanger salt. The pulse sequence used involves a short (1 s) galvanostatic pulse, an open-circuit pulse (0.5 s) during which the EMF of the cell is measured, and a longer (15 s) potentiostatic pulse to return the membrane to its original chemical composition. It is shown that total pulse sequence times can be optimized to yield reproducible real-time detection of injected samples of protamine and heparin at up to 20 samples/h. Further, it is shown that the same membrane detector can be employed for FIA detection of both polycations at levels ≥10 μg/mL and polyanions at levels of ≥40 μg/mL by changing the direction of the galvanostatic pulse. The methodology described may also be applicable in the detection of polyionic species at low levels in other flowing configurations, such as in liquid chromatography and capillary electrophoresis.
Detecting the manipulation of digital clinical records in dental practice.
Díaz-Flores-García, V; Labajo-González, E; Santiago-Sáez, A; Perea-Pérez, B
2017-11-01
Radiography provides many advantages in the diagnosis and management of dental conditions. However, dental X-ray images may be subject to manipulation with malicious intent using easily accessible computer software. In this study, we sought to evaluate a dentist's ability to identify a manipulated dental X-ray images, when compared with the original, using a variant of the methodology described by Visser and Kruger. Sixty-six dentists were invited to participate and evaluate 20 intraoral dental X-ray images, 10 originals and 10 modified, manipulated using Adobe Photoshop to simulate fillings, root canal treatments, etc. Participating dentists were correct in identifying the manipulated image in 56% of cases, 6% higher than by chance and 10% more than in the study by Visser and Kruger. Malicious changes to dental X-ray images may go unnoticed even by experienced dentists. Professionals must be aware of the legal consequences of such changes. A system of detection/validation should be created for radiographic images. Copyright © 2017 The College of Radiographers. Published by Elsevier Ltd. All rights reserved.
A novel method for measurement of MR fluid sedimentation and its experimental verification
NASA Astrophysics Data System (ADS)
Roupec, J.; Berka, P.; Mazůrek, I.; Strecker, Z.; Kubík, M.; Macháček, O.; Taheri Andani, M.
2017-10-01
This article presents a novel sedimentation measurement technique based on quantifying the changes in magnetic flux density when the magnetorheological fluid (MRF) passes through the air gap of the magnetic circuit. The sedimented MRF appears to have as a result of increased iron content. Accordingly, the sedimented portion of the sample displays a higher magnetic conductivity than the unsedimented area that contains less iron particles. The data analysis and evaluation methodology is elaborated along with an example set of measurements, which are compared against the visual observations and available data in the literature. Experiments indicate that unlike the existing methods, the new technique is able to accurately generate the complete curves of the sedimentation profile in a long-term sedimentation. The proposed method is capable of successfully detecting the area with the tightest particle configuration near the bottom (‘cake’ layer). It also addresses the issues with the development of an unclear boundary between the carrier fluid and the sediment (mudline) during an accelerated sedimentation process; improves the sensitivity of the sedimentation detection and accurately measure the changes in particle concentration with a high resolution.
Detection of melting by X-ray imaging at high pressure
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Li; Weidner, Donald J.
2014-06-15
The occurrence of partial melting at elevated pressure and temperature is documented in real time through measurement of volume strain induced by a fixed temperature change. Here we present the methodology for measuring volume strains to one part in 10{sup −4} for mm{sup 3} sized samples in situ as a function of time during a step in temperature. By calibrating the system for sample thermal expansion at temperatures lower than the solidus, the onset of melting can be detected when the melting volume increase is of comparable size to the thermal expansion induced volume change. We illustrate this technique withmore » a peridotite sample at 1.5 GPa during partial melting. The Re capsule is imaged with a CCD camera at 20 frames/s. Temperature steps of 100 K induce volume strains that triple with melting. The analysis relies on image comparison for strain determination and the thermal inertia of the sample is clearly seen in the time history of the volume strain. Coupled with a thermodynamic model of the melting, we infer that we identify melting with 2 vol.% melting.« less
Intelligent-based Structural Damage Detection Model
NASA Astrophysics Data System (ADS)
Lee, Eric Wai Ming; Yu, Kin Fung
2010-05-01
This paper presents the application of a novel Artificial Neural Network (ANN) model for the diagnosis of structural damage. The ANN model, denoted as the GRNNFA, is a hybrid model combining the General Regression Neural Network Model (GRNN) and the Fuzzy ART (FA) model. It not only retains the important features of the GRNN and FA models (i.e. fast and stable network training and incremental growth of network structure) but also facilitates the removal of the noise embedded in the training samples. Structural damage alters the stiffness distribution of the structure and so as to change the natural frequencies and mode shapes of the system. The measured modal parameter changes due to a particular damage are treated as patterns for that damage. The proposed GRNNFA model was trained to learn those patterns in order to detect the possible damage location of the structure. Simulated data is employed to verify and illustrate the procedures of the proposed ANN-based damage diagnosis methodology. The results of this study have demonstrated the feasibility of applying the GRNNFA model to structural damage diagnosis even when the training samples were noise contaminated.
Renal Carcinogenesis, Tumor Heterogeneity, and Reactive Oxygen Species: Tactics Evolved
Shanmugasundaram, Karthigayan
2016-01-01
Abstract Significance: The number of kidney cancers is growing 3–5% each year due to unknown etiologies. Intra- and inter-tumor mediators increase oxidative stress and drive tumor heterogeneity. Recent Advances: Technology advancement in state-of-the-art instrumentation and methodologies allows researchers to detect and characterize global landscaping modifications in genes, proteins, and pathophysiology patterns at the single-cell level. Critical Issues: We postulate that the sources of reactive oxygen species (ROS) and their activation within subcellular compartments will change over a timeline of tumor evolvement and contribute to tumor heterogeneity. Therefore, the complexity of intracellular changes within a tumor and ROS-induced tumor heterogeneity coupled to the advancement of detecting these events globally are limited at the level of data collection, organization, and interpretation using software algorithms and bioinformatics. Future Directions: Integrative and collaborative research, combining the power of numbers with careful experimental design, protocol development, and data interpretation, will translate cancer biology and therapeutics to a heightened level or leave the abundant raw data as stagnant and underutilized. Antioxid. Redox Signal. 25, 685–701. PMID:27287984
76 FR 62632 - NARA Records Reproduction Fees
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-11
... methodology for creating and changing records reproduction fees, to remove records reproduction fees found in... add the methodology for creating and changing records reproduction fees, to remove records...
French, Deborah; Smith, Andrew; Powers, Martin P; Wu, Alan H B
2011-08-17
Binding of a ligand to the epidermal growth factor receptor (EGFR) stimulates various intracellular signaling pathways resulting in cell cycle progression, proliferation, angiogenesis and apoptosis inhibition. KRAS is involved in signaling pathways including RAF/MAPK and PI3K and mutations in this gene result in constitutive activation of these pathways, independent of EGFR activation. Seven mutations in codons 12 and 13 of KRAS comprise around 95% of the observed human mutations, rendering monoclonal antibodies against EGFR (e.g. cetuximab and panitumumab) useless in treatment of colorectal cancer. KRAS mutation testing by two different methodologies was compared; Sanger sequencing and AutoGenomics INFINITI® assay, on DNA extracted from colorectal cancers. Out of 29 colorectal tumor samples tested, 28 were concordant between the two methodologies for the KRAS mutations that were detected in both assays with the INFINITI® assay detecting a mutation in one sample that was indeterminate by Sanger sequencing and a third methodology; single nucleotide primer extension. This study indicates the utility of the AutoGenomics INFINITI® methodology in a clinical laboratory setting where technical expertise or access to equipment for DNA sequencing does not exist. Copyright © 2011 Elsevier B.V. All rights reserved.
Vanegas, Fernando; Weiss, John; Gonzalez, Felipe
2018-01-01
Recent advances in remote sensed imagery and geospatial image processing using unmanned aerial vehicles (UAVs) have enabled the rapid and ongoing development of monitoring tools for crop management and the detection/surveillance of insect pests. This paper describes a (UAV) remote sensing-based methodology to increase the efficiency of existing surveillance practices (human inspectors and insect traps) for detecting pest infestations (e.g., grape phylloxera in vineyards). The methodology uses a UAV integrated with advanced digital hyperspectral, multispectral, and RGB sensors. We implemented the methodology for the development of a predictive model for phylloxera detection. In this method, we explore the combination of airborne RGB, multispectral, and hyperspectral imagery with ground-based data at two separate time periods and under different levels of phylloxera infestation. We describe the technology used—the sensors, the UAV, and the flight operations—the processing workflow of the datasets from each imagery type, and the methods for combining multiple airborne with ground-based datasets. Finally, we present relevant results of correlation between the different processed datasets. The objective of this research is to develop a novel methodology for collecting, processing, analysing and integrating multispectral, hyperspectral, ground and spatial data to remote sense different variables in different applications, such as, in this case, plant pest surveillance. The development of such methodology would provide researchers, agronomists, and UAV practitioners reliable data collection protocols and methods to achieve faster processing techniques and integrate multiple sources of data in diverse remote sensing applications. PMID:29342101
Auditing as part of the terminology design life cycle.
Min, Hua; Perl, Yehoshua; Chen, Yan; Halper, Michael; Geller, James; Wang, Yue
2006-01-01
To develop and test an auditing methodology for detecting errors in medical terminologies satisfying systematic inheritance. This methodology is based on various abstraction taxonomies that provide high-level views of a terminology and highlight potentially erroneous concepts. Our auditing methodology is based on dividing concepts of a terminology into smaller, more manageable units. First, we divide the terminology's concepts into areas according to their relationships/roles. Then each multi-rooted area is further divided into partial-areas (p-areas) that are singly-rooted. Each p-area contains a set of structurally and semantically uniform concepts. Two kinds of abstraction networks, called the area taxonomy and p-area taxonomy, are derived. These taxonomies form the basis for the auditing approach. Taxonomies tend to highlight potentially erroneous concepts in areas and p-areas. Human reviewers can focus their auditing efforts on the limited number of problematic concepts following two hypotheses on the probable concentration of errors. A sample of the area taxonomy and p-area taxonomy for the Biological Process (BP) hierarchy of the National Cancer Institute Thesaurus (NCIT) was derived from the application of our methodology to its concepts. These views led to the detection of a number of different kinds of errors that are reported, and to confirmation of the hypotheses on error concentration in this hierarchy. Our auditing methodology based on area and p-area taxonomies is an efficient tool for detecting errors in terminologies satisfying systematic inheritance of roles, and thus facilitates their maintenance. This methodology concentrates a domain expert's manual review on portions of the concepts with a high likelihood of errors.
NASA Astrophysics Data System (ADS)
Ali-Alvarez, S.; Ferdinand, P.; Magne, S.; Nogueira, R. P.
2013-04-01
Corrosion of reinforced bar (rebar) in concrete structures represents a major issue in civil engineering works, being its detection and evolution a challenge for the applied research. In this work, we present a new methodology to corrosion detection in reinforced concrete structures, by combining Fiber Bragg Grating (FBG) sensors with the electrochemical and physical properties of rebar in a simplified assembly. Tests in electrolytic solutions and concrete were performed for pitting and general corrosion. The proposed Structural Health Monitoring (SHM) methodology constitutes a direct corrosion measurement potentially useful to implement or improve Condition-Based Maintenance (CBM) program for civil engineering concrete structures.
Statistical Model Applied to NetFlow for Network Intrusion Detection
NASA Astrophysics Data System (ADS)
Proto, André; Alexandre, Leandro A.; Batista, Maira L.; Oliveira, Isabela L.; Cansian, Adriano M.
The computers and network services became presence guaranteed in several places. These characteristics resulted in the growth of illicit events and therefore the computers and networks security has become an essential point in any computing environment. Many methodologies were created to identify these events; however, with increasing of users and services on the Internet, many difficulties are found in trying to monitor a large network environment. This paper proposes a methodology for events detection in large-scale networks. The proposal approaches the anomaly detection using the NetFlow protocol, statistical methods and monitoring the environment in a best time for the application.
ERIC Educational Resources Information Center
Lin, Angel
2013-01-01
Contemporary TESOL methodologies have been characterized by compartmentalization of languages in the classroom. However, recent years have seen the beginning signs of paradigmatic change in TESOL methodologies that indicate a move toward plurilingualism. In this article, the author draws on the case of Hong Kong to illustrate how, in the past four…
... estimating the gestational age of a newborn. These methodological changes prevent the direct comparison of trends prior ... high of 12.8 percent in 2006. A methodological change caused a sharp decline from 2006 to ...
Intercomparison of mid latitude storm diagnostics (IMILAST)
NASA Astrophysics Data System (ADS)
Neu, U.
2009-04-01
Diagnostics of the observed and projection of the future changes of extratropical storms are a key issue e.g. for insurance companies, risk management and adaptation planning. Storm-associated damages are amongst the highest losses due to natural disasters in the mid-latitudes. Therefore the knowledge of the future variability and change in extratropical cyclone frequency, intensity and track locations is crucial for the strategic planning and minimization of the disaster impacts. Future changes in the total number of storms might be small but major signals could occur in the characteristics of cyclone life cycle such as intensity, life time, track locations. The quantification of such trends is not independent from the methodologies for storm track detection applied to observational data and models. Comparison of differences in cyclone characteristics obtained using different methods from a single data set may be as large as or even exceed the differences between the results derived from different data sets using a single methodology. Even more, the metrics used become particularly sensitive, resulting in the fact that scientific studies may find seemingly contradictory results based on the same datasets. For users of storm track analyses and projections the results are very difficult to interprete. Thus, it would be very helpful if the research community would provide information in a kind of "handbook" which contains definitions and a description of the available different identification and tracking schemes as well as of the parameters used for the quantification of cyclone activity. It cannot be expected that there is an optimum or standard scheme that fulfills all needs. Rather, a proper knowledge about advantages and restrictions of different schemes must be obtained to be able to provide a synthesis of results rather than puzzling the scientific and the general public with apparently contradicing statements. The project IMILAST aims at providing a systematic intercomparison of different methodologies and a comprehensive assessment of all types of uncertainties inherent in the mid-latitudinal storm tracking by comparing different methodologies with respect to data of different resolution (time and space) and limited areas, for both cyclone identification and cyclone tracking respectively.
Trend Change Detection in NDVI Time Series: Effects of Inter-Annual Variability and Methodology
NASA Technical Reports Server (NTRS)
Forkel, Matthias; Carvalhais, Nuno; Verbesselt, Jan; Mahecha, Miguel D.; Neigh, Christopher S.R.; Reichstein, Markus
2013-01-01
Changing trends in ecosystem productivity can be quantified using satellite observations of Normalized Difference Vegetation Index (NDVI). However, the estimation of trends from NDVI time series differs substantially depending on analyzed satellite dataset, the corresponding spatiotemporal resolution, and the applied statistical method. Here we compare the performance of a wide range of trend estimation methods and demonstrate that performance decreases with increasing inter-annual variability in the NDVI time series. Trend slope estimates based on annual aggregated time series or based on a seasonal-trend model show better performances than methods that remove the seasonal cycle of the time series. A breakpoint detection analysis reveals that an overestimation of breakpoints in NDVI trends can result in wrong or even opposite trend estimates. Based on our results, we give practical recommendations for the application of trend methods on long-term NDVI time series. Particularly, we apply and compare different methods on NDVI time series in Alaska, where both greening and browning trends have been previously observed. Here, the multi-method uncertainty of NDVI trends is quantified through the application of the different trend estimation methods. Our results indicate that greening NDVI trends in Alaska are more spatially and temporally prevalent than browning trends. We also show that detected breakpoints in NDVI trends tend to coincide with large fires. Overall, our analyses demonstrate that seasonal trend methods need to be improved against inter-annual variability to quantify changing trends in ecosystem productivity with higher accuracy.
Radhakrishnan, Rajiv; Kiluk, Brian D; Tsai, Jack
2016-03-01
Cognitive remediation (CR) has been found to improve cognitive performance among adults with schizophrenia in randomized controlled trials (RCTs). However, improvements in cognitive performance are often observed in the control groups of RCTs as well. There has been no comprehensive examination of change in control groups for CR, which may inform trial methodology and improve our understanding of measured outcomes for cognitive remediation. In this meta-analysis, we calculated pre-post change in cognitive test performance within control groups of RCTs in 32 CR trials (n = 794 participants) published between 1970 and 2011, and examined the association between pre-post change and sample size, duration of treatment, type of control group, and participants' age, intelligence, duration of illness, and psychiatric symptoms. Results showed that control groups in CR trials showed small effect size changes (Cohen's d = 0.12 ± 0.16) in cognitive test performance over the trial duration. Study characteristics associated with pre-post change included participant age and sample size. These findings suggest attention to change in control groups may help improve detection of cognitive remediation effects for schizophrenia.
A new methodology for monitoring wood fluxes in rivers using a ground camera: Potential and limits
NASA Astrophysics Data System (ADS)
Benacchio, Véronique; Piégay, Hervé; Buffin-Bélanger, Thomas; Vaudor, Lise
2017-02-01
Ground imagery, which produces large amounts of valuable data at high frequencies, is increasingly used by fluvial geomorphologists to survey and understand processes. While such technology provides immense quantities of information, it can be challenging to analyze and requires automatization and associated development of new methodologies. This paper presents a new approach to automate the processing of image analysis to monitor wood delivery from the upstream Rhône River (France). The Génissiat dam is used as an observation window; all pieces of wood coming from the catchment are trapped here, hence a wood raft accumulates over time. In 2011, we installed an Axis 211W camera to acquire oblique images of the reservoir every 10 min with the goal of automatically detecting a wood raft area, in order to transform it to wood weight (t) and flux (t/d). The methodology we developed is based on random forest classification to detect the wood raft surface over time, which provided a good classification rate of 97.2%. Based on 14 mechanical wood extractions that included weight of wood removed each time, conducted during the survey period, we established a relationship between wood weight and wood raft surface area observed just before the extraction (R2 = 0.93). We found that using such techniques to continuously monitor wood flux is difficult because the raft undergoes very significant changes through time in terms of density, with a very high interday and intraday variability. Misclassifications caused by changes in weather conditions can be mitigated as well as errors from variation in pixel resolution (owing to camera position or window size), but a set of effects on raft density and mobility must still be explored (e.g., dam operation effects, wind on the reservoir surface). At this stage, only peak flow contribution to wood delivery can be well calculated, but determining an accurate, continuous series of wood flux is not possible. Several recommendations are made in terms of maximizing the potential benefit of such monitoring.
NASA Astrophysics Data System (ADS)
Schweier, C.; Markus, M.; Steinle, E.
2004-04-01
Catastrophic events like strong earthquakes can cause big losses in life and economic values. An increase in the efficiency of reconnaissance techniques could help to reduce the losses in life as many victims die after and not during the event. A basic prerequisite to improve the rescue teams' work is an improved planning of the measures. This can only be done on the basis of reliable and detailed information about the actual situation in the affected regions. Therefore, a bundle of projects at Karlsruhe university aim at the development of a tool for fast information retrieval after strong earthquakes. The focus is on urban areas as the most losses occur there. In this paper the approach for a damage analysis of buildings will be presented. It consists of an automatic methodology to model buildings in three dimensions, a comparison of pre- and post-event models to detect changes and a subsequent classification of the changes into damage types. The process is based on information extraction from airborne laserscanning data, i.e. digital surface models (DSM) acquired through scanning of an area with pulsed laser light. To date, there are no laserscanning derived DSMs available to the authors that were taken of areas that suffered damages from earthquakes. Therefore, it was necessary to simulate such data for the development of the damage detection methodology. In this paper two different methodologies used for simulating the data will be presented. The first method is to create CAD models of undamaged buildings based on their construction plans and alter them artificially in such a way as if they had suffered serious damage. Then, a laserscanning data set is simulated based on these models which can be compared with real laserscanning data acquired of the buildings (in intact state). The other approach is to use measurements of actual damaged buildings and simulate their intact state. It is possible to model the geometrical structure of these damaged buildings based on digital photography taken after the event by evaluating the images with photogrammetrical methods. The intact state of the buildings is simulated based on on-site investigations, and finally laserscanning data are simulated for both states.
42 CFR 405.504 - Determining prevailing charges.
Code of Federal Regulations, 2012 CFR
2012-10-01
...-farm business sector labor productivity. (3) If there is no methodological change, CMS publishes a... there are any other MEI methodological changes, they are published in the Federal Register with an...
42 CFR 405.504 - Determining prevailing charges.
Code of Federal Regulations, 2011 CFR
2011-10-01
...-farm business sector labor productivity. (3) If there is no methodological change, CMS publishes a... there are any other MEI methodological changes, they are published in the Federal Register with an...
42 CFR 405.504 - Determining prevailing charges.
Code of Federal Regulations, 2010 CFR
2010-10-01
...-farm business sector labor productivity. (3) If there is no methodological change, CMS publishes a... there are any other MEI methodological changes, they are published in the Federal Register with an...
42 CFR 405.504 - Determining prevailing charges.
Code of Federal Regulations, 2013 CFR
2013-10-01
...-farm business sector labor productivity. (3) If there is no methodological change, CMS publishes a... there are any other MEI methodological changes, they are published in the Federal Register with an...
42 CFR 405.504 - Determining prevailing charges.
Code of Federal Regulations, 2014 CFR
2014-10-01
...-farm business sector labor productivity. (3) If there is no methodological change, CMS publishes a... there are any other MEI methodological changes, they are published in the Federal Register with an...
Corrosion detection in steel-reinforced concrete using a spectroscopic technique
NASA Astrophysics Data System (ADS)
Garboczi, E. J.; Stutzman, P. E.; Wang, S.; Martys, N. S.; Hassan, A. M.; Duthinh, D.; Provenzano, V.; Chou, S. G.; Plusquellic, D. F.; Surek, J. T.; Kim, S.; McMichael, R. D.; Stiles, M. D.
2014-02-01
Detecting the early corrosion of steel that is embedded in reinforced concrete (rebar) is a goal that would greatly facilitate the inspection and measurement of corrosion in the US physical infrastructure. Since 2010, the National Institute of Standards and Technology (NIST) has been working on a large project to develop an electromagnetic (EM) probe that detects the specific corrosion products via spectroscopic means. Several principal iron corrosion products, such as hematite and goethite, are antiferromagnetic at field temperatures. At a given applied EM frequency, which depends on temperature, these compounds undergo a unique absorption resonance that identifies the presence of these particular iron corrosion products. The frequency of the resonances tends to be on the order of 100 GHz or higher, so transmitting EM waves through the cover concrete and back out again at a detectable level has been challenging. NIST has successfully detected these two iron corrosion products, and is developing equipment and methodologies that will be capable of penetrating the typical 50 mm of cover concrete in the field. The novel part of this project is the detection of specific compounds, rather than only geometrical changes in rebar cross-section. This method has the potential of providing an early-corrosion probe for steel in reinforced concrete, and for other applications where steel is covered by various layers and coatings.
Evolvement of Uniformity and Volatility in the Stressed Global Financial Village
Kenett, Dror Y.; Raddant, Matthias; Lux, Thomas; Ben-Jacob, Eshel
2012-01-01
Background In the current era of strong worldwide market couplings the global financial village became highly prone to systemic collapses, events that can rapidly sweep throughout the entire village. Methodology/Principal Findings We present a new methodology to assess and quantify inter-market relations. The approach is based on the correlations between the market index, the index volatility, the market Index Cohesive Force and the meta-correlations (correlations between the intra-correlations.) We investigated the relations between six important world markets—U.S., U.K., Germany, Japan, China and India—from January 2000 until December 2010. We found that while the developed “western” markets (U.S., U.K., Germany) are highly correlated, the interdependencies between these markets and the developing “eastern” markets (India and China) are volatile and with noticeable maxima at times of global world events. The Japanese market switches “identity”—it switches between periods of high meta-correlations with the “western” markets and periods when it behaves more similarly to the “eastern” markets. Conclusions/Significance The methodological framework presented here provides a way to quantify the evolvement of interdependencies in the global market, evaluate a world financial network and quantify changes in the world inter market relations. Such changes can be used as precursors to the agitation of the global financial village. Hence, the new approach can help to develop a sensitive “financial seismograph” to detect early signs of global financial crises so they can be treated before they develop into worldwide events. PMID:22347444
NASA Astrophysics Data System (ADS)
Duffy, James P.; Pratt, Laura; Anderson, Karen; Land, Peter E.; Shutler, Jamie D.
2018-01-01
Seagrass ecosystems are highly sensitive to environmental change. They are also in global decline and under threat from a variety of anthropogenic factors. There is now an urgency to establish robust monitoring methodologies so that changes in seagrass abundance and distribution in these sensitive coastal environments can be understood. Typical monitoring approaches have included remote sensing from satellites and airborne platforms, ground based ecological surveys and snorkel/scuba surveys. These techniques can suffer from temporal and spatial inconsistency, or are very localised making it hard to assess seagrass meadows in a structured manner. Here we present a novel technique using a lightweight (sub 7 kg) drone and consumer grade cameras to produce very high spatial resolution (∼4 mm pixel-1) mosaics of two intertidal sites in Wales, UK. We present a full data collection methodology followed by a selection of classification techniques to produce coverage estimates at each site. We trialled three classification approaches of varying complexity to investigate and illustrate the differing performance and capabilities of each. Our results show that unsupervised classifications perform better than object-based methods in classifying seagrass cover. We also found that the more sparsely vegetated of the two meadows studied was more accurately classified - it had lower root mean squared deviation (RMSD) between observed and classified coverage (9-9.5%) compared to a more densely vegetated meadow (RMSD 16-22%). Furthermore, we examine the potential to detect other biotic features, finding that lugworm mounds can be detected visually at coarser resolutions such as 43 mm pixel-1, whereas smaller features such as cockle shells within seagrass require finer grained data (<17 mm pixel-1).
Intimate Partner Violence, 1993-2010
... appendix table 2 for standard errors. *Due to methodological changes, use caution when comparing 2006 NCVS criminal ... appendix table 2 for standard errors. *Due to methodological changes, use caution when comparing 2006 NCVS criminal ...
Mesquita, D P; Dias, O; Amaral, A L; Ferreira, E C
2009-04-01
In recent years, a great deal of attention has been focused on the research of activated sludge processes, where the solid-liquid separation phase is frequently considered of critical importance, due to the different problems that severely affect the compaction and the settling of the sludge. Bearing that in mind, in this work, image analysis routines were developed in Matlab environment, allowing the identification and characterization of microbial aggregates and protruding filaments in eight different wastewater treatment plants, for a combined period of 2 years. The monitoring of the activated sludge contents allowed for the detection of bulking events proving that the developed image analysis methodology is adequate for a continuous examination of the morphological changes in microbial aggregates and subsequent estimation of the sludge volume index. In fact, the obtained results proved that the developed image analysis methodology is a feasible method for the continuous monitoring of activated sludge systems and identification of disturbances.
Early Warning Signals of Ecological Transitions: Methods for Spatial Patterns
Brock, William A.; Carpenter, Stephen R.; Ellison, Aaron M.; Livina, Valerie N.; Seekell, David A.; Scheffer, Marten; van Nes, Egbert H.; Dakos, Vasilis
2014-01-01
A number of ecosystems can exhibit abrupt shifts between alternative stable states. Because of their important ecological and economic consequences, recent research has focused on devising early warning signals for anticipating such abrupt ecological transitions. In particular, theoretical studies show that changes in spatial characteristics of the system could provide early warnings of approaching transitions. However, the empirical validation of these indicators lag behind their theoretical developments. Here, we summarize a range of currently available spatial early warning signals, suggest potential null models to interpret their trends, and apply them to three simulated spatial data sets of systems undergoing an abrupt transition. In addition to providing a step-by-step methodology for applying these signals to spatial data sets, we propose a statistical toolbox that may be used to help detect approaching transitions in a wide range of spatial data. We hope that our methodology together with the computer codes will stimulate the application and testing of spatial early warning signals on real spatial data. PMID:24658137
NASA Astrophysics Data System (ADS)
Meng, Xuelian
Urban land-use research is a key component in analyzing the interactions between human activities and environmental change. Researchers have conducted many experiments to classify urban or built-up land, forest, water, agriculture, and other land-use and land-cover types. Separating residential land uses from other land uses within urban areas, however, has proven to be surprisingly troublesome. Although high-resolution images have recently become more available for land-use classification, an increase in spatial resolution does not guarantee improved classification accuracy by traditional classifiers due to the increase of class complexity. This research presents an approach to detect and separate residential land uses on a building scale directly from remotely sensed imagery to enhance urban land-use analysis. Specifically, the proposed methodology applies a multi-directional ground filter to generate a bare ground surface from lidar data, then utilizes a morphology-based building detection algorithm to identify buildings from lidar and aerial photographs, and finally separates residential buildings using a supervised C4.5 decision tree analysis based on the seven selected building land-use indicators. Successful execution of this study produces three independent methods, each corresponding to the steps of the methodology: lidar ground filtering, building detection, and building-based object-oriented land-use classification. Furthermore, this research provides a prototype as one of the few early explorations of building-based land-use analysis and successful separation of more than 85% of residential buildings based on an experiment on an 8.25-km2 study site located in Austin, Texas.
NASA Astrophysics Data System (ADS)
Walicka, A.; Jóźków, G.; Borkowski, A.
2018-05-01
The fluvial transport is an important aspect of hydrological and geomorphologic studies. The knowledge about the movement parameters of different-size fractions is essential in many applications, such as the exploration of the watercourse changes, the calculation of the river bed parameters or the investigation of the frequency and the nature of the weather events. Traditional techniques used for the fluvial transport investigations do not provide any information about the long-term horizontal movement of the rocks. This information can be gained by means of terrestrial laser scanning (TLS). However, this is a complex issue consisting of several stages of data processing. In this study the methodology for individual rocks segmentation from TLS point cloud has been proposed, which is the first step for the semi-automatic algorithm for movement detection of individual rocks. The proposed algorithm is executed in two steps. Firstly, the point cloud is classified as rocks or background using only geometrical information. Secondly, the DBSCAN algorithm is executed iteratively on points classified as rocks until only one stone is detected in each segment. The number of rocks in each segment is determined using principal component analysis (PCA) and simple derivative method for peak detection. As a result, several segments that correspond to individual rocks are formed. Numerical tests were executed on two test samples. The results of the semi-automatic segmentation were compared to results acquired by manual segmentation. The proposed methodology enabled to successfully segment 76 % and 72 % of rocks in the test sample 1 and test sample 2, respectively.
Stacked Autoencoders for Outlier Detection in Over-the-Horizon Radar Signals
Protopapadakis, Eftychios; Doulamis, Anastasios; Doulamis, Nikolaos; Dres, Dimitrios; Bimpas, Matthaios
2017-01-01
Detection of outliers in radar signals is a considerable challenge in maritime surveillance applications. High-Frequency Surface-Wave (HFSW) radars have attracted significant interest as potential tools for long-range target identification and outlier detection at over-the-horizon (OTH) distances. However, a number of disadvantages, such as their low spatial resolution and presence of clutter, have a negative impact on their accuracy. In this paper, we explore the applicability of deep learning techniques for detecting deviations from the norm in behavioral patterns of vessels (outliers) as they are tracked from an OTH radar. The proposed methodology exploits the nonlinear mapping capabilities of deep stacked autoencoders in combination with density-based clustering. A comparative experimental evaluation of the approach shows promising results in terms of the proposed methodology's performance. PMID:29312449
NASA Astrophysics Data System (ADS)
Wigal, Sharon B.; Polzonetti, Chiara M.; Stehli, Annamarie; Gratton, Enrico
2012-12-01
The beneficial effects of pharmacotherapy on children with attention-deficit hyperactivity disorder (ADHD) are well documented. We use near-infrared spectroscopy (NIRS) methodology to determine reorganization of brain neurovascular properties following the medication treatment. Twenty-six children with ADHD (ages six through 12) participated in a modified laboratory school protocol to monitor treatment response with lisdexamfetamine dimesylate (LDX; Vyvanse, Shire US Inc.). All children refrained from taking medication for at least two weeks (washout period). To detect neurovascular reorganization, we measured changes in synchronization of oxy (HbO2) and deoxy (HHb) hemoglobin waves between the two frontal lobes. Participants without medication displayed average baseline HbO2 phase difference at about -7-deg. and HHb differences at about 240-deg.. This phase synchronization index changed after pharmacological intervention. Medication induced an average phase changes of HbO2 after first medication to 280-deg. and after medication optimization to 242-deg.. Instead first medication changed of the average HHb phase difference at 186-deg. and then after medication optimization to 120-deg. In agreement with findings of White et al., and Varela et al., we associated the phase synchronization differences of brain hemodynamics in children with ADHD with lobe specific hemodynamic reorganization of HbO2- and HHB oscillations following medication status.
Quality Control Methodology Of A Surface Wind Observational Database In North Eastern North America
NASA Astrophysics Data System (ADS)
Lucio-Eceiza, Etor E.; Fidel González-Rouco, J.; Navarro, Jorge; Conte, Jorge; Beltrami, Hugo
2016-04-01
This work summarizes the design and application of a Quality Control (QC) procedure for an observational surface wind database located in North Eastern North America. The database consists of 526 sites (486 land stations and 40 buoys) with varying resolutions of hourly, 3 hourly and 6 hourly data, compiled from three different source institutions with uneven measurement units and changing measuring procedures, instrumentation and heights. The records span from 1953 to 2010. The QC process is composed of different phases focused either on problems related with the providing source institutions or measurement errors. The first phases deal with problems often related with data recording and management: (1) compilation stage dealing with the detection of typographical errors, decoding problems, site displacements and unification of institutional practices; (2) detection of erroneous data sequence duplications within a station or among different ones; (3) detection of errors related with physically unrealistic data measurements. The last phases are focused on instrumental errors: (4) problems related with low variability, placing particular emphasis on the detection of unrealistic low wind speed records with the help of regional references; (5) high variability related erroneous records; (6) standardization of wind speed record biases due to changing measurement heights, detection of wind speed biases on week to monthly timescales, and homogenization of wind direction records. As a result, around 1.7% of wind speed records and 0.4% of wind direction records have been deleted, making a combined total of 1.9% of removed records. Additionally, around 15.9% wind speed records and 2.4% of wind direction data have been also corrected.
NASA Astrophysics Data System (ADS)
Greef, Charles; Petropavlovskikh, Viatcheslav; Nilsen, Oyvind; Khattatov, Boris; Plam, Mikhail; Gardner, Patrick; Hall, John
2008-04-01
Small non-coding RNA sequences have recently been discovered as unique identifiers of certain bacterial species, raising the possibility that they can be used as highly specific Biowarfare Agent detection markers in automated field deployable integrated detection systems. Because they are present in high abundance they could allow genomic based bacterial species identification without the need for pre-assay amplification. Further, a direct detection method would obviate the need for chemical labeling, enabling a rapid, efficient, high sensitivity mechanism for bacterial detection. Surface Plasmon Resonance enhanced Common Path Interferometry (SPR-CPI) is a potentially market disruptive, high sensitivity dual technology that allows real-time direct multiplex measurement of biomolecule interactions, including small molecules, nucleic acids, proteins, and microbes. SPR-CPI measures differences in phase shift of reflected S and P polarized light under Total Internal Reflection (TIR) conditions at a surface, caused by changes in refractive index induced by biomolecular interactions within the evanescent field at the TIR interface. The measurement is performed on a microarray of discrete 2-dimensional areas functionalized with biomolecule capture reagents, allowing simultaneous measurement of up to 100 separate analytes. The optical beam encompasses the entire microarray, allowing a solid state detector system with no scanning requirement. Output consists of simultaneous voltage measurements proportional to the phase differences resulting from the refractive index changes from each microarray feature, and is automatically processed and displayed graphically or delivered to a decision making algorithm, enabling a fully automatic detection system capable of rapid detection and quantification of small nucleic acids at extremely sensitive levels. Proof-of-concept experiments on model systems and cell culture samples have demonstrated utility of the system, and efforts are in progress for full development and deployment of the device. The technology has broad applicability as a universal detection platform for BWA detection, medical diagnostics, and drug discovery research, and represents a new class of instrumentation as a rapid, high sensitivity, label-free methodology.
Burgués, Javier; Jiménez-Soto, Juan Manuel; Marco, Santiago
2018-07-12
The limit of detection (LOD) is a key figure of merit in chemical sensing. However, the estimation of this figure of merit is hindered by the non-linear calibration curve characteristic of semiconductor gas sensor technologies such as, metal oxide (MOX), gasFETs or thermoelectric sensors. Additionally, chemical sensors suffer from cross-sensitivities and temporal stability problems. The application of the International Union of Pure and Applied Chemistry (IUPAC) recommendations for univariate LOD estimation in non-linear semiconductor gas sensors is not straightforward due to the strong statistical requirements of the IUPAC methodology (linearity, homoscedasticity, normality). Here, we propose a methodological approach to LOD estimation through linearized calibration models. As an example, the methodology is applied to the detection of low concentrations of carbon monoxide using MOX gas sensors in a scenario where the main source of error is the presence of uncontrolled levels of humidity. Copyright © 2018 Elsevier B.V. All rights reserved.
Breast cancer statistics and prediction methodology: a systematic review and analysis.
Dubey, Ashutosh Kumar; Gupta, Umesh; Jain, Sonal
2015-01-01
Breast cancer is a menacing cancer, primarily affecting women. Continuous research is going on for detecting breast cancer in the early stage as the possibility of cure in early stages is bright. There are two main objectives of this current study, first establish statistics for breast cancer and second to find methodologies which can be helpful in the early stage detection of the breast cancer based on previous studies. The breast cancer statistics for incidence and mortality of the UK, US, India and Egypt were considered for this study. The finding of this study proved that the overall mortality rates of the UK and US have been improved because of awareness, improved medical technology and screening, but in case of India and Egypt the condition is less positive because of lack of awareness. The methodological findings of this study suggest a combined framework based on data mining and evolutionary algorithms. It provides a strong bridge in improving the classification and detection accuracy of breast cancer data.
Harte, Richard; Quinlan, Leo R; Glynn, Liam; Rodríguez-Molinero, Alejandro; Baker, Paul Ma; Scharf, Thomas; ÓLaighin, Gearóid
2017-05-30
Design processes such as human-centered design (HCD), which involve the end user throughout the product development and testing process, can be crucial in ensuring that the product meets the needs and capabilities of the user, particularly in terms of safety and user experience. The structured and iterative nature of HCD can often conflict with the necessary rapid product development life-cycles associated with the competitive connected health industry. The aim of this study was to apply a structured HCD methodology to the development of a smartphone app that was to be used within a connected health fall risk detection system. Our methodology utilizes so called discount usability engineering techniques to minimize the burden on resources during development and maintain a rapid pace of development. This study will provide prospective designers a detailed description of the application of a HCD methodology. A 3-phase methodology was applied. In the first phase, a descriptive "use case" was developed by the system designers and analyzed by both expert stakeholders and end users. The use case described the use of the app and how various actors would interact with it and in what context. A working app prototype and a user manual were then developed based on this feedback and were subjected to a rigorous usability inspection. Further changes were made both to the interface and support documentation. The now advanced prototype was exposed to user testing by end users where further design recommendations were made. With combined expert and end-user analysis of a comprehensive use case having originally identified 21 problems with the system interface, we have only seen and observed 3 of these problems in user testing, implying that 18 problems were eliminated between phase 1 and 3. Satisfactory ratings were obtained during validation testing by both experts and end users, and final testing by users shows the system requires low mental, physical, and temporal demands according to the NASA Task Load Index (NASA-TLX). From our observation of older adults' interactions with smartphone interfaces, there were some recurring themes. Clear and relevant feedback as the user attempts to complete a task is critical. Feedback should include pop-ups, sound tones, color or texture changes, or icon changes to indicate that a function has been completed successfully, such as for the connection sequence. For text feedback, clear and unambiguous language should be used so as not to create anxiety, particularly when it comes to saving data. Warning tones or symbols, such as caution symbols or shrill tones, should only be used if absolutely necessary. Our HCD methodology, designed and implemented based on the principles of the International Standard Organizaton (ISO) 9241-210 standard, produced a functional app interface within a short production cycle, which is now suitable for use by older adults in long term clinical trials. ©Richard Harte, Leo R Quinlan, Liam Glynn, Alejandro Rodríguez-Molinero, Paul MA Baker, Thomas Scharf, Gearóid ÓLaighin. Originally published in JMIR Mhealth and Uhealth (http://mhealth.jmir.org), 30.05.2017.
Ma, Zhanshan Sam
2018-05-01
Relatively little progress in the methodology for differentiating between the healthy and diseased microbiomes, beyond comparing microbial community diversities with traditional species richness or Shannon index, has been made. Network analysis has increasingly been called for the task, but most currently available microbiome datasets only allows for the construction of simple species correlation networks (SCNs). The main results from SCN analysis are a series of network properties such as network degree and modularity, but the metrics for these network properties often produce inconsistent evidence. We propose a simple new network property, the P/N ratio, defined as the ratio of positive links to the number of negative links in the microbial SCN. We postulate that the P/N ratio should reflect the balance between facilitative and inhibitive interactions among microbial species, possibly one of the most important changes occurring in diseased microbiome. We tested our hypothesis with five datasets representing five major human microbiome sites and discovered that the P/N ratio exhibits contrasting differences between healthy and diseased microbiomes and may be harnessed as an in silico biomarker for detecting disease-associated changes in the human microbiome, and may play an important role in personalized diagnosis of the human microbiome-associated diseases.
Automated Corrosion Detection Program
2001-10-01
More detailed explanations of the methodology development can be found in Hidden Corrosion Detection Technology Assessment, a paper presented at...Detection Program, a paper presented at the Fourth Joint DoD/FAA/NASA Conference on Aging Aircraft, 2000. AS&M PULSE. The PULSE system, developed...selection can be found in The Evaluation of Hidden Corrosion Detection Technologies on the Automated Corrosion Detection Program, a paper presented
Repetitive deliberate fires: Development and validation of a methodology to detect series.
Bruenisholz, Eva; Delémont, Olivier; Ribaux, Olivier; Wilson-Wilde, Linzi
2017-08-01
The detection of repetitive deliberate fire events is challenging and still often ineffective due to a case-by-case approach. A previous study provided a critical review of the situation and analysis of the main challenges. This study suggested that the intelligence process, integrating forensic data, could be a valid framework to provide a follow-up and systematic analysis provided it is adapted to the specificities of repetitive deliberate fires. In this current manuscript, a specific methodology to detect deliberate fires series, i.e. set by the same perpetrators, is presented and validated. It is based on case profiles relying on specific elements previously identified. The method was validated using a dataset of approximately 8000 deliberate fire events collected over 12 years in a Swiss state. Twenty possible series were detected, including 6 of 9 known series. These results are very promising and lead the way to a systematic implementation of this methodology in an intelligence framework, whilst demonstrating the need and benefit of increasing the collection of forensic specific information to strengthen the value of links between cases. Crown Copyright © 2017. Published by Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Lin, H.; Zhang, X.; Wu, X.; Tarnas, J. D.; Mustard, J. F.
2018-04-01
Quantitative analysis of hydrated minerals from hyperspectral remote sensing data is fundamental for understanding Martian geologic process. Because of the difficulties for selecting endmembers from hyperspectral images, a sparse unmixing algorithm has been proposed to be applied to CRISM data on Mars. However, it's challenge when the endmember library increases dramatically. Here, we proposed a new methodology termed Target Transformation Constrained Sparse Unmixing (TTCSU) to accurately detect hydrous minerals on Mars. A new version of target transformation technique proposed in our recent work was used to obtain the potential detections from CRISM data. Sparse unmixing constrained with these detections as prior information was applied to CRISM single-scattering albedo images, which were calculated using a Hapke radiative transfer model. This methodology increases success rate of the automatic endmember selection of sparse unmixing and could get more accurate abundances. CRISM images with well analyzed in Southwest Melas Chasma was used to validate our methodology in this study. The sulfates jarosite was detected from Southwest Melas Chasma, the distribution is consistent with previous work and the abundance is comparable. More validations will be done in our future work.
NASA Technical Reports Server (NTRS)
Generazio, Edward R.
2015-01-01
Directed Design of Experiments for Validating Probability of Detection Capability of NDE Systems (DOEPOD) Manual v.1.2 The capability of an inspection system is established by applications of various methodologies to determine the probability of detection (POD). One accepted metric of an adequate inspection system is that there is 95% confidence that the POD is greater than 90% (90/95 POD). Design of experiments for validating probability of detection capability of nondestructive evaluation (NDE) systems (DOEPOD) is a methodology that is implemented via software to serve as a diagnostic tool providing detailed analysis of POD test data, guidance on establishing data distribution requirements, and resolving test issues. DOEPOD demands utilization of observance of occurrences. The DOEPOD capability has been developed to provide an efficient and accurate methodology that yields observed POD and confidence bounds for both Hit-Miss or signal amplitude testing. DOEPOD does not assume prescribed POD logarithmic or similar functions with assumed adequacy over a wide range of flaw sizes and inspection system technologies, so that multi-parameter curve fitting or model optimization approaches to generate a POD curve are not required. DOEPOD applications for supporting inspector qualifications is included.
Brattoli, Magda; Cisternino, Ezia; Dambruoso, Paolo Rosario; de Gennaro, Gianluigi; Giungato, Pasquale; Mazzone, Antonio; Palmisani, Jolanda; Tutino, Maria
2013-01-01
The gas chromatography-olfactometry (GC-O) technique couples traditional gas chromatographic analysis with sensory detection in order to study complex mixtures of odorous substances and to identify odor active compounds. The GC-O technique is already widely used for the evaluation of food aromas and its application in environmental fields is increasing, thus moving the odor emission assessment from the solely olfactometric evaluations to the characterization of the volatile components responsible for odor nuisance. The aim of this paper is to describe the state of the art of gas chromatography-olfactometry methodology, considering the different approaches regarding the operational conditions and the different methods for evaluating the olfactometric detection of odor compounds. The potentials of GC-O are described highlighting the improvements in this methodology relative to other conventional approaches used for odor detection, such as sensoristic, sensorial and the traditional gas chromatographic methods. The paper also provides an examination of the different fields of application of the GC-O, principally related to fragrances and food aromas, odor nuisance produced by anthropic activities and odorous compounds emitted by materials and medical applications. PMID:24316571
Larivière, Dominic; Tremblay, Mélodie; Durand-Jézéquel, Myriam; Tolmachev, Sergei
2012-04-01
This article describes a robust methodology using the combination of instrumental design (high matrix interface-HMI), sample dilution and internal standardization for the quantification of beryllium (Be) in various digested autopsy tissues using inductively coupled plasma mass spectrometry. The applicability of rhodium as a proper internal standard for Be was demonstrated in three types of biological matrices (i.e., femur, hair, lung tissues). Using HMI, it was possible to achieve instrumental detection limits and sensitivity of 0.6 ng L(-1) and 157 cps L ng(-1), respectively. Resilience to high salt matrices of the HMI setup was also highlighted using bone mimicking solution ([Ca(2+)] = 26 to 1,400 mg L(-1)), providing a 14-fold increase in tolerance and a 2.7-fold decrease in method detection limit compared to optimized experimental conditions obtained without the HMI configuration. Precision of the methodology to detect low levels of Be in autopsy samples was demonstrated using hair and blood certified reference materials. Be concentration ranging from 0.015 to 255 μg kg(-1) in autopsy samples obtained from the U.S. Transuranium and Uranium Registries were measured using the methodology presented.
Empirical constrained Bayes predictors accounting for non-detects among repeated measures.
Moore, Reneé H; Lyles, Robert H; Manatunga, Amita K
2010-11-10
When the prediction of subject-specific random effects is of interest, constrained Bayes predictors (CB) have been shown to reduce the shrinkage of the widely accepted Bayes predictor while still maintaining desirable properties, such as optimizing mean-square error subsequent to matching the first two moments of the random effects of interest. However, occupational exposure and other epidemiologic (e.g. HIV) studies often present a further challenge because data may fall below the measuring instrument's limit of detection. Although methodology exists in the literature to compute Bayes estimates in the presence of non-detects (Bayes(ND)), CB methodology has not been proposed in this setting. By combining methodologies for computing CBs and Bayes(ND), we introduce two novel CBs that accommodate an arbitrary number of observable and non-detectable measurements per subject. Based on application to real data sets (e.g. occupational exposure, HIV RNA) and simulation studies, these CB predictors are markedly superior to the Bayes predictor and to alternative predictors computed using ad hoc methods in terms of meeting the goal of matching the first two moments of the true random effects distribution. Copyright © 2010 John Wiley & Sons, Ltd.
An empirical, integrated forest biomass monitoring system
NASA Astrophysics Data System (ADS)
Kennedy, Robert E.; Ohmann, Janet; Gregory, Matt; Roberts, Heather; Yang, Zhiqiang; Bell, David M.; Kane, Van; Hughes, M. Joseph; Cohen, Warren B.; Powell, Scott; Neeti, Neeti; Larrue, Tara; Hooper, Sam; Kane, Jonathan; Miller, David L.; Perkins, James; Braaten, Justin; Seidl, Rupert
2018-02-01
The fate of live forest biomass is largely controlled by growth and disturbance processes, both natural and anthropogenic. Thus, biomass monitoring strategies must characterize both the biomass of the forests at a given point in time and the dynamic processes that change it. Here, we describe and test an empirical monitoring system designed to meet those needs. Our system uses a mix of field data, statistical modeling, remotely-sensed time-series imagery, and small-footprint lidar data to build and evaluate maps of forest biomass. It ascribes biomass change to specific change agents, and attempts to capture the impact of uncertainty in methodology. We find that: • A common image framework for biomass estimation and for change detection allows for consistent comparison of both state and change processes controlling biomass dynamics. • Regional estimates of total biomass agree well with those from plot data alone. • The system tracks biomass densities up to 450-500 Mg ha-1 with little bias, but begins underestimating true biomass as densities increase further. • Scale considerations are important. Estimates at the 30 m grain size are noisy, but agreement at broad scales is good. Further investigation to determine the appropriate scales is underway. • Uncertainty from methodological choices is evident, but much smaller than uncertainty based on choice of allometric equation used to estimate biomass from tree data. • In this forest-dominated study area, growth and loss processes largely balance in most years, with loss processes dominated by human removal through harvest. In years with substantial fire activity, however, overall biomass loss greatly outpaces growth. Taken together, our methods represent a unique combination of elements foundational to an operational landscape-scale forest biomass monitoring program.
Surveillance Systems for Waterborne Protozoa: Beyond Method 1623
1. Brief introduction to waterborne Cryptosporidium and Giardia Historical perspective on detecting Cryptosporidium and Giardia Current detection methodologies 2. US EPA’s waterborne protozoan research program Building a “Protoz...
Auditing as Part of the Terminology Design Life Cycle
Min, Hua; Perl, Yehoshua; Chen, Yan; Halper, Michael; Geller, James; Wang, Yue
2006-01-01
Objective To develop and test an auditing methodology for detecting errors in medical terminologies satisfying systematic inheritance. This methodology is based on various abstraction taxonomies that provide high-level views of a terminology and highlight potentially erroneous concepts. Design Our auditing methodology is based on dividing concepts of a terminology into smaller, more manageable units. First, we divide the terminology’s concepts into areas according to their relationships/roles. Then each multi-rooted area is further divided into partial-areas (p-areas) that are singly-rooted. Each p-area contains a set of structurally and semantically uniform concepts. Two kinds of abstraction networks, called the area taxonomy and p-area taxonomy, are derived. These taxonomies form the basis for the auditing approach. Taxonomies tend to highlight potentially erroneous concepts in areas and p-areas. Human reviewers can focus their auditing efforts on the limited number of problematic concepts following two hypotheses on the probable concentration of errors. Results A sample of the area taxonomy and p-area taxonomy for the Biological Process (BP) hierarchy of the National Cancer Institute Thesaurus (NCIT) was derived from the application of our methodology to its concepts. These views led to the detection of a number of different kinds of errors that are reported, and to confirmation of the hypotheses on error concentration in this hierarchy. Conclusion Our auditing methodology based on area and p-area taxonomies is an efficient tool for detecting errors in terminologies satisfying systematic inheritance of roles, and thus facilitates their maintenance. This methodology concentrates a domain expert’s manual review on portions of the concepts with a high likelihood of errors. PMID:16929044
Detecting Role Errors in the Gene Hierarchy of the NCI Thesaurus
Min, Hua; Cohen, Barry; Halper, Michael; Oren, Marc; Perl, Yehoshua
2008-01-01
Gene terminologies are playing an increasingly important role in the ever-growing field of genomic research. While errors in large, complex terminologies are inevitable, gene terminologies are even more susceptible to them due to the rapid growth of genomic knowledge and the nature of its discovery. It is therefore very important to establish quality-assurance protocols for such genomic-knowledge repositories. Different kinds of terminologies oftentimes require auditing methodologies adapted to their particular structures. In light of this, an auditing methodology tailored to the characteristics of the NCI Thesaurus’s (NCIT’s) Gene hierarchy is presented. The Gene hierarchy is of particular interest to the NCIT’s designers due to the primary role of genomics in current cancer research. This multiphase methodology focuses on detecting role-errors, such as missing roles or roles with incorrect or incomplete target structures, occurring within that hierarchy. The methodology is based on two kinds of abstraction networks, called taxonomies, that highlight the role distribution among concepts within the IS-A (subsumption) hierarchy. These abstract views tend to highlight portions of the hierarchy having a higher concentration of errors. The errors found during an application of the methodology are reported. Hypotheses pertaining to the efficacy of our methodology are investigated. PMID:19221606
NASA Technical Reports Server (NTRS)
Generazio, Edward R.
2014-01-01
Unknown risks are introduced into failure critical systems when probability of detection (POD) capabilities are accepted without a complete understanding of the statistical method applied and the interpretation of the statistical results. The presence of this risk in the nondestructive evaluation (NDE) community is revealed in common statements about POD. These statements are often interpreted in a variety of ways and therefore, the very existence of the statements identifies the need for a more comprehensive understanding of POD methodologies. Statistical methodologies have data requirements to be met, procedures to be followed, and requirements for validation or demonstration of adequacy of the POD estimates. Risks are further enhanced due to the wide range of statistical methodologies used for determining the POD capability. Receiver/Relative Operating Characteristics (ROC) Display, simple binomial, logistic regression, and Bayes' rule POD methodologies are widely used in determining POD capability. This work focuses on Hit-Miss data to reveal the framework of the interrelationships between Receiver/Relative Operating Characteristics Display, simple binomial, logistic regression, and Bayes' Rule methodologies as they are applied to POD. Knowledge of these interrelationships leads to an intuitive and global understanding of the statistical data, procedural and validation requirements for establishing credible POD estimates.
Arendowski, Adrian; Nizioł, Joanna; Ruman, Tomasz
2018-04-01
A new methodology applicable for both high-resolution laser desorption/ionization mass spectrometry and mass spectrometry imaging of amino acids is presented. The matrix-assisted laser desorption ionization-type target containing monoisotopic cationic 109 Ag nanoparticles ( 109 AgNPs) was used for rapid mass spectrometry measurements of 11 amino acids of different chemical properties. Amino acids were directly tested in 100,000-fold concentration change conditions ranging from 100 μg/mL to 1 ng/mL which equates to 50 ng to 500 fg of amino acid per measurement spot. Limit of detection values obtained suggest that presented method/target system is among the fastest and most sensitive ones in laser mass spectrometry. Mass spectrometry imaging of spots of human blood plasma spiked with amino acids showed their surface distribution allowing optimization of quantitative measurements. Copyright © 2018 John Wiley & Sons, Ltd.
Kumar Khanna, Vinod
2007-01-01
The current status and research trends of detection techniques for DNA-based analysis such as DNA finger printing, sequencing, biochips and allied fields are examined. An overview of main detectors is presented vis-à-vis these DNA operations. The biochip method is explained, the role of micro- and nanoelectronic technologies in biochip realization is highlighted, various optical and electrical detection principles employed in biochips are indicated, and the operational mechanisms of these detection devices are described. Although a diversity of biochips for diagnostic and therapeutic applications has been demonstrated in research laboratories worldwide, only some of these chips have entered the clinical market, and more chips are awaiting commercialization. The necessity of tagging is eliminated in refractive-index change based devices, but the basic flaw of indirect nature of most detection methodologies can only be overcome by generic and/or reagentless DNA sensors such as the conductance-based approach and the DNA-single electron transistor (DNA-SET) structure. Devices of the electrical detection-based category are expected to pave the pathway for the next-generation DNA chips. The review provides a comprehensive coverage of the detection technologies for DNA finger printing, sequencing and related techniques, encompassing a variety of methods from the primitive art to the state-of-the-art scenario as well as promising methods for the future.
Corbett, Andrea M; Francis, Karen; Chapman, Ysanne
2007-04-01
Identifying a methodology to guide a study that aims to enhance service delivery can be challenging. Participatory action research offers a solution to this challenge as it both informs and is informed by critical social theory. In addition, using a feminist lens helps acquiesce this approach as a suitable methodology for changing practice. This methodology embraces empowerment self-determination and the facilitation of agreed change as central tenets that guide the research process. Encouraged by the work of Foucault, Friere, Habermas, and Maguire, this paper explicates the philosophical assumptions underpinning critical social theory and outlines how feminist influences are complimentary in exploring the processes and applications of nursing research that seeks to embrace change.
Huertas, César S; Carrascosa, L G; Bonnal, S; Valcárcel, J; Lechuga, L M
2016-04-15
Alternative splicing of mRNA precursors enables cells to generate different protein outputs from the same gene depending on their developmental or homeostatic status. Its deregulation is strongly linked to disease onset and progression. Current methodologies for monitoring alternative splicing demand elaborate procedures and often present difficulties in discerning between closely related isoforms, e.g. due to cross-hybridization during their detection. Herein, we report a general methodology using a Surface Plasmon Resonance (SPR) biosensor for label-free monitoring of alternative splicing events in real-time, without any cDNA synthesis or PCR amplification requirements. We applied this methodology to RNA isolated from HeLa cells for the quantification of alternatively spliced isoforms of the Fas gene, involved in cancer progression through regulation of programmed cell death. We demonstrate that our methodology is isoform-specific, with virtually no cross-hybridization, achieving limits of detection (LODs) in the picoMolar (pM) range. Similar results were obtained for the detection of the BCL-X gene mRNA isoforms. The results were independently validated by RT-qPCR, with excellent concordance in the determination of isoform ratios. The simplicity and robustness of this biosensor technology can greatly facilitate the exploration of alternative splicing biomarkers in disease diagnosis and therapy. Copyright © 2015 Elsevier B.V. All rights reserved.
Amezquita-Sanchez, Juan P; Adeli, Anahita; Adeli, Hojjat
2016-05-15
Mild cognitive impairment (MCI) is a cognitive disorder characterized by memory impairment, greater than expected by age. A new methodology is presented to identify MCI patients during a working memory task using MEG signals. The methodology consists of four steps: In step 1, the complete ensemble empirical mode decomposition (CEEMD) is used to decompose the MEG signal into a set of adaptive sub-bands according to its contained frequency information. In step 2, a nonlinear dynamics measure based on permutation entropy (PE) analysis is employed to analyze the sub-bands and detect features to be used for MCI detection. In step 3, an analysis of variation (ANOVA) is used for feature selection. In step 4, the enhanced probabilistic neural network (EPNN) classifier is applied to the selected features to distinguish between MCI and healthy patients. The usefulness and effectiveness of the proposed methodology are validated using the sensed MEG data obtained experimentally from 18 MCI and 19 control patients. Copyright © 2016 Elsevier B.V. All rights reserved.
Identification of anomalous motion of thunderstorms using daily rainfall fields
NASA Astrophysics Data System (ADS)
Moral, Anna del; Llasat, María del Carmen; Rigo, Tomeu
2017-03-01
Most of the adverse weather phenomena in Catalonia (northeast Iberian Peninsula) are caused by convective events, which can produce heavy rains, large hailstones, strong winds, lightning and/or tornadoes. These thunderstorms usually have marked paths. However, their trajectories can vary sharply at any given time, completely changing direction from the path they have previously followed. Furthermore, some thunderstorms split or merge with each other, creating new formations with different behaviour. In order to identify the potentially anomalous movements that some thunderstorms make, this paper presents a two-step methodology using a database with 8 years of daily rainfall fields data for the Catalonia region (2008-2015). First, it classifies daily rainfall fields between days with "no rain", "non-potentially convective rain" and "potentially convective rain", based on daily accumulated precipitation and extension thresholds. Second, it categorises convective structures within rainfall fields and briefly identifies their main features, distinguishing whether there were any anomalous thunderstorm movements in each case. This methodology has been applied to the 2008-2015 period, and the main climatic features of convective and non-convective days were obtained. The methodology can be exported to other regions that do not have the necessary radar-based algorithms to detect convective cells, but where there is a good rain gauge network in place.
Developing an ethical code for engineers: the discursive approach.
Lozano, J Félix
2006-04-01
From the Hippocratic Oath on, deontological codes and other professional self-regulation mechanisms have been used to legitimize and identify professional groups. New technological challenges and, above all, changes in the socioeconomic environment require adaptable codes which can respond to new demands. We assume that ethical codes for professionals should not simply focus on regulative functions, but must also consider ideological and educative functions. Any adaptations should take into account both contents (values, norms and recommendations) and the drafting process itself. In this article we propose a process for developing a professional ethical code for an official professional association (Colegio Oficial de Ingenieros Industriales de Valencia (COIIV) starting from the philosophical assumptions of discursive ethics but adapting them to critical hermeneutics. Our proposal is based on the Integrity Approach rather than the Compliance Approach. A process aiming to achieve an effective ethical document that fulfils regulative and ideological functions requires a participative, dialogical and reflexive methodology. This process must respond to moral exigencies and demands for efficiency and professional effectiveness. In addition to the methodological proposal we present our experience of producing an ethical code for the industrial engineers' association in Valencia (Spain) where this methodology was applied, and we evaluate the detected problems and future potential.
NASA Thermographic Inspection of Advanced Composite Materials
NASA Technical Reports Server (NTRS)
Cramer, K. Elliott
2004-01-01
As the use of advanced composite materials continues to increase in the aerospace community, the need for a quantitative, rapid, in situ inspection technology has become a critical concern throughout the industry. In many applications it is necessary to monitor changes in these materials over an extended period of time to determine the effects of various load conditions. Additionally, the detection and characterization of defects such as delaminations, is of great concern. This paper will present the application of infrared thermography to characterize various composite materials and show the advantages of different heat source types. Finally, various analysis methodologies used for quantitative material property characterization will be discussed.
Children's illness drawings and asthma symptom awareness.
Gabriels, R L; Wamboldt, M Z; McCormick, D R; Adams, T L; McTaggart, S R
2000-01-01
This study examines the relationship between children's abilities to perceive their symptoms of asthma via several previously researched subjective and objective procedures compared with their performance on a standardized children's drawing task and scale criteria. Results indicated that girls verbalized significantly more emotions about their drawings and were better able to detect airflow changes in their small airways than boys. The Gabriels Asthma Perception Drawing Scales (GAPDS) is a promising clinical tool for assessing children's perceptions and emotions about asthma via nonverbal methods. Varying methods of measuring asthma symptom awareness are not highly correlated; thus, more than one methodology is appropriate for use with children.
Nanoscale displacement sensing using microfabricated variable-inductance planar coils
NASA Astrophysics Data System (ADS)
Coskun, M. Bulut; Thotahewa, Kasun; Ying, York-Sing; Yuce, Mehmet; Neild, Adrian; Alan, Tuncay
2013-09-01
Microfabricated spiral inductors were employed for nanoscale displacement detection, suitable for use in implantable pressure sensor applications. We developed a variable inductor sensor consisting of two coaxially positioned planar coils connected in series to a measurement circuit. The devices were characterized by varying the air gap between the coils hence changing the inductance, while a Colpitts oscillator readout was used to obtain corresponding frequencies. Our approach shows significant advantages over existing methodologies combining a displacement resolution of 17 nm and low hysteresis (0.15%) in a 1 × 1 mm2 device. We show that resolution could be further improved by shrinking the device's lateral dimensions.
Phylogenomics of plant genomes: a methodology for genome-wide searches for orthologs in plants
Conte, Matthieu G; Gaillard, Sylvain; Droc, Gaetan; Perin, Christophe
2008-01-01
Background Gene ortholog identification is now a major objective for mining the increasing amount of sequence data generated by complete or partial genome sequencing projects. Comparative and functional genomics urgently need a method for ortholog detection to reduce gene function inference and to aid in the identification of conserved or divergent genetic pathways between several species. As gene functions change during evolution, reconstructing the evolutionary history of genes should be a more accurate way to differentiate orthologs from paralogs. Phylogenomics takes into account phylogenetic information from high-throughput genome annotation and is the most straightforward way to infer orthologs. However, procedures for automatic detection of orthologs are still scarce and suffer from several limitations. Results We developed a procedure for ortholog prediction between Oryza sativa and Arabidopsis thaliana. Firstly, we established an efficient method to cluster A. thaliana and O. sativa full proteomes into gene families. Then, we developed an optimized phylogenomics pipeline for ortholog inference. We validated the full procedure using test sets of orthologs and paralogs to demonstrate that our method outperforms pairwise methods for ortholog predictions. Conclusion Our procedure achieved a high level of accuracy in predicting ortholog and paralog relationships. Phylogenomic predictions for all validated gene families in both species were easily achieved and we can conclude that our methodology outperforms similarly based methods. PMID:18426584
Bayesian design of decision rules for failure detection
NASA Technical Reports Server (NTRS)
Chow, E. Y.; Willsky, A. S.
1984-01-01
The formulation of the decision making process of a failure detection algorithm as a Bayes sequential decision problem provides a simple conceptualization of the decision rule design problem. As the optimal Bayes rule is not computable, a methodology that is based on the Bayesian approach and aimed at a reduced computational requirement is developed for designing suboptimal rules. A numerical algorithm is constructed to facilitate the design and performance evaluation of these suboptimal rules. The result of applying this design methodology to an example shows that this approach is potentially a useful one.
NASA Astrophysics Data System (ADS)
Ruecker, Gernot; Schroeder, Wilfrid; Lorenz, Eckehard; Kaiser, Johannes; Caseiro, Alexandre
2016-04-01
According to recent research, black carbon has the second strongest effect on the earth climate system after carbon dioxide. In high Northern latitudes, industrial gas flares are an important source of black carbon, especially in winter. This fact is particularly relevant for the relatively fast observed climate change in the Arctic since deposition of black carbon changes the albedo of snow and ice, thus leading to a positive feedback cycle. Here we explore gas flare detection and Fire Radiative Power (FRP) retrievals of the German FireBird TET-1 and BIRD Hotspot Recognition Systems (HSRS), the VIIRS sensor on board of the S-NPP satellite, and the MODIS sensor using temporally close to near coincident data acquisitions. Comparison is based on level 2 products developed for fire detection for the different sensors; in the case of S-NPP VIIRS we use two products: the new VIIRS 750m algorithm based on MODIS collection 6, and the 350 m algorithm based on the VIIRS mid-infrared I (Imaging) band, which offers high resolution, but no FRP retrievals. Results indicate that the highest resolution FireBird sensors offer the best detection capacities, though the level two product shows false alarms, followed by the VIIRS 350 m and 750 m algorithms. MODIS has the lowest detection rate. Preliminary results of FRP retrievals show that FireBird and VIIRS algorithms have a good agreement. Given the fact that most gas flaring is at the detection limit for medium to coarse resolution space borne sensors - and hence measurement errors may be high - our results indicates that a quantitative evaluation of gas flaring using these sensors is feasible. Results shall be used to develop a gas flare detection algorithm for Sentinel-3, and a similar methodology will be employed to validate the capacity of Sentinel 3 to detect and characterize small high temperature sources such as gas flares.
Fritz, Megan L; DeYonke, Alexandra M; Papanicolaou, Alexie; Micinski, Stephen; Westbrook, John; Gould, Fred
2018-01-01
Adaptation to human-induced environmental change has the potential to profoundly influence the genomic architecture of affected species. This is particularly true in agricultural ecosystems, where anthropogenic selection pressure is strong. Heliothis virescens primarily feeds on cotton in its larval stages, and US populations have been declining since the widespread planting of transgenic cotton, which endogenously expresses proteins derived from Bacillus thuringiensis (Bt). No physiological adaptation to Bt toxin has been found in the field, so adaptation in this altered environment could involve (i) shifts in host plant selection mechanisms to avoid cotton, (ii) changes in detoxification mechanisms required for cotton-feeding vs. feeding on other hosts or (iii) loss of resistance to previously used management practices including insecticides. Here, we begin to address whether such changes occurred in H. virescens populations between 1997 and 2012, as Bt-cotton cultivation spread through the agricultural landscape. For our study, we produced an H. virescens genome assembly and used this in concert with a ddRAD-seq-enabled genome scan to identify loci with significant allele frequency changes over the 15-year period. Genetic changes at a previously described H. virescens insecticide target of selection were detectable in our genome scan and increased our confidence in this methodology. Additional loci were also detected as being under selection, and we quantified the selection strength required to elicit observed allele frequency changes at each locus. Potential contributions of genes near loci under selection to adaptive phenotypes in the H. virescens cotton system are discussed. © 2017 John Wiley & Sons Ltd.
Lowland extirpation of anuran populations on a tropical mountain
Aide, T. Mitchell
2017-01-01
Background Climate change and infectious diseases threaten animal and plant species, even in natural and protected areas. To cope with these changes, species may acclimate, adapt, move or decline. Here, we test for shifts in anuran distributions in the Luquillo Mountains (LM), a tropical montane forest in Puerto Rico by comparing species distributions from historical (1931–1989)and current data (2015/2016). Methods Historical data, which included different methodologies, were gathered through the Global Biodiversity Information Facility (GBIF) and published literature, and the current data were collected using acoustic recorders along three elevational transects. Results In the recordings, we detected the 12 native frog species known to occur in LM. Over a span of ∼25 years, two species have become extinct and four species suffered extirpation in lowland areas. As a consequence, low elevation areas in the LM (<300 m) have lost at least six anuran species. Discussion We hypothesize that these extirpations are due to the effects of climate change and infectious diseases, which are restricting many species to higher elevations and a much smaller area. Land use change is not responsible for these changes because LM has been a protected reserve for the past 80 years. However, previous studies indicate that (1) climate change has increased temperatures in Puerto Rico, and (2) Batrachochytrium dendrobatidis (Bd) was found in 10 native species and early detection of Bd coincides with anurans declines in the LM. Our study confirms the general impressions of amphibian population extirpations at low elevations, and corroborates the levels of threat assigned by IUCN. PMID:29158987
Lowland extirpation of anuran populations on a tropical mountain.
Campos-Cerqueira, Marconi; Aide, T Mitchell
2017-01-01
Climate change and infectious diseases threaten animal and plant species, even in natural and protected areas. To cope with these changes, species may acclimate, adapt, move or decline. Here, we test for shifts in anuran distributions in the Luquillo Mountains (LM), a tropical montane forest in Puerto Rico by comparing species distributions from historical (1931-1989)and current data (2015/2016). Historical data, which included different methodologies, were gathered through the Global Biodiversity Information Facility (GBIF) and published literature, and the current data were collected using acoustic recorders along three elevational transects. In the recordings, we detected the 12 native frog species known to occur in LM. Over a span of ∼25 years, two species have become extinct and four species suffered extirpation in lowland areas. As a consequence, low elevation areas in the LM (<300 m) have lost at least six anuran species. We hypothesize that these extirpations are due to the effects of climate change and infectious diseases, which are restricting many species to higher elevations and a much smaller area. Land use change is not responsible for these changes because LM has been a protected reserve for the past 80 years. However, previous studies indicate that (1) climate change has increased temperatures in Puerto Rico, and (2) Batrachochytrium dendrobatidis (Bd) was found in 10 native species and early detection of Bd coincides with anurans declines in the LM. Our study confirms the general impressions of amphibian population extirpations at low elevations, and corroborates the levels of threat assigned by IUCN.
2013-01-01
Background Predictive tools are already being implemented to assist in Emergency Department bed management by forecasting the expected total volume of patients. Yet these tools are unable to detect and diagnose when estimates fall short. Early detection of hotspots, that is subpopulations of patients presenting in unusually high numbers, would help authorities to manage limited health resources and communicate effectively about emerging risks. We evaluate an anomaly detection tool that signals when, and in what way Emergency Departments in 18 hospitals across the state of Queensland, Australia, are significantly exceeding their forecasted patient volumes. Methods The tool in question is an adaptation of the Surveillance Tree methodology initially proposed in Sparks and Okugami (IntStatl 1:2–24, 2010). for the monitoring of vehicle crashes. The methodology was trained on presentations to 18 Emergency Departments across Queensland over the period 2006 to 2008. Artificial increases were added to simulated, in-control counts for these data to evaluate the tool’s sensitivity, timeliness and diagnostic capability. The results were compared with those from a univariate control chart. The tool was then applied to data from 2009, the year of the H1N1 (or ‘Swine Flu’) pandemic. Results The Surveillance Tree method was found to be at least as effective as a univariate, exponentially weighted moving average (EWMA) control chart when increases occurred in a subgroup of the monitored population. The method has advantages over the univariate control chart in that it allows for the monitoring of multiple disease groups while still allowing control of the overall false alarm rate. It is also able to detect changes in the makeup of the Emergency Department presentations, even when the total count remains unchanged. Furthermore, the Surveillance Tree method provides diagnostic information useful for service improvements or disease management. Conclusions Multivariate surveillance provides a useful tool in the management of hospital Emergency Departments by not only efficiently detecting unusually high numbers of presentations, but by providing information about which groups of patients are causing the increase. PMID:24313914
NASA Astrophysics Data System (ADS)
Szatmári, Gábor; Laborczi, Annamária; Takács, Katalin; Pásztor, László
2017-04-01
The knowledge about soil organic carbon (SOC) baselines and changes, and the detection of vulnerable hot spots for SOC losses and gains under climate change and changed land management is still fairly limited. Thus Global Soil Partnership (GSP) has been requested to develop a global SOC mapping campaign by 2017. GSPs concept builds on official national data sets, therefore, a bottom-up (country-driven) approach is pursued. The elaborated Hungarian methodology suits the general specifications of GSOC17 provided by GSP. The input data for GSOC17@HU mapping approach has involved legacy soil data bases, as well as proper environmental covariates related to the main soil forming factors, such as climate, organisms, relief and parent material. Nowadays, digital soil mapping (DSM) highly relies on the assumption that soil properties of interest can be modelled as a sum of a deterministic and stochastic component, which can be treated and modelled separately. We also adopted this assumption in our methodology. In practice, multiple regression techniques are commonly used to model the deterministic part. However, this global (and usually linear) models commonly oversimplify the often complex and non-linear relationship, which has a crucial effect on the resulted soil maps. Thus, we integrated machine learning algorithms (namely random forest and quantile regression forest) in the elaborated methodology, supposing then to be more suitable for the problem in hand. This approach has enable us to model the GSOC17 soil properties in that complex and non-linear forms as the soil itself. Furthermore, it has enable us to model and assess the uncertainty of the results, which is highly relevant in decision making. The applied methodology has used geostatistical approach to model the stochastic part of the spatial variability of the soil properties of interest. We created GSOC17@HU map with 1 km grid resolution according to the GSPs specifications. The map contributes to the GSPs GSOC17 proposals, as well as to the development of global soil information system under GSP Pillar 4 on soil data and information. However, we elaborated our adherent code (created in R software environment) in such a way that it can be improved, specified and applied for further uses. Hence, it opens the door to create countrywide map(s) with higher grid resolution for SOC (or other soil related properties) using the advanced methodology, as well as to contribute and support the SOC (or other soil) related country level decision making. Our paper will present the soil mapping methodology itself, the resulted GSOC17@HU map, some of our conclusions drawn from the experiences and their effects on the further uses. Acknowledgement: Our work was supported by the Hungarian National Scientific Research Foundation (OTKA, Grant No. K105167).
Zhang, Liding; Wei, Qiujiang; Han, Qinqin; Chen, Qiang; Tai, Wenlin; Zhang, Jinyang; Song, Yuzhu; Xia, Xueshan
2018-01-01
Shigella is an important human food-borne zoonosis bacterial pathogen, and can cause clinically severe diarrhea. There is an urgent need to develop a specific, sensitive, and rapid methodology for detection of this pathogen. In this study, loop-mediated isothermal amplification (LAMP) combined with magnetic immunocapture assay (IC-LAMP) was first developed for the detection of Shigella in pure culture, artificial milk, and clinical stool samples. This method exhibited a detection limit of 8.7 CFU/mL. Compared with polymerase chain reaction, IC-LAMP is sensitive, specific, and reliable for monitoring Shigella. Additionally, IC-LAMP is more convenient, efficient, and rapid than ordinary LAMP, as it is more efficiently enriches pathogen cells without extraction of genomic DNA. Under isothermal conditions, the amplification curves and the green fluorescence were detected within 30 min in the presence of genomic DNA template. The overall analysis time was approximately 1 h, including the enrichment and lysis of the bacterial cells, a significantly short detection time. Therefore, the IC-LAMP methodology described here is potentially useful for the efficient detection of Shigella in various samples. PMID:29467730
NASA Astrophysics Data System (ADS)
Pinales, J. C.; Graber, H. C.; Hargrove, J. T.; Caruso, M. J.
2016-02-01
Previous studies have demonstrated the ability to detect and classify marine hydrocarbon films with spaceborne synthetic aperture radar (SAR) imagery. The dampening effects of hydrocarbon discharges on small surface capillary-gravity waves renders the ocean surface "radar dark" compared with the standard wind-borne ocean surfaces. Given the scope and impact of events like the Deepwater Horizon oil spill, the need for improved, automated and expedient monitoring of hydrocarbon-related marine anomalies has become a pressing and complex issue for governments and the extraction industry. The research presented here describes the development, training, and utilization of an algorithm that detects marine oil spills in an automated, semi-supervised manner, utilizing X-, C-, or L-band SAR data as the primary input. Ancillary datasets include related radar-borne variables (incidence angle, etc.), environmental data (wind speed, etc.) and textural descriptors. Shapefiles produced by an experienced human-analyst served as targets (validation) during the training portion of the investigation. Training and testing datasets were chosen for development and assessment of algorithm effectiveness as well as optimal conditions for oil detection in SAR data. The algorithm detects oil spills by following a 3-step methodology: object detection, feature extraction, and classification. Previous oil spill detection and classification methodologies such as machine learning algorithms, artificial neural networks (ANN), and multivariate classification methods like partial least squares-discriminant analysis (PLS-DA) are evaluated and compared. Statistical, transform, and model-based image texture techniques, commonly used for object mapping directly or as inputs for more complex methodologies, are explored to determine optimal textures for an oil spill detection system. The influence of the ancillary variables is explored, with a particular focus on the role of strong vs. weak wind forcing.
A methodology for collecting valid software engineering data
NASA Technical Reports Server (NTRS)
Basili, Victor R.; Weiss, David M.
1983-01-01
An effective data collection method for evaluating software development methodologies and for studying the software development process is described. The method uses goal-directed data collection to evaluate methodologies with respect to the claims made for them. Such claims are used as a basis for defining the goals of the data collection, establishing a list of questions of interest to be answered by data analysis, defining a set of data categorization schemes, and designing a data collection form. The data to be collected are based on the changes made to the software during development, and are obtained when the changes are made. To insure accuracy of the data, validation is performed concurrently with software development and data collection. Validation is based on interviews with those people supplying the data. Results from using the methodology show that data validation is a necessary part of change data collection. Without it, as much as 50% of the data may be erroneous. Feasibility of the data collection methodology was demonstrated by applying it to five different projects in two different environments. The application showed that the methodology was both feasible and useful.
NASA Astrophysics Data System (ADS)
Arav, Reuma; Filin, Sagi
2016-06-01
Airborne laser scans present an optimal tool to describe geomorphological features in natural environments. However, a challenge arises in the detection of such phenomena, as they are embedded in the topography, tend to blend into their surroundings and leave only a subtle signature within the data. Most object-recognition studies address mainly urban environments and follow a general pipeline where the data are partitioned into segments with uniform properties. These approaches are restricted to man-made domain and are capable to handle limited features that answer a well-defined geometric form. As natural environments present a more complex set of features, the common interpretation of the data is still manual at large. In this paper, we propose a data-aware detection scheme, unbound to specific domains or shapes. We define the recognition question as an energy optimization problem, solved by variational means. Our approach, based on the level-set method, characterizes geometrically local surfaces within the data, and uses these characteristics as potential field for minimization. The main advantage here is that it allows topological changes of the evolving curves, such as merging and breaking. We demonstrate the proposed methodology on the detection of collapse sinkholes.
Spatiotemporal Detection of Unusual Human Population Behavior Using Mobile Phone Data
Dobra, Adrian; Williams, Nathalie E.; Eagle, Nathan
2015-01-01
With the aim to contribute to humanitarian response to disasters and violent events, scientists have proposed the development of analytical tools that could identify emergency events in real-time, using mobile phone data. The assumption is that dramatic and discrete changes in behavior, measured with mobile phone data, will indicate extreme events. In this study, we propose an efficient system for spatiotemporal detection of behavioral anomalies from mobile phone data and compare sites with behavioral anomalies to an extensive database of emergency and non-emergency events in Rwanda. Our methodology successfully captures anomalous behavioral patterns associated with a broad range of events, from religious and official holidays to earthquakes, floods, violence against civilians and protests. Our results suggest that human behavioral responses to extreme events are complex and multi-dimensional, including extreme increases and decreases in both calling and movement behaviors. We also find significant temporal and spatial variance in responses to extreme events. Our behavioral anomaly detection system and extensive discussion of results are a significant contribution to the long-term project of creating an effective real-time event detection system with mobile phone data and we discuss the implications of our findings for future research to this end. PMID:25806954
Qin, Guoxing; Zhao, Shulin; Huang, Yong; Jiang, Jing; Liu, Yi-Ming
2013-08-15
In this article, we report a gold nanoparticles (AuNPs) sensing platform based on chemiluminescence resonance energy transfer (CRET) for light on detection of biomolecules. In designing such a CRET-based biosensing platform, the aptamer was first covalently labeled with a chemiluminescent reagent, N-(4-aminobutyl)-N-ethylisoluminol (ABEI). The ABEI labeled aptamer was then hybridized with AuNPs functionalized ssDNA which was complementary to the aptamer, obtaining the aptasensor. The CRET between ABEI and AuNPs in the aptasensor led to the CL quenching of ABEI. In the presence of a target analyte, it formed a complex with aptamer, and released ABEI-aptamer from AuNPs surface that resulted in CL recovery of ABEI. To test this design, a thrombin (used as a model analyte) aptasensor was prepared and evaluated. The results indicate that the proposed approach is simple and provided a linear range of 50-550 pM for thrombin detection with a detection limit of 15 pM. This new methodology can be easily extended to assay other biomolecules by simply changing the recognition sequence with the substrate aptamer. Copyright © 2013 Elsevier B.V. All rights reserved.
Pender, Alexandra; Garcia-Murillas, Isaac; Rana, Sareena; Cutts, Rosalind J; Kelly, Gavin; Fenwick, Kerry; Kozarewa, Iwanka; Gonzalez de Castro, David; Bhosle, Jaishree; O'Brien, Mary; Turner, Nicholas C; Popat, Sanjay; Downward, Julian
2015-01-01
Droplet digital PCR (ddPCR) can be used to detect low frequency mutations in oncogene-driven lung cancer. The range of KRAS point mutations observed in NSCLC necessitates a multiplex approach to efficient mutation detection in circulating DNA. Here we report the design and optimisation of three discriminatory ddPCR multiplex assays investigating nine different KRAS mutations using PrimePCR™ ddPCR™ Mutation Assays and the Bio-Rad QX100 system. Together these mutations account for 95% of the nucleotide changes found in KRAS in human cancer. Multiplex reactions were optimised on genomic DNA extracted from KRAS mutant cell lines and tested on DNA extracted from fixed tumour tissue from a cohort of lung cancer patients without prior knowledge of the specific KRAS genotype. The multiplex ddPCR assays had a limit of detection of better than 1 mutant KRAS molecule in 2,000 wild-type KRAS molecules, which compared favourably with a limit of detection of 1 in 50 for next generation sequencing and 1 in 10 for Sanger sequencing. Multiplex ddPCR assays thus provide a highly efficient methodology to identify KRAS mutations in lung adenocarcinoma.
Pender, Alexandra; Garcia-Murillas, Isaac; Rana, Sareena; Cutts, Rosalind J.; Kelly, Gavin; Fenwick, Kerry; Kozarewa, Iwanka; Gonzalez de Castro, David; Bhosle, Jaishree; O’Brien, Mary; Turner, Nicholas C.; Popat, Sanjay; Downward, Julian
2015-01-01
Droplet digital PCR (ddPCR) can be used to detect low frequency mutations in oncogene-driven lung cancer. The range of KRAS point mutations observed in NSCLC necessitates a multiplex approach to efficient mutation detection in circulating DNA. Here we report the design and optimisation of three discriminatory ddPCR multiplex assays investigating nine different KRAS mutations using PrimePCR™ ddPCR™ Mutation Assays and the Bio-Rad QX100 system. Together these mutations account for 95% of the nucleotide changes found in KRAS in human cancer. Multiplex reactions were optimised on genomic DNA extracted from KRAS mutant cell lines and tested on DNA extracted from fixed tumour tissue from a cohort of lung cancer patients without prior knowledge of the specific KRAS genotype. The multiplex ddPCR assays had a limit of detection of better than 1 mutant KRAS molecule in 2,000 wild-type KRAS molecules, which compared favourably with a limit of detection of 1 in 50 for next generation sequencing and 1 in 10 for Sanger sequencing. Multiplex ddPCR assays thus provide a highly efficient methodology to identify KRAS mutations in lung adenocarcinoma. PMID:26413866
NASA Astrophysics Data System (ADS)
Micheletti, Natan; Tonini, Marj; Lane, Stuart N.
2017-02-01
Acquisition of high density point clouds using terrestrial laser scanners (TLSs) has become commonplace in geomorphic science. The derived point clouds are often interpolated onto regular grids and the grids compared to detect change (i.e. erosion and deposition/advancement movements). This procedure is necessary for some applications (e.g. digital terrain analysis), but it inevitably leads to a certain loss of potentially valuable information contained within the point clouds. In the present study, an alternative methodology for geomorphological analysis and feature detection from point clouds is proposed. It rests on the use of the Density-Based Spatial Clustering of Applications with Noise (DBSCAN), applied to TLS data for a rock glacier front slope in the Swiss Alps. The proposed methods allowed the detection and isolation of movements directly from point clouds which yield to accuracies in the following computation of volumes that depend only on the actual registered distance between points. We demonstrated that these values are more conservative than volumes computed with the traditional DEM comparison. The results are illustrated for the summer of 2015, a season of enhanced geomorphic activity associated with exceptionally high temperatures.
Reagentless, Structure-Switching, Electrochemical Aptamer-Based Sensors
NASA Astrophysics Data System (ADS)
Schoukroun-Barnes, Lauren R.; Macazo, Florika C.; Gutierrez, Brenda; Lottermoser, Justine; Liu, Juan; White, Ryan J.
2016-06-01
The development of structure-switching, electrochemical, aptamer-based sensors over the past ˜10 years has led to a variety of reagentless sensors capable of analytical detection in a range of sample matrices. The crux of this methodology is the coupling of target-induced conformation changes of a redox-labeled aptamer with electrochemical detection of the resulting altered charge transfer rate between the redox molecule and electrode surface. Using aptamer recognition expands the highly sensitive detection ability of electrochemistry to a range of previously inaccessible analytes. In this review, we focus on the methods of sensor fabrication and how sensor signaling is affected by fabrication parameters. We then discuss recent studies addressing the fundamentals of sensor signaling as well as quantitative characterization of the analytical performance of electrochemical aptamer-based sensors. Although the limits of detection of reported electrochemical aptamer-based sensors do not often reach that of gold-standard methods such as enzyme-linked immunosorbent assays, the operational convenience of the sensor platform enables exciting analytical applications that we address. Using illustrative examples, we highlight recent advances in the field that impact important areas of analytical chemistry. Finally, we discuss the challenges and prospects for this class of sensors.
DOT National Transportation Integrated Search
2014-06-01
The objective of this project focused on the development of a hybrid nondestructive testing and evaluation (NDT&E) methodology that combines the benefits of microwave NDT and thermography into one new technique. In this way, unique features of both N...
Zopf, Agnes; Raim, Roman; Danzer, Martin; Niklas, Norbert; Spilka, Rita; Pröll, Johannes; Gabriel, Christian; Nechansky, Andreas; Roucka, Markus
2015-03-01
The detection of KRAS mutations in codons 12 and 13 is critical for anti-EGFR therapy strategies; however, only those methodologies with high sensitivity, specificity, and accuracy as well as the best cost and turnaround balance are suitable for routine daily testing. Here we compared the performance of compact sequencing using the novel hybcell technology with 454 next-generation sequencing (454-NGS), Sanger sequencing, and pyrosequencing, using an evaluation panel of 35 specimens. A total of 32 mutations and 10 wild-type cases were reported using 454-NGS as the reference method. Specificity ranged from 100% for Sanger sequencing to 80% for pyrosequencing. Sanger sequencing and hybcell-based compact sequencing achieved a sensitivity of 96%, whereas pyrosequencing had a sensitivity of 88%. Accuracy was 97% for Sanger sequencing, 85% for pyrosequencing, and 94% for hybcell-based compact sequencing. Quantitative results were obtained for 454-NGS and hybcell-based compact sequencing data, resulting in a significant correlation (r = 0.914). Whereas pyrosequencing and Sanger sequencing were not able to detect multiple mutated cell clones within one tumor specimen, 454-NGS and the hybcell-based compact sequencing detected multiple mutations in two specimens. Our comparison shows that the hybcell-based compact sequencing is a valuable alternative to state-of-the-art methodologies used for detection of clinically relevant point mutations.
Improving detection probabilities for pests in stored grain.
Elmouttie, David; Kiermeier, Andreas; Hamilton, Grant
2010-12-01
The presence of insects in stored grain is a significant problem for grain farmers, bulk grain handlers and distributors worldwide. Inspection of bulk grain commodities is essential to detect pests and thereby to reduce the risk of their presence in exported goods. It has been well documented that insect pests cluster in response to factors such as microclimatic conditions within bulk grain. Statistical sampling methodologies for grain, however, have typically considered pests and pathogens to be homogeneously distributed throughout grain commodities. In this paper, a sampling methodology is demonstrated that accounts for the heterogeneous distribution of insects in bulk grain. It is shown that failure to account for the heterogeneous distribution of pests may lead to overestimates of the capacity for a sampling programme to detect insects in bulk grain. The results indicate the importance of the proportion of grain that is infested in addition to the density of pests within the infested grain. It is also demonstrated that the probability of detecting pests in bulk grain increases as the number of subsamples increases, even when the total volume or mass of grain sampled remains constant. This study underlines the importance of considering an appropriate biological model when developing sampling methodologies for insect pests. Accounting for a heterogeneous distribution of pests leads to a considerable improvement in the detection of pests over traditional sampling models. Copyright © 2010 Society of Chemical Industry.
Scoping review of response shift methods: current reporting practices and recommendations.
Sajobi, Tolulope T; Brahmbatt, Ronak; Lix, Lisa M; Zumbo, Bruno D; Sawatzky, Richard
2018-05-01
Response shift (RS) has been defined as a change in the meaning of an individual's self-evaluation of his/her health status and quality of life. Several statistical model- and design-based methods have been developed to test for RS in longitudinal data. We reviewed the uptake of these methods in patient-reported outcomes (PRO) literature. CINHAHL, EMBASE, Medline, ProQuest, PsycINFO, and Web of Science were searched to identify English-language articles about RS published until 2016. Data on year and country of publication, PRO measure adopted, RS detection method, type of RS detected, and testing of underlying model assumptions were extracted from the included articles. Of the 1032 articles identified, 101 (9.8%) articles were included in the study. While 54.5 of the articles reported on the Then-test, 30.7% of the articles reported on Oort's or Schmitt's structural equation modeling (SEM) procedure. Newer RS detection methods, such as relative importance analysis and random forest regression, have been used less frequently. Less than 25% reported on testing the assumptions underlying the adopted RS detection method(s). Despite rapid methodological advancements in RS research, this review highlights the need for further research about RS detection methods for complex longitudinal data and standardized reporting guidelines.
The Swift/BAT Hard X-ray Transient Monitor: A Status Report
NASA Astrophysics Data System (ADS)
Krimm, Hans A.; Bloom, J. S.; Markwardt, C.; Miler-Jones, J.; Gehrels, N.; Kennea, J. A.; Holland, S.; Sivakoff, G. R.; Swift/BAT Team
2013-04-01
The Swift/Burst Alert Telescope (BAT) hard X-ray transient monitor provides near real-time coverage of the X-ray sky in the energy range 15-50 keV. This monitor was first announced at the 2006 HEAD meeting. Seven years later, it continues to operate and provides near real-time light curves of more than 900 astrophysical sources. The BAT observes ~75% of the sky each day with a 3-sigma detection sensitivity of 7 mCrab for a full-day observation and a time resolution as fine as 64 seconds. The three main purposes of the monitor are (1) the discovery of new transient X-ray sources, (2) the detection of outbursts or other changes in the flux of known X-ray sources, and (3) the generation of archival light curves spanning nearly seven years. The primary interface for the BAT transient monitor is a public web page. Since February 2005, 223 sources have been detected in the monitor, 142 of them persistent and 81 detected only in outburst. From 2006-2013, fourteen new sources have been discovered by the BAT transient monitor. We will describe the methodology of the transient monitor, present a summary of its statistics, and discuss the detection of known and newly discovered sources.
The Swift/BAT Hard X-ray Transient Monitor: A Status Report
NASA Astrophysics Data System (ADS)
Krimm, Hans A.; Swift/BAT Team
2011-09-01
The Swift/Burst Alert Telescope (BAT) hard X-ray transient monitor provides near real-time coverage of the X-ray sky in the energy range 15-50 keV. This monitor was first announced at the 2006 HEAD meeting. Five years later, it continues to operate and provides near real-time light curves of more than 900 astrophysical sources. The BAT observes 75% of the sky each day with a 3-sigma detection sensitivity of 7 mCrab for a full-day observation and a time resolution as fine as 64 seconds. The three main purposes of the monitor are (1) the discovery of new transient X-ray sources, (2) the detection of outbursts or other changes in the flux of known X-ray sources, and (3) the generation of archival light curves spanning nearly seven years. The primary interface for the BAT transient monitor is a public web page. Since February 2005, 172 sources have been detected in the monitor, 89 of them persistent and 83 detected only in outburst. From 2006-2011, nine new sources have been discovered by the BAT transient monitor. We will describe the methodology of the transient monitor, present a summary of its statistics, and discuss the detection of known and newly discovered sources.
Development of a mercury detection kit based on melamine-functionalized gold nanoparticles.
Liu, Guoyan; Ren, Huipeng; Guan, Yuyu; Dai, Ronghua; Chai, Chunyan
2015-01-01
A fast and simple mercury detection kit was developed based on melamine-functionalized gold nanoparticles (GNPs). The detection kit contained reagent 1 (GNPs), reagent 2 (melanine), a reaction cuvette with four separated cells, a colorimetric card and a plastic pipette. The GNPs were prepared by a citrate reduction of HAuCl4. A proper amount of melamine was applied to functionalize the GNPs. The complex reaction took place in the present of Hg(2+) in the test samples, leading to the combination of Hg(2+) with the C=N group of melamine located on the surface of the GNPs. This reaction resulted in damage to the stability of colloid gold, and the aggregation of GNPs occurred. Different color changes (from claret-red to lilac, purple and plum) were displayed with different concentrations of Hg(2+) in the test samples. It was very easy and convenient to determine the amount of mercury ion by the naked eye. The advantages of this methodology are listed as follows: a short detecting time (within 10 min), a high specificity (no significant interference was indicated upon adding a certain amount of Cu(2+), Pb(2+), Zn(2+), Mg(2+), Cd(2+) and Fe(2+)), high sensitivity with a detection limit of 0.01 mg L(-1) , easy operation and practical on-site use.
The Swift/BAT Hard X-ray Transient Monitor
NASA Technical Reports Server (NTRS)
Krimm, H. A.; Holland, S. T.; Corbet, R.H.D.; Pearlman, A. B.; Romano, P.; Kennea, J. A.; Bloom, J. S.; Barthelmy, S. D.; Baumgartner, W. H.; Cummings, J. R.;
2013-01-01
The Swift/Burst Alert Telescope (BAT) hard X-ray transient monitor provides near real-time coverage of the X-ray sky in the energy range 15-50 keV. The BAT observes 88% of the sky each day with a detection sensitivity of 5.3 mCrab for a full-day observation and a time resolution as ne as 64 seconds. The three main purposes of the monitor are (1) the discovery of new transient X-ray sources, (2) the detection of outbursts or other changes in the ux of known X-ray sources, and (3) the generation of light curves of more than 900 sources spanning over eight years. The primary interface for the BAT transient monitor is a public web page. Since 2005 February, 242 sources have been detected in the monitor, 149 of them persistent and 93 detected only in outburst. Among these sources, 16 were previously unknown and discovered in the transient monitor. In this paper, we discuss the methodology and the data processing and ltering for the BAT transient monitor and review its sensitivity and exposure. We provide a summary of the source detections and classify them according to the variability of their light curves. Finally, we review all new BAT monitor discoveries and present basic data analysis and interpretations for those sources with previously unpublished results.
Comparison of non-O157 Shiga toxin-producing E. coli detection systems
USDA-ARS?s Scientific Manuscript database
Category: methodology improvements Objective: To identify strengths and weaknesses of commercial Shiga toxin-producing E. coli detection systems and kits in a side by side fashion. Experimental Design: Three commercial Shiga toxin-producing E. coli detection tests (BAX, GDS, and GeneDisc) and two t...
A Comparison of Bias Correction Adjustments for the DETECT Procedure
ERIC Educational Resources Information Center
Nandakumar, Ratna; Yu, Feng; Zhang, Yanwei
2011-01-01
DETECT is a nonparametric methodology to identify the dimensional structure underlying test data. The associated DETECT index, "D[subscript max]," denotes the degree of multidimensionality in data. Conditional covariances (CCOV) are the building blocks of this index. In specifying population CCOVs, the latent test composite [theta][subscript TT]…
Microbiology--Detection, Occurrence, and Removal of Viruses.
ERIC Educational Resources Information Center
Berg, Gerald
1978-01-01
Presents a literature review of the detection, survival, and destruction of viruses in wastewater sludges, covering publications of 1976-77. This review includes these topics: (1) detection methodology; (2) viruses in sludge, shellfish, and aerosols; and (3) indicators of viruses. A list of 59 references is also presented. (HM)
USDA-ARS?s Scientific Manuscript database
Analytical methodology to detect ricin in food matrices is important because of the potential use of foodborne ricin as a terrorist weapon. Monoclonal antibodies (mAbs) that bind ricin were used for both capture and detection in sandwich enzyme-linked immunosorbent assay (ELISA) and electrochemilumi...
Zhi, Lihua; Zeng, Xiaofan; Wang, Hao; Hai, Jun; Yang, Xiangliang; Wang, Baodui; Zhu, Yanhong
2017-07-18
The development of sensitive and reliable methods to monitor the presence of mercuric ions in cells and organisms is of great importance to biological research and biomedical applications. In this work, we propose a strategy to construct a solar-driven nanoprobe using a 3D Au@MoS 2 heterostructure as a photocatalyst and rhodamine B (RB) as a fluorescent and color change reporter molecule for monitoring Hg 2+ in living cells and animals. The sensing mechanism is based on the photoinduced electron formation of gold amalgam in the 3D Au@MoS 2 heterostructure under visible light illumination. This formation is able to remarkably inhibit the photocatalytic activity of the heterostructure toward RB decomposition. As a result, "OFF-ON" fluorescence and color change are produced. Such characteristics enable this new sensing platform to sensitively and selectively detect Hg 2+ in water by fluorescence and colorimetric methods. The detection limits of the fluorescence assay and colorimetric assay are 0.22 and 0.038 nM for Hg 2+ , respectively; these values are well below the acceptable limits in drinking water standards (10 nM). For the first time, such photocatalysis-based sensing platform is successfully used to monitor Hg 2+ in live cells and mice. Our work therefore opens a promising photocatalysis-based analysis methodology for highly sensitive and selective in vivo Hg 2+ bioimaging studies.
ERIC Educational Resources Information Center
Macintyre, Peter D.; Legatto, James Jason
2011-01-01
Willingness to communicate (WTC) can be conceptualized as changing from moment to moment, as opportunities for second-language communication arise. In this study we present an idiodynamic methodology for studying rapid changes in WTC. The methodology consists of recording responses from six young adult, female speakers to second-language…
Diffractive interference optical analyzer (DiOPTER)
NASA Astrophysics Data System (ADS)
Sasikumar, Harish; Prasad, Vishnu; Pal, Parama; Varma, Manoj M.
2016-03-01
This report demonstrates a method for high-resolution refractometric measurements using, what we have termed as, a Diffractive Interference Optical Analyzer (DiOpter). The setup consists of a laser, polarizer, a transparent diffraction grating and Si-photodetectors. The sensor is based on the differential response of diffracted orders to bulk refractive index changes. In these setups, the differential read-out of the diffracted orders suppresses signal drifts and enables time-resolved determination of refractive index changes in the sample cell. A remarkable feature of this device is that under appropriate conditions, the measurement sensitivity of the sensor can be enhanced by more than two orders of magnitude due to interference between multiply reflected diffracted orders. A noise-equivalent limit of detection (LoD) of 6x10-7 RIU was achieved in glass. This work focuses on devices with integrated sample well, made on low-cost PDMS. As the detection methodology is experimentally straightforward, it can be used across a wide array of applications, ranging from detecting changes in surface adsorbates via binding reactions to estimating refractive index (and hence concentration) variations in bulk samples. An exciting prospect of this technique is the potential integration of this device to smartphones using a simple interface based on transmission mode configuration. In a transmission configuration, we were able to achieve an LoD of 4x10-4 RIU which is sufficient to explore several applications in food quality testing and related fields. We are envisioning the future of this platform as a personal handheld optical analyzer for applications ranging from environmental sensing to healthcare and quality testing of food products.
NASA Astrophysics Data System (ADS)
Hammi, A.; Placidi, L.; Weber, D. C.; Lomax, A. J.
2018-01-01
To exploit the full potential of proton therapy, accurate and on-line methods to verify the patient positioning and the proton range during the treatment are desirable. Here we propose and validate an innovative technique for determining patient misalignment uncertainties through the use of a small number of low dose, carefully selected proton pencil beams (‘range probes’) (RP) with sufficient energy that their residual Bragg peak (BP) position and shape can be measured on exit. Since any change of the patient orientation in relation to these beams will result in changes of the density heterogeneities through which they pass, our hypothesis is that patient misalignments can be deduced from measured changes in Bragg curve (BC) shape and range. As such, a simple and robust methodology has been developed that estimates average proton range and range dilution of the detected residual BC, in order to locate range probe positions with optimal prediction power for detecting misalignments. The validation of this RP based approach has been split into two phases. First we retrospectively investigate its potential to detect translational patient misalignments under real clinical conditions. Second, we test it for determining rotational errors of an anthropomorphic phantom that was systematically rotated using an in-house developed high precision motion stage. Simulations of RPs in these two scenarios show that this approach could potentially predict translational errors to lower than1.5 mm and rotational errors to smaller than 1° using only three or five RPs positions respectively.
Exotic mosquito threats require strategic surveillance and response planning.
Webb, Cameron E; Doggett, Stephen L
2016-12-14
Mosquito-borne diseases caused by endemic pathogens such as Ross River, Barmah Forest and Murray Valley encephalitis viruses are an annual concern in New South Wales (NSW), Australia. More than a dozen mosquito species have been implicated in the transmission of these pathogens, with each mosquito occupying a specialised ecological niche that influences their habitat associations, host feeding preferences and the environmental drivers of their abundance. The NSW Arbovirus Surveillance and Mosquito Monitoring Program provides an early warning system for potential outbreaks of mosquito-borne disease by tracking annual activity of these mosquitoes and their associated pathogens. Although the program will effectively track changes in local mosquito populations that may increase with a changing climate, urbanisation and wetland rehabilitation, it will be less effective with current surveillance methodologies at detecting or monitoring changes in exotic mosquito threats, where different surveillance strategies need to be used. Exotic container-inhabiting mosquitoes such as Aedes aegypti and Ae. albopictus pose a threat to NSW because they are nuisance-biting pests and vectors of pathogens such as dengue, chikungunya and Zika viruses. International movement of humans and their belongings have spread these mosquitoes to many regions of the world. In recent years, these two mosquitoes have been detected by the Australian Government Department of Agriculture and Water Resources at local airports and seaports. To target the detection of these exotic mosquitoes, new trapping technologies and networks of surveillance locations are required. Additionally, incursions of these mosquitoes into urban areas of the state will require strategic responses to minimise substantial public health and economic burdens to local communities.
Pattern-based IP block detection, verification, and variability analysis
NASA Astrophysics Data System (ADS)
Ahmad Ibrahim, Muhamad Asraf Bin; Muhsain, Mohamad Fahmi Bin; Kamal Baharin, Ezni Aznida Binti; Sweis, Jason; Lai, Ya-Chieh; Hurat, Philippe
2018-03-01
The goal of a foundry partner is to deliver high quality silicon product to its customers on time. There is an assumed trust that the silicon will yield, function and perform as expected when the design fits all the sign-off criteria. The use of Intellectual Property (IP) blocks is very common today and provides the customer with pre-qualified and optimized functions for their design thus shortening the design cycle. There are many methods by which an IP Block can be generated and placed within layout. Even with the most careful methods and following of guidelines comes the responsibility of sign-off checking. A foundry needs to detect where these IP Blocks have been placed and look for any violations. This includes DRC clean modifications to the IP Block which may or may not be intentional. Using a pattern-based approach to detect all IP Blocks used provides the foundry advanced capabilities to analyze them further for any kind of changes which could void the OPC and process window optimizations. Having any changes in an IP Block could cause functionality changes or even failures. This also opens the foundry to legal and cost issues while at the same time forcing re-spins of the design. In this publication, we discuss the methodology we have employed to avoid process issues and tape-out errors while at the same time reduce our manual work and improve the turnaround time. We are also able to use our pattern analysis to improve our OPC optimizations when modifications are encountered which have not been seen before.
ERIC Educational Resources Information Center
Holveck, Susan E.
2012-01-01
This mixed methods study was designed to compare the effect of using an inquiry teaching methodology and a more traditional teaching methodology on the learning gains of students who were taught a five-week conceptual change unit on density. Seventh graders (N = 479) were assigned to five teachers who taught the same unit on density using either a…
Detection of adverse events in general surgery using the " Trigger Tool" methodology.
Pérez Zapata, Ana Isabel; Gutiérrez Samaniego, María; Rodríguez Cuéllar, Elías; Andrés Esteban, Eva María; Gómez de la Cámara, Agustín; Ruiz López, Pedro
2015-02-01
Surgery is one of the high-risk areas for the occurrence of adverse events (AE). The purpose of this study is to know the percentage of hospitalisation-related AE that are detected by the «Global Trigger Tool» methodology in surgical patients, their characteristics and the tool validity. Retrospective, observational study on patients admitted to a general surgery department, who underwent a surgical operation in a third level hospital during the year 2012. The identification of AE was carried out by patient record review using an adaptation of «Global Trigger Tool» methodology. Once an AE was identified, a harm category was assigned, including the grade in which the AE could have been avoided and its relation with the surgical procedure. The prevalence of AE was 36,8%. There were 0,5 AE per patient. 56,2% were deemed preventable. 69,3% were directly related to the surgical procedure. The tool had a sensitivity of 86% and a specificity of 93,6%. The positive predictive value was 89% and the negative predictive value 92%. Prevalence of AE is greater than the estimate of other studies. In most cases the AE detected were related to the surgical procedure and more than half were also preventable. The adapted «Global Trigger Tool» methodology has demonstrated to be highly effective and efficient for detecting AE in surgical patients, identifying all the serious AE with few false negative results. Copyright © 2014 AEC. Publicado por Elsevier España, S.L.U. All rights reserved.
Public Relations Telephone Surveys: Avoiding Methodological Debacles.
ERIC Educational Resources Information Center
Stone, Gerald C.
1996-01-01
Reports that a study revealed a serious methodological flaw in interviewer bias in telephone surveys. States that most surveys, using standard detection measures, would not find the defect, but outcomes were so misleading that a campaign using the results would be doomed. Warns about practitioner telephone surveys; suggests special precautions if…
Feng, Jie; Yee, Rebecca; Zhang, Shuo; Tian, Lili; Shi, Wanliang; Zhang, Wen-Hong; Zhang, Ying
2018-01-01
Antibiotic-resistant bacteria have caused huge concerns and demand innovative approaches for their prompt detection. Current antimicrobial susceptibility tests (AST) rely on the growth of the organisms which takes 1-2 days for fast-growing organisms and several weeks for slow growing organisms. Here, we show for the first time the utility of the SYBR Green I/propidium iodide (PI) viability assay for rapidly identifying antibiotic resistance in less than 30 min for major, antibiotic-resistant, fast-growing bacteria, such as Staphylococcus aureus, Escherichia coli, Klebsiella pneumoniae , and Acinetobacter baumannii for bactericidal and bacteriostatic agents and in 16 h for extremely rapid detection of drug resistance for isoniazid and pyrazinamide in slow-growing Mycobacterium tuberculosis . The SYBR Green I/PI assay generated rapid and robust results in concordance with traditional AST methods. This novel growth-independent methodology changes the concept of the current growth-based AST and may revolutionize current drug susceptibility testing for all cells of prokaryotic and eukaryotic origin and, subject to further clinical validation, may play a major role in saving lives and improving patient outcomes.
NASA Astrophysics Data System (ADS)
Donges, J. F.; Donner, R. V.; Marwan, N.; Breitenbach, S. F. M.; Rehfeld, K.; Kurths, J.
2015-05-01
The Asian monsoon system is an important tipping element in Earth's climate with a large impact on human societies in the past and present. In light of the potentially severe impacts of present and future anthropogenic climate change on Asian hydrology, it is vital to understand the forcing mechanisms of past climatic regime shifts in the Asian monsoon domain. Here we use novel recurrence network analysis techniques for detecting episodes with pronounced non-linear changes in Holocene Asian monsoon dynamics recorded in speleothems from caves distributed throughout the major branches of the Asian monsoon system. A newly developed multi-proxy methodology explicitly considers dating uncertainties with the COPRA (COnstructing Proxy Records from Age models) approach and allows for detection of continental-scale regime shifts in the complexity of monsoon dynamics. Several epochs are characterised by non-linear regime shifts in Asian monsoon variability, including the periods around 8.5-7.9, 5.7-5.0, 4.1-3.7, and 3.0-2.4 ka BP. The timing of these regime shifts is consistent with known episodes of Holocene rapid climate change (RCC) and high-latitude Bond events. Additionally, we observe a previously rarely reported non-linear regime shift around 7.3 ka BP, a timing that matches the typical 1.0-1.5 ky return intervals of Bond events. A detailed review of previously suggested links between Holocene climatic changes in the Asian monsoon domain and the archaeological record indicates that, in addition to previously considered longer-term changes in mean monsoon intensity and other climatic parameters, regime shifts in monsoon complexity might have played an important role as drivers of migration, pronounced cultural changes, and the collapse of ancient human societies.
Design and Development of Nanostructured Surfaces for Enhanced Optical Sensing
NASA Astrophysics Data System (ADS)
Santiago Cordoba, Miguel A.
At smaller size regimes, materials' physicochemical properties change with respect to bulk analogs. In the case of metal nanoparticles like gold or silver, specific wavelengths of light can induce a coherent oscillation of their conduction electrons, generating an optical field confined to the nanoparticle surface. This phenomenon is termed surface plasmon, and has been used as an enhancing mechanism in optical sensing, allowing the detection of foreign materials at small concentrations. The goal of this dissertation is to develop nanostructured materials relying on surface plasmons that can be combined with different optical sensing platforms in order to enhance current detection limits. Initially, we focus on the development of surfactant free, stimuli responsive nanoparticle thin films, which undergo an active release when exposed to a stimulus such as a change in pH. These nanoparticle thin films provide faster analyte particle transport and direct electronic coupling with the analyte molecule, all without attenuating the evanescent wave from the optical transducer to the particle. These stimuli responsive nanostructured substrates are tested within a surface enhanced Raman platform for the detection of biomolecular probes at sub-nanomolar concentrations and microL sample sizes. Furthermore, the developed nanosubstrates can be patterned, providing a versatile nanoparticle thin film for multiplexing analysis, offering a substantial advantage over conventional surface based nanoparticle detection methods. Our results encouraged further optimization of light-matter interactions in optical detection platforms. It is for that reason that this dissertation evolves towards confined optical systems. Particularly, whispering gallery microcavities confine electromagnetic waves - at high volumes - at the boundary of a dielectric resonator. In this dissertation, we examined the sensitivity of whispering gallery modes combining optical microcavities with plasmonic nanoparticles in analogy to a "nanoantenna". First, our hybrid methodology is tested by analyzing the resonant wavelength displacement of a whispering gallery mode cavity upon perturbation with a gold nanoparticle layer containing a model protein. Next, we developed a real-time optical sensing platform relying on whispering gallery microcavities and surface plasmons, and then tested it for the detection of a model protein at fM concentration (less than 1000 protein molecules). Finally, this plasmonic-photonic coupling process involving whispering gallery modes is studied via a self-referenced methodology relying on the mode splitting of a whispering gallery resonance. Specifically, we studied the mode splitting evolution of a resonant whispering gallery microcavity as a function of gold nanoparticle adherence with varying diameters. Mode splitting increases as the localized surface plasmon wavelength of the nanoparticle approaches the spectral line of the whispering gallery mode. Plasmonic-photonic coupling observed in this study provides a novel alternative to achieve single particle detection using mode splitting, as well as understanding optimization of particle size for plasmonic-photonic coupling. The study described herein opens a new way to optimize current optical sensing technology, enabling not only the detection of an analyte, but also the execution of fundamental studies of analyte interactions at ultralow concentrations.
Aly, Mariam; Yonelinas, Andrew P
2012-01-01
Subjective experience indicates that mental states are discrete, in the sense that memories and perceptions readily come to mind in some cases, but are entirely unavailable to awareness in others. However, a long history of psychophysical research has indicated that the discrete nature of mental states is largely epiphenomenal and that mental processes vary continuously in strength. We used a novel combination of behavioral methodologies to examine the processes underlying perception of complex images: (1) analysis of receiver operating characteristics (ROCs), (2) a modification of the change-detection flicker paradigm, and (3) subjective reports of conscious experience. These methods yielded converging results showing that perceptual judgments reflect the combined, yet functionally independent, contributions of two processes available to conscious experience: a state process of conscious perception and a strength process of knowing; processes that correspond to recollection and familiarity in long-term memory. In addition, insights from the perception experiments led to the discovery of a new recollection phenomenon in a long-term memory change detection paradigm. The apparent incompatibility between subjective experience and theories of cognition can be understood within a unified state-strength framework that links consciousness to cognition across the domains of perception and memory.
Monitoring trail conditions: New methodological considerations
Marion, Jeffrey L.; Leung, Yu-Fai; Nepal, Sanjay K.
2006-01-01
The U.S. National Park Service (NPS) accommodates nearly 300 million visitors per year, visitation that has the potential to produce negative effects on fragile natural and cultural resources. The policy guidance from the NPS Management Policies recognizes the legitimacy of providing opportunities for public enjoyment of parks while acknowledging the need for managers to “seek ways to avoid, or to minimize to the greatest degree practicable, adverse impacts on park resources and values” (NPS 2001). Thus, relative to visitor use, park managers must evaluate the types and extents of resource impacts associated with recreational activities, and determine to what extent they are unacceptable and constitute impairment. Visitor impact monitoring programs can assist managers in making objective evaluations of impact acceptability and impairment and in selecting effective impact management practices by providing quantitative documentation of the types and extent of recreationrelated impacts on natural resources. Monitoring programs are explicitly authorized in Section 4.1 of the Management Policies: Natural systems in the national park system, and the human influences upon them, will be monitored to detect change. The Service will use the results of monitoring and research to understand the detected change and to develop appropriate management actions.
Hétu, Sébastien; Luo, Yi; D’Ardenne, Kimberlee; Lohrenz, Terry
2017-01-01
Abstract As models of shared expectations, social norms play an essential role in our societies. Since our social environment is changing constantly, our internal models of it also need to change. In humans, there is mounting evidence that neural structures such as the insula and the ventral striatum are involved in detecting norm violation and updating internal models. However, because of methodological challenges, little is known about the possible involvement of midbrain structures in detecting norm violation and updating internal models of our norms. Here, we used high-resolution cardiac-gated functional magnetic resonance imaging and a norm adaptation paradigm in healthy adults to investigate the role of the substantia nigra/ventral tegmental area (SN/VTA) complex in tracking signals related to norm violation that can be used to update internal norms. We show that the SN/VTA codes for the norm’s variance prediction error (PE) and norm PE with spatially distinct regions coding for negative and positive norm PE. These results point to a common role played by the SN/VTA complex in supporting both simple reward-based and social decision making. PMID:28981876
Aly, Mariam; Yonelinas, Andrew P.
2012-01-01
Subjective experience indicates that mental states are discrete, in the sense that memories and perceptions readily come to mind in some cases, but are entirely unavailable to awareness in others. However, a long history of psychophysical research has indicated that the discrete nature of mental states is largely epiphenomenal and that mental processes vary continuously in strength. We used a novel combination of behavioral methodologies to examine the processes underlying perception of complex images: (1) analysis of receiver operating characteristics (ROCs), (2) a modification of the change-detection flicker paradigm, and (3) subjective reports of conscious experience. These methods yielded converging results showing that perceptual judgments reflect the combined, yet functionally independent, contributions of two processes available to conscious experience: a state process of conscious perception and a strength process of knowing; processes that correspond to recollection and familiarity in long-term memory. In addition, insights from the perception experiments led to the discovery of a new recollection phenomenon in a long-term memory change detection paradigm. The apparent incompatibility between subjective experience and theories of cognition can be understood within a unified state-strength framework that links consciousness to cognition across the domains of perception and memory. PMID:22272314
Rear-end vision-based collision detection system for motorcyclists
NASA Astrophysics Data System (ADS)
Muzammel, Muhammad; Yusoff, Mohd Zuki; Meriaudeau, Fabrice
2017-05-01
In many countries, the motorcyclist fatality rate is much higher than that of other vehicle drivers. Among many other factors, motorcycle rear-end collisions are also contributing to these biker fatalities. To increase the safety of motorcyclists and minimize their road fatalities, this paper introduces a vision-based rear-end collision detection system. The binary road detection scheme contributes significantly to reduce the negative false detections and helps to achieve reliable results even though shadows and different lane markers are present on the road. The methodology is based on Harris corner detection and Hough transform. To validate this methodology, two types of dataset are used: (1) self-recorded datasets (obtained by placing a camera at the rear end of a motorcycle) and (2) online datasets (recorded by placing a camera at the front of a car). This method achieved 95.1% accuracy for the self-recorded dataset and gives reliable results for the rear-end vehicle detections under different road scenarios. This technique also performs better for the online car datasets. The proposed technique's high detection accuracy using a monocular vision camera coupled with its low computational complexity makes it a suitable candidate for a motorbike rear-end collision detection system.
Estepp, Justin R.; Christensen, James C.
2015-01-01
The passive brain-computer interface (pBCI) framework has been shown to be a very promising construct for assessing cognitive and affective state in both individuals and teams. There is a growing body of work that focuses on solving the challenges of transitioning pBCI systems from the research laboratory environment to practical, everyday use. An interesting issue is what impact methodological variability may have on the ability to reliably identify (neuro)physiological patterns that are useful for state assessment. This work aimed at quantifying the effects of methodological variability in a pBCI design for detecting changes in cognitive workload. Specific focus was directed toward the effects of replacing electrodes over dual sessions (thus inducing changes in placement, electromechanical properties, and/or impedance between the electrode and skin surface) on the accuracy of several machine learning approaches in a binary classification problem. In investigating these methodological variables, it was determined that the removal and replacement of the electrode suite between sessions does not impact the accuracy of a number of learning approaches when trained on one session and tested on a second. This finding was confirmed by comparing to a control group for which the electrode suite was not replaced between sessions. This result suggests that sensors (both neurological and peripheral) may be removed and replaced over the course of many interactions with a pBCI system without affecting its performance. Future work on multi-session and multi-day pBCI system use should seek to replicate this (lack of) effect between sessions in other tasks, temporal time courses, and data analytic approaches while also focusing on non-stationarity and variable classification performance due to intrinsic factors. PMID:25805963
Systematic review of smartphone-based passive sensing for health and wellbeing.
Cornet, Victor P; Holden, Richard J
2018-01-01
To review published empirical literature on the use of smartphone-based passive sensing for health and wellbeing. A systematic review of the English language literature was performed following PRISMA guidelines. Papers indexed in computing, technology, and medical databases were included if they were empirical, focused on health and/or wellbeing, involved the collection of data via smartphones, and described the utilized technology as passive or requiring minimal user interaction. Thirty-five papers were included in the review. Studies were performed around the world, with samples of up to 171 (median n = 15) representing individuals with bipolar disorder, schizophrenia, depression, older adults, and the general population. The majority of studies used the Android operating system and an array of smartphone sensors, most frequently capturing accelerometry, location, audio, and usage data. Captured data were usually sent to a remote server for processing but were shared with participants in only 40% of studies. Reported benefits of passive sensing included accurately detecting changes in status, behavior change through feedback, and increased accountability in participants. Studies reported facing technical, methodological, and privacy challenges. Studies in the nascent area of smartphone-based passive sensing for health and wellbeing demonstrate promise and invite continued research and investment. Existing studies suffer from weaknesses in research design, lack of feedback and clinical integration, and inadequate attention to privacy issues. Key recommendations relate to developing passive sensing strategies matching the problem at hand, using personalized interventions, and addressing methodological and privacy challenges. As evolving passive sensing technology presents new possibilities for health and wellbeing, additional research must address methodological, clinical integration, and privacy issues. Doing so depends on interdisciplinary collaboration between informatics and clinical experts. Copyright © 2017 Elsevier Inc. All rights reserved.
Estepp, Justin R; Christensen, James C
2015-01-01
The passive brain-computer interface (pBCI) framework has been shown to be a very promising construct for assessing cognitive and affective state in both individuals and teams. There is a growing body of work that focuses on solving the challenges of transitioning pBCI systems from the research laboratory environment to practical, everyday use. An interesting issue is what impact methodological variability may have on the ability to reliably identify (neuro)physiological patterns that are useful for state assessment. This work aimed at quantifying the effects of methodological variability in a pBCI design for detecting changes in cognitive workload. Specific focus was directed toward the effects of replacing electrodes over dual sessions (thus inducing changes in placement, electromechanical properties, and/or impedance between the electrode and skin surface) on the accuracy of several machine learning approaches in a binary classification problem. In investigating these methodological variables, it was determined that the removal and replacement of the electrode suite between sessions does not impact the accuracy of a number of learning approaches when trained on one session and tested on a second. This finding was confirmed by comparing to a control group for which the electrode suite was not replaced between sessions. This result suggests that sensors (both neurological and peripheral) may be removed and replaced over the course of many interactions with a pBCI system without affecting its performance. Future work on multi-session and multi-day pBCI system use should seek to replicate this (lack of) effect between sessions in other tasks, temporal time courses, and data analytic approaches while also focusing on non-stationarity and variable classification performance due to intrinsic factors.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lafata, K; Ren, L; Cai, J
2016-06-15
Purpose: To develop a methodology based on digitally-reconstructed-fluoroscopy (DRF) to quantitatively assess target localization accuracy of lung SBRT, and to evaluate using both a dynamic digital phantom and a patient dataset. Methods: For each treatment field, a 10-phase DRF is generated based on the planning 4DCT. Each frame is pre-processed with a morphological top-hat filter, and corresponding beam apertures are projected to each detector plane. A template-matching algorithm based on cross-correlation is used to detect the tumor location in each frame. Tumor motion relative beam aperture is extracted in the superior-inferior direction based on each frame’s impulse response to themore » template, and the mean tumor position (MTP) is calculated as the average tumor displacement. The DRF template coordinates are then transferred to the corresponding MV-cine dataset, which is retrospectively filtered as above. The treatment MTP is calculated within each field’s projection space, relative to the DRF-defined template. The field’s localization error is defined as the difference between the DRF-derived-MTP (planning) and the MV-cine-derived-MTP (delivery). A dynamic digital phantom was used to assess the algorithm’s ability to detect intra-fractional changes in patient alignment, by simulating different spatial variations in the MV-cine and calculating the corresponding change in MTP. Inter-and-intra-fractional variation, IGRT accuracy, and filtering effects were investigated on a patient dataset. Results: Phantom results demonstrated a high accuracy in detecting both translational and rotational variation. The lowest localization error of the patient dataset was achieved at each fraction’s first field (mean=0.38mm), with Fx3 demonstrating a particularly strong correlation between intra-fractional motion-caused localization error and treatment progress. Filtering significantly improved tracking visibility in both the DRF and MV-cine images. Conclusion: We have developed and evaluated a methodology to quantify lung SBRT target localization accuracy based on digitally-reconstructed-fluoroscopy. Our approach may be useful in potentially reducing treatment margins to optimize lung SBRT outcomes. R01-184173.« less
A study and evaluation of image analysis techniques applied to remotely sensed data
NASA Technical Reports Server (NTRS)
Atkinson, R. J.; Dasarathy, B. V.; Lybanon, M.; Ramapriyan, H. K.
1976-01-01
An analysis of phenomena causing nonlinearities in the transformation from Landsat multispectral scanner coordinates to ground coordinates is presented. Experimental results comparing rms errors at ground control points indicated a slight improvement when a nonlinear (8-parameter) transformation was used instead of an affine (6-parameter) transformation. Using a preliminary ground truth map of a test site in Alabama covering the Mobile Bay area and six Landsat images of the same scene, several classification methods were assessed. A methodology was developed for automatic change detection using classification/cluster maps. A coding scheme was employed for generation of change depiction maps indicating specific types of changes. Inter- and intraseasonal data of the Mobile Bay test area were compared to illustrate the method. A beginning was made in the study of data compression by applying a Karhunen-Loeve transform technique to a small section of the test data set. The second part of the report provides a formal documentation of the several programs developed for the analysis and assessments presented.
Statistical data mining of streaming motion data for fall detection in assistive environments.
Tasoulis, S K; Doukas, C N; Maglogiannis, I; Plagianakos, V P
2011-01-01
The analysis of human motion data is interesting for the purpose of activity recognition or emergency event detection, especially in the case of elderly or disabled people living independently in their homes. Several techniques have been proposed for identifying such distress situations using either motion, audio or video sensors on the monitored subject (wearable sensors) or the surrounding environment. The output of such sensors is data streams that require real time recognition, especially in emergency situations, thus traditional classification approaches may not be applicable for immediate alarm triggering or fall prevention. This paper presents a statistical mining methodology that may be used for the specific problem of real time fall detection. Visual data captured from the user's environment, using overhead cameras along with motion data are collected from accelerometers on the subject's body and are fed to the fall detection system. The paper includes the details of the stream data mining methodology incorporated in the system along with an initial evaluation of the achieved accuracy in detecting falls.
ERIC Educational Resources Information Center
Rosencwaig, Allan
1982-01-01
Thermal features of and beneath the surface of a sample can be detected and imaged with a thermal-wave microscope. Various methodologies for the excitation and detection of thermal waves are discussed, and several applications, primarily in microelectronics, are presented. (Author)
Dhir, Somdutta; Pacurar, Mircea; Franklin, Dino; Gáspári, Zoltán; Kertész-Farkas, Attila; Kocsor, András; Eisenhaber, Frank; Pongor, Sándor
2010-11-01
SBASE is a project initiated to detect known domain types and predicting domain architectures using sequence similarity searching (Simon et al., Protein Seq Data Anal, 5: 39-42, 1992, Pongor et al, Nucl. Acids. Res. 21:3111-3115, 1992). The current approach uses a curated collection of domain sequences - the SBASE domain library - and standard similarity search algorithms, followed by postprocessing which is based on a simple statistics of the domain similarity network (http://hydra.icgeb.trieste.it/sbase/). It is especially useful in detecting rare, atypical examples of known domain types which are sometimes missed even by more sophisticated methodologies. This approach does not require multiple alignment or machine learning techniques, and can be a useful complement to other domain detection methodologies. This article gives an overview of the project history as well as of the concepts and principles developed within this the project.
Chorny, Joseph A; Frye, Teresa C; Fisher, Beth L; Remmers, Carol L
2018-03-23
The primary high-risk human papillomavirus (hrHPV) assays in the United States are the cobas (Roche) and the Aptima (Hologic). The cobas assay detects hrHPV by DNA analysis while the Aptima detects messenger RNA (mRNA) oncogenic transcripts. As the Aptima assay identifies oncogenic expression, it should have a lower rate of hrHPV and genotype detection. The Kaiser Permanente Regional Reference Laboratory in Denver, Colorado changed its hrHPV assay from the cobas to the Aptima assay. The rates of hrHPV detection and genotyping were compared over successive six-month periods. The overall hrHPV detection rates by the two platforms were similar (9.5% versus 9.1%) and not statistically different. For genotyping, the HPV 16 rate by the cobas was 1.6% and by the Aptima it was 1.1%. These differences were statistically different with the Aptima detecting nearly one-third less HPV 16 infections. With the HPV 18 and HPV 18/45, there was a slightly higher detection rate of HPV 18/45 by the Aptima platform (0.5% versus 0.9%) and this was statistically significant. While HPV 16 represents a low percentage of hrHPV infections, it was detected significantly less by the Aptima assay compared to the cobas assay. This has been previously reported, although not highlighted. Given the test methodologies, one would expect the Aptima to detect less HPV 16. This difference appears to be mainly due to a significantly increased number of non-oncogenic HPV 16 infections detected by the cobas test as there were no differences in HPV 16 detection rates in the high-grade squamous intraepithelial lesions indicating that the two tests have similar sensitivities for oncogenic HPV 16. © 2018 Wiley Periodicals, Inc.
Methylsorb: a simple method for quantifying DNA methylation using DNA-gold affinity interactions.
Sina, Abu Ali Ibn; Carrascosa, Laura G; Palanisamy, Ramkumar; Rauf, Sakandar; Shiddiky, Muhammad J A; Trau, Matt
2014-10-21
The analysis of DNA methylation is becoming increasingly important both in the clinic and also as a research tool to unravel key epigenetic molecular mechanisms in biology. Current methodologies for the quantification of regional DNA methylation (i.e., the average methylation over a region of DNA in the genome) are largely affected by comprehensive DNA sequencing methodologies which tend to be expensive, tedious, and time-consuming for many applications. Herein, we report an alternative DNA methylation detection method referred to as "Methylsorb", which is based on the inherent affinity of DNA bases to the gold surface (i.e., the trend of the affinity interactions is adenine > cytosine ≥ guanine > thymine).1 Since the degree of gold-DNA affinity interaction is highly sequence dependent, it provides a new capability to detect DNA methylation by simply monitoring the relative adsorption of bisulfite treated DNA sequences onto a gold chip. Because the selective physical adsorption of DNA fragments to gold enable a direct read-out of regional DNA methylation, the current requirement for DNA sequencing is obviated. To demonstrate the utility of this method, we present data on the regional methylation status of two CpG clusters located in the EN1 and MIR200B genes in MCF7 and MDA-MB-231 cells. The methylation status of these regions was obtained from the change in relative mass on gold surface with respect to relative adsorption of an unmethylated DNA source and this was detected using surface plasmon resonance (SPR) in a label-free and real-time manner. We anticipate that the simplicity of this method, combined with the high level of accuracy for identifying the methylation status of cytosines in DNA, could find broad application in biology and diagnostics.
Wan, Boyong; Small, Gary W
2011-01-21
A novel synthetic data generation methodology is described for use in the development of pattern recognition classifiers that are employed for the automated detection of volatile organic compounds (VOCs) during infrared remote sensing measurements. The approach used is passive Fourier transform infrared spectrometry implemented in a downward-looking mode on an aircraft platform. A key issue in developing this methodology in practice is the need for example data that can be used to train the classifiers. To replace the time-consuming and costly collection of training data in the field, this work implements a strategy for taking laboratory analyte spectra and superimposing them on background spectra collected from the air. The resulting synthetic spectra can be used to train the classifiers. This methodology is tested by developing classifiers for ethanol and methanol, two prevalent VOCs in wide industrial use. The classifiers are successfully tested with data collected from the aircraft during controlled releases of ethanol and during a methanol release from an industrial facility. For both ethanol and methanol, missed detections in the aircraft data are in the range of 4 to 5%, with false positive detections ranging from 0.1 to 0.3%.
Transmission line relay mis-operation detection based on time-synchronized field data
Esmaeilian, Ahad; Popovic, Tomo; Kezunovic, Mladen
2015-05-04
In this paper, a real-time tool to detect transmission line relay mis-operation is implemented. The tool uses time-synchronized measurements obtained from both ends of the line during disturbances. The proposed fault analysis tool comes into the picture only after the protective device has operated and tripped the line. The proposed methodology is able not only to detect, classify, and locate transmission line faults, but also to accurately confirm whether the line was tripped due to a mis-operation of protective relays. The analysis report includes either detailed description of the fault type and location or detection of relay mis-operation. As such,more » it can be a source of very useful information to support the system restoration. The focus of the paper is on the implementation requirements that allow practical application of the methodology, which is illustrated using the field data obtained the real power system. Testing and validation is done using the field data recorded by digital fault recorders and protective relays. The test data included several hundreds of event records corresponding to both relay mis-operations and actual faults. The discussion of results addresses various challenges encountered during the implementation and validation of the presented methodology.« less
Sazonov, Edward S; Makeyev, Oleksandr; Schuckers, Stephanie; Lopez-Meyer, Paulo; Melanson, Edward L; Neuman, Michael R
2010-03-01
Our understanding of etiology of obesity and overweight is incomplete due to lack of objective and accurate methods for monitoring of ingestive behavior (MIB) in the free-living population. Our research has shown that frequency of swallowing may serve as a predictor for detecting food intake, differentiating liquids and solids, and estimating ingested mass. This paper proposes and compares two methods of acoustical swallowing detection from sounds contaminated by motion artifacts, speech, and external noise. Methods based on mel-scale Fourier spectrum, wavelet packets, and support vector machines are studied considering the effects of epoch size, level of decomposition, and lagging on classification accuracy. The methodology was tested on a large dataset (64.5 h with a total of 9966 swallows) collected from 20 human subjects with various degrees of adiposity. Average weighted epoch-recognition accuracy for intravisit individual models was 96.8%, which resulted in 84.7% average weighted accuracy in detection of swallowing events. These results suggest high efficiency of the proposed methodology in separation of swallowing sounds from artifacts that originate from respiration, intrinsic speech, head movements, food ingestion, and ambient noise. The recognition accuracy was not related to body mass index, suggesting that the methodology is suitable for obese individuals.
Sazonov, Edward S.; Makeyev, Oleksandr; Schuckers, Stephanie; Lopez-Meyer, Paulo; Melanson, Edward L.; Neuman, Michael R.
2010-01-01
Our understanding of etiology of obesity and overweight is incomplete due to lack of objective and accurate methods for Monitoring of Ingestive Behavior (MIB) in the free living population. Our research has shown that frequency of swallowing may serve as a predictor for detecting food intake, differentiating liquids and solids, and estimating ingested mass. This paper proposes and compares two methods of acoustical swallowing detection from sounds contaminated by motion artifacts, speech and external noise. Methods based on mel-scale Fourier spectrum, wavelet packets, and support vector machines are studied considering the effects of epoch size, level of decomposition and lagging on classification accuracy. The methodology was tested on a large dataset (64.5 hours with a total of 9,966 swallows) collected from 20 human subjects with various degrees of adiposity. Average weighted epoch recognition accuracy for intra-visit individual models was 96.8% which resulted in 84.7% average weighted accuracy in detection of swallowing events. These results suggest high efficiency of the proposed methodology in separation of swallowing sounds from artifacts that originate from respiration, intrinsic speech, head movements, food ingestion, and ambient noise. The recognition accuracy was not related to body mass index, suggesting that the methodology is suitable for obese individuals. PMID:19789095
Furukawa, Emi; Shimabukuro, Shizuka; Alsop, Brent; Tripp, Gail
2017-09-25
Most research on motivational processes in attention deficit hyperactivity disorder (ADHD) has been undertaken in Western Europe and North America. The extent to which these findings apply to other cultural groups is unclear. The current study evaluated the behavioral sensitivity of Japanese children with and without ADHD to changing reward availability. Forty-one school-aged children, 19 diagnosed with DSM-IV ADHD, completed a signal-detection task in which correct discriminations between two stimuli were associated with different reinforcement frequencies. The response alternative associated with the higher rate of reinforcement switched twice during the task without warning. Both groups of children developed an initial bias toward the more frequently reinforced response alternative. When the reward contingencies switched the response allocation (bias) of the control group children followed suit. The response bias scores of the children with ADHD did not, suggesting impaired tracking of reward availability over time. Japanese children with ADHD adjust their behavioral responses to changing reinforcer availability less than their typically developing peers. This is not explained by poor attention to task or a lack of sensitivity to reward. The current results are consistent with altered sensitivity to changing reward contingencies identified in non-Japanese samples of children with ADHD. Irrespective of their country of origin, children with ADHD will likely benefit from behavioral expectations and reinforcement contingencies being made explicit together with high rates of reinforcement for appropriate behaviors.
Dreger, Mathias; Leung, Bo Wah; Brownlee, George G; Deng, Tao
2009-01-01
We describe a method for studying quantitative changes in accessibility of surface lysine residues of the PB1 subunit of the influenza RNA polymerase as a result of association with the PA subunit to form a PB1-PA heterodimer. Our method combines two established methods: (i) the chemical modification of surface lysine residues of native proteins by N-hydroxysuccinimidobiotin (NHS-biotin) and (ii) the stable isotope labeling of amino acids in cell culture (SILAC) followed by tryptic digestion and mass spectrometry. By linking the chemical modification with the SILAC methodology for the first time, we obtain quantitative data on chemical modification allowing subtle changes in accessibility to be described. Five regions in the PB1 monomer showed altered reactivity to NHS-biotin when compared with the [PB1-PA] heterodimer. Mutational analysis of residues in two such regions—at K265 and K481 of PB1, which were about three- and twofold, respectively, less accessible to biotinylation in the PB1-PA heterodimer compared with the PB1 monomer, demonstrated that both K265 and K481 were crucial for polymerase function. This novel assay of quantitative profiling of biotinylation patterns (Q-POP assay) highlights likely conformational changes at important functional sites, as observed here for PB1, and may provide information on protein–protein interaction interfaces. The Q-POP assay should be a generally applicable approach and may detect novel functional sites suitable for targeting by drugs. PMID:19517532
An Empirical Research Study of the Efficacy of Two Plagiarism-Detection Applications
ERIC Educational Resources Information Center
Hill, Jacob D.; Page, Elaine Fetyko
2009-01-01
This article describes a study of the two most popular plagiarism-detection software platforms available on today's market--Turnitin (http://www.turnitin.com/static/index.html) and SafeAssign (http://www.safeassign.com/). After a brief discussion of plagiarism's relevance to librarians, the authors examine plagiarism-detection methodology and…
An advanced LC-MS (Q-TOF) technique for the detection of amino acids in atmospheric aerosols
Methodology for detection of native (underivitized) amino acids in atmospheric aerosols has been developed. This article describes the use of LC-MS (Q-TOF) and microwave-assisted gas phase hydrolysis for detection of free and combined amino acids in aerosols collected in a Southe...
Vaz, Marcela C. M.; Rocha-Santos, Teresa A. P.; Rocha, Rui J. M.; Lopes, Isabel; Pereira, Ruth; Duarte, Armando C.; Rubec, Peter J.; Calado, Ricardo
2012-01-01
Cyanide fishing is a method employed to capture marine fish alive on coral reefs. They are shipped to markets for human consumption in Southeast Asia, as well as to supply the marine aquarium trade worldwide. Although several techniques can be used to detect cyanide in reef fish, there is still no testing method that can be used to survey the whole supply chain. Most methods for cyanide detection are time-consuming and require the sacrifice of the sampled fish. Thiocyanate anion (SCN−) is a metabolite produced by the main metabolic pathway for cyanide anion (CN−) detoxification. Our study employed an optical fiber (OF) methodology (analytical time <6 min) to detect SCN− in a non-invasive and non-destructive manner. Our OF methodology is able to detect trace levels (>3.16 µg L−1) of SCN− in seawater. Given that marine fish exposed to cyanide excrete SCN− in the urine, elevated levels of SCN− present in the seawater holding live reef fish indicate that the surveyed specimens were likely exposed to cyanide. In our study, captive-bred clownfish (Amphiprion clarkii) pulse exposed for 60 s to either 12.5 or 25 mg L−1 of CN− excreted up to 6.96±0.03 and 9.84±0.03 µg L−1 of SCN−, respectively, during the 28 days following exposure. No detectable levels of SCN− were recorded in the water holding control organisms not exposed to CN−, or in synthetic seawater lacking fish. While further research is necessary, our methodology can allow a rapid detection of SCN− in the holding water and can be used as a screening tool to indicate if live reef fish were collected with cyanide. PMID:22536375
Sepúlveda, Nuno; Campino, Susana G; Assefa, Samuel A; Sutherland, Colin J; Pain, Arnab; Clark, Taane G
2013-02-26
The advent of next generation sequencing technology has accelerated efforts to map and catalogue copy number variation (CNV) in genomes of important micro-organisms for public health. A typical analysis of the sequence data involves mapping reads onto a reference genome, calculating the respective coverage, and detecting regions with too-low or too-high coverage (deletions and amplifications, respectively). Current CNV detection methods rely on statistical assumptions (e.g., a Poisson model) that may not hold in general, or require fine-tuning the underlying algorithms to detect known hits. We propose a new CNV detection methodology based on two Poisson hierarchical models, the Poisson-Gamma and Poisson-Lognormal, with the advantage of being sufficiently flexible to describe different data patterns, whilst robust against deviations from the often assumed Poisson model. Using sequence coverage data of 7 Plasmodium falciparum malaria genomes (3D7 reference strain, HB3, DD2, 7G8, GB4, OX005, and OX006), we showed that empirical coverage distributions are intrinsically asymmetric and overdispersed in relation to the Poisson model. We also demonstrated a low baseline false positive rate for the proposed methodology using 3D7 resequencing data and simulation. When applied to the non-reference isolate data, our approach detected known CNV hits, including an amplification of the PfMDR1 locus in DD2 and a large deletion in the CLAG3.2 gene in GB4, and putative novel CNV regions. When compared to the recently available FREEC and cn.MOPS approaches, our findings were more concordant with putative hits from the highest quality array data for the 7G8 and GB4 isolates. In summary, the proposed methodology brings an increase in flexibility, robustness, accuracy and statistical rigour to CNV detection using sequence coverage data.
Vaz, Marcela C M; Rocha-Santos, Teresa A P; Rocha, Rui J M; Lopes, Isabel; Pereira, Ruth; Duarte, Armando C; Rubec, Peter J; Calado, Ricardo
2012-01-01
Cyanide fishing is a method employed to capture marine fish alive on coral reefs. They are shipped to markets for human consumption in Southeast Asia, as well as to supply the marine aquarium trade worldwide. Although several techniques can be used to detect cyanide in reef fish, there is still no testing method that can be used to survey the whole supply chain. Most methods for cyanide detection are time-consuming and require the sacrifice of the sampled fish. Thiocyanate anion (SCN(-)) is a metabolite produced by the main metabolic pathway for cyanide anion (CN(-)) detoxification. Our study employed an optical fiber (OF) methodology (analytical time <6 min) to detect SCN(-) in a non-invasive and non-destructive manner. Our OF methodology is able to detect trace levels (>3.16 µg L(-1)) of SCN(-) in seawater. Given that marine fish exposed to cyanide excrete SCN(-) in the urine, elevated levels of SCN(-) present in the seawater holding live reef fish indicate that the surveyed specimens were likely exposed to cyanide. In our study, captive-bred clownfish (Amphiprion clarkii) pulse exposed for 60 s to either 12.5 or 25 mg L(-1) of CN(-) excreted up to 6.96±0.03 and 9.84±0.03 µg L(-1) of SCN(-), respectively, during the 28 days following exposure. No detectable levels of SCN(-) were recorded in the water holding control organisms not exposed to CN(-), or in synthetic seawater lacking fish. While further research is necessary, our methodology can allow a rapid detection of SCN(-) in the holding water and can be used as a screening tool to indicate if live reef fish were collected with cyanide.
C-Based Design Methodology and Topological Change for an Indian Agricultural Tractor Component
NASA Astrophysics Data System (ADS)
Matta, Anil Kumar; Raju, D. Ranga; Suman, K. N. S.; Kranthi, A. S.
2018-06-01
The failure of tractor components and their replacement has now become very common in India because of re-cycling, re-sale, and duplication. To over come the problem of failure we propose a design methodology for topological change co-simulating with software's. In the proposed Design methodology, the designer checks Paxial, Pcr, Pfailue, τ by hand calculations, from which refined topological changes of R.S.Arm are formed. We explained several techniques employed in the component for reduction, removal of rib material to change center of gravity and centroid point by using system C for mixed level simulation and faster topological changes. The design process in system C can be compiled and executed with software, TURBO C7. The modified component is developed in proE and analyzed in ANSYS. The topologically changed component with slot 120 × 4.75 × 32.5 mm at the center showed greater effectiveness than the original component.
C-Based Design Methodology and Topological Change for an Indian Agricultural Tractor Component
NASA Astrophysics Data System (ADS)
Matta, Anil Kumar; Raju, D. Ranga; Suman, K. N. S.; Kranthi, A. S.
2018-02-01
The failure of tractor components and their replacement has now become very common in India because of re-cycling, re-sale, and duplication. To over come the problem of failure we propose a design methodology for topological change co-simulating with software's. In the proposed Design methodology, the designer checks Paxial, Pcr, Pfailue, τ by hand calculations, from which refined topological changes of R.S.Arm are formed. We explained several techniques employed in the component for reduction, removal of rib material to change center of gravity and centroid point by using system C for mixed level simulation and faster topological changes. The design process in system C can be compiled and executed with software, TURBO C7. The modified component is developed in proE and analyzed in ANSYS. The topologically changed component with slot 120 × 4.75 × 32.5 mm at the center showed greater effectiveness than the original component.
Evaluation Methodology. The Evaluation Exchange. Volume 11, Number 2, Summer 2005
ERIC Educational Resources Information Center
Coffman, Julia, Ed.
2005-01-01
This is the third issue of "The Evaluation Exchange" devoted entirely to the theme of methodology, though every issue tries to identify new methodological choices, the instructive ways in which people have applied or combined different methods, and emerging methodological trends. For example, lately "theories of change" have gained almost…
ERIC Educational Resources Information Center
Garske, Steven Ray
2010-01-01
Backsourcing is the act of an organization changing an outsourcing relationship through insourcing, vendor change, or elimination of the outsourced service. This study discovered numerous problematic outsourcing manipulations conducted by suppliers, and identified backsourcing methodologies to correct these manipulations across multiple supplier…
Kamal, Noreen; Fels, Sidney
2013-01-01
Positive health behaviour is critical to preventing illness and managing chronic conditions. A user-centred methodology was employed to design an online social network to motivate health behaviour change. The methodology was augmented by utilizing the Appeal, Belonging, Commitment (ABC) Framework, which is based on theoretical models for health behaviour change and use of online social networks. The user-centred methodology included four phases: 1) initial user inquiry on health behaviour and use of online social networks; 2) interview feedback on paper prototypes; 2) laboratory study on medium fidelity prototype; and 4) a field study on the high fidelity prototype. The points of inquiry through these phases were based on the ABC Framework. This yielded an online social network system that linked to external third party databases to deploy to users via an interactive website.
Modified Methodology for Projecting Coastal Louisiana Land Changes over the Next 50 Years
Hartley, Steve B.
2009-01-01
The coastal Louisiana landscape is continually undergoing geomorphologic changes (in particular, land loss); however, after the 2005 hurricane season, the changes were intensified because of Hurricanes Katrina and Rita. The amount of land loss caused by the 2005 hurricane season was 42 percent (562 km2) of the total land loss (1,329 km2) that was projected for the next 50 years in the Louisiana Coastal Area (LCA), Louisiana Ecosystem Restoration Study. The purpose of this study is to provide information on potential changes to coastal Louisiana by using a revised LCA study methodology. In the revised methodology, we used classified Landsat TM satellite imagery from 1990, 2001, 2004, and 2006 to calculate the 'background' or ambient land-water change rates but divided the Louisiana coastal area differently on the basis of (1) geographic regions ('subprovinces') and (2) specific homogeneous habitat types. Defining polygons by subprovinces (1, Pontchartrain Basin; 2, Barataria Basin; 3, Vermilion/Terrebonne Basins; and 4, the Chenier Plain area) allows for a specific erosion rate to be applied to that area. Further subdividing the provinces by habitat type allows for specific erosion rates for a particular vegetation type to be applied. Our modified methodology resulted in 24 polygons rather than the 183 that were used in the LCA study; further, actively managed areas and the CWPPRA areas were not masked out and dealt with separately as in the LCA study. This revised methodology assumes that erosion rates for habitat types by subprovince are under the influence of similar environmental conditions (sediment depletion, subsidence, and saltwater intrusion). Background change rate for three time periods (1990-2001, 1990-2004, and 1990-2006) were calculated by taking the difference in water or land among each time period and dividing it by the time interval. This calculation gives an annual change rate for each polygon per time period. Change rates for each time period were then used to compute the projected change in each subprovince and habitat type over 50 years by using the same compound rate functions used in the LCA study. The resulting maps show projected land changes based on the revised methodology and inclusion of damage by Hurricanes Katrina and Rita. Comparison of projected land change values between the LCA study and this study shows that this revised methodology - that is, using a reduced polygon subset (reduced from 183 to 24) based on habitat type and subprovince - can be used as a quick projection of land loss.
A strategic planning methodology for aircraft redesign
NASA Astrophysics Data System (ADS)
Romli, Fairuz Izzuddin
Due to a progressive market shift to a customer-driven environment, the influence of engineering changes on the product's market success is becoming more prominent. This situation affects many long lead-time product industries including aircraft manufacturing. Derivative development has been the key strategy for many aircraft manufacturers to survive the competitive market and this trend is expected to continue in the future. Within this environment of design adaptation and variation, the main market advantages are often gained by the fastest aircraft manufacturers to develop and produce their range of market offerings without any costly mistakes. This realization creates an emphasis on the efficiency of the redesign process, particularly on the handling of engineering changes. However, most activities involved in the redesign process are supported either inefficiently or not at all by the current design methods and tools, primarily because they have been mostly developed to improve original product development. In view of this, the main goal of this research is to propose an aircraft redesign methodology that will act as a decision-making aid for aircraft designers in the change implementation planning of derivative developments. The proposed method, known as Strategic Planning of Engineering Changes (SPEC), combines the key elements of the product redesign planning and change management processes. Its application is aimed at reducing the redesign risks of derivative aircraft development, improving the detection of possible change effects propagation, increasing the efficiency of the change implementation planning and also reducing the costs and the time delays due to the redesign process. To address these challenges, four research areas have been identified: baseline assessment, change propagation prediction, change impact analysis and change implementation planning. Based on the established requirements for the redesign planning process, several methods and tools that are identified within these research areas have been abstracted and adapted into the proposed SPEC method to meet the research goals. The proposed SPEC method is shown to be promising in improving the overall efficiency of the derivative aircraft planning process through two notional aircraft system redesign case studies that are presented in this study.
Documentation and Detection of Colour Changes of Bas Relieves Using Close Range Photogrammetry
NASA Astrophysics Data System (ADS)
Malinverni, E. S.; Pierdicca, R.; Sturari, M.; Colosi, F.; Orazi, R.
2017-05-01
The digitization of complex buildings, findings or bas relieves can strongly facilitate the work of archaeologists, mainly for in depth analysis tasks. Notwithstanding, whether new visualization techniques ease the study phase, a classical naked-eye approach for determining changes or surface alteration could bring towards several drawbacks. The research work described in these pages is aimed at providing experts with a workflow for the evaluation of alterations (e.g. color decay or surface alterations), allowing a more rapid and objective monitoring of monuments. More in deep, a pipeline of work has been tested in order to evaluate the color variation between surfaces acquired at different époques. The introduction of reliable tools of change detection in the archaeological domain is needful; in fact, the most widespread practice, among archaeologists and practitioners, is to perform a traditional monitoring of surfaces that is made of three main steps: production of a hand-made map based on a subjective analysis, selection of a sub-set of regions of interest, removal of small portion of surface for in depth analysis conducted in laboratory. To overcome this risky and time consuming process, digital automatic change detection procedure represents a turning point. To do so, automatic classification has been carried out according to two approaches: a pixel-based and an object-based method. Pixel-based classification aims to identify the classes by means of the spectral information provided by each pixel belonging to the original bands. The object-based approach operates on sets of pixels (objects/regions) grouped together by means of an image segmentation technique. The methodology was tested by studying the bas-relieves of a temple located in Peru, named Huaca de la Luna. Despite the data sources were collected with unplanned surveys, the workflow proved to be a valuable solution useful to understand which are the main changes over time.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rood, Arthur S.; Sondrup, A. Jeffrey; Ritter, Paul D.
A methodology to quantify the performance of an air monitoring network in terms of frequency of detection has been developed. The methodology utilizes an atmospheric transport model to predict air concentrations of radionuclides at the samplers for a given release time and duration. Frequency of detection is defined as the fraction of “events” that result in a detection at either a single sampler or network of samplers. An “event” is defined as a release of finite duration that begins on a given day and hour of the year from a facility with the potential to emit airborne radionuclides. Another metricmore » of interest is the network intensity, which is defined as the fraction of samplers in the network that have a positive detection for a given event. The frequency of detection methodology allows for evaluation of short-term releases that include effects of short-term variability in meteorological conditions. The methodology was tested using the U.S. Department of Energy Idaho National Laboratory (INL) Site ambient air monitoring network consisting of 37 low-volume air samplers in 31 different locations covering a 17,630 km 2 region. Releases from six major INL facilities distributed over an area of 1,435 km 2 were modeled and included three stack sources and eight ground-level sources. A Lagrangian Puff air dispersion model (CALPUFF) was used to model atmospheric transport. The model was validated using historical 125Sb releases and measurements. Relevant one-week release quantities from each emission source were calculated based on a dose of 1.9 × 10 –4 mSv at a public receptor (0.01 mSv assuming release persists over a year). Important radionuclides considered include 241Am, 137Cs, 238Pu, 239Pu, 90Sr, and tritium. Results show the detection frequency is over 97.5% for the entire network considering all sources and radionuclides. Network intensities ranged from 3.75% to 62.7%. Evaluation of individual samplers indicated some samplers were poorly situated and add little to the overall effectiveness of the network. As a result, using the frequency of detection methods, optimum sampler placements were simulated that could substantially improve the performance and efficiency of the network.« less
Rood, Arthur S.; Sondrup, A. Jeffrey; Ritter, Paul D.
2016-04-01
A methodology to quantify the performance of an air monitoring network in terms of frequency of detection has been developed. The methodology utilizes an atmospheric transport model to predict air concentrations of radionuclides at the samplers for a given release time and duration. Frequency of detection is defined as the fraction of “events” that result in a detection at either a single sampler or network of samplers. An “event” is defined as a release of finite duration that begins on a given day and hour of the year from a facility with the potential to emit airborne radionuclides. Another metricmore » of interest is the network intensity, which is defined as the fraction of samplers in the network that have a positive detection for a given event. The frequency of detection methodology allows for evaluation of short-term releases that include effects of short-term variability in meteorological conditions. The methodology was tested using the U.S. Department of Energy Idaho National Laboratory (INL) Site ambient air monitoring network consisting of 37 low-volume air samplers in 31 different locations covering a 17,630 km 2 region. Releases from six major INL facilities distributed over an area of 1,435 km 2 were modeled and included three stack sources and eight ground-level sources. A Lagrangian Puff air dispersion model (CALPUFF) was used to model atmospheric transport. The model was validated using historical 125Sb releases and measurements. Relevant one-week release quantities from each emission source were calculated based on a dose of 1.9 × 10 –4 mSv at a public receptor (0.01 mSv assuming release persists over a year). Important radionuclides considered include 241Am, 137Cs, 238Pu, 239Pu, 90Sr, and tritium. Results show the detection frequency is over 97.5% for the entire network considering all sources and radionuclides. Network intensities ranged from 3.75% to 62.7%. Evaluation of individual samplers indicated some samplers were poorly situated and add little to the overall effectiveness of the network. As a result, using the frequency of detection methods, optimum sampler placements were simulated that could substantially improve the performance and efficiency of the network.« less
77 FR 7109 - Establishment of User Fees for Filovirus Testing of Nonhuman Primate Liver Samples
Federal Register 2010, 2011, 2012, 2013, 2014
2012-02-10
... assay (ELISA) or other appropriate methodology. Each specimen will be held for six months. After six... loss of the only commercially available antigen-detection ELISA filovirus testing facility. Currently... current methodology (ELISA) used to test NHP liver samples. This cost determines the amount of the user...
ERIC Educational Resources Information Center
Azevedo, Roger; Moos, Daniel C.; Johnson, Amy M.; Chauncey, Amber D.
2010-01-01
Self-regulated learning (SRL) with hypermedia environments involves a complex cycle of temporally unfolding cognitive and metacognitive processes that impact students' learning. We present several methodological issues related to treating SRL as an event and strengths and challenges of using online trace methodologies to detect, trace, model, and…
Sequeiros, R C P; Neng, N R; Portugal, F C M; Pinto, M L; Pires, J; Nogueira, J M F
2011-04-01
This work describes the development, validation, and application of a novel methodology for the determination of testosterone and methenolone in urine matrices by stir bar sorptive extraction using polyurethane foams [SBSE(PU)] followed by liquid desorption and high-performance liquid chromatography with diode array detection. The methodology was optimized in terms of extraction time, agitation speed, pH, ionic strength and organic modifier, as well as back-extraction solvent and desorption time. Under optimized experimental conditions, convenient accuracy were achieved with average recoveries of 49.7 8.6% for testosterone and 54.2 ± 4.7% for methenolone. Additionally, the methodology showed good precision (<9%), excellent linear dynamic ranges (>0.9963) and convenient detection limits (0.2-0.3 μg/L). When comparing the efficiency obtained by SBSE(PU) and with the conventional polydimethylsiloxane phase [SBSE(PDMS)], yields up to four-fold higher are attained for the former, under the same experimental conditions. The application of the proposed methodology for the analysis of testosterone and methenolone in urine matrices showed negligible matrix effects and good analytical performance.
Alberdi-Cedeño, Jon; Ibargoitia, María L; Cristillo, Giovanna; Sopelana, Patricia; Guillén, María D
2017-04-15
The possibilities offered by a new methodology to determine minor components in edible oils are described. This is based on immersion of a solid-phase microextraction fiber of PDMS/DVB into the oil matrix, followed by Gas Chromatography/Mass Spectrometry. It enables characterization and differentiation of edible oils in a simple way, without either solvents or sample modification. This methodology allows simultaneous identification and quantification of sterols, tocols, hydrocarbons of different natures, fatty acids, esters, monoglycerides, fatty amides, aldehydes, ketones, alcohols, epoxides, furans, pyrans and terpenic oxygenated derivatives. The broad information provided by this methodology is useful for different areas of interest such as nutritional value, oxidative stability, technological performance, quality, processing, safety and even the prevention of fraudulent practices. Furthermore, for the first time, certain fatty amides, gamma- and delta-lactones of high molecular weight, and other aromatic compounds such as some esters derived from cinnamic acid have been detected in edible oils. Copyright © 2016 Elsevier Ltd. All rights reserved.
Yang, Zhen; Wang, Huanhuan; Guo, Pengfei; Ding, Yuanyuan; Lei, Chong; Luo, Yongsong
2018-06-01
Cardiac biomarkers (CBs) are substances that appear in the blood when the heart is damaged or stressed. Measurements of the level of CBs can be used in course of diagnostics or monitoring the state of the health of group risk persons. A multi-region bio-analytical system (MRBAS) based on magnetoimpedance (MI) changes was proposed for ultrasensitive simultaneous detection of CBs myoglobin (Mb) and C-reactive protein (CRP). The microfluidic device was designed and developed using standard microfabrication techniques for their usage in different regions, which were pre-modified with specific antibody for specified detection. Mb and CRP antigens labels attached to commercial Dynabeads with selected concentrations were trapped in different detection regions. The MI response of the triple sensitive element was carefully evaluated in initial state and in the presence of biomarkers. The results showed that the MI-based bio-sensing system had high selectivity and sensitivity for detection of CBs. Compared with the control region, ultrasensitive detections of CRP and Mb were accomplished with the detection limits of 1.0 pg/mL and 0.1 pg/mL, respectively. The linear detection range contained low concentration detection area and high concentration detection area, which were 1 pg/mL⁻10 ng/mL, 10⁻100 ng/mL for CRP, and 0.1 pg/mL⁻1 ng/mL, 1 n/mL⁻80 ng/mL for Mb. The measurement technique presented here provides a new methodology for multi-target biomolecules rapid testing.
Review of current neutron detection systems for emergency response
Mukhopadhyay, Sanjoy; Maurer, Richard; Guss, Paul; ...
2014-09-05
Neutron detectors are utilized in a myriad of applications—from safeguarding special nuclear materials (SNM) to determining lattice spacing in soft materials. The transformational changes taking place in neutron detection and imaging techniques in the last few years are largely being driven by the global shortage of helium-3 ( 3He). This article reviews the status of neutron sensors used specifically for SNM detection in radiological emergency response. These neutron detectors must be highly efficient, be rugged, have fast electronics to measure neutron multiplicity, and be capable of measuring direction of the neutron sources and possibly image them with high spatial resolution.more » Neutron detection is an indirect physical process: neutrons react with nuclei in materials to initiate the release of one or more charged particles that produce electric signals that can be processed by the detection system. Therefore, neutron detection requires conversion materials as active elements of the detection system; these materials may include boron-10 ( 10B), lithium-6 ( 6Li), and gadollinium-157 ( 157Gd), to name a few, but the number of materials available for neutron detection is limited. However, in recent years, pulse-shape-discriminating plastic scintillators, scintillators made of helium-4 ( 4He) under high pressure, pillar and trench semiconductor diodes, and exotic semiconductor neutron detectors made from uranium oxide and other materials have widely expanded the parameter space in neutron detection methodology. In this article we will pay special attention to semiconductor-based neutron sensors. Finally, modern microfabricated nanotubes covered inside with neutron converter materials and with very high aspect ratios for better charge transport will be discussed.« less
Review of current neutron detection systems for emergency response
NASA Astrophysics Data System (ADS)
Mukhopadhyay, Sanjoy; Maurer, Richard; Guss, Paul; Kruschwitz, Craig
2014-09-01
Neutron detectors are used in a myriad of applications—from safeguarding special nuclear materials (SNM) to determining lattice spacing in soft materials. The transformational changes taking place in neutron detection and imaging techniques in the last few years are largely being driven by the global shortage of helium-3 (3He). This article reviews the status of neutron sensors used specifically for SNM detection in radiological emergency response. These neutron detectors must be highly efficient, be rugged, have fast electronics to measure neutron multiplicity, and be capable of measuring direction of the neutron sources and possibly image them with high spatial resolution. Neutron detection is an indirect physical process: neutrons react with nuclei in materials to initiate the release of one or more charged particles that produce electric signals that can be processed by the detection system. Therefore, neutron detection requires conversion materials as active elements of the detection system; these materials may include boron-10 (10B), lithium-6 (6Li), and gadollinium-157 (157Gd), to name a few, but the number of materials available for neutron detection is limited. However, in recent years, pulse-shape-discriminating plastic scintillators, scintillators made of helium-4 (4He) under high pressure, pillar and trench semiconductor diodes, and exotic semiconductor neutron detectors made from uranium oxide and other materials have widely expanded the parameter space in neutron detection methodology. In this article we will pay special attention to semiconductor-based neutron sensors. Modern microfabricated nanotubes covered inside with neutron converter materials and with very high aspect ratios for better charge transport will be discussed.
Ayaydin, Ferhan; Kotogány, Edit; Ábrahám, Edit; Horváth, Gábor V
2017-01-01
Deepening our knowledge on the regulation of the plant cell division cycle depends on techniques that allow for the enrichment of cell populations in defined cell cycle phases. Synchronization of cell division can be achieved using different plant tissues; however, well-established cell suspension cultures provide large amount of biological sample for further analyses. Here, we describe the methodology of the establishment, propagation, and analysis of a Medicago sativa suspension culture that can be used for efficient synchronization of the cell division. A novel 5-ethynyl-2'-deoxyuridine (EdU)-based method is used for the estimation of cell fraction that enters DNA synthesis phase of the cell cycle and we also demonstrate the changes in the phosphorylation level of Medicago sativa retinoblastoma-related protein (MsRBR1) during cell cycle progression.
[PRENATAL DIAGNOSIS: EVOLUTION OF CLINICAL INDICATIONS AND THE SOCIETY IN THE PAST 30 YEARS].
Sagredo, José Miguel García
2014-01-01
Here we report the results of prenatal diagnosis at the Hospital Universitario Ramón y Cajal since its opening in 1979 to 2010, establishing a parallelism between the different methodologies for screening and prenatal diagnosis, clinical indications, and demographic changes. It shows how the indications have varied as to the structure of the population did. These changes have been possible thanks to the fact that the screening and prenatal diagnosis methods have allowed it. Demonstrating, once again, how procedures evolve with technology and adapt to the demography. This evolution has allowed to make a more effective prenatal diagnosis because the clinical indications have been more precise what has allowed to detect the same number of fetuses with chromosomal abnormalities by performing less invasive procedures, which has led to an optimization of prenatal diagnosis saving resources and personnel and, above all, avoiding unnecessary fetal losses.
Detection of early changes in lung cell cytology by flow-systems analysis techniques
DOE Office of Scientific and Technical Information (OSTI.GOV)
Steinkamp, J.A.; Hansen, K.M.; Wilson, J.S.
1976-12-01
This report summarizes results of continuing experiments to develop cytological and biochemical indicators for estimating damage to respiratory cells in test animals exposed by inhalation to toxic agents associated with nonnuclear energy production, the specific goal being the application of advanced multiparameter flow-systems technologies to the detection of early atypical cellular changes in lung epithelium. Normal Syrian hamster lung cell samples composed of macrophages, leukocytes, ciliated columnar cells, and epithelial cells were stained with fluorescent dyes specific for different biochemical parameters and were analyzed in liquid suspension as they flowed through a chamber intersecting a laser beam of exciting light.more » Multiple sensors measured the total or two-color fluorescence and light scatter on a cell-by-cell basis. Cellular parameters proportional to optical measurements (i.e., cell size, DNA content, total protein, nonspecific esterase activity, nuclear and cytoplasmic diameters) were displayed as frequency distribution histograms. Lung cell samples were also separated according to various cytological parameters and identified microscopically. The basic operating features of the methodology are discussed briefly, along with specific examples of preliminary results illustrating the initial characterization of exfoliated pulmonary cells from normal hamsters. As the flow technology is adapted further to the analysis of respiratory cells, measurements of changes in physical and biochemical properties as a function of exposure to toxic agents will be performed.« less
Targeted Quantification of Isoforms of a Thylakoid-Bound Protein: MRM Method Development.
Bru-Martínez, Roque; Martínez-Márquez, Ascensión; Morante-Carriel, Jaime; Sellés-Marchart, Susana; Martínez-Esteso, María José; Pineda-Lucas, José Luis; Luque, Ignacio
2018-01-01
Targeted mass spectrometric methods such as selected/multiple reaction monitoring (SRM/MRM) have found intense application in protein detection and quantification which competes with classical immunoaffinity techniques. It provides a universal procedure to develop a fast, highly specific, sensitive, accurate, and cheap methodology for targeted detection and quantification of proteins based on the direct analysis of their surrogate peptides typically generated by tryptic digestion. This methodology can be advantageously applied in the field of plant proteomics and particularly for non-model species since immunoreagents are scarcely available. Here, we describe the issues to take into consideration in order to develop a MRM method to detect and quantify isoforms of the thylakoid-bound protein polyphenol oxidase from the non-model and database underrepresented species Eriobotrya japonica Lindl.
NASA Technical Reports Server (NTRS)
Generazio, Edward R.
2011-01-01
The capability of an inspection system is established by applications of various methodologies to determine the probability of detection (POD). One accepted metric of an adequate inspection system is that for a minimum flaw size and all greater flaw sizes, there is 0.90 probability of detection with 95% confidence (90/95 POD). Directed design of experiments for probability of detection (DOEPOD) has been developed to provide an efficient and accurate methodology that yields estimates of POD and confidence bounds for both Hit-Miss or signal amplitude testing, where signal amplitudes are reduced to Hit-Miss by using a signal threshold Directed DOEPOD uses a nonparametric approach for the analysis or inspection data that does require any assumptions about the particular functional form of a POD function. The DOEPOD procedure identifies, for a given sample set whether or not the minimum requirement of 0.90 probability of detection with 95% confidence is demonstrated for a minimum flaw size and for all greater flaw sizes (90/95 POD). The DOEPOD procedures are sequentially executed in order to minimize the number of samples needed to demonstrate that there is a 90/95 POD lower confidence bound at a given flaw size and that the POD is monotonic for flaw sizes exceeding that 90/95 POD flaw size. The conservativeness of the DOEPOD methodology results is discussed. Validated guidelines for binomial estimation of POD for fracture critical inspection are established.
Automatic detection of freezing of gait events in patients with Parkinson's disease.
Tripoliti, Evanthia E; Tzallas, Alexandros T; Tsipouras, Markos G; Rigas, George; Bougia, Panagiota; Leontiou, Michael; Konitsiotis, Spiros; Chondrogiorgi, Maria; Tsouli, Sofia; Fotiadis, Dimitrios I
2013-04-01
The aim of this study is to detect freezing of gait (FoG) events in patients suffering from Parkinson's disease (PD) using signals received from wearable sensors (six accelerometers and two gyroscopes) placed on the patients' body. For this purpose, an automated methodology has been developed which consists of four stages. In the first stage, missing values due to signal loss or degradation are replaced and then (second stage) low frequency components of the raw signal are removed. In the third stage, the entropy of the raw signal is calculated. Finally (fourth stage), four classification algorithms have been tested (Naïve Bayes, Random Forests, Decision Trees and Random Tree) in order to detect the FoG events. The methodology has been evaluated using several different configurations of sensors in order to conclude to the set of sensors which can produce optimal FoG episode detection. Signals recorded from five healthy subjects, five patients with PD who presented the symptom of FoG and six patients who suffered from PD but they do not present FoG events. The signals included 93 FoG events with 405.6s total duration. The results indicate that the proposed methodology is able to detect FoG events with 81.94% sensitivity, 98.74% specificity, 96.11% accuracy and 98.6% area under curve (AUC) using the signals from all sensors and the Random Forests classification algorithm. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
Rapid quantitation of neuraminidase inhibitor drug resistance in influenza virus quasispecies.
Lackenby, Angie; Democratis, Jane; Siqueira, Marilda M; Zambon, Maria C
2008-01-01
Emerging resistance of influenza viruses to neuraminidase inhibitors is a concern, both in surveillance of global circulating strains and in treatment of individual patients. Current methodologies to detect resistance rely on the use of cultured virus, thus taking time to complete or lacking the sensitivity to detect mutations in viral quasispecies. Methodology for rapid detection of clinically meaningful resistance is needed to assist individual patient management and to track the transmission of resistant viruses in the community. We have developed a pyrosequencing methodology to detect and quantitate influenza neuraminidase inhibitor resistance mutations in cultured virus and directly in clinical material. Our assays target polymorphisms associated with drug resistance in the neuraminidase genes of human influenza A H1N1 as well as human and avian H5N1 viruses. Quantitation can be achieved using viral RNA extracted directly from respiratory or tissue samples, thus eliminating the need for virus culture and allowing the assay of highly pathogenic viruses such as H5N1 without high containment laboratory facilities. Antiviral-resistant quasispecies are detected and quantitated accurately when present in the total virus population at levels as low as 10%. Pyrosequencing is a real-time assay; therefore, results can be obtained within a clinically relevant timeframe and provide information capable of informing individual patient or outbreak management. Pyrosequencing is ideally suited for early identification of emerging antiviral resistance in human and avian influenza infection and is a useful tool for laboratory surveillance and pandemic preparedness.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bri Rolston
2005-06-01
Threat characterization is a key component in evaluating the threat faced by control systems. Without a thorough understanding of the threat faced by critical infrastructure networks, adequate resources cannot be allocated or directed effectively to the defense of these systems. Traditional methods of threat analysis focus on identifying the capabilities and motivations of a specific attacker, assessing the value the adversary would place on targeted systems, and deploying defenses according to the threat posed by the potential adversary. Too many effective exploits and tools exist and are easily accessible to anyone with access to an Internet connection, minimal technical skills,more » and a significantly reduced motivational threshold to be able to narrow the field of potential adversaries effectively. Understanding how hackers evaluate new IT security research and incorporate significant new ideas into their own tools provides a means of anticipating how IT systems are most likely to be attacked in the future. This research, Attack Methodology Analysis (AMA), could supply pertinent information on how to detect and stop new types of attacks. Since the exploit methodologies and attack vectors developed in the general Information Technology (IT) arena can be converted for use against control system environments, assessing areas in which cutting edge exploit development and remediation techniques are occurring can provide significance intelligence for control system network exploitation, defense, and a means of assessing threat without identifying specific capabilities of individual opponents. Attack Methodology Analysis begins with the study of what exploit technology and attack methodologies are being developed in the Information Technology (IT) security research community within the black and white hat community. Once a solid understanding of the cutting edge security research is established, emerging trends in attack methodology can be identified and the gap between those threats and the defensive capabilities of control systems can be analyzed. The results of the gap analysis drive changes in the cyber security of critical infrastructure networks to close the gap between current exploits and existing defenses. The analysis also provides defenders with an idea of how threat technology is evolving and how defenses will need to be modified to address these emerging trends.« less
ERIC Educational Resources Information Center
Unicomb, Rachael; Colyvas, Kim; Harrison, Elisabeth; Hewat, Sally
2015-01-01
Purpose: Case-study methodology studying change is often used in the field of speech-language pathology, but it can be criticized for not being statistically robust. Yet with the heterogeneous nature of many communication disorders, case studies allow clinicians and researchers to closely observe and report on change. Such information is valuable…
Lykins, Amy D; Meana, Marta; Kambe, Gretchen
2006-10-01
As a first step in the investigation of the role of visual attention in the processing of erotic stimuli, eye-tracking methodology was employed to measure eye movements during erotic scene presentation. Because eye-tracking is a novel methodology in sexuality research, we attempted to determine whether the eye-tracker could detect differences (should they exist) in visual attention to erotic and non-erotic scenes. A total of 20 men and 20 women were presented with a series of erotic and non-erotic images and tracked their eye movements during image presentation. Comparisons between erotic and non-erotic image groups showed significant differences on two of three dependent measures of visual attention (number of fixations and total time) in both men and women. As hypothesized, there was a significant Stimulus x Scene Region interaction, indicating that participants visually attended to the body more in the erotic stimuli than in the non-erotic stimuli, as evidenced by a greater number of fixations and longer total time devoted to that region. These findings provide support for the application of eye-tracking methodology as a measure of visual attentional capture in sexuality research. Future applications of this methodology to expand our knowledge of the role of cognition in sexuality are suggested.
Footprint Map Partitioning Using Airborne Laser Scanning Data
NASA Astrophysics Data System (ADS)
Xiong, B.; Oude Elberink, S.; Vosselman, G.
2016-06-01
Nowadays many cities and countries are creating their 3D building models for a better daily management and smarter decision making. The newly created 3D models are required to be consistent with existing 2D footprint maps. Thereby the 2D maps are usually combined with height data for the task of 3D reconstruction. Many buildings are often composed by parts that are discontinuous over height. Building parts can be reconstructed independently and combined into a complete building. Therefore, most of the state-of-the-art work on 3D building reconstruction first decomposes a footprint map into parts. However, those works usually change the footprint maps for easier partitioning and cannot detect building parts that are fully inside the footprint polygon. In order to solve those problems, we introduce two methodologies, one more dependent on height data, and the other one more dependent on footprints. We also experimentally evaluate the two methodologies and compare their advantages and disadvantages. The experiments use Airborne Laser Scanning (ALS) data and two vector maps, one with 1:10,000 scale and another one with 1:500 scale.
NASA Technical Reports Server (NTRS)
Volponi, Al; Simon, Donald L. (Technical Monitor)
2008-01-01
A key technological concept for producing reliable engine diagnostics and prognostics exploits the benefits of fusing sensor data, information, and/or processing algorithms. This report describes the development of a hybrid engine model for a propulsion gas turbine engine, which is the result of fusing two diverse modeling methodologies: a physics-based model approach and an empirical model approach. The report describes the process and methods involved in deriving and implementing a hybrid model configuration for a commercial turbofan engine. Among the intended uses for such a model is to enable real-time, on-board tracking of engine module performance changes and engine parameter synthesis for fault detection and accommodation.
Sormani, Maria Pia
2017-03-01
Multiple sclerosis is a highly heterogeneous disease; the quantitative assessment of disease progression is problematic for many reasons, including the lack of objective methods to measure disability and the long follow-up times needed to detect relevant and stable changes. For these reasons, the importance of prognostic markers, markers of response to treatments and of surrogate endpoints, is crucial in multiple sclerosis research. Aim of this report is to clarify some basic definitions and methodological issues about baseline factors to be considered prognostic markers or markers of response to treatment; to define the dynamic role that variables must have to be considered surrogate markers in relation to specific treatments.
NASA Astrophysics Data System (ADS)
Martinis, Sandro; Clandillon, Stephen; Twele, André; Huber, Claire; Plank, Simon; Maxant, Jérôme; Cao, Wenxi; Caspard, Mathilde; May, Stéphane
2016-04-01
Optical and radar satellite remote sensing have proven to provide essential crisis information in case of natural disasters, humanitarian relief activities and civil security issues in a growing number of cases through mechanisms such as the Copernicus Emergency Management Service (EMS) of the European Commission or the International Charter 'Space and Major Disasters'. The aforementioned programs and initiatives make use of satellite-based rapid mapping services aimed at delivering reliable and accurate crisis information after natural hazards. Although these services are increasingly operational, they need to be continuously updated and improved through research and development (R&D) activities. The principal objective of ASAPTERRA (Advancing SAR and Optical Methods for Rapid Mapping), the ESA-funded R&D project being described here, is to improve, automate and, hence, speed-up geo-information extraction procedures in the context of natural hazards response. This is performed through the development, implementation, testing and validation of novel image processing methods using optical and Synthetic Aperture Radar (SAR) data. The methods are mainly developed based on data of the German radar satellites TerraSAR-X and TanDEM-X, the French satellite missions Pléiades-1A/1B as well as the ESA missions Sentinel-1/2 with the aim to better characterize the potential and limitations of these sensors and their synergy. The resulting algorithms and techniques are evaluated in real case applications during rapid mapping activities. The project is focussed on three types of natural hazards: floods, landslides and fires. Within this presentation an overview of the main methodological developments in each topic is given and demonstrated in selected test areas. The following developments are presented in the context of flood mapping: a fully automated Sentinel-1 based processing chain for detecting open flood surfaces, a method for the improved detection of flooded vegetation in Sentinel-1data using Entropy/Alpha decomposition, unsupervised Wishart Classification, and object-based post-classification as well as semi-automatic approaches for extracting inundated areas and flood traces in rural and urban areas from VHR and HR optical imagery using machine learning techniques. Methodological developments related to fires are the implementation of fast and robust methods for mapping burnt scars using change detection procedures using SAR (Sentinel-1, TerraSAR-X) and HR optical (e.g. SPOT, Sentinel-2) data as well as the extraction of 3D surface and volume change information from Pléiades stereo-pairs. In the context of landslides, fast and transferable change detection procedures based on SAR (TerraSAR-X) and optical (SPOT) data as well methods for extracting the extent of landslides only based on polarimetric VHR SAR (TerraSAR-X) data are presented.
T-Pattern Analysis and Cognitive Load Manipulation to Detect Low-Stake Lies: An Exploratory Study.
Diana, Barbara; Zurloni, Valentino; Elia, Massimiliano; Cavalera, Cesare; Realdon, Olivia; Jonsson, Gudberg K; Anguera, M Teresa
2018-01-01
Deception has evolved to become a fundamental aspect of human interaction. Despite the prolonged efforts in many disciplines, there has been no definite finding of a univocally "deceptive" signal. This work proposes an approach to deception detection combining cognitive load manipulation and T-pattern methodology with the objective of: (a) testing the efficacy of dual task-procedure in enhancing differences between truth tellers and liars in a low-stakes situation; (b) exploring the efficacy of T-pattern methodology in discriminating truthful reports from deceitful ones in a low-stakes situation; (c) setting the experimental design and procedure for following research. We manipulated cognitive load to enhance differences between truth tellers and liars, because of the low-stakes lies involved in our experiment. We conducted an experimental study with a convenience sample of 40 students. We carried out a first analysis on the behaviors' frequencies coded through the observation software, using SPSS (22). The aim was to describe shape and characteristics of behavior's distributions and explore differences between groups. Datasets were then analyzed with Theme 6.0 software which detects repeated patterns (T-patterns) of coded events (non-verbal behaviors) that regularly or irregularly occur within a period of observation. A descriptive analysis on T-pattern frequencies was carried out to explore differences between groups. An in-depth analysis on more complex patterns was performed to get qualitative information on the behavior structure expressed by the participants. Results show that the dual-task procedure enhances differences observed between liars and truth tellers with T-pattern methodology; moreover, T-pattern detection reveals a higher variety and complexity of behavior in truth tellers than in liars. These findings support the combination of cognitive load manipulation and T-pattern methodology for deception detection in low-stakes situations, suggesting the testing of directional hypothesis on a larger probabilistic sample of population.
T-Pattern Analysis and Cognitive Load Manipulation to Detect Low-Stake Lies: An Exploratory Study
Diana, Barbara; Zurloni, Valentino; Elia, Massimiliano; Cavalera, Cesare; Realdon, Olivia; Jonsson, Gudberg K.; Anguera, M. Teresa
2018-01-01
Deception has evolved to become a fundamental aspect of human interaction. Despite the prolonged efforts in many disciplines, there has been no definite finding of a univocally “deceptive” signal. This work proposes an approach to deception detection combining cognitive load manipulation and T-pattern methodology with the objective of: (a) testing the efficacy of dual task-procedure in enhancing differences between truth tellers and liars in a low-stakes situation; (b) exploring the efficacy of T-pattern methodology in discriminating truthful reports from deceitful ones in a low-stakes situation; (c) setting the experimental design and procedure for following research. We manipulated cognitive load to enhance differences between truth tellers and liars, because of the low-stakes lies involved in our experiment. We conducted an experimental study with a convenience sample of 40 students. We carried out a first analysis on the behaviors’ frequencies coded through the observation software, using SPSS (22). The aim was to describe shape and characteristics of behavior’s distributions and explore differences between groups. Datasets were then analyzed with Theme 6.0 software which detects repeated patterns (T-patterns) of coded events (non-verbal behaviors) that regularly or irregularly occur within a period of observation. A descriptive analysis on T-pattern frequencies was carried out to explore differences between groups. An in-depth analysis on more complex patterns was performed to get qualitative information on the behavior structure expressed by the participants. Results show that the dual-task procedure enhances differences observed between liars and truth tellers with T-pattern methodology; moreover, T-pattern detection reveals a higher variety and complexity of behavior in truth tellers than in liars. These findings support the combination of cognitive load manipulation and T-pattern methodology for deception detection in low-stakes situations, suggesting the testing of directional hypothesis on a larger probabilistic sample of population. PMID:29551986
Kalinina, A M; Ipatov, P V; Kaminskaya, A K; Kushunina, D V
2015-01-01
To study the efficiency of a methodology for the active detection of coronary heart disease (CHD) and cerebrovascular diseases (CVD) during medical examination and to determine the need and possible ways of its improvement. The medical examinations of 19.4 million people (94.6% of all the citizens who had undergone medical examinations in all the regions of Russia in 2013) were analyzed and the methodological aspects of identification of the circulatory diseases (CDs) that were induced by coronary and cerebral vessel atherosclerosis and had common risk factors, primarily CHD and CVD, were assessed. The medical examinations revealed 2,915,445 cases of CDs and their suspicions, during which its clinical diagnosis was established in 57.2%. The suspected disease requiring that its diagnosis should be further specified; off-medical examinations revealed hypertension in more than 770,000 cases, CHD in 232,000, and CVD in 146,000. The proportion of stable angina pectoris of all angina cases was much higher at a young age (25.6%) than at middle (15.6%) and elderly (11.3%) ages. Brachiocephalic artery stenoses were detected in almost 13,000 cases. According to the official health statistics, within the years preceding the introduction of large-scale medical examinations, there was a slight rise in new CD cases among the adult population of Russia, which was more significant in 2013 (according to the preliminary data) than in 2012. The methodology for the active detection of CDs through a two-step medical examination, which is used during a follow-up, makes it possible to substantially increase detection rates for CDs. There has been shown to be a need for the better quality and completeness of diagnostic examination in real practice.
Monoclonal antibody technologies and rapid detection assays
USDA-ARS?s Scientific Manuscript database
Novel methodologies and screening strategies will be outlined on the use of hybridoma technology for the selection of antigen specific monoclonal antibodies. The development of immunoassays used for diagnostic detection of prions and bacterial toxins will be discussed and examples provided demonstr...
Research Methodology on Language Development from a Complex Systems Perspective
ERIC Educational Resources Information Center
Larsen-Freeman, Diane; Cameron, Lynne
2008-01-01
Changes to research methodology motivated by the adoption of a complexity theory perspective on language development are considered. The dynamic, nonlinear, and open nature of complex systems, together with their tendency toward self-organization and interaction across levels and timescales, requires changes in traditional views of the functions…
Change--how to remove the fear, resentment, and resistance.
Weitz, A J
1995-11-01
This article introduces active learning, which is an innovative education methodology for the workplace classroom. It is used to help people remove their fear, resentment, and resistance to the change process itself. Active learning makes education more effective compared with the predominantly used traditional lecture-type teaching methodology.
Capello, Manuela; Robert, Marianne; Soria, Marc; Potin, Gael; Itano, David; Holland, Kim; Deneubourg, Jean-Louis; Dagorn, Laurent
2015-01-01
The rapid expansion of the use of passive acoustic telemetry technologies has facilitated unprecedented opportunities for studying the behavior of marine organisms in their natural environment. This technological advance would greatly benefit from the parallel development of dedicated methodologies accounting for the variety of timescales involved in the remote detection of tagged animals related to instrumental, environmental and behavioral events. In this paper we propose a methodological framework for estimating the site fidelity (“residence times”) of acoustic tagged animals at different timescales, based on the survival analysis of continuous residence times recorded at multiple receivers. Our approach is validated through modeling and applied on two distinct datasets obtained from a small coastal pelagic species (bigeye scad, Selar crumenophthalmus) and a large, offshore pelagic species (yellowfin tuna, Thunnus albacares), which show very distinct spatial scales of behavior. The methodological framework proposed herein allows estimating the most appropriate temporal scale for processing passive acoustic telemetry data depending on the scientific question of interest. Our method provides residence times free of the bias inherent to environmental and instrumental noise that can be used to study the small scale behavior of acoustic tagged animals. At larger timescales, it can effectively identify residence times that encompass the diel behavioral excursions of fish out of the acoustic detection range. This study provides a systematic framework for the analysis of passive acoustic telemetry data that can be employed for the comparative study of different species and study sites. The same methodology can be used each time discrete records of animal detections of any nature are employed for estimating the site fidelity of an animal at different timescales. PMID:26261985
Capello, Manuela; Robert, Marianne; Soria, Marc; Potin, Gael; Itano, David; Holland, Kim; Deneubourg, Jean-Louis; Dagorn, Laurent
2015-01-01
The rapid expansion of the use of passive acoustic telemetry technologies has facilitated unprecedented opportunities for studying the behavior of marine organisms in their natural environment. This technological advance would greatly benefit from the parallel development of dedicated methodologies accounting for the variety of timescales involved in the remote detection of tagged animals related to instrumental, environmental and behavioral events. In this paper we propose a methodological framework for estimating the site fidelity ("residence times") of acoustic tagged animals at different timescales, based on the survival analysis of continuous residence times recorded at multiple receivers. Our approach is validated through modeling and applied on two distinct datasets obtained from a small coastal pelagic species (bigeye scad, Selar crumenophthalmus) and a large, offshore pelagic species (yellowfin tuna, Thunnus albacares), which show very distinct spatial scales of behavior. The methodological framework proposed herein allows estimating the most appropriate temporal scale for processing passive acoustic telemetry data depending on the scientific question of interest. Our method provides residence times free of the bias inherent to environmental and instrumental noise that can be used to study the small scale behavior of acoustic tagged animals. At larger timescales, it can effectively identify residence times that encompass the diel behavioral excursions of fish out of the acoustic detection range. This study provides a systematic framework for the analysis of passive acoustic telemetry data that can be employed for the comparative study of different species and study sites. The same methodology can be used each time discrete records of animal detections of any nature are employed for estimating the site fidelity of an animal at different timescales.
ERIC Educational Resources Information Center
Taft, Laritza M.
2010-01-01
In its report "To Err is Human", The Institute of Medicine recommended the implementation of internal and external voluntary and mandatory automatic reporting systems to increase detection of adverse events. Knowledge Discovery in Databases (KDD) allows the detection of patterns and trends that would be hidden or less detectable if analyzed by…
A Conceptual Framework for Detecting Cheating in Online and Take-Home Exams
ERIC Educational Resources Information Center
D'Souza, Kelwyn A.; Siegfeldt, Denise V.
2017-01-01
Selecting the right methodology to use for detecting cheating in online exams requires considerable time and effort due to a wide variety of scholarly publications on academic dishonesty in online education. This article offers a cheating detection framework that can serve as a guideline for conducting cheating studies. The necessary theories and…
Salary Equity: Detecting Sex Bias in Salaries among College and University Professors.
ERIC Educational Resources Information Center
Pezzullo, Thomas R., Ed.; Brittingham, Barbara E., Ed.
Sex bias in college faculty salaries is examined in this book. Part 1 contains the following four chapters on the use of multiple regression to detect and estimate sex bias in salaries: "The Assessment of Salary Equity: A Methodology, Alternatives, and a Dilemma" (Thomas R. Pezzullo and Barbara E. Brittingham); "Detection of Sex-Related Salary…
Data collection and analysis strategies for phMRI.
Mandeville, Joseph B; Liu, Christina H; Vanduffel, Wim; Marota, John J A; Jenkins, Bruce G
2014-09-01
Although functional MRI traditionally has been applied mainly to study changes in task-induced brain function, evolving acquisition methodologies and improved knowledge of signal mechanisms have increased the utility of this method for studying responses to pharmacological stimuli, a technique often dubbed "phMRI". The proliferation of higher magnetic field strengths and the use of exogenous contrast agent have boosted detection power, a critical factor for successful phMRI due to the restricted ability to average multiple stimuli within subjects. Receptor-based models of neurovascular coupling, including explicit pharmacological models incorporating receptor densities and affinities and data-driven models that incorporate weak biophysical constraints, have demonstrated compelling descriptions of phMRI signal induced by dopaminergic stimuli. This report describes phMRI acquisition and analysis methodologies, with an emphasis on data-driven analyses. As an example application, statistically efficient data-driven regressors were used to describe the biphasic response to the mu-opioid agonist remifentanil, and antagonism using dopaminergic and GABAergic ligands revealed modulation of the mesolimbic pathway. Results illustrate the power of phMRI as well as our incomplete understanding of mechanisms underlying the signal. Future directions are discussed for phMRI acquisitions in human studies, for evolving analysis methodologies, and for interpretative studies using the new generation of simultaneous PET/MRI scanners. This article is part of the Special Issue Section entitled 'Neuroimaging in Neuropharmacology'. Copyright © 2014 Elsevier Ltd. All rights reserved.
Women's experiences of continuous fetal monitoring - a mixed-methods systematic review.
Crawford, Alexandra; Hayes, Dexter; Johnstone, Edward D; Heazell, Alexander E P
2017-12-01
Antepartum stillbirth is often preceded by detectable signs of fetal compromise, including changes in fetal heart rate and movement. It is hypothesized that continuous fetal monitoring could detect these signs more accurately and objectively than current forms of fetal monitoring and allow for timely intervention. This systematic review aimed to explore available evidence on women's experiences of continuous fetal monitoring to investigate its acceptability before clinical implementation and to inform clinical studies. Systematic searching of four electronic databases (Embase, PsycINFO, MEDLINE and CINAHL), using key terms defined by initial scoping searches, identified a total of 35 studies. Following title and abstract screening by two independent researchers, five studies met the inclusion criteria. Studies were not excluded based on language, methodology or quality assessment. An integrative methodology was used to synthesize qualitative and quantitative data together. Forms of continuous fetal monitoring used included Monica AN24 monitors (n = 4) and phonocardiography (n = 1). Four main themes were identified: practical limitations of the device, negative emotions, positive perceptions, and device implementation. Continuous fetal monitoring was reported to have high levels of participant satisfaction and was preferred by women to intermittent cardiotocography. This review suggests that continuous fetal monitoring is accepted by women. However, it has also highlighted both the paucity and heterogeneity of current studies and suggests that further research should be conducted into women's experiences of continuous fetal monitoring before such devices can be used clinically. © 2017 Nordic Federation of Societies of Obstetrics and Gynecology.
Kloefkorn, Heidi E.; Pettengill, Travis R.; Turner, Sara M. F.; Streeter, Kristi A.; Gonzalez-Rothi, Elisa J.; Fuller, David D.; Allen, Kyle D.
2016-01-01
While rodent gait analysis can quantify the behavioral consequences of disease, significant methodological differences exist between analysis platforms and little validation has been performed to understand or mitigate these sources of variance. By providing the algorithms used to quantify gait, open-source gait analysis software can be validated and used to explore methodological differences. Our group is introducing, for the first time, a fully-automated, open-source method for the characterization of rodent spatiotemporal gait patterns, termed Automated Gait Analysis Through Hues and Areas (AGATHA). This study describes how AGATHA identifies gait events, validates AGATHA relative to manual digitization methods, and utilizes AGATHA to detect gait compensations in orthopaedic and spinal cord injury models. To validate AGATHA against manual digitization, results from videos of rodent gait, recorded at 1000 frames per second (fps), were compared. To assess one common source of variance (the effects of video frame rate), these 1000 fps videos were re-sampled to mimic several lower fps and compared again. While spatial variables were indistinguishable between AGATHA and manual digitization, low video frame rates resulted in temporal errors for both methods. At frame rates over 125 fps, AGATHA achieved a comparable accuracy and precision to manual digitization for all gait variables. Moreover, AGATHA detected unique gait changes in each injury model. These data demonstrate AGATHA is an accurate and precise platform for the analysis of rodent spatiotemporal gait patterns. PMID:27554674
Fluorescent sensor based models for the detection of environmentally-related toxic heavy metals.
Rasheed, Tahir; Bilal, Muhammad; Nabeel, Faran; Iqbal, Hafiz M N; Li, Chuanlong; Zhou, Yongfeng
2018-02-15
The quest for industrial and biotechnological revolution has been contributed in increasing environmental contamination issues, worldwide. The controlled or uncontrolled release of hazardous pollutants from various industrial sectors is one of the key problems facing humanity. Among them, adverse influences of lead, cadmium, and mercury on human health are well known to cause many disorders like reproductive, neurological, endocrine system, and cardiovascular, etc. Besides their presence at lower concentrations, most of these toxic heavy metals are posing noteworthy toxicological concerns. In this context, notable efforts from various regulatory authorities, the increase in the concentration of these toxic heavy metals in the environment is of serious concern, so real-time monitoring is urgently required. This necessitates the exploration for novel and efficient probes for recognition of these toxic agents. Among various methodologies adopted for tailoring such probes, generally the methodologies, in which changes associated with spectral properties, are preferred for the deceptive ease in the recognition process. Accordingly, a promising modality has emerged in the form of radiometric and colorimetric monitoring of these toxic agents. Herein, we review fluorescent sensor based models and their potentialities to address the detection fate of hazardous pollutants for a cleaner environment. Second, recent advances regarding small molecule and rhodamine-based fluorescent sensors, radiometric and colorimetric probes are discussed. The information is also given on the photoinduced electron transfer (PET) mechanism, chelation enhancement fluorescence (CHEF) effect and spirocyclic ring opening mechanism. Copyright © 2017 Elsevier B.V. All rights reserved.
Kloefkorn, Heidi E; Pettengill, Travis R; Turner, Sara M F; Streeter, Kristi A; Gonzalez-Rothi, Elisa J; Fuller, David D; Allen, Kyle D
2017-03-01
While rodent gait analysis can quantify the behavioral consequences of disease, significant methodological differences exist between analysis platforms and little validation has been performed to understand or mitigate these sources of variance. By providing the algorithms used to quantify gait, open-source gait analysis software can be validated and used to explore methodological differences. Our group is introducing, for the first time, a fully-automated, open-source method for the characterization of rodent spatiotemporal gait patterns, termed Automated Gait Analysis Through Hues and Areas (AGATHA). This study describes how AGATHA identifies gait events, validates AGATHA relative to manual digitization methods, and utilizes AGATHA to detect gait compensations in orthopaedic and spinal cord injury models. To validate AGATHA against manual digitization, results from videos of rodent gait, recorded at 1000 frames per second (fps), were compared. To assess one common source of variance (the effects of video frame rate), these 1000 fps videos were re-sampled to mimic several lower fps and compared again. While spatial variables were indistinguishable between AGATHA and manual digitization, low video frame rates resulted in temporal errors for both methods. At frame rates over 125 fps, AGATHA achieved a comparable accuracy and precision to manual digitization for all gait variables. Moreover, AGATHA detected unique gait changes in each injury model. These data demonstrate AGATHA is an accurate and precise platform for the analysis of rodent spatiotemporal gait patterns.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Disney, R.K.
1994-10-01
The methodology for handling bias and uncertainty when calculational methods are used in criticality safety evaluations (CSE`s) is a rapidly evolving technology. The changes in the methodology are driven by a number of factors. One factor responsible for changes in the methodology for handling bias and uncertainty in CSE`s within the overview of the US Department of Energy (DOE) is a shift in the overview function from a ``site`` perception to a more uniform or ``national`` perception. Other causes for change or improvement in the methodology for handling calculational bias and uncertainty are; (1) an increased demand for benchmark criticalsmore » data to expand the area (range) of applicability of existing data, (2) a demand for new data to supplement existing benchmark criticals data, (3) the increased reliance on (or need for) computational benchmarks which supplement (or replace) experimental measurements in critical assemblies, and (4) an increased demand for benchmark data applicable to the expanded range of conditions and configurations encountered in DOE site restoration and remediation.« less
DOT National Transportation Integrated Search
1976-04-01
The development and testing of incident detection algorithms was based on Los Angeles and Minneapolis freeway surveillance data. Algorithms considered were based on times series and pattern recognition techniques. Attention was given to the effects o...
DOT National Transportation Integrated Search
2014-07-01
This report presents a vibration : - : based damage : - : detection methodology that is capable of effectively capturing crack growth : near connections and crack re : - : initiation of retrofitted connections. The proposed damage detection algorithm...
2017-09-01
THE EXPANDED APPLICATION OF FORENSIC SCIENCE AND LAW ENFORCEMENT METHODOLOGIES IN ARMY COUNTERINTELLIGENCE A RESEARCH PROJECT...Jul 2017 The Expanded Application of Forensic Science and Law Enforcement Methodologies in Army Counterintelligence CW2 Stockham, Braden E. National...forensic science resources, law enforcement methodologies and procedures, and basic investigative training. In order to determine if these changes would
Poulsen, Nicklas N; Pedersen, Morten E; Østergaard, Jesper; Petersen, Nickolaj J; Nielsen, Christoffer T; Heegaard, Niels H H; Jensen, Henrik
2016-09-20
Detection of immune responses is important in the diagnosis of many diseases. For example, the detection of circulating autoantibodies against double-stranded DNA (dsDNA) is used in the diagnosis of Systemic Lupus Erythematosus (SLE). It is, however, difficult to reach satisfactory sensitivity, specificity, and accuracy with established assays. Also, existing methodologies for quantification of autoantibodies are challenging to transfer to a point-of-care setting. Here we present the use of flow-induced dispersion analysis (FIDA) for rapid (minutes) measurement of autoantibodies against dsDNA. The assay is based on Taylor dispersion analysis (TDA) and is fully automated with the use of standard capillary electrophoresis (CE) based equipment employing fluorescence detection. It is robust toward matrix effects as demonstrated by the direct analysis of samples composed of up to 85% plasma derived from human blood samples, and it allows for flexible exchange of the DNA sequences used to probe for the autoantibodies. Plasma samples from SLE positive patients were analyzed using the new FIDA methodology as well as by standard indirect immunofluorescence and solid-phase immunoassays. Interestingly, the patient antibodies bound DNA sequences with different affinities, suggesting pronounced heterogeneity among autoantibodies produced in SLE. The FIDA based methodology is a new approach for autoantibody detection and holds promise for being used for patient stratification and monitoring of disease activity.
The Swift-BAT Hard X-Ray Transient Monitor
NASA Technical Reports Server (NTRS)
Krimm, H. A.; Holland, S. T.; Corbet, R. H. D.; Pearlman, A. B.; Romano, P.; Kennea, J. A.; Bloom, J. S.; Barthelmy, S. D.; Baumgartner, W. H.; Cummings, J. R.;
2013-01-01
The Swift/Burst Alert Telescope (BAT) hard X-ray transient monitor provides near real-time coverage of the X-ray sky in the energy range 15-50 keV. The BAT observes 88% of the sky each day with a detection sensitivity of 5.3 mCrab for a full-day observation and a time resolution as fine as 64 s. The three main purposes of the monitor are (1) the discovery of new transient X-ray sources, (2) the detection of outbursts or other changes in the flux of known X-ray sources, and (3) the generation of light curves of more than 900 sources spanning over eight years. The primary interface for the BAT transient monitor is a public Web site. Between 2005 February 12 and 2013 April 30, 245 sources have been detected in the monitor, 146 of them persistent and 99 detected only in outburst. Among these sources, 17 were previously unknown and were discovered in the transient monitor. In this paper, we discuss the methodology and the data processing and filtering for the BAT transient monitor and review its sensitivity and exposure.We provide a summary of the source detections and classify them according to the variability of their light curves. Finally, we review all new BAT monitor discoveries. For the new sources that are previously unpublished, we present basic data analysis and interpretations.
Jeffrey Yang, Y; Haught, Roy C; Goodrich, James A
2009-06-01
Accurate detection and identification of natural or intentional contamination events in a drinking water pipe is critical to drinking water supply security and health risk management. To use conventional water quality sensors for the purpose, we have explored a real-time event adaptive detection, identification and warning (READiw) methodology and examined it using pilot-scale pipe flow experiments of 11 chemical and biological contaminants each at three concentration levels. The tested contaminants include pesticide and herbicides (aldicarb, glyphosate and dicamba), alkaloids (nicotine and colchicine), E. coli in terrific broth, biological growth media (nutrient broth, terrific broth, tryptic soy broth), and inorganic chemical compounds (mercuric chloride and potassium ferricyanide). First, through adaptive transformation of the sensor outputs, contaminant signals were enhanced and background noise was reduced in time-series plots leading to detection and identification of all simulated contamination events. The improved sensor detection threshold was 0.1% of the background for pH and oxidation-reduction potential (ORP), 0.9% for free chlorine, 1.6% for total chlorine, and 0.9% for chloride. Second, the relative changes calculated from adaptively transformed residual chlorine measurements were quantitatively related to contaminant-chlorine reactivity in drinking water. We have shown that based on these kinetic and chemical differences, the tested contaminants were distinguishable in forensic discrimination diagrams made of adaptively transformed sensor measurements.
THE SWIFT/BAT HARD X-RAY TRANSIENT MONITOR
DOE Office of Scientific and Technical Information (OSTI.GOV)
Krimm, H. A.; Holland, S. T.; Corbet, R. H. D.
2013-11-01
The Swift/Burst Alert Telescope (BAT) hard X-ray transient monitor provides near real-time coverage of the X-ray sky in the energy range 15-50 keV. The BAT observes 88% of the sky each day with a detection sensitivity of 5.3 mCrab for a full-day observation and a time resolution as fine as 64 s. The three main purposes of the monitor are (1) the discovery of new transient X-ray sources, (2) the detection of outbursts or other changes in the flux of known X-ray sources, and (3) the generation of light curves of more than 900 sources spanning over eight years. Themore » primary interface for the BAT transient monitor is a public Web site. Between 2005 February 12 and 2013 April 30, 245 sources have been detected in the monitor, 146 of them persistent and 99 detected only in outburst. Among these sources, 17 were previously unknown and were discovered in the transient monitor. In this paper, we discuss the methodology and the data processing and filtering for the BAT transient monitor and review its sensitivity and exposure. We provide a summary of the source detections and classify them according to the variability of their light curves. Finally, we review all new BAT monitor discoveries. For the new sources that are previously unpublished, we present basic data analysis and interpretations.« less
Wang, Yi; Wang, Yan; Ma, Aijing; Li, Dongxun; Luo, Lijuan; Liu, Dongxin; Hu, Shoukui; Jin, Dong; Liu, Kai; Ye, Changyun
2015-12-03
Here, a novel model of loop-mediated isothermal amplification (LAMP), termed multiple inner primers-LAMP (MIP-LAMP), was devised and successfully applied to detect Listeria monocytogenes. A set of 10 specific MIP-LAMP primers, which recognized 14 different regions of target gene, was designed to target a sequence in the hlyA gene. The MIP-LAMP assay efficiently amplified the target element within 35 min at 63 °C and was evaluated for sensitivity and specificity. The templates were specially amplified in the presence of the genomic DNA from L. monocytogenes. The limit of detection (LoD) of MIP-LAMP assay was 62.5 fg/reaction using purified L. monocytogenes DNA. The LoD for DNA isolated from serial dilutions of L. monocytogenes cells in buffer and in milk corresponded to 2.4 CFU and 24 CFU, respectively. The amplified products were analyzed by real-time monitoring of changes in turbidity, and visualized by adding Loop Fluorescent Detection Reagent (FD), or as a ladder-like banding pattern on gel electrophoresis. A total of 48 pork samples were investigated for L. monocytogenes by the novel MIP-LAMP method, and the diagnostic accuracy was shown to be 100% when compared to the culture-biotechnical method. In conclusion, the MIP-LAMP methodology was demonstrated to be a reliable, sensitive and specific tool for rapid detection of L. monocytogenes strains.