Sample records for subset sbas method

  1. An advanced algorithm for deformation estimation in non-urban areas

    NASA Astrophysics Data System (ADS)

    Goel, Kanika; Adam, Nico

    2012-09-01

    This paper presents an advanced differential SAR interferometry stacking algorithm for high resolution deformation monitoring in non-urban areas with a focus on distributed scatterers (DSs). Techniques such as the Small Baseline Subset Algorithm (SBAS) have been proposed for processing DSs. SBAS makes use of small baseline differential interferogram subsets. Singular value decomposition (SVD), i.e. L2 norm minimization is applied to link independent subsets separated by large baselines. However, the interferograms used in SBAS are multilooked using a rectangular window to reduce phase noise caused for instance by temporal decorrelation, resulting in a loss of resolution and the superposition of topography and deformation signals from different objects. Moreover, these have to be individually phase unwrapped and this can be especially difficult in natural terrains. An improved deformation estimation technique is presented here which exploits high resolution SAR data and is suitable for rural areas. The implemented method makes use of small baseline differential interferograms and incorporates an object adaptive spatial phase filtering and residual topography removal for an accurate phase and coherence estimation, while preserving the high resolution provided by modern satellites. This is followed by retrieval of deformation via the SBAS approach, wherein, the phase inversion is performed using an L1 norm minimization which is more robust to the typical phase unwrapping errors encountered in non-urban areas. Meter resolution TerraSAR-X data of an underground gas storage reservoir in Germany is used for demonstrating the effectiveness of this newly developed technique in rural areas.

  2. Land subsidence in Tianjin for 2015 to 2016 revealed by the analysis of Sentinel-1A with SBAS-InSAR

    NASA Astrophysics Data System (ADS)

    Guo, Jiming; Hu, Jiyuan; Li, Bing; Zhou, Lv; Wang, Wei

    2017-04-01

    It has been suggested that Tianjin, China, has significant land subsidence due to excessive extraction of water. Although it is presently under control, the land subsidence around Tianjin suburbs in recent years should not be ignored. However, existing research work on land subsidence is based on traditional synthetic aperture radar satellite images in which the research time spans are mainly before 2012. An advanced time-series method, namely small baselines subset (SBAS) technique, is applied to a total of 27 Sentinel-1A images over Tianjin acquired between May 31, 2015, and May 13, 2016, to derive the subsidence magnitude and distribution of Tianjin. Furthermore, the overall and quantitative validations of SBAS-derived results are implemented. First, the overall subsidence distribution derived by SBAS is compared with the annual report of land subsidence in Tianjin 2015, which shows the same subsidence trend and distribution. Then, 44 benchmarks and 2 continuously operating reference station datasets, i.e., CH01 and XQYY, are processed to provide a specific validation of SBAS-derived results of Sentinel-1A. Finally, through investigation, an interpretation from two aspects of groundwater extraction and geological structures of the surrounding Wangqingtuo settlement funnel area is given.

  3. Monitoring of Surface Subsidence of the Mining Area Based on Sbas

    NASA Astrophysics Data System (ADS)

    Zhu, Y.; Zhou, S.; Zang, D.; Lu, T.

    2018-05-01

    This paper has collected 7 scenes of L band PALSAR sensor radar data of a mine in FengCheng city, jiangxi province, using the Small-baseline Subset (SBAS) method to invert the surface subsidence of the mine. Baselines of interference less than 800m has been chosen to constitute short baseline differential interference atlas, using pixels whose average coherent coefficient was larger than or equal to 0.3 as like high coherent point target, using singular value decomposition (SVD) method to calculate deformation phase sequence based on these high coherent points, and the accumulation of settlements of study area of different period had been obtained, so as to reflect the ground surface settlement evolution of the settlement of the area. The results of the study has showed that: SBAS technology has overcome coherent problem of the traditionality D-InSAR technique, continuous deformation field of surface mining in time dimension of time could been obtained, characteristics of ground surface settlement of mining subsidence in different period has been displayed, so to improve the accuracy and reliability of the monitoring results.

  4. Saponin-based adjuvants induce cross-presentation in dendritic cells by intracellular lipid body formation

    PubMed Central

    den Brok, Martijn H.; Büll, Christian; Wassink, Melissa; de Graaf, Annemarie M.; Wagenaars, Jori A.; Minderman, Marthe; Thakur, Mayank; Amigorena, Sebastian; Rijke, Eric O.; Schrier, Carla C.; Adema, Gosse J.

    2016-01-01

    Saponin-based adjuvants (SBAs) are being used in animal and human (cancer) vaccines, as they induce protective cellular immunity. Their adjuvant potency is a factor of inflammasome activation and enhanced antigen cross-presentation by dendritic cells (DCs), but how antigen cross-presentation is induced is not clear. Here we show that SBAs uniquely induce intracellular lipid bodies (LBs) in the CD11b+ DC subset in vitro and in vivo. Using genetic and pharmacological interference in models for vaccination and in situ tumour ablation, we demonstrate that LB induction is causally related to the saponin-dependent increase in cross-presentation and T-cell activation. These findings link adjuvant activity to LB formation, aid the application of SBAs as a cancer vaccine component, and will stimulate development of new adjuvants enhancing T-cell-mediated immunity. PMID:27819292

  5. SBAS-InSAR analysis of surface deformation at Mauna Loa and Kilauea volcanoes in Hawaii

    USGS Publications Warehouse

    Casu, F.; Lanari, Riccardo; Sansosti, E.; Solaro, G.; Tizzani, Pietro; Poland, M.; Miklius, Asta

    2009-01-01

    We investigate the deformation of Mauna Loa and K??lauea volcanoes, Hawai'i, by exploiting the advanced differential Synthetic Aperture Radar Interferometry (InSAR) technique referred to as the Small BAseline Subset (SBAS) algorithm. In particular, we present time series of line-of-sight (LOS) displacements derived from SAR data acquired by the ASAR instrument, on board the ENVISAT satellite, from the ascending (track 93) and descending (track 429) orbits between 2003 and 2008. For each coherent pixel of the radar images we compute time-dependent surface displacements as well as the average LOS deformation rate. Our results quantify, in space and time, the complex deformation of Mauna Loa and K??lauea volcanoes. The derived InSAR measurements are compared to continuous GPS data to asses the quality of the SBAS-InSAR products. ??2009 IEEE.

  6. Evaluating Three Insar Time-Series Methods to Assess Creep Motion, Case Study: Masouleh Landslide in North Iran

    NASA Astrophysics Data System (ADS)

    Mirzaee, S.; Motagh, M.; Akbari, B.; Wetzel, H. U.; Roessner, S.

    2017-05-01

    Masouleh is one of the ancient cities located in a high mountainous area in Gilan province of northern Iran. The region is threatened by a hazardous landslide, which was last activated in 1998, causing 32 dead and 45 injured. Significant temporal decorrelation caused by dense vegetation coverage within the landslide area makes the use of Synthetic Aperture Radar Interferometry (InSAR) for monitoring landslide movement very challenging. In this paper, we investigate the capability of three InSAR time-series techniques for evaluating creep motion on Masouleh landslide. The techniques are Persistent Scatterer Interferometry (PSI), Small BAseline Subset (SBAS) and SqueeSAR. The analysis is done using a dataset of 33 TerraSAR-X images in SpotLight (SL) mode covering a period of 15 months between June 2015 and September 2016. Results show the distinguished capability of SqueeSAR method in comparison to 2 other techniques for assessing landslide movement. The final number of scatterers in the landslide body detected by PSI and SBAS are about 70 and 120 respectively while this increases to about 345 in SqueeSAR. The coherence of interferograms improved by about 37% for SqueeSAR as compared to SBAS. The same rate of displacement was observed in those regions where all the methods were able to detect scatterers. Maximum rates of displacement detected by SqueeSAR technique in the northern edge, older and younger part of the landslide body are about -39, -65 and -22 mm/y, respectively.

  7. Deformation Estimation In Non-Urban Areas Exploiting High Resolution SAR Data

    NASA Astrophysics Data System (ADS)

    Goel, Kanika; Adam, Nico

    2012-01-01

    Advanced techniques such as the Small Baseline Subset Algorithm (SBAS) have been developed for terrain motion mapping in non-urban areas with a focus on extracting information from distributed scatterers (DSs). SBAS uses small baseline differential interferograms (to limit the effects of geometric decorrelation) and these are typically multilooked to reduce phase noise, resulting in loss of resolution. Various error sources e.g. phase unwrapping errors, topographic errors, temporal decorrelation and atmospheric effects also affect the interferometric phase. The aim of our work is an improved deformation monitoring in non-urban areas exploiting high resolution SAR data. The paper provides technical details and a processing example of a newly developed technique which incorporates an adaptive spatial phase filtering algorithm for an accurate high resolution differential interferometric stacking, followed by deformation retrieval via the SBAS approach where we perform the phase inversion using a more robust L1 norm minimization.

  8. Intermittent Small Baseline Subset (ISBAS) InSAR of rural and vegetated terrain: a new method to monitor land motion applied to peatlands in Wales, UK

    NASA Astrophysics Data System (ADS)

    Cigna, Francesca; Rawlins, Barry G.; Jordan, Colm J.; Sowter, Andrew; Evans, Christopher D.

    2014-05-01

    It is renowned that the success of multi-interferometric Synthetic Aperture Radar (SAR) methods such as Persistent Scatterer Interferometry (PSI) and Small BASeline Subset (SBAS) is controlled by not only the availability of data, but also local topography and land cover. Locations with sufficient temporal phase stability and coherence are typically limited to either built-up, urban areas or areas of exposed bedrock. Whilst conventional PSI and SBAS approaches have limited potential to monitor surface motions in areas where few (or zero) scatterers or coherent targets exist, the newly developed Intermittent SBAS (ISBAS) technique (Sowter et al. 2013) can fill the gap by providing a more complete picture of ground movement in rural and vegetated regions. ISBAS is a small baseline, multi-look, coherent target method, which considers the intermittent coherence of rural areas and can work over a wide range of land cover classes including agriculture and grassland. Building upon a nationwide study that the British Geological Survey (BGS) undertook to assess the feasibility of InSAR techniques to monitor the landmass of Great Britain (Cigna et al. 2013), we identified a rural region in North Wales as an appropriate target area to evaluate the efficacy of ISBAS, where conventional SBAS and PSI approaches are unlikely to succeed. According to the UK Land Cover Map 2007 (LCM2007) from the Centre for Ecology & Hydrology (CEH), this area is dominated by improved and acid grassland, heather, bog and coniferous woodland, which are likely to result into extremely low PSI or SBAS point densities and sparse coverage of monitoring results. We employed 53 ERS-1/2 C-band (5.3GHz frequency) SAR data acquired in descending mode between 1993 and 2000, which were made available to BGS via the ESA Category 1 project id.13543. In the framework of the Glastir Monitoring & Evaluation Programme (Emmett et al. 2013), funded by the Welsh Government, we processed these using ISBAS covering a 4,460 km2 region of interest. By using thresholds for perpendicular and temporal baselines of 200 m and 4 years respectively, a total of 300 small baseline interferograms were generated and good network redundancy was obtained. Average temporal coherence of the processed scenes was rather low, with only ~4% of the processed area showing values exceeding 0.25 (hence suitable for an SBAS analysis), and most of the region revealed very low coherence, especially over areas of peat, grass, forest and heather. Processing with ISBAS allowed us to consider the intermittent behaviour of rural scatterers, dramatically improving the point density and achieving areal coverage results of around 99%. This increased the total number of monitored points by a factor of 25. The greatest improvement in terms of point density was achieved for coniferous woodland, which showed ISBAS/SBAS ratios exceeding 300, and densities increasing up to 150 points/km2 with ISBAS. Bog, acid grassland and dwarf shrub heath showed densities increasing from 2-10 to 150-160 points/km2 when using ISBAS with respect to conventional SBAS. It is worth noting that despite intermittence and the fact of relying only on a temporal subset of interferograms, the vast majority of the ISBAS points showed velocity standard deviations lower than 1.0-1.5 mm/yr, hence good quality of the estimated ground motion rates was retained using ISBAS and intermittently coherent targets. Geological interpretation, analysis and further discussion of the results in relation to changes in surface elevation of blanket peat are presented by Rawlins et al. 2014 (cf. BG2.3/SSS6.6). REFERENCES Cigna, F., Bateson, L., Jordan, C., & Dashwood, C. (2013). Nationwide monitoring of geohazards in Great Britain with InSAR: feasibility mapping based on ERS-1/2 and ENVISAT imagery. IEEE International Geoscience and Remote Sensing Symposium (IGARSS), pp. 672:675. Emmett B. E. & the GMEP team (2013). Glastir Monitoring & Evaluation Programme. First Year Annual Report to Welsh Government (Contract reference: C147/2010/11). NERC Centre for Ecology & Hydrology (CEH Project: NEC04780), pp. 393. Rawlins, B. G., Cigna, F., Jordan, C., Sowter, A., & Evans, C. (2014). Monitoring changes in surface elevation of blanket peat and other land cover types using a novel InSAR processing technique. EGU 2014, Session BG2.3/SSS6.6. Sowter, A., Bateson, L., Strange, P., Ambrose, K., & Syafiudin, M. (2013). DInSAR estimation of land motion using intermittent coherence with application to the South Derbyshire and Leicestershire coalfield. Remote Sensing Letters, 4, 979:987.

  9. Monitoring and inversion on land subsidence over mining area with InSAR technique

    USGS Publications Warehouse

    Wang, Y.; Zhang, Q.; Zhao, C.; Lu, Z.; Ding, X.

    2011-01-01

    The Wulanmulun town, located in Inner Mongolia, is one of the main mining areas of Shendong Company such as Shangwan coal mine and Bulianta coal mine, which has been suffering serious mine collapse with the underground mine withdrawal. We use ALOS/PALSAR data to extract land deformation under these regions, in which Small Baseline Subsets (SBAS) method was applied. Then we compared InSAR results with the underground mining activities, and found high correlations between them. Lastly we applied Distributed Dislocation (Okada) model to invert the mine collapse mechanism. ?? 2011 Copyright Society of Photo-Optical Instrumentation Engineers (SPIE).

  10. Real-Time Single Frequency Precise Point Positioning Using SBAS Corrections

    PubMed Central

    Li, Liang; Jia, Chun; Zhao, Lin; Cheng, Jianhua; Liu, Jianxu; Ding, Jicheng

    2016-01-01

    Real-time single frequency precise point positioning (PPP) is a promising technique for high-precision navigation with sub-meter or even centimeter-level accuracy because of its convenience and low cost. The navigation performance of single frequency PPP heavily depends on the real-time availability and quality of correction products for satellite orbits and satellite clocks. Satellite-based augmentation system (SBAS) provides the correction products in real-time, but they are intended to be used for wide area differential positioning at 1 meter level precision. By imposing the constraints for ionosphere error, we have developed a real-time single frequency PPP method by sufficiently utilizing SBAS correction products. The proposed PPP method are tested with static and kinematic data, respectively. The static experimental results show that the position accuracy of the proposed PPP method can reach decimeter level, and achieve an improvement of at least 30% when compared with the traditional SBAS method. The positioning convergence of the proposed PPP method can be achieved in 636 epochs at most in static mode. In the kinematic experiment, the position accuracy of the proposed PPP method can be improved by at least 20 cm relative to the SBAS method. Furthermore, it has revealed that the proposed PPP method can achieve decimeter level convergence within 500 s in the kinematic mode. PMID:27517930

  11. The 2010 slow slip event and secular motion at Kilauea, Hawai`i inferred from TerraSAR-X InSAR data

    USGS Publications Warehouse

    Chen, Jingyi; Zebker, Howard A.; Segall, Paul; Miklius, Asta

    2014-01-01

    We present here an Small BAseline Subset (SBAS) algorithm to extract both transient and secular ground deformations on the order of millimeters in the presence of tropospheric noise on the order of centimeters, when the transient is of short duration and known time, and the background deformation is smooth in time. We applied this algorithm to study the 2010 slow slip event as well as the secular motion of Kīlauea's south flank using 49 TerraSAR-X images. We also estimate the tropospheric delay variation relative to a given reference pixel using an InSAR SBAS approach. We compare the InSAR SBAS solution for both ground deformation and tropospheric delays with existing GPS measurements and confirm that the ground deformation signal andtropospheric noise in InSAR data are successfully separated. We observe that the coastal region on the south side of the Hilina Pali moves at a higher background rate than the region north side of the Pali. We also conclude that the 2010 SSE displacement is mainly horizontal and the maximum magnitude of the 2010 SSE vertical component is less than 5 mm.

  12. Small baseline subsets approach of DInSAR for investigating land surface deformation along the high-speed railway

    NASA Astrophysics Data System (ADS)

    Rao, Xiong; Tang, Yunwei

    2014-11-01

    Land surface deformation evidently exists in a newly-built high-speed railway in the southeast of China. In this study, we utilize the Small BAseline Subsets (SBAS)-Differential Synthetic Aperture Radar Interferometry (DInSAR) technique to detect land surface deformation along the railway. In this work, 40 Cosmo-SkyMed satellite images were selected to analyze the spatial distribution and velocity of the deformation in study area. 88 pairs of image with high coherence were firstly chosen with an appropriate threshold. These images were used to deduce the deformation velocity map and the variation in time series. This result can provide information for orbit correctness and ground control point (GCP) selection in the following steps. Then, more pairs of image were selected to tighten the constraint in time dimension, and to improve the final result by decreasing the phase unwrapping error. 171 combinations of SAR pairs were ultimately selected. Reliable GCPs were re-selected according to the previously derived deformation velocity map. Orbital residuals error was rectified using these GCPs, and nonlinear deformation components were estimated. Therefore, a more accurate surface deformation velocity map was produced. Precise geodetic leveling work was implemented in the meantime. We compared the leveling result with the geocoding SBAS product using the nearest neighbour method. The mean error and standard deviation of the error were respectively 0.82 mm and 4.17 mm. This result demonstrates the effectiveness of DInSAR technique for monitoring land surface deformation, which can serve as a reliable decision for supporting highspeed railway project design, construction, operation and maintenance.

  13. Sentinel-1 data massive processing for large scale DInSAR analyses within Cloud Computing environments through the P-SBAS approach

    NASA Astrophysics Data System (ADS)

    Lanari, Riccardo; Bonano, Manuela; Buonanno, Sabatino; Casu, Francesco; De Luca, Claudio; Fusco, Adele; Manunta, Michele; Manzo, Mariarosaria; Pepe, Antonio; Zinno, Ivana

    2017-04-01

    The SENTINEL-1 (S1) mission is designed to provide operational capability for continuous mapping of the Earth thanks to its two polar-orbiting satellites (SENTINEL-1A and B) performing C-band synthetic aperture radar (SAR) imaging. It is, indeed, characterized by enhanced revisit frequency, coverage and reliability for operational services and applications requiring long SAR data time series. Moreover, SENTINEL-1 is specifically oriented to interferometry applications with stringent requirements based on attitude and orbit accuracy and it is intrinsically characterized by small spatial and temporal baselines. Consequently, SENTINEL-1 data are particularly suitable to be exploited through advanced interferometric techniques such as the well-known DInSAR algorithm referred to as Small BAseline Subset (SBAS), which allows the generation of deformation time series and displacement velocity maps. In this work we present an advanced interferometric processing chain, based on the Parallel SBAS (P-SBAS) approach, for the massive processing of S1 Interferometric Wide Swath (IWS) data aimed at generating deformation time series in efficient, automatic and systematic way. Such a DInSAR chain is designed to exploit distributed computing infrastructures, and more specifically Cloud Computing environments, to properly deal with the storage and the processing of huge S1 datasets. In particular, since S1 IWS data are acquired with the innovative Terrain Observation with Progressive Scans (TOPS) mode, we could benefit from the structure of S1 data, which are composed by bursts that can be considered as separate acquisitions. Indeed, the processing is intrinsically parallelizable with respect to such independent input data and therefore we basically exploited this coarse granularity parallelization strategy in the majority of the steps of the SBAS processing chain. Moreover, we also implemented more sophisticated parallelization approaches, exploiting both multi-node and multi-core programming techniques. Currently, Cloud Computing environments make available large collections of computing resources and storage that can be effectively exploited through the presented S1 P-SBAS processing chain to carry out interferometric analyses at a very large scale, in reduced time. This allows us to deal also with the problems connected to the use of S1 P-SBAS chain in operational contexts, related to hazard monitoring and risk prevention and mitigation, where handling large amounts of data represents a challenging task. As a significant experimental result we performed a large spatial scale SBAS analysis relevant to the Central and Southern Italy by exploiting the Amazon Web Services Cloud Computing platform. In particular, we processed in parallel 300 S1 acquisitions covering the Italian peninsula from Lazio to Sicily through the presented S1 P-SBAS processing chain, generating 710 interferograms, thus finally obtaining the displacement time series of the whole processed area. This work has been partially supported by the CNR-DPC agreement, the H2020 EPOS-IP project (GA 676564) and the ESA GEP project.

  14. A new implementation of full resolution SBAS-DInSAR processing chain for the effective monitoring of structures and infrastructures

    NASA Astrophysics Data System (ADS)

    Bonano, Manuela; Buonanno, Sabatino; Ojha, Chandrakanta; Berardino, Paolo; Lanari, Riccardo; Zeni, Giovanni; Manunta, Michele

    2017-04-01

    The advanced DInSAR technique referred to as Small BAseline Subset (SBAS) algorithm has already largely demonstrated its effectiveness to carry out multi-scale and multi-platform surface deformation analyses relevant to both natural and man-made hazards. Thanks to its capability to generate displacement maps and long-term deformation time series at both regional (low resolution analysis) and local (full resolution analysis) spatial scales, it allows to get more insights on the spatial and temporal patterns of localized displacements relevant to single buildings and infrastructures over extended urban areas, with a key role in supporting risk mitigation and preservation activities. The extensive application of the multi-scale SBAS-DInSAR approach in many scientific contexts has gone hand in hand with new SAR satellite mission development, characterized by different frequency bands, spatial resolution, revisit times and ground coverage. This brought to the generation of huge DInSAR data stacks to be efficiently handled, processed and archived, with a strong impact on both the data storage and the computational requirements needed for generating the full resolution SBAS-DInSAR results. Accordingly, innovative and effective solutions for the automatic processing of massive SAR data archives and for the operational management of the derived SBAS-DInSAR products need to be designed and implemented, by exploiting the high efficiency (in terms of portability, scalability and computing performances) of the new ICT methodologies. In this work, we present a novel parallel implementation of the full resolution SBAS-DInSAR processing chain, aimed at investigating localized displacements affecting single buildings and infrastructures relevant to very large urban areas, relying on different granularity level parallelization strategies. The image granularity level is applied in most steps of the SBAS-DInSAR processing chain and exploits the multiprocessor systems with distributed memory. Moreover, in some processing steps very heavy from the computational point of view, the Graphical Processing Units (GPU) are exploited for the processing of blocks working on a pixel-by-pixel basis, requiring strong modifications on some key parts of the sequential full resolution SBAS-DInSAR processing chain. GPU processing is implemented by efficiently exploiting parallel processing architectures (as CUDA) for increasing the computing performances, in terms of optimization of the available GPU memory, as well as reduction of the Input/Output operations on the GPU and of the whole processing time for specific blocks w.r.t. the corresponding sequential implementation, particularly critical in presence of huge DInSAR datasets. Moreover, to efficiently handle the massive amount of DInSAR measurements provided by the new generation SAR constellations (CSK and Sentinel-1), we perform a proper re-design strategy aimed at the robust assimilation of the full resolution SBAS-DInSAR results into the web-based Geonode platform of the Spatial Data Infrastructure, thus allowing the efficient management, analysis and integration of the interferometric results with different data sources.

  15. Residual settlements detection of ocean reclaimed lands with multi-platform SAR time series and SBAS technique: a case study of Shanghai Pudong International Airport

    NASA Astrophysics Data System (ADS)

    Yu, Lei; Yang, Tianliang; Zhao, Qing; Pepe, Antonio; Dong, Hongbin; Sun, Zhibin

    2017-09-01

    Shanghai Pudong International airport is one of the three major international airports in China. The airport is located at the Yangtze estuary which is a sensitive belt of sea and land interaction region. The majority of the buildings and facilities in the airport are built on ocean-reclaimed lands and silt tidal flat. Residual ground settlement could probably occur after the completion of the airport construction. The current status of the ground settlement of the airport and whether it is within a safe range are necessary to be investigated. In order to continuously monitor the ground settlement of the airport, two Synthetic Aperture Radar (SAR) time series, acquired by X-band TerraSAR-X (TSX) and TanDEM-X (TDX) sensors from December 2009 to December 2010 and from April 2013 to July 2015, were used for analyzing with SBAS technique. We firstly obtained ground deformation measurement of each SAR subset. Both of the measurements show that obvious ground subsidence phenomenon occurred at the airport, especially in the second runway, the second terminal, the sixth cargo plane and the eighth apron. The maximum vertical ground deformation rates of both SAR subset measurements were greater than -30 mm/year, while the cumulative ground deformations reached up to -30 mm and -35 mm respectively. After generation of SBAS-retrieved ground deformation for each SAR subset, we performed a joint analysis to combine time series of each common coherent point by applying a geotechnical model. The results show that three centralized areas of ground deformation existed in the airport, mainly distributed in the sixth cargo plane, the fifth apron and the fourth apron, The maximum vertical cumulative ground subsidence was more than -70 mm. In addition, by analyzing the combined time series of four selected points, we found that the ground deformation rates of the points located at the second runway, the third runway, and the second terminal, were progressively smaller as time goes by. It indicates that the stabilities of the foundation around these points were gradually enhanced.

  16. Application of Time Series Insar Technique for Deformation Monitoring of Large-Scale Landslides in Mountainous Areas of Western China

    NASA Astrophysics Data System (ADS)

    Qu, T.; Lu, P.; Liu, C.; Wan, H.

    2016-06-01

    Western China is very susceptible to landslide hazards. As a result, landslide detection and early warning are of great importance. This work employs the SBAS (Small Baseline Subset) InSAR Technique for detection and monitoring of large-scale landslides that occurred in Li County, Sichuan Province, Western China. The time series INSAR is performed using descending scenes acquired from TerraSAR-X StripMap mode since 2014 to get the spatial distribution of surface displacements of this giant landslide. The time series results identify the distinct deformation zone on the landslide body with a rate of up to 150mm/yr. The deformation acquired by SBAS technique is validated by inclinometers from diverse boreholes of in-situ monitoring. The integration of InSAR time series displacements and ground-based monitoring data helps to provide reliable data support for the forecasting and monitoring of largescale landslide.

  17. Influence of Birth Preparedness, Decision-Making on Location of Birth and Assistance by Skilled Birth Attendants among Women in South-Western Uganda

    PubMed Central

    Kabakyenga, Jerome K.; Östergren, Per-Olof; Turyakira, Eleanor; Pettersson, Karen Odberg

    2012-01-01

    Introduction Assistance by skilled birth attendants (SBAs) during childbirth is one of the strategies aimed at reducing maternal morbidity and mortality in low-income countries. However, the relationship between birth preparedness and decision-making on location of birth and assistance by skilled birth attendants in this context is not well studied. The aim of this study was to assess the influence of birth preparedness practices and decision-making and assistance by SBAs among women in south-western Uganda. Methods Community survey methods were used to identify 759 recently delivered women from 120 villages in rural Mbarara district. Interviewer-administered questionnaires were used to collect data. Logistic regression analyses were conducted to assess the relationship between birth preparedness, decision-making on location of birth and assistance by SBAs. Results 35% of the women had been prepared for childbirth and the prevalence of assistance by SBAs in the sample was 68%. The final decision regarding location of birth was made by the woman herself (36%), the woman with spouse (56%) and the woman with relative/friend (8%). The relationships between birth preparedness and women decision-making on location of birth in consultation with spouse/friends/relatives and choosing assistance by SBAs showed statistical significance which persisted after adjusting for possible confounders (OR 1.5, 95% CI: 1.0–2.4) and (OR 4.4, 95% CI: 3.0–6.7) respectively. Education, household assets and birth preparedness showed clear synergistic effect on the relationship between decision-maker on location of birth and assistance by SBAs. Other factors which showed statistical significant relationships with assistance by SBAs were ANC attendance, parity and residence. Conclusion Women’s decision-making on location of birth in consultation with spouse/friends/relatives and birth preparedness showed significant effect on choosing assistance by SBAs at birth. Education and household assets ownership showed a synergistic effect on the relationship between the decision-maker and assistance by SBAs. PMID:22558214

  18. Integration of MODIS data and Short Baseline Subset (SBAS) technique for land subsidence monitoring in Datong, China

    NASA Astrophysics Data System (ADS)

    Zhao, Chao-ying; Zhang, Qin; Yang, Chengsheng; Zou, Weibao

    2011-07-01

    Datong is located in the north of Shanxi Province, which is famous for its old-fashioned coal-mining preservation in China. Some serious issues such as land subsidence, ground fissures, mining collapse, and earthquake hazards have occurred over this area for a long time resulting in significant damages to buildings and roads. In order to monitor and mitigate these natural man-made hazards, Short Baseline Subsets (SBAS) InSAR technique with ten Envisat ASAR data is applied to detect the surface deformation over an area of thousands of square kilometers. Then, five MODIS data are used to check the atmospheric effects on InSAR interferograms. Finally, nine nonlinear land subsidence cumulative results during September 2004 and February 2008 are obtained. Based on the deformation data, three kinds of land subsidence are clearly detected, caused by mine extraction, underground water withdrawal and construction of new economic zones, respectively. The annual mean velocity of subsidence can reach 1 to 4 cm/year in different subsidence areas. A newly designed high-speed railway (HSR) with speeds of 350 km/h will cross through the Datong hi-tech zone. Special measures should be taken for the long run of this project. In addition, another two subsidence regions need further investigation to mitigate such hazards.

  19. Analysis of deformation patterns through advanced DINSAR techniques in Istanbul megacity

    NASA Astrophysics Data System (ADS)

    Balik Sanli, F.; Calò, F.; Abdikan, S.; Pepe, A.; Gorum, T.

    2014-09-01

    As result of the Turkey's economic growth and heavy migration processes from rural areas, Istanbul has experienced a high urbanization rate, with severe impacts on the environment in terms of natural resources pressure, land-cover changes and uncontrolled sprawl. As a consequence, the city became extremely vulnerable to natural and man-made hazards, inducing ground deformation phenomena that threaten buildings and infrastructures and often cause significant socio-economic losses. Therefore, the detection and monitoring of such deformation patterns is of primary importance for hazard and risk assessment as well as for the design and implementation of effective mitigation strategies. Aim of this work is to analyze the spatial distribution and temporal evolution of deformations affecting the Istanbul metropolitan area, by exploiting advanced Differential SAR Interferometry (DInSAR) techniques. In particular, we apply the Small BAseline Subset (SBAS) approach to a dataset of 43 TerraSAR-X images acquired, between November 2010 and June 2012, along descending orbits with an 11-day revisit time and a 3 m × 3 m spatial resolution. The SBAS processing allowed us to remotely detect and monitor subsidence patterns over all the urban area as well as to provide detailed information at the scale of the single building. Such SBAS measurements, effectively integrated with ground-based monitoring data and thematic maps, allows to explore the relationship between the detected deformation phenomena and urbanization, contributing to improve the urban planning and management.

  20. Accelerated Training of Skilled Birth Attendants in a Marginalized Population on the Thai-Myanmar Border: A Multiple Methods Program Evaluation

    PubMed Central

    White, Adrienne Lynne; Min, Thaw Htwe; Gross, Mechthild M.; Kajeechiwa, Ladda; Thwin, May Myo; Hanboonkunupakarn, Borimas; Than, Hla Hla; Zin, Thet Wai; Rijken, Marcus J.; Hoogenboom, Gabie; McGready, Rose

    2016-01-01

    Background To evaluate a skilled birth attendant (SBA) training program in a neglected population on the Thai-Myanmar border, we used multiple methods to show that refugee and migrant health workers can be given effective training in their own environment to become SBAs and teachers of SBAs. The loss of SBAs through resettlement to third countries necessitated urgent training of available workers to meet local needs. Methods and Findings All results were obtained from student records of theory grades and clinical log books. Qualitative evaluation of both the SBA and teacher programs was obtained using semi-structured interviews with supervisors and teachers. We also reviewed perinatal indicators over an eight-year period, starting prior to the first training program until after the graduation of the fourth cohort of SBAs. Results Four SBA training programs scheduled between 2009 and 2015 resulted in 79/88 (90%) of students successfully completing a training program of 250 theory hours and 625 supervised clinical hours. All 79 students were able to: achieve pass grades on theory examination (median 80%, range [70–89]); obtain the required clinical experience within twelve months; achieve clinical competence to provide safe care during childbirth. In 2010–2011, five experienced SBAs completed a train-the-trainer (TOT) program and went on to facilitate further training programs. Perinatal indicators within Shoklo Malaria Research Unit (SMRU), such as place of birth, maternal and newborn outcomes, showed no significant differences before and after introduction of training or following graduate deployment in the local maternity units. Confidence, competence and teamwork emerged from qualitative evaluation by senior SBAs working with and supervising students in the clinics. Conclusions We demonstrate that in resource-limited settings or in marginalized populations, it is possible to accelerate training of skilled birth attendants to provide safe maternity care. Education needs to be tailored to local needs to ensure evidence-based care of women and their families. PMID:27711144

  1. A time series deformation estimation in the NW Himalayas using SBAS InSAR technique

    NASA Astrophysics Data System (ADS)

    Kumar, V.; Venkataraman, G.

    2012-12-01

    A time series land deformation studies in north western Himalayan region has been presented in this study. Synthetic aperture radar (SAR) interferometry (InSAR) is an important tool for measuring the land displacement caused by different geological processes [1]. Frequent spatial and temporal decorrelation in the Himalayan region is a strong impediment in precise deformation estimation using conventional interferometric SAR approach. In such cases, advanced DInSAR approaches PSInSAR as well as Small base line subset (SBAS) can be used to estimate earth surface deformation. The SBAS technique [2] is a DInSAR approach which uses a twelve or more number of repeat SAR acquisitions in different combinations of a properly chosen data (subsets) for generation of DInSAR interferograms using two pass interferometric approach. Finally it leads to the generation of mean deformation velocity maps and displacement time series. Herein, SBAS algorithm has been used for time series deformation estimation in the NW Himalayan region. ENVISAT ASAR IS2 swath data from 2003 to 2008 have been used for quantifying slow deformation. Himalayan region is a very active tectonic belt and active orogeny play a significant role in land deformation process [3]. Geomorphology in the region is unique and reacts to the climate change adversely bringing with land slides and subsidence. Settlements on the hill slopes are prone to land slides, landslips, rockslides and soil creep. These hazardous features have hampered the over all progress of the region as they obstruct the roads and flow of traffic, break communication, block flowing water in stream and create temporary reservoirs and also bring down lot of soil cover and thus add enormous silt and gravel to the streams. It has been observed that average deformation varies from -30.0 mm/year to 10 mm/year in the NW Himalayan region . References [1] Massonnet, D., Feigl, K.L.,Rossi, M. and Adragna, F. (1994) Radar interferometry mapping of deformation in the year after the Landers earthquake. Nature 1994, 369, 227-230. [2] Berardino, P., Fornaro, G., Lanari, R., Sansosti, E. (2002). A new algorithm for surface deformation Monitoring based on Small Baseline Differential SAR Interferograms. IEEE Transactions on Geoscience and Remote Sensing, 40 (11), 2375-2383. [3] GEOLOGICAL SURVEY OF INDIA (GSI), (1999) Inventory of the Himalayan glaciers. Special publication, vol. 34, pp. 165-168. [4] Chen, C.W., and Zebker, H. A., (2000). Network approaches to two-dimensional phase unwrapping: intractability and two new algorithms. Journal of the Optical Society of America, A, 17, 401-414.

  2. Deriving a time series of 3D glacier motion to investigate interactions of a large mountain glacial system with its glacial lake: Use of Synthetic Aperture Radar Pixel Offset-Small Baseline Subset technique

    NASA Astrophysics Data System (ADS)

    Li, Jia; Li, Zhi-wei; Wu, Li-xin; Xu, Bing; Hu, Jun; Zhou, Yu-shan; Miao, Ze-lang

    2018-04-01

    We investigated the interactions of Lake Merzbacher with the Southern Inylchek Glacier (Central Tien Shan) using the Synthetic Aperture Radar (SAR) Pixel Offset-Small Baseline Subset (PO-SBAS) to derive a time series of three-dimensional (3D) glacier motion. The measurements of 3D glacier velocity were ∼17% more precise than a previous study that did not use the SBAS estimation. The velocities of the glacier were up to 58 cm/day east, 70 cm/day north, and 113 cm/day vertically. Combining these data with Landsat images indicated that movement of the glacier is sensitive to changes of Lake Merzbacher. Specifically, the entry of more lake water into the glacier during the ablation season increased englacial ablation due to thermal erosion. Moreover, ice calving begins when the lake water gradually lifts the ice dam. Calving can cause greater loss of glacier mass than normal ablation. Trying to replenish the front mass loss, the distributary accelerates and the mass loss further intensifies. A time series of the vertical velocity indicates that the glacier tongue has a huge englacial cavity. We suggest that the lake outburst is directly related to the crack of this cavity. Bursting of the lake triggers a mini-surge at the glacier tongue. The vertical velocity at the ice dam was ∼+60 cm/day before the lake outburst, and ∼-113 cm/day afterwards. After drainage of the lake, flow velocities at the distributary, do not sharply decrease because pre-drainage mass loss needs to be replenished by fast flow. Based on comparisons with previous measurements, our results indicate that the lake had an increasing influence on the glacier from 2005 to 2009. This study demonstrates that a time series of 3D glacier motion based on the PO-SBAS technique is effective for assessing the dynamics of a mountain glacial system and interactions with its glacial lake.

  3. Provider perspectives on the enabling environment required for skilled birth attendance: a qualitative study in western Nepal.

    PubMed

    Morgan, Alison; Jimenez Soto, Eliana; Bhandari, Gajananda; Kermode, Michelle

    2014-12-01

    In Nepal, where difficult geography and an under-resourced health system contribute to poor health care access, the government has increased the number of trained skilled birth attendants (SBAs) and posted them in newly constructed birthing centres attached to peripheral health facilities that are available to women 24 h a day. This study describes their views on their enabling environment. Qualitative methods included semi-structured interviews with 22 SBAs within Palpa district, a hill district in the Western Region of Nepal; a focus group discussion with ten SBA trainees, and in-depth interviews with five key informants. Participants identified the essential components of an enabling environment as: relevant training; ongoing professional support; adequate infrastructure, equipment and drugs; and timely referral pathways. All SBAs who practised alone felt unable to manage obstetric complications because quality management of life-threatening complications requires the attention of more than one SBA. Maternal health guidelines should account for the provision of an enabling environment in addition to the deployment of SBAs. In Nepal, referral systems require strengthening, and the policy of posting SBAs alone, in remote clinics, needs to be reconsidered to achieve the goal of reducing maternal deaths through timely management of obstetric complications. © 2014 John Wiley & Sons Ltd.

  4. Accelerated Training of Skilled Birth Attendants in a Marginalized Population on the Thai-Myanmar Border: A Multiple Methods Program Evaluation.

    PubMed

    White, Adrienne Lynne; Min, Thaw Htwe; Gross, Mechthild M; Kajeechiwa, Ladda; Thwin, May Myo; Hanboonkunupakarn, Borimas; Than, Hla Hla; Zin, Thet Wai; Rijken, Marcus J; Hoogenboom, Gabie; McGready, Rose

    2016-01-01

    To evaluate a skilled birth attendant (SBA) training program in a neglected population on the Thai-Myanmar border, we used multiple methods to show that refugee and migrant health workers can be given effective training in their own environment to become SBAs and teachers of SBAs. The loss of SBAs through resettlement to third countries necessitated urgent training of available workers to meet local needs. All results were obtained from student records of theory grades and clinical log books. Qualitative evaluation of both the SBA and teacher programs was obtained using semi-structured interviews with supervisors and teachers. We also reviewed perinatal indicators over an eight-year period, starting prior to the first training program until after the graduation of the fourth cohort of SBAs. Four SBA training programs scheduled between 2009 and 2015 resulted in 79/88 (90%) of students successfully completing a training program of 250 theory hours and 625 supervised clinical hours. All 79 students were able to: achieve pass grades on theory examination (median 80%, range [70-89]); obtain the required clinical experience within twelve months; achieve clinical competence to provide safe care during childbirth. In 2010-2011, five experienced SBAs completed a train-the-trainer (TOT) program and went on to facilitate further training programs. Perinatal indicators within Shoklo Malaria Research Unit (SMRU), such as place of birth, maternal and newborn outcomes, showed no significant differences before and after introduction of training or following graduate deployment in the local maternity units. Confidence, competence and teamwork emerged from qualitative evaluation by senior SBAs working with and supervising students in the clinics. We demonstrate that in resource-limited settings or in marginalized populations, it is possible to accelerate training of skilled birth attendants to provide safe maternity care. Education needs to be tailored to local needs to ensure evidence-based care of women and their families.

  5. Multi Temporal Interferometry as Tool for Urban Landslide Hazard Assessment

    NASA Astrophysics Data System (ADS)

    Vicari, A.; Colangelo, G.; Famiglietti, N.; Cecere, G.; Stramondo, S.; Viggiano, D.

    2017-12-01

    Advanced Synthetic Aperture Radar Differential Interferometry (A-DInSAR) are Multi Temporal Interferometry(MTI) techniques suitable for the monitoring of deformation phenomena in slow kinematics. A-DInSAR methodologies include both Coherence-based type, as well as Small Baseline Subset (SBAS) (Berardino et al., 2002, Lanari et al., 2004) and Persistent/Permanent Scatterers (PS), (Ferretti et al., 2001). Such techniques are capable to provide wide-area coverage (thousands of km2) and precise (mm-cm resolution), spatially dense information (from hundreds to thousands of measurementpoints/km2) on groundsurfacedeformations. SBAS and PShavebeenapplied to the town of Stigliano (MT) in Basilicata Region (Southern Italy), where the social center has been destroyed after the reactivation of a known landslide. The comparison of results has shown that these techniques are equivalent in terms of obtained coherent areas and displacement patterns, although lightly different velocity values for individual points (-5/-25 mm/y for PS vs. -5/-15 mm/y for SBAS) have been pointed out. Differences are probably due to scattering properties of the ground surface (e.g. Lauknes et al., 2010). Furthermore, on the crown of the landslide body, a Robotics Explorer Total Monitoring Station (Leica Nova TM50) that measures distance values with 0.6 mm of resolution has been installed. In particular, 20 different points corresponding to that identified through satellite techniques have been chosen, and a sampling time of 15 minutes has been fixed. The displacement values obtained are in agreement with the results of the MTI analysis, showing as these techniques could be a useful tool in the case of early - warning situations.

  6. Barriers to using skilled birth attendants’ services in mid- and far-western Nepal: a cross-sectional study

    PubMed Central

    2013-01-01

    Background Skilled birth attendants (SBAs) provide important interventions that improve maternal and neonatal health and reduce maternal and neonatal mortality. However, utilization and coverage of services by SBAs remain poor, especially in rural and remote areas of Nepal. This study examined the characteristics associated with utilization of SBA services in mid- and far-western Nepal. Methods This cross-sectional study examined three rural and remote districts of mid- and far-western Nepal (i.e., Kanchanpur, Dailekh and Bajhang), representing three ecological zones (southern plains [Tarai], hill and mountain, respectively) with low utilization of services by SBAs. Enumerators assisted a total of 2,481 women. All respondents had delivered a baby within the past 12 months. We used bivariate and multivariate analyses to assess the association between antenatal and delivery care visits and the women’s background characteristics. Results Fifty-seven percent of study participants had completed at least four antenatal care visits and 48% delivered their babies with the assistance of SBAs. Knowing the danger signs of pregnancy and delivery (e.g., premature labor, prolonged labor, breech delivery, postpartum hemorrhage, severe headache) associated positively with four or more antenatal care visits (OR = 1.71; 95% CI: 1.41-2.07). Living less than 30 min from a health facility associated positively with increased use of both antenatal care (OR = 1.44; 95% CI: 1.18-1.77) and delivery services (OR = 1.25; CI: 1.03-1.52). Four or more antenatal care visits was a determining factor for the utilization of SBAs. Conclusions Less than half of the women in our study delivered babies with the aid of SBAs, indicating a need to increase utilization of such services in rural and remote areas of Nepal. Distance from health facilities and inadequate transportation pose major barriers to the utilization of SBAs. Providing women with transportation funds before they go to a facility for delivery and managing transportation options will increase service utilization. Moreover, SBA utilization associates positively with women’s knowledge of pregnancy danger signs, wealth quintile, and completed antenatal care visits. Nepal’s health system must develop strategies that generate demand for SBAs and also reduce financial, geographic and cultural barriers to such services. PMID:24365039

  7. Use of the Stanford Brief Activity Survey for physical activity assessment in postpartum Latinas: a validation study of a linguistically translated Spanish version.

    PubMed

    Joseph, Rodney P; Ainsworth, Barbara E; Vega-López, Sonia; Keller, Colleen S

    2014-01-01

    This study assessed the concurrent validity of the English and a linguistic Spanish translation of the Stanford Brief Activity Survey (SBAS) with pedometer-measured physical activity (PA) among postpartum Latinas. Latinas (n 97) completed the SBAS in either English (n 47) or Spanish (n 50) and wore pedometers 7 days at three different assessment periods. The English version demonstrated significant trends (p .01) for differentiating aerobic walking steps (AWS) and aerobic walking time (AWT) across SBAS intensity categories at two of the three assessment periods. The Spanish version showed marginally significant trends for differentiating AWS (p .048) and AWT (p .052) across SBAS intensity categories at only one assessment period. The English version of the SBAS is effective in assessing PA status among Latinas; however, the Spanish version indicates a need for research to further explore cultural and linguistic adaptations of the SBAS.

  8. Use of an electronic Partograph: feasibility and acceptability study in Zanzibar, Tanzania.

    PubMed

    Litwin, Lindsay Elizabeth; Maly, Christina; Khamis, Asma Ramadan; Hiner, Cyndi; Zoungrana, Jérémie; Mohamed, Khadija; Drake, Mary; Machaku, Michael; Njozi, Mustafa; Muhsin, Salhiya Ali; Kulindwa, Yusuph K; Gomez, Patricia P

    2018-05-09

    The ePartogram is a tablet-based application developed to improve care for women in labor by addressing documented challenges in partograph use. The application is designed to provide real-time decision support, improve data entry, and increase access to information for appropriate labor management. This study's primary objective was to evaluate the feasibility and acceptability of ePartogram use in resource-constrained clinical settings. The ePartogram was introduced at three facilities in Zanzibar, Tanzania. Following 3 days of training, skilled birth attendants (SBAs) were observed for 2 weeks using the ePartogram to monitor laboring women. During each observed shift, data collectors used a structured observation form to document SBA comfort, confidence, and ability to use the ePartogram. Results were analyzed by shift. Short interviews, conducted with SBAs (n = 82) after each of their first five ePartogram-monitored labors, detected differences over time. After the observation period, in-depth interviews were conducted (n = 15). A thematic analysis of interview transcripts was completed. Observations of 23 SBAs using the ePartogram to monitor 103 women over 84 shifts showed that the majority of SBAs (87-91%) completed each of four fundamental ePartogram tasks-registering a client, entering first and subsequent measurements, and navigating between screens-with ease or increasing ease on their first shift; this increased to 100% by the fifth shift. Nearly all SBAs (93%) demonstrated confidence and all SBAs demonstrated comfort in using the ePartogram by the fifth shift. SBAs expressed positive impressions of the ePartogram and found it efficient and easy to use, beginning with first client use. SBAs noted the helpfulness of auditory reminders (indicating that measurements were due) and visual alerts (signaling abnormal measurements). SBAs expressed confidence in their ability to interpret and act on these reminders and alerts. It is feasible and acceptable for SBAs to use the ePartogram to support labor management and care. With structured training and support during initial use, SBAs quickly became competent and confident in ePartogram use. Qualitative findings revealed that SBAs felt the ePartogram improved timeliness of care and supported decision-making. These findings point to the ePartogram's potential to improve quality of care in resource-constrained labor and delivery settings.

  9. ROLES OF TRADITIONAL BIRTH ATTENDANTS AND PERCEPTIONS ON THE POLICY DISCOURAGING HOME DELIVERY IN COASTAL KENYA.

    PubMed

    Wanyua, S; Kaneko, S; Karama, M; Makokha, A; Ndemwa, M; Kisule, A; Changoma, M; Goto, K; Shimada, M

    2014-03-01

    To describe the roles of Traditional Birth Attendants (TBAs), to determine the perceptions of TBAs and Skilled Birth Attendants (SBAs) towards the policy discouraging home delivery by TBAs and to establish the working relationship between TBAs and SBAs in Kwale, Kenya. Community based cross-sectional study. Mwaluphamba, Kinango and Golini locations of Kwale County, Kenya. Fifty eight participants were involved in the study. Interviews were conducted with 22 TBAs and 8 SBAs as well as 3 FGDs with 28 TBAs were carried out in July 2012. Roles of TBAs, policy awareness and support as well as the working relationship between TBAs and SBAs. Before delivery, the main role of TBAs was checking position of the baby in the womb (86%) while during delivery, the main role was stomach massage (64%). However, majority (95%) of the TBAs did not provide any after delivery. All SBAs and 59% of TBAs were aware of the policy while 88% SBAs and 36% of TBAs supported it. The working relationship between TBAs and SBAs mainly involved the referral of women to health facilities (HFs). Sometimes, TBAs accompanied women to the HF offering emotional support until after delivery. TBAs in Kwale have a big role to play especially during pregnancy and delivery periods. Awareness and support of the policy as well as the collaboration between SBAs and TBAs should be enhanced in Kwale.

  10. Lower lattice thermal conductivity in SbAs than As or Sb monolayers: a first-principles study.

    PubMed

    Guo, San-Dong; Liu, Jiang-Tao

    2017-12-06

    Phonon transport in group-VA element (As, Sb and Bi) monolayer semiconductors has been widely investigated in theory, and, of them, monolayer Sb (antimonene) has recently been synthesized. In this work, phonon transport in monolayer SbAs is investigated with a combination of first-principles calculations and the linearized phonon Boltzmann equation. It is found that the lattice thermal conductivity of monolayer SbAs is lower than those of both monolayer As and Sb, and the corresponding sheet thermal conductance is 28.8 W K -1 at room temperature. To understand the lower lattice thermal conductivity in monolayer SbAs than those in monolayer As and Sb, the group velocities and phonon lifetimes of monolayer As, SbAs and Sb are calculated. The calculated results show that the group velocities of monolayer SbAs are between those of monolayer As and Sb, but that the phonon lifetimes of SbAs are smaller than those of both monolayer As and Sb. Hence, the low lattice thermal conductivity in monolayer SbAs is attributed to very small phonon lifetimes. Unexpectedly, the ZA branch has very little contribution to the total thermal conductivity, only 2.4%, which is obviously different from those of monolayer As and Sb with very large contributions. This can be explained by very small phonon lifetimes for the ZA branch of monolayer SbAs. The lower lattice thermal conductivity of monolayer SbAs compared to that of monolayer As or Sb can be understood by the alloying of As (Sb) with Sb (As), which should introduce phonon point defect scattering. We also consider the isotope and size effects on the lattice thermal conductivity. It is found that isotope scattering produces a neglectful effect, and the lattice thermal conductivity with a characteristic length smaller than 30 nm can reach a decrease of about 47%. These results may offer perspectives on tuning the lattice thermal conductivity by the mixture of multiple elements for applications of thermal management and thermoelectricity, and motivate further experimental efforts to synthesize monolayer SbAs.

  11. Clinical audit to enhance safe practice of skilled birth attendants for the fetus with nuchal cord: evidence from a refugee and migrant cohort

    PubMed Central

    2014-01-01

    Background Current evidence for optimal management of fetal nuchal cord detected after the head has birthed supports techniques that avoid ligation of the umbilical cord circulation. Routine audit found frequent unsafe management of nuchal cord by skilled birth attendants (SBAs) in migrant and refugee birth centres on the Thai-Burmese border. Method The audit cycle was used to enhance safe practice by SBA for the fetus with nuchal cord. In the three birth centres the action phase of the audit cycle was initially carried out by the doctor responsible for the site. Six months later a registered midwife, present six days per week for three months in one birth facility, encouraged SBAs to facilitate birth with an intact umbilical circulation for nuchal cord. Rates of cord ligation before birth were recorded over a 24 month period (1-July-2011 to 30-June-2013) and in-depth interviews and a knowledge survey of the SBAs took place three months after the registered midwife departure. Results The proportion of births with nuchal cord ligation declined significantly over the four six monthly quarters from 15.9% (178/1123) before the action phase of the audit cycle; to 11.1% (107/966) during the action phase of the audit cycle with the doctors; to 2.4% (28/1182) with the registered midwife; to 0.9% (9/999) from three to nine months after the departure of the registered midwife, (p < 0.001, linear trend). Significant improvements in safe practice were observed at all three SMRU birth facilities. Knowledge of fetal nuchal cord amongst SBAs was sub-optimal and associated with fear and worry despite improved practice. The support of a registered midwife increased confidence of SBAs. Conclusion The audit cycle and registered midwife interprofessional learning for SBAs led to a significant improvement in safe practice for the fetus with nuchal cord. The authors would encourage this type of learning in organizations with birth facilities on the Thai-Burmese border and in other similar resource limited settings with SBAs. PMID:24552462

  12. Utility of the Stanford Brief Activity Survey for physical activity assessment in postpartum Latinas: A validation study of a linguistically translated Spanish version

    PubMed Central

    Joseph, Rodney P.; Ainsworth, Barbara E.; Vega-López, Sonia; Keller, Colleen S.

    2014-01-01

    This study assessed the concurrent validity of the English and a linguistic Spanish translation of the Stanford Brief Activity Survey (SBAS) with pedometer measured physical activity (PA) among postpartum Latinas. Latinas (n = 97) completed the SBAS in either English (n = 47) or Spanish (n = 50) and wore pedometers seven days at three different assessment periods. The English version demonstrated significant trends (p < .01) for differentiating aerobic walking steps (AWS) and aerobic walking time (AWT) across SBAS intensity categories at two of the three assessment periods. The Spanish version showed marginally significant trends for differentiating AWS (p = .048) and AWT (p = .052) across SBAS intensity categories at only one assessment period. The English version of the SBAS is effective in assessing PA status among Latinas; however, the Spanish version indicates a need for research to further explore cultural and linguistic adaptations of the SBAS. PMID:25239211

  13. The SBAS Sentinel-1 Surveillance service for automatic and systematic generation of Earth surface displacement within the GEP platform.

    NASA Astrophysics Data System (ADS)

    Casu, Francesco; De Luca, Claudio; Lanari, Riccardo; Manunta, Michele; Zinno, Ivana

    2017-04-01

    The Geohazards Exploitation Platform (GEP) is an ESA activity of the Earth Observation (EO) ground segment to demonstrate the benefit of new technologies for large scale processing of EO data. GEP aims at providing both on-demand processing services for scientific users of the geohazards community and an integration platform for new EO data analysis processors dedicated to scientists and other expert users. In the Remote Sensing scenario, a crucial role is played by the recently launched Sentinel-1 (S1) constellation that, with its global acquisition policy, has literally flooded the scientific community with a huge amount of data acquired over large part of the Earth on a regular basis (down to 6-days with both Sentinel-1A and 1B passes). Moreover, the S1 data, as part of the European Copernicus program, are openly and freely accessible, thus fostering their use for the development of tools for Earth surface monitoring. In particular, due to their specific SAR Interferometry (InSAR) design, Sentinel-1 satellites can be exploited to build up operational services for the generation of advanced interferometric products that can be very useful within risk management and natural hazard monitoring scenarios. Accordingly, in this work we present the activities carried out for the development, integration, and deployment of the SBAS Sentinel-1 Surveillance service of CNR-IREA within the GEP platform. This service is based on a parallel implementation of the SBAS approach, referred to as P-SBAS, able to effectively run in large distributed computing infrastructures (grid and cloud) and to allow for an efficient computation of large SAR data sequences with advanced DInSAR approaches. In particular, the Surveillance service developed on GEP platform consists on the systematic and automatic processing of Sentinel-1 data on selected Areas of Interest (AoI) to generate updated surface displacement time series via the SBAS-InSAR algorithm. We built up a system that is automatically triggered by every new S1 acquisition over the AoI, once it is available on the S1 catalogue. Then, tacking benefit from the SBAS results generated by previous runs of the service, the system processes the new acquisitions only, thus saving storage space and computing time and finally generating an updated SBAS time series. The same P-SBAS processor underlying the Surveillance service is also available through the GEP as a standard on-demand DInSAR service, thus allowing the scientific community to generate S1 SBAS time series on areas not covered by the Surveillance service itself. It is worth noting that the SBAS Sentinel-1 Surveillance service on GEP represents the core of the EPOSAR service, which will deliver S1 displacement time series of Earth surface on a regular basis for the European Plate Observing System (EPOS) Research Infrastructure community. In particular, the main goal of EPOSAR is to contribute with advanced technique and methods, which have already well demonstrated their effectiveness and relevance, in investigating the physical processes controlling earthquakes, volcanic eruptions and unrest episodes as well as those driving tectonics and Earth surface dynamics.

  14. Validity and reliability of two brief physical activity questionnaires among Spanish-speaking individuals of Mexican descent

    PubMed Central

    2014-01-01

    Background Mexican Americans are the largest minority group in the US and suffer disproportionate rates of diseases related to the lack of physical activity (PA). Since many of these Mexican Americans are Spanish-speaking, it is important to validate a Spanish language physical activity assessment tool that can be used in epidemiology as well as clinical practice. This study explored the utility of two Spanish translated physical activity questionnaires, the Stanford Brief Activity Survey (SBAS) and the Rapid Assessment of Physical Activity (RAPA), for use among Spanish-speaking Mexican Americans. Methods Thirty-four participants (13 M, 21 F; 37.6 ± 9.5 y) completed each of the two PA surveys twice, one week apart. During that week 31 participants also wore an ActiGraph GT1M accelerometer for 7 days to objectively measure PA. Minutes of moderate and vigorous PA (MVPA) were determined from the accelerometer data using Freedson and Matthews cut points. Results Validity, determined by Spearman correlation coefficients between questionnaire scores and minutes of ActiGraph measured MVPA were 0.38 and 0.45 for the SBAS and RAPA, respectively. Test-retest reliability was 0.61 for the SBAS and 0.65 for the RAPA. Sensitivity and specificity were 0.60 and 0.47 for the SBAS, and 0.73 and 0.75 for the RAPA. Participants who were classified as meeting the 2008 National Physical Activity Guidelines by the RAPA engaged in significantly (p < 0.05) more minutes of MVPA than those who were not, while there were no significant differences in minutes of MVPA classified by the SBAS. Conclusions The SBAS and the RAPA are both reasonably valid measures for quickly assessing PA and determining compliance to the PA guidelines in Spanish-speaking Mexican Americans. Although the two questionnaires had comparable reliability, the RAPA was better able to distinguish between those who met and did not meet National PA Guidelines. PMID:24410978

  15. The storm-time assessment of GNSS-SBAS performance within low latitude African region using a testbed-like platform

    NASA Astrophysics Data System (ADS)

    Abe, O. E.; Paparini, C.; Ngaya, R. H.; Otero Villamide, X.; Radicella, S. M.; Nava, B.

    2017-09-01

    A Satellite Based Augmentation System (SBAS) is designed to improve Global Navigation Satellite Systems (GNSS) in terms of integrity, accuracy, availability and continuity. The main limitation to SBAS performance optimization is the ionosphere, and this is more critical in low latitude. During geomagnetically disturbed periods the role of storm-time winds is important because they modify the atmospheric composition toward low latitudes. An index of ionospheric disturbance, the relative percentage of deviation of the vertical Total Electron Content (TEC) from the quiet level (DvTEC) at each station was evaluated to study positive and negative phases of the geomagnetic storms. The rate of change of TEC index (ROTI) over all the GNSS stations was estimated to evaluate equatorial ionospheric gradients and irregularities. From the study it is observed that the positive deviations are more frequent than negative ones. The availability map, which is the mean of the combine Vertical Protection Level (VPL) and Horizontal Protection Level (HPL) are used for the SBAS performance. The cases of moderate and minor storms studied during the months of July and October 2013 showed that the SBAS system performance during the disturbed periods depends on the local time in which the storm occurs, geographic longitude and other phenomena that need further study. During the storm-time conditions considered, three out of seven geomagnetic storms indicated good SBAS performance and exceed monthly average of the availability map, three geomagnetic storms reduced the system performance below monthly average while one does not have effect on SBAS system performance in respect to monthly average. The present study indicates ROTI as a better proxy than geomagnetic indices for the assessment of storm-time effects on GNSS-SBAS performance.

  16. High Resolution Deformation Time Series Estimation for Distributed Scatterers Using Terrasar-X Data

    NASA Astrophysics Data System (ADS)

    Goel, K.; Adam, N.

    2012-07-01

    In recent years, several SAR satellites such as TerraSAR-X, COSMO-SkyMed and Radarsat-2 have been launched. These satellites provide high resolution data suitable for sophisticated interferometric applications. With shorter repeat cycles, smaller orbital tubes and higher bandwidth of the satellites; deformation time series analysis of distributed scatterers (DSs) is now supported by a practical data basis. Techniques for exploiting DSs in non-urban (rural) areas include the Small Baseline Subset Algorithm (SBAS). However, it involves spatial phase unwrapping, and phase unwrapping errors are typically encountered in rural areas and are difficult to detect. In addition, the SBAS technique involves a rectangular multilooking of the differential interferograms to reduce phase noise, resulting in a loss of resolution and superposition of different objects on ground. In this paper, we introduce a new approach for deformation monitoring with a focus on DSs, wherein, there is no need to unwrap the differential interferograms and the deformation is mapped at object resolution. It is based on a robust object adaptive parameter estimation using single look differential interferograms, where, the local tilts of deformation velocity and local slopes of residual DEM in range and azimuth directions are estimated. We present here the technical details and a processing example of this newly developed algorithm.

  17. Factors affecting deliveries attended by skilled birth attendants in Bangladesh.

    PubMed

    Kibria, Gulam Muhammed Al; Ghosh, Swagata; Hossen, Shakir; Barsha, Rifath Ara Alam; Sharmeen, Atia; Uddin, S M Iftekhar

    2017-01-01

    The presence of skilled birth attendants (SBAs) is crucial in childbirth to reduce the maternal mortality ratio (MMR) and to achieve the maternal mortality target of the United Nations' Sustainable Development Goals (SDGs). The aim of this study was to investigate the factors related to childbirths attended by SBAs in Bangladesh. Data from the Bangladesh Demographic and Health Survey (2014 BDHS) were analyzed. Logistic regression was applied to calculate crude odds ratios (CORs), adjusted odds ratios (AORs), 95% confidence intervals (CIs), and p-values. In Bangladesh, 35.9% of deliveries were attended by SBAs, and 44.2% of those women received at least one antenatal check-up by a skilled provider. The deliveries by SBAs were less than 50% of the total deliveries in all divisions, excluding Khulna. Known pregnancy complications (AOR: 1.2; 95% CI: 1.1-1.4), higher level of education in both women (AOR: 1.7; 95% CI: 1.2-2.3) and their husbands (AOR: 1.8; 95% CI: 1.3-2.4), receiving antenatal care (ANC) by a skilled provider during the pregnancy period (AOR: 1.5; 95% CI: 1.1-2.1), and higher wealth quintiles (AOR: 3.4; 95% CI: 2.5-4.7) were all significantly associated with an increased likelihood of a delivery by SBAs ( p  <0.05). In contrast, women living in rural areas (AOR: 0.7; 95% CI: 0.6-0.8) and the Sylhet Division (AOR: 0.4; 95% CI: 0.3-0.5) were less likely to be delivered by SBAs. To achieve the target of the Government of Bangladesh - 50% of deliveries to be attended by SBAs - it is important to increase ANC services and awareness programs in all seven divisions of Bangladesh. Special focus in rural areas is also required to meet this target. A new study should be conducted to explore the unexamined factors associated with the presence of SBAs during childbirth.

  18. Massive Cloud Computing Processing of P-SBAS Time Series for Displacement Analyses at Large Spatial Scale

    NASA Astrophysics Data System (ADS)

    Casu, F.; de Luca, C.; Lanari, R.; Manunta, M.; Zinno, I.

    2016-12-01

    A methodology for computing surface deformation time series and mean velocity maps of large areas is presented. Our approach relies on the availability of a multi-temporal set of Synthetic Aperture Radar (SAR) data collected from ascending and descending orbits over an area of interest, and also permits to estimate the vertical and horizontal (East-West) displacement components of the Earth's surface. The adopted methodology is based on an advanced Cloud Computing implementation of the Differential SAR Interferometry (DInSAR) Parallel Small Baseline Subset (P-SBAS) processing chain which allows the unsupervised processing of large SAR data volumes, from the raw data (level-0) imagery up to the generation of DInSAR time series and maps. The presented solution, which is highly scalable, has been tested on the ascending and descending ENVISAT SAR archives, which have been acquired over a large area of Southern California (US) that extends for about 90.000 km2. Such an input dataset has been processed in parallel by exploiting 280 computing nodes of the Amazon Web Services Cloud environment. Moreover, to produce the final mean deformation velocity maps of the vertical and East-West displacement components of the whole investigated area, we took also advantage of the information available from external GPS measurements that permit to account for possible regional trends not easily detectable by DInSAR and to refer the P-SBAS measurements to an external geodetic datum. The presented results clearly demonstrate the effectiveness of the proposed approach that paves the way to the extensive use of the available ERS and ENVISAT SAR data archives. Furthermore, the proposed methodology can be particularly suitable to deal with the very huge data flow provided by the Sentinel-1 constellation, thus permitting to extend the DInSAR analyses at a nearly global scale. This work is partially supported by: the DPC-CNR agreement, the EPOS-IP project and the ESA GEP project.

  19. Barriers to using skilled birth attendants' services in mid- and far-western Nepal: a cross-sectional study.

    PubMed

    Choulagai, Bishnu; Onta, Sharad; Subedi, Narayan; Mehata, Suresh; Bhandari, Gajananda P; Poudyal, Amod; Shrestha, Binjwala; Mathai, Matthews; Petzold, Max; Krettek, Alexandra

    2013-12-23

    Skilled birth attendants (SBAs) provide important interventions that improve maternal and neonatal health and reduce maternal and neonatal mortality. However, utilization and coverage of services by SBAs remain poor, especially in rural and remote areas of Nepal. This study examined the characteristics associated with utilization of SBA services in mid- and far-western Nepal. This cross-sectional study examined three rural and remote districts of mid- and far-western Nepal (i.e., Kanchanpur, Dailekh and Bajhang), representing three ecological zones (southern plains [Tarai], hill and mountain, respectively) with low utilization of services by SBAs. Enumerators assisted a total of 2,481 women. All respondents had delivered a baby within the past 12 months. We used bivariate and multivariate analyses to assess the association between antenatal and delivery care visits and the women's background characteristics. Fifty-seven percent of study participants had completed at least four antenatal care visits and 48% delivered their babies with the assistance of SBAs. Knowing the danger signs of pregnancy and delivery (e.g., premature labor, prolonged labor, breech delivery, postpartum hemorrhage, severe headache) associated positively with four or more antenatal care visits (OR = 1.71; 95% CI: 1.41-2.07). Living less than 30 min from a health facility associated positively with increased use of both antenatal care (OR = 1.44; 95% CI: 1.18-1.77) and delivery services (OR = 1.25; CI: 1.03-1.52). Four or more antenatal care visits was a determining factor for the utilization of SBAs. Less than half of the women in our study delivered babies with the aid of SBAs, indicating a need to increase utilization of such services in rural and remote areas of Nepal. Distance from health facilities and inadequate transportation pose major barriers to the utilization of SBAs. Providing women with transportation funds before they go to a facility for delivery and managing transportation options will increase service utilization. Moreover, SBA utilization associates positively with women's knowledge of pregnancy danger signs, wealth quintile, and completed antenatal care visits. Nepal's health system must develop strategies that generate demand for SBAs and also reduce financial, geographic and cultural barriers to such services.

  20. Ground deformation monitoring using small baseline DInSAR technique: A case study in Taiyuan City from 2003 to 2009

    USGS Publications Warehouse

    Wu, H.-A.; Zhang, Y.-H.; Chen, X.-Y.; Lu, T.; Du, J.; Sun, Z.-H.; Sun, G.-T.

    2011-01-01

    DInSAR technique based on time series of SAR images has been very popular to monitor ground stow deformation in recent years such as permanent scatterers (PS) method small baseline subsets (SBAS) method and coherent targets (CT) method. By taking advantage of PS method and CT method in this paper small baseline DTnSAR technique is used to investigate the ground deformation of Taiyuan City Shanxi Province from 2003 to 2009 by using 23 ENVISAT ASAR images. The experiment results demonstrate that: (1) during this period four significant subsidence centers have been developed in Taiyuan namely Xiayuan Wujiabu Xiaodian Sunjiazhai. The largest subsidence center is Sunjiazhai with an average subsidence rate of -77. 28 mm/a; (2) The subsidence of the old center Wanbolin has sHowed down. And the subsidence in the northern region has stopped and some areas even rebounded. (3) The change of subsidence centers indicates that the control measures of "closing wells and reducing exploitation" taken by the Taiyuan government has achieved initial effects. (4) The experiment results have been validated with leveling data and the acouracy is 2. 90 mm which shows that the small baseline DInSAR technique can be used to monitor urban ground deformation.

  1. Ventilation/perfusion SPECT/CT in patients with pulmonary emphysema. Evaluation of software-based analysing.

    PubMed

    Schreiter, V; Steffen, I; Huebner, H; Bredow, J; Heimann, U; Kroencke, T J; Poellinger, A; Doellinger, F; Buchert, R; Hamm, B; Brenner, W; Schreiter, N F

    2015-01-01

    The purpose of this study was to evaluate the reproducibility of a new software based analysing system for ventilation/perfusion single-photon emission computed tomography/computed tomography (V/P SPECT/CT) in patients with pulmonary emphysema and to compare it to the visual interpretation. 19 patients (mean age: 68.1 years) with pulmonary emphysema who underwent V/P SPECT/CT were included. Data were analysed by two independent observers in visual interpretation (VI) and by software based analysis system (SBAS). SBAS PMOD version 3.4 (Technologies Ltd, Zurich, Switzerland) was used to assess counts and volume per lung lobe/per lung and to calculate the count density per lung, lobe ratio of counts and ratio of count density. VI was performed using a visual scale to assess the mean counts per lung lobe. Interobserver variability and association for SBAS and VI were analysed using Spearman's rho correlation coefficient. Interobserver agreement correlated highly in perfusion (rho: 0.982, 0.957, 0.90, 0.979) and ventilation (rho: 0.972, 0.924, 0.941, 0.936) for count/count density per lobe and ratio of counts/count density in SBAS. Interobserver agreement correlated clearly for perfusion (rho: 0.655) and weakly for ventilation (rho: 0.458) in VI. SBAS provides more reproducible measures than VI for the relative tracer uptake in V/P SPECT/CTs in patients with pulmonary emphysema. However, SBAS has to be improved for routine clinical use.

  2. Community and provider perceptions of traditional and skilled birth attendants providing maternal health care for pastoralist communities in Kenya: a qualitative study.

    PubMed

    Byrne, Abbey; Caulfield, Tanya; Onyo, Pamela; Nyagero, Josephat; Morgan, Alison; Nduba, John; Kermode, Michelle

    2016-03-01

    Kenya has a high burden of maternal and newborn mortality. Consequently, the Government of Kenya introduced health system reforms to promote the availability of skilled birth attendants (SBAs) and proscribed deliveries by traditional birth attendants (TBAs). Despite these changes, only 10% of women from pastoralist communities are delivered by an SBA in a health facility, and the majority are delivered by TBAs at home. The aim of this study is to better understand the practices and perceptions of TBAs and SBAs serving the remotely located, semi-nomadic, pastoralist communities of Laikipia and Samburu counties in Kenya, to inform the development of an SBA/TBA collaborative care model. This descriptive qualitative study was undertaken in 2013-14. We conducted four focus group discussions (FGDs) with TBAs, three with community health workers, ten with community women, and three with community men. In-depth interviews were conducted with seven SBAs and eight key informants. Topic areas covered were: practices and perceptions of SBAs and TBAs; rewards and challenges; managing obstetric complications; and options for SBA/TBA collaboration. All data were translated, transcribed and thematically analysed. TBAs are valued and accessible members of their communities who adhere to traditional practices and provide practical and emotional support to women during pregnancy, delivery and post-partum. Some TBA practices are potentially harmful to women e.g., restricting food intake during pregnancy, and participants recognised that TBAs are unable to manage obstetric complications. SBAs are acknowledged as having valuable technical skills and resources that contribute to safe and clean deliveries, especially in the event of complications, but there is also a perception that SBAs mistreat women. Both TBAs and SBAs identified a range of challenges related to their work, and instances of mutual respect and informal collaborations between SBAs and TBAs were described. These findings clearly indicate that an SBA/TBA collaborative model of care consistent with Kenyan Government policy is a viable proposition. The transition from traditional birth to skilled birth attendance among the pastoralist communities of Laikipia and Samburu is going to be a gradual one, and an interim collaborative model is likely to increase the proportion of SBA assisted deliveries, improve obstetric outcomes, and facilitate the transition.

  3. Structured Programming on the 380-Z.

    ERIC Educational Resources Information Center

    Horton, Graeme

    1983-01-01

    With the production of SBAS, a combination of language and machine allowing for a disciplined and error-free approach to teaching programming is available. Instructional strategies for use with SBAS and basic concepts (decisions, repetitions/iterations, actions/processes) are discussed. Sample flow charts and program listings are provided. (JN)

  4. Trauma-Focused Early Intensive Cognitive Behavioral Intervention (TF-EICBI) in children and adolescent survivors of suicide bombing attacks (SBAs). A preliminary study.

    PubMed

    Leor, Agnes; Dolberg, Orna T; Eshel, Shira Pagorek; Yagil, Yaron; Schreiber, Shaul

    2013-01-01

    To describe and evaluate the impact of an early intervention (Trauma-Focused Early Intensive Cognitive Behavioral Intervention, TF-EICBI) in children and adolescents who were victims of suicide bombing attacks (SBAs) in Israel. Description of an intervention and preliminary experience in its use. An acute trauma center of a Child and Adolescent Psychiatric Unit in a Department of Psychiatry of a university-affiliated medical center. Ten children and adolescents who were victims of SBAs and underwent early interventions (EIG) were compared to 11 adolescent victims who received no intervention (NEIG). The EIG included all the children and adolescent survivors of various SBAs that had occurred during 1 year who presented to our hospital after the TF-EICBI was implemented (June 2001). The NEIG comprised all adolescents girls <18 years of age at follow-up who survived one SBA (at the "Dolphinarium" Discotheque) before the TF-EICBI was available. At the time of the 1-year post-SBA follow-up, all 21 subjects were assessed by the Structured Clinical Interview for Axis 1 DSMIII R Disorders (SCID), and the Child Behavior Checklist (CBCL). One (10 percent) EI subject and four (36.4 percent) NEI subjects had post-traumatic stress disorder. The mean CBCL total score and most of the mean CBCL behavior problem scores were significantly higher (p < 0.021) among the NEI group members. Intervention was effective in preventing and lowering mental morbidity of children and adolescents after SBAs.

  5. Validity and reliability of two brief physical activity questionnaires among Spanish-speaking individuals of Mexican descent.

    PubMed

    Vega-López, Sonia; Chavez, Adrian; Farr, Kristin J; Ainsworth, Barbara E

    2014-01-13

    Mexican Americans are the largest minority group in the US and suffer disproportionate rates of diseases related to the lack of physical activity (PA). Since many of these Mexican Americans are Spanish-speaking, it is important to validate a Spanish language physical activity assessment tool that can be used in epidemiology as well as clinical practice. This study explored the utility of two Spanish translated physical activity questionnaires, the Stanford Brief Activity Survey (SBAS) and the Rapid Assessment of Physical Activity (RAPA), for use among Spanish-speaking Mexican Americans. Thirty-four participants (13 M, 21 F; 37.6 ± 9.5 y) completed each of the two PA surveys twice, one week apart. During that week 31 participants also wore an ActiGraph GT1M accelerometer for 7 days to objectively measure PA. Minutes of moderate and vigorous PA (MVPA) were determined from the accelerometer data using Freedson and Matthews cut points. Validity, determined by Spearman correlation coefficients between questionnaire scores and minutes of ActiGraph measured MVPA were 0.38 and 0.45 for the SBAS and RAPA, respectively. Test-retest reliability was 0.61 for the SBAS and 0.65 for the RAPA. Sensitivity and specificity were 0.60 and 0.47 for the SBAS, and 0.73 and 0.75 for the RAPA. Participants who were classified as meeting the 2008 National Physical Activity Guidelines by the RAPA engaged in significantly (p < 0.05) more minutes of MVPA than those who were not, while there were no significant differences in minutes of MVPA classified by the SBAS. The SBAS and the RAPA are both reasonably valid measures for quickly assessing PA and determining compliance to the PA guidelines in Spanish-speaking Mexican Americans. Although the two questionnaires had comparable reliability, the RAPA was better able to distinguish between those who met and did not meet National PA Guidelines.

  6. Unsupervised SBAS-DInSAR Processing of Space-borne SAR data for Earth Surface Displacement Time Series Generation

    NASA Astrophysics Data System (ADS)

    Casu, F.; de Luca, C.; Lanari, R.; Manunta, M.; Zinno, I.

    2016-12-01

    During the last 25 years, the Differential Synthetic Aperture Radar Interferometry (DInSAR) has played an important role for understanding the Earth's surface deformation and its dynamics. In particular, the large collections of SAR data acquired by a number of space-borne missions (ERS, ENVISAT, ALOS, RADARSAT, TerraSAR-X, COSMO-SkyMed) have pushed toward the development of advanced DInSAR techniques for monitoring the temporal evolution of the ground displacements with an high spatial density. Moreover, the advent of the Copernicus Sentinel-1 (S1) constellation is providing a further increase in the SAR data flow available to the Earth science community, due to its characteristics of global coverage strategy and free and open access data policy. Therefore, managing and storing such a huge amount of data, processing it in an effcient way and maximizing the available archives exploitation are becoming high priority issues. In this work we present some recent advances in the DInSAR field for dealing with the effective exploitation of the present and future SAR data archives. In particular, an efficient parallel SBAS implementation (namely P-SBAS) that takes benefit from high performance computing is proposed. Then, the P-SBAS migration to the emerging Cloud Computing paradigm is shown, together with extensive tests carried out in the Amazon's Elastic Cloud Compute (EC2) infrastructure. Finally, the integration of the P-SBAS processing chain within the ESA Geohazards Exploitation Platform (GEP), for setting up operational on-demand and systematic web tools, open to every user, aimed at automatically processing stacks of SAR data for the generation of SBAS displacement time series, is also illustrated. A number of experimental results obtained by using the ERS, ENVISAT and S1 data in areas characterized by volcanic, seismic and anthropogenic phenomena will be shown. This work is partially supported by: the DPC-CNR agreement, the EPOS-IP project and the ESA GEP project.

  7. 3D displacement time series in the Afar rift zone computed from SAR phase and amplitude information

    NASA Astrophysics Data System (ADS)

    Casu, Francesco; Manconi, Andrea

    2013-04-01

    Large and rapid deformations, such as those caused by earthquakes, eruptions, and landslides cannot be fully measured by using standard DInSAR applications. Indeed, the phase information often degrades and some areas of the interferograms are affected by high fringe rates, leading to difficulties in the phase unwrapping, and/or to complete loss of coherence due to significant misregistration errors. This limitation can be overcome by exploiting the SAR image amplitude information instead of the phase, and by calculating the Pixel-Offset (PO) field SAR image pairs, for both range and azimuth directions. Moreover, it is possible to combine the PO results by following the same rationale of the SBAS technique, to finally retrieve the offset-based deformation time series. Such technique, named PO-SBAS, permits to retrieve the deformation field in areas affected by very large displacements at an accuracy that, for ENVISAT data, correspond to 30 cm and 15 cm for the range and azimuth, respectively [1]. Moreover, the combination of SBAS and PO-SBAS time series can help to better study and model deformation phenomena characterized by spatial and temporal heterogeneities [2]. The Dabbahu rift segment of the Afar depression has been active since 2005 when a 2.5 km3 dyke intrusion and hundreds of earthquakes marked the onset a rifting episode which continues to date. The ENVISAT satellite has repeatedly imaged the Afar depression since 2003, generating a large SAR archive. In this work, we study the Afar rift region deformations by using both the phase and amplitude information of several sets of SAR images acquired from ascending and descending ENVISAT tracks. We combined sets of small baseline interferograms through the SBAS algorithm, and we generate both ground deformation maps and time series along the satellite Line-Of-Sight (LOS). In areas where the deformation gradient causes loss of coherence, we retrieve the displacement field through the amplitude information. Furthermore, we could also retrieve the full 3D deformation field, by considering the North-South displacement component obtained from the azimuth PO information. The combination of SBAS and PO-SBAS information permits to better retrieve and constrain the full deformation field due to repeated intrusions, fault movements, as well as the magma movements from individual magma chambers. [1] Casu, F., A. Manconi, A. Pepe and R. Lanari, 2011. Deformation time-series generation in areas characterized by large displacement dynamics: the SAR amplitude Pixel-Offset SBAS technique, IEEE Transaction on Geosciences and Remote Sensing. [2] Manconi, A. and F. Casu, 2012. Joint analysis of displacement time series retrieved from SAR phase and amplitude: impact on the estimation of volcanic source parameters, Geophysical Research Letters, doi:10.1029/2012GL052202.

  8. Dynamics of Kilauea's Magmatic System Imaged Using a Joint Analysis of Geodetic and Seismic Data

    NASA Astrophysics Data System (ADS)

    Wauthier, C.; Roman, D. C.; Poland, M. P.; Fukushima, Y.; Hooper, A. J.

    2012-12-01

    Nowadays, Interferometric Synthetic Aperture Radar (InSAR) is commonly used to study a wide range of active volcanic areas. InSAR provides high-spatial-resolution measurements of surface deformation with centimeter-scale accuracy. At Kilauea Volcano, Hawai'i, InSAR shows complex processes that are not well constrained by GPS data (which have relatively poor spatial resolution). However, GPS data have higher temporal resolution than InSAR data. Both datasets are thus complementary. To overcome some of the limitations of conventional InSAR, which are mainly induced by temporal decorrelation, topographic, orbital and atmospheric delays, a Multi-Temporal InSAR (MT-InSAR) approach can be used. MT-InSAR techniques involve the processing of multiple SAR acquisitions over the same area. Two classes of MT-InSAR algorithms are defined: the persistent scatterers (PS) and small baseline (SBAS) methods. Each method is designed for a specific type of scattering mechanism. A PS pixel is a pixel in which a single scatterer dominates, while the contributions from other scatterers are negligible. A SBAS pixel is a pixel that includes distributed scatterers, which have a phase with little decorrelation over short time periods. Here, we apply the "StaMPS" ("Stanford Method for Permanent Scatterers") technique, which incorporates both a PS and SBAS approach, on ENVISAT and ALOS datasets acquired from 2003 to 2010 at Kilauea. In particular, we focus our InSAR analysis on the time period before the June 2007 "Father's Day" dike intrusion and eruption, and also incorporate seismic and GPS data in our models. Our goal is to identify any precursors to the Father's Day event within Kilauea's summit magma system, east rift zone, and/or southwest rift zone.

  9. Large-area landslide detection and monitoring with ALOS/PALSAR imagery data over Northern California and Southern Oregon, USA

    USGS Publications Warehouse

    Zhao, Chaoying; Lu, Zhong; Zhang, Qin; de la Fuente, Juan

    2012-01-01

    Multi-temporal ALOS/PALSAR images are used to automatically investigate landslide activity over an area of ~ 200 km by ~ 350 km in northern California and southern Oregon. Interferometric synthetic aperture radar (InSAR) deformation images, InSAR coherence maps, SAR backscattering intensity images, and a DEM gradient map are combined to detect active landslides by setting individual thresholds. More than 50 active landslides covering a total of about 40 km2 area are detected. Then the short baseline subsets (SBAS) InSAR method is applied to retrieve time-series deformation patterns of individual detected landslides. Down-slope landslide motions observed from adjacent satellite tracks with slightly different radar look angles are used to verify InSAR results and measurement accuracy. Comparison of the landslide motion with the precipitation record suggests that the landslide deformation correlates with the rainfall rate, with a lag time of around 1–2 months between the precipitation peak and the maximum landslide displacement. The results will provide new insights into landslide mechanisms in the Pacific Northwest, and facilitate development of early warning systems for landslides under abnormal rainfall conditions. Additionally, this method will allow identification of active landslides in broad areas of the Pacific Northwest in an efficient and systematic manner, including remote and heavily vegetated areas difficult to inventory by traditional methods.

  10. The Network Structure Underlying the Earth Observation Assessment

    NASA Astrophysics Data System (ADS)

    Vitkin, S.; Doane, W. E. J.; Mary, J. C.

    2017-12-01

    The Earth Observations Assessment (EOA 2016) is a multiyear project designed to assess the effectiveness of civil earth observation data sources (instruments, sensors, models, etc.) on societal benefit areas (SBAs) for the United States. Subject matter experts (SMEs) provided input and scored how data sources inform products, product groups, key objectives, SBA sub-areas, and SBAs in an attempt to quantify the relationships between data sources and SBAs. The resulting data were processed by Integrated Applications Incorporated (IAI) using MITRE's PALMA software to create normalized relative impact scores for each of these relationships. However, PALMA processing obscures the natural network representation of the data. Any network analysis that might identify patterns of interaction among data sources, products, and SBAs is therefore impossible. Collaborating with IAI, we cleaned and recreated a network from the original dataset. Using R and Python we explore the underlying structure of the network and apply frequent itemset mining algorithms to identify groups of data sources and products that interact. We reveal interesting patterns and relationships in the EOA dataset that were not immediately observable from the EOA 2016 report and provide a basis for further exploration of the EOA network dataset.

  11. Generation of deformation time series from SAR data sequences in areas affected by large dynamics: insights from Sierra Negra caldera, Galápagos Islands

    NASA Astrophysics Data System (ADS)

    Casu, Francesco; Manconi, Andrea; Pepe, Antonio; Lanari, Riccardo

    2010-05-01

    Differential Synthetic Aperture Radar Interferometry (DInSAR) is a remote sensing technique that allows producing spatially dense deformation maps of the Earth surface, with centimeter accuracy. To this end, the phase difference of SAR image pairs acquired before and after a deformation episode is properly exploited. This technique, originally applied to investigate single deformation events, has been further extended to analyze the temporal evolution of the deformation field through the generation of displacement time-series. A well-established approach is represented by the Small BAseline Subset (SBAS) technique (Berardino et al., 2002), whose capability to analyze deformation events at low and full spatial resolution has largely been demonstrated. However, in areas where large and/or rapid deformation phenomena occur, the exploitation of the differential interferograms, thus also of the displacement time-series, can be strongly limited by the presence of significant misregistration errors and/or very high fringe rates, making unfeasible the phase unwrapping step. In this work, we propose advances on the generation of deformation time-series in areas affected by large deformation dynamics. We present an extension of the amplitude-based Pixel-Offset analyses by applying the SBAS strategy, in order to move from the investigation of single (large) deformation events to that of dynamic phenomena. The above-mentioned method has been tested on an ENVISAT SAR data archive (Track 61, Frames 7173-7191) related to the Galapagos Islands, focusing on Sierra Negra caldera (Galapagos Islands), an active volcanic area often characterized by large and rapid deformation events leading to severe image misregistration effects (Yun et al., 2007). Moreover, we present a cross-validation of the retrieved deformation estimates comparing our results to continuous GPS measurements and to synthetic deformation obtained by independently modeling the interferometric phase information when available. References: P. Berardino et al., (2002), A new algorithm for Surface Deformation Monitoring based on Small Baseline Differential SAR Interferograms, IEEE Transactions on Geoscience and Remote Sensing, vol. 40, 11, pp. 2375-2383. S-H. Yun et al., (2007), Interferogram formation in the presence of complex and large deformation, Geophys. Res. Lett., vol. 34, L12305.

  12. Quality assessment of DInSAR deformation measurements in volcanic areas by comparing GPS and SBAS results

    NASA Astrophysics Data System (ADS)

    Bonforte, A.; Casu, F.; de Martino, P.; Guglielmino, F.; Lanari, R.; Manzo, M.; Obrizzo, F.; Puglisi, G.; Sansosti, E.; Tammaro, U.

    2009-04-01

    Differential Synthetic Aperture Radar Interferometry (DInSAR) is a methodology able to measure ground deformation rates and time series of relatively large areas. Several different approaches have been developed over the past few years: they all have in common the capability to measure deformations on a relatively wide area (say 100 km by 100 km) with a high density of the measuring points. For these reasons, DInSAR represents a very useful tool for investigating geophysical phenomena, with particular reference to volcanic areas. As for any measuring technique, the knowledge of the attainable accuracy is of fundamental importance. In the case of DInSAR technology, we have several error sources, such as orbital inaccuracies, phase unwrapping errors, atmospheric artifacts, effects related to the reference point selection, thus making very difficult to define a theoretical error model. A practical way to obtain assess the accuracy is to compare DInSAR results with independent measurements, such as GPS or levelling. Here we present an in-deep comparison between the deformation measurement obtained by exploiting the DInSAR technique referred to as Small BAseline Subset (SBAS) algorithm and by continuous GPS stations. The selected volcanic test-sites are Etna, Vesuvio and Campi Flegrei, in Italy. From continuous GPS data, solutions are computed at the same days SAR data are acquired for direct comparison. Moreover, three dimensional GPS displacement vectors are projected along the radar line of sight of both ascending and descending acquisition orbits. GPS data are then compared with the coherent DInSAR pixels closest to the GPS station. Relevant statistics of the differences between the two measurements are computed and correlated to some scene parameter that may affect DInSAR accuracy (altitude, terrain slope, etc.).

  13. Unsupervised DInSAR processing chain for multi-scale displacement analysis

    NASA Astrophysics Data System (ADS)

    Casu, Francesco; Manunta, Michele

    2016-04-01

    Earth Observation techniques can be very helpful for the estimation of several sources of ground deformation due to their characteristics of large spatial coverage, high resolution and cost effectiveness. In this scenario, Differential Synthetic Aperture Radar Interferometry (DInSAR) is one of the most effective methodologies for its capability to generate spatially dense deformation maps at both global and local spatial scale, with centimeter to millimeter accuracy. DInSAR exploits the phase difference (interferogram) between SAR image pairs relevant to acquisitions gathered at different times, but with the same illumination geometry and from sufficiently close flight tracks, whose separation is typically referred to as baseline. Among several, the SBAS algorithm is one of the most used DInSAR approaches and it is aimed at generating displacement time series at a multi-scale level by exploiting a set of small baseline interferograms. SBAS, and generally DInSAR, has taken benefit from the large availability of spaceborne SAR data collected along years by several satellite systems, with particular regard to the European ERS and ENVISAT sensors, which have acquired SAR images worldwide during approximately 20 years. Moreover, since 2014 the new generation of Copernicus Sentinel satellites has started to acquire data with a short revisit time (12 days) and a global coverage policy, thus flooding the scientific EO community with an unprecedent amount of data. To efficiently manage such amount of data, proper processing facilities (as those coming from the emerging Cloud Computing technologies) have to be used, as well as novel algorithms aimed at their efficient exploitation have to be developed. In this work we present a set of results achieved by exploiting a recently proposed implementation of the SBAS algorithm, namely Parallel-SBAS (P-SBAS), which allows us to effectively process, in an unsupervised way and in a limited time frame, a huge number of SAR images, thus leading to the generation of Interferometric products for both global and local scale displacement analysis. Among several examples, we will show a wide displacement SBAS processing, carried out over the southern California, during which the whole ascending ENVISAT data set of more than 740 images has been fully processed on a Cloud Computing environment in less than 9 hours, leading to the generation of a displacement map of about 150,000 square kilometres. The P-SBAS characteristics allowed also us to integrate the algorithm within the ESA Geohazard Exploitation Platform (GEP), which is based on the use of GRID and Cloud Computing facilities, thus making freely available to the EO community a web tool for massive and systematic interferometric displacement time series generation. This work has been partially supported by: the Italian MIUR under the RITMARE project; the CNR-DPC agreement and the ESA GEP project.

  14. Taming Log Files from Game/Simulation-Based Assessments: Data Models and Data Analysis Tools. Research Report. ETS RR-16-10

    ERIC Educational Resources Information Center

    Hao, Jiangang; Smith, Lawrence; Mislevy, Robert; von Davier, Alina; Bauer, Malcolm

    2016-01-01

    Extracting information efficiently from game/simulation-based assessment (G/SBA) logs requires two things: a well-structured log file and a set of analysis methods. In this report, we propose a generic data model specified as an extensible markup language (XML) schema for the log files of G/SBAs. We also propose a set of analysis methods for…

  15. Large Spatial Scale Ground Displacement Mapping through the P-SBAS Processing of Sentinel-1 Data on a Cloud Computing Environment

    NASA Astrophysics Data System (ADS)

    Casu, F.; Bonano, M.; de Luca, C.; Lanari, R.; Manunta, M.; Manzo, M.; Zinno, I.

    2017-12-01

    Since its launch in 2014, the Sentinel-1 (S1) constellation has played a key role on SAR data availability and dissemination all over the World. Indeed, the free and open access data policy adopted by the European Copernicus program together with the global coverage acquisition strategy, make the Sentinel constellation as a game changer in the Earth Observation scenario. Being the SAR data become ubiquitous, the technological and scientific challenge is focused on maximizing the exploitation of such huge data flow. In this direction, the use of innovative processing algorithms and distributed computing infrastructures, such as the Cloud Computing platforms, can play a crucial role. In this work we present a Cloud Computing solution for the advanced interferometric (DInSAR) processing chain based on the Parallel SBAS (P-SBAS) approach, aimed at processing S1 Interferometric Wide Swath (IWS) data for the generation of large spatial scale deformation time series in efficient, automatic and systematic way. Such a DInSAR chain ingests Sentinel 1 SLC images and carries out several processing steps, to finally compute deformation time series and mean deformation velocity maps. Different parallel strategies have been designed ad hoc for each processing step of the P-SBAS S1 chain, encompassing both multi-core and multi-node programming techniques, in order to maximize the computational efficiency achieved within a Cloud Computing environment and cut down the relevant processing times. The presented P-SBAS S1 processing chain has been implemented on the Amazon Web Services platform and a thorough analysis of the attained parallel performances has been performed to identify and overcome the major bottlenecks to the scalability. The presented approach is used to perform national-scale DInSAR analyses over Italy, involving the processing of more than 3000 S1 IWS images acquired from both ascending and descending orbits. Such an experiment confirms the big advantage of exploiting large computational and storage resources of Cloud Computing platforms for large scale DInSAR analysis. The presented Cloud Computing P-SBAS processing chain can be a precious tool in the perspective of developing operational services disposable for the EO scientific community related to hazard monitoring and risk prevention and mitigation.

  16. The Surveillance and On-demand Sentinel-1 SBAS Services on the Geohazards Exploitation Platforms

    NASA Astrophysics Data System (ADS)

    Casu, F.; de Luca, C.; Zinno, I.; Manunta, M.; Lanari, R.

    2017-12-01

    The Geohazards Exploitation Platform (GEP) is an ESA R&D activity of the EO ground segment to demonstrate the benefit of new technologies for large scale processing of EO data. GEP aims at providing on-demand processing services for specific user needs, as well as systematic processing services to address the need of the geohazards community for common information layers and, finally, to integrate newly developed processors for scientists and other expert users. In this context, a crucial role is played by the recently launched Sentinel-1 (S1) constellation that, with its global acquisition policy, has flooded the scientific community with a huge amount of data acquired over large part of the Earth on a regular basis (down to 6-days with both Sentinel-1A and 1B passes). The Sentinel-1 data, as part of the European Copernicus program, are openly and freely accessible, thus fostering their use for the development of automated and systematic tools for Earth surface monitoring. In particular, due to their specific SAR Interferometry (InSAR) design, Sentinel-1 satellites can be exploited to build up operational services for the easy and rapid generation of advanced InSAR products useful for risk management and natural hazard monitoring. In this work we present the activities carried out for the development, integration, and deployment of two SBAS Sentinel-1 services of CNR-IREA within the GEP framework, namely the Surveillance and On-demand services. The Surveillance service consists on the systematic and automatic processing of Sentinel-1 data over selected Areas of Interest (AoI) to generate updated surface displacement time series via the SBAS-InSAR algorithm. We built up a system that is automatically triggered by every new Sentinel-1 acquisition over the AoI, once it is available on the S1 catalogue. Then, the system processes the new acquisitions only, thus saving storage space and computing time. The processing, which relies on the Parallel version of the SBAS (P-SBAS) chain, allows us to effectively perform massive, systematic and automatic analysis of S1 SAR data. It is worth noting that the SBAS Sentinel-1 services on GEP represent the core of the EPOSAR service, which will deliver S1 displacement time series of Earth surface for the European Plate Observing System (EPOS) Research Infrastructure community.

  17. Quality of intrapartum care by skilled birth attendants in a refugee clinic on the Thai-Myanmar border: a survey using WHO Safe Motherhood Needs Assessment.

    PubMed

    Hoogenboom, Gabie; Thwin, May Myo; Velink, Kris; Baaijens, Marijke; Charrunwatthana, Prakaykaew; Nosten, François; McGready, Rose

    2015-02-05

    Increasing the number of women birthing with skilled birth attendants (SBAs) as one of the strategies to reduce maternal mortality and morbidity must be partnered with a minimum standard of care. This manuscript describes the quality of intrapartum care provided by SBAs in Mae La camp, a low resource, protracted refugee context on the Thai-Myanmar border. In the obstetric department of Shoklo Malaria Research Unit (SMRU) the standardized WHO Safe Motherhood Needs Assessment tool was adapted to the setting and used: to assess the facility; interview SBAs; collect data from maternal records during a one year period (August 2007 - 2008); and observe practice during labour and childbirth. The facility assessment recorded no 'out of stock' or 'out of date' drugs and supplies, equipment was in operating order and necessary infrastructure e.g. a stand-by emergency car, was present. Syphilis testing was not available. SBA interviews established that danger signs and symptoms were recognized except for sepsis and endometritis. All SBAs acknowledged receiving theoretical and 'hands-on' training and regularly attended deliveries. Scores for the essential elements of antenatal care from maternal records were high (>90%) e.g. providing supplements, recording risk factors as well as regular and correct partogram use. Observed good clinical practice included: presence of a support person; active management of third stage; post-partum monitoring; and immediate and correct neonatal care. Observed incorrect practice included: improper controlled cord traction; inadequate hand washing; an episiotomy rate in nulliparous women 49% (34/70) and low rates 30% (6/20) of newborn monitoring in the first hours following birth. Overall observed complications during labour and birth were low with post-partum haemorrhage being the most common in which case the SBAs followed the protocol but were slow to recognize severity and take action. In the clinic of SMRU in Mae La refugee camp, SBAs were able to comply with evidence-based guidelines but support to improve quality of care in specific areas is required. The structure of the WHO Safe Motherhood Needs Assessment allowed significant insights into the quality of intrapartum care particularly through direct observation, identifying a clear pathway for quality improvement.

  18. Application of Geostationary GNSS and SBAS Satellites for Studying Ionospheric TEC Disturbances of Geomagnetic and Meteorological Origin

    NASA Astrophysics Data System (ADS)

    Padokhin, A. M.; Kurbatov, G. A.; Yasyukevich, Y.; Yasyukevich, A.

    2017-12-01

    With the development of GNSS and SBAS constellations, the coherent multi-frequency L band transmissions are now available from a number of geostationary satellites. These signals can be used for ionospheric TEC estimations in the same way as widely used GPS/GLONASS signals. In this work, we compare noise patterns in TEC estimations based on different geostationary satellites data: augmentation systems (Indian GAGAN, European EGNOS and American WAAS), and Chinese COMPASS/Beidou navigation system. We show that noise level in geostationary COMPASS/Beidou TEC estimations is times smaller than noise in SBAS TEC estimation and corresponds to those of GPS/GLONASS at the same elevation angles. We discuss the capabilities of geostationary TEC data for studying ionospheric variability driven by space weather and meteorological sources at different time scales. Analyzing data from IGS/MGEX receivers we present geostationary TEC response on X-class Solar flares of current cycle, moderate and strong geomagnetic storms, including G4 St. Patrick's day Storm 2015 and recent G3 storm of the end of May 2017. We also discuss geostationary TEC disturbances in near equatorial ionosphere caused by two SSW events (minor and major final warming of 2015-2016 winter season) as well as geostationary TEC response on typhoons activity near Taiwan in autumn 2016. Our results show large potential of geostationary TEC estimations with GNSS and SBAS signals for continuous ionospheric monitoring.

  19. Pharmacokinetics and bactericidal activities of one 800-milligram dose versus two 400-milligram doses of intravenously administered pefloxacin in healthy volunteers.

    PubMed Central

    Petitjean, O; Pangon, B; Brion, N; Tod, M; Chaplain, C; Le Gros, V; Louchahi, K; Allouch, P

    1993-01-01

    Pefloxacin pharmacokinetics and serum bactericidal activities (SBA) against Escherichia coli and Staphylococcus aureus were compared after intravenous infusion of either a single 800-mg dose or twice-daily 400-mg doses into 16 healthy volunteers. Plasma pefloxacin concentrations were measured for up to 60 h, and SBAs were determined 1, 12, and 24 h after the start of the infusion. The mean areas under the concentration-versus-time curve for plasma were not different (138 versus 136 h.mg/liter). The mean clearances, volumes of distribution, and half-lives were also comparable. The mean (+/- standard deviation) maximal concentration after the 800-mg infusion was 12.11 +/- 1.35 versus 6.51 +/- 0.73 mg/liter after the first 400-mg infusion and 7.42 +/- 0.76 mg/liter after the second 400-mg infusion. Mean trough concentrations at 24 h were significantly different: 2.77 +/- 0.63 (800 mg) versus 1.93 +/- 0.49 (400 mg twice) mg/liter (P = 0.0007). Mean SBAs against E. coli after 800 mg of pefloxacin were higher than 1/128 (1 h), 1/32 (12 h), and 1/16 (24 h). Mean SBAs against S. aureus under the same conditions were higher than 1/64 (1 h), 1/16 (12 h), and 1/8 (24 h). Mean SBAs at 1 and 12 h were significantly higher after the 800-mg infusion than after the 400-mg infusion but were similar at 24 h for both regimens. Comparison of SBAs according to National Committee for Clinical Laboratory Standards criteria showed a similar adequacy at 24 h for both regimens against both strains. Administration of 800 mg of pefloxacin once a day is bioequivalent to 400 mg twice a day, and bactericidal activity of the 800-mg infusion is not less than that of two 400-mg infusions. PMID:8494368

  20. Analysis of a grid ionospheric vertical delay and its bounding errors over West African sub-Saharan region

    NASA Astrophysics Data System (ADS)

    Abe, O. E.; Otero Villamide, X.; Paparini, C.; Radicella, S. M.; Nava, B.

    2017-02-01

    Investigating the effects of the Equatorial Ionization Anomaly (EIA) ionosphere and space weather on Global Navigation Satellite Systems (GNSS) is very crucial, and a key to successful implementation of a GNSS augmentation system (SBAS) over the equatorial and low-latitude regions. A possible ionospheric vertical delay (GIVD, Grid Ionospheric Vertical Delay) broadcast at a Ionospheric Grid Point (IGP) and its confidence bounds errors (GIVE, Grid Ionospheric Vertical Error) are analyzed and compared with the ionospheric vertical delay estimated at a nearby user location over the West African Sub-Saharan region. Since African sub-Saharan ionosphere falls within the EIA region, which is always characterized by a disturbance in form of irregularities after sunset, and the disturbance is even more during the geomagnetically quiet conditions unlike middle latitudes, the need to have a reliable ionospheric threat model to cater for the nighttime ionospheric plasma irregularities for the future SBAS user is essential. The study was done during the most quiet and disturbed geomagnetic conditions on October 2013. A specific low latitude EGNOS-like algorithm, based on single thin layer model, was engaged to simulate SBAS message in the study. Our preliminary results indicate that, the estimated GIVE detects and protects a potential SBAS user against sampled ionospheric plasma irregularities over the region with a steep increment in GIVE to non-monitored after local sunset to post midnight. This corresponds to the onset of the usual ionospheric plasma irregularities in the region. The results further confirm that the effects of the geomagnetic storms on the ionosphere are not consistent in affecting GNSS applications over the region. Finally, this paper suggests further work to be investigated in order to improve the threat integrity model activity, and thereby enhance the availability of the future SBAS over African sub-Saharan region.

  1. Determinants of Skilled Delivery Assistance in a Rural Population: Findings from an HDSS Site of Rural West Bengal, India.

    PubMed

    Ghosh, Saswata; Siddiqui, Md Zakaria; Barik, Anamitra; Bhaumik, Sunil

    2015-11-01

    This study examines the determinants of utilisation of skilled birth attendants (SBAs) amongst 2886 rural women in the state of West Bengal, India, using data from a survey of 2012-2013 conducted by the Birbhum Health and Demographic Surveillance System. Multilevel logit regression models were estimated and qualitative investigations conducted to understand the determinants of utilisation of SBAs in rural West Bengal. Among women who delivered their last child during the 3 years preceding the survey, 69.1 % of deliveries were assisted by SBAs, while 30.9 % were home deliveries without any SBA assistance. Multivariate analysis revealed that apart from socio-demographic and economic factors (such as household affluence, women's education, birth order, uptake of comprehensive ANC check-ups, advice regarding danger signs of pregnancy and household's socio-religious affiliation), supply side factors, such as availability of skilled birth attendants in the village and all-weather roads, have significant effect on seeking skilled assistance. Our findings also show that unobserved factors at village level independently influence uptake of SBA-assisted delivery. The present findings emphasise that both demand and supply side intervention strategies are essential prerequisites to enhance skilled birth attendance. Ample communication is observed at the individual level, but improving community level outreach and advocacy activities could generate further demand. SBAs can be better integrated by accommodating the socio-religious needs of local communities, such as providing female doctors and doctors with similar socio-religious backgrounds.

  2. A comparison of practices, distributions and determinants of birth attendance in two divisions with highest and lowest skilled delivery attendance in Bangladesh.

    PubMed

    Kibria, Gulam Muhammed Al; Burrowes, Vanessa; Choudhury, Allysha; Sharmeen, Atia; Ghosh, Swagata; Kalbarczyk, Anna

    2018-05-02

    Delivery by skilled birth attendants (SBAs) is strongly recommended to reduce maternal and neonatal mortality. The percentage of births attended by SBAs is low in Bangladesh (42% in 2014), though this rate varies widely by divisions, with the highest 58% in Khulna and only 27% in Sylhet. Comparing and critically analyzing the practices, distributions and determinants of delivery attendance in two divisions with the highest and lowest SBA attendance could help to understand the differences and to employ the findings of the high-performing division to the low-performing division. The 7th Bangladesh Demographic and Health Survey (BDHS 2014) data were analyzed. After reporting the types of delivery attendants, logistic regression analyses were applied to calculate the odds ratios of determinants of deliveries attended by SBAs. SBAs attended 225 (58.6%) and 128 (27.4%) deliveries in Khulna and Sylhet, respectively. Khulna had higher birth attendance by qualified doctors (42.5%, n = 163) than Sylhet (15.8%, n = 74). Sylhet had higher attendance by traditional attendants (60.8%, n = 285) than Khulna (33.7%, n = 129). In both regions, attendance by community skilled birth attendants (CSBAs) was very low (< 1%). Khulna had higher percentages of women with higher education level, husbands' higher education, antenatal care (ANC) visits by SBAs, and higher wealth quintiles than Sylhet. In multivariable analyses, higher education level (adjusted odds ratio (AOR): 8.4; 95% confidence interval (CI): 1.9-36.7), ANC visits (AOR: 3.6; 95% CI: 2.0-6.5), family planning workers' visit (AOR: 3.0; 95% CI: 1.6-5.4), and belonging to richer (AOR: 2.6; 95% CI: 1.4-5.1) or richest (AOR: 3.8; 95% CI: 1.9-7.6) household wealth quintiles had significant positive associations with deliveries by SBAs in Sylhet. Similarly, ANC visits (AOR: 2.5; 95% CI: 1.4-4.6) and higher wealth quintile (AOR: 4.7; 95% CI: 1.9-11.5) were positive predictors in Khulna. The higher proportion of educated women and their husbands, wealth status and ANC visits were associated with higher SBA utilization in Khulna compared to Sylhet. Improvement of socioeconomic status, increasing birth attendant awareness programs, providing ANC services, and family-planning workers' visits could increase the proportion of SBA-attended deliveries in Sylhet Division. CSBA program should be re-evaluated for both divisions.

  3. Analysis of the Los Angeles Basin ground subsidence with InSAR data by independent component analysis approach

    NASA Astrophysics Data System (ADS)

    Xu, B.

    2017-12-01

    Interferometric Synthetic Aperture Radar (InSAR) has the advantages of high spatial resolution which enable measure line of sight (LOS) surface displacements with nearly complete spatial continuity and a satellite's perspective that permits large areas view of Earth's surface quickly and efficiently. However, using InSAR to observe long wavelength and small magnitude deformation signals is still significantly limited by various unmodeled errors sources i.e. atmospheric delays, orbit induced errors, Digital Elevation Model (DEM) errors. Independent component analysis (ICA) is a probabilistic method for separating linear mixed signals generated by different underlying physical processes.The signal sources which form the interferograms are statistically independent both in space and in time, thus, they can be separated by ICA approach.The seismic behavior in the Los Angeles Basin is active and the basin has experienced numerous moderate to large earthquakes since the early Pliocene. Hence, understanding the seismotectonic deformation in the Los Angeles Basin is important for analyzing seismic behavior. Compare with the tectonic deformations, nontectonic deformations due to groundwater and oil extraction may be mainly responsible for the surface deformation in the Los Angeles basin. Using the small baseline subset (SBAS) InSAR method, we extracted the surface deformation time series in the Los Angeles basin with a time span of 7 years (September 27, 2003-September 25,2010). Then, we successfully separate the atmospheric noise from InSAR time series and detect different processes caused by different mechanisms.

  4. DInSAR time series generation within a cloud computing environment: from ERS to Sentinel-1 scenario

    NASA Astrophysics Data System (ADS)

    Casu, Francesco; Elefante, Stefano; Imperatore, Pasquale; Lanari, Riccardo; Manunta, Michele; Zinno, Ivana; Mathot, Emmanuel; Brito, Fabrice; Farres, Jordi; Lengert, Wolfgang

    2013-04-01

    One of the techniques that will strongly benefit from the advent of the Sentinel-1 system is Differential SAR Interferometry (DInSAR), which has successfully demonstrated to be an effective tool to detect and monitor ground displacements with centimetre accuracy. The geoscience communities (volcanology, seismicity, …), as well as those related to hazard monitoring and risk mitigation, make extensively use of the DInSAR technique and they will take advantage from the huge amount of SAR data acquired by Sentinel-1. Indeed, such an information will successfully permit the generation of Earth's surface displacement maps and time series both over large areas and long time span. However, the issue of managing, processing and analysing the large Sentinel data stream is envisaged by the scientific community to be a major bottleneck, particularly during crisis phases. The emerging need of creating a common ecosystem in which data, results and processing tools are shared, is envisaged to be a successful way to address such a problem and to contribute to the information and knowledge spreading. The Supersites initiative as well as the ESA SuperSites Exploitation Platform (SSEP) and the ESA Cloud Computing Operational Pilot (CIOP) projects provide effective answers to this need and they are pushing towards the development of such an ecosystem. It is clear that all the current and existent tools for querying, processing and analysing SAR data are required to be not only updated for managing the large data stream of Sentinel-1 satellite, but also reorganized for quickly replying to the simultaneous and highly demanding user requests, mainly during emergency situations. This translates into the automatic and unsupervised processing of large amount of data as well as the availability of scalable, widely accessible and high performance computing capabilities. The cloud computing environment permits to achieve all of these objectives, particularly in case of spike and peak requests of processing resources linked to disaster events. This work aims at presenting a parallel computational model for the widely used DInSAR algorithm named as Small BAseline Subset (SBAS), which has been implemented within the cloud computing environment provided by the ESA-CIOP platform. This activity has resulted in developing a scalable, unsupervised, portable, and widely accessible (through a web portal) parallel DInSAR computational tool. The activity has rewritten and developed the SBAS application algorithm within a parallel system environment, i.e., in a form that allows us to benefit from multiple processing units. This requires the devising a parallel version of the SBAS algorithm and its subsequent implementation, implying additional complexity in algorithm designing and an efficient multi processor programming, with the final aim of a parallel performance optimization. Although the presented algorithm has been designed to work with Sentinel-1 data, it can also process other satellite SAR data (ERS, ENVISAT, CSK, TSX, ALOS). Indeed, the performance analysis of the implemented SBAS parallel version has been tested on the full ASAR archive (64 acquisitions) acquired over the Napoli Bay, a volcanic and densely urbanized area in Southern Italy. The full processing - from the raw data download to the generation of DInSAR time series - has been carried out by engaging 4 nodes, each one with 2 cores and 16 GB of RAM, and has taken about 36 hours, with respect to about 135 hours of the sequential version. Extensive analysis on other test areas significant from DInSAR and geophysical viewpoint will be presented. Finally, preliminary performance evaluation of the presented approach within the Sentinel-1 scenario will be provided.

  5. Satellite SAR interferometric techniques applied to emergency mapping

    NASA Astrophysics Data System (ADS)

    Stefanova Vassileva, Magdalena; Riccardi, Paolo; Lecci, Daniele; Giulio Tonolo, Fabio; Boccardo Boccardo, Piero; Chiesa, Giuliana; Angeluccetti, Irene

    2017-04-01

    This paper aim to investigate the capabilities of the currently available SAR interferometric algorithms in the field of emergency mapping. Several tests have been performed exploiting the Copernicus Sentinel-1 data using the COTS software ENVI/SARscape 5.3. Emergency Mapping can be defined as "creation of maps, geo-information products and spatial analyses dedicated to providing situational awareness emergency management and immediate crisis information for response by means of extraction of reference (pre-event) and crisis (post-event) geographic information/data from satellite or aerial imagery". The conventional differential SAR interferometric technique (DInSAR) and the two currently available multi-temporal SAR interferometric approaches, i.e. Permanent Scatterer Interferometry (PSI) and Small BAseline Subset (SBAS), have been applied to provide crisis information useful for the emergency management activities. Depending on the considered Emergency Management phase, it may be distinguished between rapid mapping, i.e. fast provision of geospatial data regarding the area affected for the immediate emergency response, and monitoring mapping, i.e. detection of phenomena for risk prevention and mitigation activities. In order to evaluate the potential and limitations of the aforementioned SAR interferometric approaches for the specific rapid and monitoring mapping application, five main factors have been taken into account: crisis information extracted, input data required, processing time and expected accuracy. The results highlight that DInSAR has the capacity to delineate areas affected by large and sudden deformations and fulfills most of the immediate response requirements. The main limiting factor of interferometry is the availability of suitable SAR acquisition immediately after the event (e.g. Sentinel-1 mission characterized by 6-day revisiting time may not always satisfy the immediate emergency request). PSI and SBAS techniques are suitable to produce monitoring maps for risk prevention and mitigation purposes. Nevertheless, multi-temporal techniques require large SAR temporal datasets, i.e. 20 and more images. Being the Sentinel-1 missions operational only since April 2014, multi-mission SAR datasets should be therefore exploited to carry out historical analysis.

  6. Surface deformation analysis of the Mauna Loa and Kilauea volcanoes, Hawaii , revealed by InSAR measurements

    NASA Astrophysics Data System (ADS)

    Casu, F.; Poland, M.; Solaro, G.; Tizzani, P.; Miklius, A.; Sansosti, E.; Lanari, R.

    2009-04-01

    The Big Island of Hawaii is home to three volcanoes that have historically erupted. Hualālai, on the east side of the island, Mauna Loa, the largest volcano on the planet which has erupted 39 times since 1832 (most recently in 1984) and Kilauea, which has been in a state of continuous eruption since 1983 from vents on the volcano's east rift zone. Deformation at Kilauea is characterized by summit and rift zone displacements related to magmatic activity and seaward motion of the south flank caused by slip along a basal decollement. In this work we investigate the deformation affecting the Mauna Loa and Kilauea volcanoes, Hawaii , by exploiting the advanced Interferometric Synthetic Aperture Radar (InSAR) technique referred to as Small BAseline Subset (SBAS) algorithm. In particular, we present time series of line-of-sight (LOS) displacements derived from the SAR data acquired by the ASAR instrument, on board the ENVISAT satellite, from the ascending (track 93, frame 387) and descending (track 429, frame 3213) orbits over a time period between 2003 and 2008. For each coherent pixel of the radar images we compute time-dependent surface displacements as well as the average LOS deformation velocity. We also benefit from the use of the multi-orbit (ascending and descending) data which permit us to discriminate the vertical and east-west components of the revealed displacements. The retrieved InSAR measurements are also favourably compared to the continuous GPS data available in the area in order to asses the quality of the SBAS-InSAR products. The presented results show the complex and articulated deformation behavior of the investigated volcanoes; moreover, the possibility to invert the retrieved DInSAR products, in order to model both deep geological structures and magmatic sources, represents a relevant issue for the comprehension of the volcanoes dynamics.

  7. Factors associated with different types of birth attendants for home deliveries: an analysis of the cross-sectional 2010 South Sudan household survey

    PubMed Central

    Mugo, Ngatho S.; Agho, Kingsley E.; Zwi, Anthony B.; Dibley, Michael J.

    2016-01-01

    Background In South Sudan, birth deliveries attended by unskilled birth attendants put the mothers and their newborns at increased risk of perinatal morbidity and mortality. The aim of this study was to identify factors associated with delivery by unskilled birth attendants or by unassisted delivery. Design We examined data for 2,767 (weighted total) women aged 15–49 years who delivered at home 2 years prior to the South Sudan Household Health Survey 2010. Multinomial logistic regression analyses were used to identify factors associated with delivery by unskilled birth attendants or by unassisted delivery. Results The prevalence of delivery by unskilled birth attendants was 19% [95% confidence interval (CI) 17.0, 20.5], by skilled birth attendants (SBAs) was 45% (95% CI 42.4, 47.0), and by unassisted delivery was 36% (95% CI 34.2, 38.6). After adjusting for potential confounders, the following factors were associated with the increased odds for unassisted delivery or delivery by an unskilled birth attendant: mothers with no schooling, who did not attend antenatal care (ANC) during pregnancy, who had lower quality of ANC services, from poor households, or who had no prior knowledge about obstetric danger signs. Conclusions We found that non-utilization of maternal health care services, such as ANC, was significantly associated with unattended birth delivery or delivery by unskilled health providers. The increased uptake of SBAs at delivery will require easier access to ANC services, health promotion on the importance and benefits of SBAs for delivery, targeting both mothers and their families, and the training and deployment of more SBAs across the country. PMID:27473675

  8. Mapping Changes and Damages in Areas of Conflict: From Archive C-Band SAR Data to New HR X-Band Imagery, Towards the Sentinels

    NASA Astrophysics Data System (ADS)

    Tapete, Deodato; Cigna, Francesca; Donoghue, Daniel N. M.; Philip, Graham

    2015-05-01

    On the turn of radar space science with the recent launch of Sentinel-1A, we investigate how to better exploit the opportunities offered by large C-band SAR archives and increasing datasets of HR to VHR X-band data, to map changes and damages in urban and rural areas affected by conflicts. We implement a dual approach coupling multi-interferogram processing and amplitude change detection, to assess the impact of the recent civil war on the city of Homs, Western Syria, and the surrounding semi-arid landscape. More than 280,000 coherent pixels are retrieved from Small BAseline Subset (SBAS) processing of the 8year-long ENVISAT ASAR IS2 archive, to quantify land subsidence due to pre-war water abstraction in rural areas. Damages in Homs are detected by analysing the changes of SAR backscattering (σ0), comparing 3m-resolution StripMap TerraSAR-X pairs from 2009 to 2014. Pre-war alteration is differentiated from war-related damages via operator-driven interpretation of the σ0 patterns.

  9. Monitoring on Xi'an ground fissures deformation with TerraSAR-X data

    USGS Publications Warehouse

    Zhao, C.; Zhang, Q.; Zhu, W.; Lu, Z.

    2012-01-01

    Owing to the fine resolution of TerraSAR-X data provided since 2007, this paper applied 6 TerraSAR data (strip mode) during 3rd Dec. 2009 to 23rd Mar. 2010 to detect and monitor the active fissures over Xi'an region. Three themes have been designed for high precision detection and monitoring of Xi'an-Chang'an fissures, as small baseline subsets (SBAS) to test the atmospheric effects of differential interferograms pair stepwise, 2-pass differential interferogram with very short baseline perpendicular to generate the whole deformation map with 44 days interval, and finally, corner reflector (CR) technique was used to closely monitor the relative deformation time series between two CRs settled crossing two ground fissures. Results showed that TerraSAR data are a good choice for small-scale ground fissures detection and monitoring, while special considerations should be taken for their great temporal and baseline decorrelation. Secondly, ground fissures in Xi'an were mostly detected at the joint section of stable and deformable regions. Lastly, CR-InSAR had potential ability to monitor relative deformation crossing fissures with millimeter precision.

  10. 2014-2016 Mt. Etna Ground deformation imaged by SISTEM approach using GPS and SENTINEL-1A/1B TOPSAR data

    NASA Astrophysics Data System (ADS)

    Bonforte, Alessandro; Guglielmino, Francesco; Puglisi, Giuseppe

    2017-04-01

    In the frame of the EC FP7 MED-SUV project (call FP7 ENV.2012.6.4-2), and thanks to the GEO-GSNL initiative, GPS data and SENTINEL 1A/1B TOPSAR acquired on Mt. Etna between October 2014 and November 2016 were analyzed. The SENTINEL data were used in order to combine and integrate them with GPS, and detail the ground deformation recorded by GPS on Mt. Etna, during the last two-year's volcanic activity. The Sentinel data were processed by GAMMA software, using a spectral diversity method and a procedure able to co-register the SENTINEL pairs with extremely high precision (< 0.01 pixel). In order to optimize the time processing, a new software architecture based on the hypervisor virtualization technology for the x64 versions of Windows has been implemented. The DInSAR results are analysed and successively used as input for the time series analysis using the StaMPS package. On December 28, 2014 eruptive activity resumed at Mt. Etna with a fire fountain activity feeding two lava flows spreading on the eastern and south-western upper flanks of the volcano, producing evident deformation at the summit of the volcano. GPS displacements and Sentinel-1A ascending interferogram were calculated in order to image the ground deformation pattern accompanying the eruption. The ground deformation pattern has been perfectly depicted by the GPS network, mainly affecting the uppermost part of the volcano edifice, with a strong decay of the deformation, according to a very shallow and strong dyke intrusion. The Sentinel 1A SAR data, covering the similar time spanning, confirmed that most of displacements are related to the dike intrusion, and evidenced a local gravity-driven motion of the western wall of the Valle del Bove, probably related to the dike intrusion. To monitor the temporal successive evolution of ground deformation, we performed an A-DInSAR SENTINEL analysis using the Small BAseline Subset (SBAS) approach included with the StaMPS processing package. The April 2015-December 2015, SBAS Time series, shown a volcano inflation, with an uplift of about 28 mm localized in the central and western area of the volcano. Suddenly, in the first days of December 2015, volcanic activity abruptly restarted at the central crater with a very strongly explosive eruption; this kind of activity continued, with a decreasing intensity, with other episodes at the same crater and then involving, in turn, all the other three summit craters of the volcano. On December 8, when the eruptive activity was concluding, a seismic swarm affected the uppermost part of the Pernicana fault where it joins the NE-Rift. The SBAS time series have then been integrated by the SISTEM algorithm with the ground displacements measured by two GPS surveys carried out on the NE flank of the volcano at the end of April and in mid-December 2015. Results of this data integration provide a very detailed picture of the ground deformation pattern on the volcano, preceding and accompanying the vigorous eruption and the seismic swarm; besides the general inflation of the edifice during the pre-eruptive period. The January 2016 - November 2016 is the last period analyzed, characterized by the kinematic of the eastern unstable flank, with displacement involving both the Pernicana fault and the other structures dissecting this sector of the volcano.

  11. How the integration of traditional birth attendants with formal health systems can increase skilled birth attendance.

    PubMed

    Byrne, Abbey; Morgan, Alison

    2011-11-01

    Forty years of safe motherhood programming has demonstrated that isolated interventions will not reduce maternal mortality sufficiently to achieve MDG 5. Although skilled birth attendants (SBAs) can intervene to save lives, traditional birth attendants (TBAs) are often preferred by communities. Considering the value of both TBAs and SBAs, it is important to review strategies for maximizing their respective strengths. To describe mechanisms to integrate TBAs with the health system to increase skilled birth attendance and examine the components of successful integration. A systematic review of interventions linking TBAs and formal health workers, measuring outcomes of skilled birth attendance, referrals, and facility deliveries. Thirty-three articles met the selection criteria. Mechanisms used for integration included training and supervision of TBAs, collaboration skills for health workers, inclusion of TBAs at health facilities, communication systems, and clear definition of roles. Impact on skilled birth attendance depended on selection of TBAs, community participation, and addressing barriers to access. Successful approaches were context-specific. The integration of TBAs with formal health systems increases skilled birth attendance. The greatest impact is seen when TBA integration is combined with complementary actions to overcome context-specific barriers to contact among SBAs, TBAs, and women. Copyright © 2011 International Federation of Gynecology and Obstetrics. Published by Elsevier Ireland Ltd. All rights reserved.

  12. How downstream sub-basins depend on upstream inflows to avoid scarcity: typology and global analysis of transboundary rivers

    NASA Astrophysics Data System (ADS)

    Munia, Hafsa Ahmed; Guillaume, Joseph H. A.; Mirumachi, Naho; Wada, Yoshihide; Kummu, Matti

    2018-05-01

    Countries sharing river basins are often dependent upon water originating outside their boundaries; meaning that without that upstream water, water scarcity may occur with flow-on implications for water use and management. We develop a formalisation of this concept drawing on ideas about the transition between regimes from resilience literature, using water stress and water shortage as indicators of water scarcity. In our analytical framework, dependency occurs if water from upstream is needed to avoid scarcity. This can be diagnosed by comparing different types of water availability on which a sub-basin relies, in particular local runoff and upstream inflows. At the same time, possible upstream water withdrawals reduce available water downstream, influencing the latter water availability. By developing a framework of scarcity and dependency, we contribute to the understanding of transitions between system regimes. We apply our analytical framework to global transboundary river basins at the scale of sub-basin areas (SBAs). Our results show that 1175 million people live under water stress (42 % of the total transboundary population). Surprisingly, the majority (1150 million) of these currently suffer from stress only due to their own excessive water use and possible water from upstream does not have impact on the stress status - i.e. they are not yet dependent on upstream water to avoid stress - but could still impact on the intensity of the stress. At the same time, 386 million people (14 %) live in SBAs that can avoid stress owing to available water from upstream and have thus upstream dependency. In the case of water shortage, 306 million people (11 %) live in SBAs dependent on upstream water to avoid possible shortage. The identification of transitions between system regimes sheds light on how SBAs may be affected in the future, potentially contributing to further refined analysis of inter- and intrabasin hydro-political power relations and strategic planning of management practices in transboundary basins.

  13. Atmospheric corrections in interferometric synthetic aperture radar surface deformation - a case study of the city of Mendoza, Argentina

    NASA Astrophysics Data System (ADS)

    Balbarani, S.; Euillades, P. A.; Euillades, L. D.; Casu, F.; Riveros, N. C.

    2013-09-01

    Differential interferometry is a remote sensing technique that allows studying crustal deformation produced by several phenomena like earthquakes, landslides, land subsidence and volcanic eruptions. Advanced techniques, like small baseline subsets (SBAS), exploit series of images acquired by synthetic aperture radar (SAR) sensors during a given time span. Phase propagation delay in the atmosphere is the main systematic error of interferometric SAR measurements. It affects differently images acquired at different days or even at different hours of the same day. So, datasets acquired during the same time span from different sensors (or sensor configuration) often give diverging results. Here we processed two datasets acquired from June 2010 to December 2011 by COSMO-SkyMed satellites. One of them is HH-polarized, and the other one is VV-polarized and acquired on different days. As expected, time series computed from these datasets show differences. We attributed them to non-compensated atmospheric artifacts and tried to correct them by using ERA-Interim global atmospheric model (GAM) data. With this method, we were able to correct less than 50% of the scenes, considering an area where no phase unwrapping errors were detected. We conclude that GAM-based corrections are not enough for explaining differences in computed time series, at least in the processed area of interest. We remark that no direct meteorological data for the GAM-based corrections were employed. Further research is needed in order to understand under what conditions this kind of data can be used.

  14. Improved Small Baseline processing by means of CAESAR eigen-interferograms decomposition

    NASA Astrophysics Data System (ADS)

    Verde, Simona; Reale, Diego; Pauciullo, Antonio; Fornaro, Gianfranco

    2018-05-01

    The Component extrAction and sElection SAR (CAESAR) is a method for the selection and filtering of scattering mechanisms recently proposed in the multibaseline interferometric SAR framework. Its strength is related to the possibility to select and extract multiple dominant scattering mechanisms, even interfering in the same pixel, since the stage of the interferograms generation, and to carry out a decorrelation noise phase filtering. Up to now, the validation of CAESAR has been addressed in the framework of SAR Tomography for the model-based detection of Persistent Scatterers (PSs). In this paper we investigate the effectiveness related to the use of CAESAR eigen-interferograms in classical multi-baseline DInSAR processing, based on the Small BAseline Subset (SBAS) strategy, typically adopted to extract large scale distributed deformation and atmospheric phase screen. Such components are also exploited for the calibration of the full resolution data for PS or tomographic analysis. By using COSMO-SKyMed (CSK) SAR data, it is demonstrated that dominant scattering component filtering effectively improves the monitoring of distributed spatially decorrelated areas (f.i. bare soil, rocks, etc.) and allows bringing to light man-made structures with dominant backscattering characteristics embedded in highly temporally decorrelated scenario, as isolated asphalt roads and block of buildings in non-urban areas. Moreover it is shown that, thanks to the CAESAR multiple scattering components separation, the layover mitigation in low-topography eigen-interferograms relieves Phase Unwrapping (PhU) errors in urban areas due to abrupt height variations.

  15. Crustal Deformation along San Andreas Fault System revealed by GPS and Sentinel-1 InSAR

    NASA Astrophysics Data System (ADS)

    Xu, X.; Sandwell, D. T.

    2017-12-01

    We present a crustal deformation velocity map along the San Andreas Fault System by combining measurements from Sentinel-1 Interferometric Synthetic Aperture Radar (InSAR) and Global Positioning System (GPS) velocity models (CGM V1). We assembled 5 tracks of descending Sentinel-1 InSAR data spanning 2014.11-2017.02, and produced 545 interferograms, each of which covers roughly 250km x 420km area ( 60 bursts). These interferograms are unwrapped using SNAPHU [Chen & Zebker, 2002], with the 2Npi unwrapping ambiguity corrected with a sparse recovery method. We used coherence-based small baseline subset (SBAS) method [Tong & Schmidt, 2016] together with atmospheric correction by common-point stacking [Tymofyeyeva and Fialko, 2015] to construct deformation time series [Xu et. al., 2017]. Then we project the horizontal GPS model and vertical GPS data into satellite line-of-sight directions separately. We first remove the horizontal GPS model from InSAR measurements and perform elevation-dependent atmospheric phase correction. Then we compute the discrepancy between the remaining InSAR measurements and vertical GPS data. We interpolate this discrepancy and remove it from the residual InSAR measurements. Finally, we restore the horizontal GPS model. Preliminary results show that fault creep over the San Jacinto fault, the Elsinore fault, and the San Andreas creeping section is clearly resolved. During the period of drought, the Central Valley of California was subsiding at a high rate (up to 40 cm/yr), while the city of San Jose is uplifting due to recharge, with a quaternary fault acting as a ground water barrier. These findings will be reported during the meeting.

  16. Mapping ground surface deformation using temporarily coherent point SAR interferometry: Application to Los Angeles Basin

    USGS Publications Warehouse

    Zhang, L.; Lu, Zhong; Ding, X.; Jung, H.-S.; Feng, G.; Lee, C.-W.

    2012-01-01

    Multi-temporal interferometric synthetic aperture radar (InSAR) is an effective tool to detect long-term seismotectonic motions by reducing the atmospheric artifacts, thereby providing more precise deformation signal. The commonly used approaches such as persistent scatterer InSAR (PSInSAR) and small baseline subset (SBAS) algorithms need to resolve the phase ambiguities in interferogram stacks either by searching a predefined solution space or by sparse phase unwrapping methods; however the efficiency and the success of phase unwrapping cannot be guaranteed. We present here an alternative approach – temporarily coherent point (TCP) InSAR (TCPInSAR) – to estimate the long term deformation rate without the need of phase unwrapping. The proposed approach has a series of innovations including TCP identification, TCP network and TCP least squares estimator. We apply the proposed method to the Los Angeles Basin in southern California where structurally active faults are believed capable of generating damaging earthquakes. The analysis is based on 55 interferograms from 32 ERS-1/2 images acquired during Oct. 1995 and Dec. 2000. To evaluate the performance of TCPInSAR on a small set of observations, a test with half of interferometric pairs is also performed. The retrieved TCPInSAR measurements have been validated by a comparison with GPS observations from Southern California Integrated GPS Network. Our result presents a similar deformation pattern as shown in past InSAR studies but with a smaller average standard deviation (4.6 mm) compared with GPS observations, indicating that TCPInSAR is a promising alternative for efficiently mapping ground deformation even from a relatively smaller set of interferograms.

  17. Improved Persistent Scatterer analysis using Amplitude Dispersion Index optimization of dual polarimetry data

    NASA Astrophysics Data System (ADS)

    Esmaeili, Mostafa; Motagh, Mahdi

    2016-07-01

    Time-series analysis of Synthetic Aperture Radar (SAR) data using the two techniques of Small BAseline Subset (SBAS) and Persistent Scatterer Interferometric SAR (PSInSAR) extends the capability of conventional interferometry technique for deformation monitoring and mitigating many of its limitations. Using dual/quad polarized data provides us with an additional source of information to improve further the capability of InSAR time-series analysis. In this paper we use dual-polarized data and combine the Amplitude Dispersion Index (ADI) optimization of pixels with phase stability criterion for PSInSAR analysis. ADI optimization is performed by using Simulated Annealing algorithm to increase the number of Persistent Scatterer Candidate (PSC). The phase stability of PSCs is then measured using their temporal coherence to select the final sets of pixels for deformation analysis. We evaluate the method for a dataset comprising of 17 dual polarization SAR data (HH/VV) acquired by TerraSAR-X data from July 2013 to January 2014 over a subsidence area in Iran and compare the effectiveness of the method for both agricultural and urban regions. The results reveal that using optimum scattering mechanism decreases the ADI values in urban and non-urban regions. As compared to single-pol data the use of optimized polarization increases initially the number of PSCs by about three times and improves the final PS density by about 50%, in particular in regions with high rate of deformation which suffer from losing phase stability over the time. The classification of PS pixels based on their optimum scattering mechanism revealed that the dominant scattering mechanism of the PS pixels in the urban area is double-bounce while for the non-urban regions (ground surfaces and farmlands) it is mostly single-bounce mechanism.

  18. Nature-based solutions for hydro-meteorological risk reduction and nutrient removal in the Nordic and Arctic regions

    NASA Astrophysics Data System (ADS)

    Bring, Arvid; Kalantari, Zahra

    2017-04-01

    Natural ecological functions provide essential and fundamental benefits to mankind, but can also be actively employed in nature-based solutions to specific challenges in society. For example, water-related ecosystem services have a role in such societal benefits as flood protection, erosion control, and excess nutrient removal. Ecosystem services may be produced and consumed in different locations, and research has recently attempted to formalize this discrepancy in identifying service providing areas (SPAs), service benefitting areas (SBAs), and service connecting areas (SCAs). However, in terms of water-related services, there is a lack of formal evaluation of how SPAs, SBAs, and SCAs are related to hydrological measures such as discharge, flood recurrence, excess nutrient removal, etc. We seek to map SPAs, SBAs and SCAs for a number of key ecosystem services in the Nordic and Arctic region though established ecological definitions (typically, based on land use) and evaluate the findings alongside metrics of hydrological connectivity (river networks), provisioning areas (runoff generating areas), and benefitting areas (river stretches where water flow is moderated). We make use of extensive GIS analysis using both high-resolution land cover data and river network maps. In the end, the results are expected to contribute to identifying how water-related ecosystem services can be employed as nature-based solutions for hydro-meteorological risk reduction and nutrient removal in a changing climate in the Nordic and Arctic regions.

  19. Partial nitritation of raw anaerobic sludge digester liquor by swim-bed and swim-bed activated sludge processes and comparison of their sludge characteristics.

    PubMed

    Qiao, Sen; Kawakubo, Yuki; Koyama, Toichiro; Furukawa, Kenji

    2008-11-01

    This study evaluated performance of swim-bed (SB) reactors packed with a novel acrylic fiber carrier (BF) and swim-bed activated sludge (SBAS) reactor for partial nitritation of anaerobic sludge digester liquor from a municipal wastewater treatment plant. Comparison of characteristics of sludge obtained from both the reactors was also made. The average conversion rates of ammonium to nitrite were 52.3% and 40.0% under relatively high nitrogen loading rates over 3.0 kg-N/m(3)/d, respectively in two reactors. The average BOD(5) removal efficiencies were 74.3% and 64.4%, respectively in the two reactors. The size of the sludge pellets taken from SB and SBAS reactors was found to be approximately three times (229 mum versus 88 mum) of that of the seed sludge. This sludge also had relatively high extracellular proteins levels indicating better sludge settling capability as compared to the sludge taken from SBAS reactor. Although the effluent nitrite/ammonium ratios had fluctuated in both reactor in some extent, the low dissolved oxygen concentration (average of 2.5 versus 0.35 mg/l), low suspended solids (average of 33.3 versus 33.5 mg/l), and about 50% ammonium conversion to nitrite demonstrated the application potential of anammox process for nitrogen removal.

  20. Application of Satellite Based Augmentation Systems to Altitude Separation

    NASA Astrophysics Data System (ADS)

    Magny, Jean Pierre

    This paper presents the application of GNSS1, or more precisely of Satellite Based Augmentation Systems (SBAS), to vertical separation for en-route, approach and landing operations. Potential improvements in terms of operational benefit and of safety are described for two main applications. First, vertical separation between en-route aircraft, which requires a system available across wide areas. SBAS (EGNOS, WAAS, and MSAS) are very well suited for this purpose before GNSS2 becomes available. And secondly, vertical separation from the ground during approach and landing, for which preliminary design principles of instrument approach procedures and safety issues are presented. Approach and landing phases are the subject of discussions within ICAO GNSS-P. En-route phases have been listed as GNSS-P future work and by RTCA for development of new equipments.

  1. Long-term ground deformation patterns of Bucharest using multi-temporal InSAR and multivariate dynamic analyses: a possible transpressional system?

    PubMed Central

    Armaş, Iuliana; Mendes, Diana A.; Popa, Răzvan-Gabriel; Gheorghe, Mihaela; Popovici, Diana

    2017-01-01

    The aim of this exploratory research is to capture spatial evolution patterns in the Bucharest metropolitan area using sets of single polarised synthetic aperture radar (SAR) satellite data and multi-temporal radar interferometry. Three sets of SAR data acquired during the years 1992–2010 from ERS-1/-2 and ENVISAT, and 2011–2014 from TerraSAR-X satellites were used in conjunction with the Small Baseline Subset (SBAS) and persistent scatterers (PS) high-resolution multi-temporal interferometry (InSAR) techniques to provide maps of line-of-sight displacements. The satellite-based remote sensing results were combined with results derived from classical methodologies (i.e., diachronic cartography) and field research to study possible trends in developments over former clay pits, landfill excavation sites, and industrial parks. The ground displacement trend patterns were analysed using several linear and nonlinear models, and techniques. Trends based on the estimated ground displacement are characterised by long-term memory, indicated by low noise Hurst exponents, which in the long-term form interesting attractors. We hypothesize these attractors to be tectonic stress fields generated by transpressional movements. PMID:28252103

  2. Long-term ground deformation patterns of Bucharest using multi-temporal InSAR and multivariate dynamic analyses: a possible transpressional system?

    PubMed

    Armaş, Iuliana; Mendes, Diana A; Popa, Răzvan-Gabriel; Gheorghe, Mihaela; Popovici, Diana

    2017-03-02

    The aim of this exploratory research is to capture spatial evolution patterns in the Bucharest metropolitan area using sets of single polarised synthetic aperture radar (SAR) satellite data and multi-temporal radar interferometry. Three sets of SAR data acquired during the years 1992-2010 from ERS-1/-2 and ENVISAT, and 2011-2014 from TerraSAR-X satellites were used in conjunction with the Small Baseline Subset (SBAS) and persistent scatterers (PS) high-resolution multi-temporal interferometry (InSAR) techniques to provide maps of line-of-sight displacements. The satellite-based remote sensing results were combined with results derived from classical methodologies (i.e., diachronic cartography) and field research to study possible trends in developments over former clay pits, landfill excavation sites, and industrial parks. The ground displacement trend patterns were analysed using several linear and nonlinear models, and techniques. Trends based on the estimated ground displacement are characterised by long-term memory, indicated by low noise Hurst exponents, which in the long-term form interesting attractors. We hypothesize these attractors to be tectonic stress fields generated by transpressional movements.

  3. Spatiotemporal deformation patterns of the Lake Urmia Causeway as characterized by multisensor InSAR analysis.

    PubMed

    Karimzadeh, Sadra; Matsuoka, Masashi; Ogushi, Fumitaka

    2018-04-03

    We present deformation patterns in the Lake Urmia Causeway (LUC) in NW Iran based on data collected from four SAR sensors in the form of interferometric synthetic aperture radar (InSAR) time series. Sixty-eight images from Envisat (2004-2008), ALOS-1 (2006-2010), TerraSAR-X (2012-2013) and Sentinel-1 (2015-2017) were acquired, and 227 filtered interferograms were generated using the small baseline subset (SBAS) technique. The rate of line-of-sight (LOS) subsidence of the LUC peaked at 90 mm/year between 2012 and 2013, mainly due to the loss of most of the water in Lake Urmia. Principal component analysis (PCA) was conducted on 200 randomly selected time series of the LUC, and the results are presented in the form of the three major components. The InSAR scores obtained from the PCA were used in a hydro-thermal model to investigate the dynamics of consolidation settlement along the LUC based on detrended water level and temperature data. The results can be used to establish a geodetic network around the LUC to identify more detailed deformation patterns and to help plan future efforts to reduce the possible costs of damage.

  4. Mapping slope movements in Alpine environments using TerraSAR-X interferometric methods

    NASA Astrophysics Data System (ADS)

    Barboux, Chloé; Strozzi, Tazio; Delaloye, Reynald; Wegmüller, Urs; Collet, Claude

    2015-11-01

    Mapping slope movements in Alpine environments is an increasingly important task in the context of climate change and natural hazard management. We propose the detection, mapping and inventorying of slope movements using different interferometric methods based on TerraSAR-X satellite images. Differential SAR interferograms (DInSAR), Persistent Scatterer Interferometry (PSI), Short-Baseline Interferometry (SBAS) and a semi-automated texture image analysis are presented and compared in order to determine their contribution for the automatic detection and mapping of slope movements of various velocity rates encountered in Alpine environments. Investigations are conducted in a study region of about 6 km × 6 km located in the Western Swiss Alps using a unique large data set of 140 DInSAR scenes computed from 51 summer TerraSAR-X (TSX) acquisitions from 2008 to 2012. We found that PSI is able to precisely detect only points moving with velocities below 3.5 cm/yr in the LOS, with a root mean squared error of about 0.58 cm/yr compared to DGPS records. SBAS employed with 11 days summer interferograms increases the range of detectable movements to rates up to 35 cm/yr in the LOS with a root mean squared error of 6.36 cm/yr, but inaccurate measurements due to phase unwrapping are already possible for velocity rates larger than 20 cm/year. With the semi-automated texture image analysis the rough estimation of the velocity rates over an outlined moving zone is accurate for rates of "cm/day", "dm/month" and "cm/month", but due to the decorrelation of yearly TSX interferograms this method fails for the observation of slow movements in the "cm/yr" range.

  5. The Semantic Management of Environmental Resources within the Interoperable Context of the EuroGEOSS: Alignment of GEMET and the GEOSS SBAs

    NASA Astrophysics Data System (ADS)

    Cialone, Claudia; Stock, Kristin

    2010-05-01

    EuroGEOSS is a European Commission funded project. It aims at improving a scientific understanding of the complex mechanisms which drive changes affecting our planet, identifying and establishing interoperable arrangements between environmental information systems. These systems would be sustained and operated by organizations with a clear mandate and resources and rendered available following the specifications of already existent frameworks such as GEOSS (the Global Earth Observation System of systems)1 and INSPIRE (the Infrastructure for Spatial Information in the European Community)2. The EuroGEOSS project's infrastructure focuses on three thematic areas: forestry, drought and biodiversity. One of the important activities in the project is the retrieval, parsing and harmonization of the large amount of heterogeneous environmental data available at local, regional and global levels between these strategic areas. The challenge is to render it semantically and technically interoperable in a simple way. An initial step in achieving this semantic and technical interoperability involves the selection of appropriate classification schemes (for example, thesauri, ontologies and controlled vocabularies) to describe the resources in the EuroGEOSS framework. These classifications become a crucial part of the interoperable framework scaffolding because they allow data providers to describe their resources and thus support resource discovery, execution and orchestration of varying levels of complexity. However, at present, given the diverse range of environmental thesauri, controlled vocabularies and ontologies and the large number of resources provided by project participants, the selection of appropriate classification schemes involves a number of considerations. First of all, there is the semantic difficulty of selecting classification schemes that contain concepts that are relevant to each thematic area. Secondly, EuroGEOSS is intended to accommodate a number of existing environmental projects (for example, GEOSS and INSPIRE). This requirement imposes constraints on the selection. Thirdly, the selected classification scheme or group of schemes (if more than one) must be capable of alignment (establishing different kinds of mappings between concepts, hence preserving intact the original knowledge schemes) or merging (the creation of another unique ontology from the original ontological sources) (Pérez-Gómez et al., 2004). Last but not least, there is the issue of including multi-lingual schemes that are based on free, open standards (non-proprietary). Using these selection criteria, we aim to support open and convenient data discovery and exchange for users who speak different languages (particularly the European ones for the broad scopes of EuroGEOSS). In order to support the project, we have developed a solution that employs two classification schemes: the Societal Benefit Areas (SBAs)3: the upper-level environmental categorization developed for the GEOSS project and the GEneral Multilingual Environmental Thesaurus (GEMET)4: a general environmental thesaurus whose conceptual structure has already been integrated with the spatial data themes proposed by the INSPIRE project. The former seems to provide the spatial data keywords relevant to the INSPIRE's Directive (JRC, 2008). In this way, we provide users with a basic set of concepts to support resource description and discovery in the thematic areas while supporting the requirements of INSPIRE and GEOSS. Furthermore, the use of only two classification schemes together with the fact that the SBAs are very general categories while GEMET includes much more detailed, yet still top-level, concepts, makes alignment an achievable task. Alignment was selected over merging because it leaves the existing classification schemes intact and requires only a simple activity of defining mappings from GEMET to the SBAs. In order to accomplish this task we are developing a simple, automated, open-source application to assist thematic experts in defining the mappings between concepts in the two classification schemes. The application will then generate SKOS mappings (exactMatch, closeMatch, broadMatch, narrowMatch, relatedMatch) based on thematic expert selections between the concepts in GEMET with the SBAs (including both the general Societal Benefit Areas and their subcategories). Once these mappings are defined and the SKOS files generated, resource providers will be able to select concepts from either GEMET or the SBAs (or a mixture) to describe their resources, and discovery approaches will support selection of concepts from either classification scheme, also returning results classified using the other scheme. While the focus of our work has been on the SBAs and GEMET, we also plan to provide a method for resource providers to further extend the semantic infrastructure by defining alignments to new classification schemes if these are required to support particular specialized thematic areas that are not covered by GEMET. In this way, the approach is flexible and suited to the general scope of EuroGEOSS, allowing specialists to increase at will the level of semantic quality and specificity of data to the initial infrastructural skeleton of the project. References ____________________________________________ Joint research Centre (JRC), 2008. INSPIRE Metadata Editor User Guide Pérez-Gómez A., Fernandez-Lopez M., Corcho O. Ontological engineering: With Examples from the Areas of Knowledge Management, e-Commerce and the Semantic Web.Spinger: London, 2004

  6. Applications of Radar Interferometric Techniques to Assess Natural Hazards and their Controlling Factors

    NASA Astrophysics Data System (ADS)

    Sultan, M.; Becker, R.; Gebremichael, E.; Othman, A.; Emil, M.; Ahmed, M.; Elkadiri, R.; Pankratz, H. G.; Chouinard, K.

    2015-12-01

    Radar interferometric techniques including Persistent Scatterer (PS), Small BAseline Subset (SBAS), and two and three pass (differential interferometry) methods were applied to Synthetic Aperture Radar (SAR) datasets. These include the European Space Agency (ESA) ERS-1, ERS-2, Environmental satellite (Envisat), and Phased Array type L-band Synthetic Aperture Radar (PALSAR) to conduct the following: (1) map the spatial distribution of land deformation associated with a wide range of geologic settings, (2) quantify the rates of the observed land deformation, and (3) identify the factors controlling the observed deformation. The research topics/areas include: (1) subsidence associated with sediment compaction in a Delta setting (Nile Delta, Egypt), (2) deformation in a rifting setting (Red Sea rifting along the Red Sea coastal zone and proximal basement outcrops in Egypt and Saudi Arabia), (3) deformation associated with salt dome intrusion and the dissolution of sabkha deposits (Jazan area in Saudi Arabia), (4) mass transport associated with debris flows (Jazan area in Saudi Arabia), and (5) deformation preceding, contemporaneous with, or following large earthquakes (in Nepal; magnitude: 7.8; date: April, 25, 2015) and medium earthquakes (in Harrat Lunayyir volcanic field, central Saudi Arabia; magnitude: 5.7; date: May 19, 2009). The identification of the factor(s) controlling the observed deformation was attained through spatial correlation of extracted radar velocities with relevant temporal and static ground based and remotely sensed geological and cultural data sets (e.g., lithology, structure, precipitation, land use, and earthquake location, magnitude, and focal mechanism) in a Geographical Information System (GIS) environment.

  7. Sulfamoylbenzamide Derivatives Inhibit the Assembly of Hepatitis B Virus Nucleocapsids

    PubMed Central

    Campagna, Matthew R.; Liu, Fei; Mao, Richeng; Mills, Courtney; Cai, Dawei; Guo, Fang; Zhao, Xuesen; Ye, Hong; Cuconati, Andrea; Guo, Haitao; Chang, Jinhong; Xu, Xiaodong; Block, Timothy M.

    2013-01-01

    Chronic hepatitis B virus (HBV) infection, a serious public health problem leading to cirrhosis and hepatocellular carcinoma, is currently treated with either pegylated alpha interferon (pegIFN-α) or one of the five nucleos(t)ide analogue viral DNA polymerase inhibitors. However, neither pegIFN-α nor nucleos(t)ide analogues are capable of reliably curing the viral infection. In order to develop novel antiviral drugs against HBV, we established a cell-based screening assay by using an immortalized mouse hepatocyte-derived stable cell line supporting a high level of HBV replication in a tetracycline-inducible manner. Screening of a library consisting of 26,900 small molecules led to the discovery of a series of sulfamoylbenzamide (SBA) derivatives that significantly reduced the amount of cytoplasmic HBV DNA. Structure-activity relationship studies have thus far identified a group of fluorine-substituted SBAs with submicromolar antiviral activity against HBV in human hepatoma cells. Mechanistic analyses reveal that the compounds dose dependently inhibit the formation of pregenomic RNA (pgRNA)-containing nucleocapsids of HBV but not other animal hepadnaviruses, such as woodchuck hepatitis virus (WHV) and duck hepatitis B virus (DHBV). Moreover, heterologous genetic complementation studies of capsid protein, DNA polymerase, and pgRNA between HBV and WHV suggest that HBV capsid protein confers sensitivity to the SBAs. In summary, SBAs represent a novel chemical entity with superior activity and a unique antiviral mechanism and are thus warranted for further development as novel antiviral therapeutics for the treatment of chronic hepatitis B. PMID:23576513

  8. InSAR analysis of the crustal deformation affecting the megacity of Istanbul: the results of the FP7 Marsite Project as a GEO Supersite Initiative

    NASA Astrophysics Data System (ADS)

    Solaro, Giuseppe; Bonano, Manuela; Manzo, Mariarosaria

    2016-04-01

    The North Anatolian Fault (NAF) is one of the most active faults worldwide, extending approximately 1,200 km from Eastern Turkey to the Northern Aegean Sea. During the 20th century series of damaging earthquakes occurred along the NAF, generally propagated westward towards Istanbul; the last one occurred in 1999 at Izmit, a city 80 km away from Istanbul. Within this scenario, the FP7 MARsite project (New Directions in Seismic Hazard assessment through Focused Earth Observation in Marmara Supersite), supported by EU, intends to collect, share and integrate multidisciplinary data (seismologic, geochemical, surveying, satellite, etc.) in order to carry out assessment, mitigation and management of seismic risk in the region of the Sea of Marmara. In the framework of the MARsite project, we performed the analysis and monitoring of the surface deformation affecting the Istanbul mega city by exploiting the large archives of X-band satellite SAR data, made available through the Supersites Initiatives, and by processing them via the advanced multi-temporal and multi-scale InSAR technique, known as the Small BAseline Subset (SBAS) approach. In particular, we applied the SBAS technique to a dataset of 101 SAR images acquired by the TerraSAR-X constellation of the German Space Agency (DLR) over descending orbits and spanning the November 2010 - August 2014 time interval. From,these images, we generated 312 differential interferograms with a maximum spatial separation (perpendicular baseline) between the acquisition orbits of about 500 m., that were used to generate, via the SBAS approach, mean deformation velocity map and corresponding ground time series of the investigated area. The performed InSAR analysis reveals a generalized stability over the Istanbul area, except for some localized displacements, related to subsidence and slope instability phenomena. In particular, we identified: (i) a displacement pattern related to the Istanbul airport, showing a mostly linear deformation trend with a velocity of about 1 cm/yr, although a slowdown effect is observed starting from early 2014, and (ii) a subsidence phenomenon in correspondence to Miniaturk park with a mean velocity value of about 1.5 cm/yr. Moreover, by benefiting from the recent launch of the C-band Sentinel-1A (S1A) satellite (April 2014), developed within the European Commission Copernicus Programme, we performed a more recent InSAR analysis of the Istanbul mega city and its surroundings by exploiting datasets collected during the October 2014 - December 2015 time interval. In such a way, we are able to investigate possible changes in the spatial and temporal deformation rates of the detected displacements over time with the aim to improve the comprehension of the deformation processes already occurred and/or occurring in this area. The obtained results, generated within the MARsite project, are expected to be available to scientific community through the Geohazards Exploitation Platform (GEP), an ESA platform aimed at supporting the exploitation of satellite EO data for geohazards, to be fully compliant with some of the major issues of the Supersites Initiative, as well as to foster the InSAR data sharing within a wider scientific community.

  9. The Afar rift zone deformation dynamics retrieved through phase and amplitude SAR data

    NASA Astrophysics Data System (ADS)

    Casu, F.; Pagli, C.; Paglia, L.; Wang, H.; Wright, T. J.; Lanari, R.

    2011-12-01

    The Dabbahu rift segment of the Afar depression has been active since 2005 when a 2.5 km3 dyke intrusion and hundreds of earthquakes marked the onset a rifting episode which continues to date. Since 2003, the Afar depression has been repeatedly imaged by the ENVISAT satellite, generating a large SAR archive which allow us to study the ongoing deformation processes and the dynamics of magma movements. We combine sets of small baseline interferograms through the advanced DInSAR algorithm referred to as Small BAseline Subset (SBAS), and we generate both ground deformation maps and time series along the satellite Line-Of-Sight (LOS), with accuracies on the order of 5 mm. The main limitation of DInSAR applications is that large and rapid deformations, such as those caused by dyke intrusions and eruptions in Afar, cannot be fully measured. The phase information often degrades and some areas of the interferograms are affected by high fringe rates, leading to difficulties in the phase unwrapping, and/or to complete loss of coherence due to significant misregistration errors. This limitation can be overcome by exploiting the SAR image amplitude information instead of the phase, and by calculating the Pixel-Offset (PO) field of a given SAR image pair, for both range and azimuth directions. Moreover, after computing the POs for each image pair, it is possible to combine them, following the same rationale of the SBAS technique, to finally retrieve the offset-based deformation time series. Such technique, named PO-SBAS, permits to retrieve the deformation field in areas affected by very large displacements at an accuracy that, for ENVISAT data, correspond to 30cm and 15 cm for the range and azimuth, respectively. In this work, we study the Afar rift region deformations by using both the phase and amplitude information of several sets of SAR images acquired from ascending and descending ENVISAT tracks. In particular, we use the phase information to construct dense and accurate deformation maps and time series in areas not affected by large displacements. While in areas where the deformation gradient causes loss of coherence, we retrieve the displacement field through the amplitude information. This approach allows us to obtain a spatially detailed deformation map of the study area. In addition, by combining ascending and descending data we reconstruct the vertical and East-West components of deformation field. Furthermore, in areas affected by large deformations, we can also retrieve the full 3D deformation field, by using the North-South displacement component obtained from the azimuth PO information. Distinct sources of deformations interact in Afar. Fault movements and magma chamber deflation have accompanied dyke intrusions but quantifying each contribution to the total deformation has been challenging, also due to loss of coherence in the central part of the rift. Here we combined the phase and amplitude information in order to retrieve the full deformation field of repeated intrusions. This allows us to better constrain the fault movements that occur as the dyke propagates as well as the magma movements from individual magma chambers.

  10. From local to national scale DInSAR analysis for the comprehension of Earth's surface dynamics.

    NASA Astrophysics Data System (ADS)

    De Luca, Claudio; Casu, Francesco; Manunta, Michele; Zinno, Ivana; lanari, Riccardo

    2017-04-01

    Earth Observation techniques can be very helpful for the estimation of several sources of ground deformation due to their characteristics of large spatial coverage, high resolution and cost effectiveness. In this scenario, Differential Synthetic Aperture Radar Interferometry (DInSAR) is one of the most effective methodologies for its capability to generate spatially dense deformation maps with centimeter to millimeter accuracy. DInSAR exploits the phase difference (interferogram) between SAR image pairs relevant to acquisitions gathered at different times, but with the same illumination geometry and from sufficiently close flight tracks, whose separation is typically referred to as baseline. Among several, the SBAS algorithm is one of the most used DInSAR approaches and it is aimed at generating displacement time series at a multi-scale level by exploiting a set of small baseline interferograms. SBAS, and generally DInSAR, has taken benefit from the large availability of spaceborne SAR data collected along years by several satellite systems, with particular regard to the European ERS and ENVISAT sensors, which have acquired SAR images worldwide during approximately 20 years. While the application of SBAS to ERS and ENVISAT data at local scale is widely testified, very few examples involving those archives for analysis at huge spatial scale are available in literature. This is mainly due to the required processing power (in terms of CPUs, memory and storage) and the limited availability of automatic processing procedures (unsupervised tools), which are mandatory requirements for obtaining displacement results in a time effective way. Accordingly, in this work we present a methodology for generating the Vertical and Horizontal (East-West) components of Earth's surface deformation at very large (national/continental) spatial scale. In particular, it relies on the availability of a set of SAR data collected over an Area of Interest (AoI), which could be some hundreds of thousands of square kilometers wide, from ascending and descending orbits. The exploited SAR data are processed, on a local basis, through the Parallel SBAS (P-SBAS) approach thus generating the displacement time series and the corresponding mean deformation velocity maps. Subsequently, starting from the so generated DInSAR results, the proposed methodology lays on a proper mosaicking procedure to finally retrieve the mean velocity maps of the Vertical and Horizontal (East-West) deformation components relevant to the overall AoI. This technique permits to account for possible regional trends (tectonics trend) not easily detectable by the local scale DInSAR analyses. We tested the proposed methodology with the ENVISAT ASAR archives that have been acquired, from ascending and descending orbits, over California (US), covering an area of about 100.000 km2. The presented methodology can be easily applied also to other SAR satellite data. Above all, it is particularly suitable to deal with the very large data flow provided by the Sentinel-1 constellation, which collects data with a global coverage policy and an acquisition mode specifically designed for interferometric applications.

  11. Developing an Arctic Observing Network: Looking Beyond Scientific Research as a Driver to Broader Societal Benefits as Drivers

    NASA Astrophysics Data System (ADS)

    Jeffries, M. O.

    2017-12-01

    This presentation will address the first ever application of the Societal Benefit Areas approach to continuing efforts to develop an integrated pan-Arctic Observing Network. The scientific research community has been calling for an Arctic Observing Network since the early years of this century, at least. There is no question of the importance of research-driven observations at a time when rapid changes occurring throughout the Arctic environmental system are affecting people and communities in the Arctic and in regions far from the Arctic. Observations are need for continued environmental monitoring and change detection; improving understanding of how the system and its components function, and how they are connected to lower latitude regions; advancing numerical modeling capabilities for forecasting and projection; and developing value-added products and services for people and communities, and for decision- and policymaking. Scientific research is, without question, a benefit to society, but the benefits of Earth observations extend beyond scientific research. Societal Benefit Areas (SBAs) were first described by the international Group on Earth Observations (GEO) and have since been used by USGEO as the basis for its National Earth Observation Assessments. The most recent application of SBAs to Earth observing realized a framework of SBAs, SBA Sub-areas, and Key Objectives required for the completion of a full Earth observing assessment for the Arctic. This framework, described in a report released in June 2017, and a brief history of international efforts to develop an integrated pan-Arctic Observing Network, are the subjects of this presentation.

  12. Factors that influence the provision of intrapartum and postnatal care by skilled birth attendants in low- and middle-income countries: a qualitative evidence synthesis.

    PubMed

    Munabi-Babigumira, Susan; Glenton, Claire; Lewin, Simon; Fretheim, Atle; Nabudere, Harriet

    2017-11-17

    In many low- and middle-income countries women are encouraged to give birth in clinics and hospitals so that they can receive care from skilled birth attendants. A skilled birth attendant (SBA) is a health worker such as a midwife, doctor, or nurse who is trained to manage normal pregnancy and childbirth. (S)he is also trained to identify, manage, and refer any health problems that arise for mother and baby. The skills, attitudes and behaviour of SBAs, and the extent to which they work in an enabling working environment, impact on the quality of care provided. If any of these factors are missing, mothers and babies are likely to receive suboptimal care. To explore the views, experiences, and behaviours of skilled birth attendants and those who support them; to identify factors that influence the delivery of intrapartum and postnatal care in low- and middle-income countries; and to explore the extent to which these factors were reflected in intervention studies. Our search strategies specified key and free text terms related to the perinatal period, and the health provider, and included methodological filters for qualitative evidence syntheses and for low- and middle-income countries. We searched MEDLINE, OvidSP (searched 21 November 2016), Embase, OvidSP (searched 28 November 2016), PsycINFO, OvidSP (searched 30 November 2016), POPLINE, K4Health (searched 30 November 2016), CINAHL, EBSCOhost (searched 30 November 2016), ProQuest Dissertations and Theses (searched 15 August 2013), Web of Science (searched 1 December 2016), World Health Organization Reproductive Health Library (searched 16 August 2013), and World Health Organization Global Health Library for WHO databases (searched 1 December 2016). We included qualitative studies that focused on the views, experiences, and behaviours of SBAs and those who work with them as part of the team. We included studies from all levels of health care in low- and middle-income countries. One review author extracted data and assessed study quality, and another review author checked the data. We synthesised data using the best fit framework synthesis approach and assessed confidence in the evidence using the GRADE-CERQual (Confidence in the Evidence from Reviews of Qualitative research) approach. We used a matrix approach to explore whether the factors identified by health workers in our synthesis as important for providing maternity care were reflected in the interventions evaluated in the studies in a related intervention review. We included 31 studies that explored the views and experiences of different types of SBAs, including doctors, midwives, nurses, auxiliary nurses and their managers. The included studies took place in Africa, Asia, and Latin America.Our synthesis pointed to a number of factors affecting SBAs' provision of quality care. The following factors were based on evidence assessed as of moderate to high confidence. Skilled birth attendants reported that they were not always given sufficient training during their education or after they had begun clinical work. Also, inadequate staffing of facilities could increase the workloads of skilled birth attendants, make it difficult to provide supervision and result in mothers being offered poorer care. In addition, SBAs did not always believe that their salaries and benefits reflected their tasks and responsibilities and the personal risks they undertook. Together with poor living and working conditions, these issues were seen to increase stress and to negatively affect family life. Some SBAs also felt that managers lacked capacity and skills, and felt unsupported when their workplace concerns were not addressed.Possible causes of staff shortages in facilities included problems with hiring and assigning health workers to facilities where they were needed; lack of funding; poor management and bureaucratic systems; and low salaries. Skilled birth attendants and their managers suggested factors that could help recruit, keep, and motivate health workers, and improve the quality of care; these included good-quality housing, allowances for extra work, paid vacations, continuing education, appropriate assessments of their work, and rewards.Skilled birth attendants' ability to provide quality care was also limited by a lack of equipment, supplies, and drugs; blood and the infrastructure to manage blood transfusions; electricity and water supplies; and adequate space and amenities on maternity wards. These factors were seen to reduce SBAs' morale, increase their workload and infection risk, and make them less efficient in their work. A lack of transport sometimes made it difficult for SBAs to refer women on to higher levels of care. In addition, women's negative perceptions of the health system could make them reluctant to accept referral.We identified some other factors that also may have affected the quality of care, which were based on findings assessed as of low or very low confidence. Poor teamwork and lack of trust and collaboration between health workers appeared to negatively influence care. In contrast, good collaboration and teamwork appeared to increase skilled birth attendants' motivation, their decision-making abilities, and the quality of care. Skilled birth attendants' workloads and staff shortages influenced their interactions with mothers. In addition, poor communication undermined trust between skilled birth attendants and mothers. Many factors influence the care that SBAs are able to provide to mothers during childbirth. These include access to training and supervision; staff numbers and workloads; salaries and living conditions; and access to well-equipped, well-organised healthcare facilities with water, electricity, and transport. Other factors that may play a role include the existence of teamwork and of trust, collaboration, and communication between health workers and with mothers. Skilled birth attendants reported many problems tied to all of these factors.

  13. Knowledge, attitudes and practices of traditional birth attendants in pastoralist communities of Laikipia and Samburu counties, Kenya: a cross-sectional survey

    PubMed Central

    Reeve, Matthew; Onyo, Pamela; Nyagero, Josephat; Morgan, Alison; Nduba, John; Kermode, Michelle

    2016-01-01

    Introduction Current efforts to reduce maternal and newborn mortality focus on promoting institutional deliveries with skilled birth attendants (SBAs), and discouraging deliveries at home attended by traditional birth attendants (TBAs). In rural Kenya, semi-nomadic pastoralist communities are underserved by the formal health system, experience high maternal and neonatal mortality, and rely primarily on TBAs for delivery care, despite Government proscription of TBA-assisted births. This study examined the knowledge, attitude and practices of TBAs serving these communities to assess the potential for collaboration between TBAs and SBAs. Methods A cross-sectional, interviewer-administered survey was conducted among 171 TBAs from Maasai and Samburu pastoralist communities in Laikipia and Samburu counties, Kenya, as part of a larger mixed-methods study in partnership with a local service provider. Results BAs were relatively elderly (mean age 59.6 years), and attended an average of 5-6 deliveries per year. A minority (22.2%) had received formal training. They provided antenatal, intra-partum and post-partum care. Most TBA care was non-interventionist, but not necessarily consistent with best practice. Most had encountered birth complications, but knowledge regarding management of complications was sub-optimal. Most had previously referred at least one woman to a health facility (80.1%), were key participants in decision making to refer women (96.5%), and had been present at an institutional delivery (54.4%). Conclusion TBAs continue to be key providers of maternal and neonatal healthcare in regions where the formal health system has poor coverage or acceptability. Strengthening existing TBA/SBA collaborations could improve both community links to the formal health system, and the quality of care provided to pastoralist women, while remaining consistent with current Government policy. PMID:28439337

  14. "I was on the way to the hospital but delivered in the bush": Maternal health in Ghana's Upper West Region in the context of a traditional birth attendants' ban.

    PubMed

    Rishworth, Andrea; Dixon, Jenna; Luginaah, Isaac; Mkandawire, Paul; Tampah Prince, Caesar

    2016-01-01

    This study examines perceptions and experiences of mothers, traditional birth attendants (TBA), and skilled birth attendants (SBA) regarding Ghana's recent policy that forbids TBAs from undertaking deliveries and restricts their role to referrals. In the larger context of Ghana's highly underdeveloped and geographically uneven health care system, this study draws on the political ecology of health framework to explore the ways global safe motherhood policy discourses intersect with local socio-cultural and political environments of Ghana's Upper West Region (UWR). This study reveals that futile improvements in maternal health and the continued reliance on TBAs illustrate the government's inability to understand local realities marked by poor access to SBAs or modern health care services. Using focus group discussions (FGDs) (n = 10) and in-depth interviews (IDIs) (n = 48) conducted in Ghana's UWR, the findings suggest that mothers generally perceive TBAs as better placed to conduct deliveries in rural isolated communities, where in most cases no SBAs are present or easily accessible. The results indicate that by adhering to the World Health Organization's guidelines, the local government may be imposing detrimental, unintended consequences on maternal and child health in remote rural locations. In addition, the findings suggest that the new policy has resulted in considerable confusion among TBAs, many of whom remain oblivious or have not been officially notified about the new policy. Furthermore, participant accounts suggest that the new policy is seen as contributing to worsening relations and tensions between TBAs and SBAs, a situation that undermines the delivery of maternal health services in the region. The study concludes by suggesting relevant policy recommendations. Copyright © 2015 Elsevier Ltd. All rights reserved.

  15. The Subsidence Signature Due To Groundwater Extraction as Inferred from Remote Sensing Data in Mexico City

    NASA Astrophysics Data System (ADS)

    Patel, V.; Chen, J.

    2015-12-01

    Mexico City is facing a severe water shortage; current drought conditions in the city have led to an increase in the demand for groundwater, the pumping of which can cause significant land subsidence. In this study we explored what new information interferometric synthetic aperture radar (InSAR) data collected by the TerraSAR-X satellite could bring to water resource managers in the city so that they can efficiently and sustainably allocate water resources. Previous work done over Mexico City indicates that InSAR can be used to detect deformation due to groundwater pumping. Cabral-Cano et al. (2008) processed InSAR data acquired from ERS between 1996-2000 and from ENVISAT between 2003-2005. They compared the deformation map to geology maps of the region with information obtained by seismic methods. They found that a spatial correlation between the land deformation and the presence of young lacustrine clay beds, which indicate that the subsidence was caused by fluid pressure loss in the aquitard. They also concluded that the subsidence, for the most part, had no seasonal variation and continues to occur at near-constant, high rates. TerraSAR-­X satellite data is known to be more sensitive to small deformations than the data from satellites used in previous studies in the region because of its frequent revisit cycle, short wavelength, and accurate orbital information. For this project, we derived long sequences of crustal deformation time series from TerraSAR-­X data between May 2011 and December 2012 using the Small Baseline Subset (SBAS) method. The resulting time series was then compared to GPS data for calibration and validation. We observed a long-term deformation that was similar to those found in previous studies. The next step in our work is to determine whether the increased sensitivity of the TerraSAR-­X data allows us to detect a seasonal deformation pattern over the study area.

  16. Assessment of DInSAR Potential in Simulating Geological Subsurface Structure

    NASA Astrophysics Data System (ADS)

    Fouladi Moghaddam, N.; Rudiger, C.; Samsonov, S. V.; Hall, M.; Walker, J. P.; Camporese, M.

    2013-12-01

    High resolution geophysical surveys, including seismic, gravity, magnetic, etc., provide valuable information about subsurface structuring but they are very costly and time consuming with non-unique and sometimes conflicting interpretations. Several recent studies have examined the application of DInSAR to estimate surface deformation, monitor possible fault reactivation and constrain reservoir dynamic behaviour in geothermal and groundwater fields. The main focus of these studies was to generate an elevation map, which represents the reservoir extraction induced deformation. This research study, however, will focus on developing methods to simulate subsurface structuring and identify hidden faults/hydraulic barriers using DInSAR surface observations, as an innovative and cost-effective reconnaissance exploration tool for planning of seismic acquisition surveys in geothermal and Carbon Capture and Sequestration regions. By direct integration of various DInSAR datasets with overlapping temporal and spatial coverage we produce multi-temporal ground deformation maps with high resolution and precision to evaluate the potential of a new multidimensional MSBAS technique (Samsonov & d'Oreye, 2012). The technique is based on the Small Baseline Subset Algorithm (SBAS) that is modified to account for variation in sensor parameters. It allows integration of data from sensors with different wave-band, azimuth and incidence angles, different spatial and temporal sampling and resolutions. These deformation maps then will be used as an input for inverse modelling to simulate strain history and shallow depth structure. To achieve the main objective of our research, i.e. developing a method for coupled InSAR and geophysical observations and better understanding of subsurface structuring, comparing DInSAR inverse modelling results with previously provided static structural model will result in iteratively modified DInSAR structural model for adequate match with in situ observations. The newly developed and modified algorithm will then be applied in another part of the region where subsurface information is limited.

  17. Exploitation of the Intermittent SBAS (ISBAS) algorithm with COSMO-SkyMed data for landslide inventory mapping in north-western Sicily, Italy

    NASA Astrophysics Data System (ADS)

    Novellino, A.; Cigna, F.; Sowter, A.; Ramondini, M.; Calcaterra, D.

    2017-03-01

    A large scale study of landslide processes was undertaken by coupling conventional geomorphological field surveys with aerial photographs along with an advanced Interferometric Synthetic Aperture Radar (InSAR) analysis of ground instability in north-western Sicily. COSMO-SkyMed satellite images for the period between 2008 and 2011 were processed using the Intermittent Small BAseline Subset (ISBAS) technique, recently developed at the Department of Civil Engineering of the University of Nottingham. The use of ISBAS allowed the derivation of ground surface displacements across non-urbanized areas, thus overcoming one of the main limitations of conventional interferometric techniques. ISBAS provides ground motion information not only for urban but also for rural, woodland, grassland and agricultural terrains, which cover > 60% of north-western Sicily, thereby improving by 40 times in some cases, the slope instability investigation capabilities of InSAR methods. ISBAS ground motion data enabled the updating of the landslide inventory for the areas of Piana degli Albanesi and Marineo (over 130 km2), which encompass a number of active, dormant and inactive landslides according to the pre-existing landslide inventory maps produced through aerial photo-interpretation and local field checks. An average of ∼ 7000 ISBAS pixels km- 2 allowed the detection of small displacements in regions difficult to access. In particular, 226 landslides - mainly slides, flows and creep and four badlands were identified, comprising a total area of 25.3 km2. When compared to the previous landslide inventory maps, 84 phenomena were confirmed, 67 new events were detected and 79 previously mapped events were re-assessed, modifying their typology, boundary and/or state of activity. Because the InSAR method used here is designed to measure slow rates of velocity and therefore may not detect fast-moving, events such as falls and topples, the results for Piana degli Albanesi and Marineo demonstrate the validity of this method to support land management, underlying the time and cost benefits of a combined approach using traditional monitoring procedures and satellite InSAR methods especially if slow-moving slope movements prevail.

  18. The Monitoring and Spatial-Temporal Evolution Characteristic Analysis for Land Subsidence in Beijing

    NASA Astrophysics Data System (ADS)

    Zhou, Q.; Zhao, W.; Yu, J.

    2018-05-01

    At present the land subsidence has been the main geological disaster in the plain area of China, and became one of the most serious disaster that restrict the social and economic sustainable development, it also is an important content in the project of national geographic conditions monitoring. With the development of economy and society, Beijing as the capital of China has experienced significant population growth in the last few decades which led to over-exploitation of the ground water to meet the water demand of more than 20 million inhabitants, especially in the urban region with high population density. However, the rainfall and surface runoff can't satisfy the need of aquifer recharge that product the land subsidence. As China's political center and a metropolis, there are a lot of large constructions, underground traffic projects and complicated municipal pipeline network, and Beijing is also an important traffic hub for national railway and highway network, all of them would be threatened by the land subsidence disaster. In this article the author used twenty ENVISAT Synthetic Aperture Radar (SAR) images acquired in 2008 June-2010 August and ten TerraSAR images acquired in 2011 June-2012 September were processed with Small Baseline Subset SAR Interferometry (SBAS-InSAR) techniques, to investigate spatial and temporal patterns of land subsidence in the urban area of Beijing.

  19. Completing the gaps in Kilauea's Father's Day InSAR displacement signature with ScanSAR

    NASA Astrophysics Data System (ADS)

    Bertran Ortiz, A.; Pepe, A.; Lanari, R.; Lundgren, P.; Rosen, P. A.

    2009-12-01

    Currently there are gaps in the known displacement signature obtained with InSAR at Kilauea between 2002 and 2009. InSAR data can be richer than GPS because of denser spatial cover. However, to better model rapidly varying and non-steady geophysical events InSAR is limited because of its less dense time observations of the area under study. The ScanSAR mode currently available in several satellites mitigates this effect because the satellite may illuminate a given area more than once within an orbit cycle. The Kilauea displacement graph below from Instituto per Il Rilevamento Electromagnetico dell'Ambiente (IREA) is a cut in space of the displacement signature obtained from a time series of several stripmap-to-stripmap interferograms. It shows that critical information is missing, especially between 2006 and 2007. The displacement is expected to be non-linear judging from the 2007-2008 displacement signature, thus simple interpolation would not suffice. The gap can be filled by incorporating Envisat stripmap-to-ScanSAR interferograms available during that time period. We propose leveraging JPL's new ROI-PAC ScanSAR module to create stripmap-to-ScanSAR interferograms. The new interferograms will be added to the stripmap ones in order to extend the existing stripmap time series generated by using the Small BAseline Subset (SBAS) technique. At AGU we will present denser graphs that better capture Kilauea's displacement between 2003 and 2009.

  20. Land Subsidence Monitoring by InSAR Time Series Technique Derived From ALOS-2 PALSAR-2 over Surabaya City, Indonesia

    NASA Astrophysics Data System (ADS)

    Aditiya, A.; Takeuchi, W.; Aoki, Y.

    2017-12-01

    Surabaya is the second largest city in Indonesia and the capital of East Java Province with rapid population and industrialization. The impact of urbanization in the big city can suffer potential disasters either nature or anthropogenic such as land subsidence and flood. The pattern of land subsidence need to be mapped for the purposes of planning and structuring the city as well as taking appropriate policy in anticipating and mitigating the impact. This research has used interferometric Synthetic Aperture Radar (InSAR) Small Baseline Subset (SBAS) technique and applied time series analysis to investigate land subsidence occured. The technique includes the process of focusing the SAR data, incorporating the precise orbit, generating interferogram and phase unwrapping using SNAPHU algorithms. The results showed land subsidence has been detected during 2014-2017 over Surabaya city area using ALOS-2/PALSAR-2 images data. These results reveal the subsidence has observed in several area in Surabaya in particular northern part reach up to ∼2 cm/year. The fastest subsidence occurs in highly populated areas suffer vulnerable to flooding and sea level rise impact. In urban areas we found a correlation between land subsidence with residential or industrial land use. It concludes that land subsidence is mainly caused by ground water consumption for industrial and residential use respectively.

  1. Detecting and Measuring Land Subsidence in Houston-Galveston, Texas using Interferometric Synthetic Aperture Radar (InSAR) and Global Positioning System Data, 2012-2016

    NASA Astrophysics Data System (ADS)

    Reed, A.; Baker, S.

    2016-12-01

    Several cities in the Houston-Galveston (HG) region in Texas have subsided up to 13 feet over several decades due to natural and anthropogenic processes [Yu et al. 2014]. Land subsidence, a gradual sinking of the Earth's surface, is an often human-induced hazard and a major environmental problem expedited by activities such as mining, oil and gas extraction, urbanization and excessive groundwater pumping. We are able to detect and measure subsidence in HG using interferometric synthetic aperture radar (InSAR) and global positioning systems (GPS). Qu et al. [2015] used ERS, Envisat, and ALOS-1 to characterize subsidence in HG from 1995 to 2011, but a five-year gap in InSAR measurements exists due to a lack of freely available SAR data. We build upon the previous study by comparing subsidence patterns detected by Sentinel-1 data starting in July 2015. We used GMT5SAR to generate a stack of interferograms with perpendicular baselines less than 100 meters and temporal baselines less than 100 days to minimize temporal and spatial decorrelation. We applied the short baseline subset (SBAS) time series processing using GIAnT and compared our results with GPS measurements. The implications of this work will strengthen land subsidence monitoring systems in HG and broadly aid in the development of effective water resource management policies and strategies.

  2. Time series analysis of Mexico City subsidence constrained by radar interferometry

    NASA Astrophysics Data System (ADS)

    Doin, Marie-Pierre; Lopez-Quiroz, Penelope; Yan, Yajing; Bascou, Pascale; Pinel, Virginie

    2010-05-01

    In Mexico City, subsidence rates reach up to 40 cm/yr mainly due to soil compaction led by the over exploitation of the Mexico Basin aquifer. The Mexico Valley, an endoreic basin surrounded by mountains, was in the past covered by large lakes. After the Spanish conquest, the lakes have almost completely disappeared, being progressively replaced by buildings of the current Mexican capital. The simplified hydrogeologic structure includes a superficial 50 to 300 m thick lacustrine aquitard overlying a thicker aquifer made of alluvial deposits. The aquitard layer plays a crucial role in the subsidence process due to the extremely high compressibility of its clay deposits separated by a less compressible sand layer where the biggest buildings of the city are anchored. The aquifer over-exploitation leads to a large scale 30m depression of its piezometric level, inducing water downwards flow in the clays, yielding compaction and subsidence. In order to quantitatively link subsidence to water pumping, the Mexico city subsidence needs to be mapped and analyzed through space and time. We map its spatial and temporal patterns by differential radar interferometry, using 38 ENVISAT images acquired between end of 2002 and beginning of 2007. We employ both a Permanent Scatterer (PS) and a small baseline (SBAS) approach. The main difficulty consists in the severe unwrapping problems mostly due to the high deformation rate. We develop a specific SBAS approach based on 71 differential interferograms with a perpendicular baseline smaller than 500 m and a temporal baseline smaller than 9 months, forming a redundant network linking all images: (1) To help the unwrapping step, we use the fact that the deformation shape is stable for similar time intervals during the studied period. As a result, a stack of the five best interferograms can be used to reduce the number of fringes in wrapped interferograms. (2) Based on the redundancy of the interferometric data base, we quantify the unwrapping errors for each pixel and show that they are strongly decreased by iterations in the unwrapping process. (3) Finally, we present a new algorithm for time series analysis that differs from classical SVD decomposition and is best suited to the present data base. Accurate deformation time series are then derived over the metropolitan area of the city with a spatial resolution of 30 × 30 m. We also use the Gamma-PS software on the same data set. The phase differences are unwrapped within small patches with respect to a reference point chosen in each patch, whose phase is in turn unwrapped relatively to a reference point common for the whole area of interest. After removing the modelled contribution of the linear displacement rate and DEM error, some residual interferograms, presenting unwrapping errors because of strong residual orbital ramp or atmospheric phase screen, are spatially unwrapped by a minimum cost-flow algorithm. The next steps are to estimate and remove the residual orbital ramp and to apply temporal low-pass filter to remove atmospheric contributions. The step by step comparison of the SBAS and PS approaches shows both methods complementarity. The SBAS analysis provide subsidence rates with an accuracy of a mm/yr over the whole basin in a large area, together with the subsidence non linear behavior through time, however at the expense of some spatial regularization. The PS method provides locally accurate and punctual deformation rates, but fails in this case to yield a good large scale map and the non linear temporal behavior of the subsidence. We conclude that the relative contrast in subsidence between individual buildings and infrastructure must be relatively small, on average of the order of 5mm/yr.

  3. SBAS-InSAR analysis of a decade of surface deformation at Mauna Loa (Hawai'i): Preliminary results

    NASA Astrophysics Data System (ADS)

    La Marra, Daniele; Poland, Michael; Miklius, Asta; Acocella, Valerio

    2015-04-01

    The Big Island of Hawai`i consists five coalesced volcanoes: Hualālai, Mauna Loa, Kilauea, Mauna Kea, Kohala. Mauna Loa, the largest, has erupted 39 times since 1832, with the last eruption in 1984. The volcano summit hosts the Moku'aweoweo caldera from which two volcanic rift zones radiate: the Northeast Rift Zone (NERZ) and Southwest Rift Zone (SWRZ). These rifts are the reflection of past dike intrusions combined with instability of the SE flank of the volcano, possibly related to slip along a low angle decollement thrust fault 12-14 km beneath the volcano. Geodetic (InSAR, GPS) and seismic data have been used to characterize recent periods of unrest at Mauna Loa. InSAR studies spanning the period between 2002 and 2005 suggest a magma chamber 4.7-km depth below the summit, with a radius of 1.1 km, and a dike intrusion at 4 to 8-km depth and 8 km-long. These studies, however, are focused on relatively short-term processes (up to a few years), and a longer-term reconstruction of the volcano's evolution is lacking. In this work, we use SAR data, exploiting the SBAS technique, to study deformation of Mauna Loa from 2003 to 2014, and we try to relate this overall evolution to that the neighboring Kilauea. We use acquisitions from two satellites: ENVISAT (descending track 200 and ascending track 365), spanning from 2003 to the end of 2010, and COSMO-SkyMed (descending and ascending orbits), spanning from 2012 to the middle of 2014. These data are merged time series data from 24 continuously operating GPS stations, which allows us to calibrate the InSAR SBAS time series. Results show an overall good agreement between the InSAR and GPS time series. The displacement of each portion of the volcano between 2003-2014 has been thus constrained in detail. The summit area of Mauna Loa has under gone a long-term inflation from 2003 to 2014, with a peak of about 8 cm of vertical deformation between mid-2004 to mid-2005, especially on the summit. Part of this deformation may be related to the instability of the flank. We identified several distinct periods with linear deformation behavior. This deformation is now being used to obtain the best-fit parameters for the sources in each of these periods through analytical modeling. Subsequently, we will incorporate these model parameters in a 3D model that employs the finite element method to include complexities like topography and vertical/lateral heterogeneities.

  4. An Investigation into the Optimal Number of Distractors in Single-Best Answer Exams

    ERIC Educational Resources Information Center

    Kilgour, James M.; Tayyaba, Saadia

    2016-01-01

    In UK medical schools, five-option single-best answer (SBA) questions are the most widely accepted format of summative knowledge assessment. However, writing SBA questions with four effective incorrect options is difficult and time consuming, and consequently, many SBAs contain a high frequency of implausible distractors. Previous research has…

  5. Optimal Divergence-Free Hatch Filter for GNSS Single-Frequency Measurement.

    PubMed

    Park, Byungwoon; Lim, Cheolsoon; Yun, Youngsun; Kim, Euiho; Kee, Changdon

    2017-02-24

    The Hatch filter is a code-smoothing technique that uses the variation of the carrier phase. It can effectively reduce the noise of a pseudo-range with a very simple filter construction, but it occasionally causes an ionosphere-induced error for low-lying satellites. Herein, we propose an optimal single-frequency (SF) divergence-free Hatch filter that uses a satellite-based augmentation system (SBAS) message to reduce the ionospheric divergence and applies the optimal smoothing constant for its smoothing window width. According to the data-processing results, the overall performance of the proposed filter is comparable to that of the dual frequency (DF) divergence-free Hatch filter. Moreover, it can reduce the horizontal error of 57 cm to 37 cm and improve the vertical accuracy of the conventional Hatch filter by 25%. Considering that SF receivers dominate the global navigation satellite system (GNSS) market and that most of these receivers include the SBAS function, the filter suggested in this paper is of great value in that it can make the differential GPS (DGPS) performance of the low-cost SF receivers comparable to that of DF receivers.

  6. Optimal Divergence-Free Hatch Filter for GNSS Single-Frequency Measurement

    PubMed Central

    Park, Byungwoon; Lim, Cheolsoon; Yun, Youngsun; Kim, Euiho; Kee, Changdon

    2017-01-01

    The Hatch filter is a code-smoothing technique that uses the variation of the carrier phase. It can effectively reduce the noise of a pseudo-range with a very simple filter construction, but it occasionally causes an ionosphere-induced error for low-lying satellites. Herein, we propose an optimal single-frequency (SF) divergence-free Hatch filter that uses a satellite-based augmentation system (SBAS) message to reduce the ionospheric divergence and applies the optimal smoothing constant for its smoothing window width. According to the data-processing results, the overall performance of the proposed filter is comparable to that of the dual frequency (DF) divergence-free Hatch filter. Moreover, it can reduce the horizontal error of 57 cm to 37 cm and improve the vertical accuracy of the conventional Hatch filter by 25%. Considering that SF receivers dominate the global navigation satellite system (GNSS) market and that most of these receivers include the SBAS function, the filter suggested in this paper is of great value in that it can make the differential GPS (DGPS) performance of the low-cost SF receivers comparable to that of DF receivers. PMID:28245584

  7. Monitoring Ground Deformation Using Persistent Scatters Interferometry (PSI) and Small Baselines (SBAS) Techniques Integrated in the ESA RSS Service: The Case Study of Valencia, Rome and South Sardinia

    NASA Astrophysics Data System (ADS)

    Delgado, Manuel J.; Cuccu, Roberto; Rivolta, Giancarlo

    2015-05-01

    This work is focused on the infrastructure monitoring of areas which had experienced significant urbanization and therefore, also an increase of the exploitation of natural resources. Persistent Scatters Interferometry (PS-InSAR) and Small Baselines (SBAS) approaches are applied to three study areas for which large datasets of SAR images are available in ascending and descending modes to finally deploy deformation maps of different buildings and infrastructures. Valencia, Rome and South Sardinia areas have been selected for this study, having experienced an increase of the exploitation of natural resources in parallel with their urban expansion. Moreover, Rome is a very special case, where Cultural Heritage permeating the city and its surroundings would suggest the necessity of a tool for monitoring the stability of the different sites. This work wants to analyse the potential deformation that had occurred in these areas during the period 1992 to 2010, by applying Persistent Scatters Interferometry to ESA ERS SAR and Envisat ASAR data.

  8. Potential Deep Seated Landslide Mapping from Various Temporal Data - Benchmark, Aerial Photo, and SAR

    NASA Astrophysics Data System (ADS)

    Wang, Kuo-Lung; Lin, Jun-Tin; Lee, Yi-Hsuan; Lin, Meei-Ling; Chen, Chao-Wei; Liao, Ray-Tang; Chi, Chung-Chi; Lin, Hsi-Hung

    2016-04-01

    Landslide is always not hazard until mankind development in highly potential area. The study tries to map deep seated landslide before the initiation of landslide. Study area in central Taiwan is selected and the geological condition is quite unique, which is slate. Major direction of bedding in this area is northeast and the dip ranges from 30-75 degree to southeast. Several deep seated landslides were discovered in the same side of bedding from rainfall events. The benchmarks from 2002 ~ 2009 are in this study. However, the benchmarks were measured along Highway No. 14B and the road was constructed along the peak of mountains. Taiwan located between sea plates and continental plate. The elevation of mountains is rising according to most GPS and benchmarks in the island. The same trend is discovered from benchmarks in this area. But some benchmarks are located in landslide area thus the elevation is below average and event negative. The aerial photos from 1979 to 2007 are used for orthophoto generation. The changes of land use are obvious during 30 years and enlargement of river channel is also observed in this area. Both benchmarks and aerial photos have discovered landslide potential did exist this area but how big of landslide in not easy to define currently. Thus SAR data utilization is adopted in this case. DInSAR and SBAS sar analysis are used in this research and ALOS/PALSAR from 2006 to 2010 is adopted. DInSAR analysis shows that landslide is possible mapped but the error is not easy to reduce. The error is possibly form several conditions such as vegetation, clouds, vapor, etc. To conquer the problem, time series analysis, SBAS, is adopted in this research. The result of SBAS in this area shows that large deep seated landslides are easy mapped and the accuracy of vertical displacement is reasonable.

  9. Skilled Birth Attendants: who is who? A descriptive study of definitions and roles from nine Sub Saharan African countries.

    PubMed

    Adegoke, Adetoro; Utz, Bettina; Msuya, Sia E; van den Broek, Nynke

    2012-01-01

    Availability of a Skilled Birth Attendant (SBA) during childbirth is a key indicator for MDG5 and a strategy for reducing maternal and neonatal mortality in Africa. There is limited information on how SBAs and their functions are defined. The aim of this study was to map the cadres of health providers considered SBAs in Sub Saharan Africa (SSA); to describe which signal functions of Essential Obstetric Care (EmOC) they perform and assess whether they are legislated to perform these functions. Key personnel in the Ministries of Health, teaching institutions, referral, regional and district hospitals completed structured questionnaires in nine SSA countries in 2009-2011. A total of 21 different cadres of health care providers (HCP) were reported to be SBA. Type and number of EmOC signal functions reported to be provided, varied substantially between cadres and countries. Parenteral antibiotics, uterotonic drugs and anticonvulsants were provided by most SBAs. Removal of retained products of conception and assisted vaginal delivery were the least provided signal functions. Except for the cadres of obstetricians, medical doctors and registered nurse-midwives, there was lack of clarity regarding signal functions reported to be performed and whether they were legislated to perform these. This was particularly for manual removal of placenta, removal of retained products and assisted vaginal delivery. In some countries, cadres not considered SBA performed deliveries and provided EmOC signal functions. In other settings, cadres reported to be SBA were able to but not legislated to perform key EmOC signal functions. Comparison of cadres of HCPs reported to be SBA across countries is difficult because of lack of standardization in names, training, and functions performed. There is a need for countries to develop clear guidelines defining who is a SBA and which EmOC signal functions each cadre of HCP is expected to provide.

  10. Knowledge, attitudes and practices of traditional birth attendants in pastoralist communities of Laikipia and Samburu counties, Kenya: a cross-sectional survey.

    PubMed

    Reeve, Matthew; Onyo, Pamela; Nyagero, Josephat; Morgan, Alison; Nduba, John; Kermode, Michelle

    2016-01-01

    Current efforts to reduce maternal and newborn mortality focus on promoting institutional deliveries with skilled birth attendants (SBAs), and discouraging deliveries at home attended by traditional birth attendants (TBAs). In rural Kenya, semi-nomadic pastoralist communities are underserved by the formal health system, experience high maternal and neonatal mortality, and rely primarily on TBAs for delivery care, despite Government proscription of TBA-assisted births. This study examined the knowledge, attitude and practices of TBAs serving these communities to assess the potential for collaboration between TBAs and SBAs. A cross-sectional, interviewer-administered survey was conducted among 171 TBAs from Maasai and Samburu pastoralist communities in Laikipia and Samburu counties, Kenya, as part of a larger mixed-methods study in partnership with a local service provider. BAs were relatively elderly (mean age 59.6 years), and attended an average of 5-6 deliveries per year. A minority (22.2%) had received formal training. They provided antenatal, intra-partum and post-partum care. Most TBA care was non-interventionist, but not necessarily consistent with best practice. Most had encountered birth complications, but knowledge regarding management of complications was sub-optimal. Most had previously referred at least one woman to a health facility (80.1%), were key participants in decision making to refer women (96.5%), and had been present at an institutional delivery (54.4%). TBAs continue to be key providers of maternal and neonatal healthcare in regions where the formal health system has poor coverage or acceptability. Strengthening existing TBA/SBA collaborations could improve both community links to the formal health system, and the quality of care provided to pastoralist women, while remaining consistent with current Government policy.

  11. First principles calculation of two dimensional antimony and antimony arsenide

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pillai, Sharad Babu, E-mail: sbpillai001@gmail.com; Narayan, Som; Jha, Prafulla K.

    2016-05-23

    This work focuses on the strain dependence of the electronic properties of two dimensional antimony (Sb) material and its alloy with As (SbAs) using density functional theory based first principles calculations. Both systems show indirect bandgap semiconducting character which can be transformed into a direct bandgap material with the application of relatively small strain.

  12. Measuring Surface Subsidence in Wuhan, China with SENTINEL-1 Data Using Psinsar

    NASA Astrophysics Data System (ADS)

    Benattou, M. M.; Balz, T.; Liao, M.

    2018-04-01

    We use the potential of Sentinel-1 for urban subsidence monitoring. A case study was conducted in Wuhan using Sentinel-1A images acquired from 22nd June 2015 to the 24th of April 2017 acquired from an ascending orbit. Our results using PSInSAR are compared to a recent study using SBAS. Moreover, in another experiment, only more recent data, containing 18 images from the 7th of March 2017 to the 14th of March 2018, have been processed in order to analysis changes in the subsidence behavior over the study area. In addition to that, the proposed method (PSInSAR) was used to measure the water height in the east lake using metallic objects as stable PS points.

  13. Investigating the creeping section of the San Andreas Fault using ALOS PALSAR interferometry

    NASA Astrophysics Data System (ADS)

    Agram, P. S.; Wortham, C.; Zebker, H. A.

    2010-12-01

    In recent years, time-series InSAR techniques have been used to study the temporal characteristics of various geophysical phenomena that produce surface deformation including earthquakes and magma migration in volcanoes. Conventional InSAR and time-series InSAR techniques have also been successfully used to study aseismic creep across faults in urban areas like the Northern Hayward Fault in California [1-3]. However, application of these methods to studying the time-dependent creep across the Central San Andreas Fault using C-band ERS and Envisat radar satellites has resulted in limited success. While these techniques estimate the average long-term far-field deformation rates reliably, creep measurement close to the fault (< 3-4 Km) is virtually impossible due to heavy decorrelation at C-band (6cm wavelength). Shanker and Zebker (2009) [4] used the Persistent Scatterer (PS) time-series InSAR technique to estimate a time-dependent non-uniform creep signal across a section of the creeping segment of the San Andreas Fault. However, the identified PS network was spatially very sparse (1 per sq. km) to study temporal characteristics of deformation of areas close to the fault. In this work, we use L-band (24cm wavelength) SAR data from the PALSAR instrument on-board the ALOS satellite, launched by Japanese Aerospace Exploration Agency (JAXA) in 2006, to study the temporal characteristics of creep across the Central San Andreas Fault. The longer wavelength at L-band improves observed correlation over the entire scene which significantly increased the ground area coverage of estimated deformation in each interferogram but at the cost of decreased sensitivity of interferometric phase to surface deformation. However, noise levels in our deformation estimates can be decreased by combining information from multiple SAR acquisitions using time-series InSAR techniques. We analyze 13 SAR acquisitions spanning the time-period from March 2007 to Dec 2009 using the Short Baseline Subset Analysis (SBAS) time-series InSAR technique [3]. We present detailed comparisons of estimated time-series of fault creep as a function of position along the fault including the locked section around Parkfield, CA. We also present comparisons between the InSAR time-series and GPS network observations in the Parkfield region. During these three years of observation, the average fault creep is estimated to be 35 mm/yr. References [1] Bürgmann,R., E. Fielding and, J. Sukhatme, Slip along the Hayward fault, California, estimated from space-based synthetic aperture radar interferometry, Geology,26, 559-562, 1998. [2] Ferretti, A., C. Prati and F. Rocca, Permanent Scatterers in SAR Interferometry, IEEE Trans. Geosci. Remote Sens., 39, 8-20, 2001. [3] Lanari, R.,F. Casu, M. Manzo, and P. Lundgren, Application of SBAS D- InSAR technique to fault creep: A case study of the Hayward Fault, California. Remote Sensing of Environment, 109(1), 20-28, 2007. [4] Shanker, A. P., and H. Zebker, Edgelist phase unwrapping algorithm for time-series InSAR. J. Opt. Soc. Am. A, 37(4), 2010.

  14. Application of ALOS and Envisat Data in Improving Multi-Temporal InSAR Methods for Monitoring Damavand Volcano and Landslide Deformation in the Center of Alborz Mountains, North Iran

    NASA Astrophysics Data System (ADS)

    Vajedian, S.; Motagh, M.; Nilfouroushan, F.

    2013-09-01

    InSAR capacity to detect slow deformation over terrain areas is limited by temporal and geometric decorrelations. Multitemporal InSAR techniques involving Persistent Scatterer (Ps-InSAR) and Small Baseline (SBAS) are recently developed to compensate the decorrelation problems. Geometric decorrelation in mountainous areas especially for Envisat images makes phase unwrapping process difficult. To improve this unwrapping problem, we first modified phase filtering to make the wrapped phase image as smooth as possible. In addition, in order to improve unwrapping results, a modified unwrapping method has been developed. This method includes removing possible orbital and tropospheric effects. Topographic correction is done within three-dimensional unwrapping, Orbital and tropospheric corrections are done after unwrapping process. To evaluate the effectiveness of our improved method we tested the proposed algorithm by Envisat and ALOS dataset and compared our results with recently developed PS software (StaMAPS). In addition we used GPS observations for evaluating the modified method. The results indicate that our method improves the estimated deformation significantly.

  15. 76 FR 1595 - Notice of Request for Extension of a Currently Approved Information Collection

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-11

    ... collection in support of the program of the Agency's use of supervised bank accounts (SBA). DATES: Comments... Accounts. OMB Number: 0575-0158. Expiration Date of Approval: 04/30/2011 Type of Request: Extension of a... that do not qualify for loans under commercial rates and terms. The Agency use SBAs as a mechanism to...

  16. On the ionospheric impact of recent storm events on satellite-based augmentation systems in middle and low-latitude sectors

    NASA Technical Reports Server (NTRS)

    Komjathy, Attila; Sparks, Lawrence; Mannucci, Anthony J.; Pi, Xiaoqing

    2003-01-01

    The Ionospheric correction algorithms have been characterized extensively for the mid-latitude region of the ionosphere where benign conditions usually exist. The United States Federal Aviation Administration's (FAA) Wide Area Augmentation System (WAAS) for civil aircraft navigation is focused primarily on the Conterminous United States (CONUS). Other Satellite-based Augmentation Systems (SBAS) include the European Geostationary Navigation Overlay Service (EGNOS) and the Japanese Global Navigation Satellite System (MSAS). Researchers are facing a more serious challenge in addressing the ionospheric impact on navigation using SBAS in other parts of the world such as the South American region on India. At equatorial latitudes, geophysical conditions lead to the so-called Appleton-Hartree (equatorial) anomaly phenomenon, which results in significantly larger ionospheric range delays and range delay spatial gradients than is observed in the CONUS or European sectors. In this paper, we use GPS measurements of geomagnetic storm days to perform a quantitative assessment of WAAS-type ionospheric correction algorithms in other parts of the world such as the low-latitude Brazil and mid-latitude Europe. For the study, we access a world-wide network of 400+ dual frequency GPS receivers.

  17. Interseismic Deformation along the Red River Fault from InSAR Measurements

    NASA Astrophysics Data System (ADS)

    Chen, J.; Li, Z.; Clarke, P. J.

    2017-12-01

    The Red River Fault (RRF) zone is a profound geological discontinuity separating South China from Indochina. Right lateral movements along this >900 km fault are considered to accommodate the extrusion of SE China. Crustal deformation monitoring at high resolution is the key to understand the present-day mode of deformation in this zone and its interaction with the adjacent regions. This is the first study to measure the interseismic deformation of the entire fault with ALOS-1/2 and Sentinel-1 observations. Nine ascending tracks of ALOS-1 data between 2007 and 2011 are collected from the Alaska Satellite Facility (ASF), four descending tracks of Sentinel-1 data are acquired every 24 days since October 2014, and ALOS-2 data are being systematically acquired since 2014. The long wavelength (L-band) of ALOS-1/2 and short temporal baseline of Sentinel-1 ensure good coherence to overcome the limitations of heavy vegetation and variable climate in the region. Stacks of interferograms are generated by our automatic processing chain based on the InSAR Scientific Computing Environment (ISCE) software, ionospheric errors are estimated and corrected using the split-spectrum method (Fattahi et al., IEEE Trans. Geosci. Remote Sens., 2017) and the tropospheric delays are calibrated using the Generic Atmospheric Correction Online Service for InSAR (GACOS: http://ceg-research.ncl.ac.uk/v2/gacos) with high-resolution ECMWF products (Yu et al., J. Geophys. Res., 2017). Time series analysis is performed to determine the interseismic deformation rate of the RRF using the in-house InSAR time series with atmospheric estimation model (InSAR TS + AEM) package based on the Small Baseline Subset (SBAS) algorithm. Our results reveal the decrease of slip rate from north to south. We map the interseismic strain rate field to characterize the deformation patterns and seismic hazard throughout the RRF zone.

  18. High Resolution Digital Surface Model For Production Of Airport Obstruction Charts Using Spaceborne SAR Sensors

    NASA Astrophysics Data System (ADS)

    Oliveira, Henrique; Rodrigues, Marco; Radius, Andrea

    2012-01-01

    Airport Obstruction Charts (AOCs) are graphical representations of natural or man-made obstructions (its locations and heights) around airfields, according to International Civil Aviation Organization (ICAO) Annexes 4, 14 and 15. One of the most important types of data used in AOCs production/update tasks is a Digital Surface Model (first reflective surface) of the surveyed area. The development of advanced remote sensing technologies provide the available tools for obstruction data acquisition, while Geographic Information Systems (GIS) present the perfect platform for storing and analyzing this type of data, enabling the production of digital ACOs, greatly contributing to the increase of the situational awareness of pilots and enhancing the air navigation safety level [1]. Data acquisition corresponding to the first reflective surface can be obtained through the use of Airborne Laser-Scanning and Light Detection and Ranging (ALS/LIDAR) or Spaceborne SAR Systems. The need of surveying broad areas, like the entire territory of a state, shows that Spaceborne SAR systems are the most adequate in economic and feasibility terms of the process, to perform the monitoring and producing a high resolution Digital Surface Model (DSM). The high resolution DSM generation depends on many factors: the available data set, the used technique and the setting parameters. To increase the precision and obtain high resolution products, two techniques are available using a stack of data: the PS (Permanent Scatterers) technique [2], that uses large stack of data to identify many stable and coherent targets through multi- temporal analysis, removing the atmospheric contribution and to minimize the estimation errors, and the Small Baseline Subset (SBAS) technique ([3],[4]), that relies on the use of small baseline SAR interferograms and on the application of the so called singular value decomposition (SVD) method, in order to link independent SAR acquisition data sets, separated by large baselines, thus increasing the number of data used for the analysis.

  19. Assessing natural hazards in NE Colombia using Sentinel-1 interferometry

    NASA Astrophysics Data System (ADS)

    Olen, Stephanie; Bookhagen, Bodo

    2017-04-01

    The DIGENTI project (Digitaler Entscheidertisch für das Naturgefahrenmanagement auf Basis von Satellitendaten und VGI (Volunteered Geographic Information)) aims to assess the natural hazard threat to the Cesar and La Guajira departments of northeast Colombia as guidance for decision makers and disaster relief workers. As members of the DIGENTI project, we use Sentinel-1 synthetic aperture radar (SAR) interferometry to detect hillslope movements, delineate settlements, and monitor damage to urban areas. Our study area, located in the remote Serranía del Perijá mountain range on the border of Colombia and Venezuela, is mountainous, highly vegetated, and experiences high and spatially variable rainfall (between 1 and 4 m a-1). The remote nature of the region, coupled with the favorable conditions for mass movements and other hillslope instabilities, make it an ideal location to employ remote sensing techniques to monitor potential natural hazards. In the highly vegetated Serranía del Perijá mountain range, traditional damage proxy mapping is complicated by vegetation-related coherence loss between SAR scenes. Cross-referencing existing maps, we define regions of consistently high coherence as settled or urban areas. Using the spatial extent of settled or urban areas as a mask, we establish an algorithm to use coherence loss only in these regions as a damage proxy in urban areas where the local population will be most affected. Outside of settlements, hillslope instabilities and movements are quantified and mapped using a two-prong approach: (1) Horizontal ground displacement is be calculated by dense amplitude cross-correlation using the topsOffsetApp in the InSAR Scientific Computing Environment (ISCE). This allows the location, direction, and magnitude of mass movements and hillslope instabilities to be identified and mapped; (2) We use a timeseries of interferograms to quantify vertical ground deformation (e.g., as caused by landsliding) during the Sentinel-1 time window. To do this we employ the small baseline subset (SBAS) technique to create a timeseries of potential landslides in our study area. This technique has the added advantage of overcoming poor coherence between individual InSAR scenes (e.g., caused by vegetation cover). The output of the SBAS analysis will be used to designate natural hazard "hot spots" that will enhance static estimates of factor of safety and landslide risk (e.g., based on digital elevation models). Both the timeseries of horizontal and vertical surface movements and their spatial extent are compared to regional rainfall and vegetation patterns to aid in future natural hazard assessment. Preliminary work is being done to apply these algorithms to other regions with markedly different climate and tectonic settings (NW Argentine Andes and the Arun Valley, Nepal).

  20. Monitoring of land subsidence in Ravenna Municipality using two different DInSAR techniques: comparison and discussion of the results.

    NASA Astrophysics Data System (ADS)

    Fiaschi, Simone; Di Martire, Diego; Tessitore, Serena; Achilli, Vladimiro; Ahmed, Ahmed; Borgstrom, Sven; Calcaterra, Domenico; Fabris, Massimo; Ramondini, Massimo; Serpelloni, Enrico; Siniscalchi, Valeria; Floris, Mario

    2015-04-01

    Land subsidence affecting the Ravenna Municipality (Emilia Romagna Region, NE Italy) is one of the best example on how the exploitation of natural resources can affect the environment and the territory. In fact, the pumping of groundwater and the extraction of gas from both on and off-shore reservoirs, started in the 1950s, have caused a strong land subsidence affecting most of the Emilia Romagna territory but in particular the Adriatic Sea coastline near Ravenna. In such area the current subsidence rate, even if lower than in the past, can reach the -2cm/y. Local Authorities have monitored this phenomenon over the years with different techniques: spirit levelling, GPS surveys and, more recently, Interferometric Synthetic Aperture Radar (InSAR) techniques, confirming the critical situation of land subsidence risk. In this work, we present the comparison between the results obtained with two different DInSAR techniques applied to the study of the land subsidence in the Ravenna territory: the Small Baseline Subset (SBAS) and the Coherent Pixel Technique (CPT) techniques. The SBAS works on SARscape software and is based on the Berardino et al., 2002 algorithm. This technique relies on the combination of differential interferograms created from stacks of SAR image pairs that have small temporal and perpendicular baselines. Thanks to the application of several interferograms for every single image, it is possible to obtain high spatial coherence, high data density and more effective error reduction. This allows us to obtain mean velocity maps with good data density even over non-urbanized territories. For the CPT we used the SUBsoft processor based on the algorithm implemented by Mora et al., 2003. CPT is able to extract from a stack of differential interferograms the deformation evolution over wide areas during large time spans. The processing scheme is composed of three main steps: a) the generation of the best interferogram set among all the available images of the zone under study; b) the selection of the pixels with reliable phase within the employed interferograms and, c) their phase analysis to calculate, as the main result, their deformation time series within the observation period. For this study, different SAR images have been used: 25 meters ground resolution ERS 1/2 (1992-2000) and ENVISAT (2003-2010), and 3 meters ground resolution TerraSAR-X (2012-2014). The results obtained for each stack of images with the two techniques are validated and compared with the C-GPS time series of more than three benchmarks stations. The aim is to test the two InSAR techniques in the monitoring of ground settlements in low urbanized territories. Furthermore, we have investigated the advantages (data accuracy and density) of using SAR images with higher ground resolution.

  1. Monitoring interseismic activity on the Ilan Plain (NE Taiwan) using Small Baseline PS-InSAR, GPS and leveling measurements: partitioning from arc-continent collision and backarc extension

    NASA Astrophysics Data System (ADS)

    Su, Zhe; Hu, Jyr-Ching; Wang, Erchie; Li, Yongsheng; Yang, Yinghui; Wang, Pei-Ling

    2018-01-01

    The Ilan Plain, located in Northeast Taiwan, represents a transition zone between oblique collision (between the Luzon Arc and the Eurasian Plate) and backarc extension (the Okinawa Trough). The mechanism for this abrupt transition from arc-continent collision to backarc extension remains uncertain. We used Global Positioning System (GPS), leveling and multi-interferogram Small Baseline Persistent Scatterer Interferometry (SBAS-PSI) data to monitor the interseismic activity in the basin. A common reference site was selected for the data sets. The horizontal component of GPS and the vertical measurements of the leveling data were converted to line-of-sight (LOS) data and compared with the SBAS-PSI data. The comparison shows that the entire Ilan Plain is undergoing rapid subsidence at a maximum rate of -11 ± 2 mm yr-1 in the LOS direction. We speculate that vertical deformation and anthropogenic activity may play important roles in this deformation. We also performed a joint inversion modeling that combined both the DInSAR and strong motion data to constrain the source model of the 2005 Ilan earthquake. The best-fitting model predicts that the Sansing fault caused the 2005 Ilan earthquake. The observed transtensional deformation is dominated by the normal faulting with a minor left-lateral strike-slip motion. We compared our SBAS-PSI results with the short-term (2005-2009) groundwater level changes. The results indicate that although pumping-induced surface subsidence cannot be excluded, tectonic deformation, including rapid southward movement of the Ryukyu arc and backarc extension of the Okinawa Trough, characterizes the opening of the Ilan Plain. Furthermore, a series of normal and left-lateral strike-slip transtensional faults, including the Choshui and Sansing faults, form a bookshelf-like structure that accommodates the extension of the plain. Although situated in a region of complex structural interactions, the Ilan Plain is primarily controlled by extension rather than by shortening. As the massive, pre-existing Philippines-Ryukyu island arc was pierced by the Philippine Sea Plate, the Ilan Plain formed as a remnant backarc basin on the northeastern corner of Taiwan.

  2. Block Copolymer-Based Supramolecular Elastomers with High Extensibility and Large Stress Generation Capability

    NASA Astrophysics Data System (ADS)

    Noro, Atsushi; Hayashi, Mikihiro

    We prepared block copolymer-based supramolecular elastomers with high extensibility and large stress generation capability. Reversible addition fragmentation chain transfer polymerizations were conducted under normal pressure and high pressure to synthesize several large molecular weight polystyrene-b-[poly(butyl acrylate)-co-polyacrylamide]-b-polystyrene (S-Ba-S) block copolymers. Tensile tests revealed that the largest S-Ba-S with middle block molecular weight of 3140k achieved a breaking elongation of over 2000% with a maximum tensile stress of 3.6 MPa and a toughness of 28 MJ/m3 while the reference sample without any middle block hydrogen bonds, polystyrene-b-poly(butyl acrylate)-b-polystyrene with almost the same molecular weight, was merely viscous and not self-standing. Hence, incorporation of hydrogen bonds into a long soft middle block was found to be beneficial to attain high extensibility and large stress generation capability probably due to concerted combination of entropic changes and internal potential energy changes originaing from the dissociation of multiple hydrogen bonds by elongation. This work was supported by JSPS KAKENHI Grant Numbers 13J02357, 24685035, 15K13785, and 23655213 for M.H. and A.N. A.N. also expresses his gratitude for Tanaka Rubber Science & Technology Award by Enokagaku-Shinko Foundation, Japan.

  3. Physical Activity as a Vital Sign: A Systematic Review

    PubMed Central

    Allen, Kelli D.; Ambrose, Kirsten R.; Stiller, Jamie L.; Evenson, Kelly R.; Voisin, Christiane; Hootman, Jennifer M.; Callahan, Leigh F.

    2017-01-01

    Introduction Physical activity (PA) is strongly endorsed for managing chronic conditions, and a vital sign tool (indicator of general physical condition) could alert providers of inadequate PA to prompt counseling or referral. This systematic review examined the use, definitions, psychometric properties, and outcomes of brief PA instruments as vital sign measures, with attention primarily to studies focused on arthritis. Methods Electronic databases were searched for English-language literature from 1985 through 2016 using the terms PA, exercise, vital sign, exercise referral scheme, and exercise counseling. Of the 838 articles identified for title and abstract review, 9 articles qualified for full text review and data extraction. Results Five brief PA measures were identified: Exercise Vital Sign (EVS), Physical Activity Vital Sign (PAVS), Speedy Nutrition and Physical Activity Assessment (SNAP), General Practice Physical Activity Questionnaire (GPPAQ), and Stanford Brief Activity Survey (SBAS). Studies focusing on arthritis were not found. Over 1.5 years of using EVS in a large hospital system, improvements occurred in relative weight loss among overweight patients and reduction in glycosylated hemoglobin among diabetic patients. On PAVS, moderate physical activity of 5 or more days per week versus fewer than 5 days per week was associated with a lower body mass index (−2.90 kg/m2). Compared with accelerometer-defined physical activity, EVS was weakly correlated (r = 0.27), had low sensitivity (27%–59%), and high specificity (74%–89%); SNAP showed weak agreement (κ = 0.12); GPPAQ had moderate sensitivity (46%) and specificity (50%), and SBAS was weakly correlated (r = 0.10–0.28), had poor to moderate sensitivity (18%–67%), and had moderate specificity (58%–79%). Conclusion Few studies have examined a brief physical activity tool as a vital sign measure. Initial investigations suggest the promise of these simple and quick assessment tools, and research is needed to test the effects of their use on chronic disease outcomes. PMID:29191260

  4. Orogen-Wide InSAR Time Series for Detecting Deformation Sources: The Zagros and Makran of Southern Iran

    NASA Astrophysics Data System (ADS)

    Lohman, R. B.; Barnhart, W. D.

    2011-12-01

    We present interferometric synthetic aperture radar (InSAR) time series maps that span the eastern Zagros (Fars Arc) collisional belt and western Makran accretionary prism of Southern Iran. Given the upcoming availability of large volumes of SAR data from new platforms, such as Sentinel 1 and potentially DESDynI, we explore computationally efficient approaches for extracting deformation time series when the signal of interest is small compared to the level of noise in individual interferograms. We use 12 descending and 2 ascending multi-frame (2-4 frames) Envisat tracks and 2 ascending ALOS tracks spanning 2003-2010 and 2006-2010. We implement a linear inversion, similar to the Small Baseline Subset (SBaS) technique, to derive surface displacements at individual acquisition dates from trees of interferograms with perpendicular baselines less than 350m for Envisat and 1500m for ALOS pairs. This spatially extensive dataset allows us to investigate several attributes of interferometry that vary spatially and temporally over large distances, including changes in phase coherence relative to elevation and relief as well as land use. Through synthetic tests and observed data, we explore various sources of potential error in calculation of time series, including variable coherence of pixels between interferograms in a single track, ambiguities in phase unwrapping, and orbital ramp estimation over scenes with variable correlated noise structure. We present examples of detected signals with both temporally variable characteristics and small magnitudes, including surface/subsurface salt deformation, aseismic deformation across Minab-Zendan-Palami strike-slip zone, and subsidence due to hydrocarbon extraction.

  5. Potential inundated coastal area estimation in Shanghai with multi-platform SAR and altimetry data

    NASA Astrophysics Data System (ADS)

    Ma, Guanyu; Yang, Tianliang; Zhao, Qing; Kubanek, Julia; Pepe, Antonio; Dong, Hongbin; Sun, Zhibin

    2017-09-01

    As global warming problem is becoming serious in recent decades, the global sea level is continuously rising. This will cause damages to the coastal deltas with the characteristics of low-lying land, dense population, and developed economy. Continuously reclamation costal intertidal and wetland areas are making Shanghai, the mega city of Yangtze River Delta, more vulnerable to sea level rise. In this paper, we investigate the land subsidence temporal evolution of patterns and processes on a stretch of muddy coast located between the Yangtze River Estuary and Hangzou Bay with differential synthetic aperture radar interferometry (DInSAR) analyses. By exploiting a set of 31 SAR images acquired by the ENVISAT/ASAR from February 2007 to May 2010 and a set of 48 SAR images acquired by the COSMO-SkyMed (CSK) sensors from December 2013 to March 2016, coherent point targets as long as land subsidence velocity maps and time series are identified by using the Small Baseline Subset (SBAS) algorithm. With the DInSAR constrained land subsidence model, we predict the land subsidence trend and the expected cumulative subsidence in 2020, 2025 and 2030. Meanwhile, we used altimetrydata and densely distributed in the coastal region are identified (EEMD) algorithm to obtain the average sea level rise rate in the East China Sea. With the land subsidence predictions, sea level rise predictions, and high-precision digital elevation model (DEM), we analyze the combined risk of land subsidence and sea level rise on the coastal areas of Shanghai. The potential inundated areas are mapped under different scenarios.

  6. Discovery and Mechanistic Study of Benzamide Derivatives That Modulate Hepatitis B Virus Capsid Assembly.

    PubMed

    Wu, Shuo; Zhao, Qiong; Zhang, Pinghu; Kulp, John; Hu, Lydia; Hwang, Nicky; Zhang, Jiming; Block, Timothy M; Xu, Xiaodong; Du, Yanming; Chang, Jinhong; Guo, Ju-Tao

    2017-08-15

    Chronic hepatitis B virus (HBV) infection is a global public health problem. Although the currently approved medications can reliably reduce the viral load and prevent the progression of liver diseases, they fail to cure the viral infection. In an effort toward discovery of novel antiviral agents against HBV, a group of benzamide (BA) derivatives that significantly reduced the amount of cytoplasmic HBV DNA were discovered. The initial lead optimization efforts identified two BA derivatives with improved antiviral activity for further mechanistic studies. Interestingly, similar to our previously reported sulfamoylbenzamides (SBAs), the BAs promote the formation of empty capsids through specific interaction with HBV core protein but not other viral and host cellular components. Genetic evidence suggested that both SBAs and BAs inhibited HBV nucleocapsid assembly by binding to the heteroaryldihydropyrimidine (HAP) pocket between core protein dimer-dimer interfaces. However, unlike SBAs, BA compounds uniquely induced the formation of empty capsids that migrated more slowly in native agarose gel electrophoresis from A36V mutant than from the wild-type core protein. Moreover, we showed that the assembly of chimeric capsids from wild-type and drug-resistant core proteins was susceptible to multiple capsid assembly modulators. Hence, HBV core protein is a dominant antiviral target that may suppress the selection of drug-resistant viruses during core protein-targeting antiviral therapy. Our studies thus indicate that BAs are a chemically and mechanistically unique type of HBV capsid assembly modulators and warranted for further development as antiviral agents against HBV. IMPORTANCE HBV core protein plays essential roles in many steps of the viral replication cycle. In addition to packaging viral pregenomic RNA (pgRNA) and DNA polymerase complex into nucleocapsids for reverse transcriptional DNA replication to take place, the core protein dimers, existing in several different quaternary structures in infected hepatocytes, participate in and regulate HBV virion assembly, capsid uncoating, and covalently closed circular DNA (cccDNA) formation. It is anticipated that small molecular core protein assembly modulators may disrupt one or multiple steps of HBV replication, depending on their interaction with the distinct quaternary structures of core protein. The discovery of novel core protein-targeting antivirals, such as benzamide derivatives reported here, and investigation of their antiviral mechanism may lead to the identification of antiviral therapeutics for the cure of chronic hepatitis B. Copyright © 2017 American Society for Microbiology.

  7. AATR an ionospheric activity indicator specifically based on GNSS measurements

    NASA Astrophysics Data System (ADS)

    Juan, José Miguel; Sanz, Jaume; Rovira-Garcia, Adrià; González-Casado, Guillermo; Ibáñez, D.; Perez, R. Orus

    2018-03-01

    This work reviews an ionospheric activity indicator useful for identifying disturbed periods affecting the performance of Global Navigation Satellite System (GNSS). This index is based in the Along Arc TEC Rate (AATR) and can be easily computed from dual-frequency GNSS measurements. The AATR indicator has been assessed over more than one Solar Cycle (2002-2017) involving about 140 receivers distributed world-wide. Results show that it is well correlated with the ionospheric activity and, unlike other global indicators linked to the geomagnetic activity (i.e. DST or Ap), it is sensitive to the regional behaviour of the ionosphere and identifies specific effects on GNSS users. Moreover, from a devoted analysis of different Satellite Based Augmentation System (SBAS) performances in different ionospheric conditions, it follows that the AATR indicator is a very suitable mean to reveal whether SBAS service availability anomalies are linked to the ionosphere. On this account, the AATR indicator has been selected as the metric to characterise the ionosphere operational conditions in the frame of the European Space Agency activities on the European Geostationary Navigation Overlay System (EGNOS). The AATR index has been adopted as a standard tool by the International Civil Aviation Organization (ICAO) for joint ionospheric studies in SBAS. In this work we explain how the AATR is computed, paying special attention to the cycle-slip detection, which is one of the key issues in the AATR computation, not fully addressed in other indicators such as the Rate Of change of the TEC Index (ROTI). After this explanation we present some of the main conclusions about the ionospheric activity that can extracted from the AATR values during the above mentioned long-term study. These conclusions are: (a) the different spatial correlation related with the MOdified DIP (MODIP) which allows to clearly separate high, mid and low latitude regions, (b) the large spatial correlation in mid latitude regions which allows to define a planetary index, similar to the geomagnetic ones, (c) the seasonal dependency which is related with the longitude and (d) the variation of the AATR value at different time scales (hourly, daily, seasonal, among others) which confirms most of the well-known time dependences of the ionospheric events, and finally, (e) the relationship with the space weather events.

  8. An evaluation of exact methods for the multiple subset maximum cardinality selection problem.

    PubMed

    Brusco, Michael J; Köhn, Hans-Friedrich; Steinley, Douglas

    2016-05-01

    The maximum cardinality subset selection problem requires finding the largest possible subset from a set of objects, such that one or more conditions are satisfied. An important extension of this problem is to extract multiple subsets, where the addition of one more object to a larger subset would always be preferred to increases in the size of one or more smaller subsets. We refer to this as the multiple subset maximum cardinality selection problem (MSMCSP). A recently published branch-and-bound algorithm solves the MSMCSP as a partitioning problem. Unfortunately, the computational requirement associated with the algorithm is often enormous, thus rendering the method infeasible from a practical standpoint. In this paper, we present an alternative approach that successively solves a series of binary integer linear programs to obtain a globally optimal solution to the MSMCSP. Computational comparisons of the methods using published similarity data for 45 food items reveal that the proposed sequential method is computationally far more efficient than the branch-and-bound approach. © 2016 The British Psychological Society.

  9. Sentinel-1 automatic processing chain for volcanic and seismic areas monitoring within the Geohazards Exploitation Platform (GEP)

    NASA Astrophysics Data System (ADS)

    De Luca, Claudio; Zinno, Ivana; Manunta, Michele; Lanari, Riccardo; Casu, Francesco

    2016-04-01

    The microwave remote sensing scenario is rapidly evolving through development of new sensor technology for Earth Observation (EO). In particular, Sentinel-1A (S1A) is the first of a sensors' constellation designed to provide a satellite data stream for the Copernicus European program. Sentinel-1A has been specifically designed to provide, over land, Differential Interferometric Synthetic Aperture Radar (DInSAR) products to analyze and investigate Earth's surface displacements. S1A peculiarities include wide ground coverage (250 km of swath), C-band operational frequency and short revisit time (that will reduce from 12 to 6 days when the twin system Sentinel-1B will be placed in orbit during 2016). Such characteristics, together with the global coverage acquisition policy, make the Sentinel-1 constellation to be extremely suitable for volcanic and seismic areas studying and monitoring worldwide, thus allowing the generation of both ground displacement information with increasing rapidity and new geological understanding. The main acquisition mode over land is the so called Interferometric Wide Swath (IWS) that is based on the Terrain Observation by Progressive Scans (TOPS) technique and that guarantees the mentioned S1A large coverage characteristics at expense of a not trivial interferometric processing. Moreover, the satellite spatial coverage and the reduced revisit time will lead to an exponential increase of the data archives that, after the launch of Sentine-1B, will reach about 3TB per day. Therefore, the EO scientific community needs from the one hand automated and effective DInSAR tools able to address the S1A processing complexity, and from the other hand the computing and storage capacities to face out the expected large amount of data. Then, it is becoming more crucial to move processors and tools close to the satellite archives, being not efficient anymore the approach of downloading and processing data with in-house computing facilities. To address these issues, ESA recently funded the development of the Geohazards Exploitation Platform (GEP), a project aimed at putting together data, processing tools and results to make them accessible to the EO scientific community, with particular emphasis to the Geohazard Supersites & Natural Laboratories and the CEOS Seismic Hazards and Volcanoes Pilots. In this work we present the integration of the parallel version of a well-known DInSAR algorithm referred to as Small BAseline Subset (P-SBAS) within the GEP platform for processing Sentinel-1 data. The integration allowed us to set up an operational on-demand web tool, open to every user, aimed at automatically processing S1A data for the generation of SBAS displacement time-series. Main characteristics as well as a number of experimental results obtained by using the implemented web tool will be also shown. This work is partially supported by: the RITMARE project of Italian MIUR, the DPC-CNR agreement and the ESA GEP project.

  10. Incorporation Kinetics in Mixed Anion Compound Semiconductor Alloys

    DTIC Science & Technology

    2013-01-01

    been studied . 5-8 but few reports attempting to systematically determine the Sb/As incorporation as a function of growth parameters exist other...droplets are not expected to form. In this case , the surface is always covered in either the anion or segregant such that s + a =1. The rate of...values were determined in two different ways. In the first case we started by measuring the growth rates with RHEED oscillations under excess group

  11. Multirisk analysis along the Road 7, Mendoza Province, Argentina

    NASA Astrophysics Data System (ADS)

    Wick, Emmanuel; Baumann, Valérie; Michoud, Clément; Derron, Marc-Henri; Jaboyedoff, Michel; Rune Lauknes, Tom; Marengo, Hugo; Rosas, Mario

    2010-05-01

    The National Road 7 crosses Argentina from East to West, linking Buenos Aires to the Chile border. This road is an extremely important corridor crossing the Andes Cordillera, but it is exposed to numerous natural hazards, such as rockfalls, debris flows and snow avalanches. The study area is located in the Mendoza Province, between Potrerillos and Las Cuevas in the Chilean border. This study has for main goals to achieve a regional mapping of geohazards susceptibility along the Road 7 corridor using modern remote sensing and numerical modelling techniques completed by field investigations. The main topics are: - Detection and monitoring of deep-seated gravitational slope deformations by time-series satellite radar interferometry (InSAR) methods. The area of interest is mountainous with almost no vegetation permitting an optimized InSAR processing. Our results are based on applying the small-baseline subset (SBAS) method to a time-series of Envisat ASAR images. - Rockfalls susceptibility mapping is realized using statistical analysis of the slope angle distribution, including external knowledge on the geology and land cover, to detect the potential source areas (quantitative DEM analysis). The run-outs are assessed with numerical methods based on the shallow angle method with Conefall. A second propagation is performed using the alpha-beta methodology (3D numerical modelling) with RAS and is compared to the first one. - Debris flow susceptibility mapping is realized using DF-IGAR to detect starting and spreading areas. Slope, flow accumulations, contributive surfaces, plan curvature, geological and land use dataset are used. The spreading is simulated by a multiple flow algorithm (rules the path that the debris flow will follow) coupled to a run-out distance calculation (energy-based). - Snow avalanches susceptibility mapping is realized using DF-IGAR to map sources areas and propagations. To detect the sources areas, slope, altitude, land-use and minimum surfaces are needed. DF-IGAR simulates the spreading by means of the "Perla" methodology. Furthermore, RAS performs the spreading based on the "alpha-beta" method. All these methods are based on Aster and SRTM DEM (grid 30 m) and observations of both optical and radar satellite imagery (Aster, Quickbird, Worldview, Ikonos, Envisat ASAR) and aerial photographs. Several field campaigns are performed to calibrate the regional models with adapted parameters. Susceptibility maps of the entire area for rockfalls, debris flows and snow avalanches at a scale of 1:100'000 are created. Those maps and the field investigations are cross-checked to identify and prioritize hotspots. It appears that numerous road sectors are subject to highly active phenomena. Some mitigation works already exist but they are often under-dimensioned, inadequate or neglected. Recommendations for priority and realistic mitigation measures along the endangered road sectors identified are proposed.

  12. Monitoring the UPS and Downs of Sumatra and Java with D-Insar Time-Series

    NASA Astrophysics Data System (ADS)

    Chaussard, E.; Amelung, F.

    2010-12-01

    We performed, for the first time, a global D-InSAR survey of the Indonesian islands of Sumatra and Java to define locations where deformation is occurring. The goals of this study are 1) to create an inventory of actively deforming volcanic centers and 2) monitor all types of ground motion. This work provides ground deformation data for previously unmonitored areas and can assist the Indonesian authorities to improve hazards assessment. The D-InSAR survey covers an area of about 500 000 km2 and 3000 km long on the islands of Sumatra, Java and Bali. We used ALOS data from 45 tracks and more than 1500 granules obtained from the Alaska Satellite Facility (ASF) through the US Government Research Consortium (USGRC). We completed more than 1000 interferograms spanning a period from the end of 2006 to the beginning of 2009. L-band SAR images enable deformation mapping at global scales even in highly vegetated areas where C-band signal experiences loss of coherence. To identify locations where ground deformations are occurring, we used multiple SAR acquisitions of the same area and performed time series analysis using the Small BAseline Subset (SBAS) method. Interferograms with a maximum spatial baseline of 3000 m were phase-unwrapped and subsequently inverted for the phase with respect to the first acquisition. Temporal coherence of each pixel is computed on the set of interferograms in order to select only pixels with high temporal coherence. The compiled InSAR velocity map reveals the background level of activity of the 84 volcanic centers constituting the Sumatra, Java and Bali volcanic arcs. We identified possible uplift at 6 volcanic centers: Agung (Bali), Lamongan (Java), Lawu (Java), Slamet (Java), Kerinci (Sumatra) and Sinabung (Sumatra). Moreover, we identified subsidence in 5 major cities and 1 coastal area. Subsidence rates range from 6 cm/yr in Medan, the largest city of Sumatra, to more than 15 cm/yr in Jakarta. These major subsidence areas are probably due to ground water extraction needed to support the increasing population and industrial activities.

  13. A strategy for Local Surface Stability Monitoring Using SAR Imagery

    NASA Astrophysics Data System (ADS)

    Kim, J.; Lan, C. W.; Lin, S. Y.; vanGasselt, S.; Yun, H.

    2017-12-01

    In order to provide sufficient facilities to satisfy a growing number of residents, nowadays there are many constructions and maintenance of infrastructures or buildings undergoing above and below the surface of urban area. In some cases we have learned that disasters might happen if the developments were conducted on unknown or geologically unstable ground or in over-developed areas. To avoid damages caused by such settings, it is essential to perform a regular monitoring scheme to understand the ground stability over the whole urban area. Through long-term monitoring, we firstly aim to observe surface stability over the construction sites. Secondly, we propose to implement an automatic extraction and tracking of suspicious unstable area. To achieve this, we used 12-days-interval C-band Sentinel-1A Synthetic Aperture Radar (SAR) images as the main source to perform regular monitoring. Differential Interferometric SAR (D-InSAR) technique was applied to generate interferograms. Together with the accumulation of updated Sentinel-1A SAR images, time series interferograms were formed accordingly. For the purpose of observing surface stability over known construction sites, the interferograms and the unwrapped products could be used to identify the surface displacement occurring before and after specific events. In addition, Small Baseline Subset (SBAS) and Permanent Scatterers (PS) approaches combining a set of unwrapped D-InSAR interferograms were also applied to derive displacement velocities over long-term periods. For some cases, we conducted the ascending and descending mode time series analysis to decompose three surface migration vectors and to precisely identify the risk pattern. Regarding the extraction of suspicious unstable areas, we propose to develop an automatic pattern recognition algorithm for the identification of specific fringe patterns involving various potential risks. The detected fringes were tracked in the time series interferograms and overlapped with various GIS layers to find correlations with the environmental elements causing the risks. Taipei City and Taichung City located in northern Taiwan and Ulsan City in Korea were selected to demonstrate the feasibility of the proposed method.

  14. InSAR Time Series to Characterize Landslide Ground Deformations in a Tropical Urban Environment: Focus on Bukavu, East African Rift System (RD Congo).

    NASA Astrophysics Data System (ADS)

    Nobile, A.; d'Oreye, N.; Monsieurs, E.; Dewitte, O.; Kervyn, F.

    2016-12-01

    The western branch of the East African Rift System, in Central Africa, is a region naturally prone to landslides due to factors such as heavy rainfall, tectonic activity and steep topography. In addition, sensibility to slope instability is expected to increase in the future in response to increasing demographic pressure and land use/land cover changes. The Rift flanks west of Lake Kivu (DRC) are one of the Congolese regions most affected by landslides. Although heavy rainfall periods and earthquakes are the main triggering factors, nothing is known on their potential role on the current dynamics of existing landslides Here we used InSAR time series to monitor ground deformations associated to large slow-moving landslides that continuously affect highly populated slopes in the city of Bukavu (DRC). Bukavu is located within the Rift, on the southern shore of Lake Kivu, in a tropical environment. Using >100 Cosmo-SkyMed SAR images, acquired between March 2015 and June 2016 with a mean revisiting time of 8 days per orbit (ascending and descending), we produce displacement-rate maps and ground deformation time series using different techniques: Persistent Scatter (PS), Small Baseline Subset (SBAS) and Multidimensional Small Baseline Subsets (MSBAS). The three techniques provides similar results in areas with relatively small displacements (few mm per months). However, in areas where displacements are much higher and where coherence is lost by traditional techniques, MSBAS, that process concurrently the two satellite orbits improving temporal resolution, is more efficient. It allows to measure higher ground deformation rates by keeping the coherence. For one specific landslide where intense field mapping was done, the results show clearly the pattern of the deformations that divides the landslide in blocks that move with different velocity (up to 20 cm/yr). This pattern is consistent with field observations and possibly related to the anthropic activity. Furthermore, DGPS measurements, taken at 21 benchmarks in the area during the same period, allow validating the InSAR results. The combination of InSAR data with rainfall and seismic monitoring, and field observations should help us, when longer time-series will be available, to better understand the mechanisms (both natural and human) that affect this landslide.

  15. Sentinel-1 and ground-based sensors for a continuous monitoring of the Corvara landslide kinematic (South Tirol, Italy)

    NASA Astrophysics Data System (ADS)

    Schlögel, Romy; Darvishi, Mehdi; Cuozzo, Giovanni; Kofler, Christian; Rutzinger, Martin; Zieher, Thomas; Toschi, Isabella; Remondino, Fabio

    2017-04-01

    Sentinel-1 mission allows us to have Synthetic Aperture Radar (SAR) acquisitions over large areas every 6 days with spatial resolution of 20 m. This new open-source generation of satellites has enhanced the capabilities for continuously studying earth surface changes. Over the past two decades, several studies have demonstrated the potential of Differential Synthetic Aperture Radar Interferometry (DInSAR) for detecting and quantifying land surface deformation. DInSAR limitations and challenges are linked to the SAR properties and the field conditions (especially in Alpine environments) leading to spatial and temporal decorrelation of the SAR signal. High temporal decorrelation can be caused by changes in vegetation (particularly in non-urban areas), atmospheric conditions or high ground surface velocity. In this study, kinematics of the complex and vegetated Corvara landslide, situated in Val Badia (South Tirol, Italy), are monitored by a network of 3 permanent and 13 monthly Differential Global Positioning System (DGPS) stations. The slope displacement rates are found to be highly unsteady and reach several meters a year. This analysis focuses on evaluating the limitations of Sentinel-1 imagery processed with Small Baseline Subset (SBAS) technique in comparison to ground-based measurements for assessing the landslide kinematic linked to meteorological conditions. Selecting some particular acquisitions, coherence thresholds and unwrapping processes gives various results in terms of reliability and accuracy supporting the understanding of the landslide velocity field. The evolution of the coherence and phase signals are studied according to the changing field conditions and the monitored ground-based displacements. DInSAR deformation maps and residual topographic heights are finally compared with difference of high resolution Digital Elevation Models at local scale. This research is conducted within the project LEMONADE (http://lemonade.mountainresearch.at) funded by the Euregio Science Fund.

  16. Gravity and InSAR remote sensing of groundwater usage in the Sahel and Horn of Africa

    NASA Astrophysics Data System (ADS)

    Neely, W.; Borsa, A. A.; Burney, J. A.; Devlin, K.

    2016-12-01

    Changes in the Earth's climatic systems influence agro-ecological conditions on local, regional, and global scales. With the world's highest population growth rate, sub-Saharan Africa faces particularly acute concerns regarding food security and resource management. Historical sources of surface water for agricultural production may become less reliable and/or limited with increased climate variability, and African countries have already begun to depend on more stable sources of groundwater. Expected increases in groundwater usage pose questions about the sustainability of current agricultural practices, which require new sources of information to answer. Due to the logistics and costs to implement in situ networks to monitor regional water security, current remote sensing missions offer an affordable alternative. The Gravity Recovery and Climate Experiment (GRACE) has proven to be effective in quantifying changes in terrestrial water storage (TWS) at the regional scale using near-monthly gravity measurements from orbit. Using over a decade of measurements, we estimate TWS anomalies in Niger and Ethiopia. These anomalies offer a proxy for hydrological stressing, indicating potential targets for additional analysis. We use independent, but complementary, estimates of surface displacements from Interferometric Synthetic Aperture Radar (InSAR) to provide information on local groundwater withdrawal. Using data from ESA's Sentinel 1 mission and JAXA's Advanced Land Observing Satellite (ALOS) missions, we characterize the surface deformation over the past decade in regions of active groundwater pumping using the Small Baseline Subset (SBAS) technique. In particular, we investigate ties of ground motion to known agricultural/industrial land usage near Niamey, Niger and Eastern Oromia, Ethiopia to better understand how human activity affects available groundwater resources.

  17. High quality InSAR data linked to seasonal change in hydraulic head for an agricultural area in the San Luis Valley, Colorado

    NASA Astrophysics Data System (ADS)

    Reeves, Jessica A.; Knight, Rosemary; Zebker, Howard A.; Schreüder, Willem A.; Shanker Agram, Piyush; Lauknes, Tom R.

    2011-12-01

    In the San Luis Valley (SLV), Colorado legislation passed in 2004 requires that hydraulic head levels in the confined aquifer system stay within the range experienced in the years 1978-2000. While some measurements of hydraulic head exist, greater spatial and temporal sampling would be very valuable in understanding the behavior of the system. Interferometric synthetic aperture radar (InSAR) data provide fine spatial resolution measurements of Earth surface deformation, which can be related to hydraulic head change in the confined aquifer system. However, change in cm-scale crop structure with time leads to signal decorrelation, resulting in low quality data. Here we apply small baseline subset (SBAS) analysis to InSAR data collected from 1992 to 2001. We are able to show high levels of correlation, denoting high quality data, in areas between the center pivot irrigation circles, where the lack of water results in little surface vegetation. At three well locations we see a seasonal variation in the InSAR data that mimics the hydraulic head data. We use measured values of the elastic skeletal storage coefficient to estimate hydraulic head from the InSAR data. In general the magnitude of estimated and measured head agree to within the calculated error. However, the errors are unacceptably large due to both errors in the InSAR data and uncertainty in the measured value of the elastic skeletal storage coefficient. We conclude that InSAR is capturing the seasonal head variation, but that further research is required to obtain accurate hydraulic head estimates from the InSAR deformation measurements.

  18. Use of NTRIP for optimizing the decoding algorithm for real-time data streams.

    PubMed

    He, Zhanke; Tang, Wenda; Yang, Xuhai; Wang, Liming; Liu, Jihua

    2014-10-10

    As a network transmission protocol, Networked Transport of RTCM via Internet Protocol (NTRIP) is widely used in GPS and Global Orbiting Navigational Satellite System (GLONASS) Augmentation systems, such as Continuous Operational Reference System (CORS), Wide Area Augmentation System (WAAS) and Satellite Based Augmentation Systems (SBAS). With the deployment of BeiDou Navigation Satellite system(BDS) to serve the Asia-Pacific region, there are increasing needs for ground monitoring of the BeiDou Navigation Satellite system and the development of the high-precision real-time BeiDou products. This paper aims to optimize the decoding algorithm of NTRIP Client data streams and the user authentication strategies of the NTRIP Caster based on NTRIP. The proposed method greatly enhances the handling efficiency and significantly reduces the data transmission delay compared with the Federal Agency for Cartography and Geodesy (BKG) NTRIP. Meanwhile, a transcoding method is proposed to facilitate the data transformation from the BINary EXchange (BINEX) format to the RTCM format. The transformation scheme thus solves the problem of handing real-time data streams from Trimble receivers in the BeiDou Navigation Satellite System indigenously developed by China.

  19. HBV core protein allosteric modulators differentially alter cccDNA biosynthesis from de novo infection and intracellular amplification pathways.

    PubMed

    Guo, Fang; Zhao, Qiong; Sheraz, Muhammad; Cheng, Junjun; Qi, Yonghe; Su, Qing; Cuconati, Andrea; Wei, Lai; Du, Yanming; Li, Wenhui; Chang, Jinhong; Guo, Ju-Tao

    2017-09-01

    Hepatitis B virus (HBV) core protein assembles viral pre-genomic (pg) RNA and DNA polymerase into nucleocapsids for reverse transcriptional DNA replication to take place. Several chemotypes of small molecules, including heteroaryldihydropyrimidines (HAPs) and sulfamoylbenzamides (SBAs), have been discovered to allosterically modulate core protein structure and consequentially alter the kinetics and pathway of core protein assembly, resulting in formation of irregularly-shaped core protein aggregates or "empty" capsids devoid of pre-genomic RNA and viral DNA polymerase. Interestingly, in addition to inhibiting nucleocapsid assembly and subsequent viral genome replication, we have now demonstrated that HAPs and SBAs differentially modulate the biosynthesis of covalently closed circular (ccc) DNA from de novo infection and intracellular amplification pathways by inducing disassembly of nucleocapsids derived from virions as well as double-stranded DNA-containing progeny nucleocapsids in the cytoplasm. Specifically, the mistimed cuing of nucleocapsid uncoating prevents cccDNA formation during de novo infection of hepatocytes, while transiently accelerating cccDNA synthesis from cytoplasmic progeny nucleocapsids. Our studies indicate that elongation of positive-stranded DNA induces structural changes of nucleocapsids, which confers ability of mature nucleocapsids to bind CpAMs and triggers its disassembly. Understanding the molecular mechanism underlying the dual effects of the core protein allosteric modulators on nucleocapsid assembly and disassembly will facilitate the discovery of novel core protein-targeting antiviral agents that can more efficiently suppress cccDNA synthesis and cure chronic hepatitis B.

  20. Perceptions of users and providers on barriers to utilizing skilled birth care in mid- and far-western Nepal: a qualitative study.

    PubMed

    Onta, Sharad; Choulagai, Bishnu; Shrestha, Binjwala; Subedi, Narayan; Bhandari, Gajananda P; Krettek, Alexandra

    2014-01-01

    Although skilled birth care contributes significantly to the prevention of maternal and newborn morbidity and mortality, utilization of such care is poor in mid- and far-western Nepal. This study explored the perceptions of service users and providers regarding barriers to skilled birth care. We conducted 24 focus group discussions, 12 each with service users and service providers from different health institutions in mid- and far-western Nepal. All discussions examined the perceptions and experiences of service users and providers regarding barriers to skilled birth care and explored possible solutions to overcoming such barriers. Our results determined that major barriers to skilled birth care include inadequate knowledge of the importance of services offered by skilled birth attendants (SBAs), distance to health facilities, unavailability of transport services, and poor availability of SBAs. Other barriers included poor infrastructure, meager services, inadequate information about services/facilities, cultural practices and beliefs, and low prioritization of birth care. Moreover, the tradition of isolating women during and after childbirth decreased the likelihood that women would utilize delivery care services at health facilities. Service users and providers perceived inadequate availability and accessibility of skilled birth care in remote areas of Nepal, and overall utilization of these services was poor. Therefore, training and recruiting locally available health workers, helping community groups establish transport mechanisms, upgrading physical facilities and services at health institutions, and increasing community awareness of the importance of skilled birth care will help bridge these gaps.

  1. Monitoring and modeling of sinkholes affecting the Jordanian coast of the Dead Sea through satellite interferometric techniques

    NASA Astrophysics Data System (ADS)

    Tessari, Giulia; Pasquali, Paolo; Floris, Mario

    2016-04-01

    Differential Interferometric Synthetic Aperture Radar (DInSAR) techniques have been applied to investigate sinkholes affecting the Jordanian coast of the Dead Sea. The Dead Sea is a hyper saline terminal lake located in a pull-apart basin. Most of the area is characterized by highly karstic and fractured rock formations that are connected with faults. Karstic conduits extend from the land into the sea. Since the 1960s, the Dead Sea level is dropping at an increasing rate: from about 60 cm/yr in the 1970s up to 1 m/yr in the 2000s. From about the mid-1980s, sinkholes appeared more and more frequently over and around the emerged mudflats and salt flats. Strong subsidence and landslides also affect some segments of the coast. Nowadays, several thousands of sinkholes attest that the degradation of the Dead Sea coast is worsening. Deformation analysis has been focused on the Ghor Al Haditha area, located in the South-Eastern part of the lake coast. SAR data acquired by three different sensors, ERS, ENVISAT and COSMO- SkyMed have been analysed. 70 ERS images from 1992 to 2009 and 30 ENVISAT images from 2003 to 2010 have been processed. SBAS technique has been applied to define surface velocity and displacement maps. Results obtained from the SBAS technique, applied to ERS and Envisat data, highlight a diffuse subsiding of the entire Eastern coast of the Dead Sea. It was not possible to detect single sinkholes because of the resolution of these sensors (25m2) and the small size of each punctual event that is generally varying from a few meters to a hundred meters diameter. Furthermore, SBAS has been applied to 23 COSMO-SkyMed SAR satellite images from December 2011 to May 2013. The high resolution of these data (3m x 3m) and the short revisiting time allowed precise information of the displacement of punctual sinkholes beyond the overall subsidence of the coast. A specific sinkhole has been identified in order to understand its temporal evolution. The considered phenomenon reached a total displacement of around 120 mm in 18 months in its central part. On the basis of the results from DInSAR processing, a simplified analytical model has been implemented. Vertical and horizontal components of the surface displacement field obtained from analysis of SAR images have been used as input data to derive geometric parameters of the source and in particular to estimate the volumetric strain of the phenomenon. Position, dimension and mechanism have been obtained.

  2. A cluster randomized implementation trial to measure the effectiveness of an intervention package aiming to increase the utilization of skilled birth attendants by women for childbirth: study protocol

    PubMed Central

    2014-01-01

    Background Nepal is on track to achieve MDG 5 but there is a huge sub-national disparity with existing high maternal mortality in western and hilly regions. The national priority is to reduce this disparity to achieve the goal at sub-national level. Evidences from developing countries show that increasing utilization of skilled attendant at birth is an important indicator for reducing maternal death. Further, there is a very low utilization during childbirth in western and hilly regions of Nepal which clearly depicts the barriers in utilization of skilled birth attendants. So, there is a need to overcome the identified barriers to increase the utilization thereby decreasing the maternal mortality. The hypothesis of this study is that through a package of interventions the utilization of skilled birth attendants will be increased and hence improve maternal health in Nepal. Method/Design This study involves a cluster randomized controlled trial involving approximately 5000 pregnant women in 36 clusters. The 18 intervention clusters will receive the following interventions: i) mobilization of family support for pregnant women to reach the health facility, ii) availability of emergency funds for institutional childbirth, iii) availability of transport options to reach a health facility for childbirth, iv) training to health workers on communication skills, v) security provisions for SBAs to reach services 24/24 through community mobilization; 18 control clusters will not receive the intervention package. The final evaluation of the intervention is planned to be completed by October 2014. Primary study output of this study is utilization of SBA services. Secondary study outputs measure the uptake of antenatal care, post natal checkup for mother and baby, availability of transportation for childbirth, operation of emergency fund, improved reception of women at health services, and improved physical security of SBAs. Discussion The intervention package is designed to increase the utilization of skilled birth attendants by overcoming the barriers related to awareness, finance, transport, security etc. If proven effective, the Ministry of Health has committed to scale up the intervention package throughout the country. Trial registration number ISRCTN78892490. PMID:24646123

  3. System, method and apparatus for conducting a keyterm search

    NASA Technical Reports Server (NTRS)

    McGreevy, Michael W. (Inventor)

    2004-01-01

    A keyterm search is a method of searching a database for subsets of the database that are relevant to an input query. First, a number of relational models of subsets of a database are provided. A query is then input. The query can include one or more keyterms. Next, a gleaning model of the query is created. The gleaning model of the query is then compared to each one of the relational models of subsets of the database. The identifiers of the relevant subsets are then output.

  4. System, method and apparatus for conducting a phrase search

    NASA Technical Reports Server (NTRS)

    McGreevy, Michael W. (Inventor)

    2004-01-01

    A phrase search is a method of searching a database for subsets of the database that are relevant to an input query. First, a number of relational models of subsets of a database are provided. A query is then input. The query can include one or more sequences of terms. Next, a relational model of the query is created. The relational model of the query is then compared to each one of the relational models of subsets of the database. The identifiers of the relevant subsets are then output.

  5. A sustainability model based on cloud infrastructures for core and downstream Copernicus services

    NASA Astrophysics Data System (ADS)

    Manunta, Michele; Calò, Fabiana; De Luca, Claudio; Elefante, Stefano; Farres, Jordi; Guzzetti, Fausto; Imperatore, Pasquale; Lanari, Riccardo; Lengert, Wolfgang; Zinno, Ivana; Casu, Francesco

    2014-05-01

    The incoming Sentinel missions have been designed to be the first remote sensing satellite system devoted to operational services. In particular, the Synthetic Aperture Radar (SAR) Sentinel-1 sensor, dedicated to globally acquire over land in the interferometric mode, guarantees an unprecedented capability to investigate and monitor the Earth surface deformations related to natural and man-made hazards. Thanks to the global coverage strategy and 12-day revisit time, jointly with the free and open access data policy, such a system will allow an extensive application of Differential Interferometric SAR (DInSAR) techniques. In such a framework, European Commission has been funding several projects through the GMES and Copernicus programs, aimed at preparing the user community to the operational and extensive use of Sentinel-1 products for risk mitigation and management purposes. Among them, the FP7-DORIS, an advanced GMES downstream service coordinated by Italian National Council of Research (CNR), is based on the fully exploitation of advanced DInSAR products in landslides and subsidence contexts. In particular, the DORIS project (www.doris-project.eu) has developed innovative scientific techniques and methodologies to support Civil Protection Authorities (CPA) during the pre-event, event, and post-event phases of the risk management cycle. Nonetheless, the huge data stream expected from the Sentinel-1 satellite may jeopardize the effective use of such data in emergency response and security scenarios. This potential bottleneck can be properly overcome through the development of modern infrastructures, able to efficiently provide computing resources as well as advanced services for big data management, processing and dissemination. In this framework, CNR and ESA have tightened up a cooperation to foster the use of GRID and cloud computing platforms for remote sensing data processing, and to make available to a large audience advanced and innovative tools for DInSAR products generation and exploitation. In particular, CNR is porting the multi-temporal DInSAR technique referred to as Small Baseline Subset (SBAS) into the ESA G-POD (Grid Processing On Demand) and CIOP (Cloud Computing Operational Pilot) platforms (Elefante et al., 2013) within the SuperSites Exploitation Platform (SSEP) project, which aim is contributing to the development of an ecosystem for big geo-data processing and dissemination. This work focuses on presenting the main results that have been achieved by the DORIS project concerning the use of advanced DInSAR products for supporting CPA during the risk management cycle. Furthermore, based on the DORIS experience, a sustainability model for Core and Downstream Copernicus services based on the effective exploitation of cloud platforms is proposed. In this framework, remote sensing community, both service providers and users, can significantly benefit from the Helix Nebula-The Science Cloud initiative, created by European scientific institutions, agencies, SMEs and enterprises to pave the way for the development and exploitation of a cloud computing infrastructure for science. REFERENCES Elefante, S., Imperatore, P. , Zinno, I., M. Manunta, E. Mathot, F. Brito, J. Farres, W. Lengert, R. Lanari, F. Casu, 2013, "SBAS-DINSAR Time series generation on cloud computing platforms". IEEE IGARSS Conference, Melbourne (AU), July 2013.

  6. Identification of features in indexed data and equipment therefore

    DOEpatents

    Jarman, Kristin H [Richland, WA; Daly, Don Simone [Richland, WA; Anderson, Kevin K [Richland, WA; Wahl, Karen L [Richland, WA

    2002-04-02

    Embodiments of the present invention provide methods of identifying a feature in an indexed dataset. Such embodiments encompass selecting an initial subset of indices, the initial subset of indices being encompassed by an initial window-of-interest and comprising at least one beginning index and at least one ending index; computing an intensity weighted measure of dispersion for the subset of indices using a subset of responses corresponding to the subset of indices; and comparing the intensity weighted measure of dispersion to a dispersion critical value determined from an expected value of the intensity weighted measure of dispersion under a null hypothesis of no transient feature present. Embodiments of the present invention also encompass equipment configured to perform the methods of the present invention.

  7. Transferability of optimally-selected climate models in the quantification of climate change impacts on hydrology

    NASA Astrophysics Data System (ADS)

    Chen, Jie; Brissette, François P.; Lucas-Picher, Philippe

    2016-11-01

    Given the ever increasing number of climate change simulations being carried out, it has become impractical to use all of them to cover the uncertainty of climate change impacts. Various methods have been proposed to optimally select subsets of a large ensemble of climate simulations for impact studies. However, the behaviour of optimally-selected subsets of climate simulations for climate change impacts is unknown, since the transfer process from climate projections to the impact study world is usually highly non-linear. Consequently, this study investigates the transferability of optimally-selected subsets of climate simulations in the case of hydrological impacts. Two different methods were used for the optimal selection of subsets of climate scenarios, and both were found to be capable of adequately representing the spread of selected climate model variables contained in the original large ensemble. However, in both cases, the optimal subsets had limited transferability to hydrological impacts. To capture a similar variability in the impact model world, many more simulations have to be used than those that are needed to simply cover variability from the climate model variables' perspective. Overall, both optimal subset selection methods were better than random selection when small subsets were selected from a large ensemble for impact studies. However, as the number of selected simulations increased, random selection often performed better than the two optimal methods. To ensure adequate uncertainty coverage, the results of this study imply that selecting as many climate change simulations as possible is the best avenue. Where this was not possible, the two optimal methods were found to perform adequately.

  8. Adaptive feature selection using v-shaped binary particle swarm optimization.

    PubMed

    Teng, Xuyang; Dong, Hongbin; Zhou, Xiurong

    2017-01-01

    Feature selection is an important preprocessing method in machine learning and data mining. This process can be used not only to reduce the amount of data to be analyzed but also to build models with stronger interpretability based on fewer features. Traditional feature selection methods evaluate the dependency and redundancy of features separately, which leads to a lack of measurement of their combined effect. Moreover, a greedy search considers only the optimization of the current round and thus cannot be a global search. To evaluate the combined effect of different subsets in the entire feature space, an adaptive feature selection method based on V-shaped binary particle swarm optimization is proposed. In this method, the fitness function is constructed using the correlation information entropy. Feature subsets are regarded as individuals in a population, and the feature space is searched using V-shaped binary particle swarm optimization. The above procedure overcomes the hard constraint on the number of features, enables the combined evaluation of each subset as a whole, and improves the search ability of conventional binary particle swarm optimization. The proposed algorithm is an adaptive method with respect to the number of feature subsets. The experimental results show the advantages of optimizing the feature subsets using the V-shaped transfer function and confirm the effectiveness and efficiency of the feature subsets obtained under different classifiers.

  9. Adaptive feature selection using v-shaped binary particle swarm optimization

    PubMed Central

    Dong, Hongbin; Zhou, Xiurong

    2017-01-01

    Feature selection is an important preprocessing method in machine learning and data mining. This process can be used not only to reduce the amount of data to be analyzed but also to build models with stronger interpretability based on fewer features. Traditional feature selection methods evaluate the dependency and redundancy of features separately, which leads to a lack of measurement of their combined effect. Moreover, a greedy search considers only the optimization of the current round and thus cannot be a global search. To evaluate the combined effect of different subsets in the entire feature space, an adaptive feature selection method based on V-shaped binary particle swarm optimization is proposed. In this method, the fitness function is constructed using the correlation information entropy. Feature subsets are regarded as individuals in a population, and the feature space is searched using V-shaped binary particle swarm optimization. The above procedure overcomes the hard constraint on the number of features, enables the combined evaluation of each subset as a whole, and improves the search ability of conventional binary particle swarm optimization. The proposed algorithm is an adaptive method with respect to the number of feature subsets. The experimental results show the advantages of optimizing the feature subsets using the V-shaped transfer function and confirm the effectiveness and efficiency of the feature subsets obtained under different classifiers. PMID:28358850

  10. Data-driven confounder selection via Markov and Bayesian networks.

    PubMed

    Häggström, Jenny

    2018-06-01

    To unbiasedly estimate a causal effect on an outcome unconfoundedness is often assumed. If there is sufficient knowledge on the underlying causal structure then existing confounder selection criteria can be used to select subsets of the observed pretreatment covariates, X, sufficient for unconfoundedness, if such subsets exist. Here, estimation of these target subsets is considered when the underlying causal structure is unknown. The proposed method is to model the causal structure by a probabilistic graphical model, for example, a Markov or Bayesian network, estimate this graph from observed data and select the target subsets given the estimated graph. The approach is evaluated by simulation both in a high-dimensional setting where unconfoundedness holds given X and in a setting where unconfoundedness only holds given subsets of X. Several common target subsets are investigated and the selected subsets are compared with respect to accuracy in estimating the average causal effect. The proposed method is implemented with existing software that can easily handle high-dimensional data, in terms of large samples and large number of covariates. The results from the simulation study show that, if unconfoundedness holds given X, this approach is very successful in selecting the target subsets, outperforming alternative approaches based on random forests and LASSO, and that the subset estimating the target subset containing all causes of outcome yields smallest MSE in the average causal effect estimation. © 2017, The International Biometric Society.

  11. The First SeaWiFS HPLC Analysis Round-Robin Experiment (SeaHARRE-1)

    NASA Technical Reports Server (NTRS)

    Hooker, Stanford B. (Editor); Firestone, Elaine R. (Editor); Claustre, Herve; Ras, Josephine; VanHeukelem, Laurie; Berthon, Jean-Francois; Targa, Cristina; vanderLinde, Dirk; Barlow, Ray; Sessions, Heather

    2001-01-01

    Four laboratories, which had contributed to various aspects of SeaWiFS calibration and validation activities, participated in the first SeaWiFS HPLC Analysis Round-Robin Experiment (SeaHARRE-1): Horn Point Laboratory (USA), the Joint Research Centre (Italy), the Laboratoire de Physique et Chimie Marines (France), and the Marine and Coastal Management group (South Africa). The analyses of the data are presented in Chapter 1 and the individual methods of the four groups are presented in Chapters 2-5. The average (or overall) conclusions of the round-robin are derived from 12 in situ stations occupied during a cruise in the Mediterranean Sea, although, only 11 stations are used in the analyses. The data set is composed of 12 replicates taken during each sampling opportunity with 3 replicates going to each of the 4 laboratories. The average (or overall) results from the intercomparison of 15 pigments or pigment associations are as follows (in some cases, data subsets that exclude pigments which were not analyzed by all the laboratories, or that had unusually large variances, are used to exclude a variety of problematic pigments): a) the accuracy of the four methods in determining the concentration of total chlorophyll a is 7.9%, (one method did not separate mono- and divinyl chlorophyll a, and if the samples containing significant divinyl chlorophyll a concentrations are ignored, the four methods have an accuracy of 6.7%); b) the accuracy in determining the full set of pigments is 19.1%; c) there is a reduction in accuracy of approximately - 12.2% for every decade (factor of 10) decrease in concentration (based on a data subset); d) the precision of the four methods using a subset data is 8.617( 6.2% for an edited subset); e) the repeatability of the four methods using the subset data is 9.2% (7.2%; for an edited subset, and f) the reproducibility of the four methods using the subset data is 21.31% (15.0% for an edited subset).

  12. Perceptions of users and providers on barriers to utilizing skilled birth care in mid- and far-western Nepal: a qualitative study

    PubMed Central

    Onta, Sharad; Choulagai, Bishnu; Shrestha, Binjwala; Subedi, Narayan; Bhandari, Gajananda P.; Krettek, Alexandra

    2014-01-01

    Background Although skilled birth care contributes significantly to the prevention of maternal and newborn morbidity and mortality, utilization of such care is poor in mid- and far-western Nepal. This study explored the perceptions of service users and providers regarding barriers to skilled birth care. Design We conducted 24 focus group discussions, 12 each with service users and service providers from different health institutions in mid- and far-western Nepal. All discussions examined the perceptions and experiences of service users and providers regarding barriers to skilled birth care and explored possible solutions to overcoming such barriers. Results Our results determined that major barriers to skilled birth care include inadequate knowledge of the importance of services offered by skilled birth attendants (SBAs), distance to health facilities, unavailability of transport services, and poor availability of SBAs. Other barriers included poor infrastructure, meager services, inadequate information about services/facilities, cultural practices and beliefs, and low prioritization of birth care. Moreover, the tradition of isolating women during and after childbirth decreased the likelihood that women would utilize delivery care services at health facilities. Conclusions Service users and providers perceived inadequate availability and accessibility of skilled birth care in remote areas of Nepal, and overall utilization of these services was poor. Therefore, training and recruiting locally available health workers, helping community groups establish transport mechanisms, upgrading physical facilities and services at health institutions, and increasing community awareness of the importance of skilled birth care will help bridge these gaps. PMID:25119066

  13. Monitoring Volcano Deformation in the Northernmost Andes with ALOS InSAR Time-Series

    NASA Astrophysics Data System (ADS)

    Morales Rivera, A. M.; Amelung, F.

    2014-12-01

    Satellite-based Interferometric Synthetic Aperture Radar (InSAR) is well known to be used as a volcano monitoring tool, providing the opportunity to conduct local and regional surveys to detect and measure volcanic deformation. The signals detected by InSAR on volcanoes can be related to various phenomena, such as volume changes in magmatic reservoirs, compaction of recent deposits, changes in hydrothermal activity, and flank instability. The InSAR time-series method has well documented examples of these phenomena, including precursory inflation of magma reservoirs months prior to volcanic eruptions, proving its potential for early warning systems. We use the ALOS-1 satellite from the Japanese Aerospace Exploration Agency (JAXA), which acquired a global L-band data set of nearly 20 acquisitions during 2007-2011, to make an InSAR time-series analysis using the Small Baseline method (SBAS). Our analysis covers all of the volcanoes in Colombia, Ecuador, and Peru that are cataloged by the Global Volcanism Program. We present results showing time-dependent ground deformation on an near the volcanoes, and present kinematic models to constrain the characteristics of the magmatic sources for the cases in which the deformation is likely related to changes in magma reservoir pressurization.

  14. Decomposition of metabolic network into functional modules based on the global connectivity structure of reaction graph.

    PubMed

    Ma, Hong-Wu; Zhao, Xue-Ming; Yuan, Ying-Jin; Zeng, An-Ping

    2004-08-12

    Metabolic networks are organized in a modular, hierarchical manner. Methods for a rational decomposition of the metabolic network into relatively independent functional subsets are essential to better understand the modularity and organization principle of a large-scale, genome-wide network. Network decomposition is also necessary for functional analysis of metabolism by pathway analysis methods that are often hampered by the problem of combinatorial explosion due to the complexity of metabolic network. Decomposition methods proposed in literature are mainly based on the connection degree of metabolites. To obtain a more reasonable decomposition, the global connectivity structure of metabolic networks should be taken into account. In this work, we use a reaction graph representation of a metabolic network for the identification of its global connectivity structure and for decomposition. A bow-tie connectivity structure similar to that previously discovered for metabolite graph is found also to exist in the reaction graph. Based on this bow-tie structure, a new decomposition method is proposed, which uses a distance definition derived from the path length between two reactions. An hierarchical classification tree is first constructed from the distance matrix among the reactions in the giant strong component of the bow-tie structure. These reactions are then grouped into different subsets based on the hierarchical tree. Reactions in the IN and OUT subsets of the bow-tie structure are subsequently placed in the corresponding subsets according to a 'majority rule'. Compared with the decomposition methods proposed in literature, ours is based on combined properties of the global network structure and local reaction connectivity rather than, primarily, on the connection degree of metabolites. The method is applied to decompose the metabolic network of Escherichia coli. Eleven subsets are obtained. More detailed investigations of the subsets show that reactions in the same subset are really functionally related. The rational decomposition of metabolic networks, and subsequent studies of the subsets, make it more amenable to understand the inherent organization and functionality of metabolic networks at the modular level. http://genome.gbf.de/bioinformatics/

  15. Systematic wavelength selection for improved multivariate spectral analysis

    DOEpatents

    Thomas, Edward V.; Robinson, Mark R.; Haaland, David M.

    1995-01-01

    Methods and apparatus for determining in a biological material one or more unknown values of at least one known characteristic (e.g. the concentration of an analyte such as glucose in blood or the concentration of one or more blood gas parameters) with a model based on a set of samples with known values of the known characteristics and a multivariate algorithm using several wavelength subsets. The method includes selecting multiple wavelength subsets, from the electromagnetic spectral region appropriate for determining the known characteristic, for use by an algorithm wherein the selection of wavelength subsets improves the model's fitness of the determination for the unknown values of the known characteristic. The selection process utilizes multivariate search methods that select both predictive and synergistic wavelengths within the range of wavelengths utilized. The fitness of the wavelength subsets is determined by the fitness function F=.function.(cost, performance). The method includes the steps of: (1) using one or more applications of a genetic algorithm to produce one or more count spectra, with multiple count spectra then combined to produce a combined count spectrum; (2) smoothing the count spectrum; (3) selecting a threshold count from a count spectrum to select these wavelength subsets which optimize the fitness function; and (4) eliminating a portion of the selected wavelength subsets. The determination of the unknown values can be made: (1) noninvasively and in vivo; (2) invasively and in vivo; or (3) in vitro.

  16. On the reliable and flexible solution of practical subset regression problems

    NASA Technical Reports Server (NTRS)

    Verhaegen, M. H.

    1987-01-01

    A new algorithm for solving subset regression problems is described. The algorithm performs a QR decomposition with a new column-pivoting strategy, which permits subset selection directly from the originally defined regression parameters. This, in combination with a number of extensions of the new technique, makes the method a very flexible tool for analyzing subset regression problems in which the parameters have a physical meaning.

  17. Diagnosis of Chronic Kidney Disease Based on Support Vector Machine by Feature Selection Methods.

    PubMed

    Polat, Huseyin; Danaei Mehr, Homay; Cetin, Aydin

    2017-04-01

    As Chronic Kidney Disease progresses slowly, early detection and effective treatment are the only cure to reduce the mortality rate. Machine learning techniques are gaining significance in medical diagnosis because of their classification ability with high accuracy rates. The accuracy of classification algorithms depend on the use of correct feature selection algorithms to reduce the dimension of datasets. In this study, Support Vector Machine classification algorithm was used to diagnose Chronic Kidney Disease. To diagnose the Chronic Kidney Disease, two essential types of feature selection methods namely, wrapper and filter approaches were chosen to reduce the dimension of Chronic Kidney Disease dataset. In wrapper approach, classifier subset evaluator with greedy stepwise search engine and wrapper subset evaluator with the Best First search engine were used. In filter approach, correlation feature selection subset evaluator with greedy stepwise search engine and filtered subset evaluator with the Best First search engine were used. The results showed that the Support Vector Machine classifier by using filtered subset evaluator with the Best First search engine feature selection method has higher accuracy rate (98.5%) in the diagnosis of Chronic Kidney Disease compared to other selected methods.

  18. Approximate error conjugation gradient minimization methods

    DOEpatents

    Kallman, Jeffrey S

    2013-05-21

    In one embodiment, a method includes selecting a subset of rays from a set of all rays to use in an error calculation for a constrained conjugate gradient minimization problem, calculating an approximate error using the subset of rays, and calculating a minimum in a conjugate gradient direction based on the approximate error. In another embodiment, a system includes a processor for executing logic, logic for selecting a subset of rays from a set of all rays to use in an error calculation for a constrained conjugate gradient minimization problem, logic for calculating an approximate error using the subset of rays, and logic for calculating a minimum in a conjugate gradient direction based on the approximate error. In other embodiments, computer program products, methods, and systems are described capable of using approximate error in constrained conjugate gradient minimization problems.

  19. Spatial and Temporal Characteristics of Land Deformation in Northern Saudi Arabia: Inferences from Radar Interferometric Applications

    NASA Astrophysics Data System (ADS)

    Othman, A.; Sultan, M.; Gebremichael, E.; Sefry, S.; Yanar, R.; Alharbi, H.; Albalawi, S.; Emil, M. K.; Pankratz, H. G.

    2016-12-01

    Over the past two decades, land deformation phenomena and related losses in public and private property were reported from the northern part of the Kingdom of Saudi Arabia in Al Jowf region (100,212 km²; from lat: 29.25°N to 30.90°N, from long: 37.60°E to 40.70°E). We applied an integrated approach (geotechnical, geology, remote sensing, geodesy, hydrogeology, and GIS) to identify areas affected by these phenomena, quantify the nature and magnitude of deformation, investigate the factors controlling the deformation, and recommend solutions for these problems. We applied a three-fold approach in three different areas (Alisawiyah, Wadi Alsarhan, and Sakaka areas) to accomplish the following: (1) assess the spatial distribution of land deformation and quantify deformation rates using InSAR methods Persistent Scatterer Interferometry (PSI) and Small BAseline Subsets (SBAS); (2) generate a GIS database to encompass all relevant data and derived products (e.g., remote sensing, geology, geotechnical, GPS, groundwater extraction rates, distribution of urban areas, etc.), and (3) correlate findings from the InSAR exercise with relevant spatial and temporal datasets in search of causal effects. Findings revealed the following: (1) high and consistent subsidence rates (5 to 13 mm/yr) from multiple interferometric techniques; (2) subsided areas correlated largely with the distribution of irrigated agricultural land over alluvial and unconfined aquifers (e.g., Tawil and Jauf aquifers), areas characterized by high and a progressive increase in groundwater extraction (1.2 bcm/yr) as evidenced from the satellite-based temporal distribution of irrigated lands (area irrigated lands: 1998: 37,737 ha; 2013: 70,869 ha); (3) high subsidence rates ( 8 mm/yr) were also detected over urban areas (e.g., Sakaka, Dumat Aljandal, and Tubarjal ), subsidence being caused by disposal of wastewater in the subsurface leading to rise in water tables, dissolution of substrate rocks/sediments (e.g., carbonates, evaporates, sabkhas), and settling of buildings and infrastructures; (4) presence of numerous sinkholes within the Aruma formation (limestone and dolomite) that appear as incoherent circular domains (diameter range: 20 to 50 m) on the derived radar products within subsiding areas (subsidence rates: up to 6 mm/yr).

  20. InSAR Time Series Analysis and Geophysical Modeling of City Uplift Associated with Geothermal Drillings in Staufen im Breisgau, Germany

    NASA Astrophysics Data System (ADS)

    Motagh, M.; Lubitz, C.

    2014-12-01

    Geothermal energy is of increasing importance as alternative, environmentally friendly technology for heat management. Direct interaction with the subsurface requires careful implementation, in particular in geological complex regions. The historical city Staufen im Breisgau, SW Germany, has attracted national attention as a case of implementation failure with severe consequences, causing debates on the applicability and security of this sustainable technique. Located at the eastern transition zone of the Upper Rhine Graben and the Schwarzwald massif, the geothermal potential is high at Staufen due to strong temperature gradients. In September 2007, seven boreholes for geothermal probes were drilled up to a depth of 140 m to provide a new heat management for the city hall. Within five years an uplift phenomenon has been observed in Staufen reaching more than 40 cm in places and 269 buildings were damaged. Hydro-chemical driven anhydrite-gypsum transformation in the subsurface was identified as the cause leading to volume increase that is observable as surface uplift. This process is associated with the geothermal drilling activities that have crossed several groundwater levels. In this work, we summarize and present the findings of spaceborne Synthetic Aperture Radar Interferometry (InSAR) analysis of the uplift in Staufen over the last five years from July 2008 through July 2013. By applying the Small Baseline Subset (SBAS) method, we find a localized elliptical-shaped deformation field in NE-SW orientation. Area of maximum uplift is located 50 m NNE of the drilling zone. At this location, we observe a cumulative uplift of approx. 13.7 cm ± 0.34 cm (mean value within an area of 30 m by 30 m) from July 2008 to July 2009, which reduced to cumulative uplift of 3 cm ± 0.25 cm from July 2012 to July 2013. The deceleration can be related to applied countermeasures as borehole sealing and groundwater pumping. The observed ground surface response was compared to regularly performed leveling measurements and shows indications of significant symmetric horizontal motions, which were further investigated by a combined analysis of SAR imagery from ascending and descending orbits. Moreover, InSAR observations were inverted using geophysical models to derive first order characteristics of deformation source at depth.

  1. Linking the Observation of Essential Variables to Societal Benefits

    NASA Astrophysics Data System (ADS)

    Sylak-Glassman, E.

    2017-12-01

    Different scientific communities have established sets of commonly agreed upon essential variables to help coordinate data collection in a variety of Earth observation areas. As an example, the World Meteorological Organization Global Climate Observing System has identified 50 Essential Climate Variables (ECVs), such as sea-surface temperature and carbon dioxide, which are required to monitoring the climate and detect and attribute climate change. In addition to supporting climate science, measuring these ECVs deliver many types of societal benefits, ranging from disaster mitigation to agricultural productivity to human health. While communicating the value in maintaining and improving observational records for these variables has been a challenge, quantifying how the measurement of these ECVs results in the delivery of many different societal benefits may help support their continued measurement. The 2016 National Earth Observation Assessment (EOA 2016) quantified the impact of individual Earth observation systems, sensors, networks, and surveys (or Earth observation systems, for short) on the achievement of 217 Federal objectives in 13 societal benefit areas (SBAs). This study will demonstrate the use of the EOA 2016 dataset to show the different Federal objectives and SBAs that are impacted by the Earth observation systems used to measure ECVs. Describing how the measurements from these Earth observation systems are used not only to maintain the climate record but also to meet additional Federal objectives may help articulate the continued measurement of the ECVs. This study will act as a pilot for the use of the EOA 2016 dataset to map between the measurements required to observe additional sets of variables, such as the Essential Ocean Variables and Essential Biodiversity Variables, and the ability to achieve a variety of societal benefits.

  2. A regional multilevel analysis: can skilled birth attendants uniformly decrease neonatal mortality?

    PubMed

    Singh, Kavita; Brodish, Paul; Suchindran, Chirayath

    2014-01-01

    Globally 40 % of deaths to children under-five occur in the very first month of life with three-quarters of these deaths occurring during the first week of life. The promotion of delivery with a skilled birth attendant (SBA) is being promoted as a strategy to reduce neonatal mortality. This study explored whether SBAs had a protective effect against neonatal mortality in three different regions of the world. The analysis pooled data from nine diverse countries for which recent Demographic and Health Survey data were available. Multilevel logistic regression was used to understand the influence of skilled delivery on two outcomes-neonatal mortality during the first week of life and during the first day of life. Control variables included age, parity, education, wealth, residence (urban/rural), geographic region (Africa, Asia and Latin America/Caribbean), antenatal care and tetanus immunization. The direction of the effect of skilled delivery on neonatal mortality was dependent on geographic region. While having a SBA at delivery was protective against neonatal mortality in Latin America/Caribbean, in Asia there was only a protective effect for births in the first week of life. In Africa SBAs were associated with higher neonatal mortality for both outcomes, and the same was true for deaths on the first day of life in Asia. Many women in Africa and Asia deliver at home unless a complication occurs, and thus skilled birth attendants may be seeing more women with complications than their unskilled counterparts. In addition there are issues with the definition of a SBA with many attendants in both Africa and Asia not actually having the needed training and equipment to prevent neonatal mortality. Considerable investment is needed in terms of training and health infrastructure to enable these providers to save the youngest lives.

  3. The exploitation of large archives of space-borne C-band SAR data in the framework of FP7-DORIS Project

    NASA Astrophysics Data System (ADS)

    Del Ventisette, Chiara; Ciampalini, Andrea

    2013-04-01

    DORIS (Ground Deformations Risk Scenarios: an Advanced Assessment Service) is an advanced downstream service project within the seventh Framework Programme of the European Commission. A European team was set up in order to make the best views of the most advanced research and technologies outcomes in the field of Earth Observation (EO) for the improvement of risk management. The aim of the DORIS project is the development of new methodologies for the detection, mapping, monitoring and forecasting of ground deformations. DORIS integrates traditional and innovative EO and ground based (non-EO) data to improve our understanding of the complex phenomena at different temporal and spatial scales and in various physiographic and environmental settings that result in ground deformations, including landslides and ground subsidence, for civil protection purposes. One of the goal of the Doris Project is the exploitation of the large data archives for geohazards mapping. In this work the existing ESA Synthetic Aperture Radar (SAR) archives, operating in the microwave C-band (data collected by the ERS-1/2 and ENVISAT satellite) were analysed through new algorithms developed to reconstruct long time series (almost 20 years) and the obtained preliminary results are presented. The algorithms are based on Small BAseline Subset technique (SBAS; developed by CNR-IREA), ERS- ENVISAT Stitching (T.R.E.), Stable Point Network (SPN; Altamira) and ERS-ENVISAT Interferometric Point Target Analysis (IPTA; Gamma). The potentiality of these algorithms were evaluate in selected test sites characterized by different ground deformation phenomena (landslide and/or subsidence): i) Central Umbria (Italy); ii) Messina Province (Italy); iii) Rácalmás (Hungary); iv) Silesian Coal Basin (Poland); v) Tramuntana Range (Mallorca, Spain) and vi) St. Moritz (Switzerland). The results demonstrate the usefulness of the implemented algorithms, but in some cases there is a loss of the coherent points, especially in the most unstable areas.

  4. InSAR to support sustainable urbanization over compacting aquifers: The case of Toluca Valley, Mexico

    NASA Astrophysics Data System (ADS)

    Castellazzi, Pascal; Garfias, Jaime; Martel, Richard; Brouard, Charles; Rivera, Alfonso

    2017-12-01

    This paper illustrates how InSAR alone can be used to delineate potential ground fractures related to aquifer system compaction. An InSAR-derived ground fracturing map of the Toluca Valley, Mexico, is produced and validated through a field campaign. The results are of great interest to support sustainable urbanization and show that InSAR processing of open-access Synthetic Aperture Radar (SAR) data from the Sentinel-1 satellites can lead to reliable and cost-effective products directly usable by cities to help decision-making. The Toluca Valley Aquifer (TVA) sustains the water needs of two million inhabitants living within the valley, a growing industry, an intensively irrigated agricultural area, and 38% of the water needs of the megalopolis of Mexico City, located 40 km east of the valley. Ensuring water sustainability, infrastructure integrity, along with supporting the important economic and demographic growth of the region, is a major challenge for water managers and urban developers. This paper presents a long-term analysis of ground fracturing by interpreting 13 years of InSAR-derived ground displacement measurements. Small Baseline Subset (SBAS) and Persistent Scatterer Interferometry (PSI) techniques are applied over three SAR datasets totalling 93 acquisitions from Envisat, Radarsat-2, and Sentinel-1A satellites and covering the period from 2003 to 2016. From 2003 to 2016, groundwater level declines of up to 1.6 m/yr, land subsidence up to 77 mm/yr, and major infrastructure damages are observed. Groundwater level data show highly variable seasonal responses according to their connectivity to recharge areas. However, the trend of groundwater levels consistently range from -0.5 to -1.5 m/yr regardless of the well location and depth. By analysing the horizontal gradients of vertical land subsidence, we provide a potential ground fracture map to assist in future urban development planning in the Toluca Valley.

  5. Surface Deformation in Quetta Valley, Balochistan, Pakistan

    NASA Astrophysics Data System (ADS)

    Huang, J.; Shuhab, K.; Wulamu, A.; Crupa, W.; Khan, A. S.; Kakar, D. M.; Kasi, A.

    2015-12-01

    In February 2011, several ground fissures up to ~1.8 km in length appeared in the Quetta Valley, Balochsitan, Pakistan. It is not clear what caused the sudden occurrence of these fissures. The region is tectonically active and bounded to the west by several regional strike-slip faults including the north-south striking left-lateral Chaman fault system that slips at ~10 mm per year. Several large earthquakes have occurred recently in this area, one fatal 6.4 magnitude (Mw) earthquake occurred on October 28th, 2008. Some parts of Quetta Valley are subsiding; GPS data from two stations in Quetta that span mid-2006 - 2009 recorded subsidence rates of ~10 cm per year. Although subsidence in urban areas is generally attributed to groundwater depletion, it is not clear whether ground fissures are caused by water withdrawal or related to tectonics of the region. This study is designed to quantify and assess the source of surface deformation in Quetta Valley using InSAR, GPS, seismic and earthquake centroid moment tensor data. To detect and map the spatial-temporal features of the processes that led to the surface deformation, we used two time series, i.e., 15 European Remote Sensing (ERS-1/2) satellite images from 1992 - 1999 and 27 ENVISAT images spanning 2003 - 2010. A Differential Interferometric Synthetic Aperture Radar (DInSAR) Small Baseline Subset (SBAS) technique was used to investigate surface deformation. Eleven continuous-GPS stations within the InSAR antenna footprint were compared with the InSAR time series for quality control. Preliminary InSAR results revealed that the areas in and around the fissures are subsiding at 5 cm per year. Five seismic lines totaling ~60 km, acquired in 2003, were used to interpret faults beneath Holocene alluvium in the Quetta Valley. One of the blind faults is a north-south striking thrust fault mapped north into the Takatu range. However, a focal mechanism for the 2008 earthquake in this region indicated northwest-southeast strike-slip fault movement.

  6. Deformation signals from InSAR time series analysis related to the 2007 and 2011 east rift zone intrusions at Kilauea Volcano, Hawaii

    NASA Astrophysics Data System (ADS)

    Baker, S.; Amelung, F.

    2011-12-01

    Located on the Big Island of Hawaii, Kilauea volcano is one of the most active volcanoes on Earth with continuous eruptive activity since 1983. The eruptive activity is predominately from the Pu'u O'o vent within the east rift zone, but periodic intrusions occur in the upper east rift zone between the summit and Pu'u O'o. These intrusions occur as dikes typically accompanied by fissure openings and eruptions of small volumes of lava. Interferometric synthetic aperture radar (InSAR) provides surface displacement measurements showing how the ground moves before, during, and after these intrusions. Given the recent increase in the number of active or planned SAR satellites and the more frequent repeat-pass times, InSAR is proving to be a valuable monitoring tool for volcanic hazards. Using data from Radarsat-1, Envisat, ALOS, and TerraSAR-X satellites, we generate line-of-sight InSAR time series using the small baseline subset (SBAS) which provides dense spatial and temporal coverage at Kilauea covering the 17 June 2007 and 5 March 2011 intrusions. For these two events, the summit caldera area switches from deflation to inflation months to years before both intrusions, and just prior to the intrusions we observe increased rates of inflation accompanied by elevated seismic activity in the upper east rift zone. Observations of the intrusion relate surface displacement and the response of the summit caldera area provide insight into the shallow magmatic system and the connectivity of the system. By combining InSAR time series with other geophysical data sets (such as seismic or GPS), we obtain more details about the associated hazard and a better understanding of the time-dependent relationship between what we are measuring and the controlling processes at the volcano.

  7. A Simple Joint Estimation Method of Residual Frequency Offset and Sampling Frequency Offset for DVB Systems

    NASA Astrophysics Data System (ADS)

    Kwon, Ki-Won; Cho, Yongsoo

    This letter presents a simple joint estimation method for residual frequency offset (RFO) and sampling frequency offset (STO) in OFDM-based digital video broadcasting (DVB) systems. The proposed method selects a continual pilot (CP) subset from an unsymmetrically and non-uniformly distributed CP set to obtain an unbiased estimator. Simulation results show that the proposed method using a properly selected CP subset is unbiased and performs robustly.

  8. Comparison between flowcytometry and immunoperoxidase staining for the enumeration of lymphocyte subsets.

    PubMed

    Dhaliwal, J S; Malar, B; Quck, C K; Sukumaran, K D; Hassan, K

    1991-06-01

    Immunoperoxidase staining was compared with flowcytometry for the enumeration of lymphocyte subsets. The percentages obtained for peripheral blood lymphocytes using immunoperoxidase (CD3 = 76 CD4 = 27.9, B = 10.7 CD4/CD8 = 1.8) differed significantly from those obtained by flowcytometry (CD3 = 65.7 CD4 = 39.4, CD8 = 25.6, B = 16.7, HLA DR = 11.9 CD4/CD8 = 1.54) for certain subsets (CD3, CD4, B). There was no significant difference in lymphocyte subsets between children and adults using the same method. These differences are probably due to the different methods used to prepare lymphocytes for analysis. Other factors that should also be considered are the presence of CD4 antigen on monocytes and CD8 on natural killer cells.

  9. UNCLES: method for the identification of genes differentially consistently co-expressed in a specific subset of datasets.

    PubMed

    Abu-Jamous, Basel; Fa, Rui; Roberts, David J; Nandi, Asoke K

    2015-06-04

    Collective analysis of the increasingly emerging gene expression datasets are required. The recently proposed binarisation of consensus partition matrices (Bi-CoPaM) method can combine clustering results from multiple datasets to identify the subsets of genes which are consistently co-expressed in all of the provided datasets in a tuneable manner. However, results validation and parameter setting are issues that complicate the design of such methods. Moreover, although it is a common practice to test methods by application to synthetic datasets, the mathematical models used to synthesise such datasets are usually based on approximations which may not always be sufficiently representative of real datasets. Here, we propose an unsupervised method for the unification of clustering results from multiple datasets using external specifications (UNCLES). This method has the ability to identify the subsets of genes consistently co-expressed in a subset of datasets while being poorly co-expressed in another subset of datasets, and to identify the subsets of genes consistently co-expressed in all given datasets. We also propose the M-N scatter plots validation technique and adopt it to set the parameters of UNCLES, such as the number of clusters, automatically. Additionally, we propose an approach for the synthesis of gene expression datasets using real data profiles in a way which combines the ground-truth-knowledge of synthetic data and the realistic expression values of real data, and therefore overcomes the problem of faithfulness of synthetic expression data modelling. By application to those datasets, we validate UNCLES while comparing it with other conventional clustering methods, and of particular relevance, biclustering methods. We further validate UNCLES by application to a set of 14 real genome-wide yeast datasets as it produces focused clusters that conform well to known biological facts. Furthermore, in-silico-based hypotheses regarding the function of a few previously unknown genes in those focused clusters are drawn. The UNCLES method, the M-N scatter plots technique, and the expression data synthesis approach will have wide application for the comprehensive analysis of genomic and other sources of multiple complex biological datasets. Moreover, the derived in-silico-based biological hypotheses represent subjects for future functional studies.

  10. Systems and methods for data quality control and cleansing

    DOEpatents

    Wenzel, Michael; Boettcher, Andrew; Drees, Kirk; Kummer, James

    2016-05-31

    A method for detecting and cleansing suspect building automation system data is shown and described. The method includes using processing electronics to automatically determine which of a plurality of error detectors and which of a plurality of data cleansers to use with building automation system data. The method further includes using processing electronics to automatically detect errors in the data and cleanse the data using a subset of the error detectors and a subset of the cleansers.

  11. Two-stage atlas subset selection in multi-atlas based image segmentation.

    PubMed

    Zhao, Tingting; Ruan, Dan

    2015-06-01

    Fast growing access to large databases and cloud stored data presents a unique opportunity for multi-atlas based image segmentation and also presents challenges in heterogeneous atlas quality and computation burden. This work aims to develop a novel two-stage method tailored to the special needs in the face of large atlas collection with varied quality, so that high-accuracy segmentation can be achieved with low computational cost. An atlas subset selection scheme is proposed to substitute a significant portion of the computationally expensive full-fledged registration in the conventional scheme with a low-cost alternative. More specifically, the authors introduce a two-stage atlas subset selection method. In the first stage, an augmented subset is obtained based on a low-cost registration configuration and a preliminary relevance metric; in the second stage, the subset is further narrowed down to a fusion set of desired size, based on full-fledged registration and a refined relevance metric. An inference model is developed to characterize the relationship between the preliminary and refined relevance metrics, and a proper augmented subset size is derived to ensure that the desired atlases survive the preliminary selection with high probability. The performance of the proposed scheme has been assessed with cross validation based on two clinical datasets consisting of manually segmented prostate and brain magnetic resonance images, respectively. The proposed scheme demonstrates comparable end-to-end segmentation performance as the conventional single-stage selection method, but with significant computation reduction. Compared with the alternative computation reduction method, their scheme improves the mean and medium Dice similarity coefficient value from (0.74, 0.78) to (0.83, 0.85) and from (0.82, 0.84) to (0.95, 0.95) for prostate and corpus callosum segmentation, respectively, with statistical significance. The authors have developed a novel two-stage atlas subset selection scheme for multi-atlas based segmentation. It achieves good segmentation accuracy with significantly reduced computation cost, making it a suitable configuration in the presence of extensive heterogeneous atlases.

  12. Monitoring ground subsidence in Shanghai maglev area using two kinds of SAR data

    NASA Astrophysics Data System (ADS)

    Wu, Jicang; Zhang, Lina; Chen, Jie; Li, Tao

    2012-11-01

    Shanghai maglev is a very fast traffic tool, so it is very strict with the stability of the roadbed. However, the ground subsidence is a problem in Shanghai because of the poor geological condition and human-induced factors. So it is necessary to monitor ground subsidence in the area along the Shanghai maglev precisely and frequently. Traditionally, a precise levelling method is used to survey along the track. It is expensive and time consuming, and can only get the ground subsidence information on sparse benchmarks. Recently, the small baseline differential SAR technique plays a valuable part in monitoring ground subsidence, which can extract ground subsidence information with high spatial resolution in a wide area. In this paper, L-band ALOS PALSAR data and C-band Envisat ASAR data are used to extract ground subsidence information using the SBAS method in the Shanghai maglev area. The results show that the general pattern of ground subsidence from InSAR processing of two differential bands of SAR images is similar. Both results show that there is no significant ground subsidence on the maglev line. Near the railway line, there are a few places with subsidence rates at about -20 mm/y or even more, such as Chuansha town, the junction of the maglev and Waihuan road.

  13. Biochemical Sensors Using Carbon Nanotube Arrays

    NASA Technical Reports Server (NTRS)

    Meyyappan, Meyya (Inventor); Cassell, Alan M. (Inventor); Li, Jun (Inventor)

    2011-01-01

    Method and system for detecting presence of biomolecules in a selected subset, or in each of several selected subsets, in a fluid. Each of an array of two or more carbon nanotubes ("CNTs") is connected at a first CNT end to one or more electronics devices, each of which senses a selected electrochemical signal that is generated when a target biomolecule in the selected subset becomes attached to a functionalized second end of the CNT, which is covalently bonded with a probe molecule. This approach indicates when target biomolecules in the selected subset are present and indicates presence or absence of target biomolecules in two or more selected subsets. Alternatively, presence of absence of an analyte can be detected.

  14. Two-stage atlas subset selection in multi-atlas based image segmentation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhao, Tingting, E-mail: tingtingzhao@mednet.ucla.edu; Ruan, Dan, E-mail: druan@mednet.ucla.edu

    2015-06-15

    Purpose: Fast growing access to large databases and cloud stored data presents a unique opportunity for multi-atlas based image segmentation and also presents challenges in heterogeneous atlas quality and computation burden. This work aims to develop a novel two-stage method tailored to the special needs in the face of large atlas collection with varied quality, so that high-accuracy segmentation can be achieved with low computational cost. Methods: An atlas subset selection scheme is proposed to substitute a significant portion of the computationally expensive full-fledged registration in the conventional scheme with a low-cost alternative. More specifically, the authors introduce a two-stagemore » atlas subset selection method. In the first stage, an augmented subset is obtained based on a low-cost registration configuration and a preliminary relevance metric; in the second stage, the subset is further narrowed down to a fusion set of desired size, based on full-fledged registration and a refined relevance metric. An inference model is developed to characterize the relationship between the preliminary and refined relevance metrics, and a proper augmented subset size is derived to ensure that the desired atlases survive the preliminary selection with high probability. Results: The performance of the proposed scheme has been assessed with cross validation based on two clinical datasets consisting of manually segmented prostate and brain magnetic resonance images, respectively. The proposed scheme demonstrates comparable end-to-end segmentation performance as the conventional single-stage selection method, but with significant computation reduction. Compared with the alternative computation reduction method, their scheme improves the mean and medium Dice similarity coefficient value from (0.74, 0.78) to (0.83, 0.85) and from (0.82, 0.84) to (0.95, 0.95) for prostate and corpus callosum segmentation, respectively, with statistical significance. Conclusions: The authors have developed a novel two-stage atlas subset selection scheme for multi-atlas based segmentation. It achieves good segmentation accuracy with significantly reduced computation cost, making it a suitable configuration in the presence of extensive heterogeneous atlases.« less

  15. Numerical algorithms for scatter-to-attenuation reconstruction in PET: empirical comparison of convergence, acceleration, and the effect of subsets.

    PubMed

    Berker, Yannick; Karp, Joel S; Schulz, Volkmar

    2017-09-01

    The use of scattered coincidences for attenuation correction of positron emission tomography (PET) data has recently been proposed. For practical applications, convergence speeds require further improvement, yet there exists a trade-off between convergence speed and the risk of non-convergence. In this respect, a maximum-likelihood gradient-ascent (MLGA) algorithm and a two-branch back-projection (2BP), which was previously proposed, were evaluated. MLGA was combined with the Armijo step size rule; and accelerated using conjugate gradients, Nesterov's momentum method, and data subsets of different sizes. In 2BP, we varied the subset size, an important determinant of convergence speed and computational burden. We used three sets of simulation data to evaluate the impact of a spatial scale factor. The Armijo step size allowed 10-fold increased step sizes compared to native MLGA. Conjugate gradients and Nesterov momentum lead to slightly faster, yet non-uniform convergence; improvements were mostly confined to later iterations, possibly due to the non-linearity of the problem. MLGA with data subsets achieved faster, uniform, and predictable convergence, with a speed-up factor equivalent to the number of subsets and no increase in computational burden. By contrast, 2BP computational burden increased linearly with the number of subsets due to repeated evaluation of the objective function, and convergence was limited to the case of many (and therefore small) subsets, which resulted in high computational burden. Possibilities of improving 2BP appear limited. While general-purpose acceleration methods appear insufficient for MLGA, results suggest that data subsets are a promising way of improving MLGA performance.

  16. Characteristics of CD8+ T cell subsets in Chinese patients with chronic HIV infection during initial ART

    PubMed Central

    2011-01-01

    Background CD8+ T cells may play an important role in protecting against HIV. However, the changes of CD8+ T cell subsets during early period of ART have not been fully studied. Methods Twenty-one asymptomatic treatment-naive HIV-infected patients with CD4 T+ cells less than 350 cells/μl were enrolled in the study. Naïve, central memory(CM), effective memory(EM) and terminally differentiated effector (EMRA) CD8+ cell subsets and their activation and proliferation subsets were evaluated in blood samples collected at base line, and week 2, 4, 8 and 12 of ART. Results The total CD8+ T cells declined and the Naïve and CM subsets had a tendency of increase. Activation levels of all CD8+ T cell subsets except EMRA subset decreased after ART. However, proliferation levels of total CD8+ T cells, EMRA, EM and CM subsets increased at the first 4 weeks of ART, then decreased. Proliferation level of the naïve cells decreased after ART. Conclusion The changes of CD8+ T cell subsets during initial ART are complex. Our results display a complete phenotypical picture of CD8+ cell subsets during initial ART and provide insights for understanding of immune status during ART. PMID:21435275

  17. Detection of Nitrogen Content in Rubber Leaves Using Near-Infrared (NIR) Spectroscopy with Correlation-Based Successive Projections Algorithm (SPA).

    PubMed

    Tang, Rongnian; Chen, Xupeng; Li, Chuang

    2018-05-01

    Near-infrared spectroscopy is an efficient, low-cost technology that has potential as an accurate method in detecting the nitrogen content of natural rubber leaves. Successive projections algorithm (SPA) is a widely used variable selection method for multivariate calibration, which uses projection operations to select a variable subset with minimum multi-collinearity. However, due to the fluctuation of correlation between variables, high collinearity may still exist in non-adjacent variables of subset obtained by basic SPA. Based on analysis to the correlation matrix of the spectra data, this paper proposed a correlation-based SPA (CB-SPA) to apply the successive projections algorithm in regions with consistent correlation. The result shows that CB-SPA can select variable subsets with more valuable variables and less multi-collinearity. Meanwhile, models established by the CB-SPA subset outperform basic SPA subsets in predicting nitrogen content in terms of both cross-validation and external prediction. Moreover, CB-SPA is assured to be more efficient, for the time cost in its selection procedure is one-twelfth that of the basic SPA.

  18. A non-linear data mining parameter selection algorithm for continuous variables

    PubMed Central

    Razavi, Marianne; Brady, Sean

    2017-01-01

    In this article, we propose a new data mining algorithm, by which one can both capture the non-linearity in data and also find the best subset model. To produce an enhanced subset of the original variables, a preferred selection method should have the potential of adding a supplementary level of regression analysis that would capture complex relationships in the data via mathematical transformation of the predictors and exploration of synergistic effects of combined variables. The method that we present here has the potential to produce an optimal subset of variables, rendering the overall process of model selection more efficient. This algorithm introduces interpretable parameters by transforming the original inputs and also a faithful fit to the data. The core objective of this paper is to introduce a new estimation technique for the classical least square regression framework. This new automatic variable transformation and model selection method could offer an optimal and stable model that minimizes the mean square error and variability, while combining all possible subset selection methodology with the inclusion variable transformations and interactions. Moreover, this method controls multicollinearity, leading to an optimal set of explanatory variables. PMID:29131829

  19. Using learning automata to determine proper subset size in high-dimensional spaces

    NASA Astrophysics Data System (ADS)

    Seyyedi, Seyyed Hossein; Minaei-Bidgoli, Behrouz

    2017-03-01

    In this paper, we offer a new method called FSLA (Finding the best candidate Subset using Learning Automata), which combines the filter and wrapper approaches for feature selection in high-dimensional spaces. Considering the difficulties of dimension reduction in high-dimensional spaces, FSLA's multi-objective functionality is to determine, in an efficient manner, a feature subset that leads to an appropriate tradeoff between the learning algorithm's accuracy and efficiency. First, using an existing weighting function, the feature list is sorted and selected subsets of the list of different sizes are considered. Then, a learning automaton verifies the performance of each subset when it is used as the input space of the learning algorithm and estimates its fitness upon the algorithm's accuracy and the subset size, which determines the algorithm's efficiency. Finally, FSLA introduces the fittest subset as the best choice. We tested FSLA in the framework of text classification. The results confirm its promising performance of attaining the identified goal.

  20. VARIABLE SELECTION FOR QUALITATIVE INTERACTIONS IN PERSONALIZED MEDICINE WHILE CONTROLLING THE FAMILY-WISE ERROR RATE

    PubMed Central

    Gunter, Lacey; Zhu, Ji; Murphy, Susan

    2012-01-01

    For many years, subset analysis has been a popular topic for the biostatistics and clinical trials literature. In more recent years, the discussion has focused on finding subsets of genomes which play a role in the effect of treatment, often referred to as stratified or personalized medicine. Though highly sought after, methods for detecting subsets with altering treatment effects are limited and lacking in power. In this article we discuss variable selection for qualitative interactions with the aim to discover these critical patient subsets. We propose a new technique designed specifically to find these interaction variables among a large set of variables while still controlling for the number of false discoveries. We compare this new method against standard qualitative interaction tests using simulations and give an example of its use on data from a randomized controlled trial for the treatment of depression. PMID:22023676

  1. Towards semi-automatic rock mass discontinuity orientation and set analysis from 3D point clouds

    NASA Astrophysics Data System (ADS)

    Guo, Jiateng; Liu, Shanjun; Zhang, Peina; Wu, Lixin; Zhou, Wenhui; Yu, Yinan

    2017-06-01

    Obtaining accurate information on rock mass discontinuities for deformation analysis and the evaluation of rock mass stability is important. Obtaining measurements for high and steep zones with the traditional compass method is difficult. Photogrammetry, three-dimensional (3D) laser scanning and other remote sensing methods have gradually become mainstream methods. In this study, a method that is based on a 3D point cloud is proposed to semi-automatically extract rock mass structural plane information. The original data are pre-treated prior to segmentation by removing outlier points. The next step is to segment the point cloud into different point subsets. Various parameters, such as the normal, dip/direction and dip, can be calculated for each point subset after obtaining the equation of the best fit plane for the relevant point subset. A cluster analysis (a point subset that satisfies some conditions and thus forms a cluster) is performed based on the normal vectors by introducing the firefly algorithm (FA) and the fuzzy c-means (FCM) algorithm. Finally, clusters that belong to the same discontinuity sets are merged and coloured for visualization purposes. A prototype system is developed based on this method to extract the points of the rock discontinuity from a 3D point cloud. A comparison with existing software shows that this method is feasible. This method can provide a reference for rock mechanics, 3D geological modelling and other related fields.

  2. The Cross-Entropy Based Multi-Filter Ensemble Method for Gene Selection.

    PubMed

    Sun, Yingqiang; Lu, Chengbo; Li, Xiaobo

    2018-05-17

    The gene expression profile has the characteristics of a high dimension, low sample, and continuous type, and it is a great challenge to use gene expression profile data for the classification of tumor samples. This paper proposes a cross-entropy based multi-filter ensemble (CEMFE) method for microarray data classification. Firstly, multiple filters are used to select the microarray data in order to obtain a plurality of the pre-selected feature subsets with a different classification ability. The top N genes with the highest rank of each subset are integrated so as to form a new data set. Secondly, the cross-entropy algorithm is used to remove the redundant data in the data set. Finally, the wrapper method, which is based on forward feature selection, is used to select the best feature subset. The experimental results show that the proposed method is more efficient than other gene selection methods and that it can achieve a higher classification accuracy under fewer characteristic genes.

  3. Ovarian phagocyte subsets and their distinct tissue distribution patterns.

    PubMed

    Carlock, Colin; Wu, Jean; Zhou, Cindy; Ross, April; Adams, Henry; Lou, Yahuan

    2013-01-01

    Ovarian macrophages, which play critical roles in various ovarian events, are probably derived from multiple lineages. Thus, a systemic classification of their subsets is a necessary first step for determination of their functions. Utilizing antibodies to five phagocyte markers, i.e. IA/IE (major histocompatibility complex class II), F4/80, CD11b (Mac-1), CD11c, and CD68, this study investigated subsets of ovarian phagocytes in mice. Three-color immunofluorescence and flow cytometry, together with morphological observation on isolated ovarian cells, demonstrated complicated phenotypes of ovarian phagocytes. Four macrophage and one dendritic cell subset, in addition to many minor phagocyte subsets, were identified. A dendritic cell-like population with a unique phenotype of CD11c(high)IA/IE⁻F4/80⁻ was also frequently observed. A preliminary age-dependent study showed dramatic increases in IA/IE⁺ macrophages and IA/IE⁺ dendritic cells after puberty. Furthermore, immunofluorescences on ovarian sections showed that each subset displayed a distinct tissue distribution pattern. The pattern for each subset may hint to their role in an ovarian function. In addition, partial isolation of ovarian macrophage subset using CD11b antibodies was attempted. Establishment of this isolation method may have provided us a tool for more precise investigation of each subset's functions at the cellular and molecular levels.

  4. Hybrid spread spectrum radio system

    DOEpatents

    Smith, Stephen F.; Dress, William B.

    2010-02-02

    Systems and methods are described for hybrid spread spectrum radio systems. A method includes modulating a signal by utilizing a subset of bits from a pseudo-random code generator to control an amplification circuit that provides a gain to the signal. Another method includes: modulating a signal by utilizing a subset of bits from a pseudo-random code generator to control a fast hopping frequency synthesizer; and fast frequency hopping the signal with the fast hopping frequency synthesizer, wherein multiple frequency hops occur within a single data-bit time.

  5. Selection-Fusion Approach for Classification of Datasets with Missing Values

    PubMed Central

    Ghannad-Rezaie, Mostafa; Soltanian-Zadeh, Hamid; Ying, Hao; Dong, Ming

    2010-01-01

    This paper proposes a new approach based on missing value pattern discovery for classifying incomplete data. This approach is particularly designed for classification of datasets with a small number of samples and a high percentage of missing values where available missing value treatment approaches do not usually work well. Based on the pattern of the missing values, the proposed approach finds subsets of samples for which most of the features are available and trains a classifier for each subset. Then, it combines the outputs of the classifiers. Subset selection is translated into a clustering problem, allowing derivation of a mathematical framework for it. A trade off is established between the computational complexity (number of subsets) and the accuracy of the overall classifier. To deal with this trade off, a numerical criterion is proposed for the prediction of the overall performance. The proposed method is applied to seven datasets from the popular University of California, Irvine data mining archive and an epilepsy dataset from Henry Ford Hospital, Detroit, Michigan (total of eight datasets). Experimental results show that classification accuracy of the proposed method is superior to those of the widely used multiple imputations method and four other methods. They also show that the level of superiority depends on the pattern and percentage of missing values. PMID:20212921

  6. A hybrid feature selection method using multiclass SVM for diagnosis of erythemato-squamous disease

    NASA Astrophysics Data System (ADS)

    Maryam, Setiawan, Noor Akhmad; Wahyunggoro, Oyas

    2017-08-01

    The diagnosis of erythemato-squamous disease is a complex problem and difficult to detect in dermatology. Besides that, it is a major cause of skin cancer. Data mining implementation in the medical field helps expert to diagnose precisely, accurately, and inexpensively. In this research, we use data mining technique to developed a diagnosis model based on multiclass SVM with a novel hybrid feature selection method to diagnose erythemato-squamous disease. Our hybrid feature selection method, named ChiGA (Chi Square and Genetic Algorithm), uses the advantages from filter and wrapper methods to select the optimal feature subset from original feature. Chi square used as filter method to remove redundant features and GA as wrapper method to select the ideal feature subset with SVM used as classifier. Experiment performed with 10 fold cross validation on erythemato-squamous diseases dataset taken from University of California Irvine (UCI) machine learning database. The experimental result shows that the proposed model based multiclass SVM with Chi Square and GA can give an optimum feature subset. There are 18 optimum features with 99.18% accuracy.

  7. Progress towards achieving and maintaining maternal and neonatal tetanus elimination in the African region.

    PubMed

    Ridpath, Alison Delano; Scobie, Heather Melissa; Shibeshi, Messeret Eshetu; Yakubu, Ahmadu; Zulu, Flint; Raza, Azhar Abid; Masresha, Balcha; Tohme, Rania

    2017-01-01

    Despite the availability of effective tetanus prevention strategies, as of 2016, Maternal and Neonatal Tetanus Elimination (MNTE) has not yet been achieved in 18 countries globally. In this paper, we review the status of MNTE in the World Health Organization African Region (AFR),and provide recommendations for achieving and maintaining MNTE in AFR. As of November 2016, 37 (79%) AFR countries have achieved MNTE, with 10 (21%) countries remaining. DTP3 coverage increased from 52% in 2000 to 76% in 2015. In 2015, coverage with at least 2 doses of tetanus containing vaccine (TT2+) and proportion of newborns protected at birth (PAB) were 69% and 77%, compared with 44% and 62% in 2000, respectively. Since 1999, over 79 million women of reproductive age (WRA) have been vaccinated with TT2+ through supplementary immunization activities (SIAs). Despite the progress, only 54% of births were attended by skilled birth attendants (SBAs), 5 (11%) countries provided the 3 WHO-recommended booster doses to both sexes, and about 5.5 million WRA still need to be reached with SIAs. Coverage disparities still exist between countries that have achieved MNTE and those that have not. In 2015, coverage with DTP3 and PAB were higher in MNTE countries compared with those yet to achieve MNTE: 84% vs. 68% and 86% vs. 69%, respectively. Challenges to achieving MNTE in the remaining AFR countries include weak health systems, competing priorities, insufficient funding, insecurity, and sub-optimal neonatal tetanus (NT) surveillance. To achieve and maintain MNTE in AFR, increasing SBAs and tetanus vaccination coverage, integrating tetanus vaccination with other opportunities (e.g., polio and measles campaigns, mother and child health days), and providing appropriately spaced booster doses are needed. Strengthening NT surveillance and conducting serosurveys would ensure appropriate targeting of MNTE activities and high-quality information for validating the achievement and maintenance of elimination.

  8. Progress towards achieving and maintaining maternal and neonatal tetanus elimination in the African region

    PubMed Central

    Ridpath, Alison Delano; Scobie, Heather Melissa; Shibeshi, Messeret Eshetu; Yakubu, Ahmadu; Zulu, Flint; Raza, Azhar Abid; Masresha, Balcha; Tohme, Rania

    2017-01-01

    Despite the availability of effective tetanus prevention strategies, as of 2016, Maternal and Neonatal Tetanus Elimination (MNTE) has not yet been achieved in 18 countries globally. In this paper, we review the status of MNTE in the World Health Organization African Region (AFR),and provide recommendations for achieving and maintaining MNTE in AFR. As of November 2016, 37 (79%) AFR countries have achieved MNTE, with 10 (21%) countries remaining. DTP3 coverage increased from 52% in 2000 to 76% in 2015. In 2015, coverage with at least 2 doses of tetanus containing vaccine (TT2+) and proportion of newborns protected at birth (PAB) were 69% and 77%, compared with 44% and 62% in 2000, respectively. Since 1999, over 79 million women of reproductive age (WRA) have been vaccinated with TT2+ through supplementary immunization activities (SIAs). Despite the progress, only 54% of births were attended by skilled birth attendants (SBAs), 5 (11%) countries provided the 3 WHO-recommended booster doses to both sexes, and about 5.5 million WRA still need to be reached with SIAs. Coverage disparities still exist between countries that have achieved MNTE and those that have not. In 2015, coverage with DTP3 and PAB were higher in MNTE countries compared with those yet to achieve MNTE: 84% vs. 68% and 86% vs. 69%, respectively. Challenges to achieving MNTE in the remaining AFR countries include weak health systems, competing priorities, insufficient funding, insecurity, and sub-optimal neonatal tetanus (NT) surveillance. To achieve and maintain MNTE in AFR, increasing SBAs and tetanus vaccination coverage, integrating tetanus vaccination with other opportunities (e.g., polio and measles campaigns, mother and child health days), and providing appropriately spaced booster doses are needed. Strengthening NT surveillance and conducting serosurveys would ensure appropriate targeting of MNTE activities and high-quality information for validating the achievement and maintenance of elimination. PMID:29296159

  9. The GEOSS User Requirement Registry (URR): A Cross-Cutting Service-Oriented Infrastructure Linking Science, Society and GEOSS

    NASA Astrophysics Data System (ADS)

    Plag, H.-P.; Foley, G.; Jules-Plag, S.; Ondich, G.; Kaufman, J.

    2012-04-01

    The Group on Earth Observations (GEO) is implementing the Global Earth Observation System of Systems (GEOSS) as a user-driven service infrastructure responding to the needs of users in nine interdependent Societal Benefit Areas (SBAs) of Earth observations (EOs). GEOSS applies an interdisciplinary scientific approach integrating observations, research, and knowledge in these SBAs in order to enable scientific interpretation of the collected observations and the extraction of actionable information. Using EOs to actually produce these societal benefits means getting the data and information to users, i.e., decision-makers. Thus, GEO needs to know what the users need and how they would use the information. The GEOSS User Requirements Registry (URR) is developed as a service-oriented infrastructure enabling a wide range of users, including science and technology (S&T) users, to express their needs in terms of EOs and to understand the benefits of GEOSS for their fields. S&T communities need to be involved in both the development and the use of GEOSS, and the development of the URR accounts for the special needs of these communities. The GEOSS Common Infrastructure (GCI) at the core of GEOSS includes system-oriented registries enabling users to discover, access, and use EOs and derived products and services available through GEOSS. In addition, the user-oriented URR is a place for the collection, sharing, and analysis of user needs and EO requirements, and it provides means for an efficient dialog between users and providers. The URR is a community-based infrastructure for the publishing, viewing, and analyzing of user-need related information. The data model of the URR has a core of seven relations for User Types, Applications, Requirements, Research Needs, Infrastructure Needs, Technology Needs, and Capacity Building Needs. The URR also includes a Lexicon, a number of controlled vocabularies, and

  10. Multiclass cancer classification using a feature subset-based ensemble from microRNA expression profiles.

    PubMed

    Piao, Yongjun; Piao, Minghao; Ryu, Keun Ho

    2017-01-01

    Cancer classification has been a crucial topic of research in cancer treatment. In the last decade, messenger RNA (mRNA) expression profiles have been widely used to classify different types of cancers. With the discovery of a new class of small non-coding RNAs; known as microRNAs (miRNAs), various studies have shown that the expression patterns of miRNA can also accurately classify human cancers. Therefore, there is a great demand for the development of machine learning approaches to accurately classify various types of cancers using miRNA expression data. In this article, we propose a feature subset-based ensemble method in which each model is learned from a different projection of the original feature space to classify multiple cancers. In our method, the feature relevance and redundancy are considered to generate multiple feature subsets, the base classifiers are learned from each independent miRNA subset, and the average posterior probability is used to combine the base classifiers. To test the performance of our method, we used bead-based and sequence-based miRNA expression datasets and conducted 10-fold and leave-one-out cross validations. The experimental results show that the proposed method yields good results and has higher prediction accuracy than popular ensemble methods. The Java program and source code of the proposed method and the datasets in the experiments are freely available at https://sourceforge.net/projects/mirna-ensemble/. Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. Enrichment of Gold in Antimony Matte by Direct Smelting of Refractory Gold Concentrate

    NASA Astrophysics Data System (ADS)

    Yang, Tianzu; Xie, Boyi; Liu, Weifeng; Zhang, Duchao; Chen, Lin

    2018-04-01

    Conventional cyanidation technology achieves low gold recovery when used to process refractory gold concentrate. Based on the geochemical characteristics of gold deposit mineralization, a new method is proposed herein for gold enrichment in antimony matte by smelting of refractory gold concentrate. The effects of the FeO/SiO2 and CaO/SiO2 ratios, smelting temperature, and smelting time on the gold recovery were investigated in detail. The optimum conditions were determined to be FeO/SiO2 ratio of 1.2, CaO/SiO2 ratio of 0.4, smelting temperature of 1200°C, and smelting time of 45 min. The gold content in antimony matte and smelting slag was 96.68 and 1.13 g/t, respectively. The gold, antimony, and arsenic recovery was 97.72%, 26.89%, and 6.56%, respectively, with most of the antimony and arsenic volatilized into dust. Mineral liberation analyzer results showed that the antimony matte mainly consisted of FeS and FeO, with three phases, viz. FeAs, SbAs, and AuSb, embedded between them, indicating that gold was easily enriched with antimony and arsenic during smelting of refractory gold concentrate.

  12. Enrichment of Gold in Antimony Matte by Direct Smelting of Refractory Gold Concentrate

    NASA Astrophysics Data System (ADS)

    Yang, Tianzu; Xie, Boyi; Liu, Weifeng; Zhang, Duchao; Chen, Lin

    2018-06-01

    Conventional cyanidation technology achieves low gold recovery when used to process refractory gold concentrate. Based on the geochemical characteristics of gold deposit mineralization, a new method is proposed herein for gold enrichment in antimony matte by smelting of refractory gold concentrate. The effects of the FeO/SiO2 and CaO/SiO2 ratios, smelting temperature, and smelting time on the gold recovery were investigated in detail. The optimum conditions were determined to be FeO/SiO2 ratio of 1.2, CaO/SiO2 ratio of 0.4, smelting temperature of 1200°C, and smelting time of 45 min. The gold content in antimony matte and smelting slag was 96.68 and 1.13 g/t, respectively. The gold, antimony, and arsenic recovery was 97.72%, 26.89%, and 6.56%, respectively, with most of the antimony and arsenic volatilized into dust. Mineral liberation analyzer results showed that the antimony matte mainly consisted of FeS and FeO, with three phases, viz. FeAs, SbAs, and AuSb, embedded between them, indicating that gold was easily enriched with antimony and arsenic during smelting of refractory gold concentrate.

  13. Usability-driven pruning of large ontologies: the case of SNOMED CT

    PubMed Central

    Boeker, Martin; Illarramendi, Arantza; Schulz, Stefan

    2012-01-01

    Objectives To study ontology modularization techniques when applied to SNOMED CT in a scenario in which no previous corpus of information exists and to examine if frequency-based filtering using MEDLINE can reduce subset size without discarding relevant concepts. Materials and Methods Subsets were first extracted using four graph-traversal heuristics and one logic-based technique, and were subsequently filtered with frequency information from MEDLINE. Twenty manually coded discharge summaries from cardiology patients were used as signatures and test sets. The coverage, size, and precision of extracted subsets were measured. Results Graph-traversal heuristics provided high coverage (71–96% of terms in the test sets of discharge summaries) at the expense of subset size (17–51% of the size of SNOMED CT). Pre-computed subsets and logic-based techniques extracted small subsets (1%), but coverage was limited (24–55%). Filtering reduced the size of large subsets to 10% while still providing 80% coverage. Discussion Extracting subsets to annotate discharge summaries is challenging when no previous corpus exists. Ontology modularization provides valuable techniques, but the resulting modules grow as signatures spread across subhierarchies, yielding a very low precision. Conclusion Graph-traversal strategies and frequency data from an authoritative source can prune large biomedical ontologies and produce useful subsets that still exhibit acceptable coverage. However, a clinical corpus closer to the specific use case is preferred when available. PMID:22268217

  14. Optimal Frequency-Domain System Realization with Weighting

    NASA Technical Reports Server (NTRS)

    Juang, Jer-Nan; Maghami, Peiman G.

    1999-01-01

    Several approaches are presented to identify an experimental system model directly from frequency response data. The formulation uses a matrix-fraction description as the model structure. Frequency weighting such as exponential weighting is introduced to solve a weighted least-squares problem to obtain the coefficient matrices for the matrix-fraction description. A multi-variable state-space model can then be formed using the coefficient matrices of the matrix-fraction description. Three different approaches are introduced to fine-tune the model using nonlinear programming methods to minimize the desired cost function. The first method uses an eigenvalue assignment technique to reassign a subset of system poles to improve the identified model. The second method deals with the model in the real Schur or modal form, reassigns a subset of system poles, and adjusts the columns (rows) of the input (output) influence matrix using a nonlinear optimizer. The third method also optimizes a subset of poles, but the input and output influence matrices are refined at every optimization step through least-squares procedures.

  15. Method and apparatus for obtaining stack traceback data for multiple computing nodes of a massively parallel computer system

    DOEpatents

    Gooding, Thomas Michael; McCarthy, Patrick Joseph

    2010-03-02

    A data collector for a massively parallel computer system obtains call-return stack traceback data for multiple nodes by retrieving partial call-return stack traceback data from each node, grouping the nodes in subsets according to the partial traceback data, and obtaining further call-return stack traceback data from a representative node or nodes of each subset. Preferably, the partial data is a respective instruction address from each node, nodes having identical instruction address being grouped together in the same subset. Preferably, a single node of each subset is chosen and full stack traceback data is retrieved from the call-return stack within the chosen node.

  16. An enhancement of binary particle swarm optimization for gene selection in classifying cancer classes

    PubMed Central

    2013-01-01

    Background Gene expression data could likely be a momentous help in the progress of proficient cancer diagnoses and classification platforms. Lately, many researchers analyze gene expression data using diverse computational intelligence methods, for selecting a small subset of informative genes from the data for cancer classification. Many computational methods face difficulties in selecting small subsets due to the small number of samples compared to the huge number of genes (high-dimension), irrelevant genes, and noisy genes. Methods We propose an enhanced binary particle swarm optimization to perform the selection of small subsets of informative genes which is significant for cancer classification. Particle speed, rule, and modified sigmoid function are introduced in this proposed method to increase the probability of the bits in a particle’s position to be zero. The method was empirically applied to a suite of ten well-known benchmark gene expression data sets. Results The performance of the proposed method proved to be superior to other previous related works, including the conventional version of binary particle swarm optimization (BPSO) in terms of classification accuracy and the number of selected genes. The proposed method also requires lower computational time compared to BPSO. PMID:23617960

  17. An enhanced data visualization method for diesel engine malfunction classification using multi-sensor signals.

    PubMed

    Li, Yiqing; Wang, Yu; Zi, Yanyang; Zhang, Mingquan

    2015-10-21

    The various multi-sensor signal features from a diesel engine constitute a complex high-dimensional dataset. The non-linear dimensionality reduction method, t-distributed stochastic neighbor embedding (t-SNE), provides an effective way to implement data visualization for complex high-dimensional data. However, irrelevant features can deteriorate the performance of data visualization, and thus, should be eliminated a priori. This paper proposes a feature subset score based t-SNE (FSS-t-SNE) data visualization method to deal with the high-dimensional data that are collected from multi-sensor signals. In this method, the optimal feature subset is constructed by a feature subset score criterion. Then the high-dimensional data are visualized in 2-dimension space. According to the UCI dataset test, FSS-t-SNE can effectively improve the classification accuracy. An experiment was performed with a large power marine diesel engine to validate the proposed method for diesel engine malfunction classification. Multi-sensor signals were collected by a cylinder vibration sensor and a cylinder pressure sensor. Compared with other conventional data visualization methods, the proposed method shows good visualization performance and high classification accuracy in multi-malfunction classification of a diesel engine.

  18. An Enhanced Data Visualization Method for Diesel Engine Malfunction Classification Using Multi-Sensor Signals

    PubMed Central

    Li, Yiqing; Wang, Yu; Zi, Yanyang; Zhang, Mingquan

    2015-01-01

    The various multi-sensor signal features from a diesel engine constitute a complex high-dimensional dataset. The non-linear dimensionality reduction method, t-distributed stochastic neighbor embedding (t-SNE), provides an effective way to implement data visualization for complex high-dimensional data. However, irrelevant features can deteriorate the performance of data visualization, and thus, should be eliminated a priori. This paper proposes a feature subset score based t-SNE (FSS-t-SNE) data visualization method to deal with the high-dimensional data that are collected from multi-sensor signals. In this method, the optimal feature subset is constructed by a feature subset score criterion. Then the high-dimensional data are visualized in 2-dimension space. According to the UCI dataset test, FSS-t-SNE can effectively improve the classification accuracy. An experiment was performed with a large power marine diesel engine to validate the proposed method for diesel engine malfunction classification. Multi-sensor signals were collected by a cylinder vibration sensor and a cylinder pressure sensor. Compared with other conventional data visualization methods, the proposed method shows good visualization performance and high classification accuracy in multi-malfunction classification of a diesel engine. PMID:26506347

  19. Optimized hyperspectral band selection using hybrid genetic algorithm and gravitational search algorithm

    NASA Astrophysics Data System (ADS)

    Zhang, Aizhu; Sun, Genyun; Wang, Zhenjie

    2015-12-01

    The serious information redundancy in hyperspectral images (HIs) cannot contribute to the data analysis accuracy, instead it require expensive computational resources. Consequently, to identify the most useful and valuable information from the HIs, thereby improve the accuracy of data analysis, this paper proposed a novel hyperspectral band selection method using the hybrid genetic algorithm and gravitational search algorithm (GA-GSA). In the proposed method, the GA-GSA is mapped to the binary space at first. Then, the accuracy of the support vector machine (SVM) classifier and the number of selected spectral bands are utilized to measure the discriminative capability of the band subset. Finally, the band subset with the smallest number of spectral bands as well as covers the most useful and valuable information is obtained. To verify the effectiveness of the proposed method, studies conducted on an AVIRIS image against two recently proposed state-of-the-art GSA variants are presented. The experimental results revealed the superiority of the proposed method and indicated that the method can indeed considerably reduce data storage costs and efficiently identify the band subset with stable and high classification precision.

  20. Comparative analysis of QSAR models for predicting pK(a) of organic oxygen acids and nitrogen bases from molecular structure.

    PubMed

    Yu, Haiying; Kühne, Ralph; Ebert, Ralf-Uwe; Schüürmann, Gerrit

    2010-11-22

    For 1143 organic compounds comprising 580 oxygen acids and 563 nitrogen bases that cover more than 17 orders of experimental pK(a) (from -5.00 to 12.23), the pK(a) prediction performances of ACD, SPARC, and two calibrations of a semiempirical quantum chemical (QC) AM1 approach have been analyzed. The overall root-mean-square errors (rms) for the acids are 0.41, 0.58 (0.42 without ortho-substituted phenols with intramolecular H-bonding), and 0.55 and for the bases are 0.65, 0.70, 1.17, and 1.27 for ACD, SPARC, and both QC methods, respectively. Method-specific performances are discussed in detail for six acid subsets (phenols and aromatic and aliphatic carboxylic acids with different substitution patterns) and nine base subsets (anilines, primary, secondary and tertiary amines, meta/para-substituted and ortho-substituted pyridines, pyrimidines, imidazoles, and quinolines). The results demonstrate an overall better performance for acids than for bases but also a substantial variation across subsets. For the overall best-performing ACD, rms ranges from 0.12 to 1.11 and 0.40 to 1.21 pK(a) units for the acid and base subsets, respectively. With regard to the squared correlation coefficient r², the results are 0.86 to 0.96 (acids) and 0.79 to 0.95 (bases) for ACD, 0.77 to 0.95 (acids) and 0.85 to 0.97 (bases) for SPARC, and 0.64 to 0.87 (acids) and 0.43 to 0.83 (bases) for the QC methods, respectively. Attention is paid to structural and method-specific causes for observed pitfalls. The significant subset dependence of the prediction performances suggests a consensus modeling approach.

  1. Label-free haemogram using wavelength modulated Raman spectroscopy for identifying immune-cell subset

    NASA Astrophysics Data System (ADS)

    Ashok, Praveen C.; Praveen, Bavishna B.; Campbell, Elaine C.; Dholakia, Kishan; Powis, Simon J.

    2014-03-01

    Leucocytes in the blood of mammals form a powerful protective system against a wide range of dangerous pathogens. There are several types of immune cells that has specific role in the whole immune system. The number and type of immune cells alter in the disease state and identifying the type of immune cell provides information about a person's state of health. There are several immune cell subsets that are essentially morphologically identical and require external labeling to enable discrimination. Here we demonstrate the feasibility of using Wavelength Modulated Raman Spectroscopy (WMRS) with suitable machine learning algorithms as a label-free method to distinguish between different closely lying immune cell subset. Principal Component Analysis (PCA) was performed on WMRS data from single cells, obtained using confocal Raman microscopy for feature reduction, followed by Support Vector Machine (SVM) for binary discrimination of various cell subset, which yielded an accuracy >85%. The method was successful in discriminating between untouched and unfixed purified populations of CD4+CD3+ and CD8+CD3+ T lymphocyte subsets, and CD56+CD3- natural killer cells with a high degree of specificity. It was also proved sensitive enough to identify unique Raman signatures that allow clear discrimination between dendritic cell subsets, comprising CD303+CD45+ plasmacytoid and CD1c+CD141+ myeloid dendritic cells. The results of this study clearly show that WMRS is highly sensitive and can distinguish between cell types that are morphologically identical.

  2. Non-native Speech Perception Training Using Vowel Subsets: Effects of Vowels in Sets and Order of Training

    PubMed Central

    Nishi, Kanae; Kewley-Port, Diane

    2008-01-01

    Purpose Nishi and Kewley-Port (2007) trained Japanese listeners to perceive nine American English monophthongs and showed that a protocol using all nine vowels (fullset) produced better results than the one using only the three more difficult vowels (subset). The present study extended the target population to Koreans and examined whether protocols combining the two stimulus sets would provide more effective training. Method Three groups of five Korean listeners were trained on American English vowels for nine days using one of the three protocols: fullset only, first three days on subset then six days on fullset, or first six days on fullset then three days on subset. Participants' performance was assessed by pre- and post-training tests, as well as by a mid-training test. Results 1) Fullset training was also effective for Koreans; 2) no advantage was found for the two combined protocols over the fullset only protocol, and 3) sustained “non-improvement” was observed for training using one of the combined protocols. Conclusions In using subsets for training American English vowels, care should be taken not only in the selection of subset vowels, but also for the training orders of subsets. PMID:18664694

  3. Minimally buffered data transfers between nodes in a data communications network

    DOEpatents

    Miller, Douglas R.

    2015-06-23

    Methods, apparatus, and products for minimally buffered data transfers between nodes in a data communications network are disclosed that include: receiving, by a messaging module on an origin node, a storage identifier, a origin data type, and a target data type, the storage identifier specifying application storage containing data, the origin data type describing a data subset contained in the origin application storage, the target data type describing an arrangement of the data subset in application storage on a target node; creating, by the messaging module, origin metadata describing the origin data type; selecting, by the messaging module from the origin application storage in dependence upon the origin metadata and the storage identifier, the data subset; and transmitting, by the messaging module to the target node, the selected data subset for storing in the target application storage in dependence upon the target data type without temporarily buffering the data subset.

  4. Feature selection for the classification of traced neurons.

    PubMed

    López-Cabrera, José D; Lorenzo-Ginori, Juan V

    2018-06-01

    The great availability of computational tools to calculate the properties of traced neurons leads to the existence of many descriptors which allow the automated classification of neurons from these reconstructions. This situation determines the necessity to eliminate irrelevant features as well as making a selection of the most appropriate among them, in order to improve the quality of the classification obtained. The dataset used contains a total of 318 traced neurons, classified by human experts in 192 GABAergic interneurons and 126 pyramidal cells. The features were extracted by means of the L-measure software, which is one of the most used computational tools in neuroinformatics to quantify traced neurons. We review some current feature selection techniques as filter, wrapper, embedded and ensemble methods. The stability of the feature selection methods was measured. For the ensemble methods, several aggregation methods based on different metrics were applied to combine the subsets obtained during the feature selection process. The subsets obtained applying feature selection methods were evaluated using supervised classifiers, among which Random Forest, C4.5, SVM, Naïve Bayes, Knn, Decision Table and the Logistic classifier were used as classification algorithms. Feature selection methods of types filter, embedded, wrappers and ensembles were compared and the subsets returned were tested in classification tasks for different classification algorithms. L-measure features EucDistanceSD, PathDistanceSD, Branch_pathlengthAve, Branch_pathlengthSD and EucDistanceAve were present in more than 60% of the selected subsets which provides evidence about their importance in the classification of this neurons. Copyright © 2018 Elsevier B.V. All rights reserved.

  5. Collective communications apparatus and method for parallel systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Knies, Allan D.; Keppel, David Pardo; Woo, Dong Hyuk

    A collective communication apparatus and method for parallel computing systems. For example, one embodiment of an apparatus comprises a plurality of processor elements (PEs); collective interconnect logic to dynamically form a virtual collective interconnect (VCI) between the PEs at runtime without global communication among all of the PEs, the VCI defining a logical topology between the PEs in which each PE is directly communicatively coupled to a only a subset of the remaining PEs; and execution logic to execute collective operations across the PEs, wherein one or more of the PEs receive first results from a first portion of themore » subset of the remaining PEs, perform a portion of the collective operations, and provide second results to a second portion of the subset of the remaining PEs.« less

  6. Method and apparatus for wavefront sensing

    DOEpatents

    Bahk, Seung-Whan

    2016-08-23

    A method of measuring characteristics of a wavefront of an incident beam includes obtaining an interferogram associated with the incident beam passing through a transmission mask and Fourier transforming the interferogram to provide a frequency domain interferogram. The method also includes selecting a subset of harmonics from the frequency domain interferogram, individually inverse Fourier transforming each of the subset of harmonics to provide a set of spatial domain harmonics, and extracting a phase profile from each of the set of spatial domain harmonics. The method further includes removing phase discontinuities in the phase profile, rotating the phase profile, and reconstructing a phase front of the wavefront of the incident beam.

  7. Detection and Classification of Pole-Like Objects from Mobile Mapping Data

    NASA Astrophysics Data System (ADS)

    Fukano, K.; Masuda, H.

    2015-08-01

    Laser scanners on a vehicle-based mobile mapping system can capture 3D point-clouds of roads and roadside objects. Since roadside objects have to be maintained periodically, their 3D models are useful for planning maintenance tasks. In our previous work, we proposed a method for detecting cylindrical poles and planar plates in a point-cloud. However, it is often required to further classify pole-like objects into utility poles, streetlights, traffic signals and signs, which are managed by different organizations. In addition, our previous method may fail to extract low pole-like objects, which are often observed in urban residential areas. In this paper, we propose new methods for extracting and classifying pole-like objects. In our method, we robustly extract a wide variety of poles by converting point-clouds into wireframe models and calculating cross-sections between wireframe models and horizontal cutting planes. For classifying pole-like objects, we subdivide a pole-like object into five subsets by extracting poles and planes, and calculate feature values of each subset. Then we apply a supervised machine learning method using feature variables of subsets. In our experiments, our method could achieve excellent results for detection and classification of pole-like objects.

  8. UAVSAR and TerraSAR-X Based InSAR Detection of Localized Subsidence in the New Orleans Area

    NASA Astrophysics Data System (ADS)

    Blom, R. G.; An, K.; Jones, C. E.; Latini, D.

    2014-12-01

    Vulnerability of the US Gulf coast to inundation has received increased attention since hurricanes Katrina and Rita. Compounding effects of sea level rise, wetland loss, and regional and local subsidence makes flood protection a difficult challenge, and particularly for the New Orleans area. Key to flood protection is precise knowledge of elevations and elevation changes. Analysis of historical and continuing geodetic measurements show surprising complexity, including locations subsiding more rapidly than considered during planning of hurricane protection and coastal restoration projects. Combining traditional, precise geodetic data with interferometric synthetic aperture radar (InSAR) observations can provide geographically dense constraints on surface deformation. The Gulf Coast environment is challenging for InSAR techniques, especially with systems not designed for interferometry. We use two InSAR capable systems, the L- band (24 cm wavelength) airborne JPL/NASA UAVSAR, and the DLR/EADS Astrium spaceborne TerraSAR X-band (3 cm wavelength), and compare results. First, we are applying pair-wise InSAR to the longer wavelength UAVSAR data to detect localized elevation changes potentially impacting flood protection infrastructure from 2009 - 2014. We focus on areas on and near flood protection infrastructure to identify changes indicative of subsidence, structural deformation, and/or seepage. The Spaceborne TerraSAR X-band SAR system has relatively frequent observations, and dense persistent scatterers in urban areas, enabling measurement of very small displacements. We compare L-band UAVSAR results with permanent scatterer (PS-InSAR) and Short Baseline Subsets (SBAS) interferometric analyses of a stack composed by 28 TerraSAR X-band images acquired over the same period. Thus we can evaluate results from the different radar frequencies and analyses techniques. Preliminary results indicate subsidence features potentially of a variety of causes, including ground water pumping to post recent construction ground compaction. Our overall goal is to enable incorporation of InSAR into the decision making process via identification and delineation of areas of persistent subsidence, and provide input to improve monitoring and planning in flood risk areas.

  9. Dynamic deformation of Seguam Island, Alaska, 1992--2008, from multi-interferogram InSAR processing

    USGS Publications Warehouse

    Lee, Chang-Wook; Lu, Zhong; Won, Joong-Sun; Jung, Hyung-Sup; Dzurisin, Daniel

    2013-01-01

    We generated a time-series of ERS-1/2 and ENVISAT interferometric synthetic aperture radar (InSAR) images to study ground surface deformation at Seguam Island from 1992 to 2008. We used the small baseline subset (SBAS) technique to reduce artifacts associated with baseline uncertainties and atmospheric delay anomalies, and processed images from two adjacent tracks to validate our results. Seguam Island comprises the remnants of two late Quaternary calderas, one in the western caldera of the island and one in the eastern part of the island. The western caldera subsided at a constant rate of ~ 1.6 cm/yr throughout the study period, while the eastern caldera experienced alternating periods of subsidence and uplift: ~ 5 cm/year uplift during January 1993–October 1993 (stage 1), ~ 1.6 cm/year subsidence during October 1993–November 1998 (stage 2), ~ 2.0 cm/year uplift during November 1998–September 2000 (stage 3), ~ 1.4 cm/year subsidence during September 2000–November 2005 (stage 4), and ~ 0.8 cm/year uplift during November 2005– July 2007 (stage 5). Source modeling indicates a deflationary source less than 2 km below sea level (BSL) beneath the western caldera and two sources beneath the eastern caldera: an inflationary source 2.5–6.0 km BSL and a deflationary source less than 2 km BSL. We suggest that uplift of the eastern caldera is driven by episodic intrusions of basaltic magma into a poroelastic reservoir 2.5–6.0 km BSL beneath the caldera. Cooling and degassing of the reservoir between intrusions results in steady subsidence of the overlying surface. Although we found no evidence of magma intrusion beneath the western caldera during the study period, it is the site (Pyre Peak) of all historical eruptions on the island and therefore cooling and degassing of intrusions presumably contributes to subsidence there as well. Another likely subsidence mechanism in the western caldera is thermoelastic contraction of lava flows emplaced near Pyre Peak during several historical eruptions, most recently in 1977 and 1992–93.

  10. Systems, Apparatuses and Methods for Beamforming RFID Tags

    NASA Technical Reports Server (NTRS)

    Fink, Patrick W. (Inventor); Lin, Gregory Y. (Inventor); Ngo, Phong H. (Inventor); Kennedy, Timothy F. (Inventor)

    2017-01-01

    A radio frequency identification (RFID) system includes an RFID interrogator and an RFID tag having a plurality of information sources and a beamforming network. The tag receives electromagnetic radiation from the interrogator. The beamforming network directs the received electromagnetic radiation to a subset of the plurality of information sources. The RFID tag transmits a response to the received electromagnetic radiation, based on the subset of the plurality of information sources to which the received electromagnetic radiation was directed. Method and other embodiments are also disclosed.

  11. Effects of a Simulated Tennis Match on Lymphocyte Subset Measurements

    ERIC Educational Resources Information Center

    Schafer, Mark; Kell, Holly; Navalta, James; Tibana, Ramires; Lyons, Scott; Arnett, Scott

    2014-01-01

    Tennis is an activity requiring both endurance and anaerobic components, which could have immunosuppressive effects postexercise. Purpose: The purpose of this investigation was to determine the effect of a simulated tennis match on apoptotic and migratory markers on lymphocyte subsets. Method: Male high school (n = 5) and college (n = 3) tennis…

  12. Fizzy: feature subset selection for metagenomics.

    PubMed

    Ditzler, Gregory; Morrison, J Calvin; Lan, Yemin; Rosen, Gail L

    2015-11-04

    Some of the current software tools for comparative metagenomics provide ecologists with the ability to investigate and explore bacterial communities using α- & β-diversity. Feature subset selection--a sub-field of machine learning--can also provide a unique insight into the differences between metagenomic or 16S phenotypes. In particular, feature subset selection methods can obtain the operational taxonomic units (OTUs), or functional features, that have a high-level of influence on the condition being studied. For example, in a previous study we have used information-theoretic feature selection to understand the differences between protein family abundances that best discriminate between age groups in the human gut microbiome. We have developed a new Python command line tool, which is compatible with the widely adopted BIOM format, for microbial ecologists that implements information-theoretic subset selection methods for biological data formats. We demonstrate the software tools capabilities on publicly available datasets. We have made the software implementation of Fizzy available to the public under the GNU GPL license. The standalone implementation can be found at http://github.com/EESI/Fizzy.

  13. Fizzy. Feature subset selection for metagenomics

    DOE PAGES

    Ditzler, Gregory; Morrison, J. Calvin; Lan, Yemin; ...

    2015-11-04

    Background: Some of the current software tools for comparative metagenomics provide ecologists with the ability to investigate and explore bacterial communities using α– & β–diversity. Feature subset selection – a sub-field of machine learning – can also provide a unique insight into the differences between metagenomic or 16S phenotypes. In particular, feature subset selection methods can obtain the operational taxonomic units (OTUs), or functional features, that have a high-level of influence on the condition being studied. For example, in a previous study we have used information-theoretic feature selection to understand the differences between protein family abundances that best discriminate betweenmore » age groups in the human gut microbiome. Results: We have developed a new Python command line tool, which is compatible with the widely adopted BIOM format, for microbial ecologists that implements information-theoretic subset selection methods for biological data formats. We demonstrate the software tools capabilities on publicly available datasets. Conclusions: We have made the software implementation of Fizzy available to the public under the GNU GPL license. The standalone implementation can be found at http://github.com/EESI/Fizzy.« less

  14. Visual analytics in cheminformatics: user-supervised descriptor selection for QSAR methods.

    PubMed

    Martínez, María Jimena; Ponzoni, Ignacio; Díaz, Mónica F; Vazquez, Gustavo E; Soto, Axel J

    2015-01-01

    The design of QSAR/QSPR models is a challenging problem, where the selection of the most relevant descriptors constitutes a key step of the process. Several feature selection methods that address this step are concentrated on statistical associations among descriptors and target properties, whereas the chemical knowledge is left out of the analysis. For this reason, the interpretability and generality of the QSAR/QSPR models obtained by these feature selection methods are drastically affected. Therefore, an approach for integrating domain expert's knowledge in the selection process is needed for increase the confidence in the final set of descriptors. In this paper a software tool, which we named Visual and Interactive DEscriptor ANalysis (VIDEAN), that combines statistical methods with interactive visualizations for choosing a set of descriptors for predicting a target property is proposed. Domain expertise can be added to the feature selection process by means of an interactive visual exploration of data, and aided by statistical tools and metrics based on information theory. Coordinated visual representations are presented for capturing different relationships and interactions among descriptors, target properties and candidate subsets of descriptors. The competencies of the proposed software were assessed through different scenarios. These scenarios reveal how an expert can use this tool to choose one subset of descriptors from a group of candidate subsets or how to modify existing descriptor subsets and even incorporate new descriptors according to his or her own knowledge of the target property. The reported experiences showed the suitability of our software for selecting sets of descriptors with low cardinality, high interpretability, low redundancy and high statistical performance in a visual exploratory way. Therefore, it is possible to conclude that the resulting tool allows the integration of a chemist's expertise in the descriptor selection process with a low cognitive effort in contrast with the alternative of using an ad-hoc manual analysis of the selected descriptors. Graphical abstractVIDEAN allows the visual analysis of candidate subsets of descriptors for QSAR/QSPR. In the two panels on the top, users can interactively explore numerical correlations as well as co-occurrences in the candidate subsets through two interactive graphs.

  15. Plate-based diversity subset screening generation 2: an improved paradigm for high-throughput screening of large compound files.

    PubMed

    Bell, Andrew S; Bradley, Joseph; Everett, Jeremy R; Loesel, Jens; McLoughlin, David; Mills, James; Peakman, Marie-Claire; Sharp, Robert E; Williams, Christine; Zhu, Hongyao

    2016-11-01

    High-throughput screening (HTS) is an effective method for lead and probe discovery that is widely used in industry and academia to identify novel chemical matter and to initiate the drug discovery process. However, HTS can be time consuming and costly and the use of subsets as an efficient alternative to screening entire compound collections has been investigated. Subsets may be selected on the basis of chemical diversity, molecular properties, biological activity diversity or biological target focus. Previously, we described a novel form of subset screening: plate-based diversity subset (PBDS) screening, in which the screening subset is constructed by plate selection (rather than individual compound cherry-picking), using algorithms that select for compound quality and chemical diversity on a plate basis. In this paper, we describe a second-generation approach to the construction of an updated subset: PBDS2, using both plate and individual compound selection, that has an improved coverage of the chemical space of the screening file, whilst only selecting the same number of plates for screening. We describe the validation of PBDS2 and its successful use in hit and lead discovery. PBDS2 screening became the default mode of singleton (one compound per well) HTS for lead discovery in Pfizer.

  16. The depth estimation of 3D face from single 2D picture based on manifold learning constraints

    NASA Astrophysics Data System (ADS)

    Li, Xia; Yang, Yang; Xiong, Hailiang; Liu, Yunxia

    2018-04-01

    The estimation of depth is virtual important in 3D face reconstruction. In this paper, we propose a t-SNE based on manifold learning constraints and introduce K-means method to divide the original database into several subset, and the selected optimal subset to reconstruct the 3D face depth information can greatly reduce the computational complexity. Firstly, we carry out the t-SNE operation to reduce the key feature points in each 3D face model from 1×249 to 1×2. Secondly, the K-means method is applied to divide the training 3D database into several subset. Thirdly, the Euclidean distance between the 83 feature points of the image to be estimated and the feature point information before the dimension reduction of each cluster center is calculated. The category of the image to be estimated is judged according to the minimum Euclidean distance. Finally, the method Kong D will be applied only in the optimal subset to estimate the depth value information of 83 feature points of 2D face images. Achieving the final depth estimation results, thus the computational complexity is greatly reduced. Compared with the traditional traversal search estimation method, although the proposed method error rate is reduced by 0.49, the number of searches decreases with the change of the category. In order to validate our approach, we use a public database to mimic the task of estimating the depth of face images from 2D images. The average number of searches decreased by 83.19%.

  17. Evaluation of subset matching methods and forms of covariate balance.

    PubMed

    de Los Angeles Resa, María; Zubizarreta, José R

    2016-11-30

    This paper conducts a Monte Carlo simulation study to evaluate the performance of multivariate matching methods that select a subset of treatment and control observations. The matching methods studied are the widely used nearest neighbor matching with propensity score calipers and the more recently proposed methods, optimal matching of an optimally chosen subset and optimal cardinality matching. The main findings are: (i) covariate balance, as measured by differences in means, variance ratios, Kolmogorov-Smirnov distances, and cross-match test statistics, is better with cardinality matching because by construction it satisfies balance requirements; (ii) for given levels of covariate balance, the matched samples are larger with cardinality matching than with the other methods; (iii) in terms of covariate distances, optimal subset matching performs best; (iv) treatment effect estimates from cardinality matching have lower root-mean-square errors, provided strong requirements for balance, specifically, fine balance, or strength-k balance, plus close mean balance. In standard practice, a matched sample is considered to be balanced if the absolute differences in means of the covariates across treatment groups are smaller than 0.1 standard deviations. However, the simulation results suggest that stronger forms of balance should be pursued in order to remove systematic biases due to observed covariates when a difference in means treatment effect estimator is used. In particular, if the true outcome model is additive, then marginal distributions should be balanced, and if the true outcome model is additive with interactions, then low-dimensional joints should be balanced. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  18. Analysis of Information Content in High-Spectral Resolution Sounders using Subset Selection Analysis

    NASA Technical Reports Server (NTRS)

    Velez-Reyes, Miguel; Joiner, Joanna

    1998-01-01

    In this paper, we summarize the results of the sensitivity analysis and data reduction carried out to determine the information content of AIRS and IASI channels. The analysis and data reduction was based on the use of subset selection techniques developed in the linear algebra and statistical community to study linear dependencies in high dimensional data sets. We applied the subset selection method to study dependency among channels by studying the dependency among their weighting functions. Also, we applied the technique to study the information provided by the different levels in which the atmosphere is discretized for retrievals and analysis. Results from the method correlate well with intuition in many respects and point out to possible modifications for band selection in sensor design and number and location of levels in the analysis process.

  19. Finding minimum gene subsets with heuristic breadth-first search algorithm for robust tumor classification

    PubMed Central

    2012-01-01

    Background Previous studies on tumor classification based on gene expression profiles suggest that gene selection plays a key role in improving the classification performance. Moreover, finding important tumor-related genes with the highest accuracy is a very important task because these genes might serve as tumor biomarkers, which is of great benefit to not only tumor molecular diagnosis but also drug development. Results This paper proposes a novel gene selection method with rich biomedical meaning based on Heuristic Breadth-first Search Algorithm (HBSA) to find as many optimal gene subsets as possible. Due to the curse of dimensionality, this type of method could suffer from over-fitting and selection bias problems. To address these potential problems, a HBSA-based ensemble classifier is constructed using majority voting strategy from individual classifiers constructed by the selected gene subsets, and a novel HBSA-based gene ranking method is designed to find important tumor-related genes by measuring the significance of genes using their occurrence frequencies in the selected gene subsets. The experimental results on nine tumor datasets including three pairs of cross-platform datasets indicate that the proposed method can not only obtain better generalization performance but also find many important tumor-related genes. Conclusions It is found that the frequencies of the selected genes follow a power-law distribution, indicating that only a few top-ranked genes can be used as potential diagnosis biomarkers. Moreover, the top-ranked genes leading to very high prediction accuracy are closely related to specific tumor subtype and even hub genes. Compared with other related methods, the proposed method can achieve higher prediction accuracy with fewer genes. Moreover, they are further justified by analyzing the top-ranked genes in the context of individual gene function, biological pathway, and protein-protein interaction network. PMID:22830977

  20. Dimethyl fumarate–induced lymphopenia in MS due to differential T-cell subset apoptosis

    PubMed Central

    Ghadiri, Mahtab; Rezk, Ayman; Li, Rui; Evans, Ashley; Luessi, Felix; Zipp, Frauke; Giacomini, Paul S.; Antel, Jack

    2017-01-01

    Objective: To examine the mechanism underlying the preferential CD8+ vs CD4+ T-cell lymphopenia induced by dimethyl fumarate (DMF) treatment of MS. Methods: Total lymphocyte counts and comprehensive T-cell subset analyses were performed in high-quality samples obtained from patients with MS prior to and serially following DMF treatment initiation. Random coefficient mixed-effects analysis was used to model the trajectory of T-cell subset losses in vivo. Survival and apoptosis of distinct T-cell subsets were assessed following in vitro exposure to DMF. Results: Best-fit modeling indicated that the DMF-induced preferential reductions in CD8+ vs CD4+ T-cell counts nonetheless followed similar depletion kinetics, suggesting a similar rather than distinct mechanism involved in losses of both the CD8+ and CD4+ T cells. In vitro, DMF exposure resulted in dose-dependent reductions in T-cell survival, which were found to reflect apoptotic cell death. This DMF-induced apoptosis was greater for CD8+ vs CD4+, as well as for memory vs naive, and conventional vs regulatory T-cell subsets, a pattern which mirrored preferential T-cell subset losses that we observed during in vivo treatment of patients. Conclusions: Differential apoptosis mediated by DMF may underlie the preferential lymphopenia of distinct T-cell subsets, including CD8+ and memory T-cell subsets, seen in treated patients with MS. This differential susceptibility of distinct T-cell subsets to DMF-induced apoptosis may contribute to both the safety and efficacy profiles of DMF in patients with MS. PMID:28377940

  1. Short-cut Methods versus Rigorous Methods for Performance-evaluation of Distillation Configurations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ramapriya, Gautham Madenoor; Selvarajah, Ajiththaa; Jimenez Cucaita, Luis Eduardo

    Here, this study demonstrates the efficacy of a short-cut method such as the Global Minimization Algorithm (GMA), that uses assumptions of ideal mixtures, constant molar overflow (CMO) and pinched columns, in pruning the search-space of distillation column configurations for zeotropic multicomponent separation, to provide a small subset of attractive configurations with low minimum heat duties. The short-cut method, due to its simplifying assumptions, is computationally efficient, yet reliable in identifying the small subset of useful configurations for further detailed process evaluation. This two-tier approach allows expedient search of the configuration space containing hundreds to thousands of candidate configurations for amore » given application.« less

  2. Short-cut Methods versus Rigorous Methods for Performance-evaluation of Distillation Configurations

    DOE PAGES

    Ramapriya, Gautham Madenoor; Selvarajah, Ajiththaa; Jimenez Cucaita, Luis Eduardo; ...

    2018-05-17

    Here, this study demonstrates the efficacy of a short-cut method such as the Global Minimization Algorithm (GMA), that uses assumptions of ideal mixtures, constant molar overflow (CMO) and pinched columns, in pruning the search-space of distillation column configurations for zeotropic multicomponent separation, to provide a small subset of attractive configurations with low minimum heat duties. The short-cut method, due to its simplifying assumptions, is computationally efficient, yet reliable in identifying the small subset of useful configurations for further detailed process evaluation. This two-tier approach allows expedient search of the configuration space containing hundreds to thousands of candidate configurations for amore » given application.« less

  3. Monitoring large-scale landslides and their induced hazard with COSMO-SkyMed Intermittent SBAS (ISBAS): a case study in north-western Sicily, Italy.

    NASA Astrophysics Data System (ADS)

    Novellino, Alessandro; Cigna, Francesca; Jordan, Colm; Sowter, Andrew; Calcaterra, Domenico

    2014-05-01

    Landslides detection and mapping are fundamental requirements for every hazard and risk evaluation. Due to their inevitable shortcomings, geomorphological field surveys and airphoto interpretation do not document all the gravitational events. Indeed some unstable slopes are inaccessible to field surveyors, while some landslides are too slow to be detected with the naked eye or interpretation of aerial photographs. In this work, we integrate geomorphological surveys with ground motion data derived by employing COSMO-SkyMed satellite imagery and the Intermittent Small BAseline Subset (ISBAS; Sowter et al., 2013), a new Advanced Differential Interferometry Synthetic Aperture Radar (ADInSAR) technique which has been developed recently at the Nottingham University in the UK. The main advantage of ISBAS with respect to other InSAR and SBAS techniques, is the possibility to detect good radar reflectors even in non-urbanized terrain, where ground targets usually look intermittently coherent, meaning they have high coherence only in some interferograms but not in others. ISBAS has proven capable of increasing results over natural, woodland and agricultural terrains and, as a result, it makes it possible to improve the detection of landslide boundaries and the assessment of the state of activity where other InSAR approaches fail. We used COSMO-SkyMed StripMap data covering the period between November 2008 and October 2011, with 3m ground range resolution, 40° look angle and minimum revisiting time of 8 days. The data consist of 38 ascending images (track 133, frame 380) with ground track angle at scene centre of 169.5° from the north-south direction. These have been obtained thanks to an agreement between the Italian Ministry for the Environment, Land and Sea and the University of Naples 'Federico II'. We tested ISBAS in north-western Sicily (southern Italy), over a 1,530 km2 area where 1,473 landslides have been identified based on optical imagery and field surveys by the local Hydro-geomorphological Setting Plan published in 2006. The geological and tectonic setting of the area, related to the Apenninic-Maghrebian Chain orogeny, makes most of the instability phenomena of complex or flow type with an extremely slow to very slow velocity, namely very suitable for an interferometric analysis. We show the results for Piana degli Albanesi, a thrust faults-bounded basin located in the northern Mt. Kumeta massif, filled in with Lower Miocene marls and pelitic deposits. Here landslide risk affects housing and public infrastructure (e.g. the SP34, the SP38 and SP102 highways), and the unstable slopes extend up to a gravity masonry dam, such as to require continuous rebuilding of infrastructures in the portions with the higher displacement rates. The ISBAS results for ascending geometry entail an unprecedented 685,518 points in a 90 km2 extended processing area, and their Line Of Sight velocities range between -6.4 mm/yr (away from the sensor) and +6.6mm/yr (towards the sensor). ISBAS results detect a total of 89 mapped landslides, of which 23 are previously not recorded. The analysis of ISBAS COSMO-SKyMed time-series allows us to compare the deformation trends to rainfall events, and to evaluate the correlation between heavy or prolonged rainfall and accelerations in the ground motion histories of the identified landslides. Sowter A., Bateson L., Strange P., Ambrose K., Syafiudin M.F., 2013. DInSAR estimation of land motion using intermittent coherence with application to the South Derbyshire and Leicestershire coalfields. Remote Sensing Letters, v.4 : 979-987.

  4. Clustering, Seriation, and Subset Extraction of Confusion Data

    ERIC Educational Resources Information Center

    Brusco, Michael J.; Steinley, Douglas

    2006-01-01

    The study of confusion data is a well established practice in psychology. Although many types of analytical approaches for confusion data are available, among the most common methods are the extraction of 1 or more subsets of stimuli, the partitioning of the complete stimulus set into distinct groups, and the ordering of the stimulus set. Although…

  5. Generalized Lotka—Volterra systems connected with simple Lie algebras

    NASA Astrophysics Data System (ADS)

    Charalambides, Stelios A.; Damianou, Pantelis A.; Evripidou, Charalambos A.

    2015-06-01

    We devise a new method for producing Hamiltonian systems by constructing the corresponding Lax pairs. This is achieved by considering a larger subset of the positive roots than the simple roots of the root system of a simple Lie algebra. We classify all subsets of the positive roots of the root system of type An for which the corresponding Hamiltonian systems are transformed, via a simple change of variables, to Lotka-Volterra systems. For some special cases of subsets of the positive roots of the root system of type An, we produce new integrable Hamiltonian systems.

  6. NASA Johnson Space Center: Mini AERCam Testing with GSS6560

    NASA Technical Reports Server (NTRS)

    Cryant, Scott P.

    2004-01-01

    This slide presentation reviews the testing of the Miniature Autonomous Extravehicular Robotic Camera (Mini AERCam) with the GPS/SBAS simulation system, GSS6560. There is a listing of several GPS based programs at NASA Johnson, including the testing of Shuttle testing of the GPS system. Including information about Space Integrated GPS/INS (SIGI) testing. There is also information about the standalone ISS SIGI test,and testing of the SIGI for the Crew Return Vehicle. The Mini AERCam is a small, free-flying camera for remote inspections of the ISS, it uses precise relative navigation with differential carrier phase GPS to provide situational awareness to operators. The closed loop orbital testing with and without the use of the GSS6550 system of the Mini AERCam system is reviewed.

  7. Choosing non-redundant representative subsets of protein sequence data sets using submodular optimization.

    PubMed

    Libbrecht, Maxwell W; Bilmes, Jeffrey A; Noble, William Stafford

    2018-04-01

    Selecting a non-redundant representative subset of sequences is a common step in many bioinformatics workflows, such as the creation of non-redundant training sets for sequence and structural models or selection of "operational taxonomic units" from metagenomics data. Previous methods for this task, such as CD-HIT, PISCES, and UCLUST, apply a heuristic threshold-based algorithm that has no theoretical guarantees. We propose a new approach based on submodular optimization. Submodular optimization, a discrete analogue to continuous convex optimization, has been used with great success for other representative set selection problems. We demonstrate that the submodular optimization approach results in representative protein sequence subsets with greater structural diversity than sets chosen by existing methods, using as a gold standard the SCOPe library of protein domain structures. In this setting, submodular optimization consistently yields protein sequence subsets that include more SCOPe domain families than sets of the same size selected by competing approaches. We also show how the optimization framework allows us to design a mixture objective function that performs well for both large and small representative sets. The framework we describe is the best possible in polynomial time (under some assumptions), and it is flexible and intuitive because it applies a suite of generic methods to optimize one of a variety of objective functions. © 2018 Wiley Periodicals, Inc.

  8. Wab-InSAR: a new wavelet based InSAR time series technique applied to volcanic and tectonic areas

    NASA Astrophysics Data System (ADS)

    Walter, T. R.; Shirzaei, M.; Nankali, H.; Roustaei, M.

    2009-12-01

    Modern geodetic techniques such as InSAR and GPS provide valuable observations of the deformation field. Because of the variety of environmental interferences (e.g., atmosphere, topography distortion) and incompleteness of the models (assumption of the linear model for deformation), those observations are usually tainted by various systematic and random errors. Therefore we develop and test new methods to identify and filter unwanted periodic or episodic artifacts to obtain accurate and precise deformation measurements. Here we present and implement a new wavelet based InSAR (Wab-InSAR) time series approach. Because wavelets are excellent tools for identifying hidden patterns and capturing transient signals, we utilize wavelet functions for reducing the effect of atmospheric delay and digital elevation model inaccuracies. Wab-InSAR is a model free technique, reducing digital elevation model errors in individual interferograms using a 2D spatial Legendre polynomial wavelet filter. Atmospheric delays are reduced using a 3D spatio-temporal wavelet transform algorithm and a novel technique for pixel selection. We apply Wab-InSAR to several targets, including volcano deformation processes at Hawaii Island, and mountain building processes in Iran. Both targets are chosen to investigate large and small amplitude signals, variable and complex topography and atmospheric effects. In this presentation we explain different steps of the technique, validate the results by comparison to other high resolution processing methods (GPS, PS-InSAR, SBAS) and discuss the geophysical results.

  9. Analysis of Critical Earth Observation Priorities for Societal Benefit

    NASA Astrophysics Data System (ADS)

    Zell, E. R.; Huff, A. K.; Carpenter, A. T.; Friedl, L.

    2011-12-01

    To ensure that appropriate near real-time (NRT) and historical Earth observation data are available to benefit society and meet end-user needs, the Group on Earth Observations (GEO) sponsored a multi-disciplinary study to identify a set of critical and common Earth observations associated with 9 Societal Benefit Areas (SBAs): Agriculture, Biodiversity, Climate, Disasters, Ecosystems, Energy, Health, Water, and Weather. GEO is an intergovernmental organization working to improve the availability, access, and use of Earth observations to benefit society through a Global Earth Observation System of Systems (GEOSS). The study, overseen by the GEO User Interface Committee, focused on the "demand" side of Earth observation needs: which users need what types of data, and when? The methodology for the study was a meta-analysis of over 1,700 publicly available documents addressing Earth observation user priorities, under the guidance of expert advisors from around the world. The result was a ranking of 146 Earth observation parameters that are critical and common to multiple SBAs, based on an ensemble of 4 statistically robust methods. Within the results, key details emerged on NRT observations needed to serve a broad community of users. The NRT observation priorities include meteorological parameters, vegetation indices, land cover and soil property observations, water body and snow cover properties, and atmospheric composition. The results of the study and examples of NRT applications will be presented. The applications are as diverse as the list of priority parameters. For example, NRT meteorological and soil moisture information can support monitoring and forecasting for more than 25 infectious diseases, including epidemic diseases, such as malaria, and diseases of major concern in the U.S., such as Lyme disease. Quickly evolving events that impact forests, such as fires and insect outbreaks, can be monitored and forecasted with a combination of vegetation indices, fuel moisture content, burn scars, and meteorological parameters. Impacts to public health and livelihoods due to food insecurity, algal blooms, and air pollution can be addressed through NRT monitoring of specific events utilizing land cover, atmospheric composition, water quality, and meteorological observations. More broadly, the assessment of water availability for drinking and agriculture and the development of floods and storms rely on continuous feeds of NRT meteorological and atmospheric composition observations. Overall, this multi-disciplinary study of user needs for NRT data and products can inform the design and operation of NRT data systems. Follow-on work for this study will also be presented, focusing on the availability of current and future satellite measurements (including NRT) of the 30 most critical Earth observation priorities, as well as a detailed analysis of users' needs for precipitation data. The results of this study summarize the priorities for critical Earth observations utilized globally for societal benefit.

  10. A New Approach for Identifying Ionospheric Gradients in the Context of the Gagan System

    NASA Astrophysics Data System (ADS)

    Kudala, Ravi Chandra

    2012-10-01

    The Indian Space Research Organization and the Airports Authority of India are jointly implementing the Global Positioning System (GPS) aided GEO Augmented Navigation (GAGAN) system in order to meet the following required navigation performance (RNP) parameters: integrity, continuity, accuracy, and availability (for aircraft operations). Such a system provides the user with orbit, clock, and ionospheric corrections in addition to ranging signals via the geostationary earth orbit satellite (GEOSAT). The equatorial ionization anomaly (EIA), due to rapid non-uniform electron-ion recombination that persists on the Indian subcontinent, causes ionospheric gradients. Ionospheric gradients represent the most severe threat to high-integrity differential GNSS systems such as GAGAN. In order to ensure integrity under conditions of an ionospheric storm, the following three objectives must be met: careful monitoring, error bounding, and sophisticated storm-front modeling. The first objective is met by continuously tracking data due to storms, and, on quiet days, determining precise estimates of the threat parameters from reference monitoring stations. The second objective is met by quantifying the above estimates of threat parameters due to storms through maximum and minimum typical thresholds. In the context GAGAN, this work proposes a new method for identifying ionospheric gradients, in addition to determining an appropriate upper bound, in order to sufficiently understand error during storm days. Initially, carrier phase data of the GAGAN network from Indian TEC stations for both storm and quiet days was used for estimating ionospheric spatial and temporal gradients (the vertical ionospheric gradient (σVIG) and the rate of the TEC index (ROTI), respectively) in multiple viewing directions. Along similar lines, using the carrier to noise ratio (C/N0) for the same data, the carrier to noise ratio index (σCNRI) was derived. Subsequently, the one-toone relationship between σVIG and σCNRI was examined. High values of σVIG were determined for strong noise signals and corresponded to minimal σCNRI, indicating poor phase estimations and, in turn, an erroneous location. On the other hand, low values of σVIG were produced for weak noise signals and corresponded to maximum σCNRI, indicating strong phase estimations and, in turn, accurate locations. In other words, if a gradient persists in the line of sight direction of GEOSAT for aviation users, the down link L- band signal itself becomes erroneous. As a result, the en-route aviation user fails to receive a SBAS correction message leading to deprivation for the main objective of GAGAN. On the other hand, since the proposed approach enhances the receivers of both the aviation user and the reference monitoring station in terms of their performance, based on σCNRI, the integrity of SBAS messages themselves can be analyzed and considered for forward corrections.

  11. Subsidence monitoring and prediction of high-speed railway in Beijing with multitemporal TerraSAR-X data

    NASA Astrophysics Data System (ADS)

    Fan, Zelin; Zhang, Yonghong; Wu, Hong'an; Kang, Yonghui; Jiang, Decai

    2018-02-01

    The uneven settlement of high-speed railway (HSR) brings about great threat to the safe operation of trains. Therefore, the subsidence monitoring and prediction of HSR has important significance. In this paper, an improved multitemporal InSAR method combing PS-InSAR and SBAS-InSAR, Multiple-master Coherent Target Small-Baseline InSAR (MCTSB-InSAR), is used to monitor the subsidence of partial section of the Beijing-Tianjin HSR (BTHSR) and the Beijing-Shanghai HSR (BSHSR) in Beijing area. Thirty-one TerraSAR-X images from June 2011 to December 2016 are processed with the MCTSB-InSAR, and the subsidence information of the region covering 56km*32km in Beijing is dug out. Moreover, the monitoring results is validated by the leveling measurements in this area, with the accuracy of 4.4 mm/year. On the basis of above work, we extract the subsidence information of partial section of BTHSR and BSHSR in the research area. Finally, we adopt the idea of timing analysis, and employ the back-propagation (BP) neural network to simulate the relationship between former settlement and current settlement. Training data sets and test data sets are constructed respectively based on the monitoring results. The experimental results show that the prediction model has good prediction accuracy and applicability.

  12. A Microstrip Patch-Fed Short Backfire Antenna for the Tracking and Data Relay Satellite System-Continuation (TDRSS-C) Multiple Access (MA) Array

    NASA Technical Reports Server (NTRS)

    Nessel, James A.; Kory, Carol L.; Lambert, Kevin M.; Acosta, Roberto J.

    2006-01-01

    Short Backfire Antennas (SBAs) are widely utilized for mobile satellite communications, tracking, telemetry, and wireless local area network (WLAN) applications due to their compact structure and excellent radiation characteristics [1-3]. Typically, these SBA s consist of an excitation element (i.e., a half-wavelength dipole), a reflective bottom plane, a planar sub-reflector located above the "exciter", and an outer circular rim. This configuration is capable of achieving gains on the order of 13-15 dBi, but with relatively narrow bandwidths (approx.3%-5%), making it incompatible with the requirements of the next generation enhanced Tracking and Data Relay Satellite System-Continuation (TDRSS-C) Multiple Access (MA) array [1]. Several attempts have been made to enhance the bandwidth performance of the common dipole-fed SBA by employing various other feeding mechanisms (e.g., waveguide, slot) with moderate success [4-5]. In this paper, a novel method of using a microstrip patch is employed for the first time to excite an SBA. The patch element is fed via two H-shaped slots electromagnetically coupled to a broadband hybrid coupler to maintain a wide bandwidth, as well as provide for dual circular polarization capabilities.

  13. Evaluating the effect of spatial subsetting on subpixel unmixing methodology applied to ASTER over a hydrothermally altered terrain

    NASA Astrophysics Data System (ADS)

    Ayoobi, Iman; Tangestani, Majid H.

    2017-10-01

    This study investigates the effect of spatial subsets of Advanced Spaceborne Thermal Emission and Reflection radiometer (ASTER) L1B visible-near infrared and short wave-infrared (VNIR-SWIR) data on matched filtering results at the central part of Kerman magmatic arc, where abundant porphyry copper deposits exist. The matched filtering (MF) procedure was run separately at sites containing hydrothermal minerals such as sericite, kaolinite, chlorite, and jarosite to map the abundances of these minerals on spatial subsets containing 100, 75, 50, and 25 percent of the original scene. Results were evaluated by comparing the matched filtering scores with the mineral abundances obtained by semi-quantitative XRD analysis of corresponding field samples. It was concluded that MF method should be applied to the whole scene prior to any data subsetting.

  14. A parallel decision tree-based method for user authentication based on keystroke patterns.

    PubMed

    Sheng, Yong; Phoha, Vir V; Rovnyak, Steven M

    2005-08-01

    We propose a Monte Carlo approach to attain sufficient training data, a splitting method to improve effectiveness, and a system composed of parallel decision trees (DTs) to authenticate users based on keystroke patterns. For each user, approximately 19 times as much simulated data was generated to complement the 387 vectors of raw data. The training set, including raw and simulated data, is split into four subsets. For each subset, wavelet transforms are performed to obtain a total of eight training subsets for each user. Eight DTs are thus trained using the eight subsets. A parallel DT is constructed for each user, which contains all eight DTs with a criterion for its output that it authenticates the user if at least three DTs do so; otherwise it rejects the user. Training and testing data were collected from 43 users who typed the exact same string of length 37 nine consecutive times to provide data for training purposes. The users typed the same string at various times over a period from November through December 2002 to provide test data. The average false reject rate was 9.62% and the average false accept rate was 0.88%.

  15. In Silico Syndrome Prediction for Coronary Artery Disease in Traditional Chinese Medicine

    PubMed Central

    Lu, Peng; Chen, Jianxin; Zhao, Huihui; Gao, Yibo; Luo, Liangtao; Zuo, Xiaohan; Shi, Qi; Yang, Yiping; Yi, Jianqiang; Wang, Wei

    2012-01-01

    Coronary artery disease (CAD) is the leading causes of deaths in the world. The differentiation of syndrome (ZHENG) is the criterion of diagnosis and therapeutic in TCM. Therefore, syndrome prediction in silico can be improving the performance of treatment. In this paper, we present a Bayesian network framework to construct a high-confidence syndrome predictor based on the optimum subset, that is, collected by Support Vector Machine (SVM) feature selection. Syndrome of CAD can be divided into asthenia and sthenia syndromes. According to the hierarchical characteristics of syndrome, we firstly label every case three types of syndrome (asthenia, sthenia, or both) to solve several syndromes with some patients. On basis of the three syndromes' classes, we design SVM feature selection to achieve the optimum symptom subset and compare this subset with Markov blanket feature select using ROC. Using this subset, the six predictors of CAD's syndrome are constructed by the Bayesian network technique. We also design Naïve Bayes, C4.5 Logistic, Radial basis function (RBF) network compared with Bayesian network. In a conclusion, the Bayesian network method based on the optimum symptoms shows a practical method to predict six syndromes of CAD in TCM. PMID:22567030

  16. An exploration of the properties of the CORE problem list subset and how it facilitates the implementation of SNOMED CT

    PubMed Central

    Xu, Julia

    2015-01-01

    Objective Systematized Nomenclature of Medicine Clinical Terms (SNOMED CT) is the emergent international health terminology standard for encoding clinical information in electronic health records. The CORE Problem List Subset was created to facilitate the terminology’s implementation. This study evaluates the CORE Subset’s coverage and examines its growth pattern as source datasets are being incorporated. Methods Coverage of frequently used terms and the corresponding usage of the covered terms were assessed by “leave-one-out” analysis of the eight datasets constituting the current CORE Subset. The growth pattern was studied using a retrospective experiment, growing the Subset one dataset at a time and examining the relationship between the size of the starting subset and the coverage of frequently used terms in the incoming dataset. Linear regression was used to model that relationship. Results On average, the CORE Subset covered 80.3% of the frequently used terms of the left-out dataset, and the covered terms accounted for 83.7% of term usage. There was a significant positive correlation between the CORE Subset’s size and the coverage of the frequently used terms in an incoming dataset. This implies that the CORE Subset will grow at a progressively slower pace as it gets bigger. Conclusion The CORE Problem List Subset is a useful resource for the implementation of Systematized Nomenclature of Medicine Clinical Terms in electronic health records. It offers good coverage of frequently used terms, which account for a high proportion of term usage. If future datasets are incorporated into the CORE Subset, it is likely that its size will remain small and manageable. PMID:25725003

  17. Sulcal set optimization for cortical surface registration.

    PubMed

    Joshi, Anand A; Pantazis, Dimitrios; Li, Quanzheng; Damasio, Hanna; Shattuck, David W; Toga, Arthur W; Leahy, Richard M

    2010-04-15

    Flat mapping based cortical surface registration constrained by manually traced sulcal curves has been widely used for inter subject comparisons of neuroanatomical data. Even for an experienced neuroanatomist, manual sulcal tracing can be quite time consuming, with the cost increasing with the number of sulcal curves used for registration. We present a method for estimation of an optimal subset of size N(C) from N possible candidate sulcal curves that minimizes a mean squared error metric over all combinations of N(C) curves. The resulting procedure allows us to estimate a subset with a reduced number of curves to be traced as part of the registration procedure leading to optimal use of manual labeling effort for registration. To minimize the error metric we analyze the correlation structure of the errors in the sulcal curves by modeling them as a multivariate Gaussian distribution. For a given subset of sulci used as constraints in surface registration, the proposed model estimates registration error based on the correlation structure of the sulcal errors. The optimal subset of constraint curves consists of the N(C) sulci that jointly minimize the estimated error variance for the subset of unconstrained curves conditioned on the N(C) constraint curves. The optimal subsets of sulci are presented and the estimated and actual registration errors for these subsets are computed. Copyright 2009 Elsevier Inc. All rights reserved.

  18. Selecting predictors for discriminant analysis of species performance: an example from an amphibious softwater plant.

    PubMed

    Vanderhaeghe, F; Smolders, A J P; Roelofs, J G M; Hoffmann, M

    2012-03-01

    Selecting an appropriate variable subset in linear multivariate methods is an important methodological issue for ecologists. Interest often exists in obtaining general predictive capacity or in finding causal inferences from predictor variables. Because of a lack of solid knowledge on a studied phenomenon, scientists explore predictor variables in order to find the most meaningful (i.e. discriminating) ones. As an example, we modelled the response of the amphibious softwater plant Eleocharis multicaulis using canonical discriminant function analysis. We asked how variables can be selected through comparison of several methods: univariate Pearson chi-square screening, principal components analysis (PCA) and step-wise analysis, as well as combinations of some methods. We expected PCA to perform best. The selected methods were evaluated through fit and stability of the resulting discriminant functions and through correlations between these functions and the predictor variables. The chi-square subset, at P < 0.05, followed by a step-wise sub-selection, gave the best results. In contrast to expectations, PCA performed poorly, as so did step-wise analysis. The different chi-square subset methods all yielded ecologically meaningful variables, while probable noise variables were also selected by PCA and step-wise analysis. We advise against the simple use of PCA or step-wise discriminant analysis to obtain an ecologically meaningful variable subset; the former because it does not take into account the response variable, the latter because noise variables are likely to be selected. We suggest that univariate screening techniques are a worthwhile alternative for variable selection in ecology. © 2011 German Botanical Society and The Royal Botanical Society of the Netherlands.

  19. Impacts and societal benefits of research activities at Summit Station, Greenland

    NASA Astrophysics Data System (ADS)

    Hawley, R. L.; Burkhart, J. F.; Courville, Z.; Dibb, J. E.; Koenig, L.; Vaughn, B. H.

    2017-12-01

    Summit Station began as the site for the Greenland Ice Sheet Project 2 ice core in 1989. Since then, it has hosted both summer campaign science, and since 1997, year-round observations of atmospheric and cryospheric processes. The station has been continuously occupied since 2003. While most of the science activities at the station are supported by the US NSF Office of Polar Programs, the station also hosts many interagency and international investigations in physical glaciology, atmospheric chemistry, satellite validation, astrophysics and other disciplines. Summit is the only high elevation observatory north of the Arctic circle that can provide clean air or snow sites. The station is part of the INTER-ACT consortium of Arctic research stations with the main objective to identify, understand, predict and respond to diverse environmental changes, and part of the International Arctic Systems for Observing the Atmosphere (IASOA) that coordinates Arctic research activities and provides a networked, observations-based view of the Arctic. The Summit Station Science Summit, sponsored by NSF, assembled a multidisciplinary group of scientists to review Summit Station science, define the leading research questions for Summit, and make community-based recommendations for future science goals and governance for Summit. The impact of several on-going observation records was summarized in the report "Sustaining the Science Impact of Summit Station, Greenland," including the use of station data in weather forecasts and climate models. Observations made at the station as part of long-term, year-round research or during shorter summer-only campaign seasons contribute to several of the identified Social Benefit Areas (SBAs) outlined in the International Arctic Observations Assessment Framework published by the IDA Science and Technology Policy Institute and Sustaining Arctic Observing Networks as an outcome of the 2016 Arctic Science Ministerial. The SBAs supported by research conducted at Summit include Fundamental Understanding of Arctic Systems, Infrastructure and Operations, Terrestrial and Freshwater Ecosystems and Processes and Weather and Climate. Future efforts at maintaining the station's long-term climate record will focus on these areas, as identified in the Summit Station Science Summit report.

  20. A cluster randomized implementation trial to measure the effectiveness of an intervention package aiming to increase the utilization of skilled birth attendants by women for childbirth: study protocol.

    PubMed

    Bhandari, Gajananda P; Subedi, Narayan; Thapa, Janak; Choulagai, Bishnu; Maskey, Mahesh K; Onta, Sharad R

    2014-03-19

    Nepal is on track to achieve MDG 5 but there is a huge sub-national disparity with existing high maternal mortality in western and hilly regions. The national priority is to reduce this disparity to achieve the goal at sub-national level. Evidences from developing countries show that increasing utilization of skilled attendant at birth is an important indicator for reducing maternal death. Further, there is a very low utilization during childbirth in western and hilly regions of Nepal which clearly depicts the barriers in utilization of skilled birth attendants. So, there is a need to overcome the identified barriers to increase the utilization thereby decreasing the maternal mortality. The hypothesis of this study is that through a package of interventions the utilization of skilled birth attendants will be increased and hence improve maternal health in Nepal. This study involves a cluster randomized controlled trial involving approximately 5000 pregnant women in 36 clusters. The 18 intervention clusters will receive the following interventions: i) mobilization of family support for pregnant women to reach the health facility, ii) availability of emergency funds for institutional childbirth, iii) availability of transport options to reach a health facility for childbirth, iv) training to health workers on communication skills, v) security provisions for SBAs to reach services 24/24 through community mobilization; 18 control clusters will not receive the intervention package. The final evaluation of the intervention is planned to be completed by October 2014. Primary study output of this study is utilization of SBA services. Secondary study outputs measure the uptake of antenatal care, post natal checkup for mother and baby, availability of transportation for childbirth, operation of emergency fund, improved reception of women at health services, and improved physical security of SBAs. The intervention package is designed to increase the utilization of skilled birth attendants by overcoming the barriers related to awareness, finance, transport, security etc. If proven effective, the Ministry of Health has committed to scale up the intervention package throughout the country. ISRCTN78892490.

  1. Uterine Tonus Assessment by Midwives versus Patient self-assessment in the active management of the third stage of labor (UTAMP): study protocol for a randomized controlled trial.

    PubMed

    Browne, Joyce L; Damale, Nelson K R; Raams, Tessa M; Van der Linden, Eva L; Maya, Ernest T; Doe, Roseline; Rijken, Marcus J; Adanu, Richard; Grobbee, Diederick E; Franx, Arie; Klipstein-Grobusch, Kerstin

    2015-12-18

    Postpartum hemorrhage (PPH) is the leading cause of maternal mortality worldwide and accounts for one third of maternal deaths in low-income and middle-income countries. PPH can be prevented by active management of the third stage of labor (AMTSL), a series of steps recommended by the World Health Organization to be performed by skilled birth attendants (SBAs). Task shifting in the AMTSL step of uterotonic drugs administration to community health workers, traditional birth attendants and self-administration has been investigated as a strategy to increase access to quality obstetric care considering persistent SBA and facility-based delivery shortages. The aim of this study is to assess task shifting in the final step of AMTSL and compare uterine tonus assessment by a SBA to self-assessment. The study is an individual-level two-arm non-inferiority randomized controlled trial (RCT). A total of 800 women will be recruited in Korle Bu Teaching Hospital in Accra, Ghana. Adult women in labor at term with an expected vaginal delivery who received antenatal instructions for self-assessment of uterine tonus will be eligible for inclusion. Women with an increased risk for PPH will be excluded. Women will be randomized to uterine tone assessment by a skilled birth attendant (midwife) or uterine tone self-assessment (with the safety back-up of a midwife present in case of PPH or uterine atony). Postpartum blood loss will be measured through weighing of disposable mats. The main study endpoints are PPH (≥500 ml blood loss), severe PPH (≥1000 ml blood loss), mean blood loss, and routine maternal and neonatal outcomes. Participants and caregivers will not be blinded given the nature of the intervention. A reduction of PPH-related maternal mortality requires full implementation of AMTSL. Task shifting of uterine tone assessment may contribute to increased AMTSL implementation in (clinical) settings where SBAs capacity is constrained. Clinicaltrials.gov: NCT02223806 , registration August 2014. PACTR201402000736158 , registration July 2014. University of Ghana, Medical School Ethical and Protocol Review Committee: MS-Et/M.8-P4.1/2014-2015.

  2. Elucidation of Seventeen Human Peripheral Blood B cell Subsets and Quantification of the Tetanus Response Using a Density-Based Method for the Automated Identification of Cell Populations in Multidimensional Flow Cytometry Data

    PubMed Central

    Qian, Yu; Wei, Chungwen; Lee, F. Eun-Hyung; Campbell, John; Halliley, Jessica; Lee, Jamie A.; Cai, Jennifer; Kong, Megan; Sadat, Eva; Thomson, Elizabeth; Dunn, Patrick; Seegmiller, Adam C.; Karandikar, Nitin J.; Tipton, Chris; Mosmann, Tim; Sanz, Iñaki; Scheuermann, Richard H.

    2011-01-01

    Background Advances in multi-parameter flow cytometry (FCM) now allow for the independent detection of larger numbers of fluorochromes on individual cells, generating data with increasingly higher dimensionality. The increased complexity of these data has made it difficult to identify cell populations from high-dimensional FCM data using traditional manual gating strategies based on single-color or two-color displays. Methods To address this challenge, we developed a novel program, FLOCK (FLOw Clustering without K), that uses a density-based clustering approach to algorithmically identify biologically relevant cell populations from multiple samples in an unbiased fashion, thereby eliminating operator-dependent variability. Results FLOCK was used to objectively identify seventeen distinct B cell subsets in a human peripheral blood sample and to identify and quantify novel plasmablast subsets responding transiently to tetanus and other vaccinations in peripheral blood. FLOCK has been implemented in the publically available Immunology Database and Analysis Portal – ImmPort (http://www.immport.org) for open use by the immunology research community. Conclusions FLOCK is able to identify cell subsets in experiments that use multi-parameter flow cytometry through an objective, automated computational approach. The use of algorithms like FLOCK for FCM data analysis obviates the need for subjective and labor intensive manual gating to identify and quantify cell subsets. Novel populations identified by these computational approaches can serve as hypotheses for further experimental study. PMID:20839340

  3. Validation of the Abdominal Pain Index Using a Revised Scoring Method

    PubMed Central

    Sherman, Amanda L.; Smith, Craig A.; Walker, Lynn S.

    2015-01-01

    Objective Evaluate the psychometric properties of child- and parent-report versions of the four-item Abdominal Pain Index (API) in children with functional abdominal pain (FAP) and healthy controls, using a revised scoring method that facilitates comparisons of scores across samples and time. Methods Pediatric patients aged 8–18 years with FAP and controls completed the API at baseline (N = 1,967); a subset of their parents (N = 290) completed the API regarding the child’s pain. Subsets of patients completed follow-up assessments at 2 weeks (N = 231), 3 months (N = 330), and 6 months (N = 107). Subsets of both patients (N = 389) and healthy controls (N = 172) completed a long-term follow-up assessment (mean age at follow-up = 20.21 years, SD = 3.75). Results The API demonstrated good concurrent, discriminant, and construct validity, as well as good internal consistency. Conclusion We conclude that the API, using the revised scoring method, is a useful, reliable, and valid measure of abdominal pain severity. PMID:25617048

  4. Design of focused and restrained subsets from extremely large virtual libraries.

    PubMed

    Jamois, Eric A; Lin, Chien T; Waldman, Marvin

    2003-11-01

    With the current and ever-growing offering of reagents along with the vast palette of organic reactions, virtual libraries accessible to combinatorial chemists can reach sizes of billions of compounds or more. Extracting practical size subsets for experimentation has remained an essential step in the design of combinatorial libraries. A typical approach to computational library design involves enumeration of structures and properties for the entire virtual library, which may be unpractical for such large libraries. This study describes a new approach termed as on the fly optimization (OTFO) where descriptors are computed as needed within the subset optimization cycle and without intermediate enumeration of structures. Results reported herein highlight the advantages of coupling an ultra-fast descriptor calculation engine to subset optimization capabilities. We also show that enumeration of properties for the entire virtual library may not only be unpractical but also wasteful. Successful design of focused and restrained subsets can be achieved while sampling only a small fraction of the virtual library. We also investigate the stability of the method and compare results obtained from simulated annealing (SA) and genetic algorithms (GA).

  5. Classification of urine sediment based on convolution neural network

    NASA Astrophysics Data System (ADS)

    Pan, Jingjing; Jiang, Cunbo; Zhu, Tiantian

    2018-04-01

    By designing a new convolution neural network framework, this paper breaks the constraints of the original convolution neural network framework requiring large training samples and samples of the same size. Move and cropping the input images, generate the same size of the sub-graph. And then, the generated sub-graph uses the method of dropout, increasing the diversity of samples and preventing the fitting generation. Randomly select some proper subset in the sub-graphic set and ensure that the number of elements in the proper subset is same and the proper subset is not the same. The proper subsets are used as input layers for the convolution neural network. Through the convolution layer, the pooling, the full connection layer and output layer, we can obtained the classification loss rate of test set and training set. In the red blood cells, white blood cells, calcium oxalate crystallization classification experiment, the classification accuracy rate of 97% or more.

  6. Stochastic subset selection for learning with kernel machines.

    PubMed

    Rhinelander, Jason; Liu, Xiaoping P

    2012-06-01

    Kernel machines have gained much popularity in applications of machine learning. Support vector machines (SVMs) are a subset of kernel machines and generalize well for classification, regression, and anomaly detection tasks. The training procedure for traditional SVMs involves solving a quadratic programming (QP) problem. The QP problem scales super linearly in computational effort with the number of training samples and is often used for the offline batch processing of data. Kernel machines operate by retaining a subset of observed data during training. The data vectors contained within this subset are referred to as support vectors (SVs). The work presented in this paper introduces a subset selection method for the use of kernel machines in online, changing environments. Our algorithm works by using a stochastic indexing technique when selecting a subset of SVs when computing the kernel expansion. The work described here is novel because it separates the selection of kernel basis functions from the training algorithm used. The subset selection algorithm presented here can be used in conjunction with any online training technique. It is important for online kernel machines to be computationally efficient due to the real-time requirements of online environments. Our algorithm is an important contribution because it scales linearly with the number of training samples and is compatible with current training techniques. Our algorithm outperforms standard techniques in terms of computational efficiency and provides increased recognition accuracy in our experiments. We provide results from experiments using both simulated and real-world data sets to verify our algorithm.

  7. Method and system of filtering and recommending documents

    DOEpatents

    Patton, Robert M.; Potok, Thomas E.

    2016-02-09

    Disclosed is a method and system for discovering documents using a computer and providing a small set of the most relevant documents to the attention of a human observer. Using the method, the computer obtains a seed document from the user and generates a seed document vector using term frequency-inverse corpus frequency weighting. A keyword index for a plurality of source documents can be compared with the weighted terms of the seed document vector. The comparison is then filtered to reduce the number of documents, which define an initial subset of the source documents. Initial subset vectors are generated and compared to the seed document vector to obtain a similarity value for each comparison. Based on the similarity value, the method then recommends one or more of the source documents.

  8. Linear methods for reducing EMG contamination in peripheral nerve motor decodes.

    PubMed

    Kagan, Zachary B; Wendelken, Suzanne; Page, David M; Davis, Tyler; Hutchinson, Douglas T; Clark, Gregory A; Warren, David J

    2016-08-01

    Signals recorded from the peripheral nervous system (PNS) with high channel count penetrating microelectrode arrays, such as the Utah Slanted Electrode Array (USEA), often have electromyographic (EMG) signals contaminating the neural signal. This common-mode signal source may prevent single neural units from successfully being detected, thus hindering motor decode algorithms. Reducing this EMG contamination may lead to more accurate motor decode performance. A virtual reference (VR), created by a weighted linear combination of signals from a subset of all available channels, can be used to reduce this EMG contamination. Four methods of determining individual channel weights and six different methods of selecting subsets of channels were investigated (24 different VR types in total). The methods of determining individual channel weights were equal weighting, regression-based weighting, and two different proximity-based weightings. The subsets of channels were selected by a radius-based criteria, such that a channel was included if it was within a particular radius of inclusion from the target channel. These six radii of inclusion were 1.5, 2.9, 3.2, 5, 8.4, and 12.8 electrode-distances; the 12.8 electrode radius includes all USEA electrodes. We found that application of a VR improves the detectability of neural events via increasing the SNR, but we found no statistically meaningful difference amongst the VR types we examined. The computational complexity of implementation varies with respect to the method of determining channel weights and the number of channels in a subset, but does not correlate with VR performance. Hence, we examined the computational costs of calculating and applying the VR and based on these criteria, we recommend an equal weighting method of assigning weights with a 3.2 electrode-distance radius of inclusion. Further, we found empirically that application of the recommended VR will require less than 1 ms for 33.3 ms of data from one USEA.

  9. Column Subset Selection, Matrix Factorization, and Eigenvalue Optimization

    DTIC Science & Technology

    2008-07-01

    Pietsch and Grothendieck, which are regarded as basic instruments in modern functional analysis [Pis86]. • The methods for computing these... Pietsch factorization and the maxcut semi- definite program [GW95]. 1.2. Overview. We focus on the algorithmic version of the Kashin–Tzafriri theorem...will see that the desired subset is exposed by factoring the random submatrix. This factorization, which was invented by Pietsch , is regarded as a basic

  10. A Survey of Object Oriented Languages in Programming Environments.

    DTIC Science & Technology

    1987-06-01

    subset of natural languages might be more effective , and make the human-computer interface more friendly. 19 .. .. . . -.. -, " ,. o...and complexty of Ada. He meant that the language contained too many features that made it complicated to use effectively . Much of the complexity comes...by sending messages to a class instance. A small subset of the methods in Smalltalk-80 are not expressed in the !-’ Smalhalk-80 programming language

  11. Application of machine learning on brain cancer multiclass classification

    NASA Astrophysics Data System (ADS)

    Panca, V.; Rustam, Z.

    2017-07-01

    Classification of brain cancer is a problem of multiclass classification. One approach to solve this problem is by first transforming it into several binary problems. The microarray gene expression dataset has the two main characteristics of medical data: extremely many features (genes) and only a few number of samples. The application of machine learning on microarray gene expression dataset mainly consists of two steps: feature selection and classification. In this paper, the features are selected using a method based on support vector machine recursive feature elimination (SVM-RFE) principle which is improved to solve multiclass classification, called multiple multiclass SVM-RFE. Instead of using only the selected features on a single classifier, this method combines the result of multiple classifiers. The features are divided into subsets and SVM-RFE is used on each subset. Then, the selected features on each subset are put on separate classifiers. This method enhances the feature selection ability of each single SVM-RFE. Twin support vector machine (TWSVM) is used as the method of the classifier to reduce computational complexity. While ordinary SVM finds single optimum hyperplane, the main objective Twin SVM is to find two non-parallel optimum hyperplanes. The experiment on the brain cancer microarray gene expression dataset shows this method could classify 71,4% of the overall test data correctly, using 100 and 1000 genes selected from multiple multiclass SVM-RFE feature selection method. Furthermore, the per class results show that this method could classify data of normal and MD class with 100% accuracy.

  12. Dendritic Cells in the Context of Human Tumors: Biology and Experimental Tools.

    PubMed

    Volovitz, Ilan; Melzer, Susanne; Amar, Sarah; Bocsi, József; Bloch, Merav; Efroni, Sol; Ram, Zvi; Tárnok, Attila

    2016-01-01

    Dendritic cells (DC) are the most potent and versatile antigen-presenting cells (APC) in the immune system. DC have an exceptional ability to comprehend the immune context of a captured antigen based on molecular signals identified from its vicinity. The analyzed information is then conveyed to other immune effector cells. Such capability enables DC to play a pivotal role in mediating either an immunogenic response or immune tolerance towards an acquired antigen. This review summarizes current knowledge on DC in the context of human tumors. It covers the basics of human DC biology, elaborating on the different markers, morphology and function of the different subsets of human DC. Human blood-borne DC are comprised of at least three subsets consisting of one plasmacytoid DC (pDC) and two to three myeloid DC (mDC) subsets. Some tissues have unique DC. Each subset has a different phenotype and function and may induce pro-tumoral or anti-tumoral effects. The review also discusses two methods fundamental to the research of DC on the single-cell level: multicolor flow cytometry (FCM) and image-based cytometry (IC). These methods, along with new genomics and proteomics tools, can provide high-resolution information on specific DC subsets and on immune and tumor cells with which they interact. The different layers of collected biological data may then be integrated using Immune-Cytomics modeling approaches. Such novel integrated approaches may help unravel the complex network of cellular interactions that DC carry out within tumors, and may help harness this complex immunological information into the development of more effective treatments for cancer.

  13. Inline Measurement of Particle Concentrations in Multicomponent Suspensions using Ultrasonic Sensor and Least Squares Support Vector Machines.

    PubMed

    Zhan, Xiaobin; Jiang, Shulan; Yang, Yili; Liang, Jian; Shi, Tielin; Li, Xiwen

    2015-09-18

    This paper proposes an ultrasonic measurement system based on least squares support vector machines (LS-SVM) for inline measurement of particle concentrations in multicomponent suspensions. Firstly, the ultrasonic signals are analyzed and processed, and the optimal feature subset that contributes to the best model performance is selected based on the importance of features. Secondly, the LS-SVM model is tuned, trained and tested with different feature subsets to obtain the optimal model. In addition, a comparison is made between the partial least square (PLS) model and the LS-SVM model. Finally, the optimal LS-SVM model with the optimal feature subset is applied to inline measurement of particle concentrations in the mixing process. The results show that the proposed method is reliable and accurate for inline measuring the particle concentrations in multicomponent suspensions and the measurement accuracy is sufficiently high for industrial application. Furthermore, the proposed method is applicable to the modeling of the nonlinear system dynamically and provides a feasible way to monitor industrial processes.

  14. The Fisher-Markov selector: fast selecting maximally separable feature subset for multiclass classification with applications to high-dimensional data.

    PubMed

    Cheng, Qiang; Zhou, Hongbo; Cheng, Jie

    2011-06-01

    Selecting features for multiclass classification is a critically important task for pattern recognition and machine learning applications. Especially challenging is selecting an optimal subset of features from high-dimensional data, which typically have many more variables than observations and contain significant noise, missing components, or outliers. Existing methods either cannot handle high-dimensional data efficiently or scalably, or can only obtain local optimum instead of global optimum. Toward the selection of the globally optimal subset of features efficiently, we introduce a new selector--which we call the Fisher-Markov selector--to identify those features that are the most useful in describing essential differences among the possible groups. In particular, in this paper we present a way to represent essential discriminating characteristics together with the sparsity as an optimization objective. With properly identified measures for the sparseness and discriminativeness in possibly high-dimensional settings, we take a systematic approach for optimizing the measures to choose the best feature subset. We use Markov random field optimization techniques to solve the formulated objective functions for simultaneous feature selection. Our results are noncombinatorial, and they can achieve the exact global optimum of the objective function for some special kernels. The method is fast; in particular, it can be linear in the number of features and quadratic in the number of observations. We apply our procedure to a variety of real-world data, including mid--dimensional optical handwritten digit data set and high-dimensional microarray gene expression data sets. The effectiveness of our method is confirmed by experimental results. In pattern recognition and from a model selection viewpoint, our procedure says that it is possible to select the most discriminating subset of variables by solving a very simple unconstrained objective function which in fact can be obtained with an explicit expression.

  15. A New Reliability Analysis Model of the Chegongzhuang Heat-Supplying Tunnel Structure Considering the Coupling of Pipeline Thrust and Thermal Effect

    PubMed Central

    Zhang, Jiawen; He, Shaohui; Wang, Dahai; Liu, Yangpeng; Yao, Wenbo; Liu, Xiabing

    2018-01-01

    Based on the operating Chegongzhuang heat-supplying tunnel in Beijing, the reliability of its lining structure under the action of large thrust and thermal effect is studied. According to the characteristics of a heat-supplying tunnel service, a three-dimensional numerical analysis model was established based on the mechanical tests on the in-situ specimens. The stress and strain of the tunnel structure were obtained before and after the operation. Compared with the field monitoring data, the rationality of the model was verified. After extracting the internal force of the lining structure, the improved method of subset simulation was proposed as the performance function to calculate the reliability of the main control section of the tunnel. In contrast to the traditional calculation method, the analytic relationship between the sample numbers in the subset simulation method and Monte Carlo method was given. The results indicate that the lining structure is greatly influenced by coupling in the range of six meters from the fixed brackets, especially the tunnel floor. The improved subset simulation method can greatly save computation time and improve computational efficiency under the premise of ensuring the accuracy of calculation. It is suitable for the reliability calculation of tunnel engineering, because “the lower the probability, the more efficient the calculation.” PMID:29401691

  16. Diagnostic support for glaucoma using retinal images: a hybrid image analysis and data mining approach.

    PubMed

    Yu, Jin; Abidi, Syed Sibte Raza; Artes, Paul; McIntyre, Andy; Heywood, Malcolm

    2005-01-01

    The availability of modern imaging techniques such as Confocal Scanning Laser Tomography (CSLT) for capturing high-quality optic nerve images offer the potential for developing automatic and objective methods for diagnosing glaucoma. We present a hybrid approach that features the analysis of CSLT images using moment methods to derive abstract image defining features. The features are then used to train classifers for automatically distinguishing CSLT images of normal and glaucoma patient. As a first, in this paper, we present investigations in feature subset selction methods for reducing the relatively large input space produced by the moment methods. We use neural networks and support vector machines to determine a sub-set of moments that offer high classification accuracy. We demonstratee the efficacy of our methods to discriminate between healthy and glaucomatous optic disks based on shape information automatically derived from optic disk topography and reflectance images.

  17. Superiorization with level control

    NASA Astrophysics Data System (ADS)

    Cegielski, Andrzej; Al-Musallam, Fadhel

    2017-04-01

    The convex feasibility problem is to find a common point of a finite family of closed convex subsets. In many applications one requires something more, namely finding a common point of closed convex subsets which minimizes a continuous convex function. The latter requirement leads to an application of the superiorization methodology which is actually settled between methods for convex feasibility problem and the convex constrained minimization. Inspired by the superiorization idea we introduce a method which sequentially applies a long-step algorithm for a sequence of convex feasibility problems; the method employs quasi-nonexpansive operators as well as subgradient projections with level control and does not require evaluation of the metric projection. We replace a perturbation of the iterations (applied in the superiorization methodology) by a perturbation of the current level in minimizing the objective function. We consider the method in the Euclidean space in order to guarantee the strong convergence, although the method is well defined in a Hilbert space.

  18. Deferred slanted-edge analysis: a unified approach to spatial frequency response measurement on distorted images and color filter array subsets.

    PubMed

    van den Bergh, F

    2018-03-01

    The slanted-edge method of spatial frequency response (SFR) measurement is usually applied to grayscale images under the assumption that any distortion of the expected straight edge is negligible. By decoupling the edge orientation and position estimation step from the edge spread function construction step, it is shown in this paper that the slanted-edge method can be extended to allow it to be applied to images suffering from significant geometric distortion, such as produced by equiangular fisheye lenses. This same decoupling also allows the slanted-edge method to be applied directly to Bayer-mosaicked images so that the SFR of the color filter array subsets can be measured directly without the unwanted influence of demosaicking artifacts. Numerical simulation results are presented to demonstrate the efficacy of the proposed deferred slanted-edge method in relation to existing methods.

  19. Reliability-based design optimization using a generalized subset simulation method and posterior approximation

    NASA Astrophysics Data System (ADS)

    Ma, Yuan-Zhuo; Li, Hong-Shuang; Yao, Wei-Xing

    2018-05-01

    The evaluation of the probabilistic constraints in reliability-based design optimization (RBDO) problems has always been significant and challenging work, which strongly affects the performance of RBDO methods. This article deals with RBDO problems using a recently developed generalized subset simulation (GSS) method and a posterior approximation approach. The posterior approximation approach is used to transform all the probabilistic constraints into ordinary constraints as in deterministic optimization. The assessment of multiple failure probabilities required by the posterior approximation approach is achieved by GSS in a single run at all supporting points, which are selected by a proper experimental design scheme combining Sobol' sequences and Bucher's design. Sequentially, the transformed deterministic design optimization problem can be solved by optimization algorithms, for example, the sequential quadratic programming method. Three optimization problems are used to demonstrate the efficiency and accuracy of the proposed method.

  20. HIV-1 protease cleavage site prediction based on two-stage feature selection method.

    PubMed

    Niu, Bing; Yuan, Xiao-Cheng; Roeper, Preston; Su, Qiang; Peng, Chun-Rong; Yin, Jing-Yuan; Ding, Juan; Li, HaiPeng; Lu, Wen-Cong

    2013-03-01

    Knowledge of the mechanism of HIV protease cleavage specificity is critical to the design of specific and effective HIV inhibitors. Searching for an accurate, robust, and rapid method to correctly predict the cleavage sites in proteins is crucial when searching for possible HIV inhibitors. In this article, HIV-1 protease specificity was studied using the correlation-based feature subset (CfsSubset) selection method combined with Genetic Algorithms method. Thirty important biochemical features were found based on a jackknife test from the original data set containing 4,248 features. By using the AdaBoost method with the thirty selected features the prediction model yields an accuracy of 96.7% for the jackknife test and 92.1% for an independent set test, with increased accuracy over the original dataset by 6.7% and 77.4%, respectively. Our feature selection scheme could be a useful technique for finding effective competitive inhibitors of HIV protease.

  1. Validation of the Abdominal Pain Index using a revised scoring method.

    PubMed

    Laird, Kelsey T; Sherman, Amanda L; Smith, Craig A; Walker, Lynn S

    2015-06-01

    Evaluate the psychometric properties of child- and parent-report versions of the four-item Abdominal Pain Index (API) in children with functional abdominal pain (FAP) and healthy controls, using a revised scoring method that facilitates comparisons of scores across samples and time. Pediatric patients aged 8-18 years with FAP and controls completed the API at baseline (N = 1,967); a subset of their parents (N = 290) completed the API regarding the child's pain. Subsets of patients completed follow-up assessments at 2 weeks (N = 231), 3 months (N = 330), and 6 months (N = 107). Subsets of both patients (N = 389) and healthy controls (N = 172) completed a long-term follow-up assessment (mean age at follow-up = 20.21 years, SD = 3.75). The API demonstrated good concurrent, discriminant, and construct validity, as well as good internal consistency. We conclude that the API, using the revised scoring method, is a useful, reliable, and valid measure of abdominal pain severity. © The Author 2015. Published by Oxford University Press on behalf of the Society of Pediatric Psychology. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  2. Factors that influence the provision of intrapartum and postnatal care by skilled birth attendants in low- and middle-income countries: a qualitative evidence synthesis

    PubMed Central

    Munabi-Babigumira, Susan; Glenton, Claire; Lewin, Simon; Fretheim, Atle; Nabudere, Harriet

    2017-01-01

    Background In many low- and middle-income countries women are encouraged to give birth in clinics and hospitals so that they can receive care from skilled birth attendants. A skilled birth attendant (SBA) is a health worker such as a midwife, doctor, or nurse who is trained to manage normal pregnancy and childbirth. (S)he is also trained to identify, manage, and refer any health problems that arise for mother and baby. The skills, attitudes and behaviour of SBAs, and the extent to which they work in an enabling working environment, impact on the quality of care provided. If any of these factors are missing, mothers and babies are likely to receive suboptimal care. Objectives To explore the views, experiences, and behaviours of skilled birth attendants and those who support them; to identify factors that influence the delivery of intrapartum and postnatal care in low- and middle-income countries; and to explore the extent to which these factors were reflected in intervention studies. Search methods Our search strategies specified key and free text terms related to the perinatal period, and the health provider, and included methodological filters for qualitative evidence syntheses and for low- and middle-income countries. We searched MEDLINE, OvidSP (searched 21 November 2016), Embase, OvidSP (searched 28 November 2016), PsycINFO, OvidSP (searched 30 November 2016), POPLINE, K4Health (searched 30 November 2016), CINAHL, EBSCOhost (searched 30 November 2016), ProQuest Dissertations and Theses (searched 15 August 2013), Web of Science (searched 1 December 2016), World Health Organization Reproductive Health Library (searched 16 August 2013), and World Health Organization Global Health Library for WHO databases (searched 1 December 2016). Selection criteria We included qualitative studies that focused on the views, experiences, and behaviours of SBAs and those who work with them as part of the team. We included studies from all levels of health care in low- and middle-income countries. Data collection and analysis One review author extracted data and assessed study quality, and another review author checked the data. We synthesised data using the best fit framework synthesis approach and assessed confidence in the evidence using the GRADE-CERQual (Confidence in the Evidence from Reviews of Qualitative research) approach. We used a matrix approach to explore whether the factors identified by health workers in our synthesis as important for providing maternity care were reflected in the interventions evaluated in the studies in a related intervention review. Main results We included 31 studies that explored the views and experiences of different types of SBAs, including doctors, midwives, nurses, auxiliary nurses and their managers. The included studies took place in Africa, Asia, and Latin America. Our synthesis pointed to a number of factors affecting SBAs’ provision of quality care. The following factors were based on evidence assessed as of moderate to high confidence. Skilled birth attendants reported that they were not always given sufficient training during their education or after they had begun clinical work. Also, inadequate staffing of facilities could increase the workloads of skilled birth attendants, make it difficult to provide supervision and result in mothers being offered poorer care. In addition, SBAs did not always believe that their salaries and benefits reflected their tasks and responsibilities and the personal risks they undertook. Together with poor living and working conditions, these issues were seen to increase stress and to negatively affect family life. Some SBAs also felt that managers lacked capacity and skills, and felt unsupported when their workplace concerns were not addressed. Possible causes of staff shortages in facilities included problems with hiring and assigning health workers to facilities where they were needed; lack of funding; poor management and bureaucratic systems; and low salaries. Skilled birth attendants and their managers suggested factors that could help recruit, keep, and motivate health workers, and improve the quality of care; these included good-quality housing, allowances for extra work, paid vacations, continuing education, appropriate assessments of their work, and rewards. Skilled birth attendants’ ability to provide quality care was also limited by a lack of equipment, supplies, and drugs; blood and the infrastructure to manage blood transfusions; electricity and water supplies; and adequate space and amenities on maternity wards. These factors were seen to reduce SBAs’ morale, increase their workload and infection risk, and make them less efficient in their work. A lack of transport sometimes made it difficult for SBAs to refer women on to higher levels of care. In addition, women’s negative perceptions of the health system could make them reluctant to accept referral. We identified some other factors that also may have affected the quality of care, which were based on findings assessed as of low or very low confidence. Poor teamwork and lack of trust and collaboration between health workers appeared to negatively influence care. In contrast, good collaboration and teamwork appeared to increase skilled birth attendants’ motivation, their decision-making abilities, and the quality of care. Skilled birth attendants’ workloads and staff shortages influenced their interactions with mothers. In addition, poor communication undermined trust between skilled birth attendants and mothers. Authors' conclusions Many factors influence the care that SBAs are able to provide to mothers during childbirth. These include access to training and supervision; staff numbers and workloads; salaries and living conditions; and access to well-equipped, well-organised healthcare facilities with water, electricity, and transport. Other factors that may play a role include the existence of teamwork and of trust, collaboration, and communication between health workers and with mothers. Skilled birth attendants reported many problems tied to all of these factors. What factors influence the delivery of care by skilled birth attendants in low- and middle-income countries? Review aim The aim of this Cochrane synthesis of qualitative evidence was to identify factors that influence the provision of care by skilled birth attendants. To answer this question, we searched for and analysed qualitative studies of skilled birth attendants’ views, experiences, and behaviour. This synthesis complements another Cochrane Review assessing the effect of strategies to promote women’s use of healthcare facilities when giving birth. Key messages Many factors influence the care that skilled birth attendants provide to mothers during childbirth. These include access to training and supervision; staff numbers and workloads; salaries and living conditions; and access to well-equipped, well-organised healthcare facilities with water, electricity, and transport. Other factors that may play a role include the existence of teamwork, trust, collaboration, and communication between health workers and with mothers. Skilled birth attendants reported many problems tied to these factors. What did we study in the synthesis? In low- and middle-income countries, many mothers still die during childbirth. Women are encouraged to give birth in health facilities rather than at home so they can receive care from skilled birth attendants. A skilled birth attendant is a health worker such as a midwife, doctor, or nurse who is trained to manage a normal pregnancy and childbirth, and refer the mother and newborn when complications arise. By exploring skilled birth attendants’ views, experiences, and behaviour, this synthesis aimed to identify factors that can influence their ability to provide quality care. Main findings We included 31 studies conducted in Africa, Asia, and Latin America. Participants were skilled birth attendants including doctors, midwives, nurses, auxiliary nurses and their managers. Our synthesis pointed to several factors that affected skilled birth attendants’ provision of quality care. The following factors are based on evidence assessed as of moderate to high confidence. Skilled birth attendants reported that they sometimes had insufficient training during their education or after they had begun work. Where facilities lacked staff, skilled birth attendants’ workloads could increase, it could become difficult to provide supervision, and mothers could receive poorer care. In addition, skilled birth attendants did not always believe that their salaries and benefits reflected their tasks and responsibilities and the personal risks they undertook. Together with poor living and working conditions, these issues could lead to stress and affect skilled birth attendants' family life. Some skilled birth attendants felt that managers lacked capacity and skills, and they felt unsupported when their workplace concerns were not addressed. Possible causes of staff shortages included problems with hiring and assigning health workers to health facilities; lack of funding; poor management and bureaucratic systems; and low salaries. Skilled birth attendants and their managers suggested factors that could help recruit, keep, and motivate health workers, and improve the quality of their work; these included good-quality housing, allowances for extra work, paid vacations, continued education, proper assessments of their work, and rewards. Skilled birth attendants’ ability to provide quality care was also limited by a lack of equipment, drugs, and supplies; blood and the infrastructure to manage blood transfusions; electricity and water supplies; and adequate space and amenities on maternity wards. These factors were seen to reduce skilled birth attendants’ morale, increase their workload and infection risk, and make them less efficient in their work. A lack of transport sometimes made it difficult for skilled birth attendants to refer women to higher levels of care. In addition, women’s negative perceptions of the health system could make them reluctant to accept referral. We identified some other factors that also may have affected the quality of care, which were based on findings assessed as of low or very low confidence. Poor teamwork and lack of trust and collaboration between health workers appeared to negatively influence care. In contrast, good collaboration and teamwork appeared to increase skilled birth attendants’ motivation, their decision-making abilities, and the quality of care. Skilled birth attendants’ workloads and staff shortages influenced their interactions with mothers. In addition, poor communication undermined trust between skilled birth attendants and mothers. How up-to-date is this review? We searched for studies published before November 2016. PMID:29148566

  3. Selecting climate change scenarios for regional hydrologic impact studies based on climate extremes indices

    NASA Astrophysics Data System (ADS)

    Seo, Seung Beom; Kim, Young-Oh; Kim, Youngil; Eum, Hyung-Il

    2018-04-01

    When selecting a subset of climate change scenarios (GCM models), the priority is to ensure that the subset reflects the comprehensive range of possible model results for all variables concerned. Though many studies have attempted to improve the scenario selection, there is a lack of studies that discuss methods to ensure that the results from a subset of climate models contain the same range of uncertainty in hydrologic variables as when all models are considered. We applied the Katsavounidis-Kuo-Zhang (KKZ) algorithm to select a subset of climate change scenarios and demonstrated its ability to reduce the number of GCM models in an ensemble, while the ranges of multiple climate extremes indices were preserved. First, we analyzed the role of 27 ETCCDI climate extremes indices for scenario selection and selected the representative climate extreme indices. Before the selection of a subset, we excluded a few deficient GCM models that could not represent the observed climate regime. Subsequently, we discovered that a subset of GCM models selected by the KKZ algorithm with the representative climate extreme indices could not capture the full potential range of changes in hydrologic extremes (e.g., 3-day peak flow and 7-day low flow) in some regional case studies. However, the application of the KKZ algorithm with a different set of climate indices, which are correlated to the hydrologic extremes, enabled the overcoming of this limitation. Key climate indices, dependent on the hydrologic extremes to be projected, must therefore be determined prior to the selection of a subset of GCM models.

  4. Dynamic graphs, community detection, and Riemannian geometry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bakker, Craig; Halappanavar, Mahantesh; Visweswara Sathanur, Arun

    A community is a subset of a wider network where the members of that subset are more strongly connected to each other than they are to the rest of the network. In this paper, we consider the problem of identifying and tracking communities in graphs that change over time {dynamic community detection} and present a framework based on Riemannian geometry to aid in this task. Our framework currently supports several important operations such as interpolating between and averaging over graph snapshots. We compare these Riemannian methods with entry-wise linear interpolation and that the Riemannian methods are generally better suited tomore » dynamic community detection. Next steps with the Riemannian framework include developing higher-order interpolation methods (e.g. the analogues of polynomial and spline interpolation) and a Riemannian least-squares regression method for working with noisy data.« less

  5. Variable selection with stepwise and best subset approaches

    PubMed Central

    2016-01-01

    While purposeful selection is performed partly by software and partly by hand, the stepwise and best subset approaches are automatically performed by software. Two R functions stepAIC() and bestglm() are well designed for stepwise and best subset regression, respectively. The stepAIC() function begins with a full or null model, and methods for stepwise regression can be specified in the direction argument with character values “forward”, “backward” and “both”. The bestglm() function begins with a data frame containing explanatory variables and response variables. The response variable should be in the last column. Varieties of goodness-of-fit criteria can be specified in the IC argument. The Bayesian information criterion (BIC) usually results in more parsimonious model than the Akaike information criterion. PMID:27162786

  6. Transforming Science Data for GIS: How to Find and Use NASA Earth Observation Data Without Being a Rocket Scientist

    NASA Technical Reports Server (NTRS)

    Bagwell, Ross; Peters, Byron; Berrick, Stephen

    2017-01-01

    NASAs Earth Observing System Data Information System (EOSDIS) manages Earth Observation satellites and the Distributed Active Archive Centers (DAACs), where the data is stored and processed. The challenge is that Earth Observation data is complicated. There is plenty of data available, however, the science teams have had a top-down approach: define what it is you are trying to study -select a set of satellite(s) and sensor(s), and drill down for the data.Our alternative is to take a bottom-up approach using eight environmental fields of interest as defined by the Group on Earth Observations (GEO) called Societal Benefit Areas (SBAs): Disaster Resilience (DR) Public Health Surveillance (PHS) Energy and Mineral Resource Management (EMRM) Water Resources Management (WRM) Infrastructure and Transport Management (ITM) Sustainable Urban Development (SUD) Food Security and Sustainable Agriculture (FSSA) Biodiversity and Ecosystems Sustainability (BES).

  7. A prototype of an automated high resolution InSAR volcano-monitoring system in the MED-SUV project

    NASA Astrophysics Data System (ADS)

    Chowdhury, Tanvir A.; Minet, Christian; Fritz, Thomas

    2016-04-01

    Volcanic processes which produce a variety of geological and hydrological hazards are difficult to predict and capable of triggering natural disasters on regional to global scales. Therefore it is important to monitor volcano continuously and with a high spatial and temporal sampling rate. The monitoring of active volcanoes requires the reliable measurement of surface deformation before, during and after volcanic activities and it helps for the better understanding and modelling of the involved geophysical processes. Space-borne synthetic aperture radar (SAR) interferometry (InSAR), persistent scatterer interferometry (PSI) and small baseline subset algorithm (SBAS) provide a powerful tool for observing the eruptive activities and measuring the surface changes of millimetre accuracy. All the mentioned techniques with deformation time series extraction address the challenges by exploiting medium to large SAR image stacks. The process of selecting, ordering, downloading, storing, logging, extracting and preparing the data for processing is very time consuming has to be done manually for every single data-stack. In many cases it is even an iterative process which has to be done regularly and continuously. Therefore, data processing becomes slow which causes significant delays in data delivery. The SAR Satellite based High Resolution Data Acquisition System, which will be developed at DLR, will automate this entire time consuming tasks and allows an operational volcano monitoring system. Every 24 hours the system runs for searching new acquired scene over the volcanoes and keeps track of the data orders, log the status and download the provided data via ftp-transfer including E-Mail alert. Furthermore, the system will deliver specified reports and maps to a database for review and use by specialists. The user interaction will be minimized and iterative processes will be totally avoided. In this presentation, a prototype of SAR Satellite based High Resolution Data Acquisition System, which is developed and operated by DLR, will be described in detail. The workflow of the developed system is described which allow a meaningful contribution of SAR for monitoring volcanic eruptive activities. A more robust and efficient InSAR data processing in IWAP processor will be introduced in the framework of a remote sensing task of MED-SUV project. An application of the developed prototype system to a historic eruption of Mount Etna and Piton de la Fournaise will be depicted in the last part of the presentation.

  8. Comparison of Different EHG Feature Selection Methods for the Detection of Preterm Labor

    PubMed Central

    Alamedine, D.; Khalil, M.; Marque, C.

    2013-01-01

    Numerous types of linear and nonlinear features have been extracted from the electrohysterogram (EHG) in order to classify labor and pregnancy contractions. As a result, the number of available features is now very large. The goal of this study is to reduce the number of features by selecting only the relevant ones which are useful for solving the classification problem. This paper presents three methods for feature subset selection that can be applied to choose the best subsets for classifying labor and pregnancy contractions: an algorithm using the Jeffrey divergence (JD) distance, a sequential forward selection (SFS) algorithm, and a binary particle swarm optimization (BPSO) algorithm. The two last methods are based on a classifier and were tested with three types of classifiers. These methods have allowed us to identify common features which are relevant for contraction classification. PMID:24454536

  9. Quantitative assessment model for gastric cancer screening

    PubMed Central

    Chen, Kun; Yu, Wei-Ping; Song, Liang; Zhu, Yi-Min

    2005-01-01

    AIM: To set up a mathematic model for gastric cancer screening and to evaluate its function in mass screening for gastric cancer. METHODS: A case control study was carried on in 66 patients and 198 normal people, then the risk and protective factors of gastric cancer were determined, including heavy manual work, foods such as small yellow-fin tuna, dried small shrimps, squills, crabs, mothers suffering from gastric diseases, spouse alive, use of refrigerators and hot food, etc. According to some principles and methods of probability and fuzzy mathematics, a quantitative assessment model was established as follows: first, we selected some factors significant in statistics, and calculated weight coefficient for each one by two different methods; second, population space was divided into gastric cancer fuzzy subset and non gastric cancer fuzzy subset, then a mathematic model for each subset was established, we got a mathematic expression of attribute degree (AD). RESULTS: Based on the data of 63 patients and 693 normal people, AD of each subject was calculated. Considering the sensitivity and specificity, the thresholds of AD values calculated were configured with 0.20 and 0.17, respectively. According to these thresholds, the sensitivity and specificity of the quantitative model were about 69% and 63%. Moreover, statistical test showed that the identification outcomes of these two different calculation methods were identical (P>0.05). CONCLUSION: The validity of this method is satisfactory. It is convenient, feasible, economic and can be used to determine individual and population risks of gastric cancer. PMID:15655813

  10. Using Parental Profiles to Predict Membership in a Subset of College Students Experiencing Excessive Alcohol Consequences: Findings From a Longitudinal Study

    PubMed Central

    Varvil-Weld, Lindsey; Mallett, Kimberly A.; Turrisi, Rob; Abar, Caitlin C.

    2012-01-01

    Objective: Previous research identified a high-risk subset of college students experiencing a disproportionate number of alcohol-related consequences at the end of their first year. With the goal of identifying pre-college predictors of membership in this high-risk subset, the present study used a prospective design to identify latent profiles of student-reported maternal and paternal parenting styles and alcohol-specific behaviors and to determine whether these profiles were associated with membership in the high-risk consequences subset. Method: A sample of randomly selected 370 incoming first-year students at a large public university reported on their mothers’ and fathers’ communication quality, monitoring, approval of alcohol use, and modeling of drinking behaviors and on consequences experienced across the first year of college. Results: Students in the high-risk subset comprised 15.5% of the sample but accounted for almost half (46.6%) of the total consequences reported by the entire sample. Latent profile analyses identified four parental profiles: positive pro-alcohol, positive anti-alcohol, negative mother, and negative father. Logistic regression analyses revealed that students in the negative-father profile were at greatest odds of being in the high-risk consequences subset at a follow-up assessment 1 year later, even after drinking at baseline was controlled for. Students in the positive pro-alcohol profile also were at increased odds of being in the high-risk subset, although this association was attenuated after baseline drinking was controlled for. Conclusions: These findings have important implications for the improvement of existing parent- and individual-based college student drinking interventions designed to reduce alcohol-related consequences. PMID:22456248

  11. Investigating Evolutionary Conservation of Dendritic Cell Subset Identity and Functions

    PubMed Central

    Vu Manh, Thien-Phong; Bertho, Nicolas; Hosmalin, Anne; Schwartz-Cornil, Isabelle; Dalod, Marc

    2015-01-01

    Dendritic cells (DCs) were initially defined as mononuclear phagocytes with a dendritic morphology and an exquisite efficiency for naïve T-cell activation. DC encompass several subsets initially identified by their expression of specific cell surface molecules and later shown to excel in distinct functions and to develop under the instruction of different transcription factors or cytokines. Very few cell surface molecules are expressed in a specific manner on any immune cell type. Hence, to identify cell types, the sole use of a small number of cell surface markers in classical flow cytometry can be deceiving. Moreover, the markers currently used to define mononuclear phagocyte subsets vary depending on the tissue and animal species studied and even between laboratories. This has led to confusion in the definition of DC subset identity and in their attribution of specific functions. There is a strong need to identify a rigorous and consensus way to define mononuclear phagocyte subsets, with precise guidelines potentially applicable throughout tissues and species. We will discuss the advantages, drawbacks, and complementarities of different methodologies: cell surface phenotyping, ontogeny, functional characterization, and molecular profiling. We will advocate that gene expression profiling is a very rigorous, largely unbiased and accessible method to define the identity of mononuclear phagocyte subsets, which strengthens and refines surface phenotyping. It is uniquely powerful to yield new, experimentally testable, hypotheses on the ontogeny or functions of mononuclear phagocyte subsets, their molecular regulation, and their evolutionary conservation. We propose defining cell populations based on a combination of cell surface phenotyping, expression analysis of hallmark genes, and robust functional assays, in order to reach a consensus and integrate faster the huge but scattered knowledge accumulated by different laboratories on different cell types, organs, and species. PMID:26082777

  12. A data driven partial ambiguity resolution: Two step success rate criterion, and its simulation demonstration

    NASA Astrophysics Data System (ADS)

    Hou, Yanqing; Verhagen, Sandra; Wu, Jie

    2016-12-01

    Ambiguity Resolution (AR) is a key technique in GNSS precise positioning. In case of weak models (i.e., low precision of data), however, the success rate of AR may be low, which may consequently introduce large errors to the baseline solution in cases of wrong fixing. Partial Ambiguity Resolution (PAR) is therefore proposed such that the baseline precision can be improved by fixing only a subset of ambiguities with high success rate. This contribution proposes a new PAR strategy, allowing to select the subset such that the expected precision gain is maximized among a set of pre-selected subsets, while at the same time the failure rate is controlled. These pre-selected subsets are supposed to obtain the highest success rate among those with the same subset size. The strategy is called Two-step Success Rate Criterion (TSRC) as it will first try to fix a relatively large subset with the fixed failure rate ratio test (FFRT) to decide on acceptance or rejection. In case of rejection, a smaller subset will be fixed and validated by the ratio test so as to fulfill the overall failure rate criterion. It is shown how the method can be practically used, without introducing a large additional computation effort. And more importantly, how it can improve (or at least not deteriorate) the availability in terms of baseline precision comparing to classical Success Rate Criterion (SRC) PAR strategy, based on a simulation validation. In the simulation validation, significant improvements are obtained for single-GNSS on short baselines with dual-frequency observations. For dual-constellation GNSS, the improvement for single-frequency observations on short baselines is very significant, on average 68%. For the medium- to long baselines, with dual-constellation GNSS the average improvement is around 20-30%.

  13. Validation of Manual Muscle Testing and a Subset of Eight Muscles (MMT8) for Adult and Juvenile Idiopathic Inflammatory Myopathies

    PubMed Central

    Rider, Lisa G.; Koziol, Deloris; Giannini, Edward H.; Jain, Minal S.; Smith, Michaele R.; Whitney-Mahoney, Kristi; Feldman, Brian M.; Wright, Susan J.; Lindsley, Carol B.; Pachman, Lauren M.; Villalba, Maria L.; Lovell, Daniel J.; Bowyer, Suzanne L.; Plotz, Paul H.; Miller, Frederick W.; Hicks, Jeanne E.

    2010-01-01

    Objective To validate manual muscle testing (MMT) for strength assessment in juvenile and adult dermatomyositis (DM) and polymyositis (PM). Methods Seventy-three children and 45 adult DM/PM patients were assessed at baseline and reevaluated 6–9 months later. We compared Total MMT (a group of 24 proximal, distal, and axial muscles) and Proximal MMT (7 proximal muscle groups) tested bilaterally on a 0–10 scale with 144 subsets of six and 96 subsets of eight muscle groups tested unilaterally. Expert consensus was used to rank the best abbreviated MMT subsets for face validity and ease of assessment. Results The Total, Proximal and best MMT subsets had excellent internal reliability (rs:Total MMT 0.91–0.98), and consistency (Cronbach’s α 0.78–0.97). Inter- and intra-rater reliability were acceptable (Kendall’s W 0.68–0.76; rs 0.84–0.95). MMT subset scores correlated highly with Total and Proximal MMT scores and with the Childhood Myositis Assessment Scale, and correlated moderately with physician global activity, functional disability, magnetic resonance imaging, axial and distal MMT scores and, in adults, with creatine kinase. The standardized response mean for Total MMT was 0.56 in juveniles and 0.75 in adults. Consensus was reached to use a subset of eight muscles (neck flexors, deltoids, biceps, wrist extensors, gluteus maximus and medius, quadriceps and ankle dorsiflexors) that performed as well as the Total and Proximal MMT, and had good face validity and ease of assessment. Conclusions These findings aid in standardizing the use of MMT for assessing strength as an outcome measure for myositis. PMID:20391500

  14. Dissecting the genetic heterogeneity of myopia susceptibility in an Ashkenazi Jewish population using ordered subset analysis

    PubMed Central

    Simpson, Claire L.; Wojciechowski, Robert; Ibay, Grace; Stambolian, Dwight

    2011-01-01

    Purpose Despite many years of research, most of the genetic factors contributing to myopia development remain unknown. Genetic studies have pointed to a strong inherited component, but although many candidate regions have been implicated, few genes have been positively identified. Methods We have previously reported 2 genomewide linkage scans in a population of 63 highly aggregated Ashkenazi Jewish families that identified a locus on chromosome 22. Here we used ordered subset analysis (OSA), conditioned on non-parametric linkage to chromosome 22 to detect other chromosomal regions which had evidence of linkage to myopia in subsets of the families, but not the overall sample. Results Strong evidence of linkage to a 19-cM linkage interval with a peak OSA nonparametric allele-sharing logarithm-of-odds (LOD) score of 3.14 on 20p12-q11.1 (ΔLOD=2.39, empirical p=0.029) was identified in a subset of 20 families that also exhibited strong evidence of linkage to chromosome 22. One other locus also presented with suggestive LOD scores >2.0 on chromosome 11p14-q14 and one locus on chromosome 6q22-q24 had an OSA LOD score=1.76 (ΔLOD=1.65, empirical p=0.02). Conclusions The chromosome 6 and 20 loci are entirely novel and appear linked in a subset of families whose myopia is known to be linked to chromosome 22. The chromosome 11 locus overlaps with the known Myopia-7 (MYP7, OMIM 609256) locus. Using ordered subset analysis allows us to find additional loci linked to myopia in subsets of families, and underlines the complex genetic heterogeneity of myopia even in highly aggregated families and genetically isolated populations such as the Ashkenazi Jews. PMID:21738393

  15. Automatic Nuclei Segmentation in H&E Stained Breast Cancer Histopathology Images

    PubMed Central

    Veta, Mitko; van Diest, Paul J.; Kornegoor, Robert; Huisman, André; Viergever, Max A.; Pluim, Josien P. W.

    2013-01-01

    The introduction of fast digital slide scanners that provide whole slide images has led to a revival of interest in image analysis applications in pathology. Segmentation of cells and nuclei is an important first step towards automatic analysis of digitized microscopy images. We therefore developed an automated nuclei segmentation method that works with hematoxylin and eosin (H&E) stained breast cancer histopathology images, which represent regions of whole digital slides. The procedure can be divided into four main steps: 1) pre-processing with color unmixing and morphological operators, 2) marker-controlled watershed segmentation at multiple scales and with different markers, 3) post-processing for rejection of false regions and 4) merging of the results from multiple scales. The procedure was developed on a set of 21 breast cancer cases (subset A) and tested on a separate validation set of 18 cases (subset B). The evaluation was done in terms of both detection accuracy (sensitivity and positive predictive value) and segmentation accuracy (Dice coefficient). The mean estimated sensitivity for subset A was 0.875 (±0.092) and for subset B 0.853 (±0.077). The mean estimated positive predictive value was 0.904 (±0.075) and 0.886 (±0.069) for subsets A and B, respectively. For both subsets, the distribution of the Dice coefficients had a high peak around 0.9, with the vast majority of segmentations having values larger than 0.8. PMID:23922958

  16. Automatic nuclei segmentation in H&E stained breast cancer histopathology images.

    PubMed

    Veta, Mitko; van Diest, Paul J; Kornegoor, Robert; Huisman, André; Viergever, Max A; Pluim, Josien P W

    2013-01-01

    The introduction of fast digital slide scanners that provide whole slide images has led to a revival of interest in image analysis applications in pathology. Segmentation of cells and nuclei is an important first step towards automatic analysis of digitized microscopy images. We therefore developed an automated nuclei segmentation method that works with hematoxylin and eosin (H&E) stained breast cancer histopathology images, which represent regions of whole digital slides. The procedure can be divided into four main steps: 1) pre-processing with color unmixing and morphological operators, 2) marker-controlled watershed segmentation at multiple scales and with different markers, 3) post-processing for rejection of false regions and 4) merging of the results from multiple scales. The procedure was developed on a set of 21 breast cancer cases (subset A) and tested on a separate validation set of 18 cases (subset B). The evaluation was done in terms of both detection accuracy (sensitivity and positive predictive value) and segmentation accuracy (Dice coefficient). The mean estimated sensitivity for subset A was 0.875 (±0.092) and for subset B 0.853 (±0.077). The mean estimated positive predictive value was 0.904 (±0.075) and 0.886 (±0.069) for subsets A and B, respectively. For both subsets, the distribution of the Dice coefficients had a high peak around 0.9, with the vast majority of segmentations having values larger than 0.8.

  17. Defining an essence of structure determining residue contacts in proteins.

    PubMed

    Sathyapriya, R; Duarte, Jose M; Stehr, Henning; Filippis, Ioannis; Lappe, Michael

    2009-12-01

    The network of native non-covalent residue contacts determines the three-dimensional structure of a protein. However, not all contacts are of equal structural significance, and little knowledge exists about a minimal, yet sufficient, subset required to define the global features of a protein. Characterisation of this "structural essence" has remained elusive so far: no algorithmic strategy has been devised to-date that could outperform a random selection in terms of 3D reconstruction accuracy (measured as the Ca RMSD). It is not only of theoretical interest (i.e., for design of advanced statistical potentials) to identify the number and nature of essential native contacts-such a subset of spatial constraints is very useful in a number of novel experimental methods (like EPR) which rely heavily on constraint-based protein modelling. To derive accurate three-dimensional models from distance constraints, we implemented a reconstruction pipeline using distance geometry. We selected a test-set of 12 protein structures from the four major SCOP fold classes and performed our reconstruction analysis. As a reference set, series of random subsets (ranging from 10% to 90% of native contacts) are generated for each protein, and the reconstruction accuracy is computed for each subset. We have developed a rational strategy, termed "cone-peeling" that combines sequence features and network descriptors to select minimal subsets that outperform the reference sets. We present, for the first time, a rational strategy to derive a structural essence of residue contacts and provide an estimate of the size of this minimal subset. Our algorithm computes sparse subsets capable of determining the tertiary structure at approximately 4.8 A Ca RMSD with as little as 8% of the native contacts (Ca-Ca and Cb-Cb). At the same time, a randomly chosen subset of native contacts needs about twice as many contacts to reach the same level of accuracy. This "structural essence" opens new avenues in the fields of structure prediction, empirical potentials and docking.

  18. Defining an Essence of Structure Determining Residue Contacts in Proteins

    PubMed Central

    Sathyapriya, R.; Duarte, Jose M.; Stehr, Henning; Filippis, Ioannis; Lappe, Michael

    2009-01-01

    The network of native non-covalent residue contacts determines the three-dimensional structure of a protein. However, not all contacts are of equal structural significance, and little knowledge exists about a minimal, yet sufficient, subset required to define the global features of a protein. Characterisation of this “structural essence” has remained elusive so far: no algorithmic strategy has been devised to-date that could outperform a random selection in terms of 3D reconstruction accuracy (measured as the Ca RMSD). It is not only of theoretical interest (i.e., for design of advanced statistical potentials) to identify the number and nature of essential native contacts—such a subset of spatial constraints is very useful in a number of novel experimental methods (like EPR) which rely heavily on constraint-based protein modelling. To derive accurate three-dimensional models from distance constraints, we implemented a reconstruction pipeline using distance geometry. We selected a test-set of 12 protein structures from the four major SCOP fold classes and performed our reconstruction analysis. As a reference set, series of random subsets (ranging from 10% to 90% of native contacts) are generated for each protein, and the reconstruction accuracy is computed for each subset. We have developed a rational strategy, termed “cone-peeling” that combines sequence features and network descriptors to select minimal subsets that outperform the reference sets. We present, for the first time, a rational strategy to derive a structural essence of residue contacts and provide an estimate of the size of this minimal subset. Our algorithm computes sparse subsets capable of determining the tertiary structure at approximately 4.8 Å Ca RMSD with as little as 8% of the native contacts (Ca-Ca and Cb-Cb). At the same time, a randomly chosen subset of native contacts needs about twice as many contacts to reach the same level of accuracy. This “structural essence” opens new avenues in the fields of structure prediction, empirical potentials and docking. PMID:19997489

  19. Comparative study of feature selection with ensemble learning using SOM variants

    NASA Astrophysics Data System (ADS)

    Filali, Ameni; Jlassi, Chiraz; Arous, Najet

    2017-03-01

    Ensemble learning has succeeded in the growth of stability and clustering accuracy, but their runtime prohibits them from scaling up to real-world applications. This study deals the problem of selecting a subset of the most pertinent features for every cluster from a dataset. The proposed method is another extension of the Random Forests approach using self-organizing maps (SOM) variants to unlabeled data that estimates the out-of-bag feature importance from a set of partitions. Every partition is created using a various bootstrap sample and a random subset of the features. Then, we show that the process internal estimates are used to measure variable pertinence in Random Forests are also applicable to feature selection in unsupervised learning. This approach aims to the dimensionality reduction, visualization and cluster characterization at the same time. Hence, we provide empirical results on nineteen benchmark data sets indicating that RFS can lead to significant improvement in terms of clustering accuracy, over several state-of-the-art unsupervised methods, with a very limited subset of features. The approach proves promise to treat with very broad domains.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ditzler, Gregory; Morrison, J. Calvin; Lan, Yemin

    Background: Some of the current software tools for comparative metagenomics provide ecologists with the ability to investigate and explore bacterial communities using α– & β–diversity. Feature subset selection – a sub-field of machine learning – can also provide a unique insight into the differences between metagenomic or 16S phenotypes. In particular, feature subset selection methods can obtain the operational taxonomic units (OTUs), or functional features, that have a high-level of influence on the condition being studied. For example, in a previous study we have used information-theoretic feature selection to understand the differences between protein family abundances that best discriminate betweenmore » age groups in the human gut microbiome. Results: We have developed a new Python command line tool, which is compatible with the widely adopted BIOM format, for microbial ecologists that implements information-theoretic subset selection methods for biological data formats. We demonstrate the software tools capabilities on publicly available datasets. Conclusions: We have made the software implementation of Fizzy available to the public under the GNU GPL license. The standalone implementation can be found at http://github.com/EESI/Fizzy.« less

  1. Restoration of STORM images from sparse subset of localizations (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Moiseev, Alexander A.; Gelikonov, Grigory V.; Gelikonov, Valentine M.

    2016-02-01

    To construct a Stochastic Optical Reconstruction Microscopy (STORM) image one should collect sufficient number of localized fluorophores to satisfy Nyquist criterion. This requirement limits time resolution of the method. In this work we propose a probabalistic approach to construct STORM images from a subset of localized fluorophores 3-4 times sparser than required from Nyquist criterion. Using a set of STORM images constructed from number of localizations sufficient for Nyquist criterion we derive a model which allows us to predict the probability for every location to be occupied by a fluorophore at the end of hypothetical acquisition, having as an input parameters distribution of already localized fluorophores in the proximity of this location. We show that probability map obtained from number of fluorophores 3-4 times less than required by Nyquist criterion may be used as superresolution image itself. Thus we are able to construct STORM image from a subset of localized fluorophores 3-4 times sparser than required from Nyquist criterion, proportionaly decreasing STORM data acquisition time. This method may be used complementary with other approaches desined for increasing STORM time resolution.

  2. Research on allocation efficiency of the daisy chain allocation algorithm

    NASA Astrophysics Data System (ADS)

    Shi, Jingping; Zhang, Weiguo

    2013-03-01

    With the improvement of the aircraft performance in reliability, maneuverability and survivability, the number of the control effectors increases a lot. How to distribute the three-axis moments into the control surfaces reasonably becomes an important problem. Daisy chain method is simple and easy to be carried out in the design of the allocation system. But it can not solve the allocation problem for entire attainable moment subset. For the lateral-directional allocation problem, the allocation efficiency of the daisy chain can be directly measured by the area of its subset of attainable moments. Because of the non-linear allocation characteristic, the subset of attainable moments of daisy-chain method is a complex non-convex polygon, and it is difficult to solve directly. By analyzing the two-dimensional allocation problems with a "micro-element" idea, a numerical calculation algorithm is proposed to compute the area of the non-convex polygon. In order to improve the allocation efficiency of the algorithm, a genetic algorithm with the allocation efficiency chosen as the fitness function is proposed to find the best pseudo-inverse matrix.

  3. Day-Ahead PM2.5 Concentration Forecasting Using WT-VMD Based Decomposition Method and Back Propagation Neural Network Improved by Differential Evolution

    PubMed Central

    Wang, Deyun; Liu, Yanling; Luo, Hongyuan; Yue, Chenqiang; Cheng, Sheng

    2017-01-01

    Accurate PM2.5 concentration forecasting is crucial for protecting public health and atmospheric environment. However, the intermittent and unstable nature of PM2.5 concentration series makes its forecasting become a very difficult task. In order to improve the forecast accuracy of PM2.5 concentration, this paper proposes a hybrid model based on wavelet transform (WT), variational mode decomposition (VMD) and back propagation (BP) neural network optimized by differential evolution (DE) algorithm. Firstly, WT is employed to disassemble the PM2.5 concentration series into a number of subsets with different frequencies. Secondly, VMD is applied to decompose each subset into a set of variational modes (VMs). Thirdly, DE-BP model is utilized to forecast all the VMs. Fourthly, the forecast value of each subset is obtained through aggregating the forecast results of all the VMs obtained from VMD decomposition of this subset. Finally, the final forecast series of PM2.5 concentration is obtained by adding up the forecast values of all subsets. Two PM2.5 concentration series collected from Wuhan and Tianjin, respectively, located in China are used to test the effectiveness of the proposed model. The results demonstrate that the proposed model outperforms all the other considered models in this paper. PMID:28704955

  4. Hubble Parameter and Baryon Acoustic Oscillation Measurement Constraints on the Hubble Constant, the Deviation from the Spatially Flat ΛCDM Model, the Deceleration–Acceleration Transition Redshift, and Spatial Curvature

    NASA Astrophysics Data System (ADS)

    Yu, Hai; Ratra, Bharat; Wang, Fa-Yin

    2018-03-01

    We compile a complete collection of reliable Hubble parameter H(z) data to redshift z ≤ 2.36 and use them with the Gaussian Process method to determine continuous H(z) functions for various data subsets. From these continuous H(z)'s, summarizing across the data subsets considered, we find H 0 ∼ 67 ± 4 km s‑1 Mpc‑1, more consistent with the recent lower values determined using a variety of techniques. In most data subsets, we see a cosmological deceleration–acceleration transition at 2σ significance, with the data subsets transition redshifts varying over 0.33< {z}da}< 1.0 at 1σ significance. We find that the flat-ΛCDM model is consistent with the H(z) data to a z of 1.5 to 2.0, depending on data subset considered, with 2σ deviations from flat-ΛCDM above this redshift range. Using the continuous H(z) with baryon acoustic oscillation distance-redshift observations, we constrain the current spatial curvature density parameter to be {{{Ω }}}K0=-0.03+/- 0.21, consistent with a flat universe, but the large error bar does not rule out small values of spatial curvature that are now under debate.

  5. Algorithms for Learning Preferences for Sets of Objects

    NASA Technical Reports Server (NTRS)

    Wagstaff, Kiri L.; desJardins, Marie; Eaton, Eric

    2010-01-01

    A method is being developed that provides for an artificial-intelligence system to learn a user's preferences for sets of objects and to thereafter automatically select subsets of objects according to those preferences. The method was originally intended to enable automated selection, from among large sets of images acquired by instruments aboard spacecraft, of image subsets considered to be scientifically valuable enough to justify use of limited communication resources for transmission to Earth. The method is also applicable to other sets of objects: examples of sets of objects considered in the development of the method include food menus, radio-station music playlists, and assortments of colored blocks for creating mosaics. The method does not require the user to perform the often-difficult task of quantitatively specifying preferences; instead, the user provides examples of preferred sets of objects. This method goes beyond related prior artificial-intelligence methods for learning which individual items are preferred by the user: this method supports a concept of setbased preferences, which include not only preferences for individual items but also preferences regarding types and degrees of diversity of items in a set. Consideration of diversity in this method involves recognition that members of a set may interact with each other in the sense that when considered together, they may be regarded as being complementary, redundant, or incompatible to various degrees. The effects of such interactions are loosely summarized in the term portfolio effect. The learning method relies on a preference representation language, denoted DD-PREF, to express set-based preferences. In DD-PREF, a preference is represented by a tuple that includes quality (depth) functions to estimate how desired a specific value is, weights for each feature preference, the desired diversity of feature values, and the relative importance of diversity versus depth. The system applies statistical concepts to estimate quantitative measures of the user s preferences from training examples (preferred subsets) specified by the user. Once preferences have been learned, the system uses those preferences to select preferred subsets from new sets. The method was found to be viable when tested in computational experiments on menus, music playlists, and rover images. Contemplated future development efforts include further tests on more diverse sets and development of a sub-method for (a) estimating the parameter that represents the relative importance of diversity versus depth, and (b) incorporating background knowledge about the nature of quality functions, which are special functions that specify depth preferences for features.

  6. Numerical recovery of certain discontinuous electrical conductivities

    NASA Technical Reports Server (NTRS)

    Bryan, Kurt

    1991-01-01

    The inverse problem of recovering an electrical conductivity of the form Gamma(x) = 1 + (k-1)(sub Chi(D)) (Chi(D) is the characteristic function of D) on a region omega is a subset of 2-dimensional Euclid space from boundary data is considered, where D is a subset of omega and k is some positive constant. A linearization of the forward problem is formed and used in a least squares output method for approximately solving the inverse problem. Convergence results are proved and some numerical results presented.

  7. Applications and development of new algorithms for displacement analysis using InSAR time series

    NASA Astrophysics Data System (ADS)

    Osmanoglu, Batuhan

    Time series analysis of Synthetic Aperture Radar Interferometry (InSAR) data has become an important scientific tool for monitoring and measuring the displacement of Earth's surface due to a wide range of phenomena, including earthquakes, volcanoes, landslides, changes in ground water levels, and wetlands. Time series analysis is a product of interferometric phase measurements, which become ambiguous when the observed motion is larger than half of the radar wavelength. Thus, phase observations must first be unwrapped in order to obtain physically meaningful results. Persistent Scatterer Interferometry (PSI), Stanford Method for Persistent Scatterers (StaMPS), Short Baselines Interferometry (SBAS) and Small Temporal Baseline Subset (STBAS) algorithms solve for this ambiguity using a series of spatio-temporal unwrapping algorithms and filters. In this dissertation, I improve upon current phase unwrapping algorithms, and apply the PSI method to study subsidence in Mexico City. PSI was used to obtain unwrapped deformation rates in Mexico City (Chapter 3),where ground water withdrawal in excess of natural recharge causes subsurface, clay-rich sediments to compact. This study is based on 23 satellite SAR scenes acquired between January 2004 and July 2006. Time series analysis of the data reveals a maximum line-of-sight subsidence rate of 300mm/yr at a high enough resolution that individual subsidence rates for large buildings can be determined. Differential motion and related structural damage along an elevated metro rail was evident from the results. Comparison of PSI subsidence rates with data from permanent GPS stations indicate root mean square (RMS) agreement of 6.9 mm/yr, about the level expected based on joint data uncertainty. The Mexico City results suggest negligible recharge, implying continuing degradation and loss of the aquifer in the third largest metropolitan area in the world. Chapters 4 and 5 illustrate the link between time series analysis and three-dimensional (3-D) phase unwrapping. Chapter 4 focuses on the unwrapping path. Unwrapping algorithms can be divided into two groups, path-dependent and path-independent algorithms. Path-dependent algorithms use local unwrapping functions applied pixel-by-pixel to the dataset. In contrast, path-independent algorithms use global optimization methods such as least squares, and return a unique solution. However, when aliasing and noise are present, path-independent algorithms can underestimate the signal in some areas due to global fitting criteria. Path-dependent algorithms do not underestimate the signal, but, as the name implies, the unwrapping path can affect the result. Comparison between existing path algorithms and a newly developed algorithm based on Fisher information theory was conducted. Results indicate that Fisher information theory does indeed produce lower misfit results for most tested cases. Chapter 5 presents a new time series analysis method based on 3-D unwrapping of SAR data using extended Kalman filters. Existing methods for time series generation using InSAR data employ special filters to combine two-dimensional (2-D) spatial unwrapping with one-dimensional (1-D) temporal unwrapping results. The new method, however, combines observations in azimuth, range and time for repeat pass interferometry. Due to the pixel-by-pixel characteristic of the filter, the unwrapping path is selected based on a quality map. This unwrapping algorithm is the first application of extended Kalman filters to the 3-D unwrapping problem. Time series analyses of InSAR data are used in a variety of applications with different characteristics. Consequently, it is difficult to develop a single algorithm that can provide optimal results in all cases, given that different algorithms possess a unique set of strengths and weaknesses. Nonetheless, filter-based unwrapping algorithms such as the one presented in this dissertation have the capability of joining multiple observations into a uniform solution, which is becoming an important feature with continuously growing datasets.

  8. Environmental Asbestos Assessment Manual: Superfund Method for the Determination of Asbestos in Ambient Air, Part 1: Method

    EPA Science Inventory

    This is a sampling and analysis method for the determination of asbestos in air. Samples are analyzed by transmission electron microscopy (TEM). Although a small subset of samples are to be prepared using a direct procedure, the majority of samples analyzed using this method wil...

  9. Computing convex quadrangulations☆

    PubMed Central

    Schiffer, T.; Aurenhammer, F.; Demuth, M.

    2012-01-01

    We use projected Delaunay tetrahedra and a maximum independent set approach to compute large subsets of convex quadrangulations on a given set of points in the plane. The new method improves over the popular pairing method based on triangulating the point set. PMID:22389540

  10. Missing value imputation in DNA microarrays based on conjugate gradient method.

    PubMed

    Dorri, Fatemeh; Azmi, Paeiz; Dorri, Faezeh

    2012-02-01

    Analysis of gene expression profiles needs a complete matrix of gene array values; consequently, imputation methods have been suggested. In this paper, an algorithm that is based on conjugate gradient (CG) method is proposed to estimate missing values. k-nearest neighbors of the missed entry are first selected based on absolute values of their Pearson correlation coefficient. Then a subset of genes among the k-nearest neighbors is labeled as the best similar ones. CG algorithm with this subset as its input is then used to estimate the missing values. Our proposed CG based algorithm (CGimpute) is evaluated on different data sets. The results are compared with sequential local least squares (SLLSimpute), Bayesian principle component analysis (BPCAimpute), local least squares imputation (LLSimpute), iterated local least squares imputation (ILLSimpute) and adaptive k-nearest neighbors imputation (KNNKimpute) methods. The average of normalized root mean squares error (NRMSE) and relative NRMSE in different data sets with various missing rates shows CGimpute outperforms other methods. Copyright © 2011 Elsevier Ltd. All rights reserved.

  11. Accelerating an Ordered-Subset Low-Dose X-Ray Cone Beam Computed Tomography Image Reconstruction with a Power Factor and Total Variation Minimization.

    PubMed

    Huang, Hsuan-Ming; Hsiao, Ing-Tsung

    2016-01-01

    In recent years, there has been increased interest in low-dose X-ray cone beam computed tomography (CBCT) in many fields, including dentistry, guided radiotherapy and small animal imaging. Despite reducing the radiation dose, low-dose CBCT has not gained widespread acceptance in routine clinical practice. In addition to performing more evaluation studies, developing a fast and high-quality reconstruction algorithm is required. In this work, we propose an iterative reconstruction method that accelerates ordered-subsets (OS) reconstruction using a power factor. Furthermore, we combine it with the total-variation (TV) minimization method. Both simulation and phantom studies were conducted to evaluate the performance of the proposed method. Results show that the proposed method can accelerate conventional OS methods, greatly increase the convergence speed in early iterations. Moreover, applying the TV minimization to the power acceleration scheme can further improve the image quality while preserving the fast convergence rate.

  12. Accelerating an Ordered-Subset Low-Dose X-Ray Cone Beam Computed Tomography Image Reconstruction with a Power Factor and Total Variation Minimization

    PubMed Central

    Huang, Hsuan-Ming; Hsiao, Ing-Tsung

    2016-01-01

    In recent years, there has been increased interest in low-dose X-ray cone beam computed tomography (CBCT) in many fields, including dentistry, guided radiotherapy and small animal imaging. Despite reducing the radiation dose, low-dose CBCT has not gained widespread acceptance in routine clinical practice. In addition to performing more evaluation studies, developing a fast and high-quality reconstruction algorithm is required. In this work, we propose an iterative reconstruction method that accelerates ordered-subsets (OS) reconstruction using a power factor. Furthermore, we combine it with the total-variation (TV) minimization method. Both simulation and phantom studies were conducted to evaluate the performance of the proposed method. Results show that the proposed method can accelerate conventional OS methods, greatly increase the convergence speed in early iterations. Moreover, applying the TV minimization to the power acceleration scheme can further improve the image quality while preserving the fast convergence rate. PMID:27073853

  13. Testing Different Model Building Procedures Using Multiple Regression.

    ERIC Educational Resources Information Center

    Thayer, Jerome D.

    The stepwise regression method of selecting predictors for computer assisted multiple regression analysis was compared with forward, backward, and best subsets regression, using 16 data sets. The results indicated the stepwise method was preferred because of its practical nature, when the models chosen by different selection methods were similar…

  14. A ℓ2, 1 norm regularized multi-kernel learning for false positive reduction in Lung nodule CAD.

    PubMed

    Cao, Peng; Liu, Xiaoli; Zhang, Jian; Li, Wei; Zhao, Dazhe; Huang, Min; Zaiane, Osmar

    2017-03-01

    The aim of this paper is to describe a novel algorithm for False Positive Reduction in lung nodule Computer Aided Detection(CAD). In this paper, we describes a new CT lung CAD method which aims to detect solid nodules. Specially, we proposed a multi-kernel classifier with a ℓ 2, 1 norm regularizer for heterogeneous feature fusion and selection from the feature subset level, and designed two efficient strategies to optimize the parameters of kernel weights in non-smooth ℓ 2, 1 regularized multiple kernel learning algorithm. The first optimization algorithm adapts a proximal gradient method for solving the ℓ 2, 1 norm of kernel weights, and use an accelerated method based on FISTA; the second one employs an iterative scheme based on an approximate gradient descent method. The results demonstrates that the FISTA-style accelerated proximal descent method is efficient for the ℓ 2, 1 norm formulation of multiple kernel learning with the theoretical guarantee of the convergence rate. Moreover, the experimental results demonstrate the effectiveness of the proposed methods in terms of Geometric mean (G-mean) and Area under the ROC curve (AUC), and significantly outperforms the competing methods. The proposed approach exhibits some remarkable advantages both in heterogeneous feature subsets fusion and classification phases. Compared with the fusion strategies of feature-level and decision level, the proposed ℓ 2, 1 norm multi-kernel learning algorithm is able to accurately fuse the complementary and heterogeneous feature sets, and automatically prune the irrelevant and redundant feature subsets to form a more discriminative feature set, leading a promising classification performance. Moreover, the proposed algorithm consistently outperforms the comparable classification approaches in the literature. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  15. Sensitivity of control-augmented structure obtained by a system decomposition method

    NASA Technical Reports Server (NTRS)

    Sobieszczanskisobieski, Jaroslaw; Bloebaum, Christina L.; Hajela, Prabhat

    1988-01-01

    The verification of a method for computing sensitivity derivatives of a coupled system is presented. The method deals with a system whose analysis can be partitioned into subsets that correspond to disciplines and/or physical subsystems that exchange input-output data with each other. The method uses the partial sensitivity derivatives of the output with respect to input obtained for each subset separately to assemble a set of linear, simultaneous, algebraic equations that are solved for the derivatives of the coupled system response. This sensitivity analysis is verified using an example of a cantilever beam augmented with an active control system to limit the beam's dynamic displacements under an excitation force. The verification shows good agreement of the method with reference data obtained by a finite difference technique involving entire system analysis. The usefulness of a system sensitivity method in optimization applications by employing a piecewise-linear approach to the same numerical example is demonstrated. The method's principal merits are its intrinsically superior accuracy in comparison with the finite difference technique, and its compatibility with the traditional division of work in complex engineering tasks among specialty groups.

  16. Mediating Effects of Patients' Stigma and Self-Efficacy on Relationships Between Doctors' Empathy Abilities and Patients' Cellular Immunity in Male Breast Cancer Patients.

    PubMed

    Yang, Ningxi; Cao, Yingnan; Li, Xiaoyan; Li, Shiyue; Yan, Hong; Geng, Qingshan

    2018-06-12

    BACKGROUND Doctors' empathy is closely related to patients' health. This study aimed to examine whether patients' stigma and self-efficacy play a mediating role in the relationship between doctors' empathy abilities and patients' cellular immunity in male patients with breast cancer. MATERIAL AND METHODS Doctors' empathy scores and patients' demographic data, disease condition, stigma, and self-efficacy were measured. Patient T cell subset was tested at admission and 3 months after the operation and was compared by paired t test. The multivariate linear regression model was applied to analyze the factors influencing the immune index. Pearson correlation analysis and structural equation modeling were applied to explore the relationships among patients' stigma, self-efficacy, and cellular immunity and doctors' empathy abilities. RESULTS At the 2 time points, only the change in NK subset was statistically significant, while the changes in percentage of CD3+, CD4+, CD8+, and B cells were not statistically significant. The doctors' empathy abilities were negatively correlated with patients' stigma and were positively related to patients' self-efficacy. Patients' stigma was negatively related to NK subset, while self-efficacy was positively associated with NK subset. Patients' stigma and self-efficacy played a mediating role in the relationship between doctors' empathy abilities and patients' NK subset, and stigma had a stronger effect than self-efficacy. CONCLUSIONS Doctors' empathy abilities affected breast cancer patients' NK subset through their stigma and self-efficacy. The mental health of male breast cancer patients need more attention and empathy education needs to be improved.

  17. Ordered mapping of 3 alphoid DNA subsets on human chromosome 22

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Antonacci, R.; Baldini, A.; Archidiacono, N.

    1994-09-01

    Alpha satellite DNA consists of tandemly repeated monomers of 171 bp clustered in the centromeric region of primate chromosomes. Sequence divergence between subsets located in different human chromosomes is usually high enough to ensure chromosome-specific hybridization. Alphoid probes specific for almost every human chromosome have been reported. A single chromosome can carry different subsets of alphoid DNA and some alphoid subsets can be shared by different chromosomes. We report the physical order of three alphoid DNA subsets on human chromosome 22 determined by a combination of low and high resolution cytological mapping methods. Results visually demonstrate the presence of threemore » distinct alphoid DNA domains at the centromeric region of chromosome 22. We have measured the interphase distances between the three probes in three-color FISH experiments. Statistical analysis of the results indicated the order of the subsets. Two color experiments on prometaphase chromosomes established the order of the three domains relative to the arms of chromosome 22 and confirmed the results obtained using interphase mapping. This demonstrates the applicability of interphase mapping for alpha satellite DNA orderering. However, in our experiments, interphase mapping did not provide any information about the relationship between extremities of the repeat arrays. This information was gained from extended chromatin hybridization. The extremities of two of the repeat arrays were seen to be almost overlapping whereas the third repeat array was clearly separated from the other two. Our data show the value of extended chromatin hybridization as a complement of other cytological techniques for high resolution mapping of repetitive DNA sequences.« less

  18. Image segmentation via foreground and background semantic descriptors

    NASA Astrophysics Data System (ADS)

    Yuan, Ding; Qiang, Jingjing; Yin, Jihao

    2017-09-01

    In the field of image processing, it has been a challenging task to obtain a complete foreground that is not uniform in color or texture. Unlike other methods, which segment the image by only using low-level features, we present a segmentation framework, in which high-level visual features, such as semantic information, are used. First, the initial semantic labels were obtained by using the nonparametric method. Then, a subset of the training images, with a similar foreground to the input image, was selected. Consequently, the semantic labels could be further refined according to the subset. Finally, the input image was segmented by integrating the object affinity and refined semantic labels. State-of-the-art performance was achieved in experiments with the challenging MSRC 21 dataset.

  19. Unbiased Analysis of TCRα/β Chains at the Single-Cell Level in Human CD8+ T-Cell Subsets

    PubMed Central

    Sun, Xiaoming; Saito, Masumichi; Sato, Yoshinori; Chikata, Takayuki; Naruto, Takuya; Ozawa, Tatsuhiko; Kobayashi, Eiji; Kishi, Hiroyuki; Muraguchi, Atsushi; Takiguchi, Masafumi

    2012-01-01

    T-cell receptor (TCR) α/β chains are expressed on the surface of CD8+ T-cells and have been implicated in antigen recognition, activation, and proliferation. However, the methods for characterization of human TCRα/β chains have not been well established largely because of the complexity of their structures owing to the extensive genetic rearrangements that they undergo. Here we report the development of an integrated 5′-RACE and multiplex PCR method to amplify the full-length transcripts of TCRα/β at the single-cell level in human CD8+ subsets, including naive, central memory, early effector memory, late effector memory, and effector phenotypic cells. Using this method, with an approximately 47% and 62% of PCR success rate for TCRα and for TCRβ chains, respectively, we were able to analyze more than 1,000 reads of transcripts of each TCR chain. Our comprehensive analysis revealed the following: (1) chimeric rearrangements of TCRδ-α, (2) control of TCRα/β transcription with multiple transcriptional initiation sites, (3) altered utilization of TCRα/β chains in CD8+ subsets, and (4) strong association between the clonal size of TCRα/β chains and the effector phenotype of CD8+ T-cells. Based on these findings, we conclude that our method is a useful tool to identify the dynamics of the TCRα/β repertoire, and provides new insights into the study of human TCRα/β chains. PMID:22792299

  20. Unbiased analysis of TCRα/β chains at the single-cell level in human CD8+ T-cell subsets.

    PubMed

    Sun, Xiaoming; Saito, Masumichi; Sato, Yoshinori; Chikata, Takayuki; Naruto, Takuya; Ozawa, Tatsuhiko; Kobayashi, Eiji; Kishi, Hiroyuki; Muraguchi, Atsushi; Takiguchi, Masafumi

    2012-01-01

    T-cell receptor (TCR) α/β chains are expressed on the surface of CD8(+) T-cells and have been implicated in antigen recognition, activation, and proliferation. However, the methods for characterization of human TCRα/β chains have not been well established largely because of the complexity of their structures owing to the extensive genetic rearrangements that they undergo. Here we report the development of an integrated 5'-RACE and multiplex PCR method to amplify the full-length transcripts of TCRα/β at the single-cell level in human CD8(+) subsets, including naive, central memory, early effector memory, late effector memory, and effector phenotypic cells. Using this method, with an approximately 47% and 62% of PCR success rate for TCRα and for TCRβ chains, respectively, we were able to analyze more than 1,000 reads of transcripts of each TCR chain. Our comprehensive analysis revealed the following: (1) chimeric rearrangements of TCRδ-α, (2) control of TCRα/β transcription with multiple transcriptional initiation sites, (3) altered utilization of TCRα/β chains in CD8(+) subsets, and (4) strong association between the clonal size of TCRα/β chains and the effector phenotype of CD8(+) T-cells. Based on these findings, we conclude that our method is a useful tool to identify the dynamics of the TCRα/β repertoire, and provides new insights into the study of human TCRα/β chains.

  1. Sensitivity Analysis for Atmospheric Infrared Sounder (AIRS) CO2 Retrieval

    NASA Technical Reports Server (NTRS)

    Gat, Ilana

    2012-01-01

    The Atmospheric Infrared Sounder (AIRS) is a thermal infrared sensor able to retrieve the daily atmospheric state globally for clear as well as partially cloudy field-of-views. The AIRS spectrometer has 2378 channels sensing from 15.4 micrometers to 3.7 micrometers, of which a small subset in the 15 micrometers region has been selected, to date, for CO2 retrieval. To improve upon the current retrieval method, we extended the retrieval calculations to include a prior estimate component and developed a channel ranking system to optimize the channels and number of channels used. The channel ranking system uses a mathematical formalism to rapidly process and assess the retrieval potential of large numbers of channels. Implementing this system, we identifed a larger optimized subset of AIRS channels that can decrease retrieval errors and minimize the overall sensitivity to other iridescent contributors, such as water vapor, ozone, and atmospheric temperature. This methodology selects channels globally by accounting for the latitudinal, longitudinal, and seasonal dependencies of the subset. The new methodology increases accuracy in AIRS CO2 as well as other retrievals and enables the extension of retrieved CO2 vertical profiles to altitudes ranging from the lower troposphere to upper stratosphere. The extended retrieval method for CO2 vertical profile estimation using a maximum-likelihood estimation method. We use model data to demonstrate the beneficial impact of the extended retrieval method using the new channel ranking system on CO2 retrieval.

  2. On sample size and different interpretations of snow stability datasets

    NASA Astrophysics Data System (ADS)

    Schirmer, M.; Mitterer, C.; Schweizer, J.

    2009-04-01

    Interpretations of snow stability variations need an assessment of the stability itself, independent of the scale investigated in the study. Studies on stability variations at a regional scale have often chosen stability tests such as the Rutschblock test or combinations of various tests in order to detect differences in aspect and elevation. The question arose: ‘how capable are such stability interpretations in drawing conclusions'. There are at least three possible errors sources: (i) the variance of the stability test itself; (ii) the stability variance at an underlying slope scale, and (iii) that the stability interpretation might not be directly related to the probability of skier triggering. Various stability interpretations have been proposed in the past that provide partly different results. We compared a subjective one based on expert knowledge with a more objective one based on a measure derived from comparing skier-triggered slopes vs. slopes that have been skied but not triggered. In this study, the uncertainties are discussed and their effects on regional scale stability variations will be quantified in a pragmatic way. An existing dataset with very large sample sizes was revisited. This dataset contained the variance of stability at a regional scale for several situations. The stability in this dataset was determined using the subjective interpretation scheme based on expert knowledge. The question to be answered was how many measurements were needed to obtain similar results (mainly stability differences in aspect or elevation) as with the complete dataset. The optimal sample size was obtained in several ways: (i) assuming a nominal data scale the sample size was determined with a given test, significance level and power, and by calculating the mean and standard deviation of the complete dataset. With this method it can also be determined if the complete dataset consists of an appropriate sample size. (ii) Smaller subsets were created with similar aspect distributions to the large dataset. We used 100 different subsets for each sample size. Statistical variations obtained in the complete dataset were also tested on the smaller subsets using the Mann-Whitney or the Kruskal-Wallis test. For each subset size, the number of subsets were counted in which the significance level was reached. For these tests no nominal data scale was assumed. (iii) For the same subsets described above, the distribution of the aspect median was determined. A count of how often this distribution was substantially different from the distribution obtained with the complete dataset was made. Since two valid stability interpretations were available (an objective and a subjective interpretation as described above), the effect of the arbitrary choice of the interpretation on spatial variability results was tested. In over one third of the cases the two interpretations came to different results. The effect of these differences were studied in a similar method as described in (iii): the distribution of the aspect median was determined for subsets of the complete dataset using both interpretations, compared against each other as well as to the results of the complete dataset. For the complete dataset the two interpretations showed mainly identical results. Therefore the subset size was determined from the point at which the results of the two interpretations converged. A universal result for the optimal subset size cannot be presented since results differed between different situations contained in the dataset. The optimal subset size is thus dependent on stability variation in a given situation, which is unknown initially. There are indications that for some situations even the complete dataset might be not large enough. At a subset size of approximately 25, the significant differences between aspect groups (as determined using the whole dataset) were only obtained in one out of five situations. In some situations, up to 20% of the subsets showed a substantially different distribution of the aspect median. Thus, in most cases, 25 measurements (which can be achieved by six two-person teams in one day) did not allow to draw reliable conclusions.

  3. A Method for Establishing a Depreciated Monetary Value for Print Collections.

    ERIC Educational Resources Information Center

    Marman, Edward

    1995-01-01

    Outlines a method for establishing a depreciated value of a library collection and includes an example of applying the formula for calculating depreciation. The method is based on the useful life of books, other print, and audio visual materials; their original cost; and on sampling subsets or sections of the collection. (JKP)

  4. A new method of passive modifications for partial frequency assignment of general structures

    NASA Astrophysics Data System (ADS)

    Belotti, Roberto; Ouyang, Huajiang; Richiedei, Dario

    2018-01-01

    The assignment of a subset of natural frequencies to vibrating systems can be conveniently achieved by means of suitable structural modifications. It has been observed that such an approach usually leads to the undesired change of the unassigned natural frequencies, which is a phenomenon known as frequency spill-over. Such an issue has been dealt with in the literature only in simple specific cases. In this paper, a new and general method is proposed that aims to assign a subset of natural frequencies with low spill-over. The optimal structural modifications are determined through a three-step procedure that considers both the prescribed eigenvalues and the feasibility constraints, assuring that the obtained solution is physically realizable. The proposed method is therefore applicable to very general vibrating systems, such as those obtained through the finite element method. The numerical difficulties that may occur as a result of employing the method are also carefully addressed. Finally, the capabilities of the method are validated in three test-cases in which both lumped and distributed parameters are modified to obtain the desired eigenvalues.

  5. Evaluation of intranuclear BrdU detection procedures for use in multicolor flow cytometry*

    PubMed Central

    Rothaeusler, Kristina; Baumgarth, Nicole

    2010-01-01

    Background Measurement of cell proliferation via BrdU incorporation in combination with multicolor cell surface staining would facilitate studies on cell subsets that require multiple markers for their identification. However, the extent to which the often harsh cell preparation procedures required affect the staining quality of more recently developed fluorescent dyes has not been assessed. Methods Three cell preparation protocols for BrdU measurement were compared for their ability to maintain fluorescent surface staining and scatter parameters of in vivo BrdU-labeled cells by flow cytometry. A 10-color fluorescent panel was developed to test the quality of surface staining following cell treatment and the ability to perform BrdU measurements on even small B lymphocyte subsets. Results All cell preparation procedures affected the quality of fluorescent and/or scatter parameters to varying degrees. Paraformaldehyde / saponin-based procedures preserved sufficient fluorescent surface staining to determine BrdU incorporation rates among all splenic B cell subsets, including B-1a cells, which constitute roughly 0.5% of cells. Turnover rates of B-1a cells were similar to immature B cells and higher than those of the other mature B cell subsets. Conclusion Paraformaldehyde / saponin-based cell preparation procedures facilitate detailed cell turnover studies on small cell subsets in vivo, revealing new functional information on rare cell populations. PMID:16538653

  6. Theoretical analysis on the measurement errors of local 2D DIC: Part I temporal and spatial uncertainty quantification of displacement measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Yueqi; Lava, Pascal; Reu, Phillip

    This study presents a theoretical uncertainty quantification of displacement measurements by subset-based 2D-digital image correlation. A generalized solution to estimate the random error of displacement measurement is presented. The obtained solution suggests that the random error of displacement measurements is determined by the image noise, the summation of the intensity gradient in a subset, the subpixel part of displacement, and the interpolation scheme. The proposed method is validated with virtual digital image correlation tests.

  7. Theoretical analysis on the measurement errors of local 2D DIC: Part I temporal and spatial uncertainty quantification of displacement measurements

    DOE PAGES

    Wang, Yueqi; Lava, Pascal; Reu, Phillip; ...

    2015-12-23

    This study presents a theoretical uncertainty quantification of displacement measurements by subset-based 2D-digital image correlation. A generalized solution to estimate the random error of displacement measurement is presented. The obtained solution suggests that the random error of displacement measurements is determined by the image noise, the summation of the intensity gradient in a subset, the subpixel part of displacement, and the interpolation scheme. The proposed method is validated with virtual digital image correlation tests.

  8. System and method for progressive band selection for hyperspectral images

    NASA Technical Reports Server (NTRS)

    Fisher, Kevin (Inventor)

    2013-01-01

    Disclosed herein are systems, methods, and non-transitory computer-readable storage media for progressive band selection for hyperspectral images. A system having module configured to control a processor to practice the method calculates a virtual dimensionality of a hyperspectral image having multiple bands to determine a quantity Q of how many bands are needed for a threshold level of information, ranks each band based on a statistical measure, selects Q bands from the multiple bands to generate a subset of bands based on the virtual dimensionality, and generates a reduced image based on the subset of bands. This approach can create reduced datasets of full hyperspectral images tailored for individual applications. The system uses a metric specific to a target application to rank the image bands, and then selects the most useful bands. The number of bands selected can be specified manually or calculated from the hyperspectral image's virtual dimensionality.

  9. Iterative CT reconstruction using coordinate descent with ordered subsets of data

    NASA Astrophysics Data System (ADS)

    Noo, F.; Hahn, K.; Schöndube, H.; Stierstorfer, K.

    2016-04-01

    Image reconstruction based on iterative minimization of a penalized weighted least-square criteria has become an important topic of research in X-ray computed tomography. This topic is motivated by increasing evidence that such a formalism may enable a significant reduction in dose imparted to the patient while maintaining or improving image quality. One important issue associated with this iterative image reconstruction concept is slow convergence and the associated computational effort. For this reason, there is interest in finding methods that produce approximate versions of the targeted image with a small number of iterations and an acceptable level of discrepancy. We introduce here a novel method to produce such approximations: ordered subsets in combination with iterative coordinate descent. Preliminary results demonstrate that this method can produce, within 10 iterations and using only a constant image as initial condition, satisfactory reconstructions that retain the noise properties of the targeted image.

  10. Study of high speed complex number algorithms. [for determining antenna for field radiation patterns

    NASA Technical Reports Server (NTRS)

    Heisler, R.

    1981-01-01

    A method of evaluating the radiation integral on the curved surface of a reflecting antenna is presented. A three dimensional Fourier transform approach is used to generate a two dimensional radiation cross-section along a planer cut at any angle phi through the far field pattern. Salient to the method is an algorithm for evaluating a subset of the total three dimensional discrete Fourier transform results. The subset elements are selectively evaluated to yield data along a geometric plane of constant. The algorithm is extremely efficient so that computation of the induced surface currents via the physical optics approximation dominates the computer time required to compute a radiation pattern. Application to paraboloid reflectors with off-focus feeds in presented, but the method is easily extended to offset antenna systems and reflectors of arbitrary shapes. Numerical results were computed for both gain and phase and are compared with other published work.

  11. Optimizing an Actuator Array for the Control of Multi-Frequency Noise in Aircraft Interiors

    NASA Technical Reports Server (NTRS)

    Palumbo, D. L.; Padula, S. L.

    1997-01-01

    Techniques developed for selecting an optimized actuator array for interior noise reduction at a single frequency are extended to the multi-frequency case. Transfer functions for 64 actuators were obtained at 5 frequencies from ground testing the rear section of a fully trimmed DC-9 fuselage. A single loudspeaker facing the left side of the aircraft was the primary source. A combinatorial search procedure (tabu search) was employed to find optimum actuator subsets of from 2 to 16 actuators. Noise reduction predictions derived from the transfer functions were used as a basis for evaluating actuator subsets during optimization. Results indicate that it is necessary to constrain actuator forces during optimization. Unconstrained optimizations selected actuators which require unrealistically large forces. Two methods of constraint are evaluated. It is shown that a fast, but approximate, method yields results equivalent to an accurate, but computationally expensive, method.

  12. Beneficial Effects of cART Initiated during Primary and Chronic HIV-1 Infection on Immunoglobulin-Expression of Memory B-Cell Subsets

    PubMed Central

    Pensieroso, Simone; Tolazzi, Monica; Chiappetta, Stefania; Nozza, Silvia; Lazzarin, Adriano; Tambussi, Giuseppe; Scarlatti, Gabriella

    2015-01-01

    Introduction During HIV-1 infection the B-cell compartment undergoes profound changes towards terminal differentiation, which are only partially restored by antiretroviral therapy (cART). Materials and Methods To investigate the impact of infection as early as during primary HIV-1 infection (PHI) we assessed distribution of B-cell subsets in 19 PHI and 25 chronic HIV-1-infected (CHI) individuals before and during 48 weeks of cART as compared to healthy controls (n = 23). We also analysed Immunoglobulin-expression of memory B-cell subsets to identify alterations in Immunoglobulin-maturation. Results Determination of B-cell subsets at baseline showed that total and Naive B-cells were decreased whereas Activated Memory (AM), Tissue-like Memory (TLM) B-cells and Plasma cells were increased in both PHI and CHI patients. After 4 weeks of cART total B-cells increased, while AM, TLM B-cells and Plasma cells decreased, although without reaching normal levels in either group of individuals. This trend was maintained until week 48, though only total B-cells normalized in both PHI and CHI. Resting Memory (RM) B-cells were preserved since baseline. This subset remained stable in CHI, while was expanded by an early initiation of cART during PHI. Untreated CHI patients showed IgM-overexpression at the expenses of switched (IgM-IgD-) phenotypes of the memory subsets. Interestingly, in PHI patients a significant alteration of Immunoglobulin-expression was evident at BL in TLM cells, and after 4 weeks, despite treatment, in AM and RM subsets. After 48 weeks of therapy, Immunoglobulin-expression of AM and RM almost normalized, but remained perturbed in TLM cells in both groups. Conclusions In conclusion, aberrant activated and exhausted B-cell phenotypes rose already during PHI, while most of the alterations in Ig-expression seen in CHI appeared later, despite 4 weeks of effective cART. After 48 weeks of cART B-cell subsets distribution improved although without full normalization, while Immunoglobulin-expression normalized among AM and RM, remaining perturbed in TLM B-cells of PHI and CHI. PMID:26474181

  13. Optimization of OSEM parameters in myocardial perfusion imaging reconstruction as a function of body mass index: a clinical approach*

    PubMed Central

    de Barros, Pietro Paolo; Metello, Luis F.; Camozzato, Tatiane Sabriela Cagol; Vieira, Domingos Manuel da Silva

    2015-01-01

    Objective The present study is aimed at contributing to identify the most appropriate OSEM parameters to generate myocardial perfusion imaging reconstructions with the best diagnostic quality, correlating them with patients’ body mass index. Materials and Methods The present study included 28 adult patients submitted to myocardial perfusion imaging in a public hospital. The OSEM method was utilized in the images reconstruction with six different combinations of iterations and subsets numbers. The images were analyzed by nuclear cardiology specialists taking their diagnostic value into consideration and indicating the most appropriate images in terms of diagnostic quality. Results An overall scoring analysis demonstrated that the combination of four iterations and four subsets has generated the most appropriate images in terms of diagnostic quality for all the classes of body mass index; however, the role played by the combination of six iterations and four subsets is highlighted in relation to the higher body mass index classes. Conclusion The use of optimized parameters seems to play a relevant role in the generation of images with better diagnostic quality, ensuring the diagnosis and consequential appropriate and effective treatment for the patient. PMID:26543282

  14. Decoys Selection in Benchmarking Datasets: Overview and Perspectives

    PubMed Central

    Réau, Manon; Langenfeld, Florent; Zagury, Jean-François; Lagarde, Nathalie; Montes, Matthieu

    2018-01-01

    Virtual Screening (VS) is designed to prospectively help identifying potential hits, i.e., compounds capable of interacting with a given target and potentially modulate its activity, out of large compound collections. Among the variety of methodologies, it is crucial to select the protocol that is the most adapted to the query/target system under study and that yields the most reliable output. To this aim, the performance of VS methods is commonly evaluated and compared by computing their ability to retrieve active compounds in benchmarking datasets. The benchmarking datasets contain a subset of known active compounds together with a subset of decoys, i.e., assumed non-active molecules. The composition of both the active and the decoy compounds subsets is critical to limit the biases in the evaluation of the VS methods. In this review, we focus on the selection of decoy compounds that has considerably changed over the years, from randomly selected compounds to highly customized or experimentally validated negative compounds. We first outline the evolution of decoys selection in benchmarking databases as well as current benchmarking databases that tend to minimize the introduction of biases, and secondly, we propose recommendations for the selection and the design of benchmarking datasets. PMID:29416509

  15. An improved wrapper-based feature selection method for machinery fault diagnosis

    PubMed Central

    2017-01-01

    A major issue of machinery fault diagnosis using vibration signals is that it is over-reliant on personnel knowledge and experience in interpreting the signal. Thus, machine learning has been adapted for machinery fault diagnosis. The quantity and quality of the input features, however, influence the fault classification performance. Feature selection plays a vital role in selecting the most representative feature subset for the machine learning algorithm. In contrast, the trade-off relationship between capability when selecting the best feature subset and computational effort is inevitable in the wrapper-based feature selection (WFS) method. This paper proposes an improved WFS technique before integration with a support vector machine (SVM) model classifier as a complete fault diagnosis system for a rolling element bearing case study. The bearing vibration dataset made available by the Case Western Reserve University Bearing Data Centre was executed using the proposed WFS and its performance has been analysed and discussed. The results reveal that the proposed WFS secures the best feature subset with a lower computational effort by eliminating the redundancy of re-evaluation. The proposed WFS has therefore been found to be capable and efficient to carry out feature selection tasks. PMID:29261689

  16. Classification Preictal and Interictal Stages via Integrating Interchannel and Time-Domain Analysis of EEG Features.

    PubMed

    Lin, Lung-Chang; Chen, Sharon Chia-Ju; Chiang, Ching-Tai; Wu, Hui-Chuan; Yang, Rei-Cheng; Ouyang, Chen-Sen

    2017-03-01

    The life quality of patients with refractory epilepsy is extremely affected by abrupt and unpredictable seizures. A reliable method for predicting seizures is important in the management of refractory epilepsy. A critical factor in seizure prediction involves the classification of the preictal and interictal stages. This study aimed to develop an efficient, automatic, quantitative, and individualized approach for preictal/interictal stage identification. Five epileptic children, who had experienced at least 2 episodes of seizures during a 24-hour video EEG recording, were included. Artifact-free preictal and interictal EEG epochs were acquired, respectively, and characterized with 216 global feature descriptors. The best subset of 5 discriminative descriptors was identified. The best subsets showed differences among the patients. Statistical analysis revealed most of the 5 descriptors in each subset were significantly different between the preictal and interictal stages for each patient. The proposed approach yielded weighted averages of 97.50% correctness, 96.92% sensitivity, 97.78% specificity, and 95.45% precision on classifying test epochs. Although the case number was limited, this study successfully integrated a new EEG analytical method to classify preictal and interictal EEG segments and might be used further in predicting the occurrence of seizures.

  17. Robustly Aligning a Shape Model and Its Application to Car Alignment of Unknown Pose.

    PubMed

    Li, Yan; Gu, Leon; Kanade, Takeo

    2011-09-01

    Precisely localizing in an image a set of feature points that form a shape of an object, such as car or face, is called alignment. Previous shape alignment methods attempted to fit a whole shape model to the observed data, based on the assumption of Gaussian observation noise and the associated regularization process. However, such an approach, though able to deal with Gaussian noise in feature detection, turns out not to be robust or precise because it is vulnerable to gross feature detection errors or outliers resulting from partial occlusions or spurious features from the background or neighboring objects. We address this problem by adopting a randomized hypothesis-and-test approach. First, a Bayesian inference algorithm is developed to generate a shape-and-pose hypothesis of the object from a partial shape or a subset of feature points. For alignment, a large number of hypotheses are generated by randomly sampling subsets of feature points, and then evaluated to find the one that minimizes the shape prediction error. This method of randomized subset-based matching can effectively handle outliers and recover the correct object shape. We apply this approach on a challenging data set of over 5,000 different-posed car images, spanning a wide variety of car types, lighting, background scenes, and partial occlusions. Experimental results demonstrate favorable improvements over previous methods on both accuracy and robustness.

  18. Writing a Mixed Methods Report in Social Work Research

    ERIC Educational Resources Information Center

    Bronstein, Laura R.; Kovacs, Pamela J.

    2013-01-01

    This article briefly chronicles the development of mixed methods research and its use in social work. We then move onto a discussion of terms and designs, reasons for (and for not) using mixed methods. Drawing upon exemplars, we address how to write up a mixed methods study when (1) the sample is single; single for one part, with a subset for the…

  19. Orthotopic AY-27 rat bladder urothelial cell carcinoma model presented an elevated methemoglobin proportion in the increased total hemoglobin content when evaluated in vivo by single-fiber reflectance spectroscopy

    NASA Astrophysics Data System (ADS)

    Sun, Tengfei; Davis, Carole A.; Hurst, Robert E.; Slaton, Joel W.; Piao, Daqing

    2017-02-01

    In vivo single-fiber reflectance spectroscopy (SfRS) was performed on an orthotopic AY-27 rat bladder urothelial cell carcinoma model to explore potential spectroscopic features revealing neoplastic changes. AY-27 bladder tumor cells were intravesically instilled in four rats and allowed to implant and grow for one week, with two additional rats as the control. A total of 107 SfRS measurements were taken from 27 sites on two control bladders and 80 from four AY-27 treated bladders. The spectral profiles obtained from AY-27 treated bladders revealed various levels of a methemoglobin (MetHb) characteristic spectral feature around 635nm. A multisegment spectral analysis method estimated concentrations of five chromophore compositions including oxyhemoglobin, deoxyhemoglobin, MetHb, lipid and water. The total hemoglobin concentration ([HbT]), the MetHb proportion in the total hemoglobin and the lipid volume content showed possible correlations. The 80 measurements from the AY-27 treated bladders could separate to three sub-sets according to the MetHb proportion. Specifically, 72 were in subset 1 with low proportion (5.3%<[MetHb]<7%), 6 in subset 2 with moderate proportion (7%<[MetHb]<30%), and 2 in subset 3 with significant proportion (>30%). When grouped according to [MetHB], the [HbT] increased from 368 μM of subset 1 to 488 μM of subset 2 to 541 μM of subset 3, in comparison to the 285 μM of the control. The increased total hemoglobin and the elevation of MetHb proportion may signify angiogenesis and degradation in hemoglobin oxygen-transport. Additionally, the lipid volume content decreased from 2.58% in the control to <0.2% in the tumor groups, indicating disruption of subepithelium tissue architecture.

  20. Human Umbilical Cord Mesenchymal Stem Cells: Subpopulations and Their Difference in Cell Biology and Effects on Retinal Degeneration in RCS Rats.

    PubMed

    Wang, L; Li, P; Tian, Y; Li, Z; Lian, C; Ou, Q; Jin, C; Gao, F; Xu, J-Y; Wang, J; Wang, F; Zhang, J; Zhang, J; Li, W; Tian, H; Lu, L; Xu, G-T

    2017-01-01

    Human umbilical cord mesenchymal stem cells (hUC-MSCs) are potential candidates for treating retinal degeneration (RD). To further study the biology and therapeutic effects of the hUC-MSCs on retinal degeneration. Two hUC-MSC subpopulations, termed hUC-MSC1 and hUC-MSC2, were isolated by single-cell cloning method and their therapeutic functions were compared in RCS rat, a RD model. Although both subsets satisfied the basic requirements for hUC-MSCs, they were significantly different in morphology, proliferation rate, differentiation capacity, phenotype and gene expression. Furthermore, only the smaller, fibroblast-like, faster growing subset hUC-MSC1 displayed stronger colony forming potential as well as adipogenic and osteogenic differentiation capacities. When the two subsets were respectively transplanted into the subretinal spaces of RCS rats, both subsets survived, but only hUC-MSC1 expressed RPE cell markers Bestrophin and RPE65. More importantly, hUC-MSC1 showed stronger rescue effect on the retinal function as indicated by the higher b-wave amplitude on ERG examination, thicker retinal nuclear layer, and decreased apoptotic photoreceptors. When both subsets were treated with interleukin-6, mimicking the inflammatory environment when the cells were transplanted into the eyes with degenerated retina, hUC-MSC1 expressed much higher levels of trophic factors in comparison with hUC-MSC2. The data here, in addition to prove the heterogeneity of hUC-MSCs, confirmed that the stronger therapeutic effects of hUC-MSC1 were attributed to its stronger anti-apoptotic effect, paracrine of trophic factors and potential RPE cell differentiation capacity. Thus, the subset hUC-MSC1, not the other subset or the ungrouped hUC-MSCs should be used for effective treatment of RD. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  1. Sample size determination for bibliographic retrieval studies

    PubMed Central

    Yao, Xiaomei; Wilczynski, Nancy L; Walter, Stephen D; Haynes, R Brian

    2008-01-01

    Background Research for developing search strategies to retrieve high-quality clinical journal articles from MEDLINE is expensive and time-consuming. The objective of this study was to determine the minimal number of high-quality articles in a journal subset that would need to be hand-searched to update or create new MEDLINE search strategies for treatment, diagnosis, and prognosis studies. Methods The desired width of the 95% confidence intervals (W) for the lowest sensitivity among existing search strategies was used to calculate the number of high-quality articles needed to reliably update search strategies. New search strategies were derived in journal subsets formed by 2 approaches: random sampling of journals and top journals (having the most high-quality articles). The new strategies were tested in both the original large journal database and in a low-yielding journal (having few high-quality articles) subset. Results For treatment studies, if W was 10% or less for the lowest sensitivity among our existing search strategies, a subset of 15 randomly selected journals or 2 top journals were adequate for updating search strategies, based on each approach having at least 99 high-quality articles. The new strategies derived in 15 randomly selected journals or 2 top journals performed well in the original large journal database. Nevertheless, the new search strategies developed using the random sampling approach performed better than those developed using the top journal approach in a low-yielding journal subset. For studies of diagnosis and prognosis, no journal subset had enough high-quality articles to achieve the expected W (10%). Conclusion The approach of randomly sampling a small subset of journals that includes sufficient high-quality articles is an efficient way to update or create search strategies for high-quality articles on therapy in MEDLINE. The concentrations of diagnosis and prognosis articles are too low for this approach. PMID:18823538

  2. Uncertainty Analysis via Failure Domain Characterization: Polynomial Requirement Functions

    NASA Technical Reports Server (NTRS)

    Crespo, Luis G.; Munoz, Cesar A.; Narkawicz, Anthony J.; Kenny, Sean P.; Giesy, Daniel P.

    2011-01-01

    This paper proposes an uncertainty analysis framework based on the characterization of the uncertain parameter space. This characterization enables the identification of worst-case uncertainty combinations and the approximation of the failure and safe domains with a high level of accuracy. Because these approximations are comprised of subsets of readily computable probability, they enable the calculation of arbitrarily tight upper and lower bounds to the failure probability. A Bernstein expansion approach is used to size hyper-rectangular subsets while a sum of squares programming approach is used to size quasi-ellipsoidal subsets. These methods are applicable to requirement functions whose functional dependency on the uncertainty is a known polynomial. Some of the most prominent features of the methodology are the substantial desensitization of the calculations from the uncertainty model assumed (i.e., the probability distribution describing the uncertainty) as well as the accommodation for changes in such a model with a practically insignificant amount of computational effort.

  3. Data Point Averaging for Computational Fluid Dynamics Data

    NASA Technical Reports Server (NTRS)

    Norman, Jr., David (Inventor)

    2016-01-01

    A system and method for generating fluid flow parameter data for use in aerodynamic heating analysis. Computational fluid dynamics data is generated for a number of points in an area on a surface to be analyzed. Sub-areas corresponding to areas of the surface for which an aerodynamic heating analysis is to be performed are identified. A computer system automatically determines a sub-set of the number of points corresponding to each of the number of sub-areas and determines a value for each of the number of sub-areas using the data for the sub-set of points corresponding to each of the number of sub-areas. The value is determined as an average of the data for the sub-set of points corresponding to each of the number of sub-areas. The resulting parameter values then may be used to perform an aerodynamic heating analysis.

  4. Data Point Averaging for Computational Fluid Dynamics Data

    NASA Technical Reports Server (NTRS)

    Norman, David, Jr. (Inventor)

    2014-01-01

    A system and method for generating fluid flow parameter data for use in aerodynamic heating analysis. Computational fluid dynamics data is generated for a number of points in an area on a surface to be analyzed. Sub-areas corresponding to areas of the surface for which an aerodynamic heating analysis is to be performed are identified. A computer system automatically determines a sub-set of the number of points corresponding to each of the number of sub-areas and determines a value for each of the number of sub-areas using the data for the sub-set of points corresponding to each of the number of sub-areas. The value is determined as an average of the data for the sub-set of points corresponding to each of the number of sub-areas. The resulting parameter values then may be used to perform an aerodynamic heating analysis.

  5. Bayesian Ensemble Trees (BET) for Clustering and Prediction in Heterogeneous Data

    PubMed Central

    Duan, Leo L.; Clancy, John P.; Szczesniak, Rhonda D.

    2016-01-01

    We propose a novel “tree-averaging” model that utilizes the ensemble of classification and regression trees (CART). Each constituent tree is estimated with a subset of similar data. We treat this grouping of subsets as Bayesian Ensemble Trees (BET) and model them as a Dirichlet process. We show that BET determines the optimal number of trees by adapting to the data heterogeneity. Compared with the other ensemble methods, BET requires much fewer trees and shows equivalent prediction accuracy using weighted averaging. Moreover, each tree in BET provides variable selection criterion and interpretation for each subset. We developed an efficient estimating procedure with improved estimation strategies in both CART and mixture models. We demonstrate these advantages of BET with simulations and illustrate the approach with a real-world data example involving regression of lung function measurements obtained from patients with cystic fibrosis. Supplemental materials are available online. PMID:27524872

  6. Stable Phenotype Of B-Cell Subsets Following Cryopreservation and Thawing of Normal Human Lymphocytes Stored in a Tissue Biobank.

    PubMed

    Rasmussen, Simon Mylius; Bilgrau, Anders Ellern; Schmitz, Alexander; Falgreen, Steffen; Bergkvist, Kim Steve; Tramm, Anette Mai; Baech, John; Jacobsen, Chris Ladefoged; Gaihede, Michael; Kjeldsen, Malene Krag; Bødker, Julie Støve; Dybkaer, Karen; Bøgsted, Martin; Johnsen, Hans Erik

    2014-09-20

    Background Cryopreservation is an acknowledged procedure to store vital cells for future biomarker analyses. Few studies, however, have analyzed the impact of the cryopreservation on phenotyping. Methods We have performed a controlled comparison of cryopreserved and fresh cellular aliquots prepared from individual healthy donors. We studied circulating B-cell subset membrane markers and global gene expression, respectively by multiparametric flow cytometry and microarray data. Extensive statistical analysis of the generated data tested the concept that "overall, there are phenotypic differences between cryopreserved and fresh B-cell subsets". Subsequently, we performed a consecutive uncontrolled comparison of tonsil tissue samples. Results By multiparametric flow analysis, we documented no significant changes following cryopreservation of subset frequencies or membrane intensity for the differentiation markers CD19, CD20, CD22, CD27, CD38, CD45, and CD200. By gene expression profiling following cryopreservation, across all samples, only 16 out of 18708 genes were significantly up or down regulated, including FOSB, KLF4, RBP7, ANXA1 or CLC, DEFA3, respectively. Implementation of cryopreserved tissue in our research program allowed us to present a performance analysis, by comparing cryopreserved and fresh tonsil tissue. As expected, phenotypic differences were identified, but to an extent that did not affect the performance of the cryopreserved tissue to generate specific B-cell subset associated gene signatures and assign subset phenotypes to independent tissue samples. Conclusions We have confirmed our working concept and illustrated the usefulness of vital cryopreserved cell suspensions for phenotypic studies of the normal B-cell hierarchy; however, storage procedures need to be delineated by tissue specific comparative analysis. © 2014 Clinical Cytometry Society. Copyright © 2014 Clinical Cytometry Society.

  7. Targeted depletion of a MDSC subset unmasks pancreatic ductal adenocarcinoma to adaptive immunity

    PubMed Central

    Stromnes, Ingunn M.; Brockenbrough, Scott; Izeradjene, Kamel; Carlson, Markus A.; Cuevas, Carlos; Simmons, Randi M.; Greenberg, Philip D.; Hingorani, Sunil R.

    2015-01-01

    Objective Pancreatic ductal adenocarcinoma (PDA) is characterized by a robust desmoplasia, including the notable accumulation of immunosuppressive cells that shield neoplastic cells from immune detection. Immune evasion may be further enhanced if the malignant cells fail to express high levels of antigens that are sufficiently immunogenic to engender an effector T cell response. In this report, we investigate the predominant subsets of immunosuppressive cancer-conditioned myeloid cells that chronicle and shape pancreas cancer progression. We show that selective depletion of one subset of myeloid-derived suppressor cells (MDSC) in an autochthonous, genetically engineered mouse model (GEMM) of PDA unmasks the ability of the adaptive immune response to engage and target tumor epithelial cells. Methods A combination of in vivo and in vitro studies were performed employing a GEMM that faithfully recapitulates the cardinal features of human PDA. The predominant cancer-conditioned myeloid cell subpopulation was specifically targeted in vivo and the biological outcomes determined. Results PDA orchestrates the induction of distinct subsets of cancer-associated myeloid cells through the production of factors known to influence myelopoeisis. These immature myeloid cells inhibit the proliferation and induce apoptosis of activated T cells. Targeted depletion of granulocytic MDSC (Gr-MDSC) in autochthonous PDA increases the intratumoral accumulation of activated CD8 T cells and apoptosis of tumor epithelial cells, and also remodels the tumor stroma. Conclusions Neoplastic ductal cells of the pancreas induce distinct myeloid cell subsets that promote tumor cell survival and accumulation. Targeted depletion of a single myeloid subset, the Gr-MDSC, can unmask an endogenous T cell response, revealing an unexpected latent immunity and invoking targeting of Gr-MDSC as a potential strategy to exploit for treating this highly lethal disease. PMID:24555999

  8. A Model of Teaching-Learning Transactions in Generalist-Specialist Consultations

    ERIC Educational Resources Information Center

    Shershneva, Marianna B.; Carnes, Molly; Bakken, Lori L.

    2006-01-01

    Introduction: A new paradigm in continuing medical education is characterized by emphasis on physicians' learning in practice. Consistent with this paradigm, our study examined a subset of clinical practice--generalist-specialist consultations--from an educational perspective. Methods: We applied the grounded-theory method with semistructured…

  9. Minimization search method for data inversion

    NASA Technical Reports Server (NTRS)

    Fymat, A. L.

    1975-01-01

    Technique has been developed for determining values of selected subsets of independent variables in mathematical formulations. Required computation time increases with first power of the number of variables. This is in contrast with classical minimization methods for which computational time increases with third power of the number of variables.

  10. Approximation of reliability of direct genomic breeding values

    USDA-ARS?s Scientific Manuscript database

    Two methods to efficiently approximate theoretical genomic reliabilities are presented. The first method is based on the direct inverse of the left hand side (LHS) of mixed model equations. It uses the genomic relationship matrix for a small subset of individuals with the highest genomic relationshi...

  11. Data approximation using a blending type spline construction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dalmo, Rune; Bratlie, Jostein

    2014-11-18

    Generalized expo-rational B-splines (GERBS) is a blending type spline construction where local functions at each knot are blended together by C{sup k}-smooth basis functions. One way of approximating discrete regular data using GERBS is by partitioning the data set into subsets and fit a local function to each subset. Partitioning and fitting strategies can be devised such that important or interesting data points are interpolated in order to preserve certain features. We present a method for fitting discrete data using a tensor product GERBS construction. The method is based on detection of feature points using differential geometry. Derivatives, which aremore » necessary for feature point detection and used to construct local surface patches, are approximated from the discrete data using finite differences.« less

  12. Optimized diffusion gradient orientation schemes for corrupted clinical DTI data sets.

    PubMed

    Dubois, J; Poupon, C; Lethimonnier, F; Le Bihan, D

    2006-08-01

    A method is proposed for generating schemes of diffusion gradient orientations which allow the diffusion tensor to be reconstructed from partial data sets in clinical DT-MRI, should the acquisition be corrupted or terminated before completion because of patient motion. A general energy-minimization electrostatic model was developed in which the interactions between orientations are weighted according to their temporal order during acquisition. In this report, two corruption scenarios were specifically considered for generating relatively uniform schemes of 18 and 60 orientations, with useful subsets of 6 and 15 orientations. The sets and subsets were compared to conventional sets through their energy, condition number and rotational invariance. Schemes of 18 orientations were tested on a volunteer. The optimized sets were similar to uniform sets in terms of energy, condition number and rotational invariance, whether the complete set or only a subset was considered. Diffusion maps obtained in vivo were close to those for uniform sets whatever the acquisition time was. This was not the case with conventional schemes, whose subset uniformity was insufficient. With the proposed approach, sets of orientations responding to several corruption scenarios can be generated, which is potentially useful for imaging uncooperative patients or infants.

  13. Random Partition Distribution Indexed by Pairwise Information

    PubMed Central

    Dahl, David B.; Day, Ryan; Tsai, Jerry W.

    2017-01-01

    We propose a random partition distribution indexed by pairwise similarity information such that partitions compatible with the similarities are given more probability. The use of pairwise similarities, in the form of distances, is common in some clustering algorithms (e.g., hierarchical clustering), but we show how to use this type of information to define a prior partition distribution for flexible Bayesian modeling. A defining feature of the distribution is that it allocates probability among partitions within a given number of subsets, but it does not shift probability among sets of partitions with different numbers of subsets. Our distribution places more probability on partitions that group similar items yet keeps the total probability of partitions with a given number of subsets constant. The distribution of the number of subsets (and its moments) is available in closed-form and is not a function of the similarities. Our formulation has an explicit probability mass function (with a tractable normalizing constant) so the full suite of MCMC methods may be used for posterior inference. We compare our distribution with several existing partition distributions, showing that our formulation has attractive properties. We provide three demonstrations to highlight the features and relative performance of our distribution. PMID:29276318

  14. Methodological considerations for implementation of lymphocyte subset analysis in a clinical reference laboratory.

    PubMed

    Muirhead, K A; Wallace, P K; Schmitt, T C; Frescatore, R L; Franco, J A; Horan, P K

    1986-01-01

    As the diagnostic utility of lymphocyte subset analysis has been recognized in the clinical research laboratory, a wide variety of reagents and cell preparation, staining and analysis methods have also been described. Methods that are perfectly suitable for analysis of smaller sample numbers in the biological or clinical research setting are not always appropriate and/or applicable in the setting of a high volume clinical reference laboratory. We describe here some of the specific considerations involved in choosing a method for flow cytometric analysis which minimizes sample preparation and data analysis time while maximizing sample stability, viability, and reproducibility. Monoclonal T- and B-cell reagents from three manufacturers were found to give equivalent results for a reference population of healthy individuals. This was true whether direct or indirect immunofluorescence staining was used and whether cells were prepared by Ficoll-Hypaque fractionation (FH) or by lysis of whole blood. When B cells were enumerated using a polyclonal anti-immunoglobulin reagent, less cytophilic immunoglobulin staining was present after lysis than after FH preparation. However, both preparation methods required additional incubation at 37 degrees C to obtain results concordant with monoclonal B-cell reagents. Standard reagents were chosen on the basis of maximum positive/negative separation and the availability of appropriate negative controls. The effects of collection medium and storage conditions on sample stability and reproducibility of subset analysis were also assessed. Specimens collected in heparin and stored at room temperature in buffered medium gave reproducible results for 3 days after specimen collection, using either FH or lysis as the preparation method. General strategies for instrument optimization, quality control, and biohazard containment are also discussed.

  15. FORMED: Bringing Formal Methods to the Engineering Desktop

    DTIC Science & Technology

    2016-02-01

    integrates formal verification into software design and development by precisely defining semantics for a restricted subset of the Unified Modeling...input-output contract satisfaction and absence of null pointer dereferences. 15. SUBJECT TERMS Formal Methods, Software Verification , Model-Based...Domain specific languages (DSLs) drive both implementation and formal verification

  16. Gold and palladium minerals (including empirical PdCuBiSe3) from the former Roter Bär mine, St. Andreasberg, Harz Mountains, Germany: a result of low-temperature, oxidising fluid overprint

    NASA Astrophysics Data System (ADS)

    Cabral, Alexandre Raphael; Ließmann, Wilfried; Lehmann, Bernd

    2015-10-01

    At Roter Bär, a former underground mine in the polymetallic deposits of St. Andreasberg in the middle-Harz vein district, Germany, native gold and palladium minerals occur very locally in clausthalite-hematite pockets of few millimetres across in carbonate veinlets. The native gold is a Au-Ag intermetallic compound and the palladium minerals are characterised as mertieite-II [Pd8(Sb,As)3] and empirical PdCuBiSe3 with some S. The latter coexists with bohdanowiczite (AgBiSe2), a mineral that is stable below 120 °C. The geological setting of Roter Bär, underneath a post-Variscan unconformity, and its hematite-selenide-gold association suggest that oxidising hydrothermal brines of low temperature were instrumental to the Au-Pd mineralisation. The Roter Bär Au-Pd mineralisation can be explained by Permo-Triassic, red-bed-derived brines in the context of post-Variscan, unconformity-related fluid overprint.

  17. [Emotional, physical and psychiatric disorders in relatives of schizophrenia patients in Mexico].

    PubMed

    Rascón, María Luisa; Caraveo, Jorge; Valencia, Marcelo

    2010-01-01

    A study was to investigate the frequency with which they have emotional disorders, physical and psychiatric symptoms in FCPI, knowing the relationship with demographic variables and clinical caregivers of patients with schizophrenia. It was a cross sectional, descriptive, correlational, study and ex-post-facto, of a non probabilistic sample, using 131 relatives carers. Instruments. 1-The Compositive International Diagnostic Interview version 1.0 (CIDI) 2-Social Behaviour Assessment Schedule 2nd Ed. (SBAS). 58% of the relatives presented 1 to 4 psychiatric diagnoses, the most frequent was: depression (20.6%), alcohol dependence (9.9%) and dissociative disorders (7.6%); the relatives' carers reported some physical (48%) or emotional (74%) illness related to the presence of the schizophrenia in their relatives. The predictive variables associated with the presence of psychopathology in the relatives carers': the presence of active symptomatology in the patient, the years of evolution of the illness and the number of hospitalizations, r = 0.38; p > 0.000.

  18. Can a Simple Dietary Index Derived from a Sub-Set of Questionnaire Items Assess Diet Quality in a Sample of Australian Adults?

    PubMed Central

    Trapp, Georgina S. A.; Knuiman, Matthew; Hooper, Paula; Ambrosini, Gina L.

    2018-01-01

    Large, longitudinal surveys often lack consistent dietary data, limiting the use of existing tools and methods that are available to measure diet quality. This study describes a method that was used to develop a simple index for ranking individuals according to their diet quality in a longitudinal study. The RESIDential Environments (RESIDE) project (2004–2011) collected dietary data in varying detail, across four time points. The most detailed dietary data were collected using a 24-item questionnaire at the final time point (n = 555; age ≥ 25 years). At preceding time points, sub-sets of the 24 items were collected. A RESIDE dietary guideline index (RDGI) that was based on the 24-items was developed to assess diet quality in relation to the Australian Dietary Guidelines. The RDGI scores were regressed on the longitudinal sub-sets of six and nine questionnaire items at T4, from which two simple index scores (S-RDGI1 and S-RDGI2) were predicted. The S-RDGI1 and S-RDGI2 showed reasonable agreement with the RDGI (Spearman’s rho = 0.78 and 0.84; gross misclassification = 1.8%; correct classification = 64.9% and 69.7%; and, Cohen’s weighted kappa = 0.58 and 0.64, respectively). For all of the indices, higher diet quality was associated with being female, undertaking moderate to high amounts of physical activity, not smoking, and self-reported health. The S-RDGI1 and S-RDGI2 explained 62% and 73% of the variation in RDGI scores, demonstrating that a large proportion of the variability in diet quality scores can be captured using a relatively small sub-set of questionnaire items. The methods described in this study can be applied elsewhere, in situations where limited dietary data are available, to generate a sample-specific score for ranking individuals according to diet quality. PMID:29652828

  19. Can a Simple Dietary Index Derived from a Sub-Set of Questionnaire Items Assess Diet Quality in a Sample of Australian Adults?

    PubMed

    Bivoltsis, Alexia; Trapp, Georgina S A; Knuiman, Matthew; Hooper, Paula; Ambrosini, Gina L

    2018-04-13

    Large, longitudinal surveys often lack consistent dietary data, limiting the use of existing tools and methods that are available to measure diet quality. This study describes a method that was used to develop a simple index for ranking individuals according to their diet quality in a longitudinal study. The RESIDential Environments (RESIDE) project (2004-2011) collected dietary data in varying detail, across four time points. The most detailed dietary data were collected using a 24-item questionnaire at the final time point ( n = 555; age ≥ 25 years). At preceding time points, sub-sets of the 24 items were collected. A RESIDE dietary guideline index (RDGI) that was based on the 24-items was developed to assess diet quality in relation to the Australian Dietary Guidelines. The RDGI scores were regressed on the longitudinal sub-sets of six and nine questionnaire items at T4, from which two simple index scores (S-RDGI1 and S-RDGI2) were predicted. The S-RDGI1 and S-RDGI2 showed reasonable agreement with the RDGI (Spearman's rho = 0.78 and 0.84; gross misclassification = 1.8%; correct classification = 64.9% and 69.7%; and, Cohen's weighted kappa = 0.58 and 0.64, respectively). For all of the indices, higher diet quality was associated with being female, undertaking moderate to high amounts of physical activity, not smoking, and self-reported health. The S-RDGI1 and S-RDGI2 explained 62% and 73% of the variation in RDGI scores, demonstrating that a large proportion of the variability in diet quality scores can be captured using a relatively small sub-set of questionnaire items. The methods described in this study can be applied elsewhere, in situations where limited dietary data are available, to generate a sample-specific score for ranking individuals according to diet quality.

  20. SU-E-J-128: Two-Stage Atlas Selection in Multi-Atlas-Based Image Segmentation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhao, T; Ruan, D

    2015-06-15

    Purpose: In the new era of big data, multi-atlas-based image segmentation is challenged by heterogeneous atlas quality and high computation burden from extensive atlas collection, demanding efficient identification of the most relevant atlases. This study aims to develop a two-stage atlas selection scheme to achieve computational economy with performance guarantee. Methods: We develop a low-cost fusion set selection scheme by introducing a preliminary selection to trim full atlas collection into an augmented subset, alleviating the need for extensive full-fledged registrations. More specifically, fusion set selection is performed in two successive steps: preliminary selection and refinement. An augmented subset is firstmore » roughly selected from the whole atlas collection with a simple registration scheme and the corresponding preliminary relevance metric; the augmented subset is further refined into the desired fusion set size, using full-fledged registration and the associated relevance metric. The main novelty of this work is the introduction of an inference model to relate the preliminary and refined relevance metrics, based on which the augmented subset size is rigorously derived to ensure the desired atlases survive the preliminary selection with high probability. Results: The performance and complexity of the proposed two-stage atlas selection method were assessed using a collection of 30 prostate MR images. It achieved comparable segmentation accuracy as the conventional one-stage method with full-fledged registration, but significantly reduced computation time to 1/3 (from 30.82 to 11.04 min per segmentation). Compared with alternative one-stage cost-saving approach, the proposed scheme yielded superior performance with mean and medium DSC of (0.83, 0.85) compared to (0.74, 0.78). Conclusion: This work has developed a model-guided two-stage atlas selection scheme to achieve significant cost reduction while guaranteeing high segmentation accuracy. The benefit in both complexity and performance is expected to be most pronounced with large-scale heterogeneous data.« less

  1. The evidences of progressive pressurization of volcanic conduit as driving forces of unrest phenomena analyzed via modelling of multiplatform geodetic measurements: Fernandina (GALAPAGOS) and Maunaloa (HAWAII) case studies

    NASA Astrophysics Data System (ADS)

    Pepe, Susi; Castaldo, Raffaele; Casu, Francesco; D'Auria, Luca; De Luca, Claudio; De Novellis, Vincenzo; Solaro, Giuseppe; Tizzani, Pietro

    2017-04-01

    We investigated the source of the ground deformation pattern affecting the Mauna Loa (Hawaii) and Fernandina (Galapagos) volcanoes by jointly exploiting different dataset collected by both GPS and multiplatform and multiorbit SAR sensors. We exploited the advanced Differential SAR Interferometry (DInSAR) techniques to analyze unrest episode in two different geodynamics context. Our main goal is the understanding of the relationship among the spatio-temporal evolution of the ground deformation field and the temporal volumetric variation of the detected geodetic source during the uplift phenomena. We highlight the huge opportunity in understanding volcano unrest phenomena offered by the joint use of remote sensing data and inversion procedures: this prospect is particularly relevant for the analysis of uplift events, when other geophysical measurements are not available. For Mauna Loa (Hawaii) and Fernandina (Galapagos) volcanoes, the performed statistic analysis support the source pipe-like as the more suitable geometry to explain the unrest phenomena in which magmatic masses intrude in volcanic conduits. In particular, the deformation time series achieved at MounaLoa volcano are achieved by 23 GPS permanent stations of the Hawaii surveillance network, processed by Nevada Geodetic Laboratory, 7 SAR dataset acquired from ascending and descending orbits, with different look angles and along different tracks, by the C-Band Envisat satellite along the 2003 - 2010 time period for a total of 189 SAR imagery. Moreover, we exploited 2 dataset collected from ascending and descending passes by the X-Band Cosmo Sky-Med constellation during the 2012 - 2015 time span . These SAR datasets have been processed through the advanced DInSAR technique referred to as P-SBAS (De Luca et al., 2016), which allows us to retrieve the Line of Sight (LOS) projection of the surface deformation and analyze its temporal evolution by generating displacement time series. Starting this data collection, we determined the source responsible of deformation observed and in particular the results of our inversions show that the pipe source contributes substantially to both the ground deformation pattern and the cost function. In the case of Fernandina Volcano (Galápagos) we exploited the advanced Differential SAR Interferometry (DInSAR) techniques to analyze the 2012-2013 uplift episode by using X-band data from the COSMO-SkyMed (CSK) satellite constellation. This volcano falls among those not well monitored, therefore, the availability of CSK data, acquired with a repeat time ranging from 4 to 12 days and with a ground resolution of 3 meters, represents a unique opportunity to perform a detailed study of the space and time ground deformation field changes (Sansosti et al., 2014). In addition, in this case study we computed the ground deformation time series by applying the Small BAseline Subset (SBAS)-DInSAR approach (Berardino et al., 2002) to CSK data, acquired from both ascending and descending orbits. The results of their combination (vertical and horizontal E-W components) are used in order to evaluate, through a cross correlation analysis (Tizzani et al., 2009; 2015), the volcanic areas that are characterized by similar uplift temporal behavior. Subsequently, we determine the geometry, location and the temporal evolution of the geodetic source responsible for the 2012 - 2013 uplift event by applying an inverse method to the DInSAR measurements. We search for its geometrical parameters and volume variation that minimize the difference between the observed data and the modelled ground deformation field. We tested various analytical models and finally, using the Akaike Information Criterion (Akaike, 1965) among the tested analytical sources, we selected the tilted pipe. The pipe model is similar to the prolate ellipsoid, but the size of the smaller axis is kept fixed to a very small value (i.e., 10 m). Despite having a similar fit with the prolate ellipsoid, the tilted pipe-like source has been selected because it has a lower number of degrees of freedom. Both vertical and E-W cross-correlated maps support the hypothesis of the existence of a single active source, characterized by a spatial stability over the entire considered time interval. Indeed, with the proposed source inversion procedure, we have shown that the inflation of a SE dipping tilted closed pipe-like pressurized source explains the observed ground deformation pattern very well. This result suggests that the observed uplift phenomenon could be produced by the progressive pressurization of a shallow elongated magma chamber, before the eruption onset phase. References Akaike, H. On the statistical estimation of the frequency response function of a system having multiple input. Ann. Inst. Stat. Math. 17, 185-210 (1965). Berardino, P., Fornaro, G., Lanari, R., Sansosti, E. (2002). A new algorithm for surface deformation monitoring based on small baseline differential SAR interferograms, IEEE Trans. Geosci. Remote Sens., 40, 2375-2383, doi:10.1109/TGRS.2002.803792. De Luca, C.; Cuccu, R.; Elefante, S.; Zinno, I.; Manunta, M.; Casola, V.; Rivolta, G.; Lanari, R.; Casu, F. An On-Demand Web Tool for the Unsupervised Retrieval of Earth's Surface Deformation from SAR Data: The P-SBAS Service within the ESA G-POD Environment. Remote Sens. 2015, 7, 15630-15650. Fialko, Y., Khazan, Y. and Simons, M. (2001), Deformation due to a pressurized horizontal circular crack in an elastic half-space, with applications to volcano geodesy. Geophysical Journal International, 146(1), 181-190 Mogi, K. (1958), Relations between the eruptions of various volcanoes and the deformations of the ground surfaces around them. Bulletin of the Earthquake Research Institute 36, 99-134. McTigue, D. F. (1987), Elastic stress and deformation near a finite spherical magma body: Resolution of the point source paradox. Journal of Geophysical Research: Solid Earth (1978-2012), 92(B12), 12931-12940. Okada, Y. Surface deformation due to shear and tensile faults in a half-space. Bull. Seism. Soc. Am. 75, 1135-1154 (1985). Sansosti, E., Berardino, P., Bonano, M., Calò, F., Castaldo, R., Casu, F., Manunta, M., Manzo, M., Pepe, A., Pepe, S., Solaro, G., Tizzani, P., Zeni, G., Lanari, R. (2014). How second generation SAR systems are impacting the analysis of ground deformation. International Journal of Applied Earth Observation and Geoinformation, 28, doi:10.1016/j.jag.2013.10.007. Tizzani, P., Battaglia, M., Zeni, G., Atzori, S., Berardino, P., Lanari, R. (2009). Uplift and magma intrusion at Long Valley caldera from InSAR and gravity measurements, Geology, January 2009 37; no.1; p. 63-66; doi:10.1130/G25318A.1 Tizzani, P., Battaglia, M., Castaldo, R., Pepe, A., Zeni, G., Lanari, R. (2015). Magma and fluid migration at Yellowstone Caldera in the last three decades inferred from InSAR, leveling, and gravity measurements. J. Geophys. Res. Solid Earth, 120, 2627-2647. doi: 10.1002/2014JB011502. Yang, X. M., Davis, P. M., and Dieterich, J. H. (1988), Deformation from inflation of a dipping finite prolate spheroid in an elastic half-space as a model for volcanic stressing. Journal of Geophysical Research: Solid Earth (1978-2012), 93(B5), 4249-4257.

  2. An ensemble method for extracting adverse drug events from social media.

    PubMed

    Liu, Jing; Zhao, Songzheng; Zhang, Xiaodi

    2016-06-01

    Because adverse drug events (ADEs) are a serious health problem and a leading cause of death, it is of vital importance to identify them correctly and in a timely manner. With the development of Web 2.0, social media has become a large data source for information on ADEs. The objective of this study is to develop a relation extraction system that uses natural language processing techniques to effectively distinguish between ADEs and non-ADEs in informal text on social media. We develop a feature-based approach that utilizes various lexical, syntactic, and semantic features. Information-gain-based feature selection is performed to address high-dimensional features. Then, we evaluate the effectiveness of four well-known kernel-based approaches (i.e., subset tree kernel, tree kernel, shortest dependency path kernel, and all-paths graph kernel) and several ensembles that are generated by adopting different combination methods (i.e., majority voting, weighted averaging, and stacked generalization). All of the approaches are tested using three data sets: two health-related discussion forums and one general social networking site (i.e., Twitter). When investigating the contribution of each feature subset, the feature-based approach attains the best area under the receiver operating characteristics curve (AUC) values, which are 78.6%, 72.2%, and 79.2% on the three data sets. When individual methods are used, we attain the best AUC values of 82.1%, 73.2%, and 77.0% using the subset tree kernel, shortest dependency path kernel, and feature-based approach on the three data sets, respectively. When using classifier ensembles, we achieve the best AUC values of 84.5%, 77.3%, and 84.5% on the three data sets, outperforming the baselines. Our experimental results indicate that ADE extraction from social media can benefit from feature selection. With respect to the effectiveness of different feature subsets, lexical features and semantic features can enhance the ADE extraction capability. Kernel-based approaches, which can stay away from the feature sparsity issue, are qualified to address the ADE extraction problem. Combining different individual classifiers using suitable combination methods can further enhance the ADE extraction effectiveness. Copyright © 2016 Elsevier B.V. All rights reserved.

  3. Core Hunter 3: flexible core subset selection.

    PubMed

    De Beukelaer, Herman; Davenport, Guy F; Fack, Veerle

    2018-05-31

    Core collections provide genebank curators and plant breeders a way to reduce size of their collections and populations, while minimizing impact on genetic diversity and allele frequency. Many methods have been proposed to generate core collections, often using distance metrics to quantify the similarity of two accessions, based on genetic marker data or phenotypic traits. Core Hunter is a multi-purpose core subset selection tool that uses local search algorithms to generate subsets relying on one or more metrics, including several distance metrics and allelic richness. In version 3 of Core Hunter (CH3) we have incorporated two new, improved methods for summarizing distances to quantify diversity or representativeness of the core collection. A comparison of CH3 and Core Hunter 2 (CH2) showed that these new metrics can be effectively optimized with less complex algorithms, as compared to those used in CH2. CH3 is more effective at maximizing the improved diversity metric than CH2, still ensures a high average and minimum distance, and is faster for large datasets. Using CH3, a simple stochastic hill-climber is able to find highly diverse core collections, and the more advanced parallel tempering algorithm further increases the quality of the core and further reduces variability across independent samples. We also evaluate the ability of CH3 to simultaneously maximize diversity, and either representativeness or allelic richness, and compare the results with those of the GDOpt and SimEli methods. CH3 can sample equally representative cores as GDOpt, which was specifically designed for this purpose, and is able to construct cores that are simultaneously more diverse, and either are more representative or have higher allelic richness, than those obtained by SimEli. In version 3, Core Hunter has been updated to include two new core subset selection metrics that construct cores for representativeness or diversity, with improved performance. It combines and outperforms the strengths of other methods, as it (simultaneously) optimizes a variety of metrics. In addition, CH3 is an improvement over CH2, with the option to use genetic marker data or phenotypic traits, or both, and improved speed. Core Hunter 3 is freely available on http://www.corehunter.org .

  4. A subset polynomial neural networks approach for breast cancer diagnosis.

    PubMed

    O'Neill, T J; Penm, Jack; Penm, Jonathan

    2007-01-01

    Breast cancer is a very common and serious cancer for women that is diagnosed in one of every eight Australian women before the age of 85. The conventional method of breast cancer diagnosis is mammography. However, mammography has been reported to have poor diagnostic capability. In this paper we have used subset polynomial neural network techniques in conjunction with fine needle aspiration cytology to undertake this difficult task of predicting breast cancer. The successful findings indicate that adoption of NNs is likely to lead to increased survival of women with breast cancer, improved electronic healthcare, and enhanced quality of life.

  5. DistributedFBA.jl: High-level, high-performance flux balance analysis in Julia

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heirendt, Laurent; Thiele, Ines; Fleming, Ronan M. T.

    Flux balance analysis and its variants are widely used methods for predicting steady-state reaction rates in biochemical reaction networks. The exploration of high dimensional networks with such methods is currently hampered by software performance limitations. DistributedFBA.jl is a high-level, high-performance, open-source implementation of flux balance analysis in Julia. It is tailored to solve multiple flux balance analyses on a subset or all the reactions of large and huge-scale networks, on any number of threads or nodes. DistributedFBA.jl is a high-level, high-performance, open-source implementation of flux balance analysis in Julia. It is tailored to solve multiple flux balance analyses on amore » subset or all the reactions of large and huge-scale networks, on any number of threads or nodes.« less

  6. DistributedFBA.jl: High-level, high-performance flux balance analysis in Julia

    DOE PAGES

    Heirendt, Laurent; Thiele, Ines; Fleming, Ronan M. T.

    2017-01-16

    Flux balance analysis and its variants are widely used methods for predicting steady-state reaction rates in biochemical reaction networks. The exploration of high dimensional networks with such methods is currently hampered by software performance limitations. DistributedFBA.jl is a high-level, high-performance, open-source implementation of flux balance analysis in Julia. It is tailored to solve multiple flux balance analyses on a subset or all the reactions of large and huge-scale networks, on any number of threads or nodes. DistributedFBA.jl is a high-level, high-performance, open-source implementation of flux balance analysis in Julia. It is tailored to solve multiple flux balance analyses on amore » subset or all the reactions of large and huge-scale networks, on any number of threads or nodes.« less

  7. A simple method for constructing the inhomogeneous quantum group IGLq(n) and its universal enveloping algebra Uq(igl(n))

    NASA Astrophysics Data System (ADS)

    Shariati, A.; Aghamohammadi, A.

    1995-12-01

    We propose a simple and concise method to construct the inhomogeneous quantum group IGLq(n) and its universal enveloping algebra Uq(igl(n)). Our technique is based on embedding an n-dimensional quantum space in an n+1-dimensional one as the set xn+1=1. This is possible only if one considers the multiparametric quantum space whose parameters are fixed in a specific way. The quantum group IGLq(n) is then the subset of GLq(n+1), which leaves the xn+1=1 subset invariant. For the deformed universal enveloping algebra Uq(igl(n)), we will show that it can also be embedded in Uq(gl(n+1)), provided one uses the multiparametric deformation of U(gl(n+1)) with a specific choice of its parameters.

  8. Methods Used to Streamline the CAHPS® Hospital Survey

    PubMed Central

    Keller, San; O'Malley, A James; Hays, Ron D; Matthew, Rebecca A; Zaslavsky, Alan M; Hepner, Kimberly A; Cleary, Paul D

    2005-01-01

    Objective To identify a parsimonious subset of reliable, valid, and consumer-salient items from 33 questions asking for patient reports about hospital care quality. Data Source CAHPS® Hospital Survey pilot data were collected during the summer of 2003 using mail and telephone from 19,720 patients who had been treated in 132 hospitals in three states and discharged from November 2002 to January 2003. Methods Standard psychometric methods were used to assess the reliability (internal consistency reliability and hospital-level reliability) and construct validity (exploratory and confirmatory factor analyses, strength of relationship to overall rating of hospital) of the 33 report items. The best subset of items from among the 33 was selected based on their statistical properties in conjunction with the importance assigned to each item by participants in 14 focus groups. Principal Findings Confirmatory factor analysis (CFA) indicated that a subset of 16 questions proposed to measure seven aspects of hospital care (communication with nurses, communication with doctors, responsiveness to patient needs, physical environment, pain control, communication about medication, and discharge information) demonstrated excellent fit to the data. Scales in each of these areas had acceptable levels of reliability to discriminate among hospitals and internal consistency reliability estimates comparable with previously developed CAHPS instruments. Conclusion Although half the length of the original, the shorter CAHPS hospital survey demonstrates promising measurement properties, identifies variations in care among hospitals, and deals with aspects of the hospital stay that are important to patients' evaluations of care quality. PMID:16316438

  9. Statistical iterative reconstruction for streak artefact reduction when using multidetector CT to image the dento-alveolar structures.

    PubMed

    Dong, J; Hayakawa, Y; Kober, C

    2014-01-01

    When metallic prosthetic appliances and dental fillings exist in the oral cavity, the appearance of metal-induced streak artefacts is not avoidable in CT images. The aim of this study was to develop a method for artefact reduction using the statistical reconstruction on multidetector row CT images. Adjacent CT images often depict similar anatomical structures. Therefore, reconstructed images with weak artefacts were attempted using projection data of an artefact-free image in a neighbouring thin slice. Images with moderate and strong artefacts were continuously processed in sequence by successive iterative restoration where the projection data was generated from the adjacent reconstructed slice. First, the basic maximum likelihood-expectation maximization algorithm was applied. Next, the ordered subset-expectation maximization algorithm was examined. Alternatively, a small region of interest setting was designated. Finally, the general purpose graphic processing unit machine was applied in both situations. The algorithms reduced the metal-induced streak artefacts on multidetector row CT images when the sequential processing method was applied. The ordered subset-expectation maximization and small region of interest reduced the processing duration without apparent detriments. A general-purpose graphic processing unit realized the high performance. A statistical reconstruction method was applied for the streak artefact reduction. The alternative algorithms applied were effective. Both software and hardware tools, such as ordered subset-expectation maximization, small region of interest and general-purpose graphic processing unit achieved fast artefact correction.

  10. UAV Mission Planning under Uncertainty

    DTIC Science & Technology

    2006-06-01

    algorithm , adapted from [13] . 57 4-5 Robust Optimization considers only a subset of the feasible region . 61 5-1 Overview of simulation with parameter...incorporates the robust optimization method suggested by Bertsimas and Sim [12], and is solved with a standard Branch- and-Cut algorithm . The chapter... algorithms , and the heuristic methods of Local Search methods and Simulated Annealing. With each method, we attempt to give a review of research that has

  11. Multicentre evaluation of stable reference whole blood for enumeration of lymphocyte subsets by flow cytometry.

    PubMed

    Edwards, Cherry; Belgrave, Danielle; Janossy, George; Bradley, Nicholas J; Stebbings, Richard; Gaines-Das, Rose; Thorpe, Robin; Sawle, Alex; Arroz, Maria Jorge; Brando, Bruno; Gratama, Jan Willem; Orfao de Matos, Alberto; Papa, Stephano; Papamichail, Michael; Lenkei, Rodica; Rothe, Gregor; Barnett, David

    2005-06-22

    BACKGROUND: Clinical indications for lymphocyte subset enumeration by flow cytometry include monitoring of disease progression and timing of therapeutic intervention in infection with human immunodeficiency virus. Until recently international standardisation has not been possible due to a lack of suitable stable reference material. METHODS: This study consisted of two trials of a stabilised whole blood preparation. Eleven participants were sent two standard protocols for staining plus gating strategy and asked to report absolute counts for lymphocyte subsets. RESULTS: No significant difference was detected between the two methods when results from the two assays and all partners were pooled. Significant differences in results from the different partners were observed. However, representative mean counts were obtained for geometric means, geometric coefficient of variation, and 95% confidence interval for CD3 910 cells/mul, 9%, and 888 to 933, respectively), CD4 (495 cells/mul, 12%, and 483 to 507), and CD8 (408 cells/mul, 13%, and 393 to 422). CONCLUSION: We have introduced a stabilised blood preparation and a well-characterized biological standard. The availability of this reference material greatly simplifies the validation of new techniques for CD4(+) T-cell enumeration and the expansion of external quality assurance programmes for clinical laboratories, including those that operate in resource-restricted environments. (c) 2005 Wiley-Liss, Inc.

  12. Thiol-Reactive Star Polymers Display Enhanced Association with Distinct Human Blood Components.

    PubMed

    Glass, Joshua J; Li, Yang; De Rose, Robert; Johnston, Angus P R; Czuba, Ewa I; Khor, Song Yang; Quinn, John F; Whittaker, Michael R; Davis, Thomas P; Kent, Stephen J

    2017-04-12

    Directing nanoparticles to specific cell types using nonantibody-based methods is of increasing interest. Thiol-reactive nanoparticles can enhance the efficiency of cargo delivery into specific cells through interactions with cell-surface proteins. However, studies to date using this technique have been largely limited to immortalized cell lines or rodents, and the utility of this technology on primary human cells is unknown. Herein, we used RAFT polymerization to prepare pyridyl disulfide (PDS)-functionalized star polymers with a methoxy-poly(ethylene glycol) brush corona and a fluorescently labeled cross-linked core using an arm-first method. PDS star polymers were examined for their interaction with primary human blood components: six separate white blood cell subsets, as well as red blood cells and platelets. Compared with control star polymers, thiol-reactive nanoparticles displayed enhanced association with white blood cells at 37 °C, particularly the phagocytic monocyte, granulocyte, and dendritic cell subsets. Platelets associated with more PDS than control nanoparticles at both 37 °C and on ice, but they were not activated in the duration examined. Association with red blood cells was minor but still enhanced with PDS nanoparticles. Thiol-reactive nanoparticles represent a useful strategy to target primary human immune cell subsets for improved nanoparticle delivery.

  13. A Genetic Algorithm Based Support Vector Machine Model for Blood-Brain Barrier Penetration Prediction

    PubMed Central

    Zhang, Daqing; Xiao, Jianfeng; Zhou, Nannan; Luo, Xiaomin; Jiang, Hualiang; Chen, Kaixian

    2015-01-01

    Blood-brain barrier (BBB) is a highly complex physical barrier determining what substances are allowed to enter the brain. Support vector machine (SVM) is a kernel-based machine learning method that is widely used in QSAR study. For a successful SVM model, the kernel parameters for SVM and feature subset selection are the most important factors affecting prediction accuracy. In most studies, they are treated as two independent problems, but it has been proven that they could affect each other. We designed and implemented genetic algorithm (GA) to optimize kernel parameters and feature subset selection for SVM regression and applied it to the BBB penetration prediction. The results show that our GA/SVM model is more accurate than other currently available log BB models. Therefore, to optimize both SVM parameters and feature subset simultaneously with genetic algorithm is a better approach than other methods that treat the two problems separately. Analysis of our log BB model suggests that carboxylic acid group, polar surface area (PSA)/hydrogen-bonding ability, lipophilicity, and molecular charge play important role in BBB penetration. Among those properties relevant to BBB penetration, lipophilicity could enhance the BBB penetration while all the others are negatively correlated with BBB penetration. PMID:26504797

  14. Enhancing the Discrimination Ability of a Gas Sensor Array Based on a Novel Feature Selection and Fusion Framework.

    PubMed

    Deng, Changjian; Lv, Kun; Shi, Debo; Yang, Bo; Yu, Song; He, Zhiyi; Yan, Jia

    2018-06-12

    In this paper, a novel feature selection and fusion framework is proposed to enhance the discrimination ability of gas sensor arrays for odor identification. Firstly, we put forward an efficient feature selection method based on the separability and the dissimilarity to determine the feature selection order for each type of feature when increasing the dimension of selected feature subsets. Secondly, the K-nearest neighbor (KNN) classifier is applied to determine the dimensions of the optimal feature subsets for different types of features. Finally, in the process of establishing features fusion, we come up with a classification dominance feature fusion strategy which conducts an effective basic feature. Experimental results on two datasets show that the recognition rates of Database I and Database II achieve 97.5% and 80.11%, respectively, when k = 1 for KNN classifier and the distance metric is correlation distance (COR), which demonstrates the superiority of the proposed feature selection and fusion framework in representing signal features. The novel feature selection method proposed in this paper can effectively select feature subsets that are conducive to the classification, while the feature fusion framework can fuse various features which describe the different characteristics of sensor signals, for enhancing the discrimination ability of gas sensors and, to a certain extent, suppressing drift effect.

  15. Species undersampling in tropical bat surveys: effects on emerging biodiversity patterns.

    PubMed

    Meyer, Christoph F J; Aguiar, Ludmilla M S; Aguirre, Luis F; Baumgarten, Julio; Clarke, Frank M; Cosson, Jean-François; Estrada Villegas, Sergio; Fahr, Jakob; Faria, Deborah; Furey, Neil; Henry, Mickaël; Jenkins, Richard K B; Kunz, Thomas H; Cristina MacSwiney González, M; Moya, Isabel; Pons, Jean-Marc; Racey, Paul A; Rex, Katja; Sampaio, Erica M; Stoner, Kathryn E; Voigt, Christian C; von Staden, Dietrich; Weise, Christa D; Kalko, Elisabeth K V

    2015-01-01

    Undersampling is commonplace in biodiversity surveys of species-rich tropical assemblages in which rare taxa abound, with possible repercussions for our ability to implement surveys and monitoring programmes in a cost-effective way. We investigated the consequences of information loss due to species undersampling (missing subsets of species from the full species pool) in tropical bat surveys for the emerging patterns of species richness (SR) and compositional variation across sites. For 27 bat assemblage data sets from across the tropics, we used correlations between original data sets and subsets with different numbers of species deleted either at random, or according to their rarity in the assemblage, to assess to what extent patterns in SR and composition in data subsets are congruent with those in the initial data set. We then examined to what degree high sample representativeness (r ≥ 0·8) was influenced by biogeographic region, sampling method, sampling effort or structural assemblage characteristics. For SR, correlations between random subsets and original data sets were strong (r ≥ 0·8) with moderate (ca. 20%) species loss. Bias associated with information loss was greater for species composition; on average ca. 90% of species in random subsets had to be retained to adequately capture among-site variation. For nonrandom subsets, removing only the rarest species (on average c. 10% of the full data set) yielded strong correlations (r > 0·95) for both SR and composition. Eliminating greater proportions of rare species resulted in weaker correlations and large variation in the magnitude of observed correlations among data sets. Species subsets that comprised ca. 85% of the original set can be considered reliable surrogates, capable of adequately revealing patterns of SR and temporal or spatial turnover in many tropical bat assemblages. Our analyses thus demonstrate the potential as well as limitations for reducing survey effort and streamlining sampling protocols, and consequently for increasing the cost-effectiveness in tropical bat surveys or monitoring programmes. The dependence of the performance of species subsets on structural assemblage characteristics (total assemblage abundance, proportion of rare species), however, underscores the importance of adaptive monitoring schemes and of establishing surrogate performance on a site by site basis based on pilot surveys. © 2014 The Authors. Journal of Animal Ecology © 2014 British Ecological Society.

  16. The CEOS Atmospheric Composition Constellation (ACC), an Integrated Observing System

    NASA Astrophysics Data System (ADS)

    Hilsenrath, E.; Langen, J.; Zehner, C.

    2008-05-01

    The Atmospheric Composition (AC) Constellation is one of four pilot projects initiated by the Committee for Earth Observations Systems (CEOS) to bring about technical/scientific cooperation among space agencies that meet the goals of GEO and comply with the CEOS member agencies national programs. The Constellation concept has been endorsed in the GEO Work Plan, 2007-2009. The AC Constellation goal is to collect and deliver data to develop and improve monitoring, assessment and predictive capabilities for changes in the ozone layer, air quality and climate forcing associated with changes in the environment. These data will support five of the nine GEO SBAs: Health, Energy, Climate, Hazards, and Ecosystems. At the present time ESA, EC, CSA, CNES, JAXA, DLR, NIVR, NASA, NOAA and Eumetsat are participating in the Constellation study, and have major assets in orbit including 17 instruments on seven platforms. One goal of the Constellation study is to identify missing capabilities that will result when the present orbiting research satellites missions end and those not included in the next generation operational missions. Missing observations include very accurate and high spatial resolution measurements needed to be to track trends in atmospheric composition and understand their relationship to climate change. The following are the top level objectives for the AC Constellation Concept Study: • Develop a virtual constellation of existing and upcoming missions using synergies among the instruments and identify missing capabilities. • Study advanced architecture with new space assets and varying orbits with expectations that new technology could also be brought forward to best meet user requirements • Data system interoperability to insure that data are useful, properly targeted, and easily accessible. To demonstrate that the Constellation concept can provide value added data products, the ACC has initiated the three projects that are being supported by the participating space agencies. These include 1) Time of day changes in NO2 using Aura/OMI and Metop/GOME-2. 2) Near-real-time fire detection and smoke forecasts using multiple satellites (A-Train, GOES, GOME-2, MSG, etc) and trajectory model, and 3) Improved volcanic ash alerts for aviation hazard avoidance from satellite SO2 and ash data from SCIAMACHY, OMI, GOME-2, AIRS and SEVIRI. Each of the three projects will address the GEO SBAs with consideration to discovery and interoperability of their data products. The status of the ACC studies will be reviewed with a progress report on the above three projects.

  17. Inferences about nested subsets structure when not all species are detected

    USGS Publications Warehouse

    Cam, E.; Nichols, J.D.; Hines, J.E.; Sauer, J.R.

    2000-01-01

    Comparisons of species composition among ecological communities of different size have often provided evidence that the species in communities with lower species richness form nested subsets of the species in larger communities. In the vast majority of studies, the question of nested subsets has been addressed using information on presence-absence, where a '0' is interpreted as the absence of a given species from a given location. Most of the methodological discussion in earlier studies investigating nestedness concerns the approach to generation of model-based matrices. However, it is most likely that in many situations investigators cannot detect all the species present in the location sampled. The possibility that zeros in incidence matrices reflect nondetection rather than absence of species has not been considered in studies addressing nested subsets, even though the position of zeros in these matrices forms the basis of earlier inference methods. These sampling artifacts are likely to lead to erroneous conclusions about both variation over space in species richness and the degree of similarity of the various locations. Here we propose an approach to investigation of nestedness, based on statistical inference methods explicitly incorporating species detection probability, that take into account the probabilistic nature of the sampling process. We use presence-absence data collected under Pollock?s robust capture-recapture design, and resort to an estimator of species richness originally developed for closed populations to assess the proportion of species shared by different locations. We develop testable predictions corresponding to the null hypothesis of a nonnested pattern, and an alternative hypothesis of perfect nestedness. We also present an index for assessing the degree of nestedness of a system of ecological communities. We illustrate our approach using avian data from the North American Breeding Bird Survey collected in Florida Keys.

  18. Leukocyte Populations in Human Preterm and Term Breast Milk Identified by Multicolour Flow Cytometry

    PubMed Central

    Trend, Stephanie; de Jong, Emma; Lloyd, Megan L.; Kok, Chooi Heen; Richmond, Peter; Doherty, Dorota A.; Simmer, Karen; Kakulas, Foteini; Strunk, Tobias; Currie, Andrew

    2015-01-01

    Background Extremely preterm infants are highly susceptible to bacterial infections but breast milk provides some protection. It is unknown if leukocyte numbers and subsets in milk differ between term and preterm breast milk. This study serially characterised leukocyte populations in breast milk of mothers of preterm and term infants using multicolour flow cytometry methods for extended differential leukocyte counts in blood. Methods Sixty mothers of extremely preterm (<28 weeks gestational age), very preterm (28–31 wk), and moderately preterm (32–36 wk), as well as term (37–41 wk) infants were recruited. Colostrum (d2–5), transitional (d8–12) and mature milk (d26–30) samples were collected, cells isolated, and leukocyte subsets analysed using flow cytometry. Results The major CD45+ leukocyte populations circulating in blood were also detectable in breast milk but at different frequencies. Progression of lactation was associated with decreasing CD45+ leukocyte concentration, as well as increases in the relative frequencies of neutrophils and immature granulocytes, and decreases in the relative frequencies of eosinophils, myeloid and B cell precursors, and CD16- monocytes. No differences were observed between preterm and term breast milk in leukocyte concentration, though minor differences between preterm groups in some leukocyte frequencies were observed. Conclusions Flow cytometry is a useful tool to identify and quantify leukocyte subsets in breast milk. The stage of lactation is associated with major changes in milk leukocyte composition in this population. Fresh preterm breast milk is not deficient in leukocytes, but shorter gestation may be associated with minor differences in leukocyte subset frequencies in preterm compared to term breast milk. PMID:26288195

  19. Spectral Band Selection for Urban Material Classification Using Hyperspectral Libraries

    NASA Astrophysics Data System (ADS)

    Le Bris, A.; Chehata, N.; Briottet, X.; Paparoditis, N.

    2016-06-01

    In urban areas, information concerning very high resolution land cover and especially material maps are necessary for several city modelling or monitoring applications. That is to say, knowledge concerning the roofing materials or the different kinds of ground areas is required. Airborne remote sensing techniques appear to be convenient for providing such information at a large scale. However, results obtained using most traditional processing methods based on usual red-green-blue-near infrared multispectral images remain limited for such applications. A possible way to improve classification results is to enhance the imagery spectral resolution using superspectral or hyperspectral sensors. In this study, it is intended to design a superspectral sensor dedicated to urban materials classification and this work particularly focused on the selection of the optimal spectral band subsets for such sensor. First, reflectance spectral signatures of urban materials were collected from 7 spectral libraires. Then, spectral optimization was performed using this data set. The band selection workflow included two steps, optimising first the number of spectral bands using an incremental method and then examining several possible optimised band subsets using a stochastic algorithm. The same wrapper relevance criterion relying on a confidence measure of Random Forests classifier was used at both steps. To cope with the limited number of available spectra for several classes, additional synthetic spectra were generated from the collection of reference spectra: intra-class variability was simulated by multiplying reference spectra by a random coefficient. At the end, selected band subsets were evaluated considering the classification quality reached using a rbf svm classifier. It was confirmed that a limited band subset was sufficient to classify common urban materials. The important contribution of bands from the Short Wave Infra-Red (SWIR) spectral domain (1000-2400 nm) to material classification was also shown.

  20. Methodological Options and Their Implications: An Example Using Secondary Data to Analyze Latino Educational Expectations

    ERIC Educational Resources Information Center

    Wells, Ryan S.; Lynch, Cassie M.; Seifert, Tricia A.

    2011-01-01

    A number of studies over decades have examined determinants of educational expectations. However, even among the subset of quantitative studies, there is considerable variation in the methods used to operationally define and analyze expectations. Using a systematic literature review and several regression methods to analyze Latino students'…

  1. Motivational Engagement and Video Gaming: A Mixed Methods Study

    ERIC Educational Resources Information Center

    Hoffman, Bobby; Nadelson, Louis

    2010-01-01

    A mixed methods design was used to identify factors associated with motivational engagement in video gaming. Self-report instruments were administered to 189 video game players to assess goal orientations, affect, need for cognition, and perceptions of engagement and flow. Simultaneously, a sub-set of 25 participants were interviewed and results…

  2. Historical series and Near Real Time data analysis produced within ASI-SRV project infrastructures

    NASA Astrophysics Data System (ADS)

    Silvestri, M.; Musacchio, M.; Buongiorno, M.; Corradini, S.; Lombardo, V.; Merucci, L.; Spinetti, C.; Sansosti, E.; Pugnaghi, S.; Teggi, S.; Vignoli, S.; Amodio, A.; Dini, L.

    2009-12-01

    ASI-Sistema Rischio Vulcanico (SRV) project is devoted to the development of a pre-operative integrated system managing different Earth Observation (EO) and Non EO data to respond to specific needs of the Italian Civil Protection Department (DPC) and improve the monitoring of Italian active volcanoes. The project provides the capability to maintain a repository where the acquired data are stored and generates products offering a support to risk managers during the different volcanic activity phases. All the products are obtained considering technical choices and developments of ASI-SRV based on flexible and scalable modules which take into account also the new coming space sensors and new processing algorithms. An important step of the project development regards the technical and scientific feasibility of the provided products that depends on the data availability, accuracy algorithms and models used in the processing and of course the possibility to validate the results by means of comparison with non-EO independent measurements. The ASI-SRV infrastrucutre is based on a distributed client/server architecture which implies that different processors need to ingest data set characterized by a constant and common structure. ASI-SRV will develop, in its final version, a centralized HW-SW system located at INGV which will control two complete processing chains, one located at INGV for Optical data, and the other located at IREA for SAR data. The produced results will be disseminated through a WEB-GIS interface which will allow the DPC to overview and assimilate the products in a compatible format respect to their local monitoring system in order to have an immediate use of the provided information. In this paper the first results producing ground deformation measurement via Differential Interferometric SAR (DInSAR) techniques by using SAR data and via the application of the Small BAseline Subset (SBAS) technique developed at IREA, are reported. Moreover different products obtained using optical data (ASTER, MODIS, HYPERION, and AVHRR) are also reported. The processing modules for EO Optical sensors data are based on procedures those allow to estimate a number of parameters which include: surface thermal proprieties, concentration and flux of sulphur dioxide (SO2), water vapor and volcanic aerosol optical thickness, ash emissions and to characterize the volcanic products in terms of composition and geometry. For the analysis of the surface thermal characteristics, the available algorithms allow to extract information useful to detect small changes in the retrieved parameters. ASI-SRV foresees the generation of significant information also for the definition of the new lava and ash cover distribution after the end of an eruption.

  3. Hydrothermal fluid flow models of Campi Flegrei caldera, Italy constrained by InSAR surface deformation time series observations

    NASA Astrophysics Data System (ADS)

    Lundgren, P.; Lanari, R.; Manzo, M.; Sansosti, E.; Tizzani, P.; Hutnak, M.; Hurwitz, S.

    2008-12-01

    Campi Flegrei caldera, Italy, located along the Bay of Naples, has a long history of significant vertical deformation, with the most recent large uplift (>1.5m) occurring in 1983-1984. Each episode of uplift has been followed by a period of subsidence that decreases in rate with time and may be punctuated by brief episodes of lesser uplift. The large amplitude of the major uplifts that occur without volcanic activity, and the subsequent subsidence has been argued as evidence for hydrothermal amplification of any magmatic source. The later subsidence and its temporal decay have been argued as due to diffusion of the pressurized caldera fill material into the less porous surrounding country rock. We present satellite synthetic aperture radar (SAR) interferometry (InSAR) time series analysis of ERS and Envisat data from the European Space Agency, based on exploiting the Small Baseline Subset (SBAS) approach [Berardino et al., 2002]; this allows us to generate maps of relative surface deformation though time, beginning in 1992 through 2007, that are relevant to both ascending and descending satellite orbits. The general temporal behavior is one of subsidence punctuated by several lesser uplift episodes. The spatial pattern of deformation can be modeled through simple inflation/deflation sources in an elastic halfspace. Given the evidence to suggest that fluids may play a significant role in the temporal deformation of Campi Flegrei, rather than a purely magmatic or magma chamber-based interpretation, we model the temporal and spatial evolution of surface deformation as a hydrothermal fluid flow process. We use the TOUGH2-BIOT2 set of numerical codes [Preuss et al., 1999; Hsieh, 1996], which couple multi-phase (liquid-gas) and multi-component (H2O-CO2) fluid flow in a porous or fractured media with plane strain deformation and fluid flow in a linearly elastic porous medium. We explore parameters related to the depth and temporal history of fluid injection, fluid composition, circulation geometries, and the physical properties of the media, to explain the InSAR time series. References: Berardino, P., R. Lanari, E. Sansosti (2002), A new Algorithm for surface deformation monitoring based on small baseline differential SAR interferograms, IEEE Transactions on Geoscience and Remote Sensing, 40, 11, 2375-2383. Pruess, L., C. Oldenburg, and G. Moridis (1999), TOUGH2 user's guide, version 2.0, Paper LBNL-43134, Lawrence Berkeley Natl. Lab., Berkeley, Calif. Hsieh, P. A. (1996), Deformation-induced changes in hydraulic head during ground-water withdrawal, Ground Water, 34, 1082-1089.

  4. Characterisation of an epigenetically altered CD4+ CD28+ Kir+ T cell subset in autoimmune rheumatic diseases by multiparameter flow cytometry

    PubMed Central

    Strickland, Faith M; Patel, Dipak; Somers, Emily; Robida, Aaron M; Pihalja, Michael; Swartz, Richard; Marder, Wendy; Richardson, Bruce

    2016-01-01

    Objectives Antigen-specific CD4+ T cells epigenetically modified with DNA methylation inhibitors overexpress genes normally suppressed by this mechanism, including CD11a, CD70, CD40L and the KIR gene family. The altered cells become autoreactive, losing restriction for nominal antigen and responding to self-class II major histocompatibility complex (MHC) molecules without added antigen, and are sufficient to cause a lupus-like disease in syngeneic mice. T cells overexpressing the same genes are found in patients with active lupus. Whether these genes are co-overexpressed on the same or different cells is unknown. The goal of this study was to determine whether these genes are overexpressed on the same or different T cells and whether this subset of CD4+ T cells is also present in patients with lupus and other rheumatic diseases. Methods Multicolour flow cytometry was used to compare CD11a, CD70, CD40L and KIR expression on CD3+CD4+CD28+ T cells to their expression on experimentally demethylated CD3+CD4+CD28+ T cells and CD3+CD4+CD28+ T cells from patients with active lupus and other autoimmune diseases. Results Experimentally demethylated CD4+ T cells and T cells from patients with active lupus have a CD3+CD4+CD28+CD11ahiCD70+CD40LhiKIR+ subset, and the subset size is proportional to lupus flare severity. A similar subset is found in patients with other rheumatic diseases including rheumatoid arthritis, systemic sclerosis and Sjögren's syndrome but not retroperitoneal fibrosis. Conclusions Patients with active autoimmune rheumatic diseases have a previously undescribed CD3+CD4+CD28+CD11ahiCD70+CD40LhiKIR+ T cell subset. This subset may play an important role in flares of lupus and related autoimmune rheumatic diseases, provide a biomarker for disease activity and serve as a novel therapeutic target for the treatment of lupus flares. PMID:27099767

  5. Unbiased feature selection in learning random forests for high-dimensional data.

    PubMed

    Nguyen, Thanh-Tung; Huang, Joshua Zhexue; Nguyen, Thuy Thi

    2015-01-01

    Random forests (RFs) have been widely used as a powerful classification method. However, with the randomization in both bagging samples and feature selection, the trees in the forest tend to select uninformative features for node splitting. This makes RFs have poor accuracy when working with high-dimensional data. Besides that, RFs have bias in the feature selection process where multivalued features are favored. Aiming at debiasing feature selection in RFs, we propose a new RF algorithm, called xRF, to select good features in learning RFs for high-dimensional data. We first remove the uninformative features using p-value assessment, and the subset of unbiased features is then selected based on some statistical measures. This feature subset is then partitioned into two subsets. A feature weighting sampling technique is used to sample features from these two subsets for building trees. This approach enables one to generate more accurate trees, while allowing one to reduce dimensionality and the amount of data needed for learning RFs. An extensive set of experiments has been conducted on 47 high-dimensional real-world datasets including image datasets. The experimental results have shown that RFs with the proposed approach outperformed the existing random forests in increasing the accuracy and the AUC measures.

  6. Influence of Cognitive Functioning on Age-Related Performance Declines in Visuospatial Sequence Learning.

    PubMed

    Krüger, Melanie; Hinder, Mark R; Puri, Rohan; Summers, Jeffery J

    2017-01-01

    Objectives: The aim of this study was to investigate how age-related performance differences in a visuospatial sequence learning task relate to age-related declines in cognitive functioning. Method: Cognitive functioning of 18 younger and 18 older participants was assessed using a standardized test battery. Participants then undertook a perceptual visuospatial sequence learning task. Various relationships between sequence learning and participants' cognitive functioning were examined through correlation and factor analysis. Results: Older participants exhibited significantly lower performance than their younger counterparts in the sequence learning task as well as in multiple cognitive functions. Factor analysis revealed two independent subsets of cognitive functions associated with performance in the sequence learning task, related to either the processing and storage of sequence information (first subset) or problem solving (second subset). Age-related declines were only found for the first subset of cognitive functions, which also explained a significant degree of the performance differences in the sequence learning task between age-groups. Discussion: The results suggest that age-related performance differences in perceptual visuospatial sequence learning can be explained by declines in the ability to process and store sequence information in older adults, while a set of cognitive functions related to problem solving mediates performance differences independent of age.

  7. Formal functional test designs with a test representation language

    NASA Technical Reports Server (NTRS)

    Hops, J. M.

    1993-01-01

    The application of the category-partition method to the test design phase of hardware, software, or system test development is discussed. The method provides a formal framework for reducing the total number of possible test cases to a minimum logical subset for effective testing. An automatic tool and a formal language were developed to implement the method and produce the specification of test cases.

  8. Predictive equations for the estimation of body size in seals and sea lions (Carnivora: Pinnipedia)

    PubMed Central

    Churchill, Morgan; Clementz, Mark T; Kohno, Naoki

    2014-01-01

    Body size plays an important role in pinniped ecology and life history. However, body size data is often absent for historical, archaeological, and fossil specimens. To estimate the body size of pinnipeds (seals, sea lions, and walruses) for today and the past, we used 14 commonly preserved cranial measurements to develop sets of single variable and multivariate predictive equations for pinniped body mass and total length. Principal components analysis (PCA) was used to test whether separate family specific regressions were more appropriate than single predictive equations for Pinnipedia. The influence of phylogeny was tested with phylogenetic independent contrasts (PIC). The accuracy of these regressions was then assessed using a combination of coefficient of determination, percent prediction error, and standard error of estimation. Three different methods of multivariate analysis were examined: bidirectional stepwise model selection using Akaike information criteria; all-subsets model selection using Bayesian information criteria (BIC); and partial least squares regression. The PCA showed clear discrimination between Otariidae (fur seals and sea lions) and Phocidae (earless seals) for the 14 measurements, indicating the need for family-specific regression equations. The PIC analysis found that phylogeny had a minor influence on relationship between morphological variables and body size. The regressions for total length were more accurate than those for body mass, and equations specific to Otariidae were more accurate than those for Phocidae. Of the three multivariate methods, the all-subsets approach required the fewest number of variables to estimate body size accurately. We then used the single variable predictive equations and the all-subsets approach to estimate the body size of two recently extinct pinniped taxa, the Caribbean monk seal (Monachus tropicalis) and the Japanese sea lion (Zalophus japonicus). Body size estimates using single variable regressions generally under or over-estimated body size; however, the all-subset regression produced body size estimates that were close to historically recorded body length for these two species. This indicates that the all-subset regression equations developed in this study can estimate body size accurately. PMID:24916814

  9. Identification of selection signatures in cattle breeds selected for dairy production.

    PubMed

    Stella, Alessandra; Ajmone-Marsan, Paolo; Lazzari, Barbara; Boettcher, Paul

    2010-08-01

    The genomics revolution has spurred the undertaking of HapMap studies of numerous species, allowing for population genomics to increase the understanding of how selection has created genetic differences between subspecies populations. The objectives of this study were to (1) develop an approach to detect signatures of selection in subsets of phenotypically similar breeds of livestock by comparing single nucleotide polymorphism (SNP) diversity between the subset and a larger population, (2) verify this method in breeds selected for simply inherited traits, and (3) apply this method to the dairy breeds in the International Bovine HapMap (IBHM) study. The data consisted of genotypes for 32,689 SNPs of 497 animals from 19 breeds. For a given subset of breeds, the test statistic was the parametric composite log likelihood (CLL) of the differences in allelic frequencies between the subset and the IBHM for a sliding window of SNPs. The null distribution was obtained by calculating CLL for 50,000 random subsets (per chromosome) of individuals. The validity of this approach was confirmed by obtaining extremely large CLLs at the sites of causative variation for polled (BTA1) and black-coat-color (BTA18) phenotypes. Across the 30 bovine chromosomes, 699 putative selection signatures were detected. The largest CLL was on BTA6 and corresponded to KIT, which is responsible for the piebald phenotype present in four of the five dairy breeds. Potassium channel-related genes were at the site of the largest CLL on three chromosomes (BTA14, -16, and -25) whereas integrins (BTA18 and -19) and serine/arginine rich splicing factors (BTA20 and -23) each had the largest CLL on two chromosomes. On the basis of the results of this study, the application of population genomics to farm animals seems quite promising. Comparisons between breed groups have the potential to identify genomic regions influencing complex traits with no need for complex equipment and the collection of extensive phenotypic records and can contribute to the identification of candidate genes and to the understanding of the biological mechanisms controlling complex traits.

  10. Immune cell response to strenuous resistive breathing: comparison with whole body exercise and the effects of antioxidants

    PubMed Central

    Karatza, Maria-Helena; Vasileiou, Spyridoula; Katsaounou, Paraskevi; Mastora, Zafeiria

    2018-01-01

    Background/hypothesis Whole body exercise (WBE) changes lymphocyte subset percentages in peripheral blood. Resistive breathing, a hallmark of diseases of airway obstruction, is a form of exercise for the inspiratory muscles. Strenuous muscle contractions induce oxidative stress that may mediate immune alterations following exercise. We hypothesized that inspiratory resistive breathing (IRB) alters peripheral blood lymphocyte subsets and that oxidative stress mediates lymphocyte subpopulation alterations following both WBE and IRB. Patients and methods Six healthy nonathletes performed two WBE and two IRB sessions for 45 minutes at 70% of VO2 maximum and 70% of maximum inspiratory pressure (Pimax), respectively, before and after the administration of antioxidants (vitamins E, A, and C for 75 days, allopurinol for 30 days, and N-acetylcysteine for 3 days). Blood was drawn at baseline, at the end of each session, and 2 hours into recovery. Lymphocyte subsets were determined by flow cytometry. Results Before antioxidant supplementation at both WBE end and IRB end, the natural killer cell percentage increased, the T helper cell (CD3+ CD4+) percentage was reduced, and the CD4/CD8 ratio was depressed, a response which was abolished by antioxidants only after IRB. Furthermore, at IRB end, antioxidants promoted CD8+ CD38+ and blunted cytotoxic T-cell percentage increase. CD8+ CD45RA+ cell percentage changes were blunted after antioxidant supplementation in both WBE and IRB. Conclusion We conclude that IRB produces (as WBE) changes in peripheral blood lymphocyte subsets and that oxidative stress is a major stimulus predominantly for IRB-induced lymphocyte subset alterations. PMID:29445271

  11. Occupational exposure to formaldehyde and alterations in lymphocyte subsets

    PubMed Central

    Hosgood, H. Dean; Zhang, Luoping; Tang, Xiaojiang; Vermeulen, Roel; Hao, Zhenyue; Shen, Min; Qiu, Chuangyi; Ge, Yichen; Hua, Ming; Ji, Zhiying; Li, Senhua; Xiong, Jun; Reiss, Boris; Liu, Songwang; Xin, Kerry X.; Azuma, Mariko; Xie, Yuxuan; Freeman, Laura Beane; Ruan, Xiaolin; Guo, Weihong; Galvan, Noe; Blair, Aaron; Li, Laiyu; Huang, Hanlin; Smith, Martyn T.; Rothman, Nathaniel; Lan, Qing

    2012-01-01

    Background Formaldehyde is used in many occupational settings, most notably in manufacturing, health care, and embalming. Formaldehyde has been classified as a human carcinogen, but its mechanism of action remains uncertain. Methods We carried out a cross-sectional study of 43 formaldehyde exposed-workers and 51 unexposed age and sex-matched controls in Guangdong, China to study formaldehyde’s early biologic effects. To follow-up our previous report that the total lymphocyte count was decreased in formaldehyde-exposed workers compared to controls, we evaluated each major lymphocyte subset (i.e., CD4+ T cells, CD8+ T cells, natural killer (NK) cells, and B cells) and T cell lymphocyte subset (CD4+ naïve and memory T cells, CD8+ naïve and memory T cells, and regulatory T cells). Linear regression of each subset was used to test for differences between exposed workers and controls, adjusting for potential confounders. Results Total NK cell and T cell counts were about 24% (p=0.037) and 16% (p=0.0042) lower, respectively, among exposed workers. Among certain T cell subsets, decreased counts among exposed workers were observed for CD8+ T cells (p=0.026), CD8+ effector memory T cells (p=0.018), and regulatory T cells (CD4+FoxP3+: p=0.04; CD25+FoxP3+: p=0.008). Conclusions Formaldehyde exposed-workers experienced decreased counts of NK cells, regulatory T cells, and CD8+ effector memory T cells; however, due to the small sample size these findings need to be confirmed in larger studies. PMID:22767408

  12. Classification of Medical Datasets Using SVMs with Hybrid Evolutionary Algorithms Based on Endocrine-Based Particle Swarm Optimization and Artificial Bee Colony Algorithms.

    PubMed

    Lin, Kuan-Cheng; Hsieh, Yi-Hsiu

    2015-10-01

    The classification and analysis of data is an important issue in today's research. Selecting a suitable set of features makes it possible to classify an enormous quantity of data quickly and efficiently. Feature selection is generally viewed as a problem of feature subset selection, such as combination optimization problems. Evolutionary algorithms using random search methods have proven highly effective in obtaining solutions to problems of optimization in a diversity of applications. In this study, we developed a hybrid evolutionary algorithm based on endocrine-based particle swarm optimization (EPSO) and artificial bee colony (ABC) algorithms in conjunction with a support vector machine (SVM) for the selection of optimal feature subsets for the classification of datasets. The results of experiments using specific UCI medical datasets demonstrate that the accuracy of the proposed hybrid evolutionary algorithm is superior to that of basic PSO, EPSO and ABC algorithms, with regard to classification accuracy using subsets with a reduced number of features.

  13. Random-subset fitting of digital holograms for fast three-dimensional particle tracking [invited].

    PubMed

    Dimiduk, Thomas G; Perry, Rebecca W; Fung, Jerome; Manoharan, Vinothan N

    2014-09-20

    Fitting scattering solutions to time series of digital holograms is a precise way to measure three-dimensional dynamics of microscale objects such as colloidal particles. However, this inverse-problem approach is computationally expensive. We show that the computational time can be reduced by an order of magnitude or more by fitting to a random subset of the pixels in a hologram. We demonstrate our algorithm on experimentally measured holograms of micrometer-scale colloidal particles, and we show that 20-fold increases in speed, relative to fitting full frames, can be attained while introducing errors in the particle positions of 10 nm or less. The method is straightforward to implement and works for any scattering model. It also enables a parallelization strategy wherein random-subset fitting is used to quickly determine initial guesses that are subsequently used to fit full frames in parallel. This approach may prove particularly useful for studying rare events, such as nucleation, that can only be captured with high frame rates over long times.

  14. CD27 natural killer cell subsets play different roles during the pre-onset stage of experimental autoimmune encephalomyelitis.

    PubMed

    Gao, Ming; Yang, Yan; Li, Daling; Ming, Bingxia; Chen, Huoying; Sun, Yan; Xiao, Yifan; Lai, Lin; Zou, Huijuan; Xu, Yong; Xiong, Ping; Tan, Zheng; Gong, Feili; Zheng, Fang

    2016-08-01

    NK cells participate in the development of human multiple sclerosis (MS) and mouse experimental autoimmune encephalomyelitis (EAE), but the roles of different NK cell subsets in disease onset remain poorly understood. In this study, murine NK cells were divided into CD27(high) and CD27(low/-) subsets. The CD27(high) subset was decreased and the CD27(low/-) subset was increased in lymphoid organs during the pre-onset stage of EAE. Compared with the counterpart in naïve mice, the CD27(high) subset showed lower expression of Ly49D, Ly49H and NKG2D, and less production of IFN-γ, whereas the CD27(low/-) subset showed similar expression of the above mentioned surface receptors but higher cytotoxic activity in EAE mice. Compared with the CD27(high) subset, the CD27(low/-) subset exhibited increased promotion of DC maturation and no significant inhibition of T cells proliferation and Th17 cells differentiation in vitro Additionally, adoptive transfer of the CD27(low/-) subset, but not the CD27(high) subset, exacerbated the severity of EAE. Collectively, our data suggest the CD27 NK cell subsets play different roles in controlling EAE onset, which provide a new understanding for the regulation of NK cell subsets in early autoimmune disease. © The Author(s) 2016.

  15. A systems biology approach to the analysis of subset-specific responses to lipopolysaccharide in dendritic cells.

    PubMed

    Hancock, David G; Shklovskaya, Elena; Guy, Thomas V; Falsafi, Reza; Fjell, Chris D; Ritchie, William; Hancock, Robert E W; Fazekas de St Groth, Barbara

    2014-01-01

    Dendritic cells (DCs) are critical for regulating CD4 and CD8 T cell immunity, controlling Th1, Th2, and Th17 commitment, generating inducible Tregs, and mediating tolerance. It is believed that distinct DC subsets have evolved to control these different immune outcomes. However, how DC subsets mount different responses to inflammatory and/or tolerogenic signals in order to accomplish their divergent functions remains unclear. Lipopolysaccharide (LPS) provides an excellent model for investigating responses in closely related splenic DC subsets, as all subsets express the LPS receptor TLR4 and respond to LPS in vitro. However, previous studies of the LPS-induced DC transcriptome have been performed only on mixed DC populations. Moreover, comparisons of the in vivo response of two closely related DC subsets to LPS stimulation have not been reported in the literature to date. We compared the transcriptomes of murine splenic CD8 and CD11b DC subsets after in vivo LPS stimulation, using RNA-Seq and systems biology approaches. We identified subset-specific gene signatures, which included multiple functional immune mediators unique to each subset. To explain the observed subset-specific differences, we used a network analysis approach. While both DC subsets used a conserved set of transcription factors and major signalling pathways, the subsets showed differential regulation of sets of genes that 'fine-tune' the network Hubs expressed in common. We propose a model in which signalling through common pathway components is 'fine-tuned' by transcriptional control of subset-specific modulators, thus allowing for distinct functional outcomes in closely related DC subsets. We extend this analysis to comparable datasets from the literature and confirm that our model can account for cell subset-specific responses to LPS stimulation in multiple subpopulations in mouse and man.

  16. Input Decimated Ensembles

    NASA Technical Reports Server (NTRS)

    Tumer, Kagan; Oza, Nikunj C.; Clancy, Daniel (Technical Monitor)

    2001-01-01

    Using an ensemble of classifiers instead of a single classifier has been shown to improve generalization performance in many pattern recognition problems. However, the extent of such improvement depends greatly on the amount of correlation among the errors of the base classifiers. Therefore, reducing those correlations while keeping the classifiers' performance levels high is an important area of research. In this article, we explore input decimation (ID), a method which selects feature subsets for their ability to discriminate among the classes and uses them to decouple the base classifiers. We provide a summary of the theoretical benefits of correlation reduction, along with results of our method on two underwater sonar data sets, three benchmarks from the Probenl/UCI repositories, and two synthetic data sets. The results indicate that input decimated ensembles (IDEs) outperform ensembles whose base classifiers use all the input features; randomly selected subsets of features; and features created using principal components analysis, on a wide range of domains.

  17. Multiple descriptions based on multirate coding for JPEG 2000 and H.264/AVC.

    PubMed

    Tillo, Tammam; Baccaglini, Enrico; Olmo, Gabriella

    2010-07-01

    Multiple description coding (MDC) makes use of redundant representations of multimedia data to achieve resiliency. Descriptions should be generated so that the quality obtained when decoding a subset of them only depends on their number and not on the particular received subset. In this paper, we propose a method based on the principle of encoding the source at several rates, and properly blending the data encoded at different rates to generate the descriptions. The aim is to achieve efficient redundancy exploitation, and easy adaptation to different network scenarios by means of fine tuning of the encoder parameters. We apply this principle to both JPEG 2000 images and H.264/AVC video data. We consider as the reference scenario the distribution of contents on application-layer overlays with multiple-tree topology. The experimental results reveal that our method favorably compares with state-of-art MDC techniques.

  18. Estimation of the lower and upper bounds on the probability of failure using subset simulation and random set theory

    NASA Astrophysics Data System (ADS)

    Alvarez, Diego A.; Uribe, Felipe; Hurtado, Jorge E.

    2018-02-01

    Random set theory is a general framework which comprises uncertainty in the form of probability boxes, possibility distributions, cumulative distribution functions, Dempster-Shafer structures or intervals; in addition, the dependence between the input variables can be expressed using copulas. In this paper, the lower and upper bounds on the probability of failure are calculated by means of random set theory. In order to accelerate the calculation, a well-known and efficient probability-based reliability method known as subset simulation is employed. This method is especially useful for finding small failure probabilities in both low- and high-dimensional spaces, disjoint failure domains and nonlinear limit state functions. The proposed methodology represents a drastic reduction of the computational labor implied by plain Monte Carlo simulation for problems defined with a mixture of representations for the input variables, while delivering similar results. Numerical examples illustrate the efficiency of the proposed approach.

  19. Surveillance system and method having parameter estimation and operating mode partitioning

    NASA Technical Reports Server (NTRS)

    Bickford, Randall L. (Inventor)

    2003-01-01

    A system and method for monitoring an apparatus or process asset including partitioning an unpartitioned training data set into a plurality of training data subsets each having an operating mode associated thereto; creating a process model comprised of a plurality of process submodels each trained as a function of at least one of the training data subsets; acquiring a current set of observed signal data values from the asset; determining an operating mode of the asset for the current set of observed signal data values; selecting a process submodel from the process model as a function of the determined operating mode of the asset; calculating a current set of estimated signal data values from the selected process submodel for the determined operating mode; and outputting the calculated current set of estimated signal data values for providing asset surveillance and/or control.

  20. Mining nutrigenetics patterns related to obesity: use of parallel multifactor dimensionality reduction.

    PubMed

    Karayianni, Katerina N; Grimaldi, Keith A; Nikita, Konstantina S; Valavanis, Ioannis K

    2015-01-01

    This paper aims to enlighten the complex etiology beneath obesity by analysing data from a large nutrigenetics study, in which nutritional and genetic factors associated with obesity were recorded for around two thousand individuals. In our previous work, these data have been analysed using artificial neural network methods, which identified optimised subsets of factors to predict one's obesity status. These methods did not reveal though how the selected factors interact with each other in the obtained predictive models. For that reason, parallel Multifactor Dimensionality Reduction (pMDR) was used here to further analyse the pre-selected subsets of nutrigenetic factors. Within pMDR, predictive models using up to eight factors were constructed, further reducing the input dimensionality, while rules describing the interactive effects of the selected factors were derived. In this way, it was possible to identify specific genetic variations and their interactive effects with particular nutritional factors, which are now under further study.

  1. Scalable amplification of strand subsets from chip-synthesized oligonucleotide libraries

    PubMed Central

    Schmidt, Thorsten L.; Beliveau, Brian J.; Uca, Yavuz O.; Theilmann, Mark; Da Cruz, Felipe; Wu, Chao-Ting; Shih, William M.

    2015-01-01

    Synthetic oligonucleotides are the main cost factor for studies in DNA nanotechnology, genetics and synthetic biology, which all require thousands of these at high quality. Inexpensive chip-synthesized oligonucleotide libraries can contain hundreds of thousands of distinct sequences, however only at sub-femtomole quantities per strand. Here we present a selective oligonucleotide amplification method, based on three rounds of rolling-circle amplification, that produces nanomole amounts of single-stranded oligonucleotides per millilitre reaction. In a multistep one-pot procedure, subsets of hundreds or thousands of single-stranded DNAs with different lengths can selectively be amplified and purified together. These oligonucleotides are used to fold several DNA nanostructures and as primary fluorescence in situ hybridization probes. The amplification cost is lower than other reported methods (typically around US$ 20 per nanomole total oligonucleotides produced) and is dominated by the use of commercial enzymes. PMID:26567534

  2. Domain decomposition by the advancing-partition method for parallel unstructured grid generation

    NASA Technical Reports Server (NTRS)

    Banihashemi, legal representative, Soheila (Inventor); Pirzadeh, Shahyar Z. (Inventor)

    2012-01-01

    In a method for domain decomposition for generating unstructured grids, a surface mesh is generated for a spatial domain. A location of a partition plane dividing the domain into two sections is determined. Triangular faces on the surface mesh that intersect the partition plane are identified. A partition grid of tetrahedral cells, dividing the domain into two sub-domains, is generated using a marching process in which a front comprises only faces of new cells which intersect the partition plane. The partition grid is generated until no active faces remain on the front. Triangular faces on each side of the partition plane are collected into two separate subsets. Each subset of triangular faces is renumbered locally and a local/global mapping is created for each sub-domain. A volume grid is generated for each sub-domain. The partition grid and volume grids are then merged using the local-global mapping.

  3. Measurement of "total" microcystins using the MMPB/LC/MS ...

    EPA Pesticide Factsheets

    The detection and quantification of microcystins, a family of toxins associated with harmful algal blooms, is complicated by their structural diversity and a lack of commercially available analytical standards for method development. As a result, most detection methods have focused on either a subset of microcystin congeners, as in US EPA Method 544, or on techniques which are sensitive to structural features common to most microcystins, as in the anti-ADDA ELISA method. A recent development has been the use of 2-methyl-3-methoxy-4-phenylbutyric acid (MMPB), which is produced by chemical oxidation the ADDA moiety in most microcystin congeners, as a proxy for the sum of congeners present. Conditions for the MMPB derivatization were evaluated and applied to water samples obtained from various HAB impacted surface waters, and results were compared with congener-based LC/MS/MS and ELISA methods. The detection and quantification of microcystins, a family of toxins associated with harmful algal blooms, is complicated by their structural diversity and a lack of commercially available analytical standards for method development. As a result, most detection methods have focused on either a subset of microcystin congeners, as in US EPA Method 544, or on techniques which are sensitive to structural features common to most microcystins, as in the anti-ADDA ELISA method. A recent development has been the use of 2-methyl-3-methoxy-4-phenylbutyric acid (MMPB), which is produce

  4. A two-step database search method improves sensitivity in peptide sequence matches for metaproteomics and proteogenomics studies.

    PubMed

    Jagtap, Pratik; Goslinga, Jill; Kooren, Joel A; McGowan, Thomas; Wroblewski, Matthew S; Seymour, Sean L; Griffin, Timothy J

    2013-04-01

    Large databases (>10(6) sequences) used in metaproteomic and proteogenomic studies present challenges in matching peptide sequences to MS/MS data using database-search programs. Most notably, strict filtering to avoid false-positive matches leads to more false negatives, thus constraining the number of peptide matches. To address this challenge, we developed a two-step method wherein matches derived from a primary search against a large database were used to create a smaller subset database. The second search was performed against a target-decoy version of this subset database merged with a host database. High confidence peptide sequence matches were then used to infer protein identities. Applying our two-step method for both metaproteomic and proteogenomic analysis resulted in twice the number of high confidence peptide sequence matches in each case, as compared to the conventional one-step method. The two-step method captured almost all of the same peptides matched by the one-step method, with a majority of the additional matches being false negatives from the one-step method. Furthermore, the two-step method improved results regardless of the database search program used. Our results show that our two-step method maximizes the peptide matching sensitivity for applications requiring large databases, especially valuable for proteogenomics and metaproteomics studies. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. The Versatility of SpAM: A Fast, Efficient, Spatial Method of Data Collection for Multidimensional Scaling

    ERIC Educational Resources Information Center

    Hout, Michael C.; Goldinger, Stephen D.; Ferguson, Ryan W.

    2013-01-01

    Although traditional methods to collect similarity data (for multidimensional scaling [MDS]) are robust, they share a key shortcoming. Specifically, the possible pairwise comparisons in any set of objects grow rapidly as a function of set size. This leads to lengthy experimental protocols, or procedures that involve scaling stimulus subsets. We…

  6. Functional heterogeneity of human effector CD8+ T cells.

    PubMed

    Takata, Hiroshi; Naruto, Takuya; Takiguchi, Masafumi

    2012-02-09

    Effector CD8(+) T cells are believed to be terminally differentiated cells having cytotoxic activity and the ability to produce effector cytokines such as INF-γ and TNF-α. We investigated the difference between CXCR1(+) and CXCR1(-) subsets of human effector CD27(-)CD28(-)CD8(+) T cells. The subsets expressed cytolytic molecules similarly and exerted substantial cytolytic activity, whereas only the CXCR1(-) subset had IL-2 productivity and self-proliferative activity and was more resistant to cell death than the CXCR1(+) subset. These differences were explained by the specific up-regulation of CAMK4, SPRY2, and IL-7R in the CXCR1(-) subset and that of pro-apoptotic death-associated protein kinase 1 (DAPK1) in the CXCR1(+) subset. The IL-2 producers were more frequently found in the IL-7R(+) subset of the CXCR1(-) effector CD8(+) T cells than in the IL-7R(-) subset. IL-7/IL-7R signaling promoted cell survival only in the CXCR1(-) subset. The present study has highlighted a novel subset of effector CD8(+) T cells producing IL-2 and suggests the importance of this subset in the homeostasis of effector CD8(+) T cells.

  7. Ensemble variant interpretation methods to predict enzyme activity and assign pathogenicity in the CAGI4 NAGLU (Human N-acetyl-glucosaminidase) and UBE2I (Human SUMO-ligase) challenges.

    PubMed

    Yin, Yizhou; Kundu, Kunal; Pal, Lipika R; Moult, John

    2017-09-01

    CAGI (Critical Assessment of Genome Interpretation) conducts community experiments to determine the state of the art in relating genotype to phenotype. Here, we report results obtained using newly developed ensemble methods to address two CAGI4 challenges: enzyme activity for population missense variants found in NAGLU (Human N-acetyl-glucosaminidase) and random missense mutations in Human UBE2I (Human SUMO E2 ligase), assayed in a high-throughput competitive yeast complementation procedure. The ensemble methods are effective, ranked second for SUMO-ligase and third for NAGLU, according to the CAGI independent assessors. However, in common with other methods used in CAGI, there are large discrepancies between predicted and experimental activities for a subset of variants. Analysis of the structural context provides some insight into these. Post-challenge analysis shows that the ensemble methods are also effective at assigning pathogenicity for the NAGLU variants. In the clinic, providing an estimate of the reliability of pathogenic assignments is the key. We have also used the NAGLU dataset to show that ensemble methods have considerable potential for this task, and are already reliable enough for use with a subset of mutations. © 2017 Wiley Periodicals, Inc.

  8. Query3d: a new method for high-throughput analysis of functional residues in protein structures.

    PubMed

    Ausiello, Gabriele; Via, Allegra; Helmer-Citterich, Manuela

    2005-12-01

    The identification of local similarities between two protein structures can provide clues of a common function. Many different methods exist for searching for similar subsets of residues in proteins of known structure. However, the lack of functional and structural information on single residues, together with the low level of integration of this information in comparison methods, is a limitation that prevents these methods from being fully exploited in high-throughput analyses. Here we describe Query3d, a program that is both a structural DBMS (Database Management System) and a local comparison method. The method conserves a copy of all the residues of the Protein Data Bank annotated with a variety of functional and structural information. New annotations can be easily added from a variety of methods and known databases. The algorithm makes it possible to create complex queries based on the residues' function and then to compare only subsets of the selected residues. Functional information is also essential to speed up the comparison and the analysis of the results. With Query3d, users can easily obtain statistics on how many and which residues share certain properties in all proteins of known structure. At the same time, the method also finds their structural neighbours in the whole PDB. Programs and data can be accessed through the PdbFun web interface.

  9. Query3d: a new method for high-throughput analysis of functional residues in protein structures

    PubMed Central

    Ausiello, Gabriele; Via, Allegra; Helmer-Citterich, Manuela

    2005-01-01

    Background The identification of local similarities between two protein structures can provide clues of a common function. Many different methods exist for searching for similar subsets of residues in proteins of known structure. However, the lack of functional and structural information on single residues, together with the low level of integration of this information in comparison methods, is a limitation that prevents these methods from being fully exploited in high-throughput analyses. Results Here we describe Query3d, a program that is both a structural DBMS (Database Management System) and a local comparison method. The method conserves a copy of all the residues of the Protein Data Bank annotated with a variety of functional and structural information. New annotations can be easily added from a variety of methods and known databases. The algorithm makes it possible to create complex queries based on the residues' function and then to compare only subsets of the selected residues. Functional information is also essential to speed up the comparison and the analysis of the results. Conclusion With Query3d, users can easily obtain statistics on how many and which residues share certain properties in all proteins of known structure. At the same time, the method also finds their structural neighbours in the whole PDB. Programs and data can be accessed through the PdbFun web interface. PMID:16351754

  10. Target Detection Routine (TADER). User’s Guide.

    DTIC Science & Technology

    1987-09-01

    o System range capability subset (one record - omitted for standoff SLAR and penetrating system) o System inherent detection probability subset ( IELT ...records, i.e., one per element type) * System capability modifier subset/A=1, E=1 ( IELT records) o System capability modifier subset/A=1, E=2 ( IELT ...records) s System capability modifier subset/A=2, E=1 ( IELT records) o System capability modifier subset/A=2, E=2 ( IELT records) Unit Data Set (one set

  11. Multiple regression based imputation for individualizing template human model from a small number of measured dimensions.

    PubMed

    Nohara, Ryuki; Endo, Yui; Murai, Akihiko; Takemura, Hiroshi; Kouchi, Makiko; Tada, Mitsunori

    2016-08-01

    Individual human models are usually created by direct 3D scanning or deforming a template model according to the measured dimensions. In this paper, we propose a method to estimate all the necessary dimensions (full set) for the human model individualization from a small number of measured dimensions (subset) and human dimension database. For this purpose, we solved multiple regression equation from the dimension database given full set dimensions as the objective variable and subset dimensions as the explanatory variables. Thus, the full set dimensions are obtained by simply multiplying the subset dimensions to the coefficient matrix of the regression equation. We verified the accuracy of our method by imputing hand, foot, and whole body dimensions from their dimension database. The leave-one-out cross validation is employed in this evaluation. The mean absolute errors (MAE) between the measured and the estimated dimensions computed from 4 dimensions (hand length, breadth, middle finger breadth at proximal, and middle finger depth at proximal) in the hand, 2 dimensions (foot length, breadth, and lateral malleolus height) in the foot, and 1 dimension (height) and weight in the whole body are computed. The average MAE of non-measured dimensions were 4.58% in the hand, 4.42% in the foot, and 3.54% in the whole body, while that of measured dimensions were 0.00%.

  12. Nonuniform update for sparse target recovery in fluorescence molecular tomography accelerated by ordered subsets.

    PubMed

    Zhu, Dianwen; Li, Changqing

    2014-12-01

    Fluorescence molecular tomography (FMT) is a promising imaging modality and has been actively studied in the past two decades since it can locate the specific tumor position three-dimensionally in small animals. However, it remains a challenging task to obtain fast, robust and accurate reconstruction of fluorescent probe distribution in small animals due to the large computational burden, the noisy measurement and the ill-posed nature of the inverse problem. In this paper we propose a nonuniform preconditioning method in combination with L (1) regularization and ordered subsets technique (NUMOS) to take care of the different updating needs at different pixels, to enhance sparsity and suppress noise, and to further boost convergence of approximate solutions for fluorescence molecular tomography. Using both simulated data and phantom experiment, we found that the proposed nonuniform updating method outperforms its popular uniform counterpart by obtaining a more localized, less noisy, more accurate image. The computational cost was greatly reduced as well. The ordered subset (OS) technique provided additional 5 times and 3 times speed enhancements for simulation and phantom experiments, respectively, without degrading image qualities. When compared with the popular L (1) algorithms such as iterative soft-thresholding algorithm (ISTA) and Fast iterative soft-thresholding algorithm (FISTA) algorithms, NUMOS also outperforms them by obtaining a better image in much shorter period of time.

  13. Generation of synthetic flood hydrographs by hydrological donors (SHYDONHY method)

    NASA Astrophysics Data System (ADS)

    Paquet, Emmanuel

    2017-04-01

    For the design of hydraulic infrastructures like dams, a design hydrograph is required in most of the cases. Some of its features (e.g. peak value, duration, volume) corresponding to a given return period are computed thanks to a wide range of methods: historical records, mono or multivariate statistical analysis, stochastic simulation, etc. Then various methods have been proposed to construct design hydrographs having such characteristics, ranging from traditional unit-hydrograph to statistical methods (Yue et al., 2002). A new method to build design hydrographs (or more generally synthetic hydrographs) is introduced here, named SHYDONHY, French acronym for "Synthèse d'HYdrogrammes par DONneurs HYdrologiques". It is based on an extensive database of 100 000 flood hydrographs recorded at hourly time-step on 1300 gauging stations in France and Switzerland, covering a wide range of catchment size and climatology. For each station, an average of two hydrographs per year of record has been selected by a peak-over-threshold (POT) method with independence criteria (Lang et al., 1999). This sampling ensures that only hydrographs of intense floods are gathered in the dataset. For a given catchment, where few or no hydrograph is available at the outlet, a sub-set of 10 "donor stations" is selected within the complete dataset, considering several criteria: proximity, size, mean annual values and regimes for both total runoff and POT-selected floods. This sub-set of stations (and their corresponding flood hydrographs) will allow to: • Estimate a characteristic duration of flood hydrographs (e.g. duration for which the discharge is above 50% of the peak value). • For a given duration (e.g. one day), estimate the average peak-to- volume ratio of floods. • For a given duration and peak-to-volume ratio, generation of a synthetic reference hydrograph by combining appropriate hydrographs of the sub-set. • For a given daily discharge sequence, being observed or generated for extreme flood estimation, generate a suitable synthetic hydrograph, also by combining selected hydrographs of the sub-set. The reliability of the method is assessed by performing a jackknife validation on the whole dataset of stations, in particular by reconstructing the hydrograph of the biggest flood of each station and comparing it to the actual one. Some applications are presented, e.g. the coupling of SHYDONHY with the SCHADEX method (Paquet et al., 2003) for the stochastic simulation of extreme reservoir level in dams. References: Lang, M., Ouarda, T. B. M. J., & Bobée, B. (1999). Towards operational guidelines for over-threshold modeling. Journal of hydrology, 225(3), 103-117. Paquet, E., Garavaglia, F., Garçon, R., & Gailhard, J. (2013). The SCHADEX method: A semi-continuous rainfall-runoff simulation for extreme flood estimation. Journal of Hydrology, 495, 23-37. Yue, S., Ouarda, T. B., Bobée, B., Legendre, P., & Bruneau, P. (2002). Approach for describing statistical properties of flood hydrograph. Journal of hydrologic engineering, 7(2), 147-153.

  14. Single-step scanner-based digital image correlation (SB-DIC) method for large deformation mapping in rubber

    NASA Astrophysics Data System (ADS)

    Goh, C. P.; Ismail, H.; Yen, K. S.; Ratnam, M. M.

    2017-01-01

    The incremental digital image correlation (DIC) method has been applied in the past to determine strain in large deformation materials like rubber. This method is, however, prone to cumulative errors since the total displacement is determined by combining the displacements in numerous stages of the deformation. In this work, a method of mapping large strains in rubber using DIC in a single-step without the need for a series of deformation images is proposed. The reference subsets were deformed using deformation factors obtained from the fitted mean stress-axial stretch ratio curve obtained experimentally and the theoretical Poisson function. The deformed reference subsets were then correlated with the deformed image after loading. The recently developed scanner-based digital image correlation (SB-DIC) method was applied on dumbbell rubber specimens to obtain the in-plane displacement fields up to 350% axial strain. Comparison of the mean axial strains determined from the single-step SB-DIC method with those from the incremental SB-DIC method showed an average difference of 4.7%. Two rectangular rubber specimens containing circular and square holes were deformed and analysed using the proposed method. The resultant strain maps from the single-step SB-DIC method were compared with the results of finite element modeling (FEM). The comparison shows that the proposed single-step SB-DIC method can be used to map the strain distribution accurately in large deformation materials like rubber at much shorter time compared to the incremental DIC method.

  15. Feature selection for wearable smartphone-based human activity recognition with able bodied, elderly, and stroke patients.

    PubMed

    Capela, Nicole A; Lemaire, Edward D; Baddour, Natalie

    2015-01-01

    Human activity recognition (HAR), using wearable sensors, is a growing area with the potential to provide valuable information on patient mobility to rehabilitation specialists. Smartphones with accelerometer and gyroscope sensors are a convenient, minimally invasive, and low cost approach for mobility monitoring. HAR systems typically pre-process raw signals, segment the signals, and then extract features to be used in a classifier. Feature selection is a crucial step in the process to reduce potentially large data dimensionality and provide viable parameters to enable activity classification. Most HAR systems are customized to an individual research group, including a unique data set, classes, algorithms, and signal features. These data sets are obtained predominantly from able-bodied participants. In this paper, smartphone accelerometer and gyroscope sensor data were collected from populations that can benefit from human activity recognition: able-bodied, elderly, and stroke patients. Data from a consecutive sequence of 41 mobility tasks (18 different tasks) were collected for a total of 44 participants. Seventy-six signal features were calculated and subsets of these features were selected using three filter-based, classifier-independent, feature selection methods (Relief-F, Correlation-based Feature Selection, Fast Correlation Based Filter). The feature subsets were then evaluated using three generic classifiers (Naïve Bayes, Support Vector Machine, j48 Decision Tree). Common features were identified for all three populations, although the stroke population subset had some differences from both able-bodied and elderly sets. Evaluation with the three classifiers showed that the feature subsets produced similar or better accuracies than classification with the entire feature set. Therefore, since these feature subsets are classifier-independent, they should be useful for developing and improving HAR systems across and within populations.

  16. Feature Selection for Wearable Smartphone-Based Human Activity Recognition with Able bodied, Elderly, and Stroke Patients

    PubMed Central

    2015-01-01

    Human activity recognition (HAR), using wearable sensors, is a growing area with the potential to provide valuable information on patient mobility to rehabilitation specialists. Smartphones with accelerometer and gyroscope sensors are a convenient, minimally invasive, and low cost approach for mobility monitoring. HAR systems typically pre-process raw signals, segment the signals, and then extract features to be used in a classifier. Feature selection is a crucial step in the process to reduce potentially large data dimensionality and provide viable parameters to enable activity classification. Most HAR systems are customized to an individual research group, including a unique data set, classes, algorithms, and signal features. These data sets are obtained predominantly from able-bodied participants. In this paper, smartphone accelerometer and gyroscope sensor data were collected from populations that can benefit from human activity recognition: able-bodied, elderly, and stroke patients. Data from a consecutive sequence of 41 mobility tasks (18 different tasks) were collected for a total of 44 participants. Seventy-six signal features were calculated and subsets of these features were selected using three filter-based, classifier-independent, feature selection methods (Relief-F, Correlation-based Feature Selection, Fast Correlation Based Filter). The feature subsets were then evaluated using three generic classifiers (Naïve Bayes, Support Vector Machine, j48 Decision Tree). Common features were identified for all three populations, although the stroke population subset had some differences from both able-bodied and elderly sets. Evaluation with the three classifiers showed that the feature subsets produced similar or better accuracies than classification with the entire feature set. Therefore, since these feature subsets are classifier-independent, they should be useful for developing and improving HAR systems across and within populations. PMID:25885272

  17. Combined endeavor of Neutrosophic Set and Chan-Vese model to extract accurate liver image from CT scan.

    PubMed

    Siri, Sangeeta K; Latte, Mrityunjaya V

    2017-11-01

    Many different diseases can occur in the liver, including infections such as hepatitis, cirrhosis, cancer and over effect of medication or toxins. The foremost stage for computer-aided diagnosis of liver is the identification of liver region. Liver segmentation algorithms extract liver image from scan images which helps in virtual surgery simulation, speedup the diagnosis, accurate investigation and surgery planning. The existing liver segmentation algorithms try to extort exact liver image from abdominal Computed Tomography (CT) scan images. It is an open problem because of ambiguous boundaries, large variation in intensity distribution, variability of liver geometry from patient to patient and presence of noise. A novel approach is proposed to meet challenges in extracting the exact liver image from abdominal CT scan images. The proposed approach consists of three phases: (1) Pre-processing (2) CT scan image transformation to Neutrosophic Set (NS) and (3) Post-processing. In pre-processing, the noise is removed by median filter. The "new structure" is designed to transform a CT scan image into neutrosophic domain which is expressed using three membership subset: True subset (T), False subset (F) and Indeterminacy subset (I). This transform approximately extracts the liver image structure. In post processing phase, morphological operation is performed on indeterminacy subset (I) and apply Chan-Vese (C-V) model with detection of initial contour within liver without user intervention. This resulted in liver boundary identification with high accuracy. Experiments show that, the proposed method is effective, robust and comparable with existing algorithm for liver segmentation of CT scan images. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. Spontaneous and natural cytotoxicity receptor-mediated cytotoxicity are effector functions of distinct natural killer subsets in hepatitis C virus-infected chimpanzees.

    PubMed

    Verstrepen, B E; Nieuwenhuis, I G; Mooij, P; Bogers, W M; Boonstra, A; Koopman, G

    2016-07-01

    In humans, CD16 and CD56 are used to identify functionally distinct natural killer (NK) subsets. Due to ubiquitous CD56 expression, this marker cannot be used to distinguish between NK cell subsets in chimpanzees. Therefore, functional analysis of distinct NK subsets during hepatitis C virus (HCV) infection has never been performed in these animals. In the present study an alternative strategy was used to identify four distinct NK subsets on the basis of the expression of CD16 and CD94. The expression of activating and inhibiting surface receptors showed that these subsets resemble human NK subsets. CD107 expression was used to determine degranulation of the different subsets in naive and HCV-infected chimpanzees. In HCV-infected chimpanzees increased spontaneous cytotoxicity was observed in CD94(high/dim) CD16(pos) and CD94(low) CD16(pos) subsets. By contrast, increased natural cytotoxicity receptor (NCR)- mediated degranulation after NKp30 and NKp44 triggering was demonstrated in the CD94(dim) CD16(neg) subset. Our findings suggest that spontaneous and NCR-mediated cytotoxicity are effector functions of distinct NK subsets in HCV-infected chimpanzees. © 2016 British Society for Immunology.

  19. Partition dataset according to amino acid type improves the prediction of deleterious non-synonymous SNPs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Jing; Li, Yuan-Yuan; Shanghai Center for Bioinformation Technology, Shanghai 200235

    2012-03-02

    Highlights: Black-Right-Pointing-Pointer Proper dataset partition can improve the prediction of deleterious nsSNPs. Black-Right-Pointing-Pointer Partition according to original residue type at nsSNP is a good criterion. Black-Right-Pointing-Pointer Similar strategy is supposed promising in other machine learning problems. -- Abstract: Many non-synonymous SNPs (nsSNPs) are associated with diseases, and numerous machine learning methods have been applied to train classifiers for sorting disease-associated nsSNPs from neutral ones. The continuously accumulated nsSNP data allows us to further explore better prediction approaches. In this work, we partitioned the training data into 20 subsets according to either original or substituted amino acid type at the nsSNPmore » site. Using support vector machine (SVM), training classification models on each subset resulted in an overall accuracy of 76.3% or 74.9% depending on the two different partition criteria, while training on the whole dataset obtained an accuracy of only 72.6%. Moreover, the dataset was also randomly divided into 20 subsets, but the corresponding accuracy was only 73.2%. Our results demonstrated that partitioning the whole training dataset into subsets properly, i.e., according to the residue type at the nsSNP site, will improve the performance of the trained classifiers significantly, which should be valuable in developing better tools for predicting the disease-association of nsSNPs.« less

  20. Modeling Local Item Dependence Due to Common Test Format with a Multidimensional Rasch Model

    ERIC Educational Resources Information Center

    Baghaei, Purya; Aryadoust, Vahid

    2015-01-01

    Research shows that test method can exert a significant impact on test takers' performance and thereby contaminate test scores. We argue that common test method can exert the same effect as common stimuli and violate the conditional independence assumption of item response theory models because, in general, subsets of items which have a shared…

  1. Regeneration Development Across a Range of Reproduction Cutting Methods in Shortleaf Pine and Pine-Hardwood Stands in the Interior Highlands

    Treesearch

    James M. Guldin; James B. Baker; Michael G. Shelton

    2004-01-01

    Abstract - Density, milacre stocking, and height of shortleaf pine ( Pinus echinata Mill.) regeneration under 13 reproduction cutting methods were measured after 5 growing seasons across a range of reproduction cutting treatments in shortleaf pine and pine-hardwood stands in the Interior Highlands of Arkansas and Oklahoma. A subset...

  2. Band excitation method applicable to scanning probe microscopy

    DOEpatents

    Jesse, Stephen; Kalinin, Sergei V.

    2015-08-04

    Scanning probe microscopy may include a method for generating a band excitation (BE) signal and simultaneously exciting a probe at a plurality of frequencies within a predetermined frequency band based on the excitation signal. A response of the probe is measured across a subset of frequencies of the predetermined frequency band and the excitation signal is adjusted based on the measured response.

  3. Band excitation method applicable to scanning probe microscopy

    DOEpatents

    Jesse, Stephen; Kalinin, Sergei V.

    2017-01-03

    Scanning probe microscopy may include a method for generating a band excitation (BE) signal and simultaneously exciting a probe at a plurality of frequencies within a predetermined frequency band based on the excitation signal. A response of the probe is measured across a subset of frequencies of the predetermined frequency band and the excitation signal is adjusted based on the measured response.

  4. Partition resampling and extrapolation averaging: approximation methods for quantifying gene expression in large numbers of short oligonucleotide arrays.

    PubMed

    Goldstein, Darlene R

    2006-10-01

    Studies of gene expression using high-density short oligonucleotide arrays have become a standard in a variety of biological contexts. Of the expression measures that have been proposed to quantify expression in these arrays, multi-chip-based measures have been shown to perform well. As gene expression studies increase in size, however, utilizing multi-chip expression measures is more challenging in terms of computing memory requirements and time. A strategic alternative to exact multi-chip quantification on a full large chip set is to approximate expression values based on subsets of chips. This paper introduces an extrapolation method, Extrapolation Averaging (EA), and a resampling method, Partition Resampling (PR), to approximate expression in large studies. An examination of properties indicates that subset-based methods can perform well compared with exact expression quantification. The focus is on short oligonucleotide chips, but the same ideas apply equally well to any array type for which expression is quantified using an entire set of arrays, rather than for only a single array at a time. Software implementing Partition Resampling and Extrapolation Averaging is under development as an R package for the BioConductor project.

  5. Age of Local Galactic Disk from the Wdlf for Cpmbs

    NASA Astrophysics Data System (ADS)

    Smith, J. Allyn; Oswalt, Terry D.; Wood, Matt A.; Silvestri, Nicole M.

    We present the white dwarf luminosity function (WDLF) for common proper motion systems. This WDLF was derived using the 1/Vmax method pioneered by Schmidt (1975) and detailed by Liebert Dahn and Monet (1988). New cooling models were used to determine the luminosities of the white dwarfs and the age of the local Galactic disk. Comparison to WDLFs developed using older colling models (Wood 1995) will be examined for changes in the derived disk age. Kinematic data is available for a subset of the WDs in the sample. Separate luminosity functions will be examined for each of the statistically significant subsets. JAS acknowledges support from NASA GSRP Fellowship NGT-51086.

  6. Elite and Status Attainment Models of Inequality of Opportunity

    ERIC Educational Resources Information Center

    Myles, John F.; Srensen, Aage B.

    1975-01-01

    With changes in method, analysis of the process of attainment of various occupations and sub-sets of occupations such as elites can bring about the desired comparability between elite and status attainment studies of equality of opportunity. (Author/AM)

  7. Stability of Poisson Equilibria and Hamiltonian Relative Equilibria by Energy Methods

    NASA Astrophysics Data System (ADS)

    Patrick, George W.; Roberts, Mark; Wulff, Claudia

    2004-12-01

    We develop a general stability theory for equilibrium points of Poisson dynamical systems and relative equilibria of Hamiltonian systems with symmetries, including several generalisations of the Energy-Casimir and Energy-Momentum Methods. Using a topological generalisation of Lyapunov’s result that an extremal critical point of a conserved quantity is stable, we show that a Poisson equilibrium is stable if it is an isolated point in the intersection of a level set of a conserved function with a subset of the phase space that is related to the topology of the symplectic leaf space at that point. This criterion is applied to generalise the energy-momentum method to Hamiltonian systems which are invariant under non-compact symmetry groups for which the coadjoint orbit space is not Hausdorff. We also show that a G-stable relative equilibrium satisfies the stronger condition of being A-stable, where A is a specific group-theoretically defined subset of G which contains the momentum isotropy subgroup of the relative equilibrium. The results are illustrated by an application to the stability of a rigid body in an ideal irrotational fluid.

  8. A method of 3D object recognition and localization in a cloud of points

    NASA Astrophysics Data System (ADS)

    Bielicki, Jerzy; Sitnik, Robert

    2013-12-01

    The proposed method given in this article is prepared for analysis of data in the form of cloud of points directly from 3D measurements. It is designed for use in the end-user applications that can directly be integrated with 3D scanning software. The method utilizes locally calculated feature vectors (FVs) in point cloud data. Recognition is based on comparison of the analyzed scene with reference object library. A global descriptor in the form of a set of spatially distributed FVs is created for each reference model. During the detection process, correlation of subsets of reference FVs with FVs calculated in the scene is computed. Features utilized in the algorithm are based on parameters, which qualitatively estimate mean and Gaussian curvatures. Replacement of differentiation with averaging in the curvatures estimation makes the algorithm more resistant to discontinuities and poor quality of the input data. Utilization of the FV subsets allows to detect partially occluded and cluttered objects in the scene, while additional spatial information maintains false positive rate at a reasonably low level.

  9. Bootstrapping under constraint for the assessment of group behavior in human contact networks

    NASA Astrophysics Data System (ADS)

    Tremblay, Nicolas; Barrat, Alain; Forest, Cary; Nornberg, Mark; Pinton, Jean-François; Borgnat, Pierre

    2013-11-01

    The increasing availability of time- and space-resolved data describing human activities and interactions gives insights into both static and dynamic properties of human behavior. In practice, nevertheless, real-world data sets can often be considered as only one realization of a particular event. This highlights a key issue in social network analysis: the statistical significance of estimated properties. In this context, we focus here on the assessment of quantitative features of specific subset of nodes in empirical networks. We present a method of statistical resampling based on bootstrapping groups of nodes under constraints within the empirical network. The method enables us to define acceptance intervals for various null hypotheses concerning relevant properties of the subset of nodes under consideration in order to characterize by a statistical test its behavior as “normal” or not. We apply this method to a high-resolution data set describing the face-to-face proximity of individuals during two colocated scientific conferences. As a case study, we show how to probe whether colocating the two conferences succeeded in bringing together the two corresponding groups of scientists.

  10. Fourier spatial frequency analysis for image classification: training the training set

    NASA Astrophysics Data System (ADS)

    Johnson, Timothy H.; Lhamo, Yigah; Shi, Lingyan; Alfano, Robert R.; Russell, Stewart

    2016-04-01

    The Directional Fourier Spatial Frequencies (DFSF) of a 2D image can identify similarity in spatial patterns within groups of related images. A Support Vector Machine (SVM) can then be used to classify images if the inter-image variance of the FSF in the training set is bounded. However, if variation in FSF increases with training set size, accuracy may decrease as the size of the training set increases. This calls for a method to identify a set of training images from among the originals that can form a vector basis for the entire class. Applying the Cauchy product method we extract the DFSF spectrum from radiographs of osteoporotic bone, and use it as a matched filter set to eliminate noise and image specific frequencies, and demonstrate that selection of a subset of superclassifiers from within a set of training images improves SVM accuracy. Central to this challenge is that the size of the search space can become computationally prohibitive for all but the smallest training sets. We are investigating methods to reduce the search space to identify an optimal subset of basis training images.

  11. Image enhancement in positron emission mammography

    NASA Astrophysics Data System (ADS)

    Slavine, Nikolai V.; Seiler, Stephen; McColl, Roderick W.; Lenkinski, Robert E.

    2017-02-01

    Purpose: To evaluate an efficient iterative deconvolution method (RSEMD) for improving the quantitative accuracy of previously reconstructed breast images by commercial positron emission mammography (PEM) scanner. Materials and Methods: The RSEMD method was tested on breast phantom data and clinical PEM imaging data. Data acquisition was performed on a commercial Naviscan Flex Solo II PEM camera. This method was applied to patient breast images previously reconstructed with Naviscan software (MLEM) to determine improvements in resolution, signal to noise ratio (SNR) and contrast to noise ratio (CNR.) Results: In all of the patients' breast studies the post-processed images proved to have higher resolution and lower noise as compared with images reconstructed by conventional methods. In general, the values of SNR reached a plateau at around 6 iterations with an improvement factor of about 2 for post-processed Flex Solo II PEM images. Improvements in image resolution after the application of RSEMD have also been demonstrated. Conclusions: A rapidly converging, iterative deconvolution algorithm with a novel resolution subsets-based approach RSEMD that operates on patient DICOM images has been used for quantitative improvement in breast imaging. The RSEMD method can be applied to clinical PEM images to improve image quality to diagnostically acceptable levels and will be crucial in order to facilitate diagnosis of tumor progression at the earliest stages. The RSEMD method can be considered as an extended Richardson-Lucy algorithm with multiple resolution levels (resolution subsets).

  12. Enhanced Product Generation at NASA Data Centers Through Grid Technology

    NASA Technical Reports Server (NTRS)

    Barkstrom, Bruce R.; Hinke, Thomas H.; Gavali, Shradha; Seufzer, William J.

    2003-01-01

    This paper describes how grid technology can support the ability of NASA data centers to provide customized data products. A combination of grid technology and commodity processors are proposed to provide the bandwidth necessary to perform customized processing of data, with customized data subsetting providing the initial example. This customized subsetting engine can be used to support a new type of subsetting, called phenomena-based subsetting, where data is subsetted based on its association with some phenomena, such as mesoscale convective systems or hurricanes. This concept is expanded to allow the phenomena to be detected in one type of data, with the subsetting requirements transmitted to the subsetting engine to subset a different type of data. The subsetting requirements are generated by a data mining system and transmitted to the subsetter in the form of an XML feature index that describes the spatial and temporal extent of the phenomena. For this work, a grid-based mining system called the Grid Miner is used to identify the phenomena and generate the feature index. This paper discusses the value of grid technology in facilitating the development of a high performance customized product processing and the coupling of a grid mining system to support phenomena-based subsetting.

  13. Accuracy of direct genomic values in Holstein bulls and cows using subsets of SNP markers

    PubMed Central

    2010-01-01

    Background At the current price, the use of high-density single nucleotide polymorphisms (SNP) genotyping assays in genomic selection of dairy cattle is limited to applications involving elite sires and dams. The objective of this study was to evaluate the use of low-density assays to predict direct genomic value (DGV) on five milk production traits, an overall conformation trait, a survival index, and two profit index traits (APR, ASI). Methods Dense SNP genotypes were available for 42,576 SNP for 2,114 Holstein bulls and 510 cows. A subset of 1,847 bulls born between 1955 and 2004 was used as a training set to fit models with various sets of pre-selected SNP. A group of 297 bulls born between 2001 and 2004 and all cows born between 1992 and 2004 were used to evaluate the accuracy of DGV prediction. Ridge regression (RR) and partial least squares regression (PLSR) were used to derive prediction equations and to rank SNP based on the absolute value of the regression coefficients. Four alternative strategies were applied to select subset of SNP, namely: subsets of the highest ranked SNP for each individual trait, or a single subset of evenly spaced SNP, where SNP were selected based on their rank for ASI, APR or minor allele frequency within intervals of approximately equal length. Results RR and PLSR performed very similarly to predict DGV, with PLSR performing better for low-density assays and RR for higher-density SNP sets. When using all SNP, DGV predictions for production traits, which have a higher heritability, were more accurate (0.52-0.64) than for survival (0.19-0.20), which has a low heritability. The gain in accuracy using subsets that included the highest ranked SNP for each trait was marginal (5-6%) over a common set of evenly spaced SNP when at least 3,000 SNP were used. Subsets containing 3,000 SNP provided more than 90% of the accuracy that could be achieved with a high-density assay for cows, and 80% of the high-density assay for young bulls. Conclusions Accurate genomic evaluation of the broader bull and cow population can be achieved with a single genotyping assays containing ~ 3,000 to 5,000 evenly spaced SNP. PMID:20950478

  14. Predictive ability of direct genomic values for lifetime net merit of Holstein sires using selected subsets of single nucleotide polymorphism markers.

    PubMed

    Weigel, K A; de los Campos, G; González-Recio, O; Naya, H; Wu, X L; Long, N; Rosa, G J M; Gianola, D

    2009-10-01

    The objective of the present study was to assess the predictive ability of subsets of single nucleotide polymorphism (SNP) markers for development of low-cost, low-density genotyping assays in dairy cattle. Dense SNP genotypes of 4,703 Holstein bulls were provided by the USDA Agricultural Research Service. A subset of 3,305 bulls born from 1952 to 1998 was used to fit various models (training set), and a subset of 1,398 bulls born from 1999 to 2002 was used to evaluate their predictive ability (testing set). After editing, data included genotypes for 32,518 SNP and August 2003 and April 2008 predicted transmitting abilities (PTA) for lifetime net merit (LNM$), the latter resulting from progeny testing. The Bayesian least absolute shrinkage and selection operator method was used to regress August 2003 PTA on marker covariates in the training set to arrive at estimates of marker effects and direct genomic PTA. The coefficient of determination (R(2)) from regressing the April 2008 progeny test PTA of bulls in the testing set on their August 2003 direct genomic PTA was 0.375. Subsets of 300, 500, 750, 1,000, 1,250, 1,500, and 2,000 SNP were created by choosing equally spaced and highly ranked SNP, with the latter based on the absolute value of their estimated effects obtained from the training set. The SNP effects were re-estimated from the training set for each subset of SNP, and the 2008 progeny test PTA of bulls in the testing set were regressed on corresponding direct genomic PTA. The R(2) values for subsets of 300, 500, 750, 1,000, 1,250, 1,500, and 2,000 SNP with largest effects (evenly spaced SNP) were 0.184 (0.064), 0.236 (0.111), 0.269 (0.190), 0.289 (0.179), 0.307 (0.228), 0.313 (0.268), and 0.322 (0.291), respectively. These results indicate that a low-density assay comprising selected SNP could be a cost-effective alternative for selection decisions and that significant gains in predictive ability may be achieved by increasing the number of SNP allocated to such an assay from 300 or fewer to 1,000 or more.

  15. Egnos Limitations over Central and Eastern Poland - Results of Preliminary Tests of Egnos-Eupos Integration Project

    NASA Astrophysics Data System (ADS)

    Jaworski, Leszek; Swiatek, Anna; Zdunek, Ryszard

    2013-09-01

    The problem of insufficient accuracy of EGNOS correction for the territory of Poland, located at the edge of EGNOS range is well known. The EEI PECS project (EGNOS EUPOS Integration) assumes improving the EGNOS correction by using the GPS observations from Polish ASG-EUPOS stations. One of the EEI project tasks was the identification of EGNOS performance limitations over Poland and services for EGNOSS-EUPOS combination. The two sets of data were used for those goals: statistical, theoretical data obtained using the SBAS simulator software, real data obtained during the measurements. The real measurements were managed as two types of measurements: static and dynamic. Static measurements are continuously managing using Septentrio PolaRx2 receiver. The SRC permanent station works in IMAGE/PERFECT project. Dynamic measurements were managed using the Mobile GPS Laboratory (MGL). Receivers (geodetic and navigation) were working in two modes: determining navigation position from standalone GPS, determining navigation position from GPS plus EGNOS correction. The paper presents results of measurements' analyses and conclusions based on which the next tasks in EEI project are completed

  16. High levels of antimony in dust from e-waste recycling in southeastern China.

    PubMed

    Bi, Xiangyang; Li, Zhonggen; Zhuang, Xiaochun; Han, Zhixuan; Yang, Wenlin

    2011-11-01

    Environmental contamination due to uncontrolled e-waste recycling is an emerging global issue. Antimony (Sb) is a toxic element used in semiconductor components and flame retardants for circuit board within electronic equipment. When e-waste is recycled, Sb is released and contaminates the surrounding environment; however, few studies have characterized the extent of this problem. In this study, we investigated Sb and arsenic (As) distributions in indoor dust from 13 e-waste recycling villages in Guiyu, Guangdong Province, southeastern China. Results revealed significantly elevated concentrations of Sb (6.1-232 mg/kg) in dust within all villages, which were 3.9-147 times higher than those from the non e-waste sites, indicating e-waste recycling was an important source of Sb pollution. On the contrary, As concentrations (5.4-17.7 mg/kg) in e-waste dusts were similar to reference values from the control sites. Therefore, dusts emitted from e-waste recycling may be characterized by high Sb/As ratios, which may help identify the contamination due to the e-waste recycling activities. Copyright © 2011 Elsevier B.V. All rights reserved.

  17. ';Best' Practices for Aggregating Subset Results from Archived Datasets

    NASA Astrophysics Data System (ADS)

    Baskin, W. E.; Perez, J.

    2013-12-01

    In response to the exponential growth in science data analysis and visualization capabilities Data Centers have been developing new delivery mechanisms to package and deliver large volumes of aggregated subsets of archived data. New standards are evolving to help data providers and application programmers deal with growing needs of the science community. These standards evolve from the best practices gleaned from new products and capabilities. The NASA Atmospheric Sciences Data Center (ASDC) has developed and deployed production provider-specific search and subset web applications for the CALIPSO, CERES, TES, and MOPITT missions. This presentation explores several use cases that leverage aggregated subset results and examines the standards and formats ASDC developers applied to the delivered files as well as the implementation strategies for subsetting and processing the aggregated products. The following topics will be addressed: - Applications of NetCDF CF conventions to aggregated level 2 satellite subsets - Data-Provider-Specific format requirements vs. generalized standards - Organization of the file structure of aggregated NetCDF subset output - Global Attributes of individual subsetted files vs. aggregated results - Specific applications and framework used for subsetting and delivering derivative data files

  18. A Simple Application of Compressed Sensing to Further Accelerate Partially Parallel Imaging

    PubMed Central

    Miao, Jun; Guo, Weihong; Narayan, Sreenath; Wilson, David L.

    2012-01-01

    Compressed Sensing (CS) and partially parallel imaging (PPI) enable fast MR imaging by reducing the amount of k-space data required for reconstruction. Past attempts to combine these two have been limited by the incoherent sampling requirement of CS, since PPI routines typically sample on a regular (coherent) grid. Here, we developed a new method, “CS+GRAPPA,” to overcome this limitation. We decomposed sets of equidistant samples into multiple random subsets. Then, we reconstructed each subset using CS, and averaging the results to get a final CS k-space reconstruction. We used both a standard CS, and an edge and joint-sparsity guided CS reconstruction. We tested these intermediate results on both synthetic and real MR phantom data, and performed a human observer experiment to determine the effectiveness of decomposition, and to optimize the number of subsets. We then used these CS reconstructions to calibrate the GRAPPA complex coil weights. In vivo parallel MR brain and heart data sets were used. An objective image quality evaluation metric, Case-PDM, was used to quantify image quality. Coherent aliasing and noise artifacts were significantly reduced using two decompositions. More decompositions further reduced coherent aliasing and noise artifacts but introduced blurring. However, the blurring was effectively minimized using our new edge and joint-sparsity guided CS using two decompositions. Numerical results on parallel data demonstrated that the combined method greatly improved image quality as compared to standard GRAPPA, on average halving Case-PDM scores across a range of sampling rates. The proposed technique allowed the same Case-PDM scores as standard GRAPPA, using about half the number of samples. We conclude that the new method augments GRAPPA by combining it with CS, allowing CS to work even when the k-space sampling pattern is equidistant. PMID:22902065

  19. Discovering semantic features in the literature: a foundation for building functional associations

    PubMed Central

    Chagoyen, Monica; Carmona-Saez, Pedro; Shatkay, Hagit; Carazo, Jose M; Pascual-Montano, Alberto

    2006-01-01

    Background Experimental techniques such as DNA microarray, serial analysis of gene expression (SAGE) and mass spectrometry proteomics, among others, are generating large amounts of data related to genes and proteins at different levels. As in any other experimental approach, it is necessary to analyze these data in the context of previously known information about the biological entities under study. The literature is a particularly valuable source of information for experiment validation and interpretation. Therefore, the development of automated text mining tools to assist in such interpretation is one of the main challenges in current bioinformatics research. Results We present a method to create literature profiles for large sets of genes or proteins based on common semantic features extracted from a corpus of relevant documents. These profiles can be used to establish pair-wise similarities among genes, utilized in gene/protein classification or can be even combined with experimental measurements. Semantic features can be used by researchers to facilitate the understanding of the commonalities indicated by experimental results. Our approach is based on non-negative matrix factorization (NMF), a machine-learning algorithm for data analysis, capable of identifying local patterns that characterize a subset of the data. The literature is thus used to establish putative relationships among subsets of genes or proteins and to provide coherent justification for this clustering into subsets. We demonstrate the utility of the method by applying it to two independent and vastly different sets of genes. Conclusion The presented method can create literature profiles from documents relevant to sets of genes. The representation of genes as additive linear combinations of semantic features allows for the exploration of functional associations as well as for clustering, suggesting a valuable methodology for the validation and interpretation of high-throughput experimental data. PMID:16438716

  20. Characteristics of CD8+ T cell subsets in Chinese patients with chronic HIV infection during initial ART.

    PubMed

    Jiao, Yanmei; Hua, Wei; Zhang, Tong; Zhang, Yonghong; Ji, Yunxia; Zhang, Hongwei; Wu, Hao

    2011-03-25

    CD8+ T cells may play an important role in protecting against HIV. However, the changes of CD8+ T cell subsets during early period of ART have not been fully studied. Twenty-one asymptomatic treatment-naive HIV-infected patients with CD4 T+ cells less than 350 cells/μl were enrolled in the study. Naïve, central memory(CM), effective memory(EM) and terminally differentiated effector (EMRA) CD8+ cell subsets and their activation and proliferation subsets were evaluated in blood samples collected at base line, and week 2, 4, 8 and 12 of ART. The total CD8+ T cells declined and the Naïve and CM subsets had a tendency of increase. Activation levels of all CD8+ T cell subsets except EMRA subset decreased after ART. However, proliferation levels of total CD8+ T cells, EMRA, EM and CM subsets increased at the first 4 weeks of ART, then decreased. Proliferation level of the naïve cells decreased after ART. The changes of CD8+ T cell subsets during initial ART are complex. Our results display a complete phenotypical picture of CD8+ cell subsets during initial ART and provide insights for understanding of immune status during ART.

  1. Change-Based Satellite Monitoring Using Broad Coverage and Targetable Sensing

    NASA Technical Reports Server (NTRS)

    Chien, Steve A.; Tran, Daniel Q.; Doubleday, Joshua R.; Doggett, Thomas

    2013-01-01

    A generic software framework analyzes data from broad coverage sweeps or general larger areas of interest. Change detection methods are used to extract subsets of directed swath areas that intersect areas of change. These areas are prioritized and allocated to targetable assets. This method is deployed in an automatic fashion, and has operated without human monitoring or intervention for sustained periods of time (months).

  2. Space Suit Joint Torque Measurement Method Validation

    NASA Technical Reports Server (NTRS)

    Valish, Dana; Eversley, Karina

    2012-01-01

    In 2009 and early 2010, a test method was developed and performed to quantify the torque required to manipulate joints in several existing operational and prototype space suits. This was done in an effort to develop joint torque requirements appropriate for a new Constellation Program space suit system. The same test method was levied on the Constellation space suit contractors to verify that their suit design met the requirements. However, because the original test was set up and conducted by a single test operator there was some question as to whether this method was repeatable enough to be considered a standard verification method for Constellation or other future development programs. In order to validate the method itself, a representative subset of the previous test was repeated, using the same information that would be available to space suit contractors, but set up and conducted by someone not familiar with the previous test. The resultant data was compared using graphical and statistical analysis; the results indicated a significant variance in values reported for a subset of the re-tested joints. Potential variables that could have affected the data were identified and a third round of testing was conducted in an attempt to eliminate and/or quantify the effects of these variables. The results of the third test effort will be used to determine whether or not the proposed joint torque methodology can be applied to future space suit development contracts.

  3. Testing deformation hypotheses by constraints on a time series of geodetic observations

    NASA Astrophysics Data System (ADS)

    Velsink, Hiddo

    2018-01-01

    In geodetic deformation analysis observations are used to identify form and size changes of a geodetic network, representing objects on the earth's surface. The network points are monitored, often continuously, because of suspected deformations. A deformation may affect many points during many epochs. The problem is that the best description of the deformation is, in general, unknown. To find it, different hypothesised deformation models have to be tested systematically for agreement with the observations. The tests have to be capable of stating with a certain probability the size of detectable deformations, and to be datum invariant. A statistical criterion is needed to find the best deformation model. Existing methods do not fulfil these requirements. Here we propose a method that formulates the different hypotheses as sets of constraints on the parameters of a least-squares adjustment model. The constraints can relate to subsets of epochs and to subsets of points, thus combining time series analysis and congruence model analysis. The constraints are formulated as nonstochastic observations in an adjustment model of observation equations. This gives an easy way to test the constraints and to get a quality description. The proposed method aims at providing a good discriminating method to find the best description of a deformation. The method is expected to improve the quality of geodetic deformation analysis. We demonstrate the method with an elaborate example.

  4. Toward optimal feature and time segment selection by divergence method for EEG signals classification.

    PubMed

    Wang, Jie; Feng, Zuren; Lu, Na; Luo, Jing

    2018-06-01

    Feature selection plays an important role in the field of EEG signals based motor imagery pattern classification. It is a process that aims to select an optimal feature subset from the original set. Two significant advantages involved are: lowering the computational burden so as to speed up the learning procedure and removing redundant and irrelevant features so as to improve the classification performance. Therefore, feature selection is widely employed in the classification of EEG signals in practical brain-computer interface systems. In this paper, we present a novel statistical model to select the optimal feature subset based on the Kullback-Leibler divergence measure, and automatically select the optimal subject-specific time segment. The proposed method comprises four successive stages: a broad frequency band filtering and common spatial pattern enhancement as preprocessing, features extraction by autoregressive model and log-variance, the Kullback-Leibler divergence based optimal feature and time segment selection and linear discriminate analysis classification. More importantly, this paper provides a potential framework for combining other feature extraction models and classification algorithms with the proposed method for EEG signals classification. Experiments on single-trial EEG signals from two public competition datasets not only demonstrate that the proposed method is effective in selecting discriminative features and time segment, but also show that the proposed method yields relatively better classification results in comparison with other competitive methods. Copyright © 2018 Elsevier Ltd. All rights reserved.

  5. CD4 T cell subsets in the Mucosa are CD28+Ki-67−HLA-DR−CD69+ but show differential infection based on α4β7 receptor expression during acute SIV infection

    PubMed Central

    Kader, Muhamuda; Bixler, Sandra; Roederer, Mario; Veazey, Ronald; Mattapallil, Joseph J.

    2009-01-01

    Background CD4 T cell depletion in the mucosa has been well documented during acute HIV and SIV infections. The demonstration the HIV/SIV can use the α4β7 receptor for viral entry suggests that these viruses selectively target CD4 T cells in the mucosa that express high levels of α4β7 receptor. Methods Mucosal samples obtained from SIV infected rhesus macaques during the early phase of infection were used for immunophenotypic analysis. CD4 T cell subsets were sorted based on the expression of β7 and CD95 to quantify the level of SIV infection in different subsets of CD4 T cells. Changes in IL-17, IL-21, IL-23 and TGFβ mRNA expression was determined using Taqman PCR. Results CD4 T cells in the mucosa were found to harbor two major population of cells; ~25% of CD4 T cells expressed the α4+β7hi phenotype, whereas the rest of the 75% expressed an α4+β7int phenotype. Both the subsets were predominantly CD28+Ki-67− HLA-DR− but CD69+, and expressed detectable levels of CCR5 on their surface. Interestingly, however, α4+β7hiCD4 T cells were found to harbor more SIV than the α4+β7int subsets at day 10 pi. Early infection was associated with a dramatic increase in the expression of IL-17, and IL-17 promoting cytokines IL-21, IL-23, and TGFβ that stayed high even after the loss of mucosal CD4 T cells. Conclusions Our results suggest that the differential expression of the α4β7 receptor contributes to the differences in the extent of infection in CD4 T cell subsets in the mucosa. Early infection associated dysregulation of the IL-17 network in mucosal tissues involves other non-Th-17 cells that likely contributes to the pro-inflammatory environment in the mucosa during acute stages of SIV infection. PMID:19863675

  6. Selecting sequence variants to improve genomic predictions for dairy cattle

    USDA-ARS?s Scientific Manuscript database

    Millions of genetic variants have been identified by population-scale sequencing projects, but subsets are needed for routine genomic predictions or to include on genotyping arrays. Methods of selecting sequence variants were compared using both simulated sequence genotypes and actual data from run ...

  7. Redefining Myeloid Cell Subsets in Murine Spleen

    PubMed Central

    Hey, Ying-Ying; Tan, Jonathan K. H.; O’Neill, Helen C.

    2016-01-01

    Spleen is known to contain multiple dendritic and myeloid cell subsets, distinguishable on the basis of phenotype, function and anatomical location. As a result of recent intensive flow cytometric analyses, splenic dendritic cell (DC) subsets are now better characterized than other myeloid subsets. In order to identify and fully characterize a novel splenic subset termed “L-DC” in relation to other myeloid cells, it was necessary to investigate myeloid subsets in more detail. In terms of cell surface phenotype, L-DC were initially characterized as a CD11bhiCD11cloMHCII−Ly6C−Ly6G− subset in murine spleen. Their expression of CD43, lack of MHCII, and a low level of CD11c was shown to best differentiate L-DC by phenotype from conventional DC subsets. A complete analysis of all subsets in spleen led to the classification of CD11bhiCD11cloMHCII−Ly6CloLy6G− cells as monocytes expressing CX3CR1, CD43 and CD115. Siglec-F expression was used to identify a specific eosinophil population, distinguishable from both Ly6Clo and Ly6Chi monocytes, and other DC subsets. L-DC were characterized as a clear subset of CD11bhiCD11cloMHCII−Ly6C−Ly6G− cells, which are CD43+, Siglec-F− and CD115−. Changes in the prevalence of L-DC compared to other subsets in spleens of mutant mice confirmed the phenotypic distinction between L-DC, cDC and monocyte subsets. L-DC development in vivo was shown to occur independently of the BATF3 transcription factor that regulates cDC development, and also independently of the FLT3L and GM-CSF growth factors which drive cDC and monocyte development, so distinguishing L-DC from these commonly defined cell types. PMID:26793192

  8. Halftone visual cryptography.

    PubMed

    Zhou, Zhi; Arce, Gonzalo R; Di Crescenzo, Giovanni

    2006-08-01

    Visual cryptography encodes a secret binary image (SI) into n shares of random binary patterns. If the shares are xeroxed onto transparencies, the secret image can be visually decoded by superimposing a qualified subset of transparencies, but no secret information can be obtained from the superposition of a forbidden subset. The binary patterns of the n shares, however, have no visual meaning and hinder the objectives of visual cryptography. Extended visual cryptography [1] was proposed recently to construct meaningful binary images as shares using hypergraph colourings, but the visual quality is poor. In this paper, a novel technique named halftone visual cryptography is proposed to achieve visual cryptography via halftoning. Based on the blue-noise dithering principles, the proposed method utilizes the void and cluster algorithm [2] to encode a secret binary image into n halftone shares (images) carrying significant visual information. The simulation shows that the visual quality of the obtained halftone shares are observably better than that attained by any available visual cryptography method known to date.

  9. On the properties of energy stable flux reconstruction schemes for implicit large eddy simulation

    NASA Astrophysics Data System (ADS)

    Vermeire, B. C.; Vincent, P. E.

    2016-12-01

    We begin by investigating the stability, order of accuracy, and dispersion and dissipation characteristics of the extended range of energy stable flux reconstruction (E-ESFR) schemes in the context of implicit large eddy simulation (ILES). We proceed to demonstrate that subsets of the E-ESFR schemes are more stable than collocation nodal discontinuous Galerkin methods recovered with the flux reconstruction approach (FRDG) for marginally-resolved ILES simulations of the Taylor-Green vortex. These schemes are shown to have reduced dissipation and dispersion errors relative to FRDG schemes of the same polynomial degree and, simultaneously, have increased Courant-Friedrichs-Lewy (CFL) limits. Finally, we simulate turbulent flow over an SD7003 aerofoil using two of the most stable E-ESFR schemes identified by the aforementioned Taylor-Green vortex experiments. Results demonstrate that subsets of E-ESFR schemes appear more stable than the commonly used FRDG method, have increased CFL limits, and are suitable for ILES of complex turbulent flows on unstructured grids.

  10. Can data-driven benchmarks be used to set the goals of healthy people 2010?

    PubMed Central

    Allison, J; Kiefe, C I; Weissman, N W

    1999-01-01

    OBJECTIVES: Expert panels determined the public health goals of Healthy People 2000 subjectively. The present study examined whether data-driven benchmarks provide a better alternative. METHODS: We developed the "pared-mean" method to define from data the best achievable health care practices. We calculated the pared-mean benchmark for screening mammography from the 1994 National Health Interview Survey, using the metropolitan statistical area as the "provider" unit. Beginning with the best-performing provider and adding providers in descending sequence, we established the minimum provider subset that included at least 10% of all women surveyed on this question. The pared-mean benchmark is then the proportion of women in this subset who received mammography. RESULTS: The pared-mean benchmark for screening mammography was 71%, compared with the Healthy People 2000 goal of 60%. CONCLUSIONS: For Healthy People 2010, benchmarks derived from data reflecting the best available care provide viable alternatives to consensus-derived targets. We are currently pursuing additional refinements to the data-driven pared-mean benchmark approach. PMID:9987466

  11. Method and apparatus for smart battery charging including a plurality of controllers each monitoring input variables

    DOEpatents

    Hammerstrom, Donald J.

    2013-10-15

    A method for managing the charging and discharging of batteries wherein at least one battery is connected to a battery charger, the battery charger is connected to a power supply. A plurality of controllers in communication with one and another are provided, each of the controllers monitoring a subset of input variables. A set of charging constraints may then generated for each controller as a function of the subset of input variables. A set of objectives for each controller may also be generated. A preferred charge rate for each controller is generated as a function of either the set of objectives, the charging constraints, or both, using an algorithm that accounts for each of the preferred charge rates for each of the controllers and/or that does not violate any of the charging constraints. A current flow between the battery and the battery charger is then provided at the actual charge rate.

  12. Performance and Accuracy of LAPACK's Symmetric TridiagonalEigensolvers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Demmel, Jim W.; Marques, Osni A.; Parlett, Beresford N.

    2007-04-19

    We compare four algorithms from the latest LAPACK 3.1 release for computing eigenpairs of a symmetric tridiagonal matrix. These include QR iteration, bisection and inverse iteration (BI), the Divide-and-Conquer method (DC), and the method of Multiple Relatively Robust Representations (MR). Our evaluation considers speed and accuracy when computing all eigenpairs, and additionally subset computations. Using a variety of carefully selected test problems, our study includes a variety of today's computer architectures. Our conclusions can be summarized as follows. (1) DC and MR are generally much faster than QR and BI on large matrices. (2) MR almost always does the fewestmore » floating point operations, but at a lower MFlop rate than all the other algorithms. (3) The exact performance of MR and DC strongly depends on the matrix at hand. (4) DC and QR are the most accurate algorithms with observed accuracy O({radical}ne). The accuracy of BI and MR is generally O(ne). (5) MR is preferable to BI for subset computations.« less

  13. A statistical approach to selecting and confirming validation targets in -omics experiments

    PubMed Central

    2012-01-01

    Background Genomic technologies are, by their very nature, designed for hypothesis generation. In some cases, the hypotheses that are generated require that genome scientists confirm findings about specific genes or proteins. But one major advantage of high-throughput technology is that global genetic, genomic, transcriptomic, and proteomic behaviors can be observed. Manual confirmation of every statistically significant genomic result is prohibitively expensive. This has led researchers in genomics to adopt the strategy of confirming only a handful of the most statistically significant results, a small subset chosen for biological interest, or a small random subset. But there is no standard approach for selecting and quantitatively evaluating validation targets. Results Here we present a new statistical method and approach for statistically validating lists of significant results based on confirming only a small random sample. We apply our statistical method to show that the usual practice of confirming only the most statistically significant results does not statistically validate result lists. We analyze an extensively validated RNA-sequencing experiment to show that confirming a random subset can statistically validate entire lists of significant results. Finally, we analyze multiple publicly available microarray experiments to show that statistically validating random samples can both (i) provide evidence to confirm long gene lists and (ii) save thousands of dollars and hundreds of hours of labor over manual validation of each significant result. Conclusions For high-throughput -omics studies, statistical validation is a cost-effective and statistically valid approach to confirming lists of significant results. PMID:22738145

  14. A prospective gating method to acquire a diverse set of free-breathing CT images for model-based 4DCT

    NASA Astrophysics Data System (ADS)

    O'Connell, D.; Ruan, D.; Thomas, D. H.; Dou, T. H.; Lewis, J. H.; Santhanam, A.; Lee, P.; Low, D. A.

    2018-02-01

    Breathing motion modeling requires observation of tissues at sufficiently distinct respiratory states for proper 4D characterization. This work proposes a method to improve sampling of the breathing cycle with limited imaging dose. We designed and tested a prospective free-breathing acquisition protocol with a simulation using datasets from five patients imaged with a model-based 4DCT technique. Each dataset contained 25 free-breathing fast helical CT scans with simultaneous breathing surrogate measurements. Tissue displacements were measured using deformable image registration. A correspondence model related tissue displacement to the surrogate. Model residual was computed by comparing predicted displacements to image registration results. To determine a stopping criteria for the prospective protocol, i.e. when the breathing cycle had been sufficiently sampled, subsets of N scans where 5  ⩽  N  ⩽  9 were used to fit reduced models for each patient. A previously published metric was employed to describe the phase coverage, or ‘spread’, of the respiratory trajectories of each subset. Minimum phase coverage necessary to achieve mean model residual within 0.5 mm of the full 25-scan model was determined and used as the stopping criteria. Using the patient breathing traces, a prospective acquisition protocol was simulated. In all patients, phase coverage greater than the threshold necessary for model accuracy within 0.5 mm of the 25 scan model was achieved in six or fewer scans. The prospectively selected respiratory trajectories ranked in the (97.5  ±  4.2)th percentile among subsets of the originally sampled scans on average. Simulation results suggest that the proposed prospective method provides an effective means to sample the breathing cycle with limited free-breathing scans. One application of the method is to reduce the imaging dose of a previously published model-based 4DCT protocol to 25% of its original value while achieving mean model residual within 0.5 mm.

  15. Monocyte Subset Dynamics in Human Atherosclerosis Can Be Profiled with Magnetic Nano-Sensors

    PubMed Central

    Wildgruber, Moritz; Lee, Hakho; Chudnovskiy, Aleksey; Yoon, Tae-Jong; Etzrodt, Martin; Pittet, Mikael J.; Nahrendorf, Matthias; Croce, Kevin; Libby, Peter; Weissleder, Ralph; Swirski, Filip K.

    2009-01-01

    Monocytes are circulating macrophage and dendritic cell precursors that populate healthy and diseased tissue. In humans, monocytes consist of at least two subsets whose proportions in the blood fluctuate in response to coronary artery disease, sepsis, and viral infection. Animal studies have shown that specific shifts in the monocyte subset repertoire either exacerbate or attenuate disease, suggesting a role for monocyte subsets as biomarkers and therapeutic targets. Assays are therefore needed that can selectively and rapidly enumerate monocytes and their subsets. This study shows that two major human monocyte subsets express similar levels of the receptor for macrophage colony stimulating factor (MCSFR) but differ in their phagocytic capacity. We exploit these properties and custom-engineer magnetic nanoparticles for ex vivo sensing of monocytes and their subsets. We present a two-dimensional enumerative mathematical model that simultaneously reports number and proportion of monocyte subsets in a small volume of human blood. Using a recently described diagnostic magnetic resonance (DMR) chip with 1 µl sample size and high throughput capabilities, we then show that application of the model accurately quantifies subset fluctuations that occur in patients with atherosclerosis. PMID:19461894

  16. Dynamic equilibrium of heterogeneous and interconvertible multipotent hematopoietic cell subsets

    PubMed Central

    Weston, Wendy; Zayas, Jennifer; Perez, Ruben; George, John; Jurecic, Roland

    2014-01-01

    Populations of hematopoietic stem cells and progenitors are quite heterogeneous and consist of multiple cell subsets with distinct phenotypic and functional characteristics. Some of these subsets also appear to be interconvertible and oscillate between functionally distinct states. The multipotent hematopoietic cell line EML has emerged as a unique model to study the heterogeneity and interconvertibility of multipotent hematopoietic cells. Here we describe extensive phenotypic and functional heterogeneity of EML cells which stems from the coexistence of multiple cell subsets. Each of these subsets is phenotypically and functionally heterogeneous, and displays distinct multilineage differentiation potential, cell cycle profile, proliferation kinetics, and expression pattern of HSC markers and some of the key lineage-associated transcription factors. Analysis of their maintenance revealed that on a population level all EML cell subsets exhibit cell-autonomous interconvertible properties, with the capacity to generate all other subsets and re-establish complete parental EML cell population. Moreover, all EML cell subsets generated during multiple cell generations maintain their distinct phenotypic and functional signatures and interconvertible properties. The model of EML cell line suggests that interconvertible multipotent hematopoietic cell subsets coexist in a homeostatically maintained dynamic equilibrium which is regulated by currently unknown cell-intrinsic mechanisms. PMID:24903657

  17. Dynamic equilibrium of heterogeneous and interconvertible multipotent hematopoietic cell subsets.

    PubMed

    Weston, Wendy; Zayas, Jennifer; Perez, Ruben; George, John; Jurecic, Roland

    2014-06-06

    Populations of hematopoietic stem cells and progenitors are quite heterogeneous and consist of multiple cell subsets with distinct phenotypic and functional characteristics. Some of these subsets also appear to be interconvertible and oscillate between functionally distinct states. The multipotent hematopoietic cell line EML has emerged as a unique model to study the heterogeneity and interconvertibility of multipotent hematopoietic cells. Here we describe extensive phenotypic and functional heterogeneity of EML cells which stems from the coexistence of multiple cell subsets. Each of these subsets is phenotypically and functionally heterogeneous, and displays distinct multilineage differentiation potential, cell cycle profile, proliferation kinetics, and expression pattern of HSC markers and some of the key lineage-associated transcription factors. Analysis of their maintenance revealed that on a population level all EML cell subsets exhibit cell-autonomous interconvertible properties, with the capacity to generate all other subsets and re-establish complete parental EML cell population. Moreover, all EML cell subsets generated during multiple cell generations maintain their distinct phenotypic and functional signatures and interconvertible properties. The model of EML cell line suggests that interconvertible multipotent hematopoietic cell subsets coexist in a homeostatically maintained dynamic equilibrium which is regulated by currently unknown cell-intrinsic mechanisms.

  18. New error calibration tests for gravity models using subset solutions and independent data - Applied to GEM-T3

    NASA Technical Reports Server (NTRS)

    Lerch, F. J.; Nerem, R. S.; Chinn, D. S.; Chan, J. C.; Patel, G. B.; Klosko, S. M.

    1993-01-01

    A new method has been developed to provide a direct test of the error calibrations of gravity models based on actual satellite observations. The basic approach projects the error estimates of the gravity model parameters onto satellite observations, and the results of these projections are then compared with data residual computed from the orbital fits. To allow specific testing of the gravity error calibrations, subset solutions are computed based on the data set and data weighting of the gravity model. The approach is demonstrated using GEM-T3 to show that the gravity error estimates are well calibrated and that reliable predictions of orbit accuracies can be achieved for independent orbits.

  19. Replica amplification of nucleic acid arrays

    DOEpatents

    Church, George M.

    2002-01-01

    A method of producing a plurality of a nucleic acid array, comprising, in order, the steps of amplifying in situ nucleic acid molecules of a first randomly-patterned, immobilized nucleic acid array comprising a heterogeneous pool of nucleic acid molecules affixed to a support, transferring at least a subset of the nucleic acid molecules produced by such amplifying to a second support, and affixing the subset so transferred to the second support to form a second randomly-patterned, immobilized nucleic acid array, wherein the nucleic acid molecules of the second array occupy positions that correspond to those of the nucleic acid molecules from which they were amplified on the first array, so that the first array serves as a template to produce a plurality, is disclosed.

  20. Image Correlation Pattern Optimization for Micro-Scale In-Situ Strain Measurements

    NASA Technical Reports Server (NTRS)

    Bomarito, G. F.; Hochhalter, J. D.; Cannon, A. H.

    2016-01-01

    The accuracy and precision of digital image correlation (DIC) is a function of three primary ingredients: image acquisition, image analysis, and the subject of the image. Development of the first two (i.e. image acquisition techniques and image correlation algorithms) has led to widespread use of DIC; however, fewer developments have been focused on the third ingredient. Typically, subjects of DIC images are mechanical specimens with either a natural surface pattern or a pattern applied to the surface. Research in the area of DIC patterns has primarily been aimed at identifying which surface patterns are best suited for DIC, by comparing patterns to each other. Because the easiest and most widespread methods of applying patterns have a high degree of randomness associated with them (e.g., airbrush, spray paint, particle decoration, etc.), less effort has been spent on exact construction of ideal patterns. With the development of patterning techniques such as microstamping and lithography, patterns can be applied to a specimen pixel by pixel from a patterned image. In these cases, especially because the patterns are reused many times, an optimal pattern is sought such that error introduced into DIC from the pattern is minimized. DIC consists of tracking the motion of an array of nodes from a reference image to a deformed image. Every pixel in the images has an associated intensity (grayscale) value, with discretization depending on the bit depth of the image. Because individual pixel matching by intensity value yields a non-unique scale-dependent problem, subsets around each node are used for identification. A correlation criteria is used to find the best match of a particular subset of a reference image within a deformed image. The reader is referred to references for enumerations of typical correlation criteria. As illustrated by Schreier and Sutton and Lu and Cary systematic errors can be introduced by representing the underlying deformation with under-matched shape functions. An important implication, as discussed by Sutton et al., is that in the presence of highly localized deformations (e.g., crack fronts), error can be reduced by minimizing the subset size. In other words, smaller subsets allow the more accurate resolution of localized deformations. Contrarily, the choice of optimal subset size has been widely studied and a general consensus is that larger subsets with more information content are less prone to random error. Thus, an optimal subset size balances the systematic error from under matched deformations with random error from measurement noise. The alternative approach pursued in the current work is to choose a small subset size and optimize the information content within (i.e., optimizing an applied DIC pattern), rather than finding an optimal subset size. In the literature, many pattern quality metrics have been proposed, e.g., sum of square intensity gradient (SSSIG), mean subset fluctuation, gray level co-occurrence, autocorrelation-based metrics, and speckle-based metrics. The majority of these metrics were developed to quantify the quality of common pseudo-random patterns after they have been applied, and were not created with the intent of pattern generation. As such, it is found that none of the metrics examined in this study are fit to be the objective function of a pattern generation optimization. In some cases, such as with speckle-based metrics, application to pixel by pixel patterns is ill-conditioned and requires somewhat arbitrary extensions. In other cases, such as with the SSSIG, it is shown that trivial solutions exist for the optimum of the metric which are ill-suited for DIC (such as a checkerboard pattern). In the current work, a multi-metric optimization method is proposed whereby quality is viewed as a combination of individual quality metrics. Specifically, SSSIG and two auto-correlation metrics are used which have generally competitive objectives. Thus, each metric could be viewed as a constraint imposed upon the others, thereby precluding the achievement of their trivial solutions. In this way, optimization produces a pattern which balances the benefits of multiple quality metrics. The resulting pattern, along with randomly generated patterns, is subjected to numerical deformations and analyzed with DIC software. The optimal pattern is shown to outperform randomly generated patterns.

  1. [Changes of monocyte and monocyte-platelet aggregates in different subgroups of thrombotic events in patients with acute myocardial infarction during PCI].

    PubMed

    Wang, Sheng; Sun, Cuifang; Liao, Wang; Wu, Zhongwei; Wang, Yudai; Huang, Xiuxian; Lu, Sijia; Dong, Xiaoli; Shuai, Fujie; Li, Bin

    2017-07-01

    Objective To investigate the impact of thrombotic events on the alterations of monocyte and monocyte-platelet aggregates (MPAs) in patients with acute myocardial infarction (AMI) during percutaneous coronary intervention (PCI). Methods Blood was collected before PCI for flow cytometry. Monocyte subsets and MPAs were detected by four-color platform (CDl4-APC, CDl6-PE-Cy7, CD86-PE and CD41-Alexa Fluor R 488). According to the expression of the platelet surface marker CD41, the number of monocyte subsets and MPAs was analyzed using the fluorescent microspheres of absolute counting tube. The Wilcoxon rank sum test and receiver operating characteristic (ROC) curve analysis were performed. Results CD14 + CD16 ++ monocytes in intraprocedural thrombotic events (IPTE) group were significantly fewer than those in non-IPTE group, and the percentage in total mononuclear cells decreased. Compared with non-IPTE group, MPA binding ratio and monocyte subset MPA binding ratio were significantly higher in IPTE group. ROC analysis showed that MPA binding ratio and subgroup MPA binding ratio had a better predictive value for IPTE in patients with AMI. Conclusion The CD14 + CD16 ++ monocytes in IPTE group were significantly fewer than those in the non-IPTE group. MPA binding ratio and MPA binding ratio of monocyte subsets were significantly higher in the IPTE group than in the non-IPTE group, so they have a good predictive value for IPTE in patients with AMI.

  2. Peripheral Blood Lymphocyte Subset Counts in Pre-menopausal Women with Iron-Deficiency Anaemia

    PubMed Central

    Reza Keramati, Mohammad; Sadeghian, Mohammad Hadi; Ayatollahi, Hossein; Mahmoudi, Mahmoud; Khajedaluea, Mohammad; Tavasolian, Houman; Borzouei, Anahita

    2011-01-01

    Background: Iron-deficiency anaemia (IDA) is a major worldwide public health problem. Children and women of reproductive age are especially vulnerable to IDA, and it has been reported that these patients are more prone to infection. This study was done to evaluate alteration of lymphocyte subgroups in IDA. Methods: In this prospective study, we investigated lymphocyte subsets in pre-menopausal women with iron-deficiency anaemia; 50 normal subjects and 50 IDA (hypochromic microcytic) cases were enrolled. Experimental and control anticoagulated blood samples were evaluated using flow cytometry to determine the absolute and relative numbers of various lymphocyte subgroups. Finally, the results of the patient and control groups were compared. Results: Mean (SD) absolute counts of lymphocytes, CD3+ cells, CD3+/CD4+ subsets (T helper) and CD3+/CD8+ subsets (T cytotoxic) in the patient group were 2.08 (0.65) x 109/L, 1.53 (0.53) x 109/L, 0.87 (0.28) x 109/L, and 0.51 (0.24) x 109/L, respectively. The results showed significant differences between case and control groups in mean absolute counts of lymphocytes (P = 0.014), T lymphocytes (P = 0.009), helper T cells (P = 0.004), and cytotoxic T cells (P = 0.043). Conclusion: This study showed that absolute counts of peripheral blood T lymphocytes as a marker of cell-mediated immunity may be decreased in pre-menopausal women with iron-deficiency anaemia, and that these patients may be more prone to infection. PMID:22135572

  3. Human NK Cell Subset Functions Are Differentially Affected by Adipokines

    PubMed Central

    Huebner, Lena; Engeli, Stefan; Wrann, Christiane D.; Goudeva, Lilia; Laue, Tobias; Kielstein, Heike

    2013-01-01

    Background Obesity is a risk factor for various types of infectious diseases and cancer. The increase in adipose tissue causes alterations in both adipogenesis and the production of adipocyte-secreted proteins (adipokines). Since natural killer (NK) cells are the host’s primary defense against virus-infected and tumor cells, we investigated how adipocyte-conditioned medium (ACM) affects functions of two distinct human NK cell subsets. Methods Isolated human peripheral blood mononuclear cells (PBMCs) were cultured with various concentrations of human and murine ACM harvested on two different days during adipogenesis and analyzed by fluorescent-activated cell sorting (FACS). Results FACS analyses showed that the expression of tumor necrosis factor-related apoptosis-inducing ligand (TRAIL), granzyme A (GzmA) and interferon (IFN)-γ in NK cells was regulated in a subset-specific manner. ACM treatment altered IFN-γ expression in CD56dim NK cells. The production of GzmA in CD56bright NK cells was differentially affected by the distinct adipokine compositions harvested at different states of adipogenesis. Comparison of the treatment with either human or murine ACM revealed that adipokine-induced effects on NK cell expression of the leptin receptor (Ob-R), TRAIL and IFN-γ were species-specific. Conclusion Considering the growing prevalence of obesity and the various disorders related to it, the present study provides further insights into the roles human NK cell subsets play in the obesity-associated state of chronic low-grade inflammation. PMID:24098717

  4. GenoCore: A simple and fast algorithm for core subset selection from large genotype datasets.

    PubMed

    Jeong, Seongmun; Kim, Jae-Yoon; Jeong, Soon-Chun; Kang, Sung-Taeg; Moon, Jung-Kyung; Kim, Namshin

    2017-01-01

    Selecting core subsets from plant genotype datasets is important for enhancing cost-effectiveness and to shorten the time required for analyses of genome-wide association studies (GWAS), and genomics-assisted breeding of crop species, etc. Recently, a large number of genetic markers (>100,000 single nucleotide polymorphisms) have been identified from high-density single nucleotide polymorphism (SNP) arrays and next-generation sequencing (NGS) data. However, there is no software available for picking out the efficient and consistent core subset from such a huge dataset. It is necessary to develop software that can extract genetically important samples in a population with coherence. We here present a new program, GenoCore, which can find quickly and efficiently the core subset representing the entire population. We introduce simple measures of coverage and diversity scores, which reflect genotype errors and genetic variations, and can help to select a sample rapidly and accurately for crop genotype dataset. Comparison of our method to other core collection software using example datasets are performed to validate the performance according to genetic distance, diversity, coverage, required system resources, and the number of selected samples. GenoCore selects the smallest, most consistent, and most representative core collection from all samples, using less memory with more efficient scores, and shows greater genetic coverage compared to the other software tested. GenoCore was written in R language, and can be accessed online with an example dataset and test results at https://github.com/lovemun/Genocore.

  5. Identifying Depressed Older Adults in Primary Care: A Secondary Analysis of a Multisite Randomized Controlled Trial

    PubMed Central

    Voils, Corrine I.; Olsen, Maren K.; Williams, John W.; for the IMPACT Study Investigators

    2008-01-01

    Objective: To determine whether a subset of depressive symptoms could be identified to facilitate diagnosis of depression in older adults in primary care. Method: Secondary analysis was conducted on 898 participants aged 60 years or older with major depressive disorder and/or dysthymic disorder (according to DSM-IV criteria) who participated in the Improving Mood–Promoting Access to Collaborative Treatment (IMPACT) study, a multisite, randomized trial of collaborative care for depression (recruitment from July 1999 to August 2001). Linear regression was used to identify a core subset of depressive symptoms associated with decreased social, physical, and mental functioning. The sensitivity and specificity, adjusting for selection bias, were evaluated for these symptoms. The sensitivity and specificity of a second subset of 4 depressive symptoms previously validated in a midlife sample was also evaluated. Results: Psychomotor changes, fatigue, and suicidal ideation were associated with decreased functioning and served as the core set of symptoms. Adjusting for selection bias, the sensitivity of these 3 symptoms was 0.012 and specificity 0.994. The sensitivity of the 4 symptoms previously validated in a midlife sample was 0.019 and specificity was 0.997. Conclusion: We identified 3 depression symptoms that were highly specific for major depressive disorder in older adults. However, these symptoms and a previously identified subset were too insensitive for accurate diagnosis. Therefore, we recommend a full assessment of DSM-IV depression criteria for accurate diagnosis. PMID:18311416

  6. A new approach to identify the sensitivity and importance of physical parameters combination within numerical models using the Lund-Potsdam-Jena (LPJ) model as an example

    NASA Astrophysics Data System (ADS)

    Sun, Guodong; Mu, Mu

    2017-05-01

    An important source of uncertainty, which causes further uncertainty in numerical simulations, is that residing in the parameters describing physical processes in numerical models. Therefore, finding a subset among numerous physical parameters in numerical models in the atmospheric and oceanic sciences, which are relatively more sensitive and important parameters, and reducing the errors in the physical parameters in this subset would be a far more efficient way to reduce the uncertainties involved in simulations. In this context, we present a new approach based on the conditional nonlinear optimal perturbation related to parameter (CNOP-P) method. The approach provides a framework to ascertain the subset of those relatively more sensitive and important parameters among the physical parameters. The Lund-Potsdam-Jena (LPJ) dynamical global vegetation model was utilized to test the validity of the new approach in China. The results imply that nonlinear interactions among parameters play a key role in the identification of sensitive parameters in arid and semi-arid regions of China compared to those in northern, northeastern, and southern China. The uncertainties in the numerical simulations were reduced considerably by reducing the errors of the subset of relatively more sensitive and important parameters. The results demonstrate that our approach not only offers a new route to identify relatively more sensitive and important physical parameters but also that it is viable to then apply "target observations" to reduce the uncertainties in model parameters.

  7. Assessment of Different Strategies to Determine MAP-specific Cellular Immune Responses in Cattle

    USDA-ARS?s Scientific Manuscript database

    Assessment of cellular immunity in cattle against Mycobacterium avium ssp. paratuberculosis (MAP) by established methods remains unsatisfactory for diagnostic purposes. Recent studies conclude that analysis of T-cell subset responsiveness may improve diagnostic outcome. Aim of this study was to iden...

  8. Developmental Trajectories of Early Communication Skills

    ERIC Educational Resources Information Center

    Maatta, Sira; Laakso, Marja-Leena; Tolvanen, Asko; Ahonen, Timo; Aro, Tuija

    2012-01-01

    Purpose: This study focused on developmental trajectories of prelinguistic communication skills and their connections to later parent-reported language difficulties. Method: The participants represent a subset of a community-based sample of 508 children. Data include parent reports of prelinguistic communication skills at 12, 15, 18, and 21 months…

  9. Approximate Model Checking of PCTL Involving Unbounded Path Properties

    NASA Astrophysics Data System (ADS)

    Basu, Samik; Ghosh, Arka P.; He, Ru

    We study the problem of applying statistical methods for approximate model checking of probabilistic systems against properties encoded as PCTL formulas. Such approximate methods have been proposed primarily to deal with state-space explosion that makes the exact model checking by numerical methods practically infeasible for large systems. However, the existing statistical methods either consider a restricted subset of PCTL, specifically, the subset that can only express bounded until properties; or rely on user-specified finite bound on the sample path length. We propose a new method that does not have such restrictions and can be effectively used to reason about unbounded until properties. We approximate probabilistic characteristics of an unbounded until property by that of a bounded until property for a suitably chosen value of the bound. In essence, our method is a two-phase process: (a) the first phase is concerned with identifying the bound k 0; (b) the second phase computes the probability of satisfying the k 0-bounded until property as an estimate for the probability of satisfying the corresponding unbounded until property. In both phases, it is sufficient to verify bounded until properties which can be effectively done using existing statistical techniques. We prove the correctness of our technique and present its prototype implementations. We empirically show the practical applicability of our method by considering different case studies including a simple infinite-state model, and large finite-state models such as IPv4 zeroconf protocol and dining philosopher protocol modeled as Discrete Time Markov chains.

  10. A Composite Algorithm for Mixed Integer Constrained Nonlinear Optimization.

    DTIC Science & Technology

    1980-01-01

    de Silva [141, and Weisman and Wood [76). A particular direct search algorithm, the simplex method, has been cited for having the potential for...spaced discrete points on a line which makes the direction suitable for an efficient integer search technique based on Fibonacci numbers. Two...defined by a subset of variables. The complex algorithm is particularly well suited for this subspace search for two reasons. First, the complex method

  11. Investigating the Performance of Alternate Regression Weights by Studying All Possible Criteria in Regression Models with a Fixed Set of Predictors

    ERIC Educational Resources Information Center

    Waller, Niels; Jones, Jeff

    2011-01-01

    We describe methods for assessing all possible criteria (i.e., dependent variables) and subsets of criteria for regression models with a fixed set of predictors, x (where x is an n x 1 vector of independent variables). Our methods build upon the geometry of regression coefficients (hereafter called regression weights) in n-dimensional space. For a…

  12. Generalized Preconditioned Locally Harmonic Residual Eigensolver (GPLHR) v0.1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    VECHARYNSKI, EUGENE; YANG, CHAO

    The software contains a MATLAB implementation of the Generalized Preconditioned Locally Harmonic Residual (GPLHR) method for solving standard and generalized non-Hermitian eigenproblems. The method is particularly useful for computing a subset of eigenvalues, and their eigen- or Schur vectors, closest to a given shift. The proposed method is based on block iterations and can take advantage of a preconditioner if it is available. It does not need to perform exact shift-and-invert transformation. Standard and generalized eigenproblems are handled in a unified framework.

  13. Influence of Intracranial Electrode Density and Spatial Configuration on Interictal Spike Localization: A Case Study.

    PubMed

    Lie, Octavian V; Papanastassiou, Alexander M; Cavazos, José E; Szabó, Ákos C

    2015-10-01

    Poor seizure outcomes after epilepsy surgery often reflect an incorrect localization of the epileptic sources by standard intracranial EEG interpretation because of limited electrode coverage of the epileptogenic zone. This study investigates whether, in such conditions, source modeling is able to provide more accurate source localization than the standard clinical method that can be used prospectively to improve surgical resection planning. Suboptimal epileptogenic zone sampling is simulated by subsets of the electrode configuration used to record intracranial EEG in a patient rendered seizure free after surgery. sLORETA and the clinical method solutions are applied to interictal spikes sampled with these electrode subsets and are compared for colocalization with the resection volume and displacement due to electrode downsampling. sLORETA provides often congruent and at times more accurate source localization when compared with the standard clinical method. However, with electrode downsampling, individual sLORETA solution locations can vary considerably and shift consistently toward the remaining electrodes. sLORETA application can improve source localization based on the clinical method but does not reliably compensate for suboptimal electrode placement. Incorporating sLORETA solutions based on intracranial EEG in surgical planning should proceed cautiously in cases where electrode repositioning is planned on clinical grounds.

  14. Effectiveness and Adoption of a Drawing-to-Learn Study Tool for Recall and Problem Solving: Minute Sketches with Folded Lists

    PubMed Central

    Heideman, Paul D.; Flores, K. Adryan; Sevier, Lu M.; Trouton, Kelsey E.

    2017-01-01

    Drawing by learners can be an effective way to develop memory and generate visual models for higher-order skills in biology, but students are often reluctant to adopt drawing as a study method. We designed a nonclassroom intervention that instructed introductory biology college students in a drawing method, minute sketches in folded lists (MSFL), and allowed them to self-assess their recall and problem solving, first in a simple recall task involving non-European alphabets and later using unfamiliar biology content. In two preliminary ex situ experiments, students had greater recall on the simple learning task, non-European alphabets with associated phonetic sounds, using MSFL in comparison with a preferred method, visual review (VR). In the intervention, students studying using MSFL and VR had ∼50–80% greater recall of content studied with MSFL and, in a subset of trials, better performance on problem-solving tasks on biology content. Eight months after beginning the intervention, participants had shifted self-reported use of drawing from 2% to 20% of study time. For a small subset of participants, MSFL had become a preferred study method, and 70% of participants reported continued use of MSFL. This brief, low-cost intervention resulted in enduring changes in study behavior. PMID:28495932

  15. Prediction of lysine ubiquitylation with ensemble classifier and feature selection.

    PubMed

    Zhao, Xiaowei; Li, Xiangtao; Ma, Zhiqiang; Yin, Minghao

    2011-01-01

    Ubiquitylation is an important process of post-translational modification. Correct identification of protein lysine ubiquitylation sites is of fundamental importance to understand the molecular mechanism of lysine ubiquitylation in biological systems. This paper develops a novel computational method to effectively identify the lysine ubiquitylation sites based on the ensemble approach. In the proposed method, 468 ubiquitylation sites from 323 proteins retrieved from the Swiss-Prot database were encoded into feature vectors by using four kinds of protein sequences information. An effective feature selection method was then applied to extract informative feature subsets. After different feature subsets were obtained by setting different starting points in the search procedure, they were used to train multiple random forests classifiers and then aggregated into a consensus classifier by majority voting. Evaluated by jackknife tests and independent tests respectively, the accuracy of the proposed predictor reached 76.82% for the training dataset and 79.16% for the test dataset, indicating that this predictor is a useful tool to predict lysine ubiquitylation sites. Furthermore, site-specific feature analysis was performed and it was shown that ubiquitylation is intimately correlated with the features of its surrounding sites in addition to features derived from the lysine site itself. The feature selection method is available upon request.

  16. Usability-driven pruning of large ontologies: the case of SNOMED CT.

    PubMed

    López-García, Pablo; Boeker, Martin; Illarramendi, Arantza; Schulz, Stefan

    2012-06-01

    To study ontology modularization techniques when applied to SNOMED CT in a scenario in which no previous corpus of information exists and to examine if frequency-based filtering using MEDLINE can reduce subset size without discarding relevant concepts. Subsets were first extracted using four graph-traversal heuristics and one logic-based technique, and were subsequently filtered with frequency information from MEDLINE. Twenty manually coded discharge summaries from cardiology patients were used as signatures and test sets. The coverage, size, and precision of extracted subsets were measured. Graph-traversal heuristics provided high coverage (71-96% of terms in the test sets of discharge summaries) at the expense of subset size (17-51% of the size of SNOMED CT). Pre-computed subsets and logic-based techniques extracted small subsets (1%), but coverage was limited (24-55%). Filtering reduced the size of large subsets to 10% while still providing 80% coverage. Extracting subsets to annotate discharge summaries is challenging when no previous corpus exists. Ontology modularization provides valuable techniques, but the resulting modules grow as signatures spread across subhierarchies, yielding a very low precision. Graph-traversal strategies and frequency data from an authoritative source can prune large biomedical ontologies and produce useful subsets that still exhibit acceptable coverage. However, a clinical corpus closer to the specific use case is preferred when available.

  17. A brief measure of attitudes toward mixed methods research in psychology.

    PubMed

    Roberts, Lynne D; Povee, Kate

    2014-01-01

    The adoption of mixed methods research in psychology has trailed behind other social science disciplines. Teaching psychology students, academics, and practitioners about mixed methodologies may increase the use of mixed methods within the discipline. However, tailoring and evaluating education and training in mixed methodologies requires an understanding of, and way of measuring, attitudes toward mixed methods research in psychology. To date, no such measure exists. In this article we present the development and initial validation of a new measure: Attitudes toward Mixed Methods Research in Psychology. A pool of 42 items developed from previous qualitative research on attitudes toward mixed methods research along with validation measures was administered via an online survey to a convenience sample of 274 psychology students, academics and psychologists. Principal axis factoring with varimax rotation on a subset of the sample produced a four-factor, 12-item solution. Confirmatory factor analysis on a separate subset of the sample indicated that a higher order four factor model provided the best fit to the data. The four factors; 'Limited Exposure,' '(in)Compatibility,' 'Validity,' and 'Tokenistic Qualitative Component'; each have acceptable internal reliability. Known groups validity analyses based on preferred research orientation and self-rated mixed methods research skills, and convergent and divergent validity analyses based on measures of attitudes toward psychology as a science and scientist and practitioner orientation, provide initial validation of the measure. This brief, internally reliable measure can be used in assessing attitudes toward mixed methods research in psychology, measuring change in attitudes as part of the evaluation of mixed methods education, and in larger research programs.

  18. Hybrid Binary Imperialist Competition Algorithm and Tabu Search Approach for Feature Selection Using Gene Expression Data.

    PubMed

    Wang, Shuaiqun; Aorigele; Kong, Wei; Zeng, Weiming; Hong, Xiaomin

    2016-01-01

    Gene expression data composed of thousands of genes play an important role in classification platforms and disease diagnosis. Hence, it is vital to select a small subset of salient features over a large number of gene expression data. Lately, many researchers devote themselves to feature selection using diverse computational intelligence methods. However, in the progress of selecting informative genes, many computational methods face difficulties in selecting small subsets for cancer classification due to the huge number of genes (high dimension) compared to the small number of samples, noisy genes, and irrelevant genes. In this paper, we propose a new hybrid algorithm HICATS incorporating imperialist competition algorithm (ICA) which performs global search and tabu search (TS) that conducts fine-tuned search. In order to verify the performance of the proposed algorithm HICATS, we have tested it on 10 well-known benchmark gene expression classification datasets with dimensions varying from 2308 to 12600. The performance of our proposed method proved to be superior to other related works including the conventional version of binary optimization algorithm in terms of classification accuracy and the number of selected genes.

  19. Hybrid Binary Imperialist Competition Algorithm and Tabu Search Approach for Feature Selection Using Gene Expression Data

    PubMed Central

    Aorigele; Zeng, Weiming; Hong, Xiaomin

    2016-01-01

    Gene expression data composed of thousands of genes play an important role in classification platforms and disease diagnosis. Hence, it is vital to select a small subset of salient features over a large number of gene expression data. Lately, many researchers devote themselves to feature selection using diverse computational intelligence methods. However, in the progress of selecting informative genes, many computational methods face difficulties in selecting small subsets for cancer classification due to the huge number of genes (high dimension) compared to the small number of samples, noisy genes, and irrelevant genes. In this paper, we propose a new hybrid algorithm HICATS incorporating imperialist competition algorithm (ICA) which performs global search and tabu search (TS) that conducts fine-tuned search. In order to verify the performance of the proposed algorithm HICATS, we have tested it on 10 well-known benchmark gene expression classification datasets with dimensions varying from 2308 to 12600. The performance of our proposed method proved to be superior to other related works including the conventional version of binary optimization algorithm in terms of classification accuracy and the number of selected genes. PMID:27579323

  20. CyClus: a fast, comprehensive cylindrical interface approximation clustering/reranking method for rigid-body protein-protein docking decoys.

    PubMed

    Omori, Satoshi; Kitao, Akio

    2013-06-01

    We propose a fast clustering and reranking method, CyClus, for protein-protein docking decoys. This method enables comprehensive clustering of whole decoys generated by rigid-body docking using cylindrical approximation of the protein-proteininterface and hierarchical clustering procedures. We demonstrate the clustering and reranking of 54,000 decoy structures generated by ZDOCK for each complex within a few minutes. After parameter tuning for the test set in ZDOCK benchmark 2.0 with the ZDOCK and ZRANK scoring functions, blind tests for the incremental data in ZDOCK benchmark 3.0 and 4.0 were conducted. CyClus successfully generated smaller subsets of decoys containing near-native decoys. For example, the number of decoys required to create subsets containing near-native decoys with 80% probability was reduced from 22% to 50% of the number required in the original ZDOCK. Although specific ZDOCK and ZRANK results were demonstrated, the CyClus algorithm was designed to be more general and can be applied to a wide range of decoys and scoring functions by adjusting just two parameters, p and T. CyClus results were also compared to those from ClusPro. Copyright © 2013 Wiley Periodicals, Inc.

  1. Application of Sequential Quadratic Programming to Minimize Smart Active Flap Rotor Hub Loads

    NASA Technical Reports Server (NTRS)

    Kottapalli, Sesi; Leyland, Jane

    2014-01-01

    In an analytical study, SMART active flap rotor hub loads have been minimized using nonlinear programming constrained optimization methodology. The recently developed NLPQLP system (Schittkowski, 2010) that employs Sequential Quadratic Programming (SQP) as its core algorithm was embedded into a driver code (NLP10x10) specifically designed to minimize active flap rotor hub loads (Leyland, 2014). Three types of practical constraints on the flap deflections have been considered. To validate the current application, two other optimization methods have been used: i) the standard, linear unconstrained method, and ii) the nonlinear Generalized Reduced Gradient (GRG) method with constraints. The new software code NLP10x10 has been systematically checked out. It has been verified that NLP10x10 is functioning as desired. The following are briefly covered in this paper: relevant optimization theory; implementation of the capability of minimizing a metric of all, or a subset, of the hub loads as well as the capability of using all, or a subset, of the flap harmonics; and finally, solutions for the SMART rotor. The eventual goal is to implement NLP10x10 in a real-time wind tunnel environment.

  2. Analysis of Xq27-28 linkage in the international consortium for prostate cancer genetics (ICPCG) families

    PubMed Central

    2012-01-01

    Background Genetic variants are likely to contribute to a portion of prostate cancer risk. Full elucidation of the genetic etiology of prostate cancer is difficult because of incomplete penetrance and genetic and phenotypic heterogeneity. Current evidence suggests that genetic linkage to prostate cancer has been found on several chromosomes including the X; however, identification of causative genes has been elusive. Methods Parametric and non-parametric linkage analyses were performed using 26 microsatellite markers in each of 11 groups of multiple-case prostate cancer families from the International Consortium for Prostate Cancer Genetics (ICPCG). Meta-analyses of the resultant family-specific linkage statistics across the entire 1,323 families and in several predefined subsets were then performed. Results Meta-analyses of linkage statistics resulted in a maximum parametric heterogeneity lod score (HLOD) of 1.28, and an allele-sharing lod score (LOD) of 2.0 in favor of linkage to Xq27-q28 at 138 cM. In subset analyses, families with average age at onset less than 65 years exhibited a maximum HLOD of 1.8 (at 138 cM) versus a maximum regional HLOD of only 0.32 in families with average age at onset of 65 years or older. Surprisingly, the subset of families with only 2–3 affected men and some evidence of male-to-male transmission of prostate cancer gave the strongest evidence of linkage to the region (HLOD = 3.24, 134 cM). For this subset, the HLOD was slightly increased (HLOD = 3.47 at 134 cM) when families used in the original published report of linkage to Xq27-28 were excluded. Conclusions Although there was not strong support for linkage to the Xq27-28 region in the complete set of families, the subset of families with earlier age at onset exhibited more evidence of linkage than families with later onset of disease. A subset of families with 2–3 affected individuals and with some evidence of male to male disease transmission showed stronger linkage signals. Our results suggest that the genetic basis for prostate cancer in our families is much more complex than a single susceptibility locus on the X chromosome, and that future explorations of the Xq27-28 region should focus on the subset of families identified here with the strongest evidence of linkage to this region. PMID:22712434

  3. SVM-based feature extraction and classification of aflatoxin contaminated corn using fluorescence hyperspectral data

    USDA-ARS?s Scientific Manuscript database

    Support Vector Machine (SVM) was used in the Genetic Algorithms (GA) process to select and classify a subset of hyperspectral image bands. The method was applied to fluorescence hyperspectral data for the detection of aflatoxin contamination in Aspergillus flavus infected single corn kernels. In the...

  4. An Assessment of the Dimensionality of SAT-Mathematical.

    ERIC Educational Resources Information Center

    Lawrence, Ida M.; Dorans, Neil J.

    Six editions of Scholastic Aptitude Test-Mathematical (SAT-M) were factor analyzed using confirmatory and exploratory methods. Confirmatory factor analyses (using the LISREL VI program) were conducted on correlation matrices among item parcels--sums of scores on a small subset of items. Item parcels were constructed to yield correlation matrices…

  5. Teacher Acquisition of Functional Analysis Methods Using Pyramidal Training

    ERIC Educational Resources Information Center

    Pence, Sacha T.; St. Peter, Claire C.; Giles, Aimee F.

    2014-01-01

    Pyramidal training involves an experienced professional training a subset of individuals who, in turn, train additional individuals. Pyramidal training is effective for training a variety of behavior-analytic skills with direct-care staff, parents, and teachers. As teachers' roles in behavioral assessment increase, pyramidal training may be…

  6. Document clustering methods, document cluster label disambiguation methods, document clustering apparatuses, and articles of manufacture

    DOEpatents

    Sanfilippo, Antonio [Richland, WA; Calapristi, Augustin J [West Richland, WA; Crow, Vernon L [Richland, WA; Hetzler, Elizabeth G [Kennewick, WA; Turner, Alan E [Kennewick, WA

    2009-12-22

    Document clustering methods, document cluster label disambiguation methods, document clustering apparatuses, and articles of manufacture are described. In one aspect, a document clustering method includes providing a document set comprising a plurality of documents, providing a cluster comprising a subset of the documents of the document set, using a plurality of terms of the documents, providing a cluster label indicative of subject matter content of the documents of the cluster, wherein the cluster label comprises a plurality of word senses, and selecting one of the word senses of the cluster label.

  7. Parasite-Antigen Driven Expansion of IL-5− and IL-5+ Th2 Human Subpopulations in Lymphatic Filariasis and Their Differential Dependence on IL-10 and TGFβ

    PubMed Central

    Anuradha, Rajamanickam; George, Parakkal Jovvian; Hanna, Luke E.; Chandrasekaran, Vedachalam; Kumaran, P. Paul; Nutman, Thomas B.; Babu, Subash

    2014-01-01

    Background Two different Th2 subsets have been defined recently on the basis of IL-5 expression – an IL-5+Th2 subset and an IL-5−Th2 subset in the setting of allergy. However, the role of these newly described CD4+ T cells subpopulations has not been explored in other contexts. Methods To study the role of the Th2 subpopulation in a chronic, tissue invasive parasitic infection (lymphatic filariasis), we examined the frequency of IL-5+IL-4+IL-13+ CD4+ T cells and IL-5−IL-4 IL-13+ CD4+ T cells in asymptomatic, infected individuals (INF) and compared them to frequencies (Fo) in filarial-uninfected (UN) individuals and to those with filarial lymphedema (CP). Results INF individuals exhibited a significant increase in the spontaneously expressed and antigen-induced Fo of both Th2 subpopulations compared to the UN and CP. Interestingly, there was a positive correlation between the Fo of IL-5+Th2 cells and the absolute eosinophil and neutrophil counts; in addition there was a positive correlation between the frequency of the CD4+IL-5−Th2 subpopulation and the levels of parasite antigen – specific IgE and IgG4 in INF individuals. Moreover, blockade of IL-10 and/or TGFβ demonstrated that each of these 2 regulatory cytokines exert opposite effects on the different Th2 subsets. Finally, in those INF individuals cured of infection by anti-filarial therapy, there was a significantly decreased Fo of both Th2 subsets. Conclusions Our findings suggest that both IL-5+ and IL-5−Th2 cells play an important role in the regulation of immune responses in filarial infection and that these two Th2 subpopulations may be regulated by different cytokine-receptor mediated processes. PMID:24498448

  8. A new model for approximating RNA folding trajectories and population kinetics

    NASA Astrophysics Data System (ADS)

    Kirkpatrick, Bonnie; Hajiaghayi, Monir; Condon, Anne

    2013-01-01

    RNA participates both in functional aspects of the cell and in gene regulation. The interactions of these molecules are mediated by their secondary structure which can be viewed as a planar circle graph with arcs for all the chemical bonds between pairs of bases in the RNA sequence. The problem of predicting RNA secondary structure, specifically the chemically most probable structure, has many useful and efficient algorithms. This leaves RNA folding, the problem of predicting the dynamic behavior of RNA structure over time, as the main open problem. RNA folding is important for functional understanding because some RNA molecules change secondary structure in response to interactions with the environment. The full RNA folding model on at most O(3n) secondary structures is the gold standard. We present a new subset approximation model for the full model, give methods to analyze its accuracy and discuss the relative merits of our model as compared with a pre-existing subset approximation. The main advantage of our model is that it generates Monte Carlo folding pathways with the same probabilities with which they are generated under the full model. The pre-existing subset approximation does not have this property.

  9. Predicting ovarian malignancy: application of artificial neural networks to transvaginal and color Doppler flow US.

    PubMed

    Biagiotti, R; Desii, C; Vanzi, E; Gacci, G

    1999-02-01

    To compare the performance of artificial neural networks (ANNs) with that of multiple logistic regression (MLR) models for predicting ovarian malignancy in patients with adnexal masses by using transvaginal B-mode and color Doppler flow ultrasonography (US). A total of 226 adnexal masses were examined before surgery: Fifty-one were malignant and 175 were benign. The data were divided into training and testing subsets by using a "leave n out method." The training subsets were used to compute the optimum MLR equations and to train the ANNs. The cross-validation subsets were used to estimate the performance of each of the two models in predicting ovarian malignancy. At testing, three-layer back-propagation networks, based on the same input variables selected by using MLR (i.e., women's ages, papillary projections, random echogenicity, peak systolic velocity, and resistance index), had a significantly higher sensitivity than did MLR (96% vs 84%; McNemar test, p = .04). The Brier scores for ANNs were significantly lower than those calculated for MLR (Student t test for paired samples, P = .004). ANNs might have potential for categorizing adnexal masses as either malignant or benign on the basis of multiple variables related to demographic and US features.

  10. Using existing case-mix methods to fund trauma cases.

    PubMed

    Monakova, Julia; Blais, Irene; Botz, Charles; Chechulin, Yuriy; Picciano, Gino; Basinski, Antoni

    2010-01-01

    Policymakers frequently face the need to increase funding in isolated and frequently heterogeneous (clinically and in terms of resource consumption) patient subpopulations. This article presents a methodologic solution for testing the appropriateness of using existing grouping and weighting methodologies for funding subsets of patients in the scenario where a case-mix approach is preferable to a flat-rate based payment system. Using as an example the subpopulation of trauma cases of Ontario lead trauma hospitals, the statistical techniques of linear and nonlinear regression models, regression trees, and spline models were applied to examine the fit of the existing case-mix groups and reference weights for the trauma cases. The analyses demonstrated that for funding Ontario trauma cases, the existing case-mix systems can form the basis for rational and equitable hospital funding, decreasing the need to develop a different grouper for this subset of patients. This study confirmed that Injury Severity Score is a poor predictor of costs for trauma patients. Although our analysis used the Canadian case-mix classification system and cost weights, the demonstrated concept of using existing case-mix systems to develop funding rates for specific subsets of patient populations may be applicable internationally.

  11. Deep Venous Thrombosis: An Interventionalist's Approach

    PubMed Central

    Jenkins, J. Stephen; Michael, Paul

    2014-01-01

    Background Deep venous thrombosis (DVT) of the lower extremity has traditionally been anatomically categorized into proximal DVT (thrombosis involving the popliteal vein and above) and distal DVT (isolated calf vein thrombosis). Proximal DVT involving the common femoral and/or iliac veins, referred to as iliofemoral DVT (IFDVT), represents a disease process with a worse prognosis and higher risk for poor clinical outcomes compared to proximal DVT not involving the common femoral or iliac draining veins. Methods This review discusses therapeutic options for treatment of lower extremity IFDVT, including adjuvant anticoagulation and catheter-based invasive therapies; literature supporting current acute interventional techniques; and the recommendations from the recently published American Heart Association guidelines. Results Patients with IFDVT represent an opportune subset of patients for acute interventional management with currently available techniques. This subset of patients with proximal DVT has a worse prognosis, is less well studied, and benefits more from acute intervention compared to patients with proximal DVT or distal DVT. Conclusion Invasive catheter-based therapies that remove thrombus and correct venous outflow obstructions improve outcomes and morbidity in patients with IFDVT. Future trials that address IFDVT specifically will improve our understanding and the proper management of this higher-risk subset of patients with DVT. PMID:25598728

  12. The ASTRAL Compendium in 2004

    DOE R&D Accomplishments Database

    Chandonia, John-Marc; Hon, Gary; Walker, Nigel S.; Lo Conte, Loredana; Koehl, Patrice; Levitt, Michael; Brenner, Steven E.

    2003-09-15

    The ASTRAL compendium provides several databases and tools to aid in the analysis of protein structures, particularly through the use of their sequences. Partially derived from the SCOP database of protein structure domains, it includes sequences for each domain and other resources useful for studying these sequences and domain structures. The current release of ASTRAL contains 54,745 domains, more than three times as many as the initial release four years ago. ASTRAL has undergone major transformations in the past two years. In addition to several complete updates each year, ASTRAL is now updated on a weekly basis with preliminary classifications of domains from newly released PDB structures. These classifications are available as a stand-alone database, as well as available integrated into other ASTRAL databases such as representative subsets. To enhance the utility of ASTRAL to structural biologists, all SCOP domains are now made available as PDB-style coordinate files as well as sequences. In addition to sequences and representative subsets based on SCOP domains, sequences and subsets based on PDB chains are newly included in ASTRAL. Several search tools have been added to ASTRAL to facilitate retrieval of data by individual users and automated methods.

  13. A Parameter Subset Selection Algorithm for Mixed-Effects Models

    DOE PAGES

    Schmidt, Kathleen L.; Smith, Ralph C.

    2016-01-01

    Mixed-effects models are commonly used to statistically model phenomena that include attributes associated with a population or general underlying mechanism as well as effects specific to individuals or components of the general mechanism. This can include individual effects associated with data from multiple experiments. However, the parameterizations used to incorporate the population and individual effects are often unidentifiable in the sense that parameters are not uniquely specified by the data. As a result, the current literature focuses on model selection, by which insensitive parameters are fixed or removed from the model. Model selection methods that employ information criteria are applicablemore » to both linear and nonlinear mixed-effects models, but such techniques are limited in that they are computationally prohibitive for large problems due to the number of possible models that must be tested. To limit the scope of possible models for model selection via information criteria, we introduce a parameter subset selection (PSS) algorithm for mixed-effects models, which orders the parameters by their significance. In conclusion, we provide examples to verify the effectiveness of the PSS algorithm and to test the performance of mixed-effects model selection that makes use of parameter subset selection.« less

  14. Surface deformation time-series analysis at Ischia Island (South Italy) carried out via multi-platform monitoring systems

    NASA Astrophysics Data System (ADS)

    Manzo, Mariarosaria; Del Gaudio, Carlo; De Martino, Prospero; Ricco, Ciro; Tammaro, Umberto; Castaldo, Raffaele; Tizzani, Pietro; Lanari, Riccardo

    2014-05-01

    Ischia Island, located at the North-Western corner of the Gulf of Napoli (South Italy), is a volcanic area, whose state of activity is testified from eruptions (the last one occurred in 1302), earthquakes (the most disastrous in 1881 and 1883), hydrothermal manifestations and ground deformation. In this work we present the state of the art of the Ischia Island ground deformation phenomena through the joint analysis of data collected via different monitoring methodologies (leveling, GPS, and Differential SAR Interferometry) during the last twenty years. In particular, our analysis benefits from the large amount of periodic and continuous geodetic measurements collected by the 257 leveling benchmarks and the 20 (17 campaign and 3 permanent) GPS stations deployed on the island. Moreover, it takes advantage from the large archives of C-band SAR data (about 300 ascending and descending ERS-1/2 and ENVISAT images) acquired over the island since 1992 and the development of the advanced Differential SAR Interferometry (DInSAR) technique referred to as Small BAseline Subset (SBAS). The latter, allows providing space-time information on the ground displacements measured along the radar line of sight (LOS), and thanks to the availability of multi-orbit SAR data, permits to discriminate the vertical and east-west components of the detected displacements. Our integrated analysis reveals a complex deformative scenario; in particular, it identifies a spatially extended subsidence pattern, which increases as we move to higher heights, with no evidence of any uplift phenomena. This broad effect involve the Northern, Eastern, Southern and South-Western sectors of the island where we measure velocity values not exceeding -6 mm/year; moreover, we identify a more localized phenomenon affecting the North-Western area in correspondence to the Fango zone, where velocity values up to -10 mm/year are retrieved. In addition, our study shows a migration of the Eastern sector of the island towards West with velocity values of -1/-2 mm/year. Conversely, a not clear behaviour of the central and South-Western areas is found; indeed, while the GPS velocity vectors are primarily Northward directed, the DInSAR measurements reveal a migration of these sectors towards East; in both cases we measure deformation velocity values of a very few mm/year. This discrepancy is very likely related to the fact that the North deformation component does not contribute to the measured LOS displacement component due to the nearly polar characteristics of the radar sensor orbits. The performed integrated time-series analysis can significantly contribute to the comprehension of the volcanic island dynamics, especially in the case of long-term observations that promote the investigation, modelling and interpretation of the physical processes behind the deformation phenomena at different temporal and spatial scales.

  15. Analysis of recent surface deformation at Ischia Island Volcano (South Italy) via multi-platform monitoring systems

    NASA Astrophysics Data System (ADS)

    Manzo, Mariarosaria; De Martino, Prospero; Castaldo, Raffaele; De Luca, Claudio; Dolce, Mario; Scarpato, Giovanni; Tizzani, Pietro; Zinno, Ivana; Lanari, Riccardo

    2017-04-01

    Ischia Island is a densely populated volcanic area located in the North-Western sector of the Gulf of Napoli (South Italy), whose activity is characterized by eruptions (the last one occurred in 1302 A.D.), earthquakes (the most disastrous ones occurred in 1881 and in 1883), fumarolic-hydrothermal manifestations and ground deformation. In this work we carry out the surface deformation time-series analysis occurring at the Island by jointly exploiting data collected via two different monitoring systems. In particular, we take advantage from the large amount of periodic and continuous geodetic measurements collected by the GPS (campaign and permanent) stations deployed on the Island and belonging to the INGV-OV monitoring network. Moreover, we benefit from the large, free and open archive of C-band SAR data acquired over the Island by the Sentinel-1 constellation of the Copernicus Program, and processed via the advanced Differential SAR Interferometry (DInSAR) technique referred to as Small BAseline Subset (SBAS) algorithm [Berardino et al., 2002]. We focus on the 2014-2017 time period to analyze the recent surface deformation phenomena occurring on the Island, thus extending a previous study, aimed at investigating the temporal evolution of the ground displacements affecting the Island and limited to the 1992-2003 time interval [Manzo et al., 2006]. The performed integrated analysis provides relevant spatial and temporal information on the Island surface deformation pattern. In particular, it reveals a rather complex deformative scenario, where localized phenomena overlap/interact with a spatially extended deformation pattern that involves many Island sectors, with no evidence of significant uplift phenomena. Moreover, it shows a good agreement and consistency between the different kinds of data, thus providing a clear picture of the recent dynamics at Ischia Island that can be profitably exploited to deeply investigate the physical processes behind the observed deformation phenomena. Acknowledgments This work is partially supported by the IREA-CNR/Italian Department of Civil Protection agreement and the I-AMICA project (Infrastructure of High Technology for Environmental and Climate Monitoring-PONa3_00363). References Berardino, P., G. Fornaro, R. Lanari, and E. Sansosti (2002), A new algorithm for surface deformation monitoring based on small baseline differential SAR interferograms, IEEE Trans. Geosci. Remote Sens., 40, 2375-2383, doi:10.1109/TGRS.2002.803792. Manzo, M., G. P. Ricciardi, F. Casu, G. Ventura, G. Zeni, S. Borgström, P. Berardino, C. Del Gaudio, and R. Lanari (2006), Surface deformation analysis in the Ischia Island (Italy) based on spaceborne radar interferometry, Journal of Volcanology and Geothermal Research, 151, 399-416, doi:10.1016/j.jvolgeores.2005.09.010.

  16. Clinical relevance and suppressive capacity of human MDSC subsets.

    PubMed

    Lang, Stephan; Bruderek, Kirsten; Kaspar, Cordelia; Höing, Benedikt; Kanaan, Oliver; Dominas, Nina; Hussain, Timon; Droege, Freya; Eyth, Christian Peter; Hadaschik, Boris; Brandau, Sven

    2018-06-18

    Myeloid-derived suppressor cells (MDSC) are a heterogeneous group of pathologically expanded myeloid cells with immunosuppressive activity. In human disease three major MDSC subpopulations can be defined as monocytic M-MDSC, granulocytic PMN-MDSC and early stage e-MDSC, which lack myeloid lineage markers of the former two subsets. It was the purpose of this study to determine and compare the immunosuppressive capacity and clinical relevance of each of these subsets in patients with solid cancer. The frequency of MDSC subsets in the peripheral blood was determined by flow cytometry in a cohort of 49 patients with advanced head and neck cancer (HNC) and 22 patients with urological cancers. Sorted and purified MDSC subsets were tested in vitro for their T cell suppressive capacity. Frequency of circulating MDSC was correlated with overall survival of HNC patients. A high frequency of PMN-MDSC most strongly correlated with poor overall survival in HNC. T cell suppressive activity was higher in PMN-MDSC compared with M-MDSC and e-MDSC. A subset of CD66b+/CD11b+/CD16+ mature PMN-MDSC displayed high expression and activity of arginase I, and was superior to the other subsets in suppressing proliferation and cytokine production of T cells in both cancer types. High levels of this CD11b+/CD16+ PMN-MDSC, but not other PMN-MDSC subsets, strongly correlated with adverse outcome in HNC. A subset of mature CD11b+/CD16+ PMN-MDSC was identified as the MDSC subset with the strongest immunosuppressive activity and the highest clinical relevance. Copyright ©2018, American Association for Cancer Research.

  17. A new mosaic method for three-dimensional surface

    NASA Astrophysics Data System (ADS)

    Yuan, Yun; Zhu, Zhaokun; Ding, Yongjun

    2011-08-01

    Three-dimensional (3-D) data mosaic is a indispensable link in surface measurement and digital terrain map generation. With respect to the mosaic problem of the local unorganized cloud points with rude registration and mass mismatched points, a new mosaic method for 3-D surface based on RANSAC is proposed. Every circular of this method is processed sequentially by random sample with additional shape constraint, data normalization of cloud points, absolute orientation, data denormalization of cloud points, inlier number statistic, etc. After N random sample trials the largest consensus set is selected, and at last the model is re-estimated using all the points in the selected subset. The minimal subset is composed of three non-colinear points which form a triangle. The shape of triangle is considered in random sample selection in order to make the sample selection reasonable. A new coordinate system transformation algorithm presented in this paper is used to avoid the singularity. The whole rotation transformation between the two coordinate systems can be solved by twice rotations expressed by Euler angle vector, each rotation has explicit physical means. Both simulation and real data are used to prove the correctness and validity of this mosaic method. This method has better noise immunity due to its robust estimation property, and has high accuracy as the shape constraint is added to random sample and the data normalization added to the absolute orientation. This method is applicable for high precision measurement of three-dimensional surface and also for the 3-D terrain mosaic.

  18. Differential expression of CD44 and CD24 markers discriminates the epitheliod from the fibroblastoid subset in a sarcomatoid renal carcinoma cell line: evidence suggesting the existence of cancer stem cells in both subsets as studied with sorted cells.

    PubMed

    Hsieh, Chin-Hsuan; Hsiung, Shih-Chieh; Yeh, Chi-Tai; Yen, Chih-Feng; Chou, Yah-Huei Wu; Lei, Wei-Yi; Pang, See-Tong; Chuang, Cheng-Keng; Liao, Shuen-Kuei

    2017-02-28

    Epithelioid and fibroblastoid subsets coexist in the human sarcomatoid renal cell carcinoma (sRCC) cell line, RCC52, according to previous clonal studies. Herein, using monoclonal antibodies to CD44 and CD24 markers, we identified and isolated these two populations, and showed that CD44bright/CD24dim and CD44bright/CD24bright phenotypes correspond to epithelioid and fibroblastoid subsets, respectively. Both sorted subsets displayed different levels of tumorigenicity in xenotransplantation, indicating that each harbored its own cancer stem cells (CSCs). The CD44bright/CD24bright subset, associated with higher expression of MMP-7, -8 and TIMP-1 transcripts, showed greater migratory/invasive potential than the CD44bright/CD24dim subset, which was associated with higher expression of MMP-2, -9 and TIMP-2 transcripts. Both subsets differentially expressed stemness gene products c-Myc, Oct4A, Notch1, Notch2 and Notch3, and the RCC stem cell marker, CD105 in 4-5% of RCC52 cells. These results suggest the presence of CSCs in both sRCC subsets for the first time and should therefore be considered potential therapeutic targets for this aggressive malignancy.

  19. Decision Variants for the Automatic Determination of Optimal Feature Subset in RF-RFE.

    PubMed

    Chen, Qi; Meng, Zhaopeng; Liu, Xinyi; Jin, Qianguo; Su, Ran

    2018-06-15

    Feature selection, which identifies a set of most informative features from the original feature space, has been widely used to simplify the predictor. Recursive feature elimination (RFE), as one of the most popular feature selection approaches, is effective in data dimension reduction and efficiency increase. A ranking of features, as well as candidate subsets with the corresponding accuracy, is produced through RFE. The subset with highest accuracy (HA) or a preset number of features (PreNum) are often used as the final subset. However, this may lead to a large number of features being selected, or if there is no prior knowledge about this preset number, it is often ambiguous and subjective regarding final subset selection. A proper decision variant is in high demand to automatically determine the optimal subset. In this study, we conduct pioneering work to explore the decision variant after obtaining a list of candidate subsets from RFE. We provide a detailed analysis and comparison of several decision variants to automatically select the optimal feature subset. Random forest (RF)-recursive feature elimination (RF-RFE) algorithm and a voting strategy are introduced. We validated the variants on two totally different molecular biology datasets, one for a toxicogenomic study and the other one for protein sequence analysis. The study provides an automated way to determine the optimal feature subset when using RF-RFE.

  20. Identification of forgeries in handwritten petitions for ballot propositions

    NASA Astrophysics Data System (ADS)

    Srihari, Sargur; Ramakrishnan, Veshnu; Malgireddy, Manavender; Ball, Gregory R.

    2009-01-01

    Many governments have some form of "direct democracy" legislation procedure whereby individual citizens can propose various measures creating or altering laws. Generally, such a process is started with the gathering of a large number of signatures. There is interest in whether or not there are fraudulent signatures present in such a petition, and if so what percentage of the signatures are indeed fraudulent. However, due to the large number of signatures (tens of thousands), it is not feasible to have a document examiner verify the signatures directly. Instead, there is interest in creating a subset of signatures where there is a high probability of fraud that can be verified. We present a method by which a pairwise comparison of signatures can be performed and subsequent sorting can generate such subsets.

  1. Compact cancer biomarkers discovery using a swarm intelligence feature selection algorithm.

    PubMed

    Martinez, Emmanuel; Alvarez, Mario Moises; Trevino, Victor

    2010-08-01

    Biomarker discovery is a typical application from functional genomics. Due to the large number of genes studied simultaneously in microarray data, feature selection is a key step. Swarm intelligence has emerged as a solution for the feature selection problem. However, swarm intelligence settings for feature selection fail to select small features subsets. We have proposed a swarm intelligence feature selection algorithm based on the initialization and update of only a subset of particles in the swarm. In this study, we tested our algorithm in 11 microarray datasets for brain, leukemia, lung, prostate, and others. We show that the proposed swarm intelligence algorithm successfully increase the classification accuracy and decrease the number of selected features compared to other swarm intelligence methods. Copyright © 2010 Elsevier Ltd. All rights reserved.

  2. [Nested species subsets of amphibians and reptiles in Thousand Island Lake].

    PubMed

    Wang, Xi; Wang, Yan-Ping; Ding, Ping

    2012-10-01

    Habitat fragmentation is a main cause for the loss of biological diversity. Combining line-transect methods to survey the amphibians and reptiles on 23 islands on Thousand Island Lake in Zhejiang province, along with survey data on nearby plant species and habitat variables collected by GIS, we used the"BINMATNEST (binary matrix nestedness temperature calculator)" software and the Spearman rank correlation to examine whether amphibians and reptiles followed nested subsets and their influencing factors. The results showed that amphibians and reptiles were significantly nested, and that the island area and habitat type were significantly associated with their nested ranks. Therefore, to effectively protect amphibians and reptiles in the Thousand Islands Lake area we should pay prior attention to islands with larger areas and more habitat types.

  3. Automated mapping of impervious surfaces in urban and suburban areas: Linear spectral unmixing of high spatial resolution imagery

    NASA Astrophysics Data System (ADS)

    Yang, Jian; He, Yuhong

    2017-02-01

    Quantifying impervious surfaces in urban and suburban areas is a key step toward a sustainable urban planning and management strategy. With the availability of fine-scale remote sensing imagery, automated mapping of impervious surfaces has attracted growing attention. However, the vast majority of existing studies have selected pixel-based and object-based methods for impervious surface mapping, with few adopting sub-pixel analysis of high spatial resolution imagery. This research makes use of a vegetation-bright impervious-dark impervious linear spectral mixture model to characterize urban and suburban surface components. A WorldView-3 image acquired on May 9th, 2015 is analyzed for its potential in automated unmixing of meaningful surface materials for two urban subsets and one suburban subset in Toronto, ON, Canada. Given the wide distribution of shadows in urban areas, the linear spectral unmixing is implemented in non-shadowed and shadowed areas separately for the two urban subsets. The results indicate that the accuracy of impervious surface mapping in suburban areas reaches up to 86.99%, much higher than the accuracies in urban areas (80.03% and 79.67%). Despite its merits in mapping accuracy and automation, the application of our proposed vegetation-bright impervious-dark impervious model to map impervious surfaces is limited due to the absence of soil component. To further extend the operational transferability of our proposed method, especially for the areas where plenty of bare soils exist during urbanization or reclamation, it is still of great necessity to mask out bare soils by automated classification prior to the implementation of linear spectral unmixing.

  4. An overview of the NASA Langley Atmospheric Data Center: Online tools to effectively disseminate Earth science data products

    NASA Astrophysics Data System (ADS)

    Parker, L.; Dye, R. A.; Perez, J.; Rinsland, P.

    2012-12-01

    Over the past decade the Atmospheric Science Data Center (ASDC) at NASA Langley Research Center has archived and distributed a variety of satellite mission and aircraft campaign data sets. These datasets posed unique challenges to the user community at large due to the sheer volume and variety of the data and the lack of intuitive features in the order tools available to the investigator. Some of these data sets also lack sufficient metadata to provide rudimentary data discovery. To meet the needs of emerging users, the ASDC addressed issues in data discovery and delivery through the use of standards in data and access methods, and distribution through appropriate portals. The ASDC is currently undergoing a refresh of its webpages and Ordering Tools that will leverage updated collection level metadata in an effort to enhance the user experience. The ASDC is now providing search and subset capability to key mission satellite data sets. The ASDC has collaborated with Science Teams to accommodate prospective science users in the climate and modeling communities. The ASDC is using a common framework that enables more rapid development and deployment of search and subset tools that provide enhanced access features for the user community. Features of the Search and Subset web application enables a more sophisticated approach to selecting and ordering data subsets by parameter, date, time, and geographic area. The ASDC has also applied key practices from satellite missions to the multi-campaign aircraft missions executed for Earth Venture-1 and MEaSUReS

  5. CryoEM and image sorting for flexible protein/DNA complexes.

    PubMed

    Villarreal, Seth A; Stewart, Phoebe L

    2014-07-01

    Intrinsically disordered regions of proteins and conformational flexibility within complexes can be critical for biological function. However, disorder, flexibility, and heterogeneity often hinder structural analyses. CryoEM and single particle image processing techniques offer the possibility of imaging samples with significant flexibility. Division of particle images into more homogenous subsets after data acquisition can help compensate for heterogeneity within the sample. We present the utility of an eigenimage sorting analysis for examining two protein/DNA complexes with significant conformational flexibility and heterogeneity. These complexes are integral to the non-homologous end joining pathway, and are involved in the repair of double strand breaks of DNA. Both complexes include the DNA-dependent protein kinase catalytic subunit (DNA-PKcs) and biotinylated DNA with bound streptavidin, with one complex containing the Ku heterodimer. Initial 3D reconstructions of the two DNA-PKcs complexes resembled a cryoEM structure of uncomplexed DNA-PKcs without additional density clearly attributable to the remaining components. Application of eigenimage sorting allowed division of the DNA-PKcs complex datasets into more homogeneous subsets. This led to visualization of density near the base of the DNA-PKcs that can be attributed to DNA, streptavidin, and Ku. However, comparison of projections of the subset structures with 2D class averages indicated that a significant level of heterogeneity remained within each subset. In summary, image sorting methods allowed visualization of extra density near the base of DNA-PKcs, suggesting that DNA binds in the vicinity of the base of the molecule and potentially to a flexible region of DNA-PKcs. Copyright © 2013 Elsevier Inc. All rights reserved.

  6. Selecting a climate model subset to optimise key ensemble properties

    NASA Astrophysics Data System (ADS)

    Herger, Nadja; Abramowitz, Gab; Knutti, Reto; Angélil, Oliver; Lehmann, Karsten; Sanderson, Benjamin M.

    2018-02-01

    End users studying impacts and risks caused by human-induced climate change are often presented with large multi-model ensembles of climate projections whose composition and size are arbitrarily determined. An efficient and versatile method that finds a subset which maintains certain key properties from the full ensemble is needed, but very little work has been done in this area. Therefore, users typically make their own somewhat subjective subset choices and commonly use the equally weighted model mean as a best estimate. However, different climate model simulations cannot necessarily be regarded as independent estimates due to the presence of duplicated code and shared development history. Here, we present an efficient and flexible tool that makes better use of the ensemble as a whole by finding a subset with improved mean performance compared to the multi-model mean while at the same time maintaining the spread and addressing the problem of model interdependence. Out-of-sample skill and reliability are demonstrated using model-as-truth experiments. This approach is illustrated with one set of optimisation criteria but we also highlight the flexibility of cost functions, depending on the focus of different users. The technique is useful for a range of applications that, for example, minimise present-day bias to obtain an accurate ensemble mean, reduce dependence in ensemble spread, maximise future spread, ensure good performance of individual models in an ensemble, reduce the ensemble size while maintaining important ensemble characteristics, or optimise several of these at the same time. As in any calibration exercise, the final ensemble is sensitive to the metric, observational product, and pre-processing steps used.

  7. Application of identified sensitive physical parameters in reducing the uncertainty of numerical simulation

    NASA Astrophysics Data System (ADS)

    Sun, Guodong; Mu, Mu

    2016-04-01

    An important source of uncertainty, which then causes further uncertainty in numerical simulations, is that residing in the parameters describing physical processes in numerical models. There are many physical parameters in numerical models in the atmospheric and oceanic sciences, and it would cost a great deal to reduce uncertainties in all physical parameters. Therefore, finding a subset of these parameters, which are relatively more sensitive and important parameters, and reducing the errors in the physical parameters in this subset would be a far more efficient way to reduce the uncertainties involved in simulations. In this context, we present a new approach based on the conditional nonlinear optimal perturbation related to parameter (CNOP-P) method. The approach provides a framework to ascertain the subset of those relatively more sensitive and important parameters among the physical parameters. The Lund-Potsdam-Jena (LPJ) dynamical global vegetation model was utilized to test the validity of the new approach. The results imply that nonlinear interactions among parameters play a key role in the uncertainty of numerical simulations in arid and semi-arid regions of China compared to those in northern, northeastern and southern China. The uncertainties in the numerical simulations were reduced considerably by reducing the errors of the subset of relatively more sensitive and important parameters. The results demonstrate that our approach not only offers a new route to identify relatively more sensitive and important physical parameters but also that it is viable to then apply "target observations" to reduce the uncertainties in model parameters.

  8. Adenovirus-specific T-cell Subsets in Human Peripheral Blood and After IFN-γ Immunomagnetic Selection.

    PubMed

    Qian, Chongsheng; Wang, Yingying; Cai, Huili; Laroye, Caroline; De Carvalho Bittencourt, Marcelo; Clement, Laurence; Stoltz, Jean-François; Decot, Véronique; Reppel, Loïc; Bensoussan, Danièle

    2016-01-01

    Adoptive antiviral cellular immunotherapy by infusion of virus-specific T cells (VSTs) is becoming an alternative treatment for viral infection after hematopoietic stem cell transplantation. The T memory stem cell (TSCM) subset was recently described as exhibiting self-renewal and multipotency properties which are required for sustained efficacy in vivo. We wondered if such a crucial subset for immunotherapy was present in VSTs. We identified, by flow cytometry, TSCM in adenovirus (ADV)-specific interferon (IFN)-γ+ T cells before and after IFN-γ-based immunomagnetic selection, and analyzed the distribution of the main T-cell subsets in VSTs: naive T cells (TN), TSCM, T central memory cells (TCM), T effector memory cell (TEM), and effector T cells (TEFF). In this study all of the different T-cell subsets were observed in the blood sample from healthy donor ADV-VSTs, both before and after IFN-γ-based immunomagnetic selection. As the IFN-γ-based immunomagnetic selection system sorts mainly the most differentiated T-cell subsets, we observed that TEM was always the major T-cell subset of ADV-specific T cells after immunomagnetic isolation and especially after expansion in vitro. Comparing T-cell subpopulation profiles before and after in vitro expansion, we observed that in vitro cell culture with interleukin-2 resulted in a significant expansion of TN-like, TCM, TEM, and TEFF subsets in CD4IFN-γ T cells and of TCM and TEM subsets only in CD8IFN-γ T cells. We demonstrated the presence of all T-cell subsets in IFN-γ VSTs including the TSCM subpopulation, although this was weakly selected by the IFN-γ-based immunomagnetic selection system.

  9. Aggregating job exit statuses of a plurality of compute nodes executing a parallel application

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aho, Michael E.; Attinella, John E.; Gooding, Thomas M.

    Aggregating job exit statuses of a plurality of compute nodes executing a parallel application, including: identifying a subset of compute nodes in the parallel computer to execute the parallel application; selecting one compute node in the subset of compute nodes in the parallel computer as a job leader compute node; initiating execution of the parallel application on the subset of compute nodes; receiving an exit status from each compute node in the subset of compute nodes, where the exit status for each compute node includes information describing execution of some portion of the parallel application by the compute node; aggregatingmore » each exit status from each compute node in the subset of compute nodes; and sending an aggregated exit status for the subset of compute nodes in the parallel computer.« less

  10. Reducing seed dependent variability of non-uniformly sampled multidimensional NMR data

    NASA Astrophysics Data System (ADS)

    Mobli, Mehdi

    2015-07-01

    The application of NMR spectroscopy to study the structure, dynamics and function of macromolecules requires the acquisition of several multidimensional spectra. The one-dimensional NMR time-response from the spectrometer is extended to additional dimensions by introducing incremented delays in the experiment that cause oscillation of the signal along "indirect" dimensions. For a given dimension the delay is incremented at twice the rate of the maximum frequency (Nyquist rate). To achieve high-resolution requires acquisition of long data records sampled at the Nyquist rate. This is typically a prohibitive step due to time constraints, resulting in sub-optimal data records to the detriment of subsequent analyses. The multidimensional NMR spectrum itself is typically sparse, and it has been shown that in such cases it is possible to use non-Fourier methods to reconstruct a high-resolution multidimensional spectrum from a random subset of non-uniformly sampled (NUS) data. For a given acquisition time, NUS has the potential to improve the sensitivity and resolution of a multidimensional spectrum, compared to traditional uniform sampling. The improvements in sensitivity and/or resolution achieved by NUS are heavily dependent on the distribution of points in the random subset acquired. Typically, random points are selected from a probability density function (PDF) weighted according to the NMR signal envelope. In extreme cases as little as 1% of the data is subsampled. The heavy under-sampling can result in poor reproducibility, i.e. when two experiments are carried out where the same number of random samples is selected from the same PDF but using different random seeds. Here, a jittered sampling approach is introduced that is shown to improve random seed dependent reproducibility of multidimensional spectra generated from NUS data, compared to commonly applied NUS methods. It is shown that this is achieved due to the low variability of the inherent sensitivity of the random subset chosen from a given PDF. Finally, it is demonstrated that metrics used to find optimal NUS distributions are heavily dependent on the inherent sensitivity of the random subset, and such optimisation is therefore less critical when using the proposed sampling scheme.

  11. A three-step approach for the derivation and validation of high-performing predictive models using an operational dataset: congestive heart failure readmission case study.

    PubMed

    AbdelRahman, Samir E; Zhang, Mingyuan; Bray, Bruce E; Kawamoto, Kensaku

    2014-05-27

    The aim of this study was to propose an analytical approach to develop high-performing predictive models for congestive heart failure (CHF) readmission using an operational dataset with incomplete records and changing data over time. Our analytical approach involves three steps: pre-processing, systematic model development, and risk factor analysis. For pre-processing, variables that were absent in >50% of records were removed. Moreover, the dataset was divided into a validation dataset and derivation datasets which were separated into three temporal subsets based on changes to the data over time. For systematic model development, using the different temporal datasets and the remaining explanatory variables, the models were developed by combining the use of various (i) statistical analyses to explore the relationships between the validation and the derivation datasets; (ii) adjustment methods for handling missing values; (iii) classifiers; (iv) feature selection methods; and (iv) discretization methods. We then selected the best derivation dataset and the models with the highest predictive performance. For risk factor analysis, factors in the highest-performing predictive models were analyzed and ranked using (i) statistical analyses of the best derivation dataset, (ii) feature rankers, and (iii) a newly developed algorithm to categorize risk factors as being strong, regular, or weak. The analysis dataset consisted of 2,787 CHF hospitalizations at University of Utah Health Care from January 2003 to June 2013. In this study, we used the complete-case analysis and mean-based imputation adjustment methods; the wrapper subset feature selection method; and four ranking strategies based on information gain, gain ratio, symmetrical uncertainty, and wrapper subset feature evaluators. The best-performing models resulted from the use of a complete-case analysis derivation dataset combined with the Class-Attribute Contingency Coefficient discretization method and a voting classifier which averaged the results of multi-nominal logistic regression and voting feature intervals classifiers. Of 42 final model risk factors, discharge disposition, discretized age, and indicators of anemia were the most significant. This model achieved a c-statistic of 86.8%. The proposed three-step analytical approach enhanced predictive model performance for CHF readmissions. It could potentially be leveraged to improve predictive model performance in other areas of clinical medicine.

  12. The Use of Cluster Analysis in Typological Research on Community College Students

    ERIC Educational Resources Information Center

    Bahr, Peter Riley; Bielby, Rob; House, Emily

    2011-01-01

    One useful and increasingly popular method of classifying students is known commonly as cluster analysis. The variety of techniques that comprise the cluster analytic family are intended to sort observations (for example, students) within a data set into subsets (clusters) that share similar characteristics and differ in meaningful ways from other…

  13. Building Capacity through Action Research Curricula Reviews

    ERIC Educational Resources Information Center

    Lee, Vanessa; Coombe, Leanne; Robinson, Priscilla

    2015-01-01

    In Australia, graduates of Master of Public Health (MPH) programmes are expected to achieve a set of core competencies, including a subset that is specifically related to Indigenous health. This paper reports on the methods utilised in a project which was designed using action research to strengthen Indigenous public health curricula within MPH…

  14. BOX-COUNTING DIMENSION COMPUTED BY α-DENSE CURVES

    NASA Astrophysics Data System (ADS)

    García, G.; Mora, G.; Redtwitz, D. A.

    We introduce a method to reduce to the real case the calculus of the box-counting dimension of subsets of the unit cube In, n > 1. The procedure is based on the existence of special types of α-dense curves (a generalization of the space-filling curves) in In called δ-uniform curves.

  15. Efficacy and Safety of Atomoxetine in Childhood Attention-Deficit/Hyperactivity Disorder with Comorbid Oppositional Defiant Disorder

    ERIC Educational Resources Information Center

    Kaplan, S.; Heiligenstein, J.; West, S.; Busner, J.; Harder, D.; Dittmann, R.; Casat, C.; Wernicke, J. F.

    2004-01-01

    Objective: To compare the safety and efficacy of atomoxetine, a selective inhibitor of the norepinephrine transporter, versus placebo in Attention-Deficit/Hyperactivity Disorder (ADHD) patients with comorbid Oppositional Defiant Disorder (ODD). Methods: A subset analysis of 98 children from two identical, multi-site, double-blind, randomized,…

  16. Limited Reach: The Role of Mission and Institutional Aid in Supporting Minority Students

    ERIC Educational Resources Information Center

    Finger, Mary Catherine

    2013-01-01

    This study uses a mixed methods research design to explore the relationship between institutional financial aid practice and graduation rates at a subset of private, non-profit four-year colleges and universities and explores how institutions prioritize allocations to financial aid within the framework of institutional mission, culture, and…

  17. Predictors of Daily Blood Glucose Monitoring in Appalachian Ohio

    ERIC Educational Resources Information Center

    Raffle, Holly; Ware, Lezlee J.; Ruhil, Anirudh V. S.; Hamel-Lambert, Jane; Denham, Sharon A.

    2012-01-01

    Objective: To determine factors contributing to successful diabetes self-management in Appalachia, as evidenced by daily blood glucose monitoring. Methods: A telephone survey (N = 3841) was conducted to assess health status and health care access. The current investigation is limited to the subset of this sample who report having diabetes (N =…

  18. Clinical Psychology Ph.D. Program Rankings: Evaluating Eminence on Faculty Publications and Citations

    ERIC Educational Resources Information Center

    Matson, Johnny L.; Malone, Carrie J.; Gonzalez, Melissa L.; McClure, David R.; Laud, Rinita B.; Minshawi, Noha F.

    2005-01-01

    Program rankings and their visibility have taken on greater and greater significance. Rarely is the accuracy of these rankings, which are typically based on a small subset of university faculty impressions, questioned. This paper presents a more comprehensive survey method based on quantifiable measures of faculty publications and citations. The…

  19. The importance of assessing climate change vulnerability to address species conservation

    Treesearch

    Karen E. Bagne; Megan M. Friggens; Sharon J. Coe; Deborah M. Finch

    2014-01-01

    Species conservation often prioritizes attention on a small subset of "special status" species at high risk of extinction, but actions based on current lists of special status species may not effectively moderate biodiversity loss if climate change alters threats. Assessments of climate change vulnerability may provide a method to enhance identification of...

  20. The effects of operating conditions on semivolatile organic compounds emitted from light-duty, gasoline-powered motor vehicles

    EPA Science Inventory

    A thermal extraction-gas chromatography-mass spectrometry (TE-GC-MS) method was used to quantitatively examine organic compounds in fine particulate matter (PM2.5) collected from light-duty, gasoline-powered vehicle (LDGV) exhaust. Emissions were analyzed from a subset of 18 vehi...

Top