Sample records for analysis tool ascat

  1. Clarifications on the "Comparison Between SMOS, VUA, ASCAT, and ECMWF Soil Moisture Products Over Four Watersheds in U.S."

    NASA Technical Reports Server (NTRS)

    Wagner, Wolfgang; Luca, Brocca; Naeimi, Vahid; Reichle, Rolf; Draper, Clara; de Jeu, Richard; Ryu, Dongryeol; Su, Chun-Hsu; Western, Andrew; Calvet, Jean-Christophe; hide

    2013-01-01

    In a recent paper, Leroux et al. compared three satellite soil moisture data sets (SMOS, AMSR-E, and ASCAT) and ECMWF forecast soil moisture data to in situ measurements over four watersheds located in the United States. Their conclusions stated that SMOS soil moisture retrievals represent "an improvement [in RMSE] by a factor of 2-3 compared with the other products" and that the ASCAT soil moisture data are "very noisy and unstable." In this clarification, the analysis of Leroux et al. is repeated using a newer version of the ASCAT data and additional metrics are provided. It is shown that the ASCAT retrievals are skillful, although they show some unexpected behavior during summer for two of the watersheds. It is also noted that the improvement of SMOS by a factor of 2-3 mentioned by Leroux et al. is driven by differences in bias and only applies relative to AMSR-E and the ECWMF data in the now obsolete version investigated by Leroux et al.

  2. Assimilation of ASCAT near-surface soil moisture into the French SIM hydrological model

    NASA Astrophysics Data System (ADS)

    Draper, C.; Mahfouf, J.-F.; Calvet, J.-C.; Martin, E.; Wagner, W.

    2011-06-01

    The impact of assimilating near-surface soil moisture into the SAFRAN-ISBA-MODCOU (SIM) hydrological model over France is examined. Specifically, the root-zone soil moisture in the ISBA land surface model is constrained over three and a half years, by assimilating the ASCAT-derived surface degree of saturation product, using a Simplified Extended Kalman Filter. In this experiment ISBA is forced with the near-real time SAFRAN analysis, which analyses the variables required to force ISBA from relevant observations available before the real time data cut-off. The assimilation results are tested against ISBA forecasts generated with a higher quality delayed cut-off SAFRAN analysis. Ideally, assimilating the ASCAT data will constrain the ISBA surface state to correct for errors in the near-real time SAFRAN forcing, the most significant of which was a substantial dry bias caused by a dry precipitation bias. The assimilation successfully reduced the mean root-zone soil moisture bias, relative to the delayed cut-off forecasts, by close to 50 % of the open-loop value. The improved soil moisture in the model then led to significant improvements in the forecast hydrological cycle, reducing the drainage, runoff, and evapotranspiration biases (by 17 %, 11 %, and 70 %, respectively). When coupled to the MODCOU hydrogeological model, the ASCAT assimilation also led to improved streamflow forecasts, increasing the mean discharge ratio, relative to the delayed cut off forecasts, from 0.68 to 0.76. These results demonstrate that assimilating near-surface soil moisture observations can effectively constrain the SIM model hydrology, while also confirming the accuracy of the ASCAT surface degree of saturation product. This latter point highlights how assimilation experiments can contribute towards the difficult issue of validating remotely sensed land surface observations over large spatial scales.

  3. Global-scale assessment and combination of SMAP with ASCAT (active) and AMSR2 (passive) soil moisture products

    NASA Astrophysics Data System (ADS)

    Kim, Hyunglok; Parinussa, Robert; Konings, Alexandra G.; Wagner, Wolfgang; Cosh, Michael H.; Lakshmi, Venkat; Zohaib, Muhammad; Choi, Minha

    2018-01-01

    Global-scale surface soil moisture (SSM) products retrieved from active and passive microwave remote sensing provide an effective method for monitoring near-real-time SSM content with nearly daily temporal resolution. In the present study, we first inter-compared global-scale error patterns and combined the Soil Moisture Active Passive (SMAP), Advanced Scatterometer (ASCAT), and Advanced Microwave Scanning Radiometer 2 (AMSR2) SSM products using a triple collocation (TC) analysis and the maximized Pearson correlation coefficient (R) method from April 2015 to December 2016. The Global Land Data Assimilation System (GLDAS) and global in situ observations were utilized to investigate and to compare the quality of satellite-based SSM products. The average R-values of SMAP, ASCAT, and AMSR2 were 0.74, 0.64, and 0.65 when they compared with in situ networks, respectively. The ubRMSD values were (0.0411, 0.0625, and 0.0708) m3 m- 3; and the bias values were (- 0.0460, 0.0010, and 0.0418) m3 m- 3 for SMAP, ASCAT, and AMSR2, respectively. The highest average R-values from SMAP against the in situ results are very encouraging; only SMAP showed higher R-values than GLDAS in several in situ networks with low ubRMSD (0.0438 m3 m- 3). Overall, SMAP showed a dry bias (- 0.0460 m3 m- 3) and AMSR2 had a wet bias (0.0418 m3 m- 3); while ASCAT showed the least bias (0.0010 m3 m- 3) among all the products. Each product was evaluated using TC metrics with respect to the different ranges of vegetation optical depth (VOD). Under vegetation scarce conditions (VOD < 0.10), such as desert and semi-desert regions, all products have difficulty obtaining SSM information. In regions with moderately vegetated areas (0.10 < VOD < 0.40), SMAP showed the highest Signal-to-Noise Ratio. Over highly vegetated regions (VOD > 0.40) ASCAT showed comparatively better performance than did the other products. Using the maximized R method, SMAP, ASCAT, and AMSR2 products were combined one by one using the GLDAS dataset for reference SSM values. When the satellite products were combined, R-values of the combined products were improved or degraded depending on the VOD ranges produced, when compared with the results from the original products alone. The results of this study provide an overview of SMAP, ASCAT, and AMSR2 reliability and the performance of their combined products on a global scale. This study is the first to show the advantages of the recently available SMAP dataset for effective merging of different satellite products and of their application to various hydro-meteorological problems.

  4. A simple nudging scheme to assimilate ASCAT soil moisture data in the WRF model

    NASA Astrophysics Data System (ADS)

    Capecchi, V.; Gozzini, B.

    2012-04-01

    The present work shows results obtained in a numerical experiment using the WRF (Weather and Research Forecasting, www.wrf-model.org) model. A control run where soil moisture is constrained by GFS global analysis is compared with a test run where soil moisture analysis is obtained via a simple nudging scheme using ASCAT data. The basic idea of the assimilation scheme is to "nudge" the first level (0-10 cm below ground in NOAH model) of volumetric soil moisture of the first-guess (say θ(b,1) derived from global model) towards the ASCAT derived value (say ^θ A). The soil moisture analysis θ(a,1) is given by: { θ + K (^θA - θ ) l = 1 θ(a,1) = θ(b,l) (b,l) l > 1 (b,l) (1) where l is the model soil level. K is a constant scalar value that is user specified and in this study it is equal to 0.2 (same value as in similar studies). Soil moisture is critical for estimating latent and sensible heat fluxes as well as boundary layer structure. This parameter is, however, poorly assimilated in current global and regional numerical models since no extensive soil moisture observation network exists. Remote sensing technologies offer a synoptic view of the dynamics and spatial distribution of soil moisture with a frequent temporal coverage and with a horizontal resolution similar to mesoscale NWP model. Several studies have shown that measurements of normalized backscatter (surface soil wetness) from the Advanced Scatterometer (ASCAT) operating at microwave frequencies and boarded on the meteorological operational (Metop) satellite, offer quality information about surface soil moisture. Recently several studies deal with the implementation of simple assimilation procedures (nudging, Extended Kalman Filter, etc...) to integrate ASCAT data in NWP models. They found improvements in screen temperature predictions, particularly in areas such as North-America and in the Tropics, where it is strong the land-atmosphere coupling. The ECMWF (Newsletter No. 127) is currently implementing and testing an EKF for combining conventional observations and remote sensed soil moisture data in order to produce a more accurate analysis. In the present work verification skills (RMSE, BIAS, correlation) of both control and test run are presented using observed data collected by International Soil Moisture Network. Moreover improvements in temperature predictions are evaluated.

  5. A new Downscaling Approach for SMAP, SMOS and ASCAT by predicting sub-grid Soil Moisture Variability based on Soil Texture

    NASA Astrophysics Data System (ADS)

    Montzka, C.; Rötzer, K.; Bogena, H. R.; Vereecken, H.

    2017-12-01

    Improving the coarse spatial resolution of global soil moisture products from SMOS, SMAP and ASCAT is currently an up-to-date topic. Soil texture heterogeneity is known to be one of the main sources of soil moisture spatial variability. A method has been developed that predicts the soil moisture standard deviation as a function of the mean soil moisture based on soil texture information. It is a closed-form expression using stochastic analysis of 1D unsaturated gravitational flow in an infinitely long vertical profile based on the Mualem-van Genuchten model and first-order Taylor expansions. With the recent development of high resolution maps of basic soil properties such as soil texture and bulk density, relevant information to estimate soil moisture variability within a satellite product grid cell is available. Here, we predict for each SMOS, SMAP and ASCAT grid cell the sub-grid soil moisture variability based on the SoilGrids1km data set. We provide a look-up table that indicates the soil moisture standard deviation for any given soil moisture mean. The resulting data set provides important information for downscaling coarse soil moisture observations of the SMOS, SMAP and ASCAT missions. Downscaling SMAP data by a field capacity proxy indicates adequate accuracy of the sub-grid soil moisture patterns.

  6. NCA-LDAS land analysis: Development and performance of a multisensory, multivariate land data assimilation for the National Climate Assessment

    NASA Astrophysics Data System (ADS)

    Kumar, S.; Jasinski, M. F.; Mocko, D. M.; Rodell, M.; Borak, J.; Li, B.; Beaudoing, H. K.; Peters-Lidard, C. D.

    2017-12-01

    This presentation will describe one of the first successful examples of multisensor, multivariate land data assimilation, encompassing a large suite of soil moisture, snow depth, snow cover and irrigation intensity environmental data records (EDRs) from Scanning Multi-channel Microwave Radiometer (SMMR), the Special Sensor Microwave Imager (SSM/I), the Advanced Scatterometer (ASCAT), the Moderate-Resolution Imaging Spectroradiometer (MODIS), the Advanced Microwave Scanning Radiometer (AMSR-E and AMSR2), the Soil Moisture Ocean Salinity (SMOS) mission and the Soil Moisture Active Passive (SMAP) mission. The analysis is performed using the NASA Land Information System (LIS) as an enabling tool for the U.S. National Climate Assessment (NCA). The performance of NCA Land Data Assimilation System (NCA-LDAS) is evaluated by comparing to a number of hydrological reference data products. Results indicate that multivariate assimilation provides systematic improvements in simulated soil moisture and snow depth, with marginal effects on the accuracy of simulated streamflow and ET. An important conclusion is that across all evaluated variables, assimilation of data from increasingly more modern sensors (e.g. SMOS, SMAP, AMSR2, ASCAT) produces more skillful results than assimilation of data from older sensors (e.g. SMMR, SSM/I, AMSR-E). The evaluation also indicates high skill of NCA-LDAS when compared with other land analysis products. Further, drought indicators based on NCA-LDAS output suggest a trend of longer and more severe droughts over parts of Western U.S. during 1979-2015, particularly in the Southwestern U.S.

  7. Assimilation of neural network soil moisture in land surface models

    NASA Astrophysics Data System (ADS)

    Rodriguez-Fernandez, Nemesio; de Rosnay, Patricia; Albergel, Clement; Aires, Filipe; Prigent, Catherine; Kerr, Yann; Richaume, Philippe; Muñoz-Sabater, Joaquin; Drusch, Matthias

    2017-04-01

    In this study a set of land surface data assimilation (DA) experiments making use of satellite derived soil moisture (SM) are presented. These experiments have two objectives: (1) to test the information content of satellite remote sensing of soil moisture for numerical weather prediction (NWP) models, and (2) to test a simplified assimilation of these data through the use of a Neural Network (NN) retrieval. Advanced Scatterometer (ASCAT) and Soil Moisture and Ocean Salinity (SMOS) data were used. The SMOS soil moisture dataset was obtained specifically for this project training a NN using SMOS brightness temperatures as input and using as reference for the training European Centre for Medium-Range Weather Forecasts (ECMWF) H-TESSEL SM fields. In this way, the SMOS NN SM dataset has a similar climatology to that of the model and it does not present a global bias with respect to the model. The DA experiments are computed using a surface-only Land Data Assimilation System (so-LDAS) based on the HTESSEL land surface model. This system is very computationally efficient and allows to perform long surface assimilation experiments (one whole year, 2012). SMOS NN SM DA experiments are compared to ASCAT SM DA experiments. In both cases, experiments with and without 2 m air temperature and relative humidity DA are discussed using different observation errors for the ASCAT and SMOS datasets. Seasonal, geographical and soil-depth-related differences between the results of those experiments are presented and discussed. The different SM analysed fields are evaluated against a large number of in situ measurements of SM. On average, the SM analysis gives in general similar results to the model open loop with no assimilation even if significant differences can be seen for specific sites with in situ measurements. The sensitivity to observation errors to the SM dataset slightly differs depending on the networks of in situ measurements, however it is relatively low for the tests conducted here. Finally, the effect of the soil moisture analysis on the NWP is evaluated comparing experiments for different configurations of the system, with and without (Open Loop) soil moisture data assimilation. ssimilation of ASCAT soil moisture improves the forecast in the tropics and adds information with respect to the near surface conventional observations. In contrast, SMOS degrades the forecast in the Tropics in July-September. In the Southern hemisphere ASCAT degrades the forecast in July-September both alone and using 2m air temperature and relative humidity. On the other hand, experiments using SMOS (even without screen level variables) improve the forecast for all the seasons, in particular, in July-December. In the northern hemisphere both with ASCAT and SMOS, the experiments using 2m air temperature and relative humidity improve the forecast in April-September. SMOS alone has a significant positive effect in July-September for experiments with low observation error. Maps of the forecast skill with respect to the open loop experiment show that SMOS improves the forecast in North America and to a lesser extent in northern Asia for up to 72 hours.

  8. Application of the Auto-Tuned Land Assimilation System (ATLAS) to ASCAT and SMOS soil moisture retrieval products

    USDA-ARS?s Scientific Manuscript database

    Land data assimilations are typically based on highly uncertain assumptions regarding the statistical structure of observation and modeling errors. Left uncorrected, poor assumptions can degrade the quality of analysis products generated by land data assimilation systems. Recently, Crow and van de...

  9. Soil Moisture Extremes Observed by METOP ASCAT: Was 2012 an Exceptional Year?

    NASA Astrophysics Data System (ADS)

    Wagner, Wolfgang; Paulik, Christoph; Hahn, Sebastian; Melzer, Thomas; Parinussa, Robert; de Jeu, Richard; Dorigo, Wouter; Chung, Daniel; Enenkel, Markus

    2013-04-01

    In summer 2012 the international press reported widely about the severe drought that had befallen large parts of the United States. Yet, the US drought was only one of several major droughts that occurred in 2012: Southeastern Europe, Central Asia, Brazil, India, Southern Australia and several other regions suffered from similarly dry soil conditions. This raises the question whether 2012 was an exceptionally dry year? In this presentation we will address this question by analyzing global soil moisture patterns as observed by the Advanced Scatterometer (ASCAT) flown on board of the METOP-A satellite. We firstly compare the 2012 ASCAT soil moisture data to all available ASCAT measurements acquired by the instrument since the launch of METOP-A in November 2006. Secondly, we compare the 2012 data to a long-term soil moisture data set derived by merging the ASCAT soil moisture data with other active and passive microwave soil moisture retrievals as described by Liu et al. (2012) and Wagner et al. (2012) (see also http://www.esa-soilmoisture-cci.org/). A first trend analysis of the latter long-term soil moisture data set carried out by Dorigo et al. (2012) has revealed that over the period 1988-2010 significant trends were observed over 27 % of the area covered by the data set, of which 73 % were negative (soil drying) and only 27 % were positive (soil wetting). In this presentation we will show how the inclusion of the years 2011 and 2012 affects the areal extent and strengths of these significant trends. REFERENCES Dorigo, W., R. de Jeu, D. Chung, R. Parinussa, Y. Liu, W. Wagner, D. Fernández-Prieto (2012) Evaluating global trends (1988-2010) in harmonized multi-satellite surface soil moisture, Geophysical Research Letters, 39, L18405, 1-7. Liu, Y.Y., W.A. Dorigo, R.M. Parinussa, R.A.M. de Jeu, W. Wagner, M.F. McCabe, J.P. Evans, A.I.J.M. van Dijk (2012) Trend-preserving blending of passive and active microwave soil moisture retrievals, Remote Sensing of Environment, 123, 280-297. Wagner, W., W. Dorigo, R. de Jeu, D. Fernandez, J. Benveniste, E. Haas, M. Ertl (2012) Fusion of active and passive microwave observations to create an Essential Climate Variable data record on soil moisture, ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences (ISPRS Annals), Volume I-7, XXII ISPRS Congress, Melbourne, Australia, 25 August-1 September 2012, 315-321.

  10. From ASCAT to Sentinel-1: Soil Moisture Monitoring using European C-Band Radars

    NASA Astrophysics Data System (ADS)

    Wagner, Wolfgang; Bauer-Marschallinger, Bernhard; Hochstöger, Simon

    2016-04-01

    The Advanced Scatterometer (ASCAT) is a C-Band radar instrument flown on board of the series of three METOP satellites. Albeit not operating in one of the more favorable longer wavelength ranges (S, L or P-band) as the dedicated Soil Moisture and Ocean Salinity (SMOS) and Soil Moisture Active Passive (SMAP) missions, it is one of main microwave sensors used for monitoring of soil moisture on a global scale. Its attractiveness for soil moisture monitoring applications stems from its operational status, high radiometric accuracy and stability, short revisit time, multiple viewing directions and long heritage (Wagner et al. 2013). From an application perspective, its main limitation is its spatial resolution of about 25 km, which does not allow resolving soil moisture patterns driven by smaller-scale hydrometeorological processes (e.g. convective precipitation, runoff patterns, etc.) that are themselves related to highly variable land surface characteristics (soil characteristics, topography, vegetation cover, etc.). Fortunately, the technique of aperture synthesis allows to significantly improve the spatial resolution of spaceborne radar instruments up to the meter scale. Yet, past Synthetic Aperture Radar (SAR) missions had not yet been designed to achieve a short revisit time required for soil moisture monitoring. This has only changed recently with the development and launch of SMAP (Entekhabi et al. 2010) and Sentinel-1 (Hornacek et al. 2012). Unfortunately, the SMAP radar failed only after a few months of operations, which leaves Sentinel-1 as the only currently operational SAR mission capable of delivering high-resolution radar observations with a revisit time of about three days for Europe, about weekly for most crop growing regions worldwide, and about bi-weekly to monthly over the rest of the land surface area. Like ASCAT, Sentinel-1 acquires C-band backscatter data in VV polarization over land. Therefore, for the interpretation of both ASCAT and Sentinel-1 backscatter observation, the same physical processes and geophysical variables (e.g. vegetation optical depth, surface roughness, soil volume scattering, etc.) need to be considered. The difference lies mainly in the scaling, i.e. how prominently the different variables influence the C-band data at the different spatial (25 km versus 20 m) and temporal (daily versus 3-30 days repeat coverage) scales. Therefore, while the general properties of soil moisture retrievals schemes used for ASCAT and Sentinel-1 can be the same, the details of the algorithm and parameterization will be different. This presentation will review similarities and differences of soil moisture retrieval approaches used for ASCAT and Sentinel-1, with a focus on the change detection method developed by TU Wien. Some first comparisons of ASCAT and Sentinel-1 soil moisture data over Europe will also be shown. REFERENCES Entekhabi, D., Njoku, E.G., O'Neill, P.E., Kellog, K.H., Crow, W.T., Edelstein, W.N., Entin, J.K., Goodman, S.D., Jackson, T.J., Johnson, J., Kimball, J., Piepmeier, J.R., Koster, R., Martin, N., McDonald, K.C., Moghaddam, M., Moran, S., Reichle, R., Shi, J.C., Spencer, M.W., Thurman, S.W., Tsang, L., & Van Zyl, J. (2010). The Soil Moisture Active Passive (SMAP) mission. Proceedings of the IEEE, 98, 704-716 Hornacek, M., Wagner, W., Sabel, D., Truong, H.L., Snoeij, P., Hahmann, T., Diedrich, E., & Doubkova, M. (2012). Potential for High Resolution Systematic Global Surface Soil Moisture Retrieval via Change Detection Using Sentinel-1. IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, 5, 1303-1311 Wagner, W., Hahn, S., Kidd, R., Melzer, T., Bartalis, Z., Hasenauer, S., Figa-Saldana, J., De Rosnay, P., Jann, A., Schneider, S., Komma, J., Kubu, G., Brugger, K., Aubrecht, C., Züger, C., Gangkofer, U., Kienberger, S., Brocca, L., Wang, Y., Blöschl, G., Eitzinger, J., Steinnocher, K., Zeil, P., & Rubel, F. (2013). The ASCAT soil moisture product: A review of its specifications, validation results, and emerging applications. Meteorologische Zeitschrift, 22, 5-33

  11. Impact of scatterometer wind (ASCAT-A/B) data assimilation on semi real-time forecast system at KIAPS

    NASA Astrophysics Data System (ADS)

    Han, H. J.; Kang, J. H.

    2016-12-01

    Since Jul. 2015, KIAPS (Korea Institute of Atmospheric Prediction Systems) has been performing the semi real-time forecast system to assess the performance of their forecast system as a NWP model. KPOP (KIAPS Protocol for Observation Processing) is a part of KIAPS data assimilation system and has been performing well in KIAPS semi real-time forecast system. In this study, due to the fact that KPOP would be able to treat the scatterometer wind data, we analyze the effect of scatterometer wind (ASCAT-A/B) on KIAPS semi real-time forecast system. O-B global distribution and statistics of scatterometer wind give use two information which are the difference between background field and observation is not too large and KPOP processed the scatterometer wind data well. The changes of analysis increment because of O-B global distribution appear remarkably at the bottom of atmospheric field. It also shows that scatterometer wind data cover wide ocean where data would be able to short. Performance of scatterometer wind data can be checked through the vertical error reduction against IFS between background and analysis field and vertical statistics of O-A. By these analysis result, we can notice that scatterometer wind data will influence the positive effect on lower level performance of semi real-time forecast system at KIAPS. After, long-term result based on effect of scatterometer wind data will be analyzed.

  12. An Initial Assessment of the Impact of CYGNSS Ocean Surface Wind Assimilation on Navy Global and Mesoscale Numerical Weather Prediction

    NASA Astrophysics Data System (ADS)

    Baker, N. L.; Tsu, J.; Swadley, S. D.

    2017-12-01

    We assess the impact of assimilation of CYclone Global Navigation Satellite System (CYGNSS) ocean surface winds observations into the NAVGEM[i] global and COAMPS®[ii] mesoscale numerical weather prediction (NWP) systems. Both NAVGEM and COAMPS® used the NRL 4DVar assimilation system NAVDAS-AR[iii]. Long term monitoring of the NAVGEM Forecast Sensitivity Observation Impact (FSOI) indicates that the forecast error reduction for ocean surface wind vectors (ASCAT and WindSat) are significantly larger than for SSMIS wind speed observations. These differences are larger than can be explained by simply two pieces of information (for wind vectors) versus one (wind speed). To help understand these results, we conducted a series of Observing System Experiments (OSEs) to compare the assimilation of ASCAT wind vectors with the equivalent (computed) ASCAT wind speed observations. We found that wind vector assimilation was typically 3 times more effective at reducing the NAVGEM forecast error, with a higher percentage of beneficial observations. These results suggested that 4DVar, in the absence of an additional nonlinear outer loop, has limited ability to modify the analysis wind direction. We examined several strategies for assimilating CYGNSS ocean surface wind speed observations. In the first approach, we assimilated CYGNSS as wind speed observations, following the same methodology used for SSMIS winds. The next two approaches converted CYGNSS wind speed to wind vectors, using NAVGEM sea level pressure fields (following Holton, 1979), and using NAVGEM 10-m wind fields with the AER Variational Analysis Method. Finally, we compared these methods to CYGNSS wind speed assimilation using multiple outer loops with NAVGEM Hybrid 4DVar. Results support the earlier studies suggesting that NAVDAS-AR wind speed assimilation is sub-optimal. We present detailed results from multi-month NAVGEM assimilation runs along with case studies using COAMPS®. Comparisons include the fit of analyses and forecasts with in-situ observations and analyses from other NWP centers (e.g. ECMWF and GFS). [i] NAVy Global Environmental Model [ii] COAMPS® is a registered trademark of the Naval Research Laboratory for the Navy's Coupled Ocean Atmosphere Mesoscale Prediction System. [iii] NRL Atmospheric Variational Data Assimilation System

  13. Assimilation of ASCAT near-surface soil moisture into the SIM hydrological model over France

    NASA Astrophysics Data System (ADS)

    Draper, C.; Mahfouf, J.-F.; Calvet, J.-C.; Martin, E.; Wagner, W.

    2011-12-01

    This study examines whether the assimilation of remotely sensed near-surface soil moisture observations might benefit an operational hydrological model, specifically Météo-France's SAFRAN-ISBA-MODCOU (SIM) model. Soil moisture data derived from ASCAT backscatter observations are assimilated into SIM using a Simplified Extended Kalman Filter (SEKF) over 3.5 years. The benefit of the assimilation is tested by comparison to a delayed cut-off version of SIM, in which the land surface is forced with more accurate atmospheric analyses, due to the availability of additional atmospheric observations after the near-real time data cut-off. However, comparing the near-real time and delayed cut-off SIM models revealed that the main difference between them is a dry bias in the near-real time precipitation forcing, which resulted in a dry bias in the root-zone soil moisture and associated surface moisture flux forecasts. While assimilating the ASCAT data did reduce the root-zone soil moisture dry bias (by nearly 50%), this was more likely due to a bias within the SEKF, than due to the assimilation having accurately responded to the precipitation errors. Several improvements to the assimilation are identified to address this, and a bias-aware strategy is suggested for explicitly correcting the model bias. However, in this experiment the moisture added by the SEKF was quickly lost from the model surface due to the enhanced surface fluxes (particularly drainage) induced by the wetter soil moisture states. Consequently, by the end of each winter, during which frozen conditions prevent the ASCAT data from being assimilated, the model land surface had returned to its original (dry-biased) climate. This highlights that it would be more effective to address the precipitation bias directly, than to correct it by constraining the model soil moisture through data assimilation.

  14. The CMEMS L3 scatterometer wind product

    NASA Astrophysics Data System (ADS)

    de Kloe, Jos; Stoffelen, Ad; Verhoef, Anton

    2017-04-01

    Within the Copernicus Marine Environment Monitoring Service KNMI produces several ocean surface Level 3 wind products. These are daily updated global maps on a regular grid of the available scatterometer wind observations and derived properties, and produced from our EUMETSAT Ocean and Sea Ice Satellite Application Facility (OSI SAF) operational near-real time (NRT) Level 2 swath-based wind products by linear interpolation. Currently available products are the ASCAT on Metop A/B stress equivalent wind vectors, accompanied by ECMWF NWP reference stress equivalent winds from the operational ECMWF NWP model. For each ASCAT scatterometer we provide products on 2 different resolutions, 0.25 and 0.125 degrees. In addition we provide wind stress vectors, and derivative fields (curl and divergence) for stress equivalent wind and wind stress, both for the observations and for the NWP reference winds. New NRT scatterometer products will be made available when additional scatterometer instruments become available, and NRT access to the data can be arranged. We hope OSCAT on the Indian ScatSat-1 satellite will be the the next NRT product to be added. In addition multi-year reprocessing datasets have been made available for ASCAT on Metop-A (1-Jan-2007 up to 31-Mar-2014) and Seawinds on QuikScat (19-Jul-1999 up to 21-Nov-2009). For ASCAT 0.25 and 0.125 degree resolution products are provided, and for QuikScat 0.50 and 0.25 degree resolution products are provided, These products are based on reprocessing the L2 scatterometer products with the latest processing software version, and include reference winds from the ECMWF ERA-Interim model. Additional reprocessing datasets will be added when reprocessed L2 datasets become available. This will hopefully include the ERS-1 and ERS-2 scatterometer datasets (1992-2001), which will extend the available date range back to 1992. These products are available for download through the CMEMS portal website: http://marine.copernicus.eu/

  15. Coastal and rain-induced wind variability depicted by scatterometers

    NASA Astrophysics Data System (ADS)

    Portabella, M.; Lin, W.; Stoffelen, A.; Turiel, A.; Verhoef, A.; Verspeek, J.; Ballabrera, J.; Vogelzang, J.

    2012-04-01

    A detailed knowledge of local wind variability near the shore is very important since it strongly affects the weather and microclimate in coastal regions. Since coastal areas are densely populated and most activity at sea occurs near the shore, sea-surface wind field information is important for a number of applications. In the vicinity of land sea-breeze, wave fetch, katabatic and current effects are more likely than in the open ocean, thus enhancing air-sea interaction. Also very relevant for air-sea interaction are the rain-induced phenomena, such as downbursts and convergence. Relatively cold and dry air is effectively transported to the ocean surface and surface winds are enhanced. In general, both coastal and rain-induced wind variability are poorly resolved by Numerical Weather Prediction (NWP) models. Satellite real aperture radars (i.e., scatterometers) are known to provide accurate mesoscale (25-50 km resolution) sea surface wind field information used in a wide variety of applications. Nowadays, there are two operating scatterometers in orbit, i.e., the C-band Advanced Scatterometer (ASCAT) onboard Metop-A and the Ku-band scatterometer (OSCAT) onboard Oceansat-2. The EUMETSAT Ocean and Sea Ice Satellite Application Facility (OSI SAF) delivers several ASCAT level 2 wind products with 25 km and 12.5 km Wind Vector Cell (WVC) spacing, including a pre-operational coastal wind product as well as an OSCAT level 2 wind product with 50 km spacing in development status. Rain is known to both attenuate and scatter the microwave signal. In addition, there is a "splashing" effect. The roughness of the sea surface is increased because of splashing due to rain drops. The so-called "rain contamination" is larger for Ku-band scatterometer systems than for C-band systems. Moreover, the associated downdrafts lead to variable wind speeds and directions, further complicating the wind retrieval. The C-band ASCAT high resolution wind processing is validated under rainy conditions, using collocations with the Tropical Rainfall Measuring Mission's (TRMM) Microwave Imager (TMI) rain data, and the tropical moored buoy wind and precipitation data. It turns out that the effect of low and moderate rain appears mainly in increasing the wind variability near the surface and, unlike for Ku-band scatterometers, the rain rate itself does not appear clearly as a limiting factor in ASCAT wind quality. Moreover, the downburst patterns as observed by ASCAT are unique and have large implications for air-sea exchange. At the conference, the main progress in scatterometer wind data processing will be shown.

  16. Assessing seasonal backscatter variations with respect to uncertainties in soil moisture retrieval in Siberian tundra regions

    NASA Astrophysics Data System (ADS)

    Högström, Elin; Trofaier, Anna Maria; Gouttevin, Isabella; Bartsch, Annett

    2015-04-01

    Data from the Advanced Scatterometer (ASCAT) instrument provide the basis of a near real-time, coarse scale, global soil moisture product. Numerous studies have shown the applicability of this product, including recent operational use for numerical weather forecasts. Soil moisture is a key element in the global cycles of water, energy and carbon. Among many application areas, it is essential for the understanding of permafrost development in a future climate change scenario. Dramatic climate changes are expected in the Arctic, where ca 25% of the land is underlain by permafrost, and it is to a large extent remote and inaccessible. The availability and applicability of satellite derived land-surface data relevant for permafrost studies, such as surface soil moisture, is thus crucial to landscape-scale analyses of climate-induced change. However, there are challenges in the soil moisture retrieval that are specific to the Arctic. This study investigates backscatter variability unrelated to soil moisture variations in order to understand the possible impact on the soil moisture retrieval. The focus is on tundra lakes, which are a common feature in the Arctic and are expected to affect the retrieval. ENVISAT Advanced Synthetic Aperture Radar (ASAR) Wide Swath (120 m) data are used to resolve lakes and later understand and quantify their impacts on Metop ASCAT (25 km) soil moisture retrieval during the snow free period. Sites of interest are chosen according to high or low agreement between output from the land surface model ORCHIDEE and ASCAT derived SSM. The results show that in most cases low model agreement is related to high water fraction. The water fraction correlates with backscatter deviations (relative to a smooth water surface reference image) within the ASCAT footprint areas (R = 0.91-0.97). Backscatter deviations of up to 5 dB can occur in areas with less than 50% water fraction and an assumed soil moisture related range (sensitivity) of 7 dB in the ASCAT data. The study demonstrates that the usage of higher spatial resolution data than currently available for SSM is required in lowland permafrost environments. Furthermore, the results show that in the flat and open Arctic tundra areas, wind likely affects the soil moisture retrieval procedure rather than rain or remaining ice cover on the water surface. Therefore, the potential of a wind correction method is explored for sites where meteorological field data are available.

  17. Variability in Arctic sea ice topography and atmospheric form drag: Combining IceBridge laser altimetry with ASCAT radar backscatter.

    NASA Astrophysics Data System (ADS)

    Petty, A.; Tsamados, M.; Kurtz, N. T.

    2016-12-01

    Here we present atmospheric form drag estimates over Arctic sea ice using high resolution, three-dimensional surface elevation data from NASA's Operation IceBridge Airborne Topographic Mapper (ATM), and surface roughness estimates from the Advanced Scatterometer (ASCAT). Surface features of the ice pack (e.g. pressure ridges) are detected using IceBridge ATM elevation data and a novel surface feature-picking algorithm. We use simple form drag parameterizations to convert the observed height and spacing of surface features into an effective atmospheric form drag coefficient. The results demonstrate strong regional variability in the atmospheric form drag coefficient, linked to variability in both the height and spacing of surface features. This includes form drag estimates around 2-3 times higher over the multiyear ice north of Greenland, compared to the first-year ice of the Beaufort/Chukchi seas. We compare results from both scanning and linear profiling to ensure our results are consistent with previous studies investigating form drag over Arctic sea ice. A strong correlation between ASCAT surface roughness estimates (using radar backscatter) and the IceBridge form drag results enable us to extrapolate the IceBridge data collected over the western-Arctic across the entire Arctic Ocean. While our focus is on spring, due to the timing of the primary IceBridge campaigns since 2009, we also take advantage of the autumn data collected by IceBridge in 2015 to investigate seasonality in Arctic ice topography and the resulting form drag coefficient. Our results offer the first large-scale assessment of atmospheric form drag over Arctic sea ice due to variable ice topography (i.e. within the Arctic pack ice). The analysis is being extended to the Antarctic IceBridge sea ice data, and the results are being used to calibrate a sophisticated form drag parameterization scheme included in the sea ice model CICE, to improve the representation of form drag over Arctic and Antarctic sea ice in global climate models.

  18. The Offshore New European Wind Atlas

    NASA Astrophysics Data System (ADS)

    Karagali, I.; Hahmann, A. N.; Badger, M.; Hasager, C.; Mann, J.

    2017-12-01

    The New European Wind Atlas (NEWA) is a joint effort of research agencies from eight European countries, co-funded under the ERANET Plus Program. The project is structured around two areas of work: development of dynamical downscaling methodologies and measurement campaigns to validate these methodologies, leading to the creation and publication of a European wind atlas in electronic form. This atlas will contain an offshore component extending 100 km from the European coasts. To achieve this, mesoscale models along with various observational datasets are utilised. Scanning lidars located at the coastline were used to compare the coastal wind gradient reproduced by the meso-scale model. Currently, an experimental campaign is occurring in the Baltic Sea, with a lidar located in a commercial ship sailing from Germany to Lithuania, thus covering the entire span of the south Baltic basin. In addition, satellite wind retrievals from scatterometers and Synthetic Aperture Radar (SAR) instruments were used to generate mean wind field maps and validate offshore modelled wind fields and identify the optimal model set-up parameters.The aim of this study is to compare the initial outputs from the offshore wind atlas produced by the Weather & Research Forecasting (WRF) model, still in pre-operational phase, and the METOP-A/B Advanced Scatterometer (ASCAT) wind fields, reprocessed to stress equivalent winds at 10m. Different experiments were set-up to evaluate the model sensitivity for the various domains covered by the NEWA offshore atlas. ASCAT winds were utilised to assess the performance of the WRF offshore atlases. In addition, ASCAT winds were used to create an offshore atlas covering the years 2007 to 2016, capturing the signature of various spatial wind features, such as channelling and lee effects from complex coastal topographical elements.

  19. Four Decades of Microwave Satellite Soil Moisture Observations: Product validation and inter-satellite comparisons

    NASA Astrophysics Data System (ADS)

    Lanka, K.; Pan, M.; Wanders, N.; Kumar, D. N.; Wood, E. F.

    2017-12-01

    The satellite based passive and active microwave sensors enhanced our ability to retrieve soil moisture at global scales. It has been almost four decades since the first passive microwave satellite sensor was launched in 1978. Since then soil moisture has gained considerable attention in hydro-meteorological, climate, and agricultural research resulting in the deployment of two dedicated missions in the last decade, SMOS and SMAP. Signifying the four decades of microwave remote sensing of soil moisture, this work aims to present an overview of how our knowledge in this field has improved in terms of the design of sensors and their accuracy of retrieving soil moisture. We considered daily coverage, temporal performance, and spatial performance to assess the accuracy of products corresponding to eight passive sensors (SMMR, SSM/I, TMI, AMSR-E, WindSAT, AMSR2, SMOS and SMAP), two active sensors (ERS-Scatterometer, MetOp-ASCAT), and one active/passive merged soil moisture product (ESA-CCI combined product), using 1058 ISMN in-situ stations and the VIC LSM soil moisture simulations (VICSM) over the CONUS. Our analysis indicated that the daily coverage has increased from 30 % during 1980s to 85 % (during non-winter months) with the launch of dedicated soil moisture missions SMOS and SMAP. The temporal validation of passive and active soil moisture products with the ISMN data place the range of median RMSE as 0.06-0.10 m3/m3 and median correlation as 0.20-0.68. When TMI, AMSR-E and WindSAT are evaluated, the AMSR-E sensor is found to have produced the brightness temperatures with better quality, given that these sensors are paired with same retrieval algorithm (LPRM). The ASCAT product shows a significant improvement during the temporal validation of retrievals compared to its predecessor ERS, thanks to enhanced sensor configuration. The SMAP mission, through its improved sensor design and RFI handling, shows a high retrieval accuracy under all-topography conditions. Although the retrievals from the SMOS mission are affected by issues such as RFI, the accuracy is still comparable to or better than that of AMSR-E and ASCAT sensors. All soil moisture products have indicated better agreement with the ISMN data than the VICSM, which indicate that they produce soil moisture with better accuracy than the VICSM over the CONUS.

  20. Assimilation of Passive and Active Microwave Soil Moisture Retrievals

    NASA Technical Reports Server (NTRS)

    Draper, C. S.; Reichle, R. H.; DeLannoy, G. J. M.; Liu, Q.

    2012-01-01

    Root-zone soil moisture is an important control over the partition of land surface energy and moisture, and the assimilation of remotely sensed near-surface soil moisture has been shown to improve model profile soil moisture [1]. To date, efforts to assimilate remotely sensed near-surface soil moisture at large scales have focused on soil moisture derived from the passive microwave Advanced Microwave Scanning Radiometer (AMSR-E) and the active Advanced Scatterometer (ASCAT; together with its predecessor on the European Remote Sensing satellites (ERS. The assimilation of passive and active microwave soil moisture observations has not yet been directly compared, and so this study compares the impact of assimilating ASCAT and AMSR-E soil moisture data, both separately and together. Since the soil moisture retrieval skill from active and passive microwave data is thought to differ according to surface characteristics [2], the impact of each assimilation on the model soil moisture skill is assessed according to land cover type, by comparison to in situ soil moisture observations.

  1. NWS Marine Forecast Areas

    Science.gov Websites

    Currents Global Ocean Model Sea Surface Temperatures Gulf Stream ASCII Data Gulf Stream Comparison Gridded ASCAT Scatterometer Winds Lightning Strike Density Satellite Imagery Ocean Global Ocean Model , 2017 19:10:57 UTC Disclaimer Information Quality Help Glossary Privacy Policy Freedom of Information

  2. Using high-resolution soil moisture modelling to assess the uncertainty of microwave remotely sensed soil moisture products at the correct spatial and temporal support

    NASA Astrophysics Data System (ADS)

    Wanders, N.; Karssenberg, D.; Bierkens, M. F. P.; Van Dam, J. C.; De Jong, S. M.

    2012-04-01

    Soil moisture is a key variable in the hydrological cycle and important in hydrological modelling. When assimilating soil moisture into flood forecasting models, the improvement of forecasting skills depends on the ability to accurately estimate the spatial and temporal patterns of soil moisture content throughout the river basin. Space-borne remote sensing may provide this information with a high temporal and spatial resolution and with a global coverage. Currently three microwave soil moisture products are available: AMSR-E, ASCAT and SMOS. The quality of these satellite-based products is often assessed by comparing them with in-situ observations of soil moisture. This comparison is however hampered by the difference in spatial and temporal support (i.e., resolution, scale), because the spatial resolution of microwave satellites is rather low compared to in-situ field measurements. Thus, the aim of this study is to derive a method to assess the uncertainty of microwave satellite soil moisture products at the correct spatial support. To overcome the difference in support size between in-situ soil moisture observations and remote sensed soil moisture, we used a stochastic, distributed unsaturated zone model (SWAP, van Dam (2000)) that is upscaled to the support of different satellite products. A detailed assessment of the SWAP model uncertainty is included to ensure that the uncertainty in satellite soil moisture is not overestimated due to an underestimation of the model uncertainty. We simulated unsaturated water flow up to a depth of 1.5m with a vertical resolution of 1 to 10 cm and on a horizontal grid of 1 km2 for the period Jan 2010 - Jun 2011. The SWAP model was first calibrated and validated on in-situ data of the REMEDHUS soil moisture network (Spain). Next, to evaluate the satellite products, the model was run for areas in the proximity of 79 meteorological stations in Spain, where model results were aggregated to the correct support of the satellite product by averaging model results from the 1 km2 grid within the remote sensing footprint. Overall 440 (AMSR-E, SMOS) to 680 (ASCAT) timeseries were compared to the aggregated SWAP model results, providing valuable information on the uncertainty of satellite soil moisture at the proper support. Our results show that temporal dynamics are best captured by ASCAT resulting in an average correlation of 0.72 with the model, while ASMR-E (0.41) and SMOS (0.42) are less capable of representing these dynamics. Standard deviations found for ASCAT and SMOS are low, 0.049 and 0.051m3m-3 respectively, while AMSR-E has a higher value of 0.062m3m-3. All standard deviations are higher than the average model uncertainty of 0.017m3m-3. All satellite products show a negative bias compared to the model results, with the largest value for SMOS. Satellite uncertainty is not found to be significantly related to topography, but is found to increase in densely vegetated areas. In general AMSR-E has most difficulties capturing soil moisture dynamics in Spain, while SMOS and mainly ASCAT have a fair to good performance. However, all products contain valuable information about the near-surface soil moisture over Spain. Van Dam, J.C., 2000, Field scale water flow and solute transport. SWAP model concepts, parameter estimation and case studies. Ph.D. thesis, Wageningen University

  3. A review of the applications of ASCAT soil moisture products

    USDA-ARS?s Scientific Manuscript database

    Remote sensing of soil moisture has reached a level of good maturity and accuracy for which the retrieved products are ready to use in real-world applications. Due to the importance of soil moisture in the partitioning of the water and energy fluxes between the land surface and the atmosphere, a wid...

  4. Global-scale assessment and combination of SMAP with ASCAT (Active) and AMSR2 (Passive) soil moisture products

    USDA-ARS?s Scientific Manuscript database

    Global-scale surface soil moisture (SSM) products retrieved from active and passive microwave remote sensing provide an effective method for monitoring near-real-time SSM content with nearly daily temporal resolution. In the present study, we first inter-compared global-scale error patterns and comb...

  5. Multiscale radar mapping of surface melt over mountain glaciers in High Mountain Asia

    NASA Astrophysics Data System (ADS)

    Steiner, N.; McDonald, K. C.

    2017-12-01

    Glacier melt dominates input for many hydrologic systems in the Himalayan Hindukush region that feed rivers that are critical for downstream ecosystems and hydropower generation in this highly populated area. Deviation in seasonal surface melt timing and duration with a changing climate has the potential to affect up to a billion people on the Indian Subcontinent. Satellite-borne microwave remote sensing has unique capabilities that allow monitoring of numerous landscape processes associated with snowmelt and freeze/thaw state, without many of the limitations in optical-infrared sensors such as solar illumination or atmospheric conditions. The onset of regional freeze/thaw and surface melting transitions determine important surface hydrologic variables like river discharge. Theses regional events are abrupt therefore difficult to observe with low-frequency observation sensors. Recently launched synthetic aperture radar (SAR) onboard the Sentinel-1 A and B satellites from the European Space Agency (ESA) provide wide-swath and high spatial resolution (50-100 m) C-Band SAR observations with observations frequencies not previously available, on the order of 8 to 16 days. The Sentinel SARs provide unique opportunity to study freeze/thaw and mountain glacier melt dynamics at process level scales, spatial and temporal. The melt process of individual glaciers, being fully resolved by imaging radar, will inform on the radiometric scattering physics associated with surface hydrology during the transition from melted to thawed state and during refreeze. Backscatter observations, along with structural information about the surface will be compared with complimentary coarse spatial resolution C-Band radar scatterometers, Advanced Scatterometer (ASCAT Met Op A+B), to understand the sub-pixel contribution of surface melting and freeze/thaw signals. This information will inform on longer-scale records of backscatter from ASCAT, 2006-2017. We present a comparison of polarimetric C-Band melt signals contained in the multi-scale backscatter and present a coincident freeze/thaw and snowmelt records from ASCAT and Sentinel-1 for the Gandaki basin, Nepal.

  6. Simulation of the mineral dust emission over Northern Africa and Middle East using an aerodynamic roughness length map derived from the ASCAT/PARASOL

    NASA Astrophysics Data System (ADS)

    Basart, Sara; Jorba, Oriol; Pérez García-Pando, Carlos; Prigent, Catherine; Baldasano, Jose M.

    2014-05-01

    Aeolian aerodynamic roughness length in arid regions is a key parameter to predict the vulnerability of the surface to wind erosion, and, as a consequence, the related production of mineral aerosol (e.g. Laurent et al., 2008). Recently, satellite-derived roughness length at the global scale have emerged and provide the opportunity to use them in advanced emission schemes in global and regional models (i.e. Menut et al., 2013). A global map of the aeolian aerodynamic roughness length at high resolution (6 km) is derived, for arid and semi-arid regions merging PARASOL and ASCAT data to estimate aeolian roughness length. It shows very good consistency with the existing information on the properties of these surfaces. The dataset is available to the community, for use in atmospheric dust transport models. The present contribution analyses the behaviour of the NMMB/BSC-Dust model (Pérez et al., 2011) when the ASCAT/PARASOL satellite-derived global roughness length (Prigent et al, 2012) and the State Soil Geographic database Food and Agriculture Organization of the United Nations (STATSGO-FAO) soil texture data set (based on wet techniques) is used. We explore the sensitivity of the drag partition scheme (a critical component of the dust emission scheme) and the dust vertical fluxes (intensity and spatial patterns) to the roughness length. An annual evaluation of NMMB/BSC-Dust (for the year 2011) over Northern Africa and the Middle East using observed aerosol optical depths (AODs) from Aerosol Robotic Network sites and aerosol satellite products (MODIS and MISR) will be discussed. Laurent, B., Marticorena, B., Bergametti, G., Leon, J. F., and Mahowald, N. M.: Modeling mineral dust emissions from the Sahara desert using new surface properties and soil database, J. Geophys. Res., 113, D14218, doi:10.1029/2007JD009484, 2008. Menut, L., C. Pérez, K. Haustein, B. Bessagnet, C. Prigent, and S. Alfaro, Impact of surface roughness and soil texture on mineral dust emission fluxes modeling, J. Geophys. Res. Atmos., 118, 6505-6520, doi:10.1002/jgrd.50313, 2013. Pérez, C., Haustein, K., Janjic, Z., Jorba, O., Huneeus, N., Baldasano, J. M. and Thomson, M. Atmospheric dust modeling from meso to global scales with the online NMMB/BSC-Dust model-Part 1: Model description, annual simulations and evaluation. Atmospheric Chemistry and Physics, 11(24), 13001-13027, 2011. Prigent, C., Jiménez, C., and Catherinot, J.: Comparison of satellite microwave backscattering (ASCAT) and visible/near-infrared reflectances (PARASOL) for the estimation of aeolian aerodynamic roughness length in arid and semi-arid regions, Atmos. Meas. Tech., 5, 2703-2712, doi:10.5194/amt-5-2703-2012, 2012.

  7. Sequenza: allele-specific copy number and mutation profiles from tumor sequencing data.

    PubMed

    Favero, F; Joshi, T; Marquard, A M; Birkbak, N J; Krzystanek, M; Li, Q; Szallasi, Z; Eklund, A C

    2015-01-01

    Exome or whole-genome deep sequencing of tumor DNA along with paired normal DNA can potentially provide a detailed picture of the somatic mutations that characterize the tumor. However, analysis of such sequence data can be complicated by the presence of normal cells in the tumor specimen, by intratumor heterogeneity, and by the sheer size of the raw data. In particular, determination of copy number variations from exome sequencing data alone has proven difficult; thus, single nucleotide polymorphism (SNP) arrays have often been used for this task. Recently, algorithms to estimate absolute, but not allele-specific, copy number profiles from tumor sequencing data have been described. We developed Sequenza, a software package that uses paired tumor-normal DNA sequencing data to estimate tumor cellularity and ploidy, and to calculate allele-specific copy number profiles and mutation profiles. We applied Sequenza, as well as two previously published algorithms, to exome sequence data from 30 tumors from The Cancer Genome Atlas. We assessed the performance of these algorithms by comparing their results with those generated using matched SNP arrays and processed by the allele-specific copy number analysis of tumors (ASCAT) algorithm. Comparison between Sequenza/exome and SNP/ASCAT revealed strong correlation in cellularity (Pearson's r = 0.90) and ploidy estimates (r = 0.42, or r = 0.94 after manual inspecting alternative solutions). This performance was noticeably superior to previously published algorithms. In addition, in artificial data simulating normal-tumor admixtures, Sequenza detected the correct ploidy in samples with tumor content as low as 30%. The agreement between Sequenza and SNP array-based copy number profiles suggests that exome sequencing alone is sufficient not only for identifying small scale mutations but also for estimating cellularity and inferring DNA copy number aberrations. © The Author 2014. Published by Oxford University Press on behalf of the European Society for Medical Oncology.

  8. Impact of SMOS soil moisture data assimilation on NCEP-GFS forecasts

    NASA Astrophysics Data System (ADS)

    Zhan, X.; Zheng, W.; Meng, J.; Dong, J.; Ek, M.

    2012-04-01

    Soil moisture is one of the few critical land surface state variables that have long memory to impact the exchanges of water, energy and carbon between the land surface and atmosphere. Accurate information about soil moisture status is thus required for numerical weather, seasonal climate and hydrological forecast as well as for agricultural production forecasts, water management and many other water related economic or social activities. Since the successful launch of ESA's soil moisture ocean salinity (SMOS) mission in November 2009, about 2 years of soil moisture retrievals has been collected. SMOS is believed to be the currently best satellite sensors for soil moisture remote sensing. Therefore, it becomes interesting to examine how the collected SMOS soil moisture data are compared with other satellite-sensed soil moisture retrievals (such as NASA's Advanced Microwave Scanning Radiometer -AMSR-E and EUMETSAT's Advanced Scatterometer - ASCAT)), in situ soil moisture measurements, and how these data sets impact numerical weather prediction models such as the Global Forecast System of NOAA-NCEP. This study implements the Ensemble Kalman filter in GFS to assimilate the AMSR-E, ASCAT and SMOS soil moisture observations after a quantitative assessment of their error rate based on in situ measurements from ground networks around contiguous United States. in situ soil moisture measurements from ground networks (such as USDA Soil Climate Analysis network - SCAN and NOAA's U.S. Climate Reference Network -USCRN) are used to evaluate the GFS soil moisture simulations (analysis). The benefits and uncertainties of assimilating the satellite data products in GFS are examined by comparing the GFS forecasts of surface temperature and rainfall with and without the assimilations. From these examinations, the advantages of SMOS soil moisture data products over other satellite soil moisture data sets will be evaluated. The next step toward operationally assimilating soil moisture and other land observations into GFS will also be discussed.

  9. Downscaling soil moisture over East Asia through multi-sensor data fusion and optimization of regression trees

    NASA Astrophysics Data System (ADS)

    Park, Seonyoung; Im, Jungho; Park, Sumin; Rhee, Jinyoung

    2017-04-01

    Soil moisture is one of the most important keys for understanding regional and global climate systems. Soil moisture is directly related to agricultural processes as well as hydrological processes because soil moisture highly influences vegetation growth and determines water supply in the agroecosystem. Accurate monitoring of the spatiotemporal pattern of soil moisture is important. Soil moisture has been generally provided through in situ measurements at stations. Although field survey from in situ measurements provides accurate soil moisture with high temporal resolution, it requires high cost and does not provide the spatial distribution of soil moisture over large areas. Microwave satellite (e.g., advanced Microwave Scanning Radiometer on the Earth Observing System (AMSR2), the Advanced Scatterometer (ASCAT), and Soil Moisture Active Passive (SMAP)) -based approaches and numerical models such as Global Land Data Assimilation System (GLDAS) and Modern- Era Retrospective Analysis for Research and Applications (MERRA) provide spatial-temporalspatiotemporally continuous soil moisture products at global scale. However, since those global soil moisture products have coarse spatial resolution ( 25-40 km), their applications for agriculture and water resources at local and regional scales are very limited. Thus, soil moisture downscaling is needed to overcome the limitation of the spatial resolution of soil moisture products. In this study, GLDAS soil moisture data were downscaled up to 1 km spatial resolution through the integration of AMSR2 and ASCAT soil moisture data, Shuttle Radar Topography Mission (SRTM) Digital Elevation Model (DEM), and Moderate Resolution Imaging Spectroradiometer (MODIS) data—Land Surface Temperature, Normalized Difference Vegetation Index, and Land cover—using modified regression trees over East Asia from 2013 to 2015. Modified regression trees were implemented using Cubist, a commercial software tool based on machine learning. An optimization based on pruning of rules derived from the modified regression trees was conducted. Root Mean Square Error (RMSE) and Correlation coefficients (r) were used to optimize the rules, and finally 59 rules from modified regression trees were produced. The results show high validation r (0.79) and low validation RMSE (0.0556m3/m3). The 1 km downscaled soil moisture was evaluated using ground soil moisture data at 14 stations, and both soil moisture data showed similar temporal patterns (average r=0.51 and average RMSE=0.041). The spatial distribution of the 1 km downscaled soil moisture well corresponded with GLDAS soil moisture that caught both extremely dry and wet regions. Correlation between GLDAS and the 1 km downscaled soil moisture during growing season was positive (mean r=0.35) in most regions.

  10. Evaluating the Utility of Satellite Soil Moisture Retrievals over Irrigated Areas and the Ability of Land Data Assimilation Methods to Correct for Unmodeled Processes

    NASA Technical Reports Server (NTRS)

    Kumar, S. V.; Peters-Lidard, C. D.; Santanello, J. A.; Reichle, R. H.; Draper, C. S.; Koster, R. D.; Nearing, G.; Jasinski, M. F.

    2015-01-01

    Earth's land surface is characterized by tremendous natural heterogeneity and human-engineered modifications, both of which are challenging to represent in land surface models. Satellite remote sensing is often the most practical and effective method to observe the land surface over large geographical areas. Agricultural irrigation is an important human-induced modification to natural land surface processes, as it is pervasive across the world and because of its significant influence on the regional and global water budgets. In this article, irrigation is used as an example of a human-engineered, often unmodeled land surface process, and the utility of satellite soil moisture retrievals over irrigated areas in the continental US is examined. Such retrievals are based on passive or active microwave observations from the Advanced Microwave Scanning Radiometer for the Earth Observing System (AMSR-E), the Advanced Microwave Scanning Radiometer 2 (AMSR2), the Soil Moisture Ocean Salinity (SMOS) mission, WindSat and the Advanced Scatterometer (ASCAT). The analysis suggests that the skill of these retrievals for representing irrigation effects is mixed, with ASCAT-based products somewhat more skillful than SMOS and AMSR2 products. The article then examines the suitability of typical bias correction strategies in current land data assimilation systems when unmodeled processes dominate the bias between the model and the observations. Using a suite of synthetic experiments that includes bias correction strategies such as quantile mapping and trained forward modeling, it is demonstrated that the bias correction practices lead to the exclusion of the signals from unmodeled processes, if these processes are the major source of the biases. It is further shown that new methods are needed to preserve the observational information about unmodeled processes during data assimilation.

  11. A near real time scenario at regional scale for the hydrogeological risk

    NASA Astrophysics Data System (ADS)

    Ponziani, F.; Stelluti, M.; Zauri, R.; Berni, N.; Brocca, L.; Moramarco, T.; Salciarini, D.; Tamagnini, C.

    2012-04-01

    The early warning systems dedicated to landslides and floods represent the Umbria Region Civil Protection Service new generation tools for hydraulic and hydrogeological risk reduction. Following past analyses performed by the Functional Centre (part of the civil protection service dedicated to the monitoring and the evaluation of natural hazards) on the relationship between saturated soil conditions and rainfall thresholds, we have developed an automated early warning system for the landslide risk, called LANDWARN, which generates daily and 72h forecast risk matrix with a dense mesh of 100 x 100m, throughout the region. The system is based on: (a) the 20 days -observed and 72h -predicted rainfall, provided by the local meteorological network and the Local scale Meteorological Model COSMO ME, (b) the assessment of the saturation of soils by: daily extraction of ASCAT satellite data, data from a network of 16 TDR sensors, and a water balance model (developed by the Research Institute for Geo-Hydrological Protection, CNR, Perugia, Italy) that allows for the prediction of a saturation index for each point of the analysis grid up to a window of 72 h, (c) a Web-GIS platform that combines the data grids of calculated hazard indicators with layers of landslide susceptibility and vulnerability of the territory, in order to produce dynamic risk scenarios. The system is still under development and it's implemented at different scales: the entire region, and a set of known high-risk landslides in Umbria. The system is monitored and regularly reviewed through the back analysis of landslide reports for which the activation date is available. Up to now, the development of the system involves: a) the improvement of the reliability assessment of the condition of soil saturation, a key parameter which is used to dynamically adjust the values of rainfall thresholds used for the declaration of levels of landslide hazard. For this purpose, a procedure was created for the ASCAT satellite data daily download, used for the derivation of a soil water content index (SWI): these data are compared with instrumental ones from the TDR stations and the results of the water balance model that evaluates the contributions of water infiltration, percolation, evapotranspiration, etc. using physically based parameters obtained through a long process of characterization of soil and rock types, for each grid point; b) The assessment of the contribution due to the melting of the snow; c) the physically based - coupling model slope stability analysis, GIS-based, developed by the Department of Civil and Environmental Engineering, University of Perugia, with the aim to introduce also the actual mechanical and physical characteristics of slopes in the analysis. As result of the system, is the daily creation of near real-time and 24, 48, 72h forecast risk scenarios, that, under the intention of the Department of Civil Protection Service, will be used by the Functional Centre for the institutional tasks of hydrogeological risk evaluation and management, but also by local Administrations involved in the monitoring and assessment of landslide risk, in order to receive feedback on the effectiveness of the scenarios produced.

  12. Assessing Climate-Induced Change in River Flow Using Satellite Remote Sensing and Process Modeling in High Mountain Asia

    NASA Astrophysics Data System (ADS)

    McDonald, K. C.

    2017-12-01

    Snow- and glacier-fed river systems originating from High Mountain Asia (HMA) support diverse ecosystems and provide the basis for food and energy production for more than a billion people living downstream. Climate-driven changes in the melting of snow and glaciers and in precipitation patterns are expected to significantly alter the flow of the rivers in the HMA region at various temporal scales, which in turn could heavily affect the socioeconomics of the region. Hence, climate change effects on seasonal and long-term hydrological conditions may have far reaching economic impact annually and over the century. We are developing a decision support tool utilizing integrated microwave remote sensing datasets, process modeling and economic models to inform water resource management decisions and ecosystem sustainability as related to the High Mountain Asia (HMA) region's response to climate change. The availability of consistent time-series microwave remote sensing datasets from Earth-orbiting scatterometers, radiometers and synthetic aperture radar (SAR) imagery provides the basis for the observational framework of this monitoring system. We discuss the assembly, processing and application of scatterometer and SAR data sets from the Advanced Scatterometer (ASCAT) and Sentinal-1 SARs, and the enlistment of these data to monitor seasonal melt and thaw status of glacier-dominated and surrounding regions. We present current status and future plans for this effort. Our team's study emphasizes processes and economic modeling within the Trishuli basin; our remote sensing analysis supports analyses across the HiMAT domain.

  13. On the assimilation set-up of ASCAT soil moisture data for improving streamflow catchment simulation

    NASA Astrophysics Data System (ADS)

    Loizu, Javier; Massari, Christian; Álvarez-Mozos, Jesús; Tarpanelli, Angelica; Brocca, Luca; Casalí, Javier

    2018-01-01

    Assimilation of remotely sensed surface soil moisture (SSM) data into hydrological catchment models has been identified as a means to improve streamflow simulations, but reported results vary markedly depending on the particular model, catchment and assimilation procedure used. In this study, the influence of key aspects, such as the type of model, re-scaling technique and SSM observation error considered, were evaluated. For this aim, Advanced SCATterometer ASCAT-SSM observations were assimilated through the ensemble Kalman filter into two hydrological models of different complexity (namely MISDc and TOPLATS) run on two Mediterranean catchments of similar size (750 km2). Three different re-scaling techniques were evaluated (linear re-scaling, variance matching and cumulative distribution function matching), and SSM observation error values ranging from 0.01% to 20% were considered. Four different efficiency measures were used for evaluating the results. Increases in Nash-Sutcliffe efficiency (0.03-0.15) and efficiency indices (10-45%) were obtained, especially when linear re-scaling and observation errors within 4-6% were considered. This study found out that there is a potential to improve streamflow prediction through data assimilation of remotely sensed SSM in catchments of different characteristics and with hydrological models of different conceptualizations schemes, but for that, a careful evaluation of the observation error and re-scaling technique set-up utilized is required.

  14. Estimating Root Mean Square Errors in Remotely Sensed Soil Moisture over Continental Scale Domains

    NASA Technical Reports Server (NTRS)

    Draper, Clara S.; Reichle, Rolf; de Jeu, Richard; Naeimi, Vahid; Parinussa, Robert; Wagner, Wolfgang

    2013-01-01

    Root Mean Square Errors (RMSE) in the soil moisture anomaly time series obtained from the Advanced Scatterometer (ASCAT) and the Advanced Microwave Scanning Radiometer (AMSR-E; using the Land Parameter Retrieval Model) are estimated over a continental scale domain centered on North America, using two methods: triple colocation (RMSETC ) and error propagation through the soil moisture retrieval models (RMSEEP ). In the absence of an established consensus for the climatology of soil moisture over large domains, presenting a RMSE in soil moisture units requires that it be specified relative to a selected reference data set. To avoid the complications that arise from the use of a reference, the RMSE is presented as a fraction of the time series standard deviation (fRMSE). For both sensors, the fRMSETC and fRMSEEP show similar spatial patterns of relatively highlow errors, and the mean fRMSE for each land cover class is consistent with expectations. Triple colocation is also shown to be surprisingly robust to representativity differences between the soil moisture data sets used, and it is believed to accurately estimate the fRMSE in the remotely sensed soil moisture anomaly time series. Comparing the ASCAT and AMSR-E fRMSETC shows that both data sets have very similar accuracy across a range of land cover classes, although the AMSR-E accuracy is more directly related to vegetation cover. In general, both data sets have good skill up to moderate vegetation conditions.

  15. How reliable are satellite precipitation estimates for driving hydrological models: a verification study over the Mediterranean area

    NASA Astrophysics Data System (ADS)

    Camici, Stefania; Ciabatta, Luca; Massari, Christian; Brocca, Luca

    2017-04-01

    Floods are one of the most common and dangerous natural hazards, causing every year thousands of casualties and damages worldwide. The main tool for assessing flood risk and reducing damages is represented by hydrologic early warning systems that allow to forecast flood events by using real time data obtained through ground monitoring networks (e.g., raingauges and radars). However, the use of such data, mainly rainfall, presents some issues firstly related to the network density and to the limited spatial representativeness of local measurements. A way to overcome these issues may be the use of satellite-based rainfall products (SRPs) that nowadays are available on a global scale at ever increasing spatial/temporal resolution and accuracy. However, despite the large availability and increased accuracy of SRPs (e.g., the Tropical Rainfall Measurement Mission (TRMM) Multi-satellite Precipitation Analysis (TMPA); the Satellite Application Facility on Support to Operational Hydrology and Water Management (H-SAF); and the recent Global Precipitation Measurement (GPM) mission), remotely sensed rainfall data are scarcely used in hydrological modeling and only a small number of studies have been carried out to outline some guidelines for using satellite data as input for hydrological modelling. Reasons may be related to: 1) the large bias characterizing satellite precipitation estimates, which is dependent on rainfall intensity and season, 2) the spatial/temporal resolution, 3) the timeliness, which is often insufficient for operational purposes, and 4) a general (often not justified) skepticism of the hydrological community in the use of satellite products for land applications. The objective of this study is to explore the feasibility of using SRPs in a lumped hydrologic model (MISDc, "Modello Idrologico Semi-Distribuito in continuo", Masseroni et al., 2017) over 10 basins in the Mediterranean area with different sizes and physiographic characteristics. Specifically, TMPA 3B42-RT, CMORPH, PERSIANN and a new soil moisture-derived rainfall datasets obtained through the application of SM2RAIN algorithm (Brocca et al., 2014) to ASCAT (Advanced SCATterometer) soil moisture product are used in the analysis. The performances obtained with SRPs are compared with those obtained by using ground data during the 6-year period from 2010 to 2015. In addition, the performance obtained by an integration of the above mentioned SRPs is also investigated to see whether merged rainfall observations are able to improve flood simulation. Preliminary analysis were also carried out by using the IMERG early run product of GPM mission. The results highlight that SRPs should be used with caution for rainfall-runoff modelling in the Mediterranean region. Bias correction and model recalibration are necessary steps, even though not always sufficient to achieve satisfactory performances. Indeed, some of the products provide unreliable outcomes, mainly in smaller basins (<500 km2) that, however, represent the main target for flood modelling in the Mediterranean area. The better performances are obtained by integrating different SRPs, and particularly by merging TMPA 3B42-RT and SM2RAIN-ASCAT products. The promising results of the integrated product are expected to increase the confidence on the use of SRPs in hydrological modeling, even in challenging areas as the Mediterranean. REFERENCES Brocca, L., Ciabatta, L., Massari, C., Moramarco, T., Hahn, S., Hasenauer, S., Kidd, R., Dorigo, W., Wagner, W., Levizzani, V. (2014). Soil as a natural rain gauge: estimating global rainfall from satellite soil moisture data. Journal of Geophysical Research, 119(9), 5128-5141, doi:10.1002/2014JD021489. Masseroni, D., Cislaghi, A., Camici, S., Massari, C., Brocca, L. (2017). A reliable rainfall-runoff model for flood forecasting: review and application to a semiurbanized watershed at high flood risk in Italy. Hydrology Research, in press, doi:10.2166/nh.2016.037.

  16. Measurement of social capital in relation to health in low and middle income countries (LMIC): a systematic review.

    PubMed

    Agampodi, Thilini Chanchala; Agampodi, Suneth Buddhika; Glozier, Nicholas; Siribaddana, Sisira

    2015-03-01

    Social capital is a neglected determinant of health in low and middle income countries. To date, majority of evidence syntheses on social capital and health are based upon high income countries. We conducted this systematic review to identify the methods used to measure social capital in low and middle-income countries and to evaluate their relative strengths and weaknesses. An electronic search was conducted using Pubmed, Science citation index expanded, Social science citation index expanded, Web of knowledge, Cochrane, Trip, Google scholar and selected grey literature sources. We aimed to include all studies conducted in low and middle-income countries, published in English that have measured any aspect of social capital in relation to health in the study, from 1980 to January 2013. We extracted data using a data extraction form and performed narrative synthesis as the measures were heterogeneous. Of the 472 articles retrieved, 46 articles were selected for the review. The review included 32 studies from middle income countries and seven studies from low income countries. Seven were cross national studies. Most studies were descriptive cross sectional in design (n = 39). Only two randomized controlled trials were included. Among the studies conducted using primary data (n = 32), we identified18 purposely built tools that measured various dimensions of social capital. Validity (n = 11) and reliability (n = 8) of the tools were assessed only in very few studies. Cognitive constructs of social capital, namely trust, social cohesion and sense of belonging had a positive association towards measured health outcome in majority of the studies. While most studies measured social capital at individual/micro level (n = 32), group level measurements were obtained by aggregation of individual measures. As many tools originate in high income contexts, cultural adaptation, validation and reliability assessment is mandatory in adapting the tool to the study setting. Evidence on causality and assessing predictive validity is a problem due to the scarcity of prospective study designs. We recommend Harpham et al. s' Adapted Social Capital Assessment Tool (A-SCAT), Hurtado et al. s' six item tool and Elgar et al. s' World Value Survey Social Capital Scale for assessment of social capital in low and middle income countries. Copyright © 2015 Elsevier Ltd. All rights reserved.

  17. A Combined Soil Moisture Product of the Tibetan Plateau using Different Sensors Simultaneously

    NASA Astrophysics Data System (ADS)

    Zeng, Y.; Dente, L.; Su, B.; Wang, L.

    2012-12-01

    It is always challenging to find a single satellite-derived soil moisture product that has complete coverage of the Tibetan Plateau for a long time period and is suitable for climate change studies at sub-continental scale. Meanwhile, having a number of independent satellite-derived soil moisture data sets does not mean that it is straightforward to create long-term consistent time series, due to the differences among the data sets related to the different retrieval approaches. Therefore, this study is focused on the development and validation of a simple Bayesian based method to merge/blend different satellite-derived soil moisture data. The merging method was firstly tested over the Maqu region (north-eastern fringe of the Tibetan Plateau), where in situ soil moisture data were collected, for the period from May 2008 to December 2010. The in situ data provided by the 20 monitoring stations in the Maqu region were compared to the AMSR-E soil moisture products by VUA-NASA and the ASCAT soil moisture products by TU Wien, in order to determine bias and standard deviation. It was found that the bias between the satellite and the in situ data varies with seasons. The satellite-derived products were first corrected for the bias and then merged. This is generally caused by notable differences in the represented depth, spatial extent and so on. The systematic bias is affected by the spatial variability and the temporal stability (Dente et al. 2012). The dependence of the bias on season was investigated and identified as the monsoon season only (May-September), in winter only (December - February), and in the period between the monsoon season and winter (March-April, October-November, called the transition season) (Dente et al. 2012, Su et al. 2011). After the date merging procedure, the standard deviations between the satellite and the in situ data reduced from 0.0839 to 0.0622 for ASCAT data, and from 0.0682 to 0.0593 for AMSR-E data. The developed merging method is therefore suitable to provide a more accurate soil moisture product than the AMSR-E and ASCAT products. As the merging method was shown to be promising over the Maqu region, it will be extended to the entire Tibetan Plateau. Then, the combined soil moisture product will be validated over the monitored sites located in Ngari and Naqu regions. References: Dente, L.; Vekerdy, Z.; Wen, J.; Su, Z., (2012) Maqu network for validation of satellite-derived soil moisture products. International Journal of Applied Earth Observation and Geoinformation, 17, 55-65. Su, Z., Wen, J., Dente, L., van der Velde, R. and ... [et al.] (2011) The Tibetan plateau observatory of plateau scale soil moisture and soil temperature, Tibet - Obs, for quantifying uncertainties in coarse resolution satellite and model products. In: Hydrology and earth system sciences (HESS): open access, 15 (2011)7 pp. 2303-2016.

  18. Waveform-based spaceborne GNSS-R wind speed observation: Demonstration and analysis using UK TechDemoSat-1 data

    NASA Astrophysics Data System (ADS)

    Wang, Feng; Yang, Dongkai; Zhang, Bo; Li, Weiqiang

    2018-03-01

    This paper explores two types of mathematical functions to fit single- and full-frequency waveform of spaceborne Global Navigation Satellite System-Reflectometry (GNSS-R), respectively. The metrics of the waveforms, such as the noise floor, peak magnitude, mid-point position of the leading edge, leading edge slope and trailing edge slope, can be derived from the parameters of the proposed models. Because the quality of the UK TDS-1 data is not at the level required by remote sensing mission, the waveforms buried in noise or from ice/land are removed by defining peak-to-mean ratio, cosine similarity of the waveform before wind speed are retrieved. The single-parameter retrieval models are developed by comparing the peak magnitude, leading edge slope and trailing edge slope derived from the parameters of the proposed models with in situ wind speed from the ASCAT scatterometer. To improve the retrieval accuracy, three types of multi-parameter observations based on the principle component analysis (PCA), minimum variance (MV) estimator and Back Propagation (BP) network are implemented. The results indicate that compared to the best results of the single-parameter observation, the approaches based on the principle component analysis and minimum variance could not significantly improve retrieval accuracy, however, the BP networks obtain improvement with the RMSE of 2.55 m/s and 2.53 m/s for single- and full-frequency waveform, respectively.

  19. Using Satellite Data and Land Surface Models to Monitor and Forecast Drought Conditions in Africa and Middle East

    NASA Astrophysics Data System (ADS)

    Arsenault, K. R.; Shukla, S.; Getirana, A.; Peters-Lidard, C. D.; Kumar, S.; McNally, A.; Zaitchik, B. F.; Badr, H. S.; Funk, C. C.; Koster, R. D.; Narapusetty, B.; Jung, H. C.; Roningen, J. M.

    2017-12-01

    Drought and water scarcity are among the important issues facing several regions within Africa and the Middle East. In addition, these regions typically have sparse ground-based data networks, where sometimes remotely sensed observations may be the only data available. Long-term satellite records can help with determining historic and current drought conditions. In recent years, several new satellites have come on-line that monitor different hydrological variables, including soil moisture and terrestrial water storage. Though these recent data records may be considered too short for the use in identifying major droughts, they do provide additional information that can better characterize where water deficits may occur. We utilize recent satellite data records of Gravity Recovery and Climate Experiment (GRACE) terrestrial water storage (TWS) and the European Space Agency's Advanced Scatterometer (ASCAT) soil moisture retrievals. Combining these records with land surface models (LSMs), NASA's Catchment and the Noah Multi-Physics (MP), is aimed at improving the land model states and initialization for seasonal drought forecasts. The LSMs' total runoff is routed through the Hydrological Modeling and Analysis Platform (HyMAP) to simulate surface water dynamics, which can provide an additional means of validation against in situ streamflow data. The NASA Land Information System (LIS) software framework drives the LSMs and HyMAP and also supports the capability to assimilate these satellite retrievals, such as soil moisture and TWS. The LSMs are driven for 30+ years with NASA's Modern-Era Retrospective analysis for Research and Applications, Version 2 (MERRA-2), and the USGS/UCSB Climate Hazards Group InfraRed Precipitation with Stations (CHIRPS) rainfall dataset. The seasonal water deficit forecasts are generated using downscaled and bias-corrected versions of NASA's Goddard Earth Observing System Model (GEOS-5), and NOAA's Climate Forecast System (CFSv2) forecasts. These combined satellite and model records and forecasts are intended for use in different decision support tools, like the Famine Early Warning Systems Network (FEWS NET) and the Middle East-North Africa (MENA) Regional Drought Management System, for aiding and forecasting in water and food insecure regions.

  20. A Wiener-Wavelet-Based filter for de-noising satellite soil moisture retrievals

    NASA Astrophysics Data System (ADS)

    Massari, Christian; Brocca, Luca; Ciabatta, Luca; Moramarco, Tommaso; Su, Chun-Hsu; Ryu, Dongryeol; Wagner, Wolfgang

    2014-05-01

    The reduction of noise in microwave satellite soil moisture (SM) retrievals is of paramount importance for practical applications especially for those associated with the study of climate changes, droughts, floods and other related hydrological processes. So far, Fourier based methods have been used for de-noising satellite SM retrievals by filtering either the observed emissivity time series (Du, 2012) or the retrieved SM observations (Su et al. 2013). This contribution introduces an alternative approach based on a Wiener-Wavelet-Based filtering (WWB) technique, which uses the Entropy-Based Wavelet de-noising method developed by Sang et al. (2009) to design both a causal and a non-causal version of the filter. WWB is used as a post-retrieval processing tool to enhance the quality of observations derived from the i) Advanced Microwave Scanning Radiometer for the Earth observing system (AMSR-E), ii) the Advanced SCATterometer (ASCAT), and iii) the Soil Moisture and Ocean Salinity (SMOS) satellite. The method is tested on three pilot sites located in Spain (Remedhus Network), in Greece (Hydrological Observatory of Athens) and in Australia (Oznet network), respectively. Different quantitative criteria are used to judge the goodness of the de-noising technique. Results show that WWB i) is able to improve both the correlation and the root mean squared differences between satellite retrievals and in situ soil moisture observations, and ii) effectively separates random noise from deterministic components of the retrieved signals. Moreover, the use of WWB de-noised data in place of raw observations within a hydrological application confirms the usefulness of the proposed filtering technique. Du, J. (2012), A method to improve satellite soil moisture retrievals based on Fourier analysis, Geophys. Res. Lett., 39, L15404, doi:10.1029/ 2012GL052435 Su,C.-H.,D.Ryu, A. W. Western, and W. Wagner (2013), De-noising of passive and active microwave satellite soil moisture time series, Geophys. Res. Lett., 40,3624-3630, doi:10.1002/grl.50695. Sang Y.-F., D. Wang, J.-C. Wu, Q.-P. Zhu, and L. Wang (2009), Entropy-Based Wavelet De-noising Method for Time Series Analysis, Entropy, 11, pp. 1123-1148, doi:10.3390/e11041123.

  1. ASCAT soil moisture data assimilation through the Ensemble Kalman Filter for improving streamflow simulation in Mediterranean catchments

    NASA Astrophysics Data System (ADS)

    Loizu, Javier; Massari, Christian; Álvarez-Mozos, Jesús; Casalí, Javier; Goñi, Mikel

    2016-04-01

    Assimilation of Surface Soil Moisture (SSM) observations obtained from remote sensing techniques have been shown to improve streamflow prediction at different time scales of hydrological modeling. Different sensors and methods have been tested for their application in SSM estimation, especially in the microwave region of the electromagnetic spectrum. The available observation devices include passive microwave sensors such as the Advanced Microwave Scanning Radiometer - Earth Observation System (AMSR-E) onboard the Aqua satellite and the Soil Moisture and Ocean Salinity (SMOS) mission. On the other hand, active microwave systems include Scatterometers (SCAT) onboard the European Remote Sensing satellites (ERS-1/2) and the Advanced Scatterometer (ASCAT) onboard MetOp-A satellite. Data assimilation (DA) include different techniques that have been applied in hydrology and other fields for decades. These techniques include, among others, Kalman Filtering (KF), Variational Assimilation or Particle Filtering. From the initial KF method, different techniques were developed to suit its application to different systems. The Ensemble Kalman Filter (EnKF), extensively applied in hydrological modeling improvement, shows its capability to deal with nonlinear model dynamics without linearizing model equations, as its main advantage. The objective of this study was to investigate whether data assimilation of SSM ASCAT observations, through the EnKF method, could improve streamflow simulation of mediterranean catchments with TOPLATS hydrological complex model. The DA technique was programmed in FORTRAN, and applied to hourly simulations of TOPLATS catchment model. TOPLATS (TOPMODEL-based Land-Atmosphere Transfer Scheme) was applied on its lumped version for two mediterranean catchments of similar size, located in northern Spain (Arga, 741 km2) and central Italy (Nestore, 720 km2). The model performs a separated computation of energy and water balances. In those balances, the soil is divided into two layers, the upper Surface Zone (SZ), and the deeper Transmission Zone (TZ). In this study, the SZ depth was fixed to 5 cm, for adequate assimilation of observed data. Available data was distributed as follows: first, the model was calibrated for the 2001-2007 period; then the 2007-2010 period was used for satellite data rescaling purposes. Finally, data assimilation was applied during the validation (2010-2013) period. Application of the EnKF required the following steps: 1) rescaling of satellite data, 2) transformation of rescaled data into Soil Water Index (SWI) through a moving average filter, where a T = 9 calibrated value was applied, 3) generation of a 50 member ensemble through perturbation of inputs (rainfall and temperature) and three selected parameters, 4) validation of the ensemble through the compliance of two criteria based on ensemble's spread, mean square error and skill and, 5) Kalman Gain calculation. In this work, comparison of three satellite data rescaling techniques: 1) cumulative distribution Function (CDF) matching, 2) variance matching and 3) linear least square regression was also performed. Results obtained in this study showed slight improvements of hourly Nash-Sutcliffe Efficiency (NSE) in both catchments, with the different rescaling methods evaluated. Larger improvements were found in terms of seasonal simulated volume error reduction.

  2. Multi-site assimilation of a terrestrial biosphere model (BETHY) using satellite derived soil moisture data

    NASA Astrophysics Data System (ADS)

    Wu, Mousong; Sholze, Marko

    2017-04-01

    We investigated the importance of soil moisture data on assimilation of a terrestrial biosphere model (BETHY) for a long time period from 2010 to 2015. Totally, 101 parameters related to carbon turnover, soil respiration, as well as soil texture were selected for optimization within a carbon cycle data assimilation system (CCDAS). Soil moisture data from Soil Moisture and Ocean Salinity (SMOS) product was derived for 10 sites representing different plant function types (PFTs) as well as different climate zones. Uncertainty of SMOS soil moisture data was also estimated using triple collocation analysis (TCA) method by comparing with ASCAT dataset and BETHY forward simulation results. Assimilation of soil moisture to the system improved soil moisture as well as net primary productivity(NPP) and net ecosystem productivity (NEP) when compared with soil moisture derived from in-situ measurements and fluxnet datasets. Parameter uncertainties were largely reduced relatively to prior values. Using SMOS soil moisture data for assimilation of a terrestrial biosphere model proved to be an efficient approach in reducing uncertainty in ecosystem fluxes simulation. It could be further used in regional an global assimilation work to constrain carbon dioxide concentration simulation by combining with other sources of measurements.

  3. Effective Use Of Scatterometer Winds In Current and Future GMAO Reanalysis

    NASA Technical Reports Server (NTRS)

    Chattopadhyay, Mohar; McCarty, Will

    2017-01-01

    Scatterometer-derived near-surface ocean vector wind retrievals provide global measurements complementary to the sparse conventional observing system which primarily consists of ships and buoys over water surfaces. The RapidScat instrument was flown on the International Space Station as a quick and low cost replacement of QuikScat and as a continuation of the NASA scatterometry data record. A unique characteristic of RapidScat was that it flew in a non-sun synchronous orbit at an inclination of 51.6 degrees. This orbit allowed for the collocation of measurements with other scatterometers as well as an ability to sample diurnal signals. In the Modern-Era Retrospective analysis for Research and Applications, Version 2 (MERRA-2) reanalysis, the scatterometry record began with the ESA European Remote Sensing (ERS) scatterometer on 5 Aug 1991 and continued through today with the EUMETSAT Metop Advanced Scatterometer (ASCAT). RapidScat, however, was not used in the MERRA-2 system as development had been completed prior to the beginning of its data record. In this presentation, the RapidScat ocean vector winds will be compared to MERRA-2, both in terms of the analysis fields and in the context of its global observing system, to assess the viability of using the data in future reanalysis systems developed by the Global Modeling and Assimilation Office (GMAO) at NASA Goddard Space Flight Center.

  4. Setting Up a Sentinel 1 Based Soil Moisture - Data Assimilation System for Flash Flood Risk Mitigation

    NASA Astrophysics Data System (ADS)

    Cenci, Luca; Pulvirenti, Luca; Boni, Giorgio; Chini, Marco; Matgen, Patrick; Gabellani, Simone; Squicciarino, Giuseppe; Pierdicca, Nazzareno

    2017-04-01

    Several studies have shown that the assimilation of satellite-derived soil moisture products (SM-DA) within hydrological modelling is able to reduce the uncertainty of discharge predictions. This can be exploited for improving early warning systems (EWS) and it is thus particularly useful for flash flood risk mitigation (Cenci et al., 2016a). The objective of this research was to evaluate the potentialities of an advanced SM-DA system based on the assimilation of synthetic aperture radar (SAR) observations derived from Sentinel 1 (S1) acquisitions. A time-continuous, spatially-distributed, physically-based hydrological model was used: Continuum (Silvestro et al., 2013). The latter is currently exploited for civil protection activities in Italy, both at national and at regional scale. Therefore, its adoption allows for a better understanding of the real potentialities of the aforementioned SM-DA system for improving EWS. The novelty of this research consisted in the use of S1-derived SM products obtained by using a multitemporal retrieval algorithm (Cenci et al., 2016b) in which the correction of the vegetation effect was obtained by means of both SAR (Cosmo-SkyMed) and optical (Landsat) images. The maps were characterised by a comparatively higher spatial/lower temporal resolution (respectively, 100 m and 12 days) w.r.t. maps obtained from commonly used microwave sensors for such applications (e.g. the Advanced SCATterometer, ASCAT). The experiment was carried out in the period October 2014 - February 2015 in an exemplifying Mediterranean catchment prone to flash floods: the Orba Basin (Italy). The Nudging assimilation scheme was chosen for its computational efficiency, particularly useful for operational applications. The impact of the assimilation was evaluated by comparing simulated and observed discharge values. In particular, it was analysed the impact of the assimilation on higher flows. Results were compared with those obtained by assimilating an ASCAT-derived SM product (H08) that can be considered at high spatial resolution (1 km) for hydrological applications and high temporal resolution (36 h) (Wagner et al., 2013). Findings revealed the potentialities of a S1-based SM-DA system for improving discharge predictions, especially of higher flows, and suggested the more appropriate pre-processing techniques to apply to S1 data before the assimilation. The comparison with H08 highlighted the importance of the temporal resolution of the observations. Results are promising but further research is needed before the actual implementation of the aforementioned S1-based SM-DA system for operational applications. References - Cenci L., et al.: Assimilation of H-SAF Soil Moisture Products for Flash Flood Early Warning Systems. Case Study: Mediterranean Catchments, IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens.}, 9(12), 5634-5646, doi:10.1109/JSTARS.2016.2598475, 2016a. - Cenci L., et al.: Satellite Soil Moisture Assimilation: Preliminary Assessment of the Sentinel 1 Potentialities, 2016 IEEE Int. Geosci. Remote Sens. Symp. (IGARSS), Beijing, 3098-3101, doi:10.1109/IGARSS.2016.7729801, 2016b. - Silvestro F., et al.: Exploiting Remote Sensing Land Surface Temperature in Distributed Hydrological Modelling: the Example of the Continuum Model, Hydrol. Earth Syst. Sci., 17(1), 39-62, doi:10.5194/hess-17-39-2013, 2013. - Wagner W., et al.: The ASCAT Soil Moisture Product: A Review of its Specifications, Validation Results, and Emerging Applications, Meteorol. Zeitschrift, 22(1), 5-33, doi:10.1127/0941-2948/2013/0399, 2013.

  5. Round Robin evaluation of soil moisture retrieval models for the MetOp-A ASCAT Instrument

    NASA Astrophysics Data System (ADS)

    Gruber, Alexander; Paloscia, Simonetta; Santi, Emanuele; Notarnicola, Claudia; Pasolli, Luca; Smolander, Tuomo; Pulliainen, Jouni; Mittelbach, Heidi; Dorigo, Wouter; Wagner, Wolfgang

    2014-05-01

    Global soil moisture observations are crucial to understand hydrologic processes, earth-atmosphere interactions and climate variability. ESA's Climate Change Initiative (CCI) project aims to create a global consistent long-term soil moisture data set based on the merging of the best available active and passive satellite-based microwave sensors and retrieval algorithms. Within the CCI, a Round Robin evaluation of existing retrieval algorithms for both active and passive instruments was carried out. In this study we present the comparison of five different retrieval algorithms covering three different modelling principles applied to active MetOp-A ASCAT L1 backscatter data. These models include statistical models (Bayesian Regression and Support Vector Regression, provided by the Institute for Applied Remote Sensing, Eurac Research Viale Druso, Italy, and an Artificial Neural Network, provided by the Institute of Applied Physics, CNR-IFAC, Italy), a semi-empirical model (provided by the Finnish Meteorological Institute), and a change detection model (provided by the Vienna University of Technology). The algorithms were applied on L1 backscatter data within the period of 2007-2011, resampled to a 12.5 km grid. The evaluation was performed over 75 globally distributed, quality controlled in situ stations drawn from the International Soil Moisture Network (ISMN) using surface soil moisture data from the Global Land Data Assimilation System (GLDAS-) Noah land surface model as second independent reference. The temporal correlation between the data sets was analyzed and random errors of the the different algorithms were estimated using the triple collocation method. Absolute soil moisture values as well as soil moisture anomalies were considered including both long-term anomalies from the mean seasonal cycle and short-term anomalies from a five weeks moving average window. Results show a very high agreement between all five algorithms for most stations. A slight vegetation dependency of the errors and a spatial decorrelation of the performance patterns of the different algorithms was found. We conclude that future research should focus on understanding, combining and exploiting the advantages of all available modelling approaches rather than trying to optimize one approach to fit every possible condition.

  6. Identifying Stratospheric Air Intrusions and Associated Hurricane-Force Wind Events over the North Pacific Ocean

    NASA Technical Reports Server (NTRS)

    Malloy, Kelsey; Folmer, Michael J.; Phillips, Joseph; Sienkiewicz, Joseph M.; Berndt, Emily

    2017-01-01

    Motivation: Ocean data is sparse: reliance on satellite imagery for marine forecasting; Ocean Prediction Center (OPC) –“mariner’s weather lifeline”. Responsible for: Pacific, Atlantic, Pacific Alaska surface analyses –24, 48, 96 hrs.; Wind & wave analyses –24, 48, 96 hrs.; Issue warnings, make decisions, Geostationary Operational Environmental Satellite –R Series (now GOES-16), Compared to the old GOES: 3 times spectral resolution, 4 times spatial resolution, 5 times faster coverage; Comparable to Japanese Meteorological Agency’s Himawari-8, used a lot throughout this research. Research Question: How can integrating satellite data imagery and derived products help forecasters improve prognosis of rapid cyclogenesis and hurricane-force wind events? Phase I –Identifying stratospheric air intrusions: Water Vapor –6.2, 6.9, 7.3 micron channels; Airmass RGB Product; AIRS, IASI, NUCAPS total column ozone and ozone anomaly; ASCAT (A/B) and AMSR-2 wind data.

  7. Relating C-band Microwave and Optical Satellite Observations as A Function of Snow Thickness on First-Year Sea Ice during the Winter to Summer Transition

    NASA Astrophysics Data System (ADS)

    Zheng, J.; Yackel, J.

    2015-12-01

    The Arctic sea ice and its snow cover have a direct impact on both the Arctic and global climate system through their ability to moderate heat exchange across the ocean-sea ice-atmosphere (OSA) interface. Snow cover plays a key role in the OSA interface radiation and energy exchange, as it controls the growth and decay of first-year sea ice (FYI). However, meteoric accumulation and redistribution of snow on FYI is highly stochastic over space and time, which makes it poorly understood. Previous studies have estimated local-scale snow thickness distributions using in-situ technique and modelling but it is spatially limited and challenging due to logistic difficulties. Moreover, snow albedo is also critical for determining the surface energy balance of the OSA during the critical summer ablation season. Even then, due to persistent and widespread cloud cover in the Arctic at various spatio-temporal scales, it is difficult and unreliable to remotely measure albedo of snow cover on FYI in the optical spectrum. Previous studies demonstrate that only large-scale sea ice albedo was successfully estimated using optical-satellite sensors. However, space-borne microwave sensors, with their capability of all-weather and 24-hour imaging, can provide enhanced information about snow cover on FYI. Daily spaceborne C-band scatterometer data (ASCAT) and MODIS data are used to investigate the the seasonal co-evolution of the microwave backscatter coefficient and optical albedo as a function of snow thickness on smooth FYI. The research focuses on snow-covered FYI near Cambridge Bay, Nunavut (Fig.1) during the winter to advanced-melt period (April-June, 2014). The ACSAT time series (Fig.2) show distinct increase in scattering at melt onset indicating the first occurrence of melt water in the snow cover. The corresponding albedo exhibits no decrease at this stage. We show how the standard deviation of ASCAT backscatter on FYI during winter can be used as a proxy for surface roughness and subsequent snow thickness (ie. Rougher surfaces acquire thicker snow covers) and then how this surface manifests into statistically distinguishable surface melt pond fractions which largely governs the optical derived albedo. Such relationships are useful for modelling the subsequent summer melt pond fraction and albedo from winter snow cover.

  8. Observed Structure and Characteristics of Cold Pools over Tropical Oceans using Vector Wind Retrievals and WRF simulations

    NASA Astrophysics Data System (ADS)

    Garg, P.; Nesbitt, S. W.; Lang, T. J.; Chronis, T.; Thayer, J. D.; Hence, D. A.

    2017-12-01

    Cold pools generated in the wake of convective activity can enhance the surface sensible heat flux, latent heat flux, and also changes in evaporation out of, and fresh water flux into, the ocean. Recent studies have shown that over the open ocean, cold pool outflow boundaries and their intersections can organize and initiate a spectrum of deep convective clouds, which is a key driver of shallow and deep convection over conditionally-unstable tropical oceans. The primary goal of this study is to understand the structure and characteristics of cold pools over the tropical oceans using observations. With the idea that cold pools will have strong wind gradients at their boundaries, we use ASCAT vector wind retrievals. We identify regions of steep gradients in wind vectors as gradient features (GFs), akin to cold pools. Corresponding to these GFs, sensible and latent heat fluxes were calculated using the observed winds and background temperatures from MERRA-2 reanalysis. To evaluate the proposed technique, cold pools were observed using S-PolKa radar from the DYNAMO/AMIE field campaign in the Indian Ocean for the period of 1 October 2011 to 31 March 2012 and were compared with ASCAT GFs. To relate the thermodynamic and kinematic characteristics of observed and simulated cold pools, simulations were carried out on WRF on a 3-km domain explicitly. The areas of cold pools were identified in the models using virtual temperature (Tv), which is a direct measure of air density, while GFs were identified using model simulated winds. Quantitative measures indicate that GFs are highly correspondent with model-simulated cold pools. In global measurements of cold pools from 2007-2015, it is possible to examine the characteristics of GFs across all tropical ocean basins, and relate them to meteorological conditions, as well as the characteristics of the parent precipitation systems. Our results indicate that while there is a general relationship between the amount of precipitation and the number of cold pools, the largest cold pools exist over the Eastern Pacific basin, where the most stratiform rain is produced from oceanic MCSs. It is anticipated that improved understanding of cold pools, which are a primary triggering mechanism of oceanic shallow and deep convection, will improve prediction of this important component of the climate system.

  9. Assessment of GPM and SM2RAIN-ASCAT rainfall products over complex terrain in southern Italy

    NASA Astrophysics Data System (ADS)

    Chiaravalloti, Francesco; Brocca, Luca; Procopio, Antonio; Massari, Christian; Gabriele, Salvatore

    2018-07-01

    The assessment of precipitation over land is extremely important for a number of scientific purposes related to the mitigation of natural hazards, climate modelling and prediction, famine and disease monitoring, to cite a few. Due to the difficulties and the cost to maintain ground monitoring networks, i.e., raingauges and meteorological radars, remote sensing is receiving more and more attention in the recent decade(s). However, the accuracy of satellite observations of rainfall should be assessed with ground information as it is affected by a number of factors (topography, vegetation density, land-sea interface). Calabria is a peninsular region in southern Italy characterized by complex topography, dense vegetation and a narrow North-South elongated shape, thus being a very challenging place for rainfall retrieval from remote sensing. In this study, we built a high-quality rainfall datasets from raingauges and meteorological radars for testing three remotely sensed rainfall products: 1) the Integrated Multi-satellitE Retrievals for Global Precipitation Measurement product (IMERG), 2) the SM2RASC product obtained from the application of SM2RAIN (Soil Moisture TO RAIN) algorithm to the Advanced SCATterometer (ASCAT) derived satellite soil moisture data, and 3) a product derived from a simple combination of IMERG and SM2RASC. The assessment of the products is carried out at different rainfall time accumulation (e.g., from 0.5 to 24 h) for a 2-year period from 10th March 2015, to 31st December 2016. Results show that IMERG has good performance at time resolutions higher than 6 h. At daily time scale, IMERG and SM2RASC show similar results with median correlations, R, 0.60, and root mean square error, RMSE, 7.6 mm/day (BIAS is -0.85 and +0.51 mm/day, respectively). The combined product outperforms the parent products (median R > 0.70, RMSE<6.5 mm/day, BIAS -0.07 mm/day). Among the different factors affecting products quality, topographic complexity seems to play the more relevant role, particularly for SM2RASC but also for IMERG. Overall, this study shows that the investigated satellite-based products agree reasonably well with observations notwithstanding the challenging features of the region, and the combination of IMERG and SM2RASC provides a way to overcome their limitations and to produce a higher quality satellite rainfall product.

  10. A 17-Month Review of the Care Model, Service Structure, and Design of THRIVE, a Community Mental Health Initiative in Northern Singapore.

    PubMed

    Cheang, K M; Cheok, C C S

    2015-12-01

    Effective delivery of psychiatric care requires the development of a range of services. The existing Singapore health care system provides a comprehensive range of psychiatric services based in restructured hospitals. The Ministry of Health Community Mental Health Masterplan (2012-2017) aims to build novel services for the community. This Masterplan envisions the development of ASCATs (Assessment Shared Care Teams) and COMITs (Community Intervention Teams) to build the capacity and capability for psychiatric care to be delivered outside the hospital in the community. A community mental health plan comprising a fast access clinic, internet-delivered self-help and building a community network of providers was devised for the North of Singapore through the THRIVE (Total Health Rich In Vitality and Energy) programme. This article provides an introduction to the care model, service structure and design of the THRIVE, and reviews its milestones and achievements from its inception in August 2012 until December 2013.

  11. Predicting Vegetation Condition from ASCAT Soil Water Index over Southwest India

    NASA Astrophysics Data System (ADS)

    Pfeil, Isabella Maria; Hochstöger, Simon; Amarnath, Giriraj; Pani, Peejush; Enenkel, Markus; Wagner, Wolfgang

    2017-04-01

    In India, extreme water scarcity events are expected to occur on average every five years. Record-breaking droughts affecting millions of human beings and livestock are common. If the south-west monsoon (summer monsoon) is delayed or brings less rainfall than expected, a season's harvest can be destroyed despite optimal farm management, leading to, in the worst case, life-threatening circumstances for a large number of farmers. Therefore, the monitoring of key drought indicators, such as the healthiness of the vegetation, and subsequent early warning is crucial. The aim of this work is to predict vegetation state from earth observation data instead of relying on models which need a lot of input data, increasing the complexity of error propagation, or seasonal forecasts, that are often too uncertain to be used as a regression component for a vegetation parameter. While precipitation is the main water supply for large parts of India's agricultural areas, vegetation datasets such as the Normalized Difference Vegetation Index (NDVI) provide reliable estimates of vegetation greenness that can be related to vegetation health. Satellite-derived soil moisture represents the missing link between a deficit in rainfall and the response of vegetation. In particular the water available in the root zone plays an important role for near-future vegetation health. Exploiting the added-value of root zone soil moisture is therefore crucial, and its use in vegetation studies presents an added value for drought analyses and decision-support. The soil water index (SWI) dataset derived from the Advanced Scatterometer (ASCAT) on board the Metop satellites represents the water content that is available in the root zone. This dataset shows a strong correlation with NDVI data obtained from measurements of the Moderate Resolution Imaging Spectroradiometer (MODIS), which is exploited in this study. A linear regression function is fit to the multi-year SWI and NDVI dataset with a temporal resolution of eight days, returning a set of parameters for every eight-day period of the year. Those parameters are then used to predict vegetation health based on the SWI up to 32 days after the latest available SWI and NDVI observations. In this work, the prediction was carried out for multiple eight-day periods in the year 2015 for three representative districts in India, and then compared to the actually observed NDVI during these periods, showing very similar spatial patterns in most analyzed regions and periods. This approach enables the prediction of vegetation health based on root zone soil moisture instead of relying on agro-meteorological models which often lack crucial input data in remote regions.

  12. Can next-generation soil data products improve soil moisture modelling at the continental scale? An assessment using a new microclimate package for the R programming environment

    NASA Astrophysics Data System (ADS)

    Kearney, Michael R.; Maino, James L.

    2018-06-01

    Accurate models of soil moisture are vital for solving core problems in meteorology, hydrology, agriculture and ecology. The capacity for soil moisture modelling is growing rapidly with the development of high-resolution, continent-scale gridded weather and soil data together with advances in modelling methods. In particular, the GlobalSoilMap.net initiative represents next-generation, depth-specific gridded soil products that may substantially increase soil moisture modelling capacity. Here we present an implementation of Campbell's infiltration and redistribution model within the NicheMapR microclimate modelling package for the R environment, and use it to assess the predictive power provided by the GlobalSoilMap.net product Soil and Landscape Grid of Australia (SLGA, ∼100 m) as well as the coarser resolution global product SoilGrids (SG, ∼250 m). Predictions were tested in detail against 3 years of root-zone (3-75 cm) soil moisture observation data from 35 monitoring sites within the OzNet project in Australia, with additional tests of the finalised modelling approach against cosmic-ray neutron (CosmOz, 0-50 cm, 9 sites from 2011 to 2017) and satellite (ASCAT, 0-2 cm, continent-wide from 2007 to 2009) observations. The model was forced by daily 0.05° (∼5 km) gridded meteorological data. The NicheMapR system predicted soil moisture to within experimental error for all data sets. Using the SLGA or the SG soil database, the OzNet soil moisture could be predicted with a root mean square error (rmse) of ∼0.075 m3 m-3 and a correlation coefficient (r) of 0.65 consistently through the soil profile without any parameter tuning. Soil moisture predictions based on the SLGA and SG datasets were ≈ 17% closer to the observations than when using a chloropleth-derived soil data set (Digital Atlas of Australian Soils), with the greatest improvements occurring for deeper layers. The CosmOz observations were predicted with similar accuracy (r = 0.76 and rmse of ∼0.085 m3 m-3). Comparisons at the continental scale to 0-2 cm satellite data (ASCAT) showed that the SLGA/SG datasets increased model fit over simulations using the DAAS soil properties (r ∼ 0.63 &rmse 15% vs. r 0.48 &rmse 18%, respectively). Overall, our results demonstrate the advantages of using GlobalSoilMap.net products in combination with gridded weather data for modelling soil moisture at fine spatial and temporal resolution at the continental scale.

  13. Precipitation Estimation Using L-Band and C-Band Soil Moisture Retrievals

    NASA Technical Reports Server (NTRS)

    Koster, Randal D.; Brocca, Luca; Crow, Wade T.; Burgin, Mariko S.; De Lannoy, Gabrielle J. M.

    2016-01-01

    An established methodology for estimating precipitation amounts from satellite-based soil moisture retrievals is applied to L-band products from the Soil Moisture Active Passive (SMAP) and Soil Moisture and Ocean Salinity (SMOS) satellite missions and to a C-band product from the Advanced Scatterometer (ASCAT) mission. The precipitation estimates so obtained are evaluated against in situ (gauge-based) precipitation observations from across the globe. The precipitation estimation skill achieved using the L-band SMAP and SMOS data sets is higher than that obtained with the C-band product, as might be expected given that L-band is sensitive to a thicker layer of soil and thereby provides more information on the response of soil moisture to precipitation. The square of the correlation coefficient between the SMAP-based precipitation estimates and the observations (for aggregations to approximately100 km and 5 days) is on average about 0.6 in areas of high rain gauge density. Satellite missions specifically designed to monitor soil moisture thus do provide significant information on precipitation variability, information that could contribute to efforts in global precipitation estimation.

  14. Multi-decadal Arctic sea ice roughness.

    NASA Astrophysics Data System (ADS)

    Tsamados, M.; Stroeve, J.; Kharbouche, S.; Muller, J. P., , Prof; Nolin, A. W.; Petty, A.; Haas, C.; Girard-Ardhuin, F.; Landy, J.

    2017-12-01

    The transformation of Arctic sea ice from mainly perennial, multi-year ice to a seasonal, first-year ice is believed to have been accompanied by a reduction of the roughness of the ice cover surface. This smoothening effect has been shown to (i) modify the momentum and heat transfer between the atmosphere and ocean, (ii) to alter the ice thickness distribution which in turn controls the snow and melt pond repartition over the ice cover, and (iii) to bias airborne and satellite remote sensing measurements that depend on the scattering and reflective characteristics over the sea ice surface topography. We will review existing and novel remote sensing methodologies proposed to estimate sea ice roughness, ranging from airborne LIDAR measurement (ie Operation IceBridge), to backscatter coefficients from scatterometers (ASCAT, QUICKSCAT), to multi angle maging spectroradiometer (MISR), and to laser (Icesat) and radar altimeters (Envisat, Cryosat, Altika, Sentinel-3). We will show that by comparing and cross-calibrating these different products we can offer a consistent multi-mission, multi-decadal view of the declining sea ice roughness. Implications for sea ice physics, climate and remote sensing will also be discussed.

  15. Spaceborne GNSS reflectometry for ocean winds: First results from the UK TechDemoSat-1 mission

    NASA Astrophysics Data System (ADS)

    Foti, Giuseppe; Gommenginger, Christine; Jales, Philip; Unwin, Martin; Shaw, Andrew; Robertson, Colette; Roselló, Josep

    2015-07-01

    First results are presented for ocean surface wind speed retrieval from reflected GPS signals measured by the low Earth orbiting UK TechDemoSat-1 satellite (TDS-1). Launched in July 2014, TDS-1 provides the first new spaceborne Global Navigation Satellite System-Reflectometry (GNSS-R) data since the pioneering UK-Disaster Monitoring Mission (UK-DMC) experiment in 2003. Examples of onboard-processed delay-Doppler maps reveal excellent data quality for winds up to 27.9 m/s. Collocated Advanced Scatterometer (ASCAT) winds are used to develop and evaluate a wind speed algorithm based on signal-to-noise ratio (SNR) and the bistatic radar equation. For SNRs greater than 3 dB, wind speed is retrieved without bias and a precision around 2.2 m/s between 3 and 18 m/s even without calibration. Exploiting lower SNR signals, however, requires good knowledge of the antenna beam, platform attitude, and instrument gain setting. This study demonstrates the capabilities of low-cost, low-mass, and low-power GNSS-R receivers ahead of their launch on the NASA Cyclone GNSS (CYGNSS) constellation in 2016.

  16. First Spaceborne GNSS-Reflectometry Observations of Hurricanes From the UK TechDemoSat-1 Mission

    NASA Astrophysics Data System (ADS)

    Foti, Giuseppe; Gommenginger, Christine; Srokosz, Meric

    2017-12-01

    We present the first examples of Global Navigation Satellite Systems-Reflectometry (GNSS-R) observations of hurricanes using spaceborne data from the UK TechDemoSat-1 (TDS-1) mission. We confirm that GNSS-R signals can detect ocean condition changes in very high near-surface ocean wind associated with hurricanes. TDS-1 GNSS-R reflections were collocated with International Best Track Archive for Climate Stewardship (IBTrACS) hurricane data, MetOp ASCAT A/B scatterometer winds, and two reanalysis products. Clear variations of GNSS-R reflected power (σ0) are observed as reflections travel through hurricanes, in some cases up to and through the eye wall. The GNSS-R reflected power is tentatively inverted to estimate wind speed using the TDS-1 baseline wind retrieval algorithm developed for low to moderate winds. Despite this, TDS-1 GNSS-R winds through the hurricanes show closer agreement with IBTrACS estimates than winds provided by scatterometers and reanalyses. GNSS-R wind profiles show realistic spatial patterns and sharp gradients that are consistent with expected structures around the eye of tropical cyclones.

  17. Assimilating satellite soil moisture into rainfall-runoff modelling: towards a systematic study

    NASA Astrophysics Data System (ADS)

    Massari, Christian; Tarpanelli, Angelica; Brocca, Luca; Moramarco, Tommaso

    2015-04-01

    Soil moisture is the main factor for the repartition of the mass and energy fluxes between the land surface and the atmosphere thus playing a fundamental role in the hydrological cycle. Indeed, soil moisture represents the initial condition of rainfall-runoff modelling that determines the flood response of a catchment. Different initial soil moisture conditions can discriminate between catastrophic and minor effects of a given rainfall event. Therefore, improving the estimation of initial soil moisture conditions will reduce uncertainties in early warning flood forecasting models addressing the mitigation of flood hazard. In recent years, satellite soil moisture products have become available with fine spatial-temporal resolution and a good accuracy. Therefore, a number of studies have been published in which the impact of the assimilation of satellite soil moisture data into rainfall-runoff modelling is investigated. Unfortunately, data assimilation involves a series of assumptions and choices that significantly affect the final result. Given a satellite soil moisture observation, a rainfall-runoff model and a data assimilation technique, an improvement or a deterioration of discharge predictions can be obtained depending on the choices made in the data assimilation procedure. Consequently, large discrepancies have been obtained in the studies published so far likely due to the differences in the implementation of the data assimilation technique. On this basis, a comprehensive and robust procedure for the assimilation of satellite soil moisture data into rainfall-runoff modelling is developed here and applied to six subcatchment of the Upper Tiber River Basin for which high-quality hydrometeorological hourly observations are available in the period 1989-2013. The satellite soil moisture product used in this study is obtained from the Advanced SCATterometer (ASCAT) onboard Metop-A satellite and it is available since 2007. The MISDc ("Modello Idrologico SemiDistribuito in continuo") continuous hydrological model is used for flood simulation. The Ensemble Kalman Filter (EnKF) is employed as data assimilation technique for its flexibility and good performance in a number of previous applications. Different components are involved in the developed data assimilation procedure. For the correction of the bias between satellite and modelled soil moisture data three different techniques are considered: mean-variance matching, Cumulative Density Function (CDF) matching and least square linear regression. For properly generating the ensembles of model states, required in the application of EnKF technique, an exhaustive search of the model error parameterization and structure is carried out, differentiated for each study catchments. A number of scores and statistics are employed for the evaluation the reliability of the ensemble. Similarly, different configurations for the observation error are investigated. Results show that for four out six catchments the assimilation of the ASCAT soil moisture product improves discharge simulation in the validation period 2010-2013, mainly during flood events. The two catchments in which the assimilation does not improve the results are located in the mountainous part of the region where both MISDc and satellite data perform worse. The analysis on the data assimilation choices highlights that the selection of the observation error seems to have the largest influence on discharge simulation. Finally, the bias correction approaches have a lower effect and the selection of linear techniques is preferable. The assessment of all the components involved in the data assimilation procedure provides a clear understanding of results and it is advised to follow a similar procedure in this kind of studies.

  18. Improvements and Advances to the Cross-Calibrated Multi-Platform (CCMP) Ocean Vector Wind Analysis (V2.0 release)

    NASA Astrophysics Data System (ADS)

    Scott, J. P.; Wentz, F. J.; Hoffman, R. N.; Atlas, R. M.

    2016-02-01

    Ocean vector wind is a valuable climate data record (CDR) useful in observing and monitoring changes in climate and air-sea interactions. Ocean surface wind stress influences such processes as heat, moisture, and momentum fluxes between the atmosphere and ocean, driving ocean currents and forcing ocean circulation. The Cross-Calibrated Multi-Platform (CCMP) ocean vector wind analysis is a quarter-degree, six-hourly global ocean wind analysis product created using the variational analysis method (VAM) [Atlas et al., 1996; Hoffman et al., 2003]. The CCMP V1.1 wind product is a highly-esteemed, widely-used data set containing the longest gap-free record of satellite-based ocean vector wind data (July 1987 to June 2012). CCMP V1.1 was considered a "first-look" data set that used the most-timely, albeit preliminary, releases of satellite, in situ, and modeled ECMWF-Operational wind background fields. The authors have been working with the original producers of CCMP V1.1 to create an updated, improved, and consistently-reprocessed CCMP V2.0 ocean vector wind analysis data set. With Remote Sensing Systems (RSS) having recently updated all passive microwave satellite instrument calibrations and retrievals to the RSS Version-7 RTM standard, the reprocessing of the CCMP data set into a higher-quality CDR using inter-calibrated satellite inputs became feasible. In addition to the use of SSM/I, SSMIS, TRMM TMI, QuikSCAT, AMSRE, and WindSat instruments, AMSR2, GMI, and ASCAT have been also included in the CCMP V2.0 data set release, which has now been extended to the beginning of 2015. Additionally, the background field has been updated to use six-hourly, quarter-degree ERA-Interim wind vector inputs, and the quality-checks on the in situ data have been carefully reviewed and improved. The goal of the release of the CCMP V2.0 ocean wind vector analysis product is to serve as a merged ocean wind vector data set for climate studies. Diligent effort has been made by the authors to minimize systematic and spurious sources of error. The authors will present a complete discussion of upgrades made to the CCMP V2.0 data set, as well as present validation work that has been completed on the CCMP V2.0 wind analysis product.

  19. Estimating surface soil moisture from SMAP observations using a Neural Network technique.

    PubMed

    Kolassa, J; Reichle, R H; Liu, Q; Alemohammad, S H; Gentine, P; Aida, K; Asanuma, J; Bircher, S; Caldwell, T; Colliander, A; Cosh, M; Collins, C Holifield; Jackson, T J; Martínez-Fernández, J; McNairn, H; Pacheco, A; Thibeault, M; Walker, J P

    2018-01-01

    A Neural Network (NN) algorithm was developed to estimate global surface soil moisture for April 2015 to March 2017 with a 2-3 day repeat frequency using passive microwave observations from the Soil Moisture Active Passive (SMAP) satellite, surface soil temperatures from the NASA Goddard Earth Observing System Model version 5 (GEOS-5) land modeling system, and Moderate Resolution Imaging Spectroradiometer-based vegetation water content. The NN was trained on GEOS-5 soil moisture target data, making the NN estimates consistent with the GEOS-5 climatology, such that they may ultimately be assimilated into this model without further bias correction. Evaluated against in situ soil moisture measurements, the average unbiased root mean square error (ubRMSE), correlation and anomaly correlation of the NN retrievals were 0.037 m 3 m -3 , 0.70 and 0.66, respectively, against SMAP core validation site measurements and 0.026 m 3 m -3 , 0.58 and 0.48, respectively, against International Soil Moisture Network (ISMN) measurements. At the core validation sites, the NN retrievals have a significantly higher skill than the GEOS-5 model estimates and a slightly lower correlation skill than the SMAP Level-2 Passive (L2P) product. The feasibility of the NN method was reflected by a lower ubRMSE compared to the L2P retrievals as well as a higher skill when ancillary parameters in physically-based retrievals were uncertain. Against ISMN measurements, the skill of the two retrieval products was more comparable. A triple collocation analysis against Advanced Microwave Scanning Radiometer 2 (AMSR2) and Advanced Scatterometer (ASCAT) soil moisture retrievals showed that the NN and L2P retrieval errors have a similar spatial distribution, but the NN retrieval errors are generally lower in densely vegetated regions and transition zones.

  20. SOIL moisture data intercomparison

    NASA Astrophysics Data System (ADS)

    Kerr, Yann; Rodriguez-Frenandez, Nemesio; Al-Yaari, Amen; Parens, Marie; Molero, Beatriz; Mahmoodi, Ali; Mialon, Arnaud; Richaume, Philippe; Bindlish, Rajat; Mecklenburg, Susanne; Wigneron, Jean-Pierre

    2016-04-01

    The Soil Moisture and Ocean Salinity satellite (SMOS) was launched in November 2009 and started delivering data in January 2010. Subsequently, the satellite has been in operation for over 6 years while the retrieval algorithms from Level 1 to Level 2 underwent significant evolutions as knowledge improved. Other approaches for retrieval at Level 2 over land were also investigated while Level 3 and 4 were initiated. In this présentation these improvements are assessed by inter-comparisons of the current Level 2 (V620) against the previous version (V551) and new products either using neural networks or Level 3. In addition a global evaluation of different SMOS soil moisture (SM) products is performed comparing products with those of model simulations and other satellites (AMSR E/ AMSR2 and ASCAT). Finally, all products were evaluated against in situ measurements of soil moisture (SM). The study demonstrated that the V620 shows a significant improvement (including those at level1 improving level2)) with respect to the earlier version V551. Results also show that neural network based approaches can yield excellent results over areas where other products are poor. Finally, global comparison indicates that SMOS behaves very well when compared to other sensors/approaches and gives consistent results over all surfaces from very dry (African Sahel, Arizona), to wet (tropical rain forests). RFI (Radio Frequency Interference) is still an issue even though detection has been greatly improved while RFI sources in several areas of the world are significantly reduced. When compared to other satellite products, the analysis shows that SMOS achieves its expected goals and is globally consistent over different eco climate regions from low to high latitudes and throughout the seasons.

  1. Development and Implementation of Flood Risk Mapping, Water Bodies Monitoring and Climate Information for Human Health

    NASA Astrophysics Data System (ADS)

    Ceccato, P.; McDonald, K. C.; Jensen, K.; Podest, E.; De La Torre Juarez, M.

    2013-12-01

    Public health professionals are increasingly concerned about the potential impact that climate variability and change can have on infectious disease. The International Research Institute for Climate and Society (IRI), City College of New York (CCNY) and NASA Jet Propulsion Laboratory (JPL) are developing new products to increase the public health community's capacity to understand, use, and demand the appropriate climate data and climate information to mitigate the public health impacts of climate on vector-borne diseases such as malaria, leishmaniasis, rift valley fever. In this poster we present the new and improved products that have been developed for monitoring water bodies for monitoring and forecasting risks of vector-borne disease epidemics. The products include seasonal inundation patterns in the East African region based on the global mappings of inundated water fraction derived at the 25-km scale from both active and passive microwave instruments QuikSCAT, AMSR-E, SSM/I, ERS, ASCAT, and MODIS and LANDSAT data. We also present how the products are integrated into a knowledge system (IRI Data Library Map room, SERVIR) to support the use of climate and environmental information in climate-sensitive health decision-making.

  2. Assimilation of remote sensing observations into a continuous distributed hydrological model: impacts on the hydrologic cycle

    NASA Astrophysics Data System (ADS)

    Laiolo, Paola; Gabellani, Simone; Campo, Lorenzo; Cenci, Luca; Silvestro, Francesco; Delogu, Fabio; Boni, Giorgio; Rudari, Roberto

    2015-04-01

    The reliable estimation of hydrological variables (e.g. soil moisture, evapotranspiration, surface temperature) in space and time is of fundamental importance in operational hydrology to improve the forecast of the rainfall-runoff response of catchments and, consequently, flood predictions. Nowadays remote sensing can offer a chance to provide good space-time estimates of several hydrological variables and then improve hydrological model performances especially in environments with scarce in-situ data. This work investigates the impact of the assimilation of different remote sensing products on the hydrological cycle by using a continuous physically based distributed hydrological model. Three soil moisture products derived by ASCAT (Advanced SCATterometer) are used to update the model state variables. The satellite-derived products are assimilated into the hydrological model using different assimilation techniques: a simple nudging and the Ensemble Kalman Filter. Moreover two assimilation strategies are evaluated to assess the impact of assimilating the satellite products at model spatial resolution or at the satellite scale. The experiments are carried out for three Italian catchments on multi year period. The benefits on the model predictions of discharge, LST, evapotranspiration and soil moisture dynamics are tested and discussed.

  3. Assimilation of SMOS brightness temperatures in the ECMWF EKF for the analysis of soil moisture

    NASA Astrophysics Data System (ADS)

    Munoz-Sabater, Joaquin

    2012-07-01

    Since November 2nd 2009, the European Centre for Medium-Range Weather Forecasts (ECMWF) has being monitoring, in Near Real Time (NRT), L-band brightness temperatures measured by the Soil Moisture and Ocean Salinity (SMOS) satellite mission of the European Space Agency (ESA). The main objective of the monitoring suite for SMOS data is to systematically monitor the difference between SMOS observed brightness temperatures and the corresponding model equivalent simulated by the Community Microwave Emission Model (CMEM), the so-called first guess departures. This is a crucial step, as first guess departures is the quantity used in the analysis. The ultimate goal is to investigate how the assimilation of SMOS brightness temperatures over land improves the weather forecast skill, through a more accurate initialization of the global soil moisture state. In this presentation, some significant results from the activities preparing for the assimilation of SMOS data are shown. Among these activities, an effective data thinning strategy, a practical approach to reduce noise from the observed brightness temperatures and a bias correction scheme are of special interest. Firstly, SMOS data needs to be significantly thinned as the data volume delivered for a single orbit is too large for the current operational capabilities in any Numerical Weather Prediction system. Different thinning strategies have been analysed and tested. The most suitable one is the assimilation of SMOS brightness temperatures which match the ECMWF T511 (~40 km) reduced Gaussian Grid. Secondly, SMOS observational noise is reduced significantly by averaging the data in angular bins. In addition, this methodology contributes to further thinning of the SMOS data before the analysis. Finally, a bias correction scheme based on a CDF matching is applied to the observations to ensure an unbiased dataset ready for assimilation in the ECMWF surface analysis system. The current ECMWF operational soil moisture analysis system is based on a point-wise Extended Kalman Filter (EKF). This system assimilates proxy surface observations, i.e., 2 m air temperature and relative humidity to analyse the soil moisture state. Recent developments have also made it possible to assimilate remote sensing data coming from active and passive instruments. In particular, the ECMWF EKF can also assimilate data from the Advanced Scatterometer (ASCAT) onboard METOP-A and, more recently, from SMOS brightness temperatures observations. The first preliminary assimilation results will be shown. The analysis fields will be evaluated through comparison to in-situ data from different regions.

  4. Data and Tools | Energy Analysis | NREL

    Science.gov Websites

    and Tools Energy Analysis Data and Tools NREL develops energy analysis data and tools to assess collections. Data Products Technology and Performance Analysis Tools Energy Systems Analysis Tools Economic and Financial Analysis Tools

  5. Antecedent Wetness Conditions based on ERS scatterometer data in support to rainfall-runoff modeling

    NASA Astrophysics Data System (ADS)

    Brocca, L.; Melone, F.; Moramarco, T.

    2009-04-01

    Despite of its small volume compared to other components of the hydrologic cycle, the soil moisture is of fundamental importance to many hydrological, meteorological, biological and biogeochemical processes. For storm rainfall-runoff modeling the estimation of the Antecedent Wetness Conditions (AWC) is one of the most important issues to determine the hydrological response. In this context, this study investigates the potential of the scatterometer on board of the ERS satellites for the assessment of soil wetness conditions at two different scales. The satellite soil moisture data set, available from 1992, is downloaded from the ERS/METOP Soil Moisture archive located at http://www.ipf.tuwien.ac.at/radar/index.php?go=ascat. At the local scale, the scatterometer-derived soil wetness index (SWI) data (Wagner, W., Lemoine, G., and Rott, H., 1999. A Method for Estimating Soil Moisture from ERS Scatterometer and Soil Data. Remote Sensing of Environment, 70, 191-207) have been compared with two in-situ soil moisture data sets. At the catchment scale, the reliability of the SWI to estimate the AWC has been tested considering its relationship with the soil potential maximum retention parameter, S, of the Soil Conservation Service-Curve Number (SCS-CN) method for abstraction. The parameter S has been derived by considering several flood events occurred from 1992 to 2005 in different catchments of central Italy. The performance of two Antecedent Precipitation Indices (API) and one Base Flow Index (BFI), usually employed in the hydrological practice for the AWC assessment, have been compared with the SWI. The obtained results show a high accuracy of the SWI for the estimation of wetness conditions both at the local and catchment scale despite of the complex orography of the investigated areas (Brocca, L., Melone, F., Moramarco, T., Morbidelli, R., 2009. Antecedent wetness conditions based on ERS scatterometer data. Journal of Hydrology, 364 (1-2), 73-87). At the local scale, the SWI has been found quite reliable in representing the soil moisture at layer depth of 15 cm with average correlation coefficient equal to 0.81 and a root mean square error of ~ 0.04 m3/m3. In terms of AWC assessment at the catchment scale, the SWI has been found highly correlated with the observed S parameter with correlation coefficient equal to -0.90. Besides, SWI outperformed both API indices, poorly representative of AWC, and BFI. The methodology delineated in this study can be considered as a simple and entirely new approach to validate the remotely sensed soil moisture estimates at the catchment scale, mainly for coarse resolution sensors as scatterometers and radiometers. The obtained results indirectly reveal the usefulness of the SWI both for flood forecasting applications and for prediction in ungauged basins. Moreover, the correlation of in-situ soil moisture measurements with the SWI reveals the potential of scatterometer data, particularly considering the higher spatial resolution provided by the successor of ERS scatterometer, the Advanced Scatterometer, ASCAT, on board of the meteorological operational platforms, METOP.

  6. Neural network retrieval of soil moisture: application to SMOS

    NASA Astrophysics Data System (ADS)

    Rodriguez-Fernandez, Nemesio; Richaume, Philippe; Aires, Filipe; Prigent, Catherine; Kerr, Yann; Kolasssa, Jana; Jimenez, Carlos; Cabot, Francois; Mahmoodi, Ali

    2014-05-01

    We present an efficient statistical soil moisture (SM) retrieval method using SMOS brightness temperatures (BTs) complemented with MODIS NDVI and ASCAT backscattering data. The method is based on a feed-forward neural network (hereafter NN) trained with SM from ECMWF model predictions or from the SMOS operational algorithm. The best compromise to retrieve SM with NNs from SMOS brightness temperatures in a large fraction of the swath (~ 670 km) is to use incidence angles from 25 to 60 degrees (in 7 bins of 5 deg width) for both H and V polarizations. The correlation coefficient (R) of the SM retrieved by the NN and the reference SM dataset (ECMWF or SMOS L3) is 0.8. The correlation coefficient increases to 0.91 when adding as input MODIS NDVI, ECOCLIMAP sand and clay fractions and one of the following data: (i) active microwaves observations (ASCAT backscattering coefficient at 40 deg incidence angle), (ii) ECMWF soil temperature. Finally, the correlation coefficient increases to R=0.94 when using a normalization index computed locally for each latitude-longitude point with the maximum and minimum BTs and the associated SM values from the local time series. Global maps of SM obtained with NNs reproduce well the spatial structures present in the reference SM datasets, implying that the NN works well for a wide range of ecosystems and physical conditions. In addition, the results of the NNs have been evaluated at selected locations for which in situ measurements are available such as the USDA-ARS watersheds (USA), the OzNet network (AUS) and USDA-NRCS SCAN network (USA). The time series of SM obtained with NNs reproduce the temporal behavior measured with in situ sensors. For well known sites where the in situ measurement is representative of a 40 km scale like the Little Washita watershed, the NN models show a very high correlation of (R = 0.8-0.9) and a low standard deviation of 0.02-0.04 m3/m3 with respect to the in situ measurements. When comparing with all the in situ stations, the average correlation coefficients and bias of NN SM with respect to in situ measurements are comparable to those of ECMWF and SMOS L3 SM (R = 0.6). The standard deviation of the difference (STTD) of those products with respect to in situ measurements is lower for NN SM, in particular for the NN models that use local information on the extreme BTs and associated SM values, for which average STDD is 0.03 m3/m3, twice as low as the average STDD values obtained with ECMWF and L3 SM (0.05-0.07 m3/m3). In conclusion, SM obtained using NN give results of comparable or better quality to other SM products. In addition, NNs are an efficient method to obtain SM from SMOS data (one year of SMOS observations can be inverted in less than 60 seconds). These results have been obtained in the framework of the SMOS+NN project funded by ESA and they open interesting perspectives such as a near real time processor and data assimilation in weather prediction models.

  7. Autologous Stem Cells in Achilles Tendinopathy (ASCAT): protocol for a phase IIA, single-centre, proof-of-concept study

    PubMed Central

    Goldberg, Andrew J; Zaidi, Razi; Brooking, Deirdre; Kim, Louise; Korda, Michelle; Masci, Lorenzo; Green, Ruth; O’Donnell, Paul; Smith, Roger

    2018-01-01

    Introduction Achilles tendinopathy (AT) is a cause of pain and disability affecting both athletes and sedentary individuals. More than 150 000 people in the UK every year suffer from AT. While there is much preclinical work on the use of stem cells in tendon pathology, there is a scarcity of clinical data looking at the use of mesenchymal stem cells to treat tendon disease and there does not appear to be any studies of the use of autologous cultured mesenchymal stem cells (MSCs) for AT. Our hypothesis is that autologous culture expanded MSCs implanted into an area of mid-portion AT will lead to improved pain-free mechanical function. The current paper presents the protocol for a phase IIa clinical study. Methods and analysis The presented protocol is for a non-commercial, single-arm, open-label, phase IIa proof-of-concept study. The study will recruit 10 participants and will follow them up for 6 months. Included will be patients aged 18–70 years with chronic mid-portion AT who have failed at least 6 months of non-operative management. Participants will have a bone marrow aspirate collected from the posterior iliac crest under either local or general anaesthetic. MSCs will be isolated and expanded from the bone marrow. Four to 6 weeks after the harvest, participants will undergo implantation of the culture expanded MSCs under local anaesthetic and ultrasound guidance. The primary outcome will be safety as defined by the incidence rate of serious adverse reaction. The secondary outcomes will be efficacy as measured by patient-reported outcome measures and radiological outcome using ultrasound techniques. Ethics and dissemination The protocol has been approved by the National Research Ethics Service Committee (London, Harrow; reference 13/LO/1670). Trial findings will be disseminated through peer-reviewed publications and conference presentations. Trial registration number NCT02064062. PMID:29764889

  8. Global relation between microwave satellite vegetation products and vegetation productivity

    NASA Astrophysics Data System (ADS)

    Teubner, Irene E.; Forkel, Matthias; Jung, Martin; Miralles, Diego G.; Dorigo, Wouter A.

    2017-04-01

    The occurrence of unfavourable environmental conditions like droughts commonly reduces the photosynthetic activity of ecosystems and, hence, their potential to take up carbon from the atmosphere. Ecosystem photosynthetic activity is commonly determined using remote sensing observations in the optical domain, which however have limitations particularly in regions of frequent cloud cover, e.g. the tropics. In this study, we explore the potential of vegetation optical depth (VOD) from microwave satellite observations as an alternative source for assessing vegetation productivity. VOD serves as an estimate for vegetation density and water content, which has an impact on plant physiological processes and hence should potentially provide a link to gross primary production (GPP). However, to date, it is unclear how microwave-retrieved VOD data and GPP data are related. We compare seasonal dynamics and anomalies of VOD retrievals from different satellite sensors and microwave frequencies with site level and global GPP estimates. We use VOD observations from active (ASCAT) and passive microwave sensors (AMSR-E, SMOS). We include eddy covariance measurements from the FLUXNET2015 dataset to assess the VOD products at site level. For a global scale analysis, we use the solar-induced chlorophyll fluorescence (SIF) observations from GOME-2 as a proxy for GPP and the FLUXCOM GPP product, which presents an upscaling of site measurements based on remote sensing data. Our results demonstrate that in general a good agreement between VOD and GPP or SIF exists. However, the strength of these relations depends on the microwave frequency, land cover type, and the time within the growing season. Correlations between anomalies of VOD and GPP or SIF support the assumption that microwave-derived VOD can be used to monitor vegetation productivity dynamics. The study is performed as part of the EOWAVE project funded by the Vienna University of Technology (http://eowave.geo.tuwien.ac.at/) and the STR3S project funded by the Belgian Science Policy Office (BELSPO) as part of the STEREO III programme.

  9. The PRESSCA operational early warning system for landslide forecasting: the 11-12 November 2013 rainfall event in Central Italy.

    NASA Astrophysics Data System (ADS)

    Ciabatta, Luca; Brocca, Luca; Ponziani, Francesco; Berni, Nicola; Stelluti, Marco; Moramarco, Tommaso

    2014-05-01

    The Umbria Region, located in Central Italy, is one of the most landslide risk prone area in Italy, almost yearly affected by landslides events at different spatial scales. For early warning procedures aimed at the assessment of the hydrogeological risk, the rainfall thresholds represent the main tool for the Italian Civil Protection System. As shown in previous studies, soil moisture plays a key-role in landslides triggering. In fact, acting on the pore water pressure, soil moisture influences the rainfall amount needed for activating a landslide. In this work, an operational physically-based early warning system, named PRESSCA, that takes into account soil moisture for the definition of rainfall thresholds is presented. Specifically, the soil moisture conditions are evaluated in PRESSCA by using a distributed soil water balance model that is recently coupled with near real-time satellite soil moisture product obtained from ASCAT (Advanced SCATterometer) and from in-situ monitoring data. The integration of three different sources of soil moisture information allows to estimate the most accurate possible soil moisture condition. Then, both observed and forecasted rainfall data are compared with the soil moisture-based thresholds in order to obtain risk indicators over a grid of ~ 5 km. These indicators are then used for the daily hydrogeological risk evaluation and management by the Civil Protection regional service, through the sharing/delivering of near real-time landslide risk scenarios (also through an open source web platform: www.cfumbria.it). On the 11th-12th November, 2013, Umbria Region was hit by an exceptional rainfall event with up to 430mm/72hours that resulted in significant economic damages, but fortunately no casualties among the population. In this study, the results during the rainfall event of PRESSCA system are described, by underlining the model capability to reproduce, two days in advance, landslide risk scenarios in good spatial and temporal agreement with the occurred actual conditions. High-resolution risk scenarios (100mx100m), obtained by coupling PRESSCA forecasts with susceptibility and vulnerability layers, are also produced. The results show good relationship between the PRESSCA forecast and the reported landslides to the Civil Protection Service during the rainfall event, confirming the system robustness. The good forecasts of PRESSCA system have surely contributed to start well in advance the Civil Protection operations (alerting local authorities and population).

  10. Global retrieval of soil moisture and vegetation properties using data-driven methods

    NASA Astrophysics Data System (ADS)

    Rodriguez-Fernandez, Nemesio; Richaume, Philippe; Kerr, Yann

    2017-04-01

    Data-driven methods such as neural networks (NNs) are a powerful tool to retrieve soil moisture from multi-wavelength remote sensing observations at global scale. In this presentation we will review a number of recent results regarding the retrieval of soil moisture with the Soil Moisture and Ocean Salinity (SMOS) satellite, either using SMOS brightness temperatures as input data for the retrieval or using SMOS soil moisture retrievals as reference dataset for the training. The presentation will discuss several possibilities for both the input datasets and the datasets to be used as reference for the supervised learning phase. Regarding the input datasets, it will be shown that NNs take advantage of the synergy of SMOS data and data from other sensors such as the Advanced Scatterometer (ASCAT, active microwaves) and MODIS (visible and infra red). NNs have also been successfully used to construct long time series of soil moisture from the Advanced Microwave Scanning Radiometer - Earth Observing System (AMSR-E) and SMOS. A NN with input data from ASMR-E observations and SMOS soil moisture as reference for the training was used to construct a dataset sharing a similar climatology and without a significant bias with respect to SMOS soil moisture. Regarding the reference data to train the data-driven retrievals, we will show different possibilities depending on the application. Using actual in situ measurements is challenging at global scale due to the scarce distribution of sensors. In contrast, in situ measurements have been successfully used to retrieve SM at continental scale in North America, where the density of in situ measurement stations is high. Using global land surface models to train the NN constitute an interesting alternative to implement new remote sensing surface datasets. In addition, these datasets can be used to perform data assimilation into the model used as reference for the training. This approach has recently been tested at the European Centre for Medium-Range Weather Forecasts (ECMWF). Finally, retrievals using radiative transfer models can also be used as a reference SM dataset for the training phase. This approach was used to retrieve soil moisture from ASMR-E, as mentioned above, and also to implement the official European Space Agency (ESA) SMOS soil moisture product in Near-Real-Time. We will finish with a discussion of the retrieval of vegetation parameters from SMOS observations using data-driven methods.

  11. Temporal Stability of Soil Moisture and Radar Backscatter Observed by the Advanced Synthetic Aperture Radar (ASAR)

    PubMed Central

    Wagner, Wolfgang; Pathe, Carsten; Doubkova, Marcela; Sabel, Daniel; Bartsch, Annett; Hasenauer, Stefan; Blöschl, Günter; Scipal, Klaus; Martínez-Fernández, José; Löw, Alexander

    2008-01-01

    The high spatio-temporal variability of soil moisture is the result of atmospheric forcing and redistribution processes related to terrain, soil, and vegetation characteristics. Despite this high variability, many field studies have shown that in the temporal domain soil moisture measured at specific locations is correlated to the mean soil moisture content over an area. Since the measurements taken by Synthetic Aperture Radar (SAR) instruments are very sensitive to soil moisture it is hypothesized that the temporally stable soil moisture patterns are reflected in the radar backscatter measurements. To verify this hypothesis 73 Wide Swath (WS) images have been acquired by the ENVISAT Advanced Synthetic Aperture Radar (ASAR) over the REMEDHUS soil moisture network located in the Duero basin, Spain. It is found that a time-invariant linear relationship is well suited for relating local scale (pixel) and regional scale (50 km) backscatter. The observed linear model coefficients can be estimated by considering the scattering properties of the terrain and vegetation and the soil moisture scaling properties. For both linear model coefficients, the relative error between observed and modelled values is less than 5 % and the coefficient of determination (R2) is 86 %. The results are of relevance for interpreting and downscaling coarse resolution soil moisture data retrieved from active (METOP ASCAT) and passive (SMOS, AMSR-E) instruments. PMID:27879759

  12. Comparisons of Satellite Soil Moisture, an Energy Balance Model Driven by LST Data and Point Measurements

    NASA Astrophysics Data System (ADS)

    Laiolo, Paola; Gabellani, Simone; Rudari, Roberto; Boni, Giorgio; Puca, Silvia

    2013-04-01

    Soil moisture plays a fundamental role in the partitioning of mass and energy fluxes between land surface and atmosphere, thereby influencing climate and weather, and it is important in determining the rainfall-runoff response of catchments; moreover, in hydrological modelling and flood forecasting, a correct definition of moisture conditions is a key factor for accurate predictions. Different sources of information for the estimation of the soil moisture state are currently available: satellite data, point measurements and model predictions. All are affected by intrinsic uncertainty. Among different satellite sensors that can be used for soil moisture estimation three major groups can be distinguished: passive microwave sensors (e.g., SSMI), active sensors (e.g. SAR, Scatterometers), and optical sensors (e.g. Spectroradiometers). The last two families, mainly because of their temporal and spatial resolution seem the most suitable for hydrological applications In this work soil moisture point measurements from 10 sensors in the Italian territory are compared of with the satellite products both from the HSAF project SM-OBS-2, derived from the ASCAT scatterometer, and from ACHAB, an operative energy balance model that assimilate LST data derived from MSG and furnishes daily an evaporative fraction index related to soil moisture content for all the Italian region. Distributed comparison of the ACHAB and SM-OBS-2 on the whole Italian territory are performed too.

  13. A soil moisture index derived from thermal infrared sensor on-board geostationary satellites over Europe, Africa and Australia

    NASA Astrophysics Data System (ADS)

    Ghilain, Nicolas; Trigo, Isabel; Arboleda, Alirio; Barrios, Jose-Miguel; Batelaan, Okke; Gellens-Meulenberghs, Françoise

    2017-04-01

    Soil moisture plays a central role in the water cycle. In particular, it is a major component which variability controls the evapotranspiration process. Over the past years, there has been a large commitment of the remote sensing research community to develop satellites and retrieval algorithm for soil moisture monitoring over continents. Most of those rely on the observation in the microwave lengths, making use either of passive, active or both methods combined. However, the available derived products are given at a relatively low spatial resolution for applications at the kilometer scale over entire continents, and with a revisit time that may not be adequate for all applications, as for example agriculture. Thermal infrared observations from a combination of geostationary satellites offer a global view of continents every hour (or even at higher frequency) at a few kilometers resolution, which makes them attractive as another, and potentially complementary, source of information of surface soil moisture. In this study, the Copernicus LST and the LSA-SAF LST are used to derive soil moisture over entire continents (Europe, Africa, Australia). The derived soil moisture is validated against in-situ observations and compared to other available products from remote sensing (SMOS, ASCAT) and from numerical weather prediction (ECMWF). We will present the result of this validation, and will show how it could be used in continental scale evapotranspiration monitoring.

  14. Satellite observations and modeling of oil spill trajectories in the Bohai Sea.

    PubMed

    Xu, Qing; Li, Xiaofeng; Wei, Yongliang; Tang, Zeyan; Cheng, Yongcun; Pichel, William G

    2013-06-15

    On June 4 and 17, 2011, separate oil spill accidents occurred at two oil platforms in the Bohai Sea, China. The oil spills were subsequently observed on different types of satellite images including SAR (Synthetic Aperture Radar), Chinese HJ-1-B CCD and NASA MODIS. To illustrate the fate of the oil spills, we performed two numerical simulations to simulate the trajectories of the oil spills with the GNOME (General NOAA Operational Modeling Environment) model. For the first time, we drive the GNOME with currents obtained from an operational ocean model (NCOM, Navy Coastal Ocean Model) and surface winds from operational scatterometer measurements (ASCAT, the Advanced Scatterometer). Both data sets are freely and openly available. The initial oil spill location inputs to the model are based on the detected oil spill locations from the SAR images acquired on June 11 and 14. Three oil slicks are tracked simultaneously and our results show good agreement between model simulations and subsequent satellite observations in the semi-enclosed shallow sea. Moreover, GNOME simulation shows that the number of 'splots', which denotes the extent of spilled oil, is a vital factor for GNOME running stability when the number is less than 500. Therefore, oil spill area information obtained from satellite sensors, especially SAR, is an important factor for setting up the initial model conditions. Copyright © 2013 Elsevier Ltd. All rights reserved.

  15. Exploring the single-cell RNA-seq analysis landscape with the scRNA-tools database.

    PubMed

    Zappia, Luke; Phipson, Belinda; Oshlack, Alicia

    2018-06-25

    As single-cell RNA-sequencing (scRNA-seq) datasets have become more widespread the number of tools designed to analyse these data has dramatically increased. Navigating the vast sea of tools now available is becoming increasingly challenging for researchers. In order to better facilitate selection of appropriate analysis tools we have created the scRNA-tools database (www.scRNA-tools.org) to catalogue and curate analysis tools as they become available. Our database collects a range of information on each scRNA-seq analysis tool and categorises them according to the analysis tasks they perform. Exploration of this database gives insights into the areas of rapid development of analysis methods for scRNA-seq data. We see that many tools perform tasks specific to scRNA-seq analysis, particularly clustering and ordering of cells. We also find that the scRNA-seq community embraces an open-source and open-science approach, with most tools available under open-source licenses and preprints being extensively used as a means to describe methods. The scRNA-tools database provides a valuable resource for researchers embarking on scRNA-seq analysis and records the growth of the field over time.

  16. Channel CAT: A Tactical Link Analysis Tool

    DTIC Science & Technology

    1997-09-01

    NAVAL POSTGRADUATE SCHOOL Monterey, California THESIS CHANNEL CAT : A TACTICAL LINK ANALYSIS TOOL by Michael Glenn Coleman September 1997 Thesis...REPORT TYPE AND DATES COVERED September 1997 Master’s Thesis 4. TITLE AND SUBTITLE CHANNEL CAT : A TACTICAL LINK ANALYSIS TOOL 5. FUNDING NUMBERS 6...tool, the Channel Capacity Analysis Tool (Channel CAT ), designed to provide an automated tool for the anlysis of design decisions in developing client

  17. Modification of Alumina and Spinel Inclusions by Calcium in Liquid Steel

    NASA Astrophysics Data System (ADS)

    Verma, Neerav

    2011-12-01

    Steel Cleanliness plays a crucial role in determining steel properties such as toughness, ductility, formability, corrosion resistance and surface quality. The production of clean steel often involves the elimination or chemical and morphological modification of oxide and sulfide inclusions. Along with deteriorating the steel properties, solid inclusions can affect steel castability through nozzle clogging. Nozzle clogging occurs when solid inclusions accumulate in the caster pouring system such as the ladle shroud or submerged entry nozzle (SEN). Thus, it is important to understand how to achieve desired inclusion characteristics (shape, size and chemistry) through the steelmaking process. Among the various practices adopted in industries to counteract the effect of solid inclusions, modification of solid inclusions to liquid or partially liquid state through calcium treatment is one of the methods. Calcium can be used because it has a strong ability to form oxides and sulfides. In Al-killed steels, the most common inclusions are alumina (Al2O3) inclusions, which are solid at steelmaking temperatures. On calcium treatment, solid alumina inclusions are converted to calcium aluminates, which have liquidus temperatures lower than steelmaking temperature (1600°C) [14]. It has been found that alumina inclusions may contain some MgO and such inclusions are termed alumina magnesia spinels (Al2O3.xMgO) [18]. These spinels are more stable than alumina and it has been suggested that they might be more difficult to modify [18]. But, some authors have proposed that MgO can actually help in the liquefaction of inclusions, and have demonstrated successful modification of spinels by Ca treatment [20, 21]. In the present research, the mechanism of transformation of alumina and spinel inclusions upon calcium treatment was studied by characterizing transient evolution of inclusions. A vacuum induction was used for melting, making additions (Al, Al-Mg and CaSi2) and sampling. The samples were characterized for inclusion shape, size and chemistry through scanning electron microscopy (SEM). Automated inclusion analysis tools (like ASCAT [59, 91, 92], INCA-GSR [126]; Please refer section 6.4., page number 68) were employed to generate statistical information of the inclusions. Thermodynamic database software FACTSAGE [62] was used to determine thermochemistry of reactions, ternary phase diagrams (Ca-Al-S and Ca-Al-Mg systems). The compositions of the inclusions were tracked before and after calcium treatment to determine the effectiveness of calcium treatment. Extraction of inclusions through dissolution of iron in bromine-methanol solution was employed to reveal 3-D geometry of inclusions and analyze inclusions through EDS (Energy-dispersive X-ray spectroscopy) without any matrix effects. Various industrial samples were also analyzed to confirm the feasibility of various reaction mechanisms deduced through experiments. Successful modification of alumina and spinel inclusions by calcium was demonstrated [85, 86]. It was observed that these modification mechanisms proceed through transient phase (CaO, CaS) formation. In the case of spinels, preferential reduction of MgO part was also observed during calcium modification of spinels. The magnesium after MgO reduction by calcium can enter back into the melt or leave the melt in vapor form. The inclusion area fraction decreased after calcium treatment, but the inclusion concentration (number of inclusions per cm2) increased because inclusions shifted to a smaller size distribution after calcium treatment. Severe matrix effects during EDS analysis of inclusions were observed, due to which inclusion composition analyses can be significantly affected. *Please refer to dissertation for footnotes.

  18. Debugging and Performance Analysis Software Tools for Peregrine System |

    Science.gov Websites

    High-Performance Computing | NREL Debugging and Performance Analysis Software Tools for Peregrine System Debugging and Performance Analysis Software Tools for Peregrine System Learn about debugging and performance analysis software tools available to use with the Peregrine system. Allinea

  19. Two decades [1992-2012] of surface wind analyses based on satellite scatterometer observations

    NASA Astrophysics Data System (ADS)

    Desbiolles, Fabien; Bentamy, Abderrahim; Blanke, Bruno; Roy, Claude; Mestas-Nuñez, Alberto M.; Grodsky, Semyon A.; Herbette, Steven; Cambon, Gildas; Maes, Christophe

    2017-04-01

    Surface winds (equivalent neutral wind velocities at 10 m) from scatterometer missions since 1992 have been used to build up a 20-year climate series. Optimal interpolation and kriging methods have been applied to continuously provide surface wind speed and direction estimates over the global ocean on a regular grid in space and time. The use of other data sources such as radiometer data (SSM/I) and atmospheric wind reanalyses (ERA-Interim) has allowed building a blended product available at 1/4° spatial resolution and every 6 h from 1992 to 2012. Sampling issues throughout the different missions (ERS-1, ERS-2, QuikSCAT, and ASCAT) and their possible impact on the homogeneity of the gridded product are discussed. In addition, we assess carefully the quality of the blended product in the absence of scatterometer data (1992 to 1999). Data selection experiments show that the description of the surface wind is significantly improved by including the scatterometer winds. The blended winds compare well with buoy winds (1992-2012) and they resolve finer spatial scales than atmospheric reanalyses, which make them suitable for studying air-sea interactions at mesoscale. The seasonal cycle and interannual variability of the product compare well with other long-term wind analyses. The product is used to calculate 20-year trends in wind speed, as well as in zonal and meridional wind components. These trends show an important asymmetry between the southern and northern hemispheres, which may be an important issue for climate studies.

  20. Assimilation of Global Radar Backscatter and Radiometer Brightness Temperature Observations to Improve Soil Moisture and Land Evaporation Estimates

    NASA Technical Reports Server (NTRS)

    Lievens, H.; Martens, B.; Verhoest, N. E. C.; Hahn, S.; Reichle, R. H.; Miralles, D. G.

    2017-01-01

    Active radar backscatter (s?) observations from the Advanced Scatterometer (ASCAT) and passive radiometer brightness temperature (TB) observations from the Soil Moisture Ocean Salinity (SMOS) mission are assimilated either individually or jointly into the Global Land Evaporation Amsterdam Model (GLEAM) to improve its simulations of soil moisture and land evaporation. To enable s? and TB assimilation, GLEAM is coupled to the Water Cloud Model and the L-band Microwave Emission from the Biosphere (L-MEB) model. The innovations, i.e. differences between observations and simulations, are mapped onto the model soil moisture states through an Ensemble Kalman Filter. The validation of surface (0-10 cm) soil moisture simulations over the period 2010-2014 against in situ measurements from the International Soil Moisture Network (ISMN) shows that assimilating s? or TB alone improves the average correlation of seasonal anomalies (Ran) from 0.514 to 0.547 and 0.548, respectively. The joint assimilation further improves Ran to 0.559. Associated enhancements in daily evaporative flux simulations by GLEAM are validated based on measurements from 22 FLUXNET stations. Again, the singular assimilation improves Ran from 0.502 to 0.536 and 0.533, respectively for s? and TB, whereas the best performance is observed for the joint assimilation (Ran = 0.546). These results demonstrate the complementary value of assimilating radar backscatter observations together with brightness temperatures for improving estimates of hydrological variables, as their joint assimilation outperforms the assimilation of each observation type separately.

  1. The effect of monsoon variability on fish landing in the Sadeng Fishing Port of Yogyakarta, Indonesia

    NASA Astrophysics Data System (ADS)

    Subarna, D.

    2018-03-01

    The volume of landing fish of the Sadeng Fishing Port within certain months showed an increase from year to year, especially during June, July and August (JJA). While in other months the fish production was low. The purpose of this research was to understand the influence of monsoon variability on fish landing in the Sadeng Fishing Port. Data were analyzed descriptively as spatial and temporal catch. Data were namely catch fish production collected from fishing port, while satellite and HYCOM model during 2011–2012 period were selected. The wind data, sea surface temperature (SST) and chlorophyll-a were analyzed from ASCAT and MODIS sensors during the Southeast Monsoon. The result showed the wind from the southeasterly provide wind stress at sea level and caused Ekman Transport to move away water mass from the sea shore. The lost water mass in the ocean surface was replaced by cold water from deeper layer which was rich in nutrients. The distribution of chlorophyll-a during the Southeast Monsoon was relatively higher in the southern coast of Java than during the Northwest monsoon. The SST showed approximately 25.3 °C. The abundance of nutrients indicated by the distribution of chlorophyll-a around the coast during the Southeast Monsoon, will enhance the arrival of larger fish. Thus, it can be understood that during June, July, and August the catch production is higher than the other months.

  2. Oscillation Baselining and Analysis Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    PNNL developed a new tool for oscillation analysis and baselining. This tool has been developed under a new DOE Grid Modernization Laboratory Consortium (GMLC) Project (GM0072 - “Suite of open-source applications and models for advanced synchrophasor analysis”) and it is based on the open platform for PMU analysis. The Oscillation Baselining and Analysis Tool (OBAT) performs the oscillation analysis and identifies modes of oscillations (frequency, damping, energy, and shape). The tool also does oscillation event baselining (fining correlation between oscillations characteristics and system operating conditions).

  3. Rapid SAW Sensor Development Tools

    NASA Technical Reports Server (NTRS)

    Wilson, William C.; Atkinson, Gary M.

    2007-01-01

    The lack of integrated design tools for Surface Acoustic Wave (SAW) devices has led us to develop tools for the design, modeling, analysis, and automatic layout generation of SAW devices. These tools enable rapid development of wireless SAW sensors. The tools developed have been designed to integrate into existing Electronic Design Automation (EDA) tools to take advantage of existing 3D modeling, and Finite Element Analysis (FEA). This paper presents the SAW design, modeling, analysis, and automated layout generation tools.

  4. Economic and Financial Analysis Tools | Energy Analysis | NREL

    Science.gov Websites

    Economic and Financial Analysis Tools Economic and Financial Analysis Tools Use these economic and . Job and Economic Development Impact (JEDI) Model Use these easy-to-use, spreadsheet-based tools to analyze the economic impacts of constructing and operating power generation and biofuel plants at the

  5. Distributed and Collaborative Software Analysis

    NASA Astrophysics Data System (ADS)

    Ghezzi, Giacomo; Gall, Harald C.

    Throughout the years software engineers have come up with a myriad of specialized tools and techniques that focus on a certain type of software analysissoftware analysis such as source code analysis, co-change analysis or bug prediction. However, easy and straight forward synergies between these analyses and tools rarely exist because of their stand-alone nature, their platform dependence, their different input and output formats and the variety of data to analyze. As a consequence, distributed and collaborative software analysiscollaborative software analysis scenarios and in particular interoperability are severely limited. We describe a distributed and collaborative software analysis platform that allows for a seamless interoperability of software analysis tools across platform, geographical and organizational boundaries. We realize software analysis tools as services that can be accessed and composed over the Internet. These distributed analysis services shall be widely accessible in our incrementally augmented Software Analysis Broker software analysis broker where organizations and tool providers can register and share their tools. To allow (semi-) automatic use and composition of these tools, they are classified and mapped into a software analysis taxonomy and adhere to specific meta-models and ontologiesontologies for their category of analysis.

  6. OEXP Analysis Tools Workshop

    NASA Technical Reports Server (NTRS)

    Garrett, L. Bernard; Wright, Robert L.; Badi, Deborah; Findlay, John T.

    1988-01-01

    This publication summarizes the software needs and available analysis tools presented at the OEXP Analysis Tools Workshop held at the NASA Langley Research Center, Hampton, Virginia on June 21 to 22, 1988. The objective of the workshop was to identify available spacecraft system (and subsystem) analysis and engineering design tools, and mission planning and analysis software that could be used for various NASA Office of Exploration (code Z) studies, specifically lunar and Mars missions.

  7. Upper-soil moisture inter-comparison from SMOS's products and land surface models over the Iberian Peninsula

    NASA Astrophysics Data System (ADS)

    Polcher, Jan; Barella-Ortiz, Anaïs; Aires, Filipe; Balsamo, Gianpaolo; Gelati, Emiliano; Rodríguez-Fernández, Nemesio

    2015-04-01

    Soil moisture is a key state variable of the hydrological cycle. It conditions runoff, infiltration and evaporation over continental surfaces, and is key for forecasting droughts and floods. It plays thus an important role in surface-atmosphere interactions. Surface Soil Moisture (SSM) can be measured by in situ measurements, by satellite observations or modelled using land surface models. As a complementary tool, data assimilation can be used to combine both modelling and satellite observations. The work presented here is an inter-comparison of retrieved and modelled SSM data, for the 2010 - 2012 period, over the Iberian Peninsula. The region has been chosen because its vegetation cover is not very dense and includes strong contrasts in the rainfall regimes and thus a diversity of behaviours for SSM. Furthermore this semi-arid region is strongly dependent on a good management of its water resources. Satellite observations correspond to the Soil Moisture and Ocean Salinity (SMOS) retrievals: the L2 product from an optimal interpolation retrieval, and 3 other products using Neural Network retrievals with different input information: SMOS time indexes, purely SMOS data, or addition of the European Advanced Scaterometer (ASCAT) backscattering, and the Moderate-Resolution Imaging Spectrometer (MODIS) surface temperature information. The modelled soil moistures have been taken from the ORCHIDEE (ORganising Carbon and Hydrology In Dynamic EcosystEms) and the HTESSEL (Hydrology-Tiled ECMWF Scheme for Surface Exchanges over Land) land surface models. Both models are forced with the same atmospheric conditions (as part of the Earth2Observe FP7 project) over the period but they represent the surface soil moisture with very different degrees of complexity. ORCHIDEE has 5 levels in the top 5 centimetres of soil while in HTESSEL this variable is part of the top soil moisture level. The two types of SMOS retrievals are compared to the model outputs in their spatial and temporal characteristics. The comparison with the model helps to identify which retrieval configuration is most consistent with our understanding of surface soil moisture in this region. In particular we have determined how each of the soil moisture products is related to the spatio-temporal variations of rainfall. In large parts of the Iberian Peninsula the co-variance of remote sensed SSM and rainfall is consistent with that of the models. But for some regions questions are raised. The variability of SSM observed by SMOS in the North West of the Iberian Peninsula is similar to that of rainfall, at least this relation of SSM and rainfall is closer than suggested by the two models.

  8. Analysis and design of friction stir welding tool

    NASA Astrophysics Data System (ADS)

    Jagadeesha, C. B.

    2016-12-01

    Since its inception no one has done analysis and design of FSW tool. Initial dimensions of FSW tool are decided by educated guess. Optimum stresses on tool pin have been determined at optimized parameters for bead on plate welding on AZ31B-O Mg alloy plate. Fatigue analysis showed that the chosen FSW tool for the welding experiment has not ∞ life and it has determined that the life of FSW tool is 2.66×105 cycles or revolutions. So one can conclude that any arbitrarily decided FSW tool generally has finite life and cannot be used for ∞ life. In general, one can determine the suitability of tool and its material to be used in FSW of the given workpiece materials in advance by this analysis in terms of fatigue life of the tool.

  9. Inspection of the Math Model Tools for On-Orbit Assessment of Impact Damage Report

    NASA Technical Reports Server (NTRS)

    Harris, Charles E.; Raju, Ivatury S.; Piascik, Robert S> ; KramerWhite, Julie A.; KramerWhite, Julie A.; Labbe, Steve G.; Rotter, Hank A.

    2007-01-01

    In Spring of 2005, the NASA Engineering Safety Center (NESC) was engaged by the Space Shuttle Program (SSP) to peer review the suite of analytical tools being developed to support the determination of impact and damage tolerance of the Orbiter Thermal Protection Systems (TPS). The NESC formed an independent review team with the core disciplines of materials, flight sciences, structures, mechanical analysis and thermal analysis. The Math Model Tools reviewed included damage prediction and stress analysis, aeroheating analysis, and thermal analysis tools. Some tools are physics-based and other tools are empirically-derived. Each tool was created for a specific use and timeframe, including certification, real-time pre-launch assessments. In addition, the tools are used together in an integrated strategy for assessing the ramifications of impact damage to tile and RCC. The NESC teams conducted a peer review of the engineering data package for each Math Model Tool. This report contains the summary of the team observations and recommendations from these reviews.

  10. Logistics Process Analysis ToolProcess Analysis Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2008-03-31

    LPAT is the resulting integrated system between ANL-developed Enhanced Logistics Intra Theater Support Tool (ELIST) sponsored by SDDC-TEA and the Fort Future Virtual Installation Tool (sponsored by CERL). The Fort Future Simulation Engine was an application written in the ANL Repast Simphony framework and used as the basis for the process Anlysis Tool (PAT) which evolved into a stand=-along tool for detailed process analysis at a location. Combined with ELIST, an inter-installation logistics component was added to enable users to define large logistical agent-based models without having to program. PAT is the evolution of an ANL-developed software system called Fortmore » Future Virtual Installation Tool (sponsored by CERL). The Fort Future Simulation Engine was an application written in the ANL Repast Simphony framework and used as the basis for the Process Analysis Tool(PAT) which evolved into a stand-alone tool for detailed process analysis at a location (sponsored by the SDDC-TEA).« less

  11. Spatial variability and trends of seasonal snowmelt processes over Antarctic sea ice observed by satellite scatterometers

    NASA Astrophysics Data System (ADS)

    Arndt, S.; Haas, C.

    2017-12-01

    Snow is one of the key drivers determining the seasonal energy and mass budgets of sea ice in the Southern Ocean. Here, we analyze radar backscatter time series from the European Remote Sensing Satellites (ERS)-1 and-2 scatterometers, from the Quick Scatterometer (QSCAT), and from the Advanced Scatterometer (ASCAT) in order to observe the regional and inter-annual variability of Antarctic snowmelt processes from 1992 to 2014. On perennial ice, seasonal backscatter changes show two different snowmelt stages: A weak backscatter rise indicating the initial warming and metamorphosis of the snowpack (pre-melt), followed by a rapid rise indicating the onset of internal snowmelt and thaw-freeze cycles (snowmelt). In contrast, similar seasonal backscatter cycles are absent on seasonal ice, preventing the periodic retrieval of spring/summer transitions. This may be due to the dominance of ice bottom melt over snowmelt, leading to flooding and ice disintegration before strong snowmelt sets in. Resulting snowmelt onset dates on perennial sea ice show the expected latitudinal gradient from early melt onsets (mid-November) in the northern Weddell Sea towards late (end-December) or even absent snowmelt conditions further south. This result is likely related to seasonal variations in solar shortwave radiation (absorption). In addition, observations with different microwave frequencies allow to detect changing snow properties at different depths. We show that short wavelengths of passive microwave observations indicate earlier pre-melt and snowmelt onset dates than longer wavelength scatterometer observations, in response to earlier warming of upper snow layers compared to lower snow layers. Similarly, pre-melt and snowmelt onset dates retrieved from Ku-Band radars were earlier by an average of 11 and 23 days, respectively, than those retrieved from C-Band. This time difference was used to correct melt onset dates retrieved from Ku-Band to compile a consistent time series from 1992 to 2014. The subsequent regression analysis showed that no significant temporal trend in the retrieved snowmelt onset dates can be observed, but strong inter-annual variability. This absence of any notable changes in snowmelt behavior is in line with the small observed temporal changes of the Antarctic sea ice cover and atmospheric warming

  12. An evaluation of the potential of Sentinel 1 for improving flash flood predictions via soil moisture-data assimilation

    NASA Astrophysics Data System (ADS)

    Cenci, Luca; Pulvirenti, Luca; Boni, Giorgio; Chini, Marco; Matgen, Patrick; Gabellani, Simone; Squicciarino, Giuseppe; Pierdicca, Nazzareno

    2017-11-01

    The assimilation of satellite-derived soil moisture estimates (soil moisture-data assimilation, SM-DA) into hydrological models has the potential to reduce the uncertainty of streamflow simulations. The improved capacity to monitor the closeness to saturation of small catchments, such as those characterizing the Mediterranean region, can be exploited to enhance flash flood predictions. When compared to other microwave sensors that have been exploited for SM-DA in recent years (e.g. the Advanced SCATterometer - ASCAT), characterized by low spatial/high temporal resolution, the Sentinel 1 (S1) mission provides an excellent opportunity to monitor systematically soil moisture (SM) at high spatial resolution and moderate temporal resolution. The aim of this research was thus to evaluate the impact of S1-based SM-DA for enhancing flash flood predictions of a hydrological model (Continuum) that is currently exploited for civil protection applications in Italy. The analysis was carried out in a representative Mediterranean catchment prone to flash floods, located in north-western Italy, during the time period October 2014-February 2015. It provided some important findings: (i) revealing the potential provided by S1-based SM-DA for improving discharge predictions, especially for higher flows; (ii) suggesting a more appropriate pre-processing technique to be applied to S1 data before the assimilation; and (iii) highlighting that even though high spatial resolution does provide an important contribution in a SM-DA system, the temporal resolution has the most crucial role. S1-derived SM maps are still a relatively new product and, to our knowledge, this is the first work published in an international journal dealing with their assimilation within a hydrological model to improve continuous streamflow simulations and flash flood predictions. Even though the reported results were obtained by analysing a relatively short time period, and thus should be supported by further research activities, we believe this research is timely in order to enhance our understanding of the potential contribution of the S1 data within the SM-DA framework for flash flood risk mitigation.

  13. Stochastic error model corrections to improve the performance of bottom-up precipitation products for hydrologic applications

    NASA Astrophysics Data System (ADS)

    Maggioni, V.; Massari, C.; Ciabatta, L.; Brocca, L.

    2016-12-01

    Accurate quantitative precipitation estimation is of great importance for water resources management, agricultural planning, and forecasting and monitoring of natural hazards such as flash floods and landslides. In situ observations are limited around the Earth, especially in remote areas (e.g., complex terrain, dense vegetation), but currently available satellite precipitation products are able to provide global precipitation estimates with an accuracy that depends upon many factors (e.g., type of storms, temporal sampling, season, etc.). The recent SM2RAIN approach proposes to estimate rainfall by using satellite soil moisture observations. As opposed to traditional satellite precipitation methods, which sense cloud properties to retrieve instantaneous estimates, this new bottom-up approach makes use of two consecutive soil moisture measurements for obtaining an estimate of the fallen precipitation within the interval between two satellite overpasses. As a result, the nature of the measurement is different and complementary to the one of classical precipitation products and could provide a different valid perspective to substitute or improve current rainfall estimates. However, uncertainties in the SM2RAIN product are still not well known and could represent a limitation in utilizing this dataset for hydrological applications. Therefore, quantifying the uncertainty associated with SM2RAIN is necessary for enabling its use. The study is conducted over the Italian territory for a 5-yr period (2010-2014). A number of satellite precipitation error properties, typically used in error modeling, are investigated and include probability of detection, false alarm rates, missed events, spatial correlation of the error, and hit biases. After this preliminary uncertainty analysis, the potential of applying the stochastic rainfall error model SREM2D to correct SM2RAIN and to improve its performance in hydrologic applications is investigated. The use of SREM2D for characterizing the error in precipitation by SM2RAIN would be highly useful for the merging and the integration steps in its algorithm, i.e., the merging of multiple soil moisture derived products (e.g., SMAP, SMOS, ASCAT) and the integration of soil moisture derived and state of the art satellite precipitation products (e.g., GPM IMERG).

  14. Supporting Scientific Analysis within Collaborative Problem Solving Environments

    NASA Technical Reports Server (NTRS)

    Watson, Velvin R.; Kwak, Dochan (Technical Monitor)

    2000-01-01

    Collaborative problem solving environments for scientists should contain the analysis tools the scientists require in addition to the remote collaboration tools used for general communication. Unfortunately, most scientific analysis tools have been designed for a "stand-alone mode" and cannot be easily modified to work well in a collaborative environment. This paper addresses the questions, "What features are desired in a scientific analysis tool contained within a collaborative environment?", "What are the tool design criteria needed to provide these features?", and "What support is required from the architecture to support these design criteria?." First, the features of scientific analysis tools that are important for effective analysis in collaborative environments are listed. Next, several design criteria for developing analysis tools that will provide these features are presented. Then requirements for the architecture to support these design criteria are listed. Sonic proposed architectures for collaborative problem solving environments are reviewed and their capabilities to support the specified design criteria are discussed. A deficiency in the most popular architecture for remote application sharing, the ITU T. 120 architecture, prevents it from supporting highly interactive, dynamic, high resolution graphics. To illustrate that the specified design criteria can provide a highly effective analysis tool within a collaborative problem solving environment, a scientific analysis tool that contains the specified design criteria has been integrated into a collaborative environment and tested for effectiveness. The tests were conducted in collaborations between remote sites in the US and between remote sites on different continents. The tests showed that the tool (a tool for the visual analysis of computer simulations of physics) was highly effective for both synchronous and asynchronous collaborative analyses. The important features provided by the tool (and made possible by the specified design criteria) are: 1. The tool provides highly interactive, dynamic, high resolution, 3D graphics. 2. All remote scientists can view the same dynamic, high resolution, 3D scenes of the analysis as the analysis is being conducted. 3. The responsiveness of the tool is nearly identical to the responsiveness of the tool in a stand-alone mode. 4. The scientists can transfer control of the analysis between themselves. 5. Any analysis session or segment of an analysis session, whether done individually or collaboratively, can be recorded and posted on the Web for other scientists or students to download and play in either a collaborative or individual mode. 6. The scientist or student who downloaded the session can, individually or collaboratively, modify or extend the session with his/her own "what if" analysis of the data and post his/her version of the analysis back onto the Web. 7. The peak network bandwidth used in the collaborative sessions is only 1K bit/second even though the scientists at all sites are viewing high resolution (1280 x 1024 pixels), dynamic, 3D scenes of the analysis. The links between the specified design criteria and these performance features are presented.

  15. Analysis Tools for CFD Multigrid Solvers

    NASA Technical Reports Server (NTRS)

    Mineck, Raymond E.; Thomas, James L.; Diskin, Boris

    2004-01-01

    Analysis tools are needed to guide the development and evaluate the performance of multigrid solvers for the fluid flow equations. Classical analysis tools, such as local mode analysis, often fail to accurately predict performance. Two-grid analysis tools, herein referred to as Idealized Coarse Grid and Idealized Relaxation iterations, have been developed and evaluated within a pilot multigrid solver. These new tools are applicable to general systems of equations and/or discretizations and point to problem areas within an existing multigrid solver. Idealized Relaxation and Idealized Coarse Grid are applied in developing textbook-efficient multigrid solvers for incompressible stagnation flow problems.

  16. Bioinformatic tools for inferring functional information from plant microarray data: tools for the first steps.

    PubMed

    Page, Grier P; Coulibaly, Issa

    2008-01-01

    Microarrays are a very powerful tool for quantifying the amount of RNA in samples; however, their ability to query essentially every gene in a genome, which can number in the tens of thousands, presents analytical and interpretative problems. As a result, a variety of software and web-based tools have been developed to help with these issues. This article highlights and reviews some of the tools for the first steps in the analysis of a microarray study. We have tried for a balance between free and commercial systems. We have organized the tools by topics including image processing tools (Section 2), power analysis tools (Section 3), image analysis tools (Section 4), database tools (Section 5), databases of functional information (Section 6), annotation tools (Section 7), statistical and data mining tools (Section 8), and dissemination tools (Section 9).

  17. The dynamic analysis of drum roll lathe for machining of rollers

    NASA Astrophysics Data System (ADS)

    Qiao, Zheng; Wu, Dongxu; Wang, Bo; Li, Guo; Wang, Huiming; Ding, Fei

    2014-08-01

    An ultra-precision machine tool for machining of the roller has been designed and assembled, and due to the obvious impact which dynamic characteristic of machine tool has on the quality of microstructures on the roller surface, the dynamic characteristic of the existing machine tool is analyzed in this paper, so is the influence of circumstance that a large scale and slender roller is fixed in the machine on dynamic characteristic of the machine tool. At first, finite element model of the machine tool is built and simplified, and based on that, the paper carries on with the finite element mode analysis and gets the natural frequency and shaking type of four steps of the machine tool. According to the above model analysis results, the weak stiffness systems of machine tool can be further improved and the reasonable bandwidth of control system of the machine tool can be designed. In the end, considering the shock which is caused by Z axis as a result of fast positioning frequently to feeding system and cutting tool, transient analysis is conducted by means of ANSYS analysis in this paper. Based on the results of transient analysis, the vibration regularity of key components of machine tool and its impact on cutting process are explored respectively.

  18. Integrating Reliability Analysis with a Performance Tool

    NASA Technical Reports Server (NTRS)

    Nicol, David M.; Palumbo, Daniel L.; Ulrey, Michael

    1995-01-01

    A large number of commercial simulation tools support performance oriented studies of complex computer and communication systems. Reliability of these systems, when desired, must be obtained by remodeling the system in a different tool. This has obvious drawbacks: (1) substantial extra effort is required to create the reliability model; (2) through modeling error the reliability model may not reflect precisely the same system as the performance model; (3) as the performance model evolves one must continuously reevaluate the validity of assumptions made in that model. In this paper we describe an approach, and a tool that implements this approach, for integrating a reliability analysis engine into a production quality simulation based performance modeling tool, and for modeling within such an integrated tool. The integrated tool allows one to use the same modeling formalisms to conduct both performance and reliability studies. We describe how the reliability analysis engine is integrated into the performance tool, describe the extensions made to the performance tool to support the reliability analysis, and consider the tool's performance.

  19. Inspection of the Math Model Tools for On-Orbit Assessment of Impact Damage Report. Version 1.0

    NASA Technical Reports Server (NTRS)

    Harris, Charles E.; Raju, Ivatury S.; Piascik, Robert S.; Kramer White, Julie; Labbe, Steve G.; Rotter, Hank A.

    2005-01-01

    In Spring of 2005, the NASA Engineering Safety Center (NESC) was engaged by the Space Shuttle Program (SSP) to peer review the suite of analytical tools being developed to support the determination of impact and damage tolerance of the Orbiter Thermal Protection Systems (TPS). The NESC formed an independent review team with the core disciplines of materials, flight sciences, structures, mechanical analysis and thermal analysis. The Math Model Tools reviewed included damage prediction and stress analysis, aeroheating analysis, and thermal analysis tools. Some tools are physics-based and other tools are empirically-derived. Each tool was created for a specific use and timeframe, including certification, real-time pre-launch assessments, and real-time on-orbit assessments. The tools are used together in an integrated strategy for assessing the ramifications of impact damage to tile and RCC. The NESC teams conducted a peer review of the engineering data package for each Math Model Tool. This report contains the summary of the team observations and recommendations from these reviews.

  20. Navigating freely-available software tools for metabolomics analysis.

    PubMed

    Spicer, Rachel; Salek, Reza M; Moreno, Pablo; Cañueto, Daniel; Steinbeck, Christoph

    2017-01-01

    The field of metabolomics has expanded greatly over the past two decades, both as an experimental science with applications in many areas, as well as in regards to data standards and bioinformatics software tools. The diversity of experimental designs and instrumental technologies used for metabolomics has led to the need for distinct data analysis methods and the development of many software tools. To compile a comprehensive list of the most widely used freely available software and tools that are used primarily in metabolomics. The most widely used tools were selected for inclusion in the review by either ≥ 50 citations on Web of Science (as of 08/09/16) or the use of the tool being reported in the recent Metabolomics Society survey. Tools were then categorised by the type of instrumental data (i.e. LC-MS, GC-MS or NMR) and the functionality (i.e. pre- and post-processing, statistical analysis, workflow and other functions) they are designed for. A comprehensive list of the most used tools was compiled. Each tool is discussed within the context of its application domain and in relation to comparable tools of the same domain. An extended list including additional tools is available at https://github.com/RASpicer/MetabolomicsTools which is classified and searchable via a simple controlled vocabulary. This review presents the most widely used tools for metabolomics analysis, categorised based on their main functionality. As future work, we suggest a direct comparison of tools' abilities to perform specific data analysis tasks e.g. peak picking.

  1. Overcoming redundancies in bedside nursing assessments by validating a parsimonious meta-tool: findings from a methodological exercise study.

    PubMed

    Palese, Alvisa; Marini, Eva; Guarnier, Annamaria; Barelli, Paolo; Zambiasi, Paola; Allegrini, Elisabetta; Bazoli, Letizia; Casson, Paola; Marin, Meri; Padovan, Marisa; Picogna, Michele; Taddia, Patrizia; Chiari, Paolo; Salmaso, Daniele; Marognolli, Oliva; Canzan, Federica; Ambrosi, Elisa; Saiani, Luisa; Grassetti, Luca

    2016-10-01

    There is growing interest in validating tools aimed at supporting the clinical decision-making process and research. However, an increased bureaucratization of clinical practice and redundancies in the measures collected have been reported by clinicians. Redundancies in clinical assessments affect negatively both patients and nurses. To validate a meta-tool measuring the risks/problems currently estimated by multiple tools used in daily practice. A secondary analysis of a database was performed, using a cross-validation and a longitudinal study designs. In total, 1464 patients admitted to 12 medical units in 2012 were assessed at admission with the Brass, Barthel, Conley and Braden tools. Pertinent outcomes such as the occurrence of post-discharge need for resources and functional decline at discharge, as well as falls and pressure sores, were measured. Explorative factor analysis of each tool, inter-tool correlations and a conceptual evaluation of the redundant/similar items across tools were performed. Therefore, the validation of the meta-tool was performed through explorative factor analysis, confirmatory factor analysis and the structural equation model to establish the ability of the meta-tool to predict the outcomes estimated by the original tools. High correlations between the tools have emerged (from r 0.428 to 0.867) with a common variance from 18.3% to 75.1%. Through a conceptual evaluation and explorative factor analysis, the items were reduced from 42 to 20, and the three factors that emerged were confirmed by confirmatory factor analysis. According to the structural equation model results, two out of three emerged factors predicted the outcomes. From the initial 42 items, the meta-tool is composed of 20 items capable of predicting the outcomes as with the original tools. © 2016 John Wiley & Sons, Ltd.

  2. An Integrated Tool for System Analysis of Sample Return Vehicles

    NASA Technical Reports Server (NTRS)

    Samareh, Jamshid A.; Maddock, Robert W.; Winski, Richard G.

    2012-01-01

    The next important step in space exploration is the return of sample materials from extraterrestrial locations to Earth for analysis. Most mission concepts that return sample material to Earth share one common element: an Earth entry vehicle. The analysis and design of entry vehicles is multidisciplinary in nature, requiring the application of mass sizing, flight mechanics, aerodynamics, aerothermodynamics, thermal analysis, structural analysis, and impact analysis tools. Integration of a multidisciplinary problem is a challenging task; the execution process and data transfer among disciplines should be automated and consistent. This paper describes an integrated analysis tool for the design and sizing of an Earth entry vehicle. The current tool includes the following disciplines: mass sizing, flight mechanics, aerodynamics, aerothermodynamics, and impact analysis tools. Python and Java languages are used for integration. Results are presented and compared with the results from previous studies.

  3. Multi-mission space vehicle subsystem analysis tools

    NASA Technical Reports Server (NTRS)

    Kordon, M.; Wood, E.

    2003-01-01

    Spacecraft engineers often rely on specialized simulation tools to facilitate the analysis, design and operation of space systems. Unfortunately these tools are often designed for one phase of a single mission and cannot be easily adapted to other phases or other misions. The Multi-Mission Pace Vehicle Susbsystem Analysis Tools are designed to provide a solution to this problem.

  4. A survey of tools for the analysis of quantitative PCR (qPCR) data.

    PubMed

    Pabinger, Stephan; Rödiger, Stefan; Kriegner, Albert; Vierlinger, Klemens; Weinhäusel, Andreas

    2014-09-01

    Real-time quantitative polymerase-chain-reaction (qPCR) is a standard technique in most laboratories used for various applications in basic research. Analysis of qPCR data is a crucial part of the entire experiment, which has led to the development of a plethora of methods. The released tools either cover specific parts of the workflow or provide complete analysis solutions. Here, we surveyed 27 open-access software packages and tools for the analysis of qPCR data. The survey includes 8 Microsoft Windows, 5 web-based, 9 R-based and 5 tools from other platforms. Reviewed packages and tools support the analysis of different qPCR applications, such as RNA quantification, DNA methylation, genotyping, identification of copy number variations, and digital PCR. We report an overview of the functionality, features and specific requirements of the individual software tools, such as data exchange formats, availability of a graphical user interface, included procedures for graphical data presentation, and offered statistical methods. In addition, we provide an overview about quantification strategies, and report various applications of qPCR. Our comprehensive survey showed that most tools use their own file format and only a fraction of the currently existing tools support the standardized data exchange format RDML. To allow a more streamlined and comparable analysis of qPCR data, more vendors and tools need to adapt the standardized format to encourage the exchange of data between instrument software, analysis tools, and researchers.

  5. A Comparison of Satellite Conjunction Analysis Screening Tools

    DTIC Science & Technology

    2011-09-01

    visualization tool. Version 13.1.4 for Linux was tested. The SOAP conjunction analysis function does not have the capacity to perform the large...was examined by SOAP to confirm the conjunction. STK Advanced CAT STK Advanced CAT (Conjunction Analysis Tools) is an add-on module for the STK ...run with each tool. When attempting to perform the seven day all vs all analysis with STK Advanced CAT, the program consistently crashed during report

  6. Evaluating Learning Technology Content with Discourse Analysis

    ERIC Educational Resources Information Center

    Duvall, Matthew

    2016-01-01

    The researcher combined qualitative media analysis with tools for discourse analysis to review Blackboard Collaborate™, a tool often used in online education. Technology design references Discourses which dictate how and why these tools should be used. The analysis showed Collaborate™ uses sign systems and knowledge, along with politics, to…

  7. Spacecraft Electrical Power System (EPS) generic analysis tools and techniques

    NASA Technical Reports Server (NTRS)

    Morris, Gladys M.; Sheppard, Mark A.

    1992-01-01

    An overview is provided of the analysis tools and techiques used in modeling the Space Station Freedom electrical power system, as well as future space vehicle power systems. The analysis capabilities of the Electrical Power System (EPS) are described and the EPS analysis tools are surveyed.

  8. Software Users Manual (SUM): Extended Testability Analysis (ETA) Tool

    NASA Technical Reports Server (NTRS)

    Maul, William A.; Fulton, Christopher E.

    2011-01-01

    This software user manual describes the implementation and use the Extended Testability Analysis (ETA) Tool. The ETA Tool is a software program that augments the analysis and reporting capabilities of a commercial-off-the-shelf (COTS) testability analysis software package called the Testability Engineering And Maintenance System (TEAMS) Designer. An initial diagnostic assessment is performed by the TEAMS Designer software using a qualitative, directed-graph model of the system being analyzed. The ETA Tool utilizes system design information captured within the diagnostic model and testability analysis output from the TEAMS Designer software to create a series of six reports for various system engineering needs. The ETA Tool allows the user to perform additional studies on the testability analysis results by determining the detection sensitivity to the loss of certain sensors or tests. The ETA Tool was developed to support design and development of the NASA Ares I Crew Launch Vehicle. The diagnostic analysis provided by the ETA Tool was proven to be valuable system engineering output that provided consistency in the verification of system engineering requirements. This software user manual provides a description of each output report generated by the ETA Tool. The manual also describes the example diagnostic model and supporting documentation - also provided with the ETA Tool software release package - that were used to generate the reports presented in the manual

  9. Survey of visualization and analysis tools

    NASA Technical Reports Server (NTRS)

    Meyer, P. J.

    1994-01-01

    A large number of commercially available visualization and analysis tools are available to the researcher. Some of the strengths and limitations of some of these tools, from the viewpoint of the earth sciences discipline, are discussed. Visualization and analysis tools fall into one of two categories: those that are designed to a specific purpose and are non-extensive and those that are generic visual programming tools that are extensible. Most of the extensible packages examined incorporate a data flow paradigm.

  10. Supporting cognition in systems biology analysis: findings on users' processes and design implications.

    PubMed

    Mirel, Barbara

    2009-02-13

    Current usability studies of bioinformatics tools suggest that tools for exploratory analysis support some tasks related to finding relationships of interest but not the deep causal insights necessary for formulating plausible and credible hypotheses. To better understand design requirements for gaining these causal insights in systems biology analyses a longitudinal field study of 15 biomedical researchers was conducted. Researchers interacted with the same protein-protein interaction tools to discover possible disease mechanisms for further experimentation. Findings reveal patterns in scientists' exploratory and explanatory analysis and reveal that tools positively supported a number of well-structured query and analysis tasks. But for several of scientists' more complex, higher order ways of knowing and reasoning the tools did not offer adequate support. Results show that for a better fit with scientists' cognition for exploratory analysis systems biology tools need to better match scientists' processes for validating, for making a transition from classification to model-based reasoning, and for engaging in causal mental modelling. As the next great frontier in bioinformatics usability, tool designs for exploratory systems biology analysis need to move beyond the successes already achieved in supporting formulaic query and analysis tasks and now reduce current mismatches with several of scientists' higher order analytical practices. The implications of results for tool designs are discussed.

  11. Designing an Exploratory Text Analysis Tool for Humanities and Social Sciences Research

    ERIC Educational Resources Information Center

    Shrikumar, Aditi

    2013-01-01

    This dissertation presents a new tool for exploratory text analysis that attempts to improve the experience of navigating and exploring text and its metadata. The design of the tool was motivated by the unmet need for text analysis tools in the humanities and social sciences. In these fields, it is common for scholars to have hundreds or thousands…

  12. Sensitivity Analysis of Weather Variables on Offsite Consequence Analysis Tools in South Korea and the United States.

    PubMed

    Kim, Min-Uk; Moon, Kyong Whan; Sohn, Jong-Ryeul; Byeon, Sang-Hoon

    2018-05-18

    We studied sensitive weather variables for consequence analysis, in the case of chemical leaks on the user side of offsite consequence analysis (OCA) tools. We used OCA tools Korea Offsite Risk Assessment (KORA) and Areal Location of Hazardous Atmospheres (ALOHA) in South Korea and the United States, respectively. The chemicals used for this analysis were 28% ammonia (NH₃), 35% hydrogen chloride (HCl), 50% hydrofluoric acid (HF), and 69% nitric acid (HNO₃). The accident scenarios were based on leakage accidents in storage tanks. The weather variables were air temperature, wind speed, humidity, and atmospheric stability. Sensitivity analysis was performed using the Statistical Package for the Social Sciences (SPSS) program for dummy regression analysis. Sensitivity analysis showed that impact distance was not sensitive to humidity. Impact distance was most sensitive to atmospheric stability, and was also more sensitive to air temperature than wind speed, according to both the KORA and ALOHA tools. Moreover, the weather variables were more sensitive in rural conditions than in urban conditions, with the ALOHA tool being more influenced by weather variables than the KORA tool. Therefore, if using the ALOHA tool instead of the KORA tool in rural conditions, users should be careful not to cause any differences in impact distance due to input errors of weather variables, with the most sensitive one being atmospheric stability.

  13. Use of satellite and modeled soil moisture data for predicting event soil loss at plot scale

    NASA Astrophysics Data System (ADS)

    Todisco, F.; Brocca, L.; Termite, L. F.; Wagner, W.

    2015-09-01

    The potential of coupling soil moisture and a Universal Soil Loss Equation-based (USLE-based) model for event soil loss estimation at plot scale is carefully investigated at the Masse area, in central Italy. The derived model, named Soil Moisture for Erosion (SM4E), is applied by considering the unavailability of in situ soil moisture measurements, by using the data predicted by a soil water balance model (SWBM) and derived from satellite sensors, i.e., the Advanced SCATterometer (ASCAT). The soil loss estimation accuracy is validated using in situ measurements in which event observations at plot scale are available for the period 2008-2013. The results showed that including soil moisture observations in the event rainfall-runoff erosivity factor of the USLE enhances the capability of the model to account for variations in event soil losses, the soil moisture being an effective alternative to the estimated runoff, in the prediction of the event soil loss at Masse. The agreement between observed and estimated soil losses (through SM4E) is fairly satisfactory with a determination coefficient (log-scale) equal to ~ 0.35 and a root mean square error (RMSE) of ~ 2.8 Mg ha-1. These results are particularly significant for the operational estimation of soil losses. Indeed, currently, soil moisture is a relatively simple measurement at the field scale and remote sensing data are also widely available on a global scale. Through satellite data, there is the potential of applying the SM4E model for large-scale monitoring and quantification of the soil erosion process.

  14. Use of satellite and modelled soil moisture data for predicting event soil loss at plot scale

    NASA Astrophysics Data System (ADS)

    Todisco, F.; Brocca, L.; Termite, L. F.; Wagner, W.

    2015-03-01

    The potential of coupling soil moisture and a~USLE-based model for event soil loss estimation at plot scale is carefully investigated at the Masse area, in Central Italy. The derived model, named Soil Moisture for Erosion (SM4E), is applied by considering the unavailability of in situ soil moisture measurements, by using the data predicted by a soil water balance model (SWBM) and derived from satellite sensors, i.e. the Advanced SCATterometer (ASCAT). The soil loss estimation accuracy is validated using in situ measurements in which event observations at plot scale are available for the period 2008-2013. The results showed that including soil moisture observations in the event rainfall-runoff erosivity factor of the RUSLE/USLE, enhances the capability of the model to account for variations in event soil losses, being the soil moisture an effective alternative to the estimated runoff, in the prediction of the event soil loss at Masse. The agreement between observed and estimated soil losses (through SM4E) is fairly satisfactory with a determination coefficient (log-scale) equal to of ~ 0.35 and a root-mean-square error (RMSE) of ~ 2.8 Mg ha-1. These results are particularly significant for the operational estimation of soil losses. Indeed, currently, soil moisture is a relatively simple measurement at the field scale and remote sensing data are also widely available on a global scale. Through satellite data, there is the potential of applying the SM4E model for large-scale monitoring and quantification of the soil erosion process.

  15. Understanding the Dynamics of the South Indian Ocean Sea Surface Salinity Maximum Pool From Argo, Rama, Aquarius, SMOS & Other Satellites

    NASA Astrophysics Data System (ADS)

    Menezes, V. V.; Phillips, H. E.

    2016-02-01

    Subtropical salinity maximum regions are particularly important because the salty subtropical underwater (STW) is formed by subduction of surface waters in these areas. In all oceans, the STW is transported equatorward from the formation region and are tightly related to the Subtropical-Tropical Cell. In the South Indian Ocean (SIO), the salinity maximum pool is further poleward (25S-38S) and eastward (60E-120E). It significantly impacts the circulation of the eastern basin, because the STW forms a strong haline front with the fresh Indonesian Throughflow waters. This haline front overwhelms the temperature contribution establishing the eastward Eastern Gyral Current, an important upstream source for the Leeuwin Current. In the present work, we analyze the variability of the SSS maximum pool using Aquarius and SMOS satellites, an Argo gridded product and the RAMA mooring located at 25S-100E. OAFLUX, 3B42 TRMM, Ascat/Quikscat winds and OSCAR products complement this study. The salinity maximum pool has a strong seasonal cycle of contraction (min in Oct) and expansion (max in April), and most of this variation occurs in the pool poleward side. Advection and entrainment control the contraction, while expansion is due to atmospheric forcing (E-P). From 2004 to 2014, a clear reduction in the pool area is identified, which might be related to a decadal variability. In this case, the variation is in the equatorward side of the pool. Processes controlling this long-term variability are being investigated.

  16. Gene Ontology-Based Analysis of Zebrafish Omics Data Using the Web Tool Comparative Gene Ontology.

    PubMed

    Ebrahimie, Esmaeil; Fruzangohar, Mario; Moussavi Nik, Seyyed Hani; Newman, Morgan

    2017-10-01

    Gene Ontology (GO) analysis is a powerful tool in systems biology, which uses a defined nomenclature to annotate genes/proteins within three categories: "Molecular Function," "Biological Process," and "Cellular Component." GO analysis can assist in revealing functional mechanisms underlying observed patterns in transcriptomic, genomic, and proteomic data. The already extensive and increasing use of zebrafish for modeling genetic and other diseases highlights the need to develop a GO analytical tool for this organism. The web tool Comparative GO was originally developed for GO analysis of bacterial data in 2013 ( www.comparativego.com ). We have now upgraded and elaborated this web tool for analysis of zebrafish genetic data using GOs and annotations from the Gene Ontology Consortium.

  17. The environment power system analysis tool development program

    NASA Technical Reports Server (NTRS)

    Jongeward, Gary A.; Kuharski, Robert A.; Kennedy, Eric M.; Stevens, N. John; Putnam, Rand M.; Roche, James C.; Wilcox, Katherine G.

    1990-01-01

    The Environment Power System Analysis Tool (EPSAT) is being developed to provide space power system design engineers with an analysis tool for determining system performance of power systems in both naturally occurring and self-induced environments. The program is producing an easy to use computer aided engineering (CAE) tool general enough to provide a vehicle for technology transfer from space scientists and engineers to power system design engineers. The results of the project after two years of a three year development program are given. The EPSAT approach separates the CAE tool into three distinct functional units: a modern user interface to present information, a data dictionary interpreter to coordinate analysis; and a data base for storing system designs and results of analysis.

  18. Introduction, comparison, and validation of Meta‐Essentials: A free and simple tool for meta‐analysis

    PubMed Central

    van Rhee, Henk; Hak, Tony

    2017-01-01

    We present a new tool for meta‐analysis, Meta‐Essentials, which is free of charge and easy to use. In this paper, we introduce the tool and compare its features to other tools for meta‐analysis. We also provide detailed information on the validation of the tool. Although free of charge and simple, Meta‐Essentials automatically calculates effect sizes from a wide range of statistics and can be used for a wide range of meta‐analysis applications, including subgroup analysis, moderator analysis, and publication bias analyses. The confidence interval of the overall effect is automatically based on the Knapp‐Hartung adjustment of the DerSimonian‐Laird estimator. However, more advanced meta‐analysis methods such as meta‐analytical structural equation modelling and meta‐regression with multiple covariates are not available. In summary, Meta‐Essentials may prove a valuable resource for meta‐analysts, including researchers, teachers, and students. PMID:28801932

  19. DIY Solar Market Analysis Webinar Series: Top Solar Tools | State, Local,

    Science.gov Websites

    and Tribal Governments | NREL DIY Solar Market Analysis Webinar Series: Top Solar Tools DIY Solar Market Analysis Webinar Series: Top Solar Tools Wednesday, May 14, 2014 As part of a Do-It -Yourself Solar Market Analysis summer series, NREL's Solar Technical Assistance Team (STAT) presented a

  20. Transportation systems safety hazard analysis tool (SafetyHAT) user guide (version 1.0)

    DOT National Transportation Integrated Search

    2014-03-24

    This is a user guide for the transportation system Safety Hazard Analysis Tool (SafetyHAT) Version 1.0. SafetyHAT is a software tool that facilitates System Theoretic Process Analysis (STPA.) This user guide provides instructions on how to download, ...

  1. Diamond tool wear detection method using cutting force and its power spectrum analysis in ultra-precision fly cutting

    NASA Astrophysics Data System (ADS)

    Zhang, G. Q.; To, S.

    2014-08-01

    Cutting force and its power spectrum analysis was thought to be an effective method monitoring tool wear in many cutting processes and a significant body of research has been conducted on this research area. However, relative little similar research was found in ultra-precision fly cutting. In this paper, a group of experiments were carried out to investigate the cutting forces and its power spectrum characteristics under different tool wear stages. Result reveals that the cutting force increases with the progress of tool wear. The cutting force signals under different tool wear stages were analyzed using power spectrum analysis. The analysis indicates that a characteristic frequency does exist in the power spectrum of the cutting force, whose power spectral density increases with the increasing of tool wear level, this characteristic frequency could be adopted to monitor diamond tool wear in ultra-precision fly cutting.

  2. Computational Analysis of Material Flow During Friction Stir Welding of AA5059 Aluminum Alloys

    DTIC Science & Technology

    2011-01-01

    tool material (AISI H13 tool steel ) is modeled as an isotropic linear-elastic material. Within the analysis, the effects of some of the FSW key process...threads/m; (b) tool 598 material = AISI H13 tool steel ; (c) workpiece material = 599 AA5059; (d) tool rotation speed = 500 rpm; (e) tool travel 600 speed...the strain-hardening term is augmented to take into account for the effect of dynamic recrystallization) while the FSW tool material (AISI H13

  3. High-Performance Integrated Virtual Environment (HIVE) Tools and Applications for Big Data Analysis.

    PubMed

    Simonyan, Vahan; Mazumder, Raja

    2014-09-30

    The High-performance Integrated Virtual Environment (HIVE) is a high-throughput cloud-based infrastructure developed for the storage and analysis of genomic and associated biological data. HIVE consists of a web-accessible interface for authorized users to deposit, retrieve, share, annotate, compute and visualize Next-generation Sequencing (NGS) data in a scalable and highly efficient fashion. The platform contains a distributed storage library and a distributed computational powerhouse linked seamlessly. Resources available through the interface include algorithms, tools and applications developed exclusively for the HIVE platform, as well as commonly used external tools adapted to operate within the parallel architecture of the system. HIVE is composed of a flexible infrastructure, which allows for simple implementation of new algorithms and tools. Currently, available HIVE tools include sequence alignment and nucleotide variation profiling tools, metagenomic analyzers, phylogenetic tree-building tools using NGS data, clone discovery algorithms, and recombination analysis algorithms. In addition to tools, HIVE also provides knowledgebases that can be used in conjunction with the tools for NGS sequence and metadata analysis.

  4. High-Performance Integrated Virtual Environment (HIVE) Tools and Applications for Big Data Analysis

    PubMed Central

    Simonyan, Vahan; Mazumder, Raja

    2014-01-01

    The High-performance Integrated Virtual Environment (HIVE) is a high-throughput cloud-based infrastructure developed for the storage and analysis of genomic and associated biological data. HIVE consists of a web-accessible interface for authorized users to deposit, retrieve, share, annotate, compute and visualize Next-generation Sequencing (NGS) data in a scalable and highly efficient fashion. The platform contains a distributed storage library and a distributed computational powerhouse linked seamlessly. Resources available through the interface include algorithms, tools and applications developed exclusively for the HIVE platform, as well as commonly used external tools adapted to operate within the parallel architecture of the system. HIVE is composed of a flexible infrastructure, which allows for simple implementation of new algorithms and tools. Currently, available HIVE tools include sequence alignment and nucleotide variation profiling tools, metagenomic analyzers, phylogenetic tree-building tools using NGS data, clone discovery algorithms, and recombination analysis algorithms. In addition to tools, HIVE also provides knowledgebases that can be used in conjunction with the tools for NGS sequence and metadata analysis. PMID:25271953

  5. Simulation Tools for Forest Health Analysis: An Application in the Red River Watershed, Idaho

    Treesearch

    Andrew J. McMahan; Eric L. Smith

    2006-01-01

    Software tools for landscape analyses--including FVS model extensions, and a number of FVS-related pre- and post-processing “tools”--are presented, using an analysis in the Red River Watershed, Nez Perce National Forest as an example. We present (1) a discussion of pre-simulation data analysis; (2) the Physiographic Information Extraction System (PIES), a tool that can...

  6. Evaluation and Validation of Operational RapidScat Ocean Surface Vector Winds

    NASA Astrophysics Data System (ADS)

    Chang, Paul; Jelenak, Zorana; Soisuvarn, Seubson; Said, Faozi; Sienkiewicz, Joseph; Brennan, Michael

    2015-04-01

    NASA launched RapidScat to the International Space Station (ISS) on September 21, 2014 on a two-year mission to support global monitoring of ocean winds for improved weather forecasting and climate studies. The JPL-developed space-based scatterometer is conically scanning and operates at ku-band (13.4 GHz) similar to QuikSCAT. The ISS-RapidScat's measurement swath is approximately 900 kilometers and covers the majority of the ocean between 51.6 degrees north and south latitude (approximately from north of Vancouver, Canada, to the southern tip of Patagonia) in 48 hours. RapidScat data are currently being posted at a spacing of 25 kilometers, but a version to be released in the near future will improve the postings to 12.5 kilometers. RapidScat ocean surface wind vector data are being provided in near real-time to NOAA, and other operational users such as the U.S. Navy, the European Organisation for the Exploitation of Meteorological Satellites (EUMETSAT), the Indian Space Research Organisation (ISRO) and the Royal Netherlands Meteorological Institute (KNMI). The quality of the RapidScat OSVW data are assessed by collocating the data in space and time with "truth" data. Typically "truth" data will include, but are not limited to, the NWS global forecast model analysis (GDAS) fields, buoys, ASCAT, WindSat, AMSR-2, and aircraft measurements during hurricane and winter storm experiment flights. The standard statistical analysis used for satellite microwave wind sensors will be utilized to characterize the RapidScat wind vector retrievals. The global numerical weather prediction (NWP) models are a convenient source of "truth" data because they are available 4 times/day globally which results in the accumulation of a large number of collocations over a relatively short amount of time. The NWP model fields are not "truth" in the same way an actual observation would be, however, as long as there are no systematic errors in the NWP model output the collocations will converge in the mean for winds between approximately 3-20 m/s. The NWP models typically do not properly resolve the very low and high wind speeds in part due to limitations of the spatial scales they can account for. Buoy measurements, aircraft-based measurements and other satellite retrievals can be more directly compared on a point-by-point basis. The RapidScat OSVW validation results will be presented and discussed. Utilization examples of these data in support of NOAA's marine weather forecasting and warning mission will also be presented and discussed.

  7. Overview of the Development for a Suite of Low-Thrust Trajectory Analysis Tools

    NASA Technical Reports Server (NTRS)

    Kos, Larry D.; Polsgrove, Tara; Hopkins, Randall; Thomas, Dan; Sims, Jon A.

    2006-01-01

    A NASA intercenter team has developed a suite of low-thrust trajectory analysis tools to make a significant improvement in three major facets of low-thrust trajectory and mission analysis. These are: 1) ease of use, 2) ability to more robustly converge to solutions, and 3) higher fidelity modeling and accuracy of results. Due mostly to the short duration of the development, the team concluded that a suite of tools was preferred over having one integrated tool. This tool-suite, their characteristics, and their applicability will be described. Trajectory analysts can read this paper and determine which tool is most appropriate for their problem.

  8. Active controls: A look at analytical methods and associated tools

    NASA Technical Reports Server (NTRS)

    Newsom, J. R.; Adams, W. M., Jr.; Mukhopadhyay, V.; Tiffany, S. H.; Abel, I.

    1984-01-01

    A review of analytical methods and associated tools for active controls analysis and design problems is presented. Approaches employed to develop mathematical models suitable for control system analysis and/or design are discussed. Significant efforts have been expended to develop tools to generate the models from the standpoint of control system designers' needs and develop the tools necessary to analyze and design active control systems. Representative examples of these tools are discussed. Examples where results from the methods and tools have been compared with experimental data are also presented. Finally, a perspective on future trends in analysis and design methods is presented.

  9. CMS Configuration Editor: GUI based application for user analysis job

    NASA Astrophysics Data System (ADS)

    de Cosa, A.

    2011-12-01

    We present the user interface and the software architecture of the Configuration Editor for the CMS experiment. The analysis workflow is organized in a modular way integrated within the CMS framework that organizes in a flexible way user analysis code. The Python scripting language is adopted to define the job configuration that drives the analysis workflow. It could be a challenging task for users, especially for newcomers, to develop analysis jobs managing the configuration of many required modules. For this reason a graphical tool has been conceived in order to edit and inspect configuration files. A set of common analysis tools defined in the CMS Physics Analysis Toolkit (PAT) can be steered and configured using the Config Editor. A user-defined analysis workflow can be produced starting from a standard configuration file, applying and configuring PAT tools according to the specific user requirements. CMS users can adopt this tool, the Config Editor, to create their analysis visualizing in real time which are the effects of their actions. They can visualize the structure of their configuration, look at the modules included in the workflow, inspect the dependences existing among the modules and check the data flow. They can visualize at which values parameters are set and change them according to what is required by their analysis task. The integration of common tools in the GUI needed to adopt an object-oriented structure in the Python definition of the PAT tools and the definition of a layer of abstraction from which all PAT tools inherit.

  10. Vehicle Technology Simulation and Analysis Tools | Transportation Research

    Science.gov Websites

    | NREL Vehicle Technology Simulation and Analysis Tools Vehicle Technology Simulation and vehicle technologies with the potential to achieve significant fuel savings and emission reductions. NREL : Automotive Deployment Options Projection Tool The ADOPT modeling tool estimates vehicle technology

  11. BiNChE: a web tool and library for chemical enrichment analysis based on the ChEBI ontology.

    PubMed

    Moreno, Pablo; Beisken, Stephan; Harsha, Bhavana; Muthukrishnan, Venkatesh; Tudose, Ilinca; Dekker, Adriano; Dornfeldt, Stefanie; Taruttis, Franziska; Grosse, Ivo; Hastings, Janna; Neumann, Steffen; Steinbeck, Christoph

    2015-02-21

    Ontology-based enrichment analysis aids in the interpretation and understanding of large-scale biological data. Ontologies are hierarchies of biologically relevant groupings. Using ontology annotations, which link ontology classes to biological entities, enrichment analysis methods assess whether there is a significant over or under representation of entities for ontology classes. While many tools exist that run enrichment analysis for protein sets annotated with the Gene Ontology, there are only a few that can be used for small molecules enrichment analysis. We describe BiNChE, an enrichment analysis tool for small molecules based on the ChEBI Ontology. BiNChE displays an interactive graph that can be exported as a high-resolution image or in network formats. The tool provides plain, weighted and fragment analysis based on either the ChEBI Role Ontology or the ChEBI Structural Ontology. BiNChE aids in the exploration of large sets of small molecules produced within Metabolomics or other Systems Biology research contexts. The open-source tool provides easy and highly interactive web access to enrichment analysis with the ChEBI ontology tool and is additionally available as a standalone library.

  12. General Mission Analysis Tool (GMAT) User's Guide (Draft)

    NASA Technical Reports Server (NTRS)

    Hughes, Steven P.

    2007-01-01

    4The General Mission Analysis Tool (GMAT) is a space trajectory optimization and mission analysis system. This document is a draft of the users guide for the tool. Included in the guide is information about Configuring Objects/Resources, Object Fields: Quick Look-up Tables, and Commands and Events.

  13. Cross-Cutting Interoperability in an Earth Science Collaboratory

    NASA Technical Reports Server (NTRS)

    Lynnes, Christopher; Ramachandran, Rahul; Kuo, Kuo-Sen

    2011-01-01

    An Earth Science Collaboratory is: A rich data analysis environment with: (1) Access to a wide spectrum of Earth Science data, (3) A diverse set of science analysis services and tools, (4) A means to collaborate on data, tools and analysis, and (5)Supports sharing of data, tools, results and knowledge

  14. MetaboTools: A comprehensive toolbox for analysis of genome-scale metabolic models

    DOE PAGES

    Aurich, Maike K.; Fleming, Ronan M. T.; Thiele, Ines

    2016-08-03

    Metabolomic data sets provide a direct read-out of cellular phenotypes and are increasingly generated to study biological questions. Previous work, by us and others, revealed the potential of analyzing extracellular metabolomic data in the context of the metabolic model using constraint-based modeling. With the MetaboTools, we make our methods available to the broader scientific community. The MetaboTools consist of a protocol, a toolbox, and tutorials of two use cases. The protocol describes, in a step-wise manner, the workflow of data integration, and computational analysis. The MetaboTools comprise the Matlab code required to complete the workflow described in the protocol. Tutorialsmore » explain the computational steps for integration of two different data sets and demonstrate a comprehensive set of methods for the computational analysis of metabolic models and stratification thereof into different phenotypes. The presented workflow supports integrative analysis of multiple omics data sets. Importantly, all analysis tools can be applied to metabolic models without performing the entire workflow. Taken together, the MetaboTools constitute a comprehensive guide to the intra-model analysis of extracellular metabolomic data from microbial, plant, or human cells. In conclusion, this computational modeling resource offers a broad set of computational analysis tools for a wide biomedical and non-biomedical research community.« less

  15. Modeling and Analysis of Space Based Transceivers

    NASA Technical Reports Server (NTRS)

    Reinhart, Richard C.; Liebetreu, John; Moore, Michael S.; Price, Jeremy C.; Abbott, Ben

    2005-01-01

    This paper presents the tool chain, methodology, and initial results of a study to provide a thorough, objective, and quantitative analysis of the design alternatives for space Software Defined Radio (SDR) transceivers. The approach taken was to develop a set of models and tools for describing communications requirements, the algorithm resource requirements, the available hardware, and the alternative software architectures, and generate analysis data necessary to compare alternative designs. The Space Transceiver Analysis Tool (STAT) was developed to help users identify and select representative designs, calculate the analysis data, and perform a comparative analysis of the representative designs. The tool allows the design space to be searched quickly while permitting incremental refinement in regions of higher payoff.

  16. Modeling and Analysis of Space Based Transceivers

    NASA Technical Reports Server (NTRS)

    Moore, Michael S.; Price, Jeremy C.; Abbott, Ben; Liebetreu, John; Reinhart, Richard C.; Kacpura, Thomas J.

    2007-01-01

    This paper presents the tool chain, methodology, and initial results of a study to provide a thorough, objective, and quantitative analysis of the design alternatives for space Software Defined Radio (SDR) transceivers. The approach taken was to develop a set of models and tools for describing communications requirements, the algorithm resource requirements, the available hardware, and the alternative software architectures, and generate analysis data necessary to compare alternative designs. The Space Transceiver Analysis Tool (STAT) was developed to help users identify and select representative designs, calculate the analysis data, and perform a comparative analysis of the representative designs. The tool allows the design space to be searched quickly while permitting incremental refinement in regions of higher payoff.

  17. New Tools for Sea Ice Data Analysis and Visualization: NSIDC's Arctic Sea Ice News and Analysis

    NASA Astrophysics Data System (ADS)

    Vizcarra, N.; Stroeve, J.; Beam, K.; Beitler, J.; Brandt, M.; Kovarik, J.; Savoie, M. H.; Skaug, M.; Stafford, T.

    2017-12-01

    Arctic sea ice has long been recognized as a sensitive climate indicator and has undergone a dramatic decline over the past thirty years. Antarctic sea ice continues to be an intriguing and active field of research. The National Snow and Ice Data Center's Arctic Sea Ice News & Analysis (ASINA) offers researchers and the public a transparent view of sea ice data and analysis. We have released a new set of tools for sea ice analysis and visualization. In addition to Charctic, our interactive sea ice extent graph, the new Sea Ice Data and Analysis Tools page provides access to Arctic and Antarctic sea ice data organized in seven different data workbooks, updated daily or monthly. An interactive tool lets scientists, or the public, quickly compare changes in ice extent and location. Another tool allows users to map trends, anomalies, and means for user-defined time periods. Animations of September Arctic and Antarctic monthly average sea ice extent and concentration may also be accessed from this page. Our tools help the NSIDC scientists monitor and understand sea ice conditions in near real time. They also allow the public to easily interact with and explore sea ice data. Technical innovations in our data center helped NSIDC quickly build these tools and more easily maintain them. The tools were made publicly accessible to meet the desire from the public and members of the media to access the numbers and calculations that power our visualizations and analysis. This poster explores these tools and how other researchers, the media, and the general public are using them.

  18. Using Kepler for Tool Integration in Microarray Analysis Workflows.

    PubMed

    Gan, Zhuohui; Stowe, Jennifer C; Altintas, Ilkay; McCulloch, Andrew D; Zambon, Alexander C

    Increasing numbers of genomic technologies are leading to massive amounts of genomic data, all of which requires complex analysis. More and more bioinformatics analysis tools are being developed by scientist to simplify these analyses. However, different pipelines have been developed using different software environments. This makes integrations of these diverse bioinformatics tools difficult. Kepler provides an open source environment to integrate these disparate packages. Using Kepler, we integrated several external tools including Bioconductor packages, AltAnalyze, a python-based open source tool, and R-based comparison tool to build an automated workflow to meta-analyze both online and local microarray data. The automated workflow connects the integrated tools seamlessly, delivers data flow between the tools smoothly, and hence improves efficiency and accuracy of complex data analyses. Our workflow exemplifies the usage of Kepler as a scientific workflow platform for bioinformatics pipelines.

  19. Analysis Tools in Geant4 10.2 and 10.3

    NASA Astrophysics Data System (ADS)

    Hřivnáčová, I.; Barrand, G.

    2017-10-01

    A new analysis category based on g4tools was added in Geant4 release 9.5 (2011). The aim was to provide users with a lightweight analysis tool available as part of the Geant4 installation without the need to link to an external analysis package. It has progressively been included in all Geant4 examples. Frequent questions in the Geant4 users forum show its increasing popularity in the Geant4 users community. In this presentation, we will give a brief overview of g4tools and the analysis category. We report on new developments since our CHEP 2013 contribution as well as mention upcoming new features.

  20. An exploration of inter-organisational partnership assessment tools in the context of Australian Aboriginal-mainstream partnerships: a scoping review of the literature.

    PubMed

    Tsou, Christina; Haynes, Emma; Warner, Wayne D; Gray, Gordon; Thompson, Sandra C

    2015-04-23

    The need for better partnerships between Aboriginal organisations and mainstream agencies demands attention on process and relational elements of these partnerships, and improving partnership functioning through transformative or iterative evaluation procedures. This paper presents the findings of a literature review which examines the usefulness of existing partnership tools to the Australian Aboriginal-mainstream partnership (AMP) context. Three sets of best practice principles for successful AMP were selected based on authors' knowledge and experience. Items in each set of principles were separated into process and relational elements and used to guide the analysis of partnership assessment tools. The review and analysis of partnership assessment tools were conducted in three distinct but related parts. Part 1- identify and select reviews of partnership tools; part 2 - identify and select partnership self-assessment tool; part 3 - analysis of selected tools using AMP principles. The focus on relational and process elements in the partnership tools reviewed is consistent with the focus of Australian AMP principles by reconciliation advocates; however, historical context, lived experience, cultural context and approaches of Australian Aboriginal people represent key deficiencies in the tools reviewed. The overall assessment indicated that the New York Partnership Self-Assessment Tool and the VicHealth Partnership Analysis Tools reflect the greatest number of AMP principles followed by the Nuffield Partnership Assessment Tool. The New York PSAT has the strongest alignment with the relational elements while VicHealth and Nuffield tools showed greatest alignment with the process elements in the chosen AMP principles. Partnership tools offer opportunities for providing evidence based support to partnership development. The multiplicity of tools in existence and the reported uniqueness of each partnership, mean the development of a generic partnership analysis for AMP may not be a viable option for future effort.

  1. An integrated modeling and design tool for advanced optical spacecraft

    NASA Technical Reports Server (NTRS)

    Briggs, Hugh C.

    1992-01-01

    Consideration is given to the design and status of the Integrated Modeling of Optical Systems (IMOS) tool and to critical design issues. A multidisciplinary spacecraft design and analysis tool with support for structural dynamics, controls, thermal analysis, and optics, IMOS provides rapid and accurate end-to-end performance analysis, simulations, and optimization of advanced space-based optical systems. The requirements for IMOS-supported numerical arrays, user defined data structures, and a hierarchical data base are outlined, and initial experience with the tool is summarized. A simulation of a flexible telescope illustrates the integrated nature of the tools.

  2. Introduction, comparison, and validation of Meta-Essentials: A free and simple tool for meta-analysis.

    PubMed

    Suurmond, Robert; van Rhee, Henk; Hak, Tony

    2017-12-01

    We present a new tool for meta-analysis, Meta-Essentials, which is free of charge and easy to use. In this paper, we introduce the tool and compare its features to other tools for meta-analysis. We also provide detailed information on the validation of the tool. Although free of charge and simple, Meta-Essentials automatically calculates effect sizes from a wide range of statistics and can be used for a wide range of meta-analysis applications, including subgroup analysis, moderator analysis, and publication bias analyses. The confidence interval of the overall effect is automatically based on the Knapp-Hartung adjustment of the DerSimonian-Laird estimator. However, more advanced meta-analysis methods such as meta-analytical structural equation modelling and meta-regression with multiple covariates are not available. In summary, Meta-Essentials may prove a valuable resource for meta-analysts, including researchers, teachers, and students. © 2017 The Authors. Research Synthesis Methods published by John Wiley & Sons Ltd.

  3. DIY Solar Market Analysis Webinar Series: Community Solar Scenario Tool |

    Science.gov Websites

    State, Local, and Tribal Governments | NREL Webinar Series: Community Solar Scenario Tool DIY Solar Market Analysis Webinar Series: Community Solar Scenario Tool Wednesday, August 13, 2014 As part ) presented a live webinar titled, "Community Solar Scenario Tool: Planning for a fruitful solar garden

  4. RSAT: regulatory sequence analysis tools.

    PubMed

    Thomas-Chollier, Morgane; Sand, Olivier; Turatsinze, Jean-Valéry; Janky, Rekin's; Defrance, Matthieu; Vervisch, Eric; Brohée, Sylvain; van Helden, Jacques

    2008-07-01

    The regulatory sequence analysis tools (RSAT, http://rsat.ulb.ac.be/rsat/) is a software suite that integrates a wide collection of modular tools for the detection of cis-regulatory elements in genome sequences. The suite includes programs for sequence retrieval, pattern discovery, phylogenetic footprint detection, pattern matching, genome scanning and feature map drawing. Random controls can be performed with random gene selections or by generating random sequences according to a variety of background models (Bernoulli, Markov). Beyond the original word-based pattern-discovery tools (oligo-analysis and dyad-analysis), we recently added a battery of tools for matrix-based detection of cis-acting elements, with some original features (adaptive background models, Markov-chain estimation of P-values) that do not exist in other matrix-based scanning tools. The web server offers an intuitive interface, where each program can be accessed either separately or connected to the other tools. In addition, the tools are now available as web services, enabling their integration in programmatic workflows. Genomes are regularly updated from various genome repositories (NCBI and EnsEMBL) and 682 organisms are currently supported. Since 1998, the tools have been used by several hundreds of researchers from all over the world. Several predictions made with RSAT were validated experimentally and published.

  5. Tools for developing a quality management program: proactive tools (process mapping, value stream mapping, fault tree analysis, and failure mode and effects analysis).

    PubMed

    Rath, Frank

    2008-01-01

    This article examines the concepts of quality management (QM) and quality assurance (QA), as well as the current state of QM and QA practices in radiotherapy. A systematic approach incorporating a series of industrial engineering-based tools is proposed, which can be applied in health care organizations proactively to improve process outcomes, reduce risk and/or improve patient safety, improve through-put, and reduce cost. This tool set includes process mapping and process flowcharting, failure modes and effects analysis (FMEA), value stream mapping, and fault tree analysis (FTA). Many health care organizations do not have experience in applying these tools and therefore do not understand how and when to use them. As a result there are many misconceptions about how to use these tools, and they are often incorrectly applied. This article describes these industrial engineering-based tools and also how to use them, when they should be used (and not used), and the intended purposes for their use. In addition the strengths and weaknesses of each of these tools are described, and examples are given to demonstrate the application of these tools in health care settings.

  6. msBiodat analysis tool, big data analysis for high-throughput experiments.

    PubMed

    Muñoz-Torres, Pau M; Rokć, Filip; Belužic, Robert; Grbeša, Ivana; Vugrek, Oliver

    2016-01-01

    Mass spectrometry (MS) are a group of a high-throughput techniques used to increase knowledge about biomolecules. They produce a large amount of data which is presented as a list of hundreds or thousands of proteins. Filtering those data efficiently is the first step for extracting biologically relevant information. The filtering may increase interest by merging previous data with the data obtained from public databases, resulting in an accurate list of proteins which meet the predetermined conditions. In this article we present msBiodat Analysis Tool, a web-based application thought to approach proteomics to the big data analysis. With this tool, researchers can easily select the most relevant information from their MS experiments using an easy-to-use web interface. An interesting feature of msBiodat analysis tool is the possibility of selecting proteins by its annotation on Gene Ontology using its Gene Id, ensembl or UniProt codes. The msBiodat analysis tool is a web-based application that allows researchers with any programming experience to deal with efficient database querying advantages. Its versatility and user-friendly interface makes easy to perform fast and accurate data screening by using complex queries. Once the analysis is finished, the result is delivered by e-mail. msBiodat analysis tool is freely available at http://msbiodata.irb.hr.

  7. Interchange Safety Analysis Tool (ISAT) : user manual

    DOT National Transportation Integrated Search

    2007-06-01

    This User Manual describes the usage and operation of the spreadsheet-based Interchange Safety Analysis Tool (ISAT). ISAT provides design and safety engineers with an automated tool for assessing the safety effects of geometric design and traffic con...

  8. Fault Tree Analysis Application for Safety and Reliability

    NASA Technical Reports Server (NTRS)

    Wallace, Dolores R.

    2003-01-01

    Many commercial software tools exist for fault tree analysis (FTA), an accepted method for mitigating risk in systems. The method embedded in the tools identifies a root as use in system components, but when software is identified as a root cause, it does not build trees into the software component. No commercial software tools have been built specifically for development and analysis of software fault trees. Research indicates that the methods of FTA could be applied to software, but the method is not practical without automated tool support. With appropriate automated tool support, software fault tree analysis (SFTA) may be a practical technique for identifying the underlying cause of software faults that may lead to critical system failures. We strive to demonstrate that existing commercial tools for FTA can be adapted for use with SFTA, and that applied to a safety-critical system, SFTA can be used to identify serious potential problems long before integrator and system testing.

  9. Dcode.org anthology of comparative genomic tools.

    PubMed

    Loots, Gabriela G; Ovcharenko, Ivan

    2005-07-01

    Comparative genomics provides the means to demarcate functional regions in anonymous DNA sequences. The successful application of this method to identifying novel genes is currently shifting to deciphering the non-coding encryption of gene regulation across genomes. To facilitate the practical application of comparative sequence analysis to genetics and genomics, we have developed several analytical and visualization tools for the analysis of arbitrary sequences and whole genomes. These tools include two alignment tools, zPicture and Mulan; a phylogenetic shadowing tool, eShadow for identifying lineage- and species-specific functional elements; two evolutionary conserved transcription factor analysis tools, rVista and multiTF; a tool for extracting cis-regulatory modules governing the expression of co-regulated genes, Creme 2.0; and a dynamic portal to multiple vertebrate and invertebrate genome alignments, the ECR Browser. Here, we briefly describe each one of these tools and provide specific examples on their practical applications. All the tools are publicly available at the http://www.dcode.org/ website.

  10. An Integrated Approach to Risk Assessment for Concurrent Design

    NASA Technical Reports Server (NTRS)

    Meshkat, Leila; Voss, Luke; Feather, Martin; Cornford, Steve

    2005-01-01

    This paper describes an approach to risk assessment and analysis suited to the early phase, concurrent design of a space mission. The approach integrates an agile, multi-user risk collection tool, a more in-depth risk analysis tool, and repositories of risk information. A JPL developed tool, named RAP, is used for collecting expert opinions about risk from designers involved in the concurrent design of a space mission. Another in-house developed risk assessment tool, named DDP, is used for the analysis.

  11. Multi-mission telecom analysis tool

    NASA Technical Reports Server (NTRS)

    Hanks, D.; Kordon, M.; Baker, J.

    2002-01-01

    In the early formulation phase of a mission it is critically important to have fast, easy to use, easy to integrate space vehicle subsystem analysis tools so that engineers can rapidly perform trade studies not only by themselves but in coordination with other subsystem engineers as well. The Multi-Mission Telecom Analysis Tool (MMTAT) is designed for just this purpose.

  12. A Quality Assessment Tool for Non-Specialist Users of Regression Analysis

    ERIC Educational Resources Information Center

    Argyrous, George

    2015-01-01

    This paper illustrates the use of a quality assessment tool for regression analysis. It is designed for non-specialist "consumers" of evidence, such as policy makers. The tool provides a series of questions such consumers of evidence can ask to interrogate regression analysis, and is illustrated with reference to a recent study published…

  13. Overview of the Hydrogen Financial Analysis Scenario Tool (H2FAST); NREL (National Renewable Energy Laboratory)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Melaina, Marc; Bush, Brian; Penev, Michael

    This presentation provides an introduction to the Hydrogen Financial Analysis Scenario Tool (H2FAST) and includes an overview of each of the three versions of H2FAST: the Web tool, the Excel spreadsheet version, and the beta version of the H2FAST Business Case Scenario tool.

  14. Rotorcraft Conceptual Design Environment

    DTIC Science & Technology

    2009-10-01

    systems engineering design tool sets. The DaVinci Project vision is to develop software architecture and tools specifically for acquisition system...enable movement of that information to and from analyses. Finally, a recently developed rotorcraft system analysis tool is described. Introduction...information to and from analyses. Finally, a recently developed rotorcraft system analysis tool is described. 15. SUBJECT TERMS 16. SECURITY CLASSIFICATION

  15. Photomat: A Mobile Tool for Aiding in Student Construction of Research Questions and Data Analysis

    ERIC Educational Resources Information Center

    Shelley, Tia Renee; Dasgupta, Chandan; Silva, Alexandra; Lyons, Leilah; Moher, Tom

    2015-01-01

    This paper presents a new mobile software tool, PhotoMAT (Photo Management and Analysis Tool), and students' experiences with this tool within a scaffolded curricular unit--Neighborhood Safari. PhotoMAT was designed to support learners' investigations of backyard animal behavior and works with image sets obtained using fixed-position field cameras…

  16. Scaffolding Assignments: Analysis of Assignmentor as a Tool to Support First Year Students' Academic Writing Skills

    ERIC Educational Resources Information Center

    Silva, Pedro

    2017-01-01

    There are several technological tools which aim to support first year students' challenges, especially when it comes to academic writing. This paper analyses one of these tools, Wiley's AssignMentor. The Technological Pedagogical Content Knowledge framework was used to systematise this analysis. The paper showed an alignment between the tools'…

  17. HydroClimATe: hydrologic and climatic analysis toolkit

    USGS Publications Warehouse

    Dickinson, Jesse; Hanson, Randall T.; Predmore, Steven K.

    2014-01-01

    The potential consequences of climate variability and climate change have been identified as major issues for the sustainability and availability of the worldwide water resources. Unlike global climate change, climate variability represents deviations from the long-term state of the climate over periods of a few years to several decades. Currently, rich hydrologic time-series data are available, but the combination of data preparation and statistical methods developed by the U.S. Geological Survey as part of the Groundwater Resources Program is relatively unavailable to hydrologists and engineers who could benefit from estimates of climate variability and its effects on periodic recharge and water-resource availability. This report documents HydroClimATe, a computer program for assessing the relations between variable climatic and hydrologic time-series data. HydroClimATe was developed for a Windows operating system. The software includes statistical tools for (1) time-series preprocessing, (2) spectral analysis, (3) spatial and temporal analysis, (4) correlation analysis, and (5) projections. The time-series preprocessing tools include spline fitting, standardization using a normal or gamma distribution, and transformation by a cumulative departure. The spectral analysis tools include discrete Fourier transform, maximum entropy method, and singular spectrum analysis. The spatial and temporal analysis tool is empirical orthogonal function analysis. The correlation analysis tools are linear regression and lag correlation. The projection tools include autoregressive time-series modeling and generation of many realizations. These tools are demonstrated in four examples that use stream-flow discharge data, groundwater-level records, gridded time series of precipitation data, and the Multivariate ENSO Index.

  18. On-line Monitoring for Cutting Tool Wear Condition Based on the Parameters

    NASA Astrophysics Data System (ADS)

    Han, Fenghua; Xie, Feng

    2017-07-01

    In the process of cutting tools, it is very important to monitor the working state of the tools. On the basis of acceleration signal acquisition under the constant speed, time domain and frequency domain analysis of relevant indicators monitor the online of tool wear condition. The analysis results show that the method can effectively judge the tool wear condition in the process of machining. It has certain application value.

  19. Guidelines for the analysis of free energy calculations

    PubMed Central

    Klimovich, Pavel V.; Shirts, Michael R.; Mobley, David L.

    2015-01-01

    Free energy calculations based on molecular dynamics (MD) simulations show considerable promise for applications ranging from drug discovery to prediction of physical properties and structure-function studies. But these calculations are still difficult and tedious to analyze, and best practices for analysis are not well defined or propagated. Essentially, each group analyzing these calculations needs to decide how to conduct the analysis and, usually, develop its own analysis tools. Here, we review and recommend best practices for analysis yielding reliable free energies from molecular simulations. Additionally, we provide a Python tool, alchemical–analysis.py, freely available on GitHub at https://github.com/choderalab/pymbar–examples, that implements the analysis practices reviewed here for several reference simulation packages, which can be adapted to handle data from other packages. Both this review and the tool covers analysis of alchemical calculations generally, including free energy estimates via both thermodynamic integration and free energy perturbation-based estimators. Our Python tool also handles output from multiple types of free energy calculations, including expanded ensemble and Hamiltonian replica exchange, as well as standard fixed ensemble calculations. We also survey a range of statistical and graphical ways of assessing the quality of the data and free energy estimates, and provide prototypes of these in our tool. We hope these tools and discussion will serve as a foundation for more standardization of and agreement on best practices for analysis of free energy calculations. PMID:25808134

  20. Spreadsheet-based engine data analysis tool - user's guide.

    DOT National Transportation Integrated Search

    2016-07-01

    This record refers to both the spreadsheet tool - Fleet Equipment Performance Measurement Preventive Maintenance Model: Spreadsheet-Based Engine Data Analysis Tool, http://ntl.bts.gov/lib/60000/60000/60007/0-6626-P1_Final.xlsm - and its accompanying ...

  1. A Multidimensional Analysis Tool for Visualizing Online Interactions

    ERIC Educational Resources Information Center

    Kim, Minjeong; Lee, Eunchul

    2012-01-01

    This study proposes and verifies the performance of an analysis tool for visualizing online interactions. A review of the most widely used methods for analyzing online interactions, including quantitative analysis, content analysis, and social network analysis methods, indicates these analysis methods have some limitations resulting from their…

  2. Design and Analysis Tools for Supersonic Inlets

    NASA Technical Reports Server (NTRS)

    Slater, John W.; Folk, Thomas C.

    2009-01-01

    Computational tools are being developed for the design and analysis of supersonic inlets. The objective is to update existing tools and provide design and low-order aerodynamic analysis capability for advanced inlet concepts. The Inlet Tools effort includes aspects of creating an electronic database of inlet design information, a document describing inlet design and analysis methods, a geometry model for describing the shape of inlets, and computer tools that implement the geometry model and methods. The geometry model has a set of basic inlet shapes that include pitot, two-dimensional, axisymmetric, and stream-traced inlet shapes. The inlet model divides the inlet flow field into parts that facilitate the design and analysis methods. The inlet geometry model constructs the inlet surfaces through the generation and transformation of planar entities based on key inlet design factors. Future efforts will focus on developing the inlet geometry model, the inlet design and analysis methods, a Fortran 95 code to implement the model and methods. Other computational platforms, such as Java, will also be explored.

  3. Nutrition screening tools: an analysis of the evidence.

    PubMed

    Skipper, Annalynn; Ferguson, Maree; Thompson, Kyle; Castellanos, Victoria H; Porcari, Judy

    2012-05-01

    In response to questions about tools for nutrition screening, an evidence analysis project was developed to identify the most valid and reliable nutrition screening tools for use in acute care and hospital-based ambulatory care settings. An oversight group defined nutrition screening and literature search criteria. A trained analyst conducted structured searches of the literature for studies of nutrition screening tools according to predetermined criteria. Eleven nutrition screening tools designed to detect undernutrition in patients in acute care and hospital-based ambulatory care were identified. Trained analysts evaluated articles for quality using criteria specified by the American Dietetic Association's Evidence Analysis Library. Members of the oversight group assigned quality grades to the tools based on the quality of the supporting evidence, including reliability and validity data. One tool, the NRS-2002, received a grade I, and 4 tools-the Simple Two-Part Tool, the Mini-Nutritional Assessment-Short Form (MNA-SF), the Malnutrition Screening Tool (MST), and Malnutrition Universal Screening Tool (MUST)-received a grade II. The MST was the only tool shown to be both valid and reliable for identifying undernutrition in the settings studied. Thus, validated nutrition screening tools that are simple and easy to use are available for application in acute care and hospital-based ambulatory care settings.

  4. Tool for Rapid Analysis of Monte Carlo Simulations

    NASA Technical Reports Server (NTRS)

    Restrepo, Carolina; McCall, Kurt E.; Hurtado, John E.

    2013-01-01

    Designing a spacecraft, or any other complex engineering system, requires extensive simulation and analysis work. Oftentimes, the large amounts of simulation data generated are very difficult and time consuming to analyze, with the added risk of overlooking potentially critical problems in the design. The authors have developed a generic data analysis tool that can quickly sort through large data sets and point an analyst to the areas in the data set that cause specific types of failures. The first version of this tool was a serial code and the current version is a parallel code, which has greatly increased the analysis capabilities. This paper describes the new implementation of this analysis tool on a graphical processing unit, and presents analysis results for NASA's Orion Monte Carlo data to demonstrate its capabilities.

  5. YersiniaBase: a genomic resource and analysis platform for comparative analysis of Yersinia.

    PubMed

    Tan, Shi Yang; Dutta, Avirup; Jakubovics, Nicholas S; Ang, Mia Yang; Siow, Cheuk Chuen; Mutha, Naresh Vr; Heydari, Hamed; Wee, Wei Yee; Wong, Guat Jah; Choo, Siew Woh

    2015-01-16

    Yersinia is a Gram-negative bacteria that includes serious pathogens such as the Yersinia pestis, which causes plague, Yersinia pseudotuberculosis, Yersinia enterocolitica. The remaining species are generally considered non-pathogenic to humans, although there is evidence that at least some of these species can cause occasional infections using distinct mechanisms from the more pathogenic species. With the advances in sequencing technologies, many genomes of Yersinia have been sequenced. However, there is currently no specialized platform to hold the rapidly-growing Yersinia genomic data and to provide analysis tools particularly for comparative analyses, which are required to provide improved insights into their biology, evolution and pathogenicity. To facilitate the ongoing and future research of Yersinia, especially those generally considered non-pathogenic species, a well-defined repository and analysis platform is needed to hold the Yersinia genomic data and analysis tools for the Yersinia research community. Hence, we have developed the YersiniaBase, a robust and user-friendly Yersinia resource and analysis platform for the analysis of Yersinia genomic data. YersiniaBase has a total of twelve species and 232 genome sequences, of which the majority are Yersinia pestis. In order to smooth the process of searching genomic data in a large database, we implemented an Asynchronous JavaScript and XML (AJAX)-based real-time searching system in YersiniaBase. Besides incorporating existing tools, which include JavaScript-based genome browser (JBrowse) and Basic Local Alignment Search Tool (BLAST), YersiniaBase also has in-house developed tools: (1) Pairwise Genome Comparison tool (PGC) for comparing two user-selected genomes; (2) Pathogenomics Profiling Tool (PathoProT) for comparative pathogenomics analysis of Yersinia genomes; (3) YersiniaTree for constructing phylogenetic tree of Yersinia. We ran analyses based on the tools and genomic data in YersiniaBase and the preliminary results showed differences in virulence genes found in Yersinia pestis and Yersinia pseudotuberculosis compared to other Yersinia species, and differences between Yersinia enterocolitica subsp. enterocolitica and Yersinia enterocolitica subsp. palearctica. YersiniaBase offers free access to wide range of genomic data and analysis tools for the analysis of Yersinia. YersiniaBase can be accessed at http://yersinia.um.edu.my .

  6. Macro Analysis Tool - MAT

    EPA Science Inventory

    This product is an easy-to-use Excel-based macro analysis tool (MAT) for performing comparisons of air sensor data with reference data and interpreting the results. This tool tackles one of the biggest hurdles in citizen-led community air monitoring projects – working with ...

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aurich, Maike K.; Fleming, Ronan M. T.; Thiele, Ines

    Metabolomic data sets provide a direct read-out of cellular phenotypes and are increasingly generated to study biological questions. Previous work, by us and others, revealed the potential of analyzing extracellular metabolomic data in the context of the metabolic model using constraint-based modeling. With the MetaboTools, we make our methods available to the broader scientific community. The MetaboTools consist of a protocol, a toolbox, and tutorials of two use cases. The protocol describes, in a step-wise manner, the workflow of data integration, and computational analysis. The MetaboTools comprise the Matlab code required to complete the workflow described in the protocol. Tutorialsmore » explain the computational steps for integration of two different data sets and demonstrate a comprehensive set of methods for the computational analysis of metabolic models and stratification thereof into different phenotypes. The presented workflow supports integrative analysis of multiple omics data sets. Importantly, all analysis tools can be applied to metabolic models without performing the entire workflow. Taken together, the MetaboTools constitute a comprehensive guide to the intra-model analysis of extracellular metabolomic data from microbial, plant, or human cells. In conclusion, this computational modeling resource offers a broad set of computational analysis tools for a wide biomedical and non-biomedical research community.« less

  8. FSSC Science Tools: Pulsar Analysis

    NASA Technical Reports Server (NTRS)

    Thompson, Dave

    2010-01-01

    This slide presentation reviews the typical pulsar analysis, giving tips for screening of the data, the use of time series analysis, and utility tools. Specific information about analyzing Vela data is reviewed.

  9. 78 FR 48912 - Agency Information Collection Activities: Submission to OMB for Reinstatement, With, of a...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-12

    ... for analysis in the NCUA Low-Income Designation (LID) Tool. The LID Tool is a geocoding software... the member address data are obtained through the examination process and the results of the LID Tool... may send an electronic member address data file for analysis in the LID Tool. If a credit union does...

  10. 78 FR 59377 - Agency Information Collection Activities: Submission to OMB for Reinstatement, With Change, of a...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-26

    ... analysis in the NCUA Low-Income Designation (LID) Tool. The LID Tool is a geocoding software program which... data are obtained through the examination process and the results of the LID Tool indicate the credit... electronic member address data file for analysis in the LID Tool. If a credit union does not qualify for a...

  11. Modeling Tools for Propulsion Analysis and Computational Fluid Dynamics on the Internet

    NASA Technical Reports Server (NTRS)

    Muss, J. A.; Johnson, C. W.; Gotchy, M. B.

    2000-01-01

    The existing RocketWeb(TradeMark) Internet Analysis System (httr)://www.iohnsonrockets.com/rocketweb) provides an integrated set of advanced analysis tools that can be securely accessed over the Internet. Since these tools consist of both batch and interactive analysis codes, the system includes convenient methods for creating input files and evaluating the resulting data. The RocketWeb(TradeMark) system also contains many features that permit data sharing which, when further developed, will facilitate real-time, geographically diverse, collaborative engineering within a designated work group. Adding work group management functionality while simultaneously extending and integrating the system's set of design and analysis tools will create a system providing rigorous, controlled design development, reducing design cycle time and cost.

  12. Thermodynamics of Gas Turbine Cycles with Analytic Derivatives in OpenMDAO

    NASA Technical Reports Server (NTRS)

    Gray, Justin; Chin, Jeffrey; Hearn, Tristan; Hendricks, Eric; Lavelle, Thomas; Martins, Joaquim R. R. A.

    2016-01-01

    A new equilibrium thermodynamics analysis tool was built based on the CEA method using the OpenMDAO framework. The new tool provides forward and adjoint analytic derivatives for use with gradient based optimization algorithms. The new tool was validated against the original CEA code to ensure an accurate analysis and the analytic derivatives were validated against finite-difference approximations. Performance comparisons between analytic and finite difference methods showed a significant speed advantage for the analytic methods. To further test the new analysis tool, a sample optimization was performed to find the optimal air-fuel equivalence ratio, , maximizing combustion temperature for a range of different pressures. Collectively, the results demonstrate the viability of the new tool to serve as the thermodynamic backbone for future work on a full propulsion modeling tool.

  13. Hydrogen Financial Analysis Scenario Tool (H2FAST). Web Tool User's Manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bush, B.; Penev, M.; Melaina, M.

    The Hydrogen Financial Analysis Scenario Tool (H2FAST) provides a quick and convenient indepth financial analysis for hydrogen fueling stations. This manual describes how to use the H2FAST web tool, which is one of three H2FAST formats developed by the National Renewable Energy Laboratory (NREL). Although all of the formats are based on the same financial computations and conform to generally accepted accounting principles (FASAB 2014, Investopedia 2014), each format provides a different level of complexity and user interactivity.

  14. Full Life Cycle of Data Analysis with Climate Model Diagnostic Analyzer (CMDA)

    NASA Astrophysics Data System (ADS)

    Lee, S.; Zhai, C.; Pan, L.; Tang, B.; Zhang, J.; Bao, Q.; Malarout, N.

    2017-12-01

    We have developed a system that supports the full life cycle of a data analysis process, from data discovery, to data customization, to analysis, to reanalysis, to publication, and to reproduction. The system called Climate Model Diagnostic Analyzer (CMDA) is designed to demonstrate that the full life cycle of data analysis can be supported within one integrated system for climate model diagnostic evaluation with global observational and reanalysis datasets. CMDA has four subsystems that are highly integrated to support the analysis life cycle. Data System manages datasets used by CMDA analysis tools, Analysis System manages CMDA analysis tools which are all web services, Provenance System manages the meta data of CMDA datasets and the provenance of CMDA analysis history, and Recommendation System extracts knowledge from CMDA usage history and recommends datasets/analysis tools to users. These four subsystems are not only highly integrated but also easily expandable. New datasets can be easily added to Data System and scanned to be visible to the other subsystems. New analysis tools can be easily registered to be available in the Analysis System and Provenance System. With CMDA, a user can start a data analysis process by discovering datasets of relevance to their research topic using the Recommendation System. Next, the user can customize the discovered datasets for their scientific use (e.g. anomaly calculation, regridding, etc) with tools in the Analysis System. Next, the user can do their analysis with the tools (e.g. conditional sampling, time averaging, spatial averaging) in the Analysis System. Next, the user can reanalyze the datasets based on the previously stored analysis provenance in the Provenance System. Further, they can publish their analysis process and result to the Provenance System to share with other users. Finally, any user can reproduce the published analysis process and results. By supporting the full life cycle of climate data analysis, CMDA improves the research productivity and collaboration level of its user.

  15. Structured Analysis and the Data Flow Diagram: Tools for Library Analysis.

    ERIC Educational Resources Information Center

    Carlson, David H.

    1986-01-01

    This article discusses tools developed to aid the systems analysis process (program evaluation and review technique, Gantt charts, organizational charts, decision tables, flowcharts, hierarchy plus input-process-output). Similarities and differences among techniques, library applications of analysis, structured systems analysis, and the data flow…

  16. Surface analysis of stone and bone tools

    NASA Astrophysics Data System (ADS)

    Stemp, W. James; Watson, Adam S.; Evans, Adrian A.

    2016-03-01

    Microwear (use-wear) analysis is a powerful method for identifying tool use that archaeologists and anthropologists employ to determine the activities undertaken by both humans and their hominin ancestors. Knowledge of tool use allows for more accurate and detailed reconstructions of past behavior, particularly in relation to subsistence practices, economic activities, conflict and ritual. It can also be used to document changes in these activities over time, in different locations, and by different members of society, in terms of gender and status, for example. Both stone and bone tools have been analyzed using a variety of techniques that focus on the observation, documentation and interpretation of wear traces. Traditionally, microwear analysis relied on the qualitative assessment of wear features using microscopes and often included comparisons between replicated tools used experimentally and the recovered artifacts, as well as functional analogies dependent upon modern implements and those used by indigenous peoples from various places around the world. Determination of tool use has also relied on the recovery and analysis of both organic and inorganic residues of past worked materials that survived in and on artifact surfaces. To determine tool use and better understand the mechanics of wear formation, particularly on stone and bone, archaeologists and anthropologists have increasingly turned to surface metrology and tribology to assist them in their research. This paper provides a history of the development of traditional microwear analysis in archaeology and anthropology and also explores the introduction and adoption of more modern methods and technologies for documenting and identifying wear on stone and bone tools, specifically those developed for the engineering sciences to study surface structures on micro- and nanoscales. The current state of microwear analysis is discussed as are the future directions in the study of microwear on stone and bone tools.

  17. State Analysis Database Tool

    NASA Technical Reports Server (NTRS)

    Rasmussen, Robert; Bennett, Matthew

    2006-01-01

    The State Analysis Database Tool software establishes a productive environment for collaboration among software and system engineers engaged in the development of complex interacting systems. The tool embodies State Analysis, a model-based system engineering methodology founded on a state-based control architecture (see figure). A state represents a momentary condition of an evolving system, and a model may describe how a state evolves and is affected by other states. The State Analysis methodology is a process for capturing system and software requirements in the form of explicit models and states, and defining goal-based operational plans consistent with the models. Requirements, models, and operational concerns have traditionally been documented in a variety of system engineering artifacts that address different aspects of a mission s lifecycle. In State Analysis, requirements, models, and operations information are State Analysis artifacts that are consistent and stored in a State Analysis Database. The tool includes a back-end database, a multi-platform front-end client, and Web-based administrative functions. The tool is structured to prompt an engineer to follow the State Analysis methodology, to encourage state discovery and model description, and to make software requirements and operations plans consistent with model descriptions.

  18. SimHap GUI: an intuitive graphical user interface for genetic association analysis.

    PubMed

    Carter, Kim W; McCaskie, Pamela A; Palmer, Lyle J

    2008-12-25

    Researchers wishing to conduct genetic association analysis involving single nucleotide polymorphisms (SNPs) or haplotypes are often confronted with the lack of user-friendly graphical analysis tools, requiring sophisticated statistical and informatics expertise to perform relatively straightforward tasks. Tools, such as the SimHap package for the R statistics language, provide the necessary statistical operations to conduct sophisticated genetic analysis, but lacks a graphical user interface that allows anyone but a professional statistician to effectively utilise the tool. We have developed SimHap GUI, a cross-platform integrated graphical analysis tool for conducting epidemiological, single SNP and haplotype-based association analysis. SimHap GUI features a novel workflow interface that guides the user through each logical step of the analysis process, making it accessible to both novice and advanced users. This tool provides a seamless interface to the SimHap R package, while providing enhanced functionality such as sophisticated data checking, automated data conversion, and real-time estimations of haplotype simulation progress. SimHap GUI provides a novel, easy-to-use, cross-platform solution for conducting a range of genetic and non-genetic association analyses. This provides a free alternative to commercial statistics packages that is specifically designed for genetic association analysis.

  19. Satellite-based Tropical Cyclone Monitoring Capabilities

    NASA Astrophysics Data System (ADS)

    Hawkins, J.; Richardson, K.; Surratt, M.; Yang, S.; Lee, T. F.; Sampson, C. R.; Solbrig, J.; Kuciauskas, A. P.; Miller, S. D.; Kent, J.

    2012-12-01

    Satellite remote sensing capabilities to monitor tropical cyclone (TC) location, structure, and intensity have evolved by utilizing a combination of operational and research and development (R&D) sensors. The microwave imagers from the operational Defense Meteorological Satellite Program [Special Sensor Microwave/Imager (SSM/I) and the Special Sensor Microwave Imager Sounder (SSMIS)] form the "base" for structure observations due to their ability to view through upper-level clouds, modest size swaths and ability to capture most storm structure features. The NASA TRMM microwave imager and precipitation radar continue their 15+ yearlong missions in serving the TC warning and research communities. The cessation of NASA's QuikSCAT satellite after more than a decade of service is sorely missed, but India's OceanSat-2 scatterometer is now providing crucial ocean surface wind vectors in addition to the Navy's WindSat ocean surface wind vector retrievals. Another Advanced Scatterometer (ASCAT) onboard EUMETSAT's MetOp-2 satellite is slated for launch soon. Passive microwave imagery has received a much needed boost with the launch of the French/Indian Megha Tropiques imager in September 2011, basically greatly supplementing the very successful NASA TRMM pathfinder with a larger swath and more frequent temporal sampling. While initial data issues have delayed data utilization, current news indicates this data will be available in 2013. Future NASA Global Precipitation Mission (GPM) sensors starting in 2014 will provide enhanced capabilities. Also, the inclusion of the new microwave sounder data from the NPP ATMS (Oct 2011) will assist in mapping TC convective structures. The National Polar orbiting Partnership (NPP) program's VIIRS sensor includes a day night band (DNB) with the capability to view TC cloud structure at night when sufficient lunar illumination exits. Examples highlighting this new capability will be discussed in concert with additional data fusion efforts.

  20. Challenges in validating model results for first year ice

    NASA Astrophysics Data System (ADS)

    Melsom, Arne; Eastwood, Steinar; Xie, Jiping; Aaboe, Signe; Bertino, Laurent

    2017-04-01

    In order to assess the quality of model results for the distribution of first year ice, a comparison with a product based on observations from satellite-borne instruments has been performed. Such a comparison is not straightforward due to the contrasting algorithms that are used in the model product and the remote sensing product. The implementation of the validation is discussed in light of the differences between this set of products, and validation results are presented. The model product is the daily updated 10-day forecast from the Arctic Monitoring and Forecasting Centre in CMEMS. The forecasts are produced with the assimilative ocean prediction system TOPAZ. Presently, observations of sea ice concentration and sea ice drift are introduced in the assimilation step, but data for sea ice thickness and ice age (or roughness) are not included. The model computes the age of the ice by recording and updating the time passed after ice formation as sea ice grows and deteriorates as it is advected inside the model domain. Ice that is younger than 365 days is classified as first year ice. The fraction of first-year ice is recorded as a tracer in each grid cell. The Ocean and Sea Ice Thematic Assembly Centre in CMEMS redistributes a daily product from the EUMETSAT OSI SAF of gridded sea ice conditions which include "ice type", a representation of the separation of regions between those infested by first year ice, and those infested by multi-year ice. The ice type is parameterized based on data for the gradient ratio GR(19,37) from SSMIS observations, and from the ASCAT backscatter parameter. This product also includes information on ambiguity in the processing of the remote sensing data, and the product's confidence level, which have a strong seasonal dependency.

  1. Impact of the assimilation of satellite soil moisture and LST on the hydrological cycle

    NASA Astrophysics Data System (ADS)

    Laiolo, Paola; Gabellani, Simone; Delogu, Fabio; Silvestro, Francesco; Rudari, Roberto; Campo, Lorenzo; Boni, Giorgio

    2014-05-01

    The reliable estimation of hydrological variables (e.g. soil moisture, evapotranspiration, surface temperature) in space and time is of fundamental importance in operational hydrology to improve the forecast of the rainfall-runoff response of catchments and, consequently, flood predictions. Nowadays remote sensing can offer a chance to provide good space-time estimates of several hydrological variables and then improve hydrological model performances especially in environments with scarce ground based data. The aim of this work is to investigate the impacts on the performances of a distributed hydrological model (Continuum) of the assimilation of satellite-derived soil moisture products and Land Surface (LST). In this work three different soil moisture (SM) products, derived by ASCAT sensor, are used. These data are provided by the EUMETSAT's H-SAF (Satellite Application Facility on Support to Operational Hydrology and Water Management) program. The considered soil moisture products are: large scale surface soil moisture (SM OBS 1 - H07), small scale surface soil moisture (SM OBS 2 - H08) and profile index in the roots region (SM DAS 2 - H14). These data are compared with soil moisture estimated by Continuum model on the Orba catchment (800 km2), in the northern part of Italy, for the period July 2012-June 2013. Different assimilation experiments have been performed. The first experiment consists in the assimilation of the SM products by using a simple Nudging technique; the second one is the assimilation of only LST data, derived from MSG satellite, and the third is the assimilation of both SM products and LST. The benefits on the model predictions of discharge, LST and soil moisture dynamics were tested.

  2. Arctic lead detection using a waveform unmixing algorithm from CryoSat-2 data

    NASA Astrophysics Data System (ADS)

    Lee, S.; Im, J.

    2016-12-01

    Arctic areas consist of ice floes, leads, and polynyas. While leads and polynyas account for small parts in the Arctic Ocean, they play a key role in exchanging heat flux, moisture, and momentum between the atmosphere and ocean in wintertime because of their huge temperature difference In this study, a linear waveform unmixing approach was proposed to detect lead fraction. CryoSat-2 waveforms for pure leads, sea ice, and ocean were used as end-members based on visual interpretation of MODIS images coincident with CryoSat-2 data. The unmixing model produced lead, sea ice, and ocean abundances and a threshold (> 0.7) was applied to make a binary classification between lead and sea ice. The unmixing model produced better results than the existing models in the literature, which are based on simple thresholding approaches. The results were also comparable with our previous research using machine learning based models (i.e., decision trees and random forest). A monthly lead fraction was calculated, dividing the number of detected leads by the total number of measurements. The lead fraction around Beaufort Sea and Fram strait was high due to the anti-cyclonic rotation of Beaufort Gyre and the outflows of sea ice to the Atlantic. The lead fraction maps produced in this study were matched well with monthly lead fraction maps in the literature. The areas with thin sea ice identified from our previous research correspond to the high lead fraction areas in the present study. Furthermore, sea ice roughness from ASCAT scatterometer was compared to a lead fraction map to see the relationship between surface roughness and lead distribution.

  3. Impact of Scatterometer Ocean Wind Vector Data on NOAA Operations

    NASA Astrophysics Data System (ADS)

    Jelenak, Z.; Chang, P.; Brennan, M. J.; Sienkiewicz, J. M.

    2015-12-01

    Near real-time measurements of ocean surface vector winds (OSVW), including both wind speed and direction from non-NOAA satellites, are being widely used in critical operational NOAA forecasting and warning activities. The scatterometer wind data data have had major operational impact in: a) determining wind warning areas for mid-latitude systems (gale, storm,hurricane force); b) determining tropical cyclone 34-knot and 50-knot wind radii. c) tracking the center location of tropical cyclones, including the initial identification of their formation. d) identifying and warning of extreme gap and jet wind events at all latitudes. e) identifying the current location of frontal systems and high and low pressure centers. f) improving coastal surf and swell forecasts Much has been learned about the importance and utility of satellite OSVW data in operational weather forecasting and warning by exploiting OSVW research satellites in near real-time. Since December 1999 when first data from QuikSCAT scatterometer became available in near real time NOAA operations have been benefiting from ASCAT scatterometer observations on MetOp-A and B, Indian OSCAT scatterometer on OceanSat-3 and lately NASA's RapidScat mission on International Space Station. With oceans comprising over 70 percent of the earth's surface, the impacts of these data have been tremendous in serving society's needs for weather and water information and in supporting the nation's commerce with information for safe, efficient, and environmentally sound transportation and coastal preparedness. The satellite OSVW experience that has been gained over the past decade by users in the operational weather community allows for realistic operational OSVW requirements to be properly stated for future missions. Successful model of transitioning research data into operation implemented by Ocean Winds Team in NOAA's NESDIS/STAR office and subsequent data impacts will be presented and discussed.

  4. The benefits of using remotely sensed soil moisture in parameter identification of large-scale hydrological models

    NASA Astrophysics Data System (ADS)

    Wanders, N.; Bierkens, M. F. P.; de Jong, S. M.; de Roo, A.; Karssenberg, D.

    2014-08-01

    Large-scale hydrological models are nowadays mostly calibrated using observed discharge. As a result, a large part of the hydrological system, in particular the unsaturated zone, remains uncalibrated. Soil moisture observations from satellites have the potential to fill this gap. Here we evaluate the added value of remotely sensed soil moisture in calibration of large-scale hydrological models by addressing two research questions: (1) Which parameters of hydrological models can be identified by calibration with remotely sensed soil moisture? (2) Does calibration with remotely sensed soil moisture lead to an improved calibration of hydrological models compared to calibration based only on discharge observations, such that this leads to improved simulations of soil moisture content and discharge? A dual state and parameter Ensemble Kalman Filter is used to calibrate the hydrological model LISFLOOD for the Upper Danube. Calibration is done using discharge and remotely sensed soil moisture acquired by AMSR-E, SMOS, and ASCAT. Calibration with discharge data improves the estimation of groundwater and routing parameters. Calibration with only remotely sensed soil moisture results in an accurate identification of parameters related to land-surface processes. For the Upper Danube upstream area up to 40,000 km2, calibration on both discharge and soil moisture results in a reduction by 10-30% in the RMSE for discharge simulations, compared to calibration on discharge alone. The conclusion is that remotely sensed soil moisture holds potential for calibration of hydrological models, leading to a better simulation of soil moisture content throughout the catchment and a better simulation of discharge in upstream areas. This article was corrected on 15 SEP 2014. See the end of the full text for details.

  5. Designed tools for analysis of lithography patterns and nanostructures

    NASA Astrophysics Data System (ADS)

    Dervillé, Alexandre; Baderot, Julien; Bernard, Guilhem; Foucher, Johann; Grönqvist, Hanna; Labrosse, Aurélien; Martinez, Sergio; Zimmermann, Yann

    2017-03-01

    We introduce a set of designed tools for the analysis of lithography patterns and nano structures. The classical metrological analysis of these objects has the drawbacks of being time consuming, requiring manual tuning and lacking robustness and user friendliness. With the goal of improving the current situation, we propose new image processing tools at different levels: semi automatic, automatic and machine-learning enhanced tools. The complete set of tools has been integrated into a software platform designed to transform the lab into a virtual fab. The underlying idea is to master nano processes at the research and development level by accelerating the access to knowledge and hence speed up the implementation in product lines.

  6. Data and Tools | NREL

    Science.gov Websites

    Data and Tools Data and Tools NREL develops data sets, maps, models, and tools for the analysis of , models, and tools in the alphabetical listing. Popular Resources PVWatts Calculator Geospatial Data

  7. Minimally invasive surgical video analysis: a powerful tool for surgical training and navigation.

    PubMed

    Sánchez-González, P; Oropesa, I; Gómez, E J

    2013-01-01

    Analysis of minimally invasive surgical videos is a powerful tool to drive new solutions for achieving reproducible training programs, objective and transparent assessment systems and navigation tools to assist surgeons and improve patient safety. This paper presents how video analysis contributes to the development of new cognitive and motor training and assessment programs as well as new paradigms for image-guided surgery.

  8. Guidelines for the analysis of free energy calculations.

    PubMed

    Klimovich, Pavel V; Shirts, Michael R; Mobley, David L

    2015-05-01

    Free energy calculations based on molecular dynamics simulations show considerable promise for applications ranging from drug discovery to prediction of physical properties and structure-function studies. But these calculations are still difficult and tedious to analyze, and best practices for analysis are not well defined or propagated. Essentially, each group analyzing these calculations needs to decide how to conduct the analysis and, usually, develop its own analysis tools. Here, we review and recommend best practices for analysis yielding reliable free energies from molecular simulations. Additionally, we provide a Python tool, alchemical-analysis.py, freely available on GitHub as part of the pymbar package (located at http://github.com/choderalab/pymbar), that implements the analysis practices reviewed here for several reference simulation packages, which can be adapted to handle data from other packages. Both this review and the tool covers analysis of alchemical calculations generally, including free energy estimates via both thermodynamic integration and free energy perturbation-based estimators. Our Python tool also handles output from multiple types of free energy calculations, including expanded ensemble and Hamiltonian replica exchange, as well as standard fixed ensemble calculations. We also survey a range of statistical and graphical ways of assessing the quality of the data and free energy estimates, and provide prototypes of these in our tool. We hope this tool and discussion will serve as a foundation for more standardization of and agreement on best practices for analysis of free energy calculations.

  9. Code Analysis and Refactoring with Clang Tools, Version 0.1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kelley, Timothy M.

    2016-12-23

    Code Analysis and Refactoring with Clang Tools is a small set of example code that demonstrates techniques for applying tools distributed with the open source Clang compiler. Examples include analyzing where variables are used and replacing old data structures with standard structures.

  10. Capturing district nursing through a knowledge-based electronic caseload analysis tool (eCAT).

    PubMed

    Kane, Kay

    2014-03-01

    The Electronic Caseload Analysis Tool (eCAT) is a knowledge-based software tool to assist the caseload analysis process. The tool provides a wide range of graphical reports, along with an integrated clinical advisor, to assist district nurses, team leaders, operational and strategic managers with caseload analysis by describing, comparing and benchmarking district nursing practice in the context of population need, staff resources, and service structure. District nurses and clinical lead nurses in Northern Ireland developed the tool, along with academic colleagues from the University of Ulster, working in partnership with a leading software company. The aim was to use the eCAT tool to identify the nursing need of local populations, along with the variances in district nursing practice, and match the workforce accordingly. This article reviews the literature, describes the eCAT solution and discusses the impact of eCAT on nursing practice, staff allocation, service delivery and workforce planning, using fictitious exemplars and a post-implementation evaluation from the trusts.

  11. Design and application of a tool for structuring, capitalizing and making more accessible information and lessons learned from accidents involving machinery.

    PubMed

    Sadeghi, Samira; Sadeghi, Leyla; Tricot, Nicolas; Mathieu, Luc

    2017-12-01

    Accident reports are published in order to communicate the information and lessons learned from accidents. An efficient accident recording and analysis system is a necessary step towards improvement of safety. However, currently there is a shortage of efficient tools to support such recording and analysis. In this study we introduce a flexible and customizable tool that allows structuring and analysis of this information. This tool has been implemented under TEEXMA®. We named our prototype TEEXMA®SAFETY. This tool provides an information management system to facilitate data collection, organization, query, analysis and reporting of accidents. A predefined information retrieval module provides ready access to data which allows the user to quickly identify the possible hazards for specific machines and provides information on the source of hazards. The main target audience for this tool includes safety personnel, accident reporters and designers. The proposed data model has been developed by analyzing different accident reports.

  12. GREAT: a web portal for Genome Regulatory Architecture Tools

    PubMed Central

    Bouyioukos, Costas; Bucchini, François; Elati, Mohamed; Képès, François

    2016-01-01

    GREAT (Genome REgulatory Architecture Tools) is a novel web portal for tools designed to generate user-friendly and biologically useful analysis of genome architecture and regulation. The online tools of GREAT are freely accessible and compatible with essentially any operating system which runs a modern browser. GREAT is based on the analysis of genome layout -defined as the respective positioning of co-functional genes- and its relation with chromosome architecture and gene expression. GREAT tools allow users to systematically detect regular patterns along co-functional genomic features in an automatic way consisting of three individual steps and respective interactive visualizations. In addition to the complete analysis of regularities, GREAT tools enable the use of periodicity and position information for improving the prediction of transcription factor binding sites using a multi-view machine learning approach. The outcome of this integrative approach features a multivariate analysis of the interplay between the location of a gene and its regulatory sequence. GREAT results are plotted in web interactive graphs and are available for download either as individual plots, self-contained interactive pages or as machine readable tables for downstream analysis. The GREAT portal can be reached at the following URL https://absynth.issb.genopole.fr/GREAT and each individual GREAT tool is available for downloading. PMID:27151196

  13. SMALT - Soil Moisture from Altimetry project

    NASA Astrophysics Data System (ADS)

    Smith, Richard; Benveniste, Jérôme; Dinardo, Salvatore; Lucas, Bruno Manuel; Berry, Philippa; Wagner, Wolfgang; Hahn, Sebastian; Egido, Alejandro

    Soil surface moisture is a key scientific parameter; however, it is extremely difficult to measure remotely, particularly in arid and semi-arid terrain. This paper outlines the development of a novel methodology to generate soil moisture estimates in these regions from multi-mission satellite radar altimetry. Key to this approach is the development of detailed DRy Earth ModelS (DREAMS), which encapsulate the detailed and intricate surface brightness variations over the Earth’s land surface, resulting from changes in surface roughness and composition. DREAMS have been created over a number of arid and semi-arid deserts worldwide to produce historical SMALT timeseries over soil moisture variation. These products are available in two formats - a high resolution track product which utilises the altimeter’s high frequency content alongtrack and a multi-looked 6” gridded product at facilitate easy comparison/integeration with other remote sensing techniques. An overview of the SMALT processing scheme, covering the progression of the data from altimeter sigma0 through to final soil moisture estimate, is included along with example SMALT products. Validation has been performed over a number of deserts by comparing SMALT products with other remote sensing techniques, results of the comparison between SMALT and Metop Warp 5.5 are presented here. Comparisons with other remote sensing techniques have been limited in scope due to differences in the operational aspects of the instruments, the restricted geographical coverage of the DREAMS and the low repeat temporal sampling rate of the altimeter. The potential to expand the SMALT technique into less arid areas has been investigated. Small-scale comparison with in-situ and GNSS-R data obtained by the LEiMON experimental campaign over Tuscany, where historical trends exist within both SMALT and SMC probe datasets. A qualitative analysis of unexpected backscatter characteristics in dedicated dry environments is performed with comparison between Metop ASCAT and altimeter sigma0 over Saharan Africa. Geographical correlated areas of agreement and disagreement corresponding to underlying terrain are identified. SMALT products provide a first order estimation of soil moisture in areas of very dry terrain, where other datasets are limited. Potential to improve and expand the technique has been found, although further work is required to produce products with the same accuracy confidence as more established techniques. The data are made freely available to the scientific community through the website http://tethys.eaprs.cse.dmu.ac.uk/SMALT

  14. SMALT - Soil Moisture from Altimetry

    NASA Astrophysics Data System (ADS)

    Smith, Richard; Salloway, Mark; Berry, Philippa; Hahn, Sebastian; Wagner, Wolfgang; Egido, Alejandro; Dinardo, Salvatore; Lucas, Bruno Manuel; Benveniste, Jerome

    2014-05-01

    Soil surface moisture is a key scientific parameter; however, it is extremely difficult to measure remotely, particularly in arid and semi-arid terrain. This paper outlines the development of a novel methodology to generate soil moisture estimates in these regions from multi-mission satellite radar altimetry. Key to this approach is the development of detailed DRy Earth ModelS (DREAMS), which encapsulate the detailed and intricate surface brightness variations over the Earth's land surface, resulting from changes in surface roughness and composition. DREAMS have been created over a number of arid and semi-arid deserts worldwide to produce historical SMALT timeseries over soil moisture variation. These products are available in two formats - a high resolution track product which utilises the altimeter's high frequency content alongtrack and a multi-looked 6" gridded product at facilitate easy comparison/integeration with other remote sensing techniques. An overview of the SMALT processing scheme, covering the progression of the data from altimeter sigma0 through to final soil moisture estimate, is included along with example SMALT products. Validation has been performed over a number of deserts by comparing SMALT products with other remote sensing techniques, results of the comparison between SMALT and Metop Warp 5.5 are presented here. Comparisons with other remote sensing techniques have been limited in scope due to differences in the operational aspects of the instruments, the restricted geographical coverage of the DREAMS and the low repeat temporal sampling rate of the altimeter. The potential to expand the SMALT technique into less arid areas has been investigated. Small-scale comparison with in-situ and GNSS-R data obtained by the LEiMON experimental campaign over Tuscany, where historical trends exist within both SMALT and SMC probe datasets. A qualitative analysis of unexpected backscatter characteristics in dedicated dry environments is performed with comparison between Metop ASCAT and altimeter sigma0 over Saharan Africa. Geographical correlated areas of agreement and disagreement corresponding to underlying terrain are identified. SMALT products provide a first order estimation of soil moisture in areas of very dry terrain, where other datasets are limited. Potential to improve and expand the technique has been found, although further work is required to produce products with the same accuracy confidence as more established techniques. The data are made freely available to the scientific community through the website http://tethys.eaprs.cse.dmu.ac.uk/SMALT

  15. Sustainability Tools Inventory - Initial Gaps Analysis | Science ...

    EPA Pesticide Factsheets

    This report identifies a suite of tools that address a comprehensive set of community sustainability concerns. The objective is to discover whether "gaps" exist in the tool suite’s analytic capabilities. These tools address activities that significantly influence resource consumption, waste generation, and hazard generation including air pollution and greenhouse gases. In addition, the tools have been evaluated using four screening criteria: relevance to community decision making, tools in an appropriate developmental stage, tools that may be transferrable to situations useful for communities, and tools with requiring skill levels appropriate to communities. This document provides an initial gap analysis in the area of community sustainability decision support tools. It provides a reference to communities for existing decision support tools, and a set of gaps for those wishing to develop additional needed tools to help communities to achieve sustainability. It contributes to SHC 1.61.4

  16. Design of a Syntax Validation Tool for Requirements Analysis Using Structured Analysis and Design Technique (SADT)

    DTIC Science & Technology

    1988-09-01

    analysis phase of the software life cycle (16:1-1). While editing a SADT diagram, the tool should be able to check whether or not structured analysis...diag-ams are valid for the SADT’s syntax, produce error messages, do error recovery, and perform editing suggestions. Thus, this tool must have the...directed editors are editors which use the syn- tax of the programming language while editing a program. While text editors treat programs as text, syntax

  17. RADC SCAT automated sneak circuit analysis tool

    NASA Astrophysics Data System (ADS)

    Depalma, Edward L.

    The sneak circuit analysis tool (SCAT) provides a PC-based system for real-time identification (during the design phase) of sneak paths and design concerns. The tool utilizes an expert system shell to assist the analyst so that prior experience with sneak analysis is not necessary for performance. Both sneak circuits and design concerns are targeted by this tool, with both digital and analog circuits being examined. SCAT focuses the analysis at the assembly level, rather than the entire system, so that most sneak problems can be identified and corrected by the responsible design engineer in a timely manner. The SCAT program identifies the sneak circuits to the designer, who then decides what course of action is necessary.

  18. NASTRAN as an analytical research tool for composite mechanics and composite structures

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Sinclair, J. H.; Sullivan, T. L.

    1976-01-01

    Selected examples are described in which NASTRAN is used as an analysis research tool for composite mechanics and for composite structural components. The examples were selected to illustrate the importance of using NASTRAN as an analysis tool in this rapidly advancing field.

  19. Synthesis of research on work zone delays and simplified application of QuickZone analysis tool.

    DOT National Transportation Integrated Search

    2010-03-01

    The objectives of this project were to synthesize the latest information on work zone safety and management and identify case studies in which FHWAs decision support tool QuickZone or other appropriate analysis tools could be applied. The results ...

  20. Interactive Graphics Tools for Analysis of MOLA and Other Data

    NASA Technical Reports Server (NTRS)

    Frey, H.; Roark, J.; Sakimoto, S.

    2000-01-01

    We have developed several interactive analysis tools based on the IDL programming language for the analysis of Mars Orbiting Laser Altimeter (MOLA) profile and gridded data which are available to the general community.

  1. Novel features and enhancements in BioBin, a tool for the biologically inspired binning and association analysis of rare variants

    PubMed Central

    Byrska-Bishop, Marta; Wallace, John; Frase, Alexander T; Ritchie, Marylyn D

    2018-01-01

    Abstract Motivation BioBin is an automated bioinformatics tool for the multi-level biological binning of sequence variants. Herein, we present a significant update to BioBin which expands the software to facilitate a comprehensive rare variant analysis and incorporates novel features and analysis enhancements. Results In BioBin 2.3, we extend our software tool by implementing statistical association testing, updating the binning algorithm, as well as incorporating novel analysis features providing for a robust, highly customizable, and unified rare variant analysis tool. Availability and implementation The BioBin software package is open source and freely available to users at http://www.ritchielab.com/software/biobin-download Contact mdritchie@geisinger.edu Supplementary information Supplementary data are available at Bioinformatics online. PMID:28968757

  2. Rainfall estimation from soil moisture data: crash test for SM2RAIN algorithm

    NASA Astrophysics Data System (ADS)

    Brocca, Luca; Albergel, Clement; Massari, Christian; Ciabatta, Luca; Moramarco, Tommaso; de Rosnay, Patricia

    2015-04-01

    Soil moisture governs the partitioning of mass and energy fluxes between the land surface and the atmosphere and, hence, it represents a key variable for many applications in hydrology and earth science. In recent years, it was demonstrated that soil moisture observations from ground and satellite sensors contain important information useful for improving rainfall estimation. Indeed, soil moisture data have been used for correcting rainfall estimates from state-of-the-art satellite sensors (e.g. Crow et al., 2011), and also for improving flood prediction through a dual data assimilation approach (e.g. Massari et al., 2014; Chen et al., 2014). Brocca et al. (2013; 2014) developed a simple algorithm, called SM2RAIN, which allows estimating rainfall directly from soil moisture data. SM2RAIN has been applied successfully to in situ and satellite observations. Specifically, by using three satellite soil moisture products from ASCAT (Advanced SCATterometer), AMSR-E (Advanced Microwave Scanning Radiometer for Earth Observation) and SMOS (Soil Moisture and Ocean Salinity); it was found that the SM2RAIN-derived rainfall products are as accurate as state-of-the-art products, e.g., the real-time version of the TRMM (Tropical Rainfall Measuring Mission) product. Notwithstanding these promising results, a detailed study investigating the physical basis of the SM2RAIN algorithm, its range of applicability and its limitations on a global scale has still to be carried out. In this study, we carried out a crash test for SM2RAIN algorithm on a global scale by performing a synthetic experiment. Specifically, modelled soil moisture data are obtained from HTESSEL model (Hydrology Tiled ECMWF Scheme for Surface Exchanges over Land) forced by ERA-Interim near-surface meteorology. Afterwards, the modelled soil moisture data are used as input into SM2RAIN algorithm for testing weather or not the resulting rainfall estimates are able to reproduce ERA-Interim rainfall data. Correlation, root mean square differences and categorical scores were used to evaluate the goodness of the results. This analysis wants to draw global picture of the performance of SM2RAIN algorithm in absence of errors in soil moisture and rainfall data. First preliminary results over Europe have shown that SM2RAIN performs particularly well over southern Europe (e.g., Spain, Italy and Greece) while its performances diminish by moving towards Northern latitudes (Scandinavia) and over Alps. The results on a global scale will be shown and discussed at the conference session. REFERENCES Brocca, L., Melone, F., Moramarco, T., Wagner, W. (2013). A new method for rainfall estimation through soil moisture observations. Geophysical Research Letters, 40(5), 853-858. Brocca, L., Ciabatta, L., Massari, C., Moramarco, T., Hahn, S., Hasenauer, S., Kidd, R., Dorigo, W., Wagner, W., Levizzani, V. (2014). Soil as a natural rain gauge: estimating global rainfall from satellite soil moisture data. Journal of Geophysical Research, 119(9), 5128-5141. Chen F, Crow WT, Ryu D. (2014) Dual forcing and state correction via soil moisture assimilation for improved rainfall-runoff modeling. J Hydrometeor, 15, 1832-1848. Crow, W.T., van den Berg, M.J., Huffman, G.J., Pellarin, T. (2011). Correcting rainfall using satellite-based surface soil moisture retrievals: the soil moisture analysis rainfall tool (SMART). Water Resour Res, 47, W08521. Dee, D. P.,et al. (2011). The ERA-Interim reanalysis: configuration and performance of the data assimilation system. Q. J. Roy. Meteorol. Soc., 137, 553-597 Massari, C., Brocca, L., Moramarco, T., Tramblay, Y., Didon Lescot, J.-F. (2014). Potential of soil moisture observations in flood modelling: estimating initial conditions and correcting rainfall. Advances in Water Resources, 74, 44-53.

  3. Advanced Stoichiometric Analysis of Metabolic Networks of Mammalian Systems

    PubMed Central

    Orman, Mehmet A.; Berthiaume, Francois; Androulakis, Ioannis P.; Ierapetritou, Marianthi G.

    2013-01-01

    Metabolic engineering tools have been widely applied to living organisms to gain a comprehensive understanding about cellular networks and to improve cellular properties. Metabolic flux analysis (MFA), flux balance analysis (FBA), and metabolic pathway analysis (MPA) are among the most popular tools in stoichiometric network analysis. Although application of these tools into well-known microbial systems is extensive in the literature, various barriers prevent them from being utilized in mammalian cells. Limited experimental data, complex regulatory mechanisms, and the requirement of more complex nutrient media are some major obstacles in mammalian cell systems. However, mammalian cells have been used to produce therapeutic proteins, to characterize disease states or related abnormal metabolic conditions, and to analyze the toxicological effects of some medicinally important drugs. Therefore, there is a growing need for extending metabolic engineering principles to mammalian cells in order to understand their underlying metabolic functions. In this review article, advanced metabolic engineering tools developed for stoichiometric analysis including MFA, FBA, and MPA are described. Applications of these tools in mammalian cells are discussed in detail, and the challenges and opportunities are highlighted. PMID:22196224

  4. Inconsistency in the items included in tools used in general health research and physical therapy to evaluate the methodological quality of randomized controlled trials: a descriptive analysis

    PubMed Central

    2013-01-01

    Background Assessing the risk of bias of randomized controlled trials (RCTs) is crucial to understand how biases affect treatment effect estimates. A number of tools have been developed to evaluate risk of bias of RCTs; however, it is unknown how these tools compare to each other in the items included. The main objective of this study was to describe which individual items are included in RCT quality tools used in general health and physical therapy (PT) research, and how these items compare to those of the Cochrane Risk of Bias (RoB) tool. Methods We used comprehensive literature searches and a systematic approach to identify tools that evaluated the methodological quality or risk of bias of RCTs in general health and PT research. We extracted individual items from all quality tools. We calculated the frequency of quality items used across tools and compared them to those in the RoB tool. Comparisons were made between general health and PT quality tools using Chi-squared tests. Results In addition to the RoB tool, 26 quality tools were identified, with 19 being used in general health and seven in PT research. The total number of quality items included in general health research tools was 130, compared with 48 items across PT tools and seven items in the RoB tool. The most frequently included items in general health research tools (14/19, 74%) were inclusion and exclusion criteria, and appropriate statistical analysis. In contrast, the most frequent items included in PT tools (86%, 6/7) were: baseline comparability, blinding of investigator/assessor, and use of intention-to-treat analysis. Key items of the RoB tool (sequence generation and allocation concealment) were included in 71% (5/7) of PT tools, and 63% (12/19) and 37% (7/19) of general health research tools, respectively. Conclusions There is extensive item variation across tools that evaluate the risk of bias of RCTs in health research. Results call for an in-depth analysis of items that should be used to assess risk of bias of RCTs. Further empirical evidence on the use of individual items and the psychometric properties of risk of bias tools is needed. PMID:24044807

  5. Inconsistency in the items included in tools used in general health research and physical therapy to evaluate the methodological quality of randomized controlled trials: a descriptive analysis.

    PubMed

    Armijo-Olivo, Susan; Fuentes, Jorge; Ospina, Maria; Saltaji, Humam; Hartling, Lisa

    2013-09-17

    Assessing the risk of bias of randomized controlled trials (RCTs) is crucial to understand how biases affect treatment effect estimates. A number of tools have been developed to evaluate risk of bias of RCTs; however, it is unknown how these tools compare to each other in the items included. The main objective of this study was to describe which individual items are included in RCT quality tools used in general health and physical therapy (PT) research, and how these items compare to those of the Cochrane Risk of Bias (RoB) tool. We used comprehensive literature searches and a systematic approach to identify tools that evaluated the methodological quality or risk of bias of RCTs in general health and PT research. We extracted individual items from all quality tools. We calculated the frequency of quality items used across tools and compared them to those in the RoB tool. Comparisons were made between general health and PT quality tools using Chi-squared tests. In addition to the RoB tool, 26 quality tools were identified, with 19 being used in general health and seven in PT research. The total number of quality items included in general health research tools was 130, compared with 48 items across PT tools and seven items in the RoB tool. The most frequently included items in general health research tools (14/19, 74%) were inclusion and exclusion criteria, and appropriate statistical analysis. In contrast, the most frequent items included in PT tools (86%, 6/7) were: baseline comparability, blinding of investigator/assessor, and use of intention-to-treat analysis. Key items of the RoB tool (sequence generation and allocation concealment) were included in 71% (5/7) of PT tools, and 63% (12/19) and 37% (7/19) of general health research tools, respectively. There is extensive item variation across tools that evaluate the risk of bias of RCTs in health research. Results call for an in-depth analysis of items that should be used to assess risk of bias of RCTs. Further empirical evidence on the use of individual items and the psychometric properties of risk of bias tools is needed.

  6. A comparative analysis of Patient-Reported Expanded Disability Status Scale tools.

    PubMed

    Collins, Christian DE; Ivry, Ben; Bowen, James D; Cheng, Eric M; Dobson, Ruth; Goodin, Douglas S; Lechner-Scott, Jeannette; Kappos, Ludwig; Galea, Ian

    2016-09-01

    Patient-Reported Expanded Disability Status Scale (PREDSS) tools are an attractive alternative to the Expanded Disability Status Scale (EDSS) during long term or geographically challenging studies, or in pressured clinical service environments. Because the studies reporting these tools have used different metrics to compare the PREDSS and EDSS, we undertook an individual patient data level analysis of all available tools. Spearman's rho and the Bland-Altman method were used to assess correlation and agreement respectively. A systematic search for validated PREDSS tools covering the full EDSS range identified eight such tools. Individual patient data were available for five PREDSS tools. Excellent correlation was observed between EDSS and PREDSS with all tools. A higher level of agreement was observed with increasing levels of disability. In all tools, the 95% limits of agreement were greater than the minimum EDSS difference considered to be clinically significant. However, the intra-class coefficient was greater than that reported for EDSS raters of mixed seniority. The visual functional system was identified as the most significant predictor of the PREDSS-EDSS difference. This analysis will (1) enable researchers and service providers to make an informed choice of PREDSS tool, depending on their individual requirements, and (2) facilitate improvement of current PREDSS tools. © The Author(s), 2015.

  7. Physical Education Curriculum Analysis Tool (PECAT)

    ERIC Educational Resources Information Center

    Lee, Sarah M.; Wechsler, Howell

    2006-01-01

    The Physical Education Curriculum Analysis Tool (PECAT) will help school districts conduct a clear, complete, and consistent analysis of written physical education curricula, based upon national physical education standards. The PECAT is customizable to include local standards. The results from the analysis can help school districts enhance…

  8. Understanding and Using the Fermi Science Tools

    NASA Astrophysics Data System (ADS)

    Asercion, Joseph

    2018-01-01

    The Fermi Science Support Center (FSSC) provides information, documentation, and tools for the analysis of Fermi science data, including both the Large-Area Telescope (LAT) and the Gamma-ray Burst Monitor (GBM). Source and binary versions of the Fermi Science Tools can be downloaded from the FSSC website, and are supported on multiple platforms. An overview document, the Cicerone, provides details of the Fermi mission, the science instruments and their response functions, the science data preparation and analysis process, and interpretation of the results. Analysis Threads and a reference manual available on the FSSC website provide the user with step-by-step instructions for many different types of data analysis: point source analysis - generating maps, spectra, and light curves, pulsar timing analysis, source identification, and the use of python for scripting customized analysis chains. We present an overview of the structure of the Fermi science tools and documentation, and how to acquire them. We also provide examples of standard analyses, including tips and tricks for improving Fermi science analysis.

  9. VStar: Variable star data visualization and analysis tool

    NASA Astrophysics Data System (ADS)

    VStar Team

    2014-07-01

    VStar is a multi-platform, easy-to-use variable star data visualization and analysis tool. Data for a star can be read from the AAVSO (American Association of Variable Star Observers) database or from CSV and TSV files. VStar displays light curves and phase plots, can produce a mean curve, and analyzes time-frequency with Weighted Wavelet Z-Transform. It offers tools for period analysis, filtering, and other functions.

  10. Cloud-Based Orchestration of a Model-Based Power and Data Analysis Toolchain

    NASA Technical Reports Server (NTRS)

    Post, Ethan; Cole, Bjorn; Dinkel, Kevin; Kim, Hongman; Lee, Erich; Nairouz, Bassem

    2016-01-01

    The proposed Europa Mission concept contains many engineering and scientific instruments that consume varying amounts of power and produce varying amounts of data throughout the mission. System-level power and data usage must be well understood and analyzed to verify design requirements. Numerous cross-disciplinary tools and analysis models are used to simulate the system-level spacecraft power and data behavior. This paper addresses the problem of orchestrating a consistent set of models, tools, and data in a unified analysis toolchain when ownership is distributed among numerous domain experts. An analysis and simulation environment was developed as a way to manage the complexity of the power and data analysis toolchain and to reduce the simulation turnaround time. A system model data repository is used as the trusted store of high-level inputs and results while other remote servers are used for archival of larger data sets and for analysis tool execution. Simulation data passes through numerous domain-specific analysis tools and end-to-end simulation execution is enabled through a web-based tool. The use of a cloud-based service facilitates coordination among distributed developers and enables scalable computation and storage needs, and ensures a consistent execution environment. Configuration management is emphasized to maintain traceability between current and historical simulation runs and their corresponding versions of models, tools and data.

  11. Hydrogen Financial Analysis Scenario Tool (H2FAST); NREL (National Renewable Energy Laboratory)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Melaina, Marc

    This presentation describes the Hydrogen Financial Analysis Scenario Tool, H2FAST, and provides an overview of each of the three H2FAST formats: the H2FAST web tool, the H2FAST Excel spreadsheet, and the H2FAST Business Case Scenario (BCS) tool. Examples are presented to illustrate the types of questions that H2FAST can help answer.

  12. Modes of Learning in Religious Education

    ERIC Educational Resources Information Center

    Afdal, Geir

    2015-01-01

    This article is a contribution to the discussion of learning processes in religious education (RE) classrooms. Sociocultural theories of learning, understood here as tool-mediated processes, are used in an analysis of three RE classroom conversations. The analysis focuses on the language tools that are used in conversations; how the tools mediate;…

  13. Paediatric Automatic Phonological Analysis Tools (APAT).

    PubMed

    Saraiva, Daniela; Lousada, Marisa; Hall, Andreia; Jesus, Luis M T

    2017-12-01

    To develop the pediatric Automatic Phonological Analysis Tools (APAT) and to estimate inter and intrajudge reliability, content validity, and concurrent validity. The APAT were constructed using Excel spreadsheets with formulas. The tools were presented to an expert panel for content validation. The corpus used in the Portuguese standardized test Teste Fonético-Fonológico - ALPE produced by 24 children with phonological delay or phonological disorder was recorded, transcribed, and then inserted into the APAT. Reliability and validity of APAT were analyzed. The APAT present strong inter- and intrajudge reliability (>97%). The content validity was also analyzed (ICC = 0.71), and concurrent validity revealed strong correlations between computerized and manual (traditional) methods. The development of these tools contributes to fill existing gaps in clinical practice and research, since previously there were no valid and reliable tools/instruments for automatic phonological analysis, which allowed the analysis of different corpora.

  14. Standardizing Exoplanet Analysis with the Exoplanet Characterization Tool Kit (ExoCTK)

    NASA Astrophysics Data System (ADS)

    Fowler, Julia; Stevenson, Kevin B.; Lewis, Nikole K.; Fraine, Jonathan D.; Pueyo, Laurent; Bruno, Giovanni; Filippazzo, Joe; Hill, Matthew; Batalha, Natasha; Wakeford, Hannah; Bushra, Rafia

    2018-06-01

    Exoplanet characterization depends critically on analysis tools, models, and spectral libraries that are constantly under development and have no single source nor sense of unified style or methods. The complexity of spectroscopic analysis and initial time commitment required to become competitive is prohibitive to new researchers entering the field, as well as a remaining obstacle for established groups hoping to contribute in a comparable manner to their peers. As a solution, we are developing an open-source, modular data analysis package in Python and a publicly facing web interface including tools that address atmospheric characterization, transit observation planning with JWST, JWST corongraphy simulations, limb darkening, forward modeling, and data reduction, as well as libraries of stellar, planet, and opacity models. The foundation of these software tools and libraries exist within pockets of the exoplanet community, but our project will gather these seedling tools and grow a robust, uniform, and well-maintained exoplanet characterization toolkit.

  15. Rapid Modeling and Analysis Tools: Evolution, Status, Needs and Directions

    NASA Technical Reports Server (NTRS)

    Knight, Norman F., Jr.; Stone, Thomas J.; Ransom, Jonathan B. (Technical Monitor)

    2002-01-01

    Advanced aerospace systems are becoming increasingly more complex, and customers are demanding lower cost, higher performance, and high reliability. Increased demands are placed on the design engineers to collaborate and integrate design needs and objectives early in the design process to minimize risks that may occur later in the design development stage. High performance systems require better understanding of system sensitivities much earlier in the design process to meet these goals. The knowledge, skills, intuition, and experience of an individual design engineer will need to be extended significantly for the next generation of aerospace system designs. Then a collaborative effort involving the designer, rapid and reliable analysis tools and virtual experts will result in advanced aerospace systems that are safe, reliable, and efficient. This paper discusses the evolution, status, needs and directions for rapid modeling and analysis tools for structural analysis. First, the evolution of computerized design and analysis tools is briefly described. Next, the status of representative design and analysis tools is described along with a brief statement on their functionality. Then technology advancements to achieve rapid modeling and analysis are identified. Finally, potential future directions including possible prototype configurations are proposed.

  16. 78 FR 45992 - National Science and Technology Council; Notice of Meeting: Open Meeting of the National Science...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-30

    ..., and the general public; analysis of the role of comparative risk assessment in these evaluations, including decision analysis tools and gap analysis tools; identification, through case study presentations...

  17. Application of Risk Assessment Tools in the Continuous Risk Management (CRM) Process

    NASA Technical Reports Server (NTRS)

    Ray, Paul S.

    2002-01-01

    Marshall Space Flight Center (MSFC) of the National Aeronautics and Space Administration (NASA) is currently implementing the Continuous Risk Management (CRM) Program developed by the Carnegie Mellon University and recommended by NASA as the Risk Management (RM) implementation approach. The four most frequently used risk assessment tools in the center are: (a) Failure Modes and Effects Analysis (FMEA), Hazard Analysis (HA), Fault Tree Analysis (FTA), and Probabilistic Risk Analysis (PRA). There are some guidelines for selecting the type of risk assessment tools during the project formulation phase of a project, but there is not enough guidance as to how to apply these tools in the Continuous Risk Management process (CRM). But the ways the safety and risk assessment tools are used make a significant difference in the effectiveness in the risk management function. Decisions regarding, what events are to be included in the analysis, to what level of details should the analysis be continued, make significant difference in the effectiveness of risk management program. Tools of risk analysis also depends on the phase of a project e.g. at the initial phase of a project, when not much data are available on hardware, standard FMEA cannot be applied; instead a functional FMEA may be appropriate. This study attempted to provide some directives to alleviate the difficulty in applying FTA, PRA, and FMEA in the CRM process. Hazard Analysis was not included in the scope of the study due to the short duration of the summer research project.

  18. Tool for Rapid Analysis of Monte Carlo Simulations

    NASA Technical Reports Server (NTRS)

    Restrepo, Carolina; McCall, Kurt E.; Hurtado, John E.

    2011-01-01

    Designing a spacecraft, or any other complex engineering system, requires extensive simulation and analysis work. Oftentimes, the large amounts of simulation data generated are very di cult and time consuming to analyze, with the added risk of overlooking potentially critical problems in the design. The authors have developed a generic data analysis tool that can quickly sort through large data sets and point an analyst to the areas in the data set that cause specific types of failures. The Tool for Rapid Analysis of Monte Carlo simulations (TRAM) has been used in recent design and analysis work for the Orion vehicle, greatly decreasing the time it takes to evaluate performance requirements. A previous version of this tool was developed to automatically identify driving design variables in Monte Carlo data sets. This paper describes a new, parallel version, of TRAM implemented on a graphical processing unit, and presents analysis results for NASA's Orion Monte Carlo data to demonstrate its capabilities.

  19. Object-Oriented Multi-Disciplinary Design, Analysis, and Optimization Tool

    NASA Technical Reports Server (NTRS)

    Pak, Chan-gi

    2011-01-01

    An Object-Oriented Optimization (O3) tool was developed that leverages existing tools and practices, and allows the easy integration and adoption of new state-of-the-art software. At the heart of the O3 tool is the Central Executive Module (CEM), which can integrate disparate software packages in a cross platform network environment so as to quickly perform optimization and design tasks in a cohesive, streamlined manner. This object-oriented framework can integrate the analysis codes for multiple disciplines instead of relying on one code to perform the analysis for all disciplines. The CEM was written in FORTRAN and the script commands for each performance index were submitted through the use of the FORTRAN Call System command. In this CEM, the user chooses an optimization methodology, defines objective and constraint functions from performance indices, and provides starting and side constraints for continuous as well as discrete design variables. The structural analysis modules such as computations of the structural weight, stress, deflection, buckling, and flutter and divergence speeds have been developed and incorporated into the O3 tool to build an object-oriented Multidisciplinary Design, Analysis, and Optimization (MDAO) tool.

  20. First GIS Analysis of Modern Stone Tools Used by Wild Chimpanzees (Pan troglodytes verus) in Bossou, Guinea, West Africa

    PubMed Central

    Arroyo, Adrian; Matsuzawa, Tetsuro; de la Torre, Ignacio

    2015-01-01

    Stone tool use by wild chimpanzees of West Africa offers a unique opportunity to explore the evolutionary roots of technology during human evolution. However, detailed analyses of chimpanzee stone artifacts are still lacking, thus precluding a comparison with the earliest archaeological record. This paper presents the first systematic study of stone tools used by wild chimpanzees to crack open nuts in Bossou (Guinea-Conakry), and applies pioneering analytical techniques to such artifacts. Automatic morphometric GIS classification enabled to create maps of use wear over the stone tools (anvils, hammers, and hammers/ anvils), which were blind tested with GIS spatial analysis of damage patterns identified visually. Our analysis shows that chimpanzee stone tool use wear can be systematized and specific damage patterns discerned, allowing to discriminate between active and passive pounders in lithic assemblages. In summary, our results demonstrate the heuristic potential of combined suites of GIS techniques for the analysis of battered artifacts, and have enabled creating a referential framework of analysis in which wild chimpanzee battered tools can for the first time be directly compared to the early archaeological record. PMID:25793642

  1. SWATH2stats: An R/Bioconductor Package to Process and Convert Quantitative SWATH-MS Proteomics Data for Downstream Analysis Tools.

    PubMed

    Blattmann, Peter; Heusel, Moritz; Aebersold, Ruedi

    2016-01-01

    SWATH-MS is an acquisition and analysis technique of targeted proteomics that enables measuring several thousand proteins with high reproducibility and accuracy across many samples. OpenSWATH is popular open-source software for peptide identification and quantification from SWATH-MS data. For downstream statistical and quantitative analysis there exist different tools such as MSstats, mapDIA and aLFQ. However, the transfer of data from OpenSWATH to the downstream statistical tools is currently technically challenging. Here we introduce the R/Bioconductor package SWATH2stats, which allows convenient processing of the data into a format directly readable by the downstream analysis tools. In addition, SWATH2stats allows annotation, analyzing the variation and the reproducibility of the measurements, FDR estimation, and advanced filtering before submitting the processed data to downstream tools. These functionalities are important to quickly analyze the quality of the SWATH-MS data. Hence, SWATH2stats is a new open-source tool that summarizes several practical functionalities for analyzing, processing, and converting SWATH-MS data and thus facilitates the efficient analysis of large-scale SWATH/DIA datasets.

  2. Computational Tools and Facilities for the Next-Generation Analysis and Design Environment

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K. (Compiler); Malone, John B. (Compiler)

    1997-01-01

    This document contains presentations from the joint UVA/NASA Workshop on Computational Tools and Facilities for the Next-Generation Analysis and Design Environment held at the Virginia Consortium of Engineering and Science Universities in Hampton, Virginia on September 17-18, 1996. The presentations focused on the computational tools and facilities for analysis and design of engineering systems, including, real-time simulations, immersive systems, collaborative engineering environment, Web-based tools and interactive media for technical training. Workshop attendees represented NASA, commercial software developers, the aerospace industry, government labs, and academia. The workshop objectives were to assess the level of maturity of a number of computational tools and facilities and their potential for application to the next-generation integrated design environment.

  3. Mars Reconnaissance Orbiter Uplink Analysis Tool

    NASA Technical Reports Server (NTRS)

    Khanampompan, Teerapat; Gladden, Roy; Fisher, Forest; Hwang, Pauline

    2008-01-01

    This software analyzes Mars Reconnaissance Orbiter (MRO) orbital geometry with respect to Mars Exploration Rover (MER) contact windows, and is the first tool of its kind designed specifically to support MRO-MER interface coordination. Prior to this automated tool, this analysis was done manually with Excel and the UNIX command line. In total, the process would take approximately 30 minutes for each analysis. The current automated analysis takes less than 30 seconds. This tool resides on the flight machine and uses a PHP interface that does the entire analysis of the input files and takes into account one-way light time from another input file. Input flies are copied over to the proper directories and are dynamically read into the tool s interface. The user can then choose the corresponding input files based on the time frame desired for analysis. After submission of the Web form, the tool merges the two files into a single, time-ordered listing of events for both spacecraft. The times are converted to the same reference time (Earth Transmit Time) by reading in a light time file and performing the calculations necessary to shift the time formats. The program also has the ability to vary the size of the keep-out window on the main page of the analysis tool by inputting a custom time for padding each MRO event time. The parameters on the form are read in and passed to the second page for analysis. Everything is fully coded in PHP and can be accessed by anyone with access to the machine via Web page. This uplink tool will continue to be used for the duration of the MER mission's needs for X-band uplinks. Future missions also can use the tools to check overflight times as well as potential site observation times. Adaptation of the input files to the proper format, and the window keep-out times, would allow for other analyses. Any operations task that uses the idea of keep-out windows will have a use for this program.

  4. EZ and GOSSIP, two new VO compliant tools for spectral analysis

    NASA Astrophysics Data System (ADS)

    Franzetti, P.; Garill, B.; Fumana, M.; Paioro, L.; Scodeggio, M.; Paltani, S.; Scaramella, R.

    2008-10-01

    We present EZ and GOSSIP, two new VO compliant tools dedicated to spectral analysis. EZ is a tool to perform automatic redshift measurement; GOSSIP is a tool created to perform the SED fitting procedure in a simple, user friendly and efficient way. These two tools have been developed by the PANDORA Group at INAF-IASF (Milano); EZ has been developed in collaboration with Osservatorio Monte Porzio (Roma) and Integral Science Data Center (Geneve). EZ is released to the astronomical community; GOSSIP is currently in beta-testing.

  5. SimHap GUI: An intuitive graphical user interface for genetic association analysis

    PubMed Central

    Carter, Kim W; McCaskie, Pamela A; Palmer, Lyle J

    2008-01-01

    Background Researchers wishing to conduct genetic association analysis involving single nucleotide polymorphisms (SNPs) or haplotypes are often confronted with the lack of user-friendly graphical analysis tools, requiring sophisticated statistical and informatics expertise to perform relatively straightforward tasks. Tools, such as the SimHap package for the R statistics language, provide the necessary statistical operations to conduct sophisticated genetic analysis, but lacks a graphical user interface that allows anyone but a professional statistician to effectively utilise the tool. Results We have developed SimHap GUI, a cross-platform integrated graphical analysis tool for conducting epidemiological, single SNP and haplotype-based association analysis. SimHap GUI features a novel workflow interface that guides the user through each logical step of the analysis process, making it accessible to both novice and advanced users. This tool provides a seamless interface to the SimHap R package, while providing enhanced functionality such as sophisticated data checking, automated data conversion, and real-time estimations of haplotype simulation progress. Conclusion SimHap GUI provides a novel, easy-to-use, cross-platform solution for conducting a range of genetic and non-genetic association analyses. This provides a free alternative to commercial statistics packages that is specifically designed for genetic association analysis. PMID:19109877

  6. General Mission Analysis Tool (GMAT) Mathematical Specifications

    NASA Technical Reports Server (NTRS)

    Hughes, Steve

    2007-01-01

    The General Mission Analysis Tool (GMAT) is a space trajectory optimization and mission analysis system developed by NASA and private industry in the spirit of the NASA Mission. GMAT contains new technology and is a testbed for future technology development.

  7. Computerized power supply analysis: State equation generation and terminal models

    NASA Technical Reports Server (NTRS)

    Garrett, S. J.

    1978-01-01

    To aid engineers that design power supply systems two analysis tools that can be used with the state equation analysis package were developed. These tools include integration routines that start with the description of a power supply in state equation form and yield analytical results. The first tool uses a computer program that works with the SUPER SCEPTRE circuit analysis program and prints the state equation for an electrical network. The state equations developed automatically by the computer program are used to develop an algorithm for reducing the number of state variables required to describe an electrical network. In this way a second tool is obtained in which the order of the network is reduced and a simpler terminal model is obtained.

  8. Network Analysis Tools: from biological networks to clusters and pathways.

    PubMed

    Brohée, Sylvain; Faust, Karoline; Lima-Mendez, Gipsi; Vanderstocken, Gilles; van Helden, Jacques

    2008-01-01

    Network Analysis Tools (NeAT) is a suite of computer tools that integrate various algorithms for the analysis of biological networks: comparison between graphs, between clusters, or between graphs and clusters; network randomization; analysis of degree distribution; network-based clustering and path finding. The tools are interconnected to enable a stepwise analysis of the network through a complete analytical workflow. In this protocol, we present a typical case of utilization, where the tasks above are combined to decipher a protein-protein interaction network retrieved from the STRING database. The results returned by NeAT are typically subnetworks, networks enriched with additional information (i.e., clusters or paths) or tables displaying statistics. Typical networks comprising several thousands of nodes and arcs can be analyzed within a few minutes. The complete protocol can be read and executed in approximately 1 h.

  9. OpenEIS. Developer Guide

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lutes, Robert G.; Neubauer, Casey C.; Haack, Jereme N.

    2015-03-31

    The Department of Energy’s (DOE’s) Building Technologies Office (BTO) is supporting the development of an open-source software tool for analyzing building energy and operational data: OpenEIS (open energy information system). This tool addresses the problems of both owners of building data and developers of tools to analyze this data. Building owners and managers have data but lack the tools to analyze it while tool developers lack data in a common format to ease development of reusable data analysis tools. This document is intended for developers of applications and explains the mechanisms for building analysis applications, accessing data, and displaying datamore » using a visualization from the included library. A brief introduction to the visualizations can be used as a jumping off point for developers familiar with JavaScript to produce their own. Several example applications are included which can be used along with this document to implement algorithms for performing energy data analysis.« less

  10. Learn by Yourself: The Self-Learning Tools for Qualitative Analysis Software Packages

    ERIC Educational Resources Information Center

    Freitas, Fábio; Ribeiro, Jaime; Brandão, Catarina; Reis, Luís Paulo; de Souza, Francislê Neri; Costa, António Pedro

    2017-01-01

    Computer Assisted Qualitative Data Analysis Software (CAQDAS) are tools that help researchers to develop qualitative research projects. These software packages help the users with tasks such as transcription analysis, coding and text interpretation, writing and annotation, content search and analysis, recursive abstraction, grounded theory…

  11. GREAT: a web portal for Genome Regulatory Architecture Tools.

    PubMed

    Bouyioukos, Costas; Bucchini, François; Elati, Mohamed; Képès, François

    2016-07-08

    GREAT (Genome REgulatory Architecture Tools) is a novel web portal for tools designed to generate user-friendly and biologically useful analysis of genome architecture and regulation. The online tools of GREAT are freely accessible and compatible with essentially any operating system which runs a modern browser. GREAT is based on the analysis of genome layout -defined as the respective positioning of co-functional genes- and its relation with chromosome architecture and gene expression. GREAT tools allow users to systematically detect regular patterns along co-functional genomic features in an automatic way consisting of three individual steps and respective interactive visualizations. In addition to the complete analysis of regularities, GREAT tools enable the use of periodicity and position information for improving the prediction of transcription factor binding sites using a multi-view machine learning approach. The outcome of this integrative approach features a multivariate analysis of the interplay between the location of a gene and its regulatory sequence. GREAT results are plotted in web interactive graphs and are available for download either as individual plots, self-contained interactive pages or as machine readable tables for downstream analysis. The GREAT portal can be reached at the following URL https://absynth.issb.genopole.fr/GREAT and each individual GREAT tool is available for downloading. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  12. Stochastic Simulation Tool for Aerospace Structural Analysis

    NASA Technical Reports Server (NTRS)

    Knight, Norman F.; Moore, David F.

    2006-01-01

    Stochastic simulation refers to incorporating the effects of design tolerances and uncertainties into the design analysis model and then determining their influence on the design. A high-level evaluation of one such stochastic simulation tool, the MSC.Robust Design tool by MSC.Software Corporation, has been conducted. This stochastic simulation tool provides structural analysts with a tool to interrogate their structural design based on their mathematical description of the design problem using finite element analysis methods. This tool leverages the analyst's prior investment in finite element model development of a particular design. The original finite element model is treated as the baseline structural analysis model for the stochastic simulations that are to be performed. A Monte Carlo approach is used by MSC.Robust Design to determine the effects of scatter in design input variables on response output parameters. The tool was not designed to provide a probabilistic assessment, but to assist engineers in understanding cause and effect. It is driven by a graphical-user interface and retains the engineer-in-the-loop strategy for design evaluation and improvement. The application problem for the evaluation is chosen to be a two-dimensional shell finite element model of a Space Shuttle wing leading-edge panel under re-entry aerodynamic loading. MSC.Robust Design adds value to the analysis effort by rapidly being able to identify design input variables whose variability causes the most influence in response output parameters.

  13. SBML-SAT: a systems biology markup language (SBML) based sensitivity analysis tool

    PubMed Central

    Zi, Zhike; Zheng, Yanan; Rundell, Ann E; Klipp, Edda

    2008-01-01

    Background It has long been recognized that sensitivity analysis plays a key role in modeling and analyzing cellular and biochemical processes. Systems biology markup language (SBML) has become a well-known platform for coding and sharing mathematical models of such processes. However, current SBML compatible software tools are limited in their ability to perform global sensitivity analyses of these models. Results This work introduces a freely downloadable, software package, SBML-SAT, which implements algorithms for simulation, steady state analysis, robustness analysis and local and global sensitivity analysis for SBML models. This software tool extends current capabilities through its execution of global sensitivity analyses using multi-parametric sensitivity analysis, partial rank correlation coefficient, SOBOL's method, and weighted average of local sensitivity analyses in addition to its ability to handle systems with discontinuous events and intuitive graphical user interface. Conclusion SBML-SAT provides the community of systems biologists a new tool for the analysis of their SBML models of biochemical and cellular processes. PMID:18706080

  14. SBML-SAT: a systems biology markup language (SBML) based sensitivity analysis tool.

    PubMed

    Zi, Zhike; Zheng, Yanan; Rundell, Ann E; Klipp, Edda

    2008-08-15

    It has long been recognized that sensitivity analysis plays a key role in modeling and analyzing cellular and biochemical processes. Systems biology markup language (SBML) has become a well-known platform for coding and sharing mathematical models of such processes. However, current SBML compatible software tools are limited in their ability to perform global sensitivity analyses of these models. This work introduces a freely downloadable, software package, SBML-SAT, which implements algorithms for simulation, steady state analysis, robustness analysis and local and global sensitivity analysis for SBML models. This software tool extends current capabilities through its execution of global sensitivity analyses using multi-parametric sensitivity analysis, partial rank correlation coefficient, SOBOL's method, and weighted average of local sensitivity analyses in addition to its ability to handle systems with discontinuous events and intuitive graphical user interface. SBML-SAT provides the community of systems biologists a new tool for the analysis of their SBML models of biochemical and cellular processes.

  15. Effects of machining parameters on tool life and its optimization in turning mild steel with brazed carbide cutting tool

    NASA Astrophysics Data System (ADS)

    Dasgupta, S.; Mukherjee, S.

    2016-09-01

    One of the most significant factors in metal cutting is tool life. In this research work, the effects of machining parameters on tool under wet machining environment were studied. Tool life characteristics of brazed carbide cutting tool machined against mild steel and optimization of machining parameters based on Taguchi design of experiments were examined. The experiments were conducted using three factors, spindle speed, feed rate and depth of cut each having three levels. Nine experiments were performed on a high speed semi-automatic precision central lathe. ANOVA was used to determine the level of importance of the machining parameters on tool life. The optimum machining parameter combination was obtained by the analysis of S/N ratio. A mathematical model based on multiple regression analysis was developed to predict the tool life. Taguchi's orthogonal array analysis revealed the optimal combination of parameters at lower levels of spindle speed, feed rate and depth of cut which are 550 rpm, 0.2 mm/rev and 0.5mm respectively. The Main Effects plot reiterated the same. The variation of tool life with different process parameters has been plotted. Feed rate has the most significant effect on tool life followed by spindle speed and depth of cut.

  16. HUMAN HEALTH METRICS FOR ENVIRONMENTAL DECISION SUPPORT TOOLS: LESSONS FROM HEALTH ECONOMICS AND DECISION ANALYSIS: PUBLISHED REPORT

    EPA Science Inventory

    NRMRL-CIN-1351A Hofstetter**, P., and Hammitt, J. K. Human Health Metrics for Environmental Decision Support Tools: Lessons from Health Economics and Decision Analysis. EPA/600/R-01/104 (NTIS PB2002-102119). Decision makers using environmental decision support tools are often ...

  17. Rotorcraft performance data for AEDT : Methods of using the NASA Design and Analysis of Rotorcraft tool for developing data for AEDT's Rotorcraft Performance Model

    DOT National Transportation Integrated Search

    2016-09-01

    This report documents use of the NASA Design and Analysis of Rotorcraft (NDARC) helicopter performance software tool in developing data for the FAAs Aviation Environmental Design Tool (AEDT). These data support the Rotorcraft Performance Model (RP...

  18. 77 FR 4638 - Defense Federal Acquisition Regulation Supplement; Performance-Based Payments (DFARS Case 2011-D045)

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-01-30

    ... tool. The PBP analysis tool is a cash-flow model for evaluating alternative financing arrangements, and... PBP analysis tool is a cash-flow model for evaluating alternative financing arrangements, and is... that reflects adequate consideration to the Government for the improved contractor cash flow...

  19. 3D TRUMP - A GBI launch window tool

    NASA Astrophysics Data System (ADS)

    Karels, Steven N.; Hancock, John; Matchett, Gary

    3D TRUMP is a novel GPS and communicatons-link software analysis tool developed for the SDIO's Ground-Based Interceptor (GBI) program. 3D TRUMP uses a computationally efficient analysis tool which provides key GPS-based performance measures for an entire GBI mission's reentry vehicle and interceptor trajectories. Algorithms and sample outputs are presented.

  20. Making Culturally Responsive Mathematics Teaching Explicit: A Lesson Analysis Tool

    ERIC Educational Resources Information Center

    Aguirre, Julia M.; Zavala, Maria del Rosario

    2013-01-01

    In the United States, there is a need for pedagogical tools that help teachers develop essential pedagogical content knowledge and practices to meet the mathematical education needs of a growing culturally and linguistically diverse student population. In this article, we introduce an innovative lesson analysis tool that focuses on integrating…

  1. Analysis of Ten Reverse Engineering Tools

    NASA Astrophysics Data System (ADS)

    Koskinen, Jussi; Lehmonen, Tero

    Reverse engineering tools can be used in satisfying the information needs of software maintainers. Especially in case of maintaining large-scale legacy systems tool support is essential. Reverse engineering tools provide various kinds of capabilities to provide the needed information to the tool user. In this paper we analyze the provided capabilities in terms of four aspects: provided data structures, visualization mechanisms, information request specification mechanisms, and navigation features. We provide a compact analysis of ten representative reverse engineering tools for supporting C, C++ or Java: Eclipse Java Development Tools, Wind River Workbench (for C and C++), Understand (for C++), Imagix 4D, Creole, Javadoc, Javasrc, Source Navigator, Doxygen, and HyperSoft. The results of the study supplement the earlier findings in this important area.

  2. VIPER: Visualization Pipeline for RNA-seq, a Snakemake workflow for efficient and complete RNA-seq analysis.

    PubMed

    Cornwell, MacIntosh; Vangala, Mahesh; Taing, Len; Herbert, Zachary; Köster, Johannes; Li, Bo; Sun, Hanfei; Li, Taiwen; Zhang, Jian; Qiu, Xintao; Pun, Matthew; Jeselsohn, Rinath; Brown, Myles; Liu, X Shirley; Long, Henry W

    2018-04-12

    RNA sequencing has become a ubiquitous technology used throughout life sciences as an effective method of measuring RNA abundance quantitatively in tissues and cells. The increase in use of RNA-seq technology has led to the continuous development of new tools for every step of analysis from alignment to downstream pathway analysis. However, effectively using these analysis tools in a scalable and reproducible way can be challenging, especially for non-experts. Using the workflow management system Snakemake we have developed a user friendly, fast, efficient, and comprehensive pipeline for RNA-seq analysis. VIPER (Visualization Pipeline for RNA-seq analysis) is an analysis workflow that combines some of the most popular tools to take RNA-seq analysis from raw sequencing data, through alignment and quality control, into downstream differential expression and pathway analysis. VIPER has been created in a modular fashion to allow for the rapid incorporation of new tools to expand the capabilities. This capacity has already been exploited to include very recently developed tools that explore immune infiltrate and T-cell CDR (Complementarity-Determining Regions) reconstruction abilities. The pipeline has been conveniently packaged such that minimal computational skills are required to download and install the dozens of software packages that VIPER uses. VIPER is a comprehensive solution that performs most standard RNA-seq analyses quickly and effectively with a built-in capacity for customization and expansion.

  3. Two New Tools for Glycopeptide Analysis Researchers: A Glycopeptide Decoy Generator and a Large Data Set of Assigned CID Spectra of Glycopeptides.

    PubMed

    Lakbub, Jude C; Su, Xiaomeng; Zhu, Zhikai; Patabandige, Milani W; Hua, David; Go, Eden P; Desaire, Heather

    2017-08-04

    The glycopeptide analysis field is tightly constrained by a lack of effective tools that translate mass spectrometry data into meaningful chemical information, and perhaps the most challenging aspect of building effective glycopeptide analysis software is designing an accurate scoring algorithm for MS/MS data. We provide the glycoproteomics community with two tools to address this challenge. The first tool, a curated set of 100 expert-assigned CID spectra of glycopeptides, contains a diverse set of spectra from a variety of glycan types; the second tool, Glycopeptide Decoy Generator, is a new software application that generates glycopeptide decoys de novo. We developed these tools so that emerging methods of assigning glycopeptides' CID spectra could be rigorously tested. Software developers or those interested in developing skills in expert (manual) analysis can use these tools to facilitate their work. We demonstrate the tools' utility in assessing the quality of one particular glycopeptide software package, GlycoPep Grader, which assigns glycopeptides to CID spectra. We first acquired the set of 100 expert assigned CID spectra; then, we used the Decoy Generator (described herein) to generate 20 decoys per target glycopeptide. The assigned spectra and decoys were used to test the accuracy of GlycoPep Grader's scoring algorithm; new strengths and weaknesses were identified in the algorithm using this approach. Both newly developed tools are freely available. The software can be downloaded at http://glycopro.chem.ku.edu/GPJ.jar.

  4. The EMBL-EBI bioinformatics web and programmatic tools framework.

    PubMed

    Li, Weizhong; Cowley, Andrew; Uludag, Mahmut; Gur, Tamer; McWilliam, Hamish; Squizzato, Silvano; Park, Young Mi; Buso, Nicola; Lopez, Rodrigo

    2015-07-01

    Since 2009 the EMBL-EBI Job Dispatcher framework has provided free access to a range of mainstream sequence analysis applications. These include sequence similarity search services (https://www.ebi.ac.uk/Tools/sss/) such as BLAST, FASTA and PSI-Search, multiple sequence alignment tools (https://www.ebi.ac.uk/Tools/msa/) such as Clustal Omega, MAFFT and T-Coffee, and other sequence analysis tools (https://www.ebi.ac.uk/Tools/pfa/) such as InterProScan. Through these services users can search mainstream sequence databases such as ENA, UniProt and Ensembl Genomes, utilising a uniform web interface or systematically through Web Services interfaces (https://www.ebi.ac.uk/Tools/webservices/) using common programming languages, and obtain enriched results with novel visualisations. Integration with EBI Search (https://www.ebi.ac.uk/ebisearch/) and the dbfetch retrieval service (https://www.ebi.ac.uk/Tools/dbfetch/) further expands the usefulness of the framework. New tools and updates such as NCBI BLAST+, InterProScan 5 and PfamScan, new categories such as RNA analysis tools (https://www.ebi.ac.uk/Tools/rna/), new databases such as ENA non-coding, WormBase ParaSite, Pfam and Rfam, and new workflow methods, together with the retirement of depreciated services, ensure that the framework remains relevant to today's biological community. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.

  5. Scientific Platform as a Service - Tools and solutions for efficient access to and analysis of oceanographic data

    NASA Astrophysics Data System (ADS)

    Vines, Aleksander; Hansen, Morten W.; Korosov, Anton

    2017-04-01

    Existing infrastructure international and Norwegian projects, e.g., NorDataNet, NMDC and NORMAP, provide open data access through the OPeNDAP protocol following the conventions for CF (Climate and Forecast) metadata, designed to promote the processing and sharing of files created with the NetCDF application programming interface (API). This approach is now also being implemented in the Norwegian Sentinel Data Hub (satellittdata.no) to provide satellite EO data to the user community. Simultaneously with providing simplified and unified data access, these projects also seek to use and establish common standards for use and discovery metadata. This then allows development of standardized tools for data search and (subset) streaming over the internet to perform actual scientific analysis. A combinnation of software tools, which we call a Scientific Platform as a Service (SPaaS), will take advantage of these opportunities to harmonize and streamline the search, retrieval and analysis of integrated satellite and auxiliary observations of the oceans in a seamless system. The SPaaS is a cloud solution for integration of analysis tools with scientific datasets via an API. The core part of the SPaaS is a distributed metadata catalog to store granular metadata describing the structure, location and content of available satellite, model, and in situ datasets. The analysis tools include software for visualization (also online), interactive in-depth analysis, and server-based processing chains. The API conveys search requests between system nodes (i.e., interactive and server tools) and provides easy access to the metadata catalog, data repositories, and the tools. The SPaaS components are integrated in virtual machines, of which provisioning and deployment are automatized using existing state-of-the-art open-source tools (e.g., Vagrant, Ansible, Docker). The open-source code for scientific tools and virtual machine configurations is under version control at https://github.com/nansencenter/, and is coupled to an online continuous integration system (e.g., Travis CI).

  6. U.S. Geological Survey ArcMap Sediment Classification tool

    USGS Publications Warehouse

    O'Malley, John

    2007-01-01

    The U.S. Geological Survey (USGS) ArcMap Sediment Classification tool is a custom toolbar that extends the Environmental Systems Research Institute, Inc. (ESRI) ArcGIS 9.2 Desktop application to aid in the analysis of seabed sediment classification. The tool uses as input either a point data layer with field attributes containing percentage of gravel, sand, silt, and clay or four raster data layers representing a percentage of sediment (0-100%) for the various sediment grain size analysis: sand, gravel, silt and clay. This tool is designed to analyze the percent of sediment at a given location and classify the sediments according to either the Folk (1954, 1974) or Shepard (1954) as modified by Schlee(1973) classification schemes. The sediment analysis tool is based upon the USGS SEDCLASS program (Poppe, et al. 2004).

  7. Second NASA Technical Interchange Meeting (TIM): Advanced Technology Lifecycle Analysis System (ATLAS) Technology Tool Box (TTB)

    NASA Technical Reports Server (NTRS)

    ONeil, D. A.; Mankins, J. C.; Christensen, C. B.; Gresham, E. C.

    2005-01-01

    The Advanced Technology Lifecycle Analysis System (ATLAS), a spreadsheet analysis tool suite, applies parametric equations for sizing and lifecycle cost estimation. Performance, operation, and programmatic data used by the equations come from a Technology Tool Box (TTB) database. In this second TTB Technical Interchange Meeting (TIM), technologists, system model developers, and architecture analysts discussed methods for modeling technology decisions in spreadsheet models, identified specific technology parameters, and defined detailed development requirements. This Conference Publication captures the consensus of the discussions and provides narrative explanations of the tool suite, the database, and applications of ATLAS within NASA s changing environment.

  8. SECIMTools: a suite of metabolomics data analysis tools.

    PubMed

    Kirpich, Alexander S; Ibarra, Miguel; Moskalenko, Oleksandr; Fear, Justin M; Gerken, Joseph; Mi, Xinlei; Ashrafi, Ali; Morse, Alison M; McIntyre, Lauren M

    2018-04-20

    Metabolomics has the promise to transform the area of personalized medicine with the rapid development of high throughput technology for untargeted analysis of metabolites. Open access, easy to use, analytic tools that are broadly accessible to the biological community need to be developed. While technology used in metabolomics varies, most metabolomics studies have a set of features identified. Galaxy is an open access platform that enables scientists at all levels to interact with big data. Galaxy promotes reproducibility by saving histories and enabling the sharing workflows among scientists. SECIMTools (SouthEast Center for Integrated Metabolomics) is a set of Python applications that are available both as standalone tools and wrapped for use in Galaxy. The suite includes a comprehensive set of quality control metrics (retention time window evaluation and various peak evaluation tools), visualization techniques (hierarchical cluster heatmap, principal component analysis, modular modularity clustering), basic statistical analysis methods (partial least squares - discriminant analysis, analysis of variance, t-test, Kruskal-Wallis non-parametric test), advanced classification methods (random forest, support vector machines), and advanced variable selection tools (least absolute shrinkage and selection operator LASSO and Elastic Net). SECIMTools leverages the Galaxy platform and enables integrated workflows for metabolomics data analysis made from building blocks designed for easy use and interpretability. Standard data formats and a set of utilities allow arbitrary linkages between tools to encourage novel workflow designs. The Galaxy framework enables future data integration for metabolomics studies with other omics data.

  9. Cost/Schedule Control Systems Criteria: A Reference Guide to C/SCSC information

    DTIC Science & Technology

    1992-09-01

    Smith, Larry A. "Mainframe ARTEMIS: More than a Project Management Tool -- Earned Value Analysis ( PEVA )," Project Management Journal, 19:23-28 (April 1988...A. "Mainframe ARTEMIS: More than a Project Management Tool - Earned Value Analysis ( PEVA )," Project Management Journal, 19:23-28 (April 1988). 14...than a Project Management Tool -- Earned Value Analysis ( PEVA )," Project Management Journal, 19:23-28 (April 1988). 17. Trufant, Thomas M. and Robert

  10. A software tool to analyze clinical workflows from direct observations.

    PubMed

    Schweitzer, Marco; Lasierra, Nelia; Hoerbst, Alexander

    2015-01-01

    Observational data of clinical processes need to be managed in a convenient way, so that process information is reliable, valid and viable for further analysis. However, existing tools for allocating observations fail in systematic data collection of specific workflow recordings. We present a software tool which was developed to facilitate the analysis of clinical process observations. The tool was successfully used in the project OntoHealth, to build, store and analyze observations of diabetes routine consultations.

  11. Interactive Planning under Uncertainty with Casual Modeling and Analysis

    DTIC Science & Technology

    2006-01-01

    Tool ( CAT ), a system for creating and analyzing causal models similar to Bayes networks. In order to use CAT as a tool for planning, users go through...an iterative process in which they use CAT to create and an- alyze alternative plans. One of the biggest difficulties is that the number of possible...Causal Analysis Tool ( CAT ), which is a tool for representing and analyzing causal networks sim- ilar to Bayesian networks. In order to represent plans

  12. Design and analysis of lifting tool assemblies to lift different engine block

    NASA Astrophysics Data System (ADS)

    Sawant, Arpana; Deshmukh, Nilaj N.; Chauhan, Santosh; Dabhadkar, Mandar; Deore, Rupali

    2017-07-01

    Engines block are required to be lifted from one place to another while they are being processed. The human effort required for this purpose is more and also the engine block may get damaged if it is not handled properly. There is a need for designing a proper lifting tool which will be able to conveniently lift the engine block and place it at the desired position without any accident and damage to the engine block. In the present study lifting tool assemblies are designed and analyzed in such way that it may lift different categories of engine blocks. The lifting tool assembly consists of lifting plate, lifting ring, cap screws and washers. A parametric model and assembly of Lifting tool is done in 3D modelling software CREO 2.0 and analysis is carried out in ANSYS Workbench 16.0. A test block of weight equivalent to that of an engine block is considered for the purpose of analysis. In the preliminary study, without washer the stresses obtained on the lifting tool were more than the safety margin. In the present design, washers were used with appropriate dimensions which helps to bring down the stresses on the lifting tool within the safety margin. Analysis is carried out to verify that tool design meets the ASME BTH-1 required safety margin.

  13. Integrated Sensitivity Analysis Workflow

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Friedman-Hill, Ernest J.; Hoffman, Edward L.; Gibson, Marcus J.

    2014-08-01

    Sensitivity analysis is a crucial element of rigorous engineering analysis, but performing such an analysis on a complex model is difficult and time consuming. The mission of the DART Workbench team at Sandia National Laboratories is to lower the barriers to adoption of advanced analysis tools through software integration. The integrated environment guides the engineer in the use of these integrated tools and greatly reduces the cycle time for engineering analysis.

  14. Air Traffic Complexity Measurement Environment (ACME): Software User's Guide

    NASA Technical Reports Server (NTRS)

    1996-01-01

    A user's guide for the Air Traffic Complexity Measurement Environment (ACME) software is presented. The ACME consists of two major components, a complexity analysis tool and user interface. The Complexity Analysis Tool (CAT) analyzes complexity off-line, producing data files which may be examined interactively via the Complexity Data Analysis Tool (CDAT). The Complexity Analysis Tool is composed of three independently executing processes that communicate via PVM (Parallel Virtual Machine) and Unix sockets. The Runtime Data Management and Control process (RUNDMC) extracts flight plan and track information from a SAR input file, and sends the information to GARP (Generate Aircraft Routes Process) and CAT (Complexity Analysis Task). GARP in turn generates aircraft trajectories, which are utilized by CAT to calculate sector complexity. CAT writes flight plan, track and complexity data to an output file, which can be examined interactively. The Complexity Data Analysis Tool (CDAT) provides an interactive graphic environment for examining the complexity data produced by the Complexity Analysis Tool (CAT). CDAT can also play back track data extracted from System Analysis Recording (SAR) tapes. The CDAT user interface consists of a primary window, a controls window, and miscellaneous pop-ups. Aircraft track and position data is displayed in the main viewing area of the primary window. The controls window contains miscellaneous control and display items. Complexity data is displayed in pop-up windows. CDAT plays back sector complexity and aircraft track and position data as a function of time. Controls are provided to start and stop playback, adjust the playback rate, and reposition the display to a specified time.

  15. Radiation Mitigation and Power Optimization Design Tools for Reconfigurable Hardware in Orbit

    NASA Technical Reports Server (NTRS)

    French, Matthew; Graham, Paul; Wirthlin, Michael; Wang, Li; Larchev, Gregory

    2005-01-01

    The Reconfigurable Hardware in Orbit (RHinO)project is focused on creating a set of design tools that facilitate and automate design techniques for reconfigurable computing in space, using SRAM-based field-programmable-gate-array (FPGA) technology. In the second year of the project, design tools that leverage an established FPGA design environment have been created to visualize and analyze an FPGA circuit for radiation weaknesses and power inefficiencies. For radiation, a single event Upset (SEU) emulator, persistence analysis tool, and a half-latch removal tool for Xilinx/Virtex-II devices have been created. Research is underway on a persistence mitigation tool and multiple bit upsets (MBU) studies. For power, synthesis level dynamic power visualization and analysis tools have been completed. Power optimization tools are under development and preliminary test results are positive.

  16. Building a Community Infrastructure for Scalable On-Line Performance Analysis Tools around Open|Speedshop

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, Barton

    2014-06-30

    Peta-scale computing environments pose significant challenges for both system and application developers and addressing them required more than simply scaling up existing tera-scale solutions. Performance analysis tools play an important role in gaining this understanding, but previous monolithic tools with fixed feature sets have not sufficed. Instead, this project worked on the design, implementation, and evaluation of a general, flexible tool infrastructure supporting the construction of performance tools as “pipelines” of high-quality tool building blocks. These tool building blocks provide common performance tool functionality, and are designed for scalability, lightweight data acquisition and analysis, and interoperability. For this project, wemore » built on Open|SpeedShop, a modular and extensible open source performance analysis tool set. The design and implementation of such a general and reusable infrastructure targeted for petascale systems required us to address several challenging research issues. All components needed to be designed for scale, a task made more difficult by the need to provide general modules. The infrastructure needed to support online data aggregation to cope with the large amounts of performance and debugging data. We needed to be able to map any combination of tool components to each target architecture. And we needed to design interoperable tool APIs and workflows that were concrete enough to support the required functionality, yet provide the necessary flexibility to address a wide range of tools. A major result of this project is the ability to use this scalable infrastructure to quickly create tools that match with a machine architecture and a performance problem that needs to be understood. Another benefit is the ability for application engineers to use the highly scalable, interoperable version of Open|SpeedShop, which are reassembled from the tool building blocks into a flexible, multi-user interface set of tools. This set of tools targeted at Office of Science Leadership Class computer systems and selected Office of Science application codes. We describe the contributions made by the team at the University of Wisconsin. The project built on the efforts in Open|SpeedShop funded by DOE/NNSA and the DOE/NNSA Tri-Lab community, extended Open|Speedshop to the Office of Science Leadership Class Computing Facilities, and addressed new challenges found on these cutting edge systems. Work done under this project at Wisconsin can be divided into two categories, new algorithms and techniques for debugging, and foundation infrastructure work on our Dyninst binary analysis and instrumentation toolkits and MRNet scalability infrastructure.« less

  17. MEA-Tools: an open source toolbox for the analysis of multi-electrode data with MATLAB.

    PubMed

    Egert, U; Knott, Th; Schwarz, C; Nawrot, M; Brandt, A; Rotter, S; Diesmann, M

    2002-05-30

    Recent advances in electrophysiological techniques have created new tools for the acquisition and storage of neuronal activity recorded simultaneously with numerous electrodes. These techniques support the analysis of the function as well as the structure of individual electrogenic cells in the context of surrounding neuronal or cardiac network. Commercially available tools for the analysis of such data, however, cannot be easily adapted to newly emerging requirements for data analysis and visualization, and cross compatibility between them is limited. In this report we introduce a free open source toolbox called microelectrode array tools (MEA-Tools) for the analysis of multi-electrode data based on the common data analysis environment MATLAB (version 5.3-6.1, The Mathworks, Natick, MA). The toolbox itself is platform independent. The file interface currently supports files recorded with MCRack (Multi Channel Systems, Reutlingen, Germany) under Microsoft Windows 95, 98, NT, and 2000, but can be adapted to other data acquisition systems. Functions are controlled via command line input and graphical user interfaces, and support common requirements for the analysis of local field potentials, extracellular spike activity, and continuous recordings, in addition to supplementary data acquired by additional instruments, e.g. intracellular amplifiers. Data may be processed as continuous recordings or time windows triggered to some event.

  18. RDNAnalyzer: A tool for DNA secondary structure prediction and sequence analysis.

    PubMed

    Afzal, Muhammad; Shahid, Ahmad Ali; Shehzadi, Abida; Nadeem, Shahid; Husnain, Tayyab

    2012-01-01

    RDNAnalyzer is an innovative computer based tool designed for DNA secondary structure prediction and sequence analysis. It can randomly generate the DNA sequence or user can upload the sequences of their own interest in RAW format. It uses and extends the Nussinov dynamic programming algorithm and has various application for the sequence analysis. It predicts the DNA secondary structure and base pairings. It also provides the tools for routinely performed sequence analysis by the biological scientists such as DNA replication, reverse compliment generation, transcription, translation, sequence specific information as total number of nucleotide bases, ATGC base contents along with their respective percentages and sequence cleaner. RDNAnalyzer is a unique tool developed in Microsoft Visual Studio 2008 using Microsoft Visual C# and Windows Presentation Foundation and provides user friendly environment for sequence analysis. It is freely available. http://www.cemb.edu.pk/sw.html RDNAnalyzer - Random DNA Analyser, GUI - Graphical user interface, XAML - Extensible Application Markup Language.

  19. Research on the Intensity Analysis and Result Visualization of Construction Land in Urban Planning

    NASA Astrophysics Data System (ADS)

    Cui, J.; Dong, B.; Li, J.; Li, L.

    2017-09-01

    As a fundamental work of urban planning, the intensity analysis of construction land involves many repetitive data processing works that are prone to cause errors or data precision loss, and the lack of efficient methods and tools to visualizing the analysis results in current urban planning. In the research a portable tool is developed by using the Model Builder technique embedded in ArcGIS to provide automatic data processing and rapid result visualization for the works. A series of basic modules provided by ArcGIS are linked together to shape a whole data processing chain in the tool. Once the required data is imported, the analysis results and related maps and graphs including the intensity values and zoning map, the skyline analysis map etc. are produced automatically. Finally the tool is installation-free and can be dispatched quickly between planning teams.

  20. PANDA-view: An easy-to-use tool for statistical analysis and visualization of quantitative proteomics data.

    PubMed

    Chang, Cheng; Xu, Kaikun; Guo, Chaoping; Wang, Jinxia; Yan, Qi; Zhang, Jian; He, Fuchu; Zhu, Yunping

    2018-05-22

    Compared with the numerous software tools developed for identification and quantification of -omics data, there remains a lack of suitable tools for both downstream analysis and data visualization. To help researchers better understand the biological meanings in their -omics data, we present an easy-to-use tool, named PANDA-view, for both statistical analysis and visualization of quantitative proteomics data and other -omics data. PANDA-view contains various kinds of analysis methods such as normalization, missing value imputation, statistical tests, clustering and principal component analysis, as well as the most commonly-used data visualization methods including an interactive volcano plot. Additionally, it provides user-friendly interfaces for protein-peptide-spectrum representation of the quantitative proteomics data. PANDA-view is freely available at https://sourceforge.net/projects/panda-view/. 1987ccpacer@163.com and zhuyunping@gmail.com. Supplementary data are available at Bioinformatics online.

  1. C++ software quality in the ATLAS experiment: tools and experience

    NASA Astrophysics Data System (ADS)

    Martin-Haugh, S.; Kluth, S.; Seuster, R.; Snyder, S.; Obreshkov, E.; Roe, S.; Sherwood, P.; Stewart, G. A.

    2017-10-01

    In this paper we explain how the C++ code quality is managed in ATLAS using a range of tools from compile-time through to run time testing and reflect on the substantial progress made in the last two years largely through the use of static analysis tools such as Coverity®, an industry-standard tool which enables quality comparison with general open source C++ code. Other available code analysis tools are also discussed, as is the role of unit testing with an example of how the GoogleTest framework can be applied to our codebase.

  2. Analysis of the electromagnetic wave resistivity tool in deviated well drilling

    NASA Astrophysics Data System (ADS)

    Zhang, Yumei; Xu, Lijun; Cao, Zhang

    2014-04-01

    Electromagnetic wave resistivity (EWR) tools are used to provide real-time measurements of resistivity in the formation around the tool in Logging While Drilling (LWD). In this paper, the acquired resistivity information in the formation is analyzed to extract more information, including dipping angle and azimuth direction of the drill. A finite element (FM) model of EWR tool working in layered earth formations is established. Numerical analysis and FM simulations are employed to analyze the amplitude ratio and phase difference between the voltages measured at the two receivers of the EWR tool in deviated well drilling.

  3. Evaluation of automated decisionmaking methodologies and development of an integrated robotic system simulation, volume 2, part 1. Appendix A: Software documentation

    NASA Technical Reports Server (NTRS)

    Lowrie, J. W.; Fermelia, A. J.; Haley, D. C.; Gremban, K. D.; Vanbaalen, J.; Walsh, R. W.

    1982-01-01

    Documentation of the preliminary software developed as a framework for a generalized integrated robotic system simulation is presented. The program structure is composed of three major functions controlled by a program executive. The three major functions are: system definition, analysis tools, and post processing. The system definition function handles user input of system parameters and definition of the manipulator configuration. The analysis tools function handles the computational requirements of the program. The post processing function allows for more detailed study of the results of analysis tool function executions. Also documented is the manipulator joint model software to be used as the basis of the manipulator simulation which will be part of the analysis tools capability.

  4. Water Power Data and Tools | Water Power | NREL

    Science.gov Websites

    computer modeling tools and data with state-of-the-art design and analysis. Photo of a buoy designed around National Wind Technology Center's Information Portal as well as a WEC-Sim fact sheet. WEC Design Response Toolbox The WEC Design Response Toolbox provides extreme response and fatigue analysis tools specifically

  5. RdTools: An Open Source Python Library for PV Degradation Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Deceglie, Michael G; Jordan, Dirk; Nag, Ambarish

    RdTools is a set of Python tools for analysis of photovoltaic data. In particular, PV production data is evaluated over several years to obtain rates of performance degradation over time. Rdtools can handle both high frequency (hourly or better) or low frequency (daily, weekly, etc.) datasets. Best results are obtained with higher frequency data.

  6. STRESSED SEBATES: A TRAIT-BASED EVALUATION OF CLIMATE RISKS TO ROCKFISHES OF THE NORTHEASTERN PACIFIC USING THE COASTAL BIOGEOGRAPHIC RISK ANALYSIS TOOL (CBRAT)

    EPA Science Inventory

    The EPA and USGS have developed a framework to evaluate the relative vulnerability of near-coastal species to impacts of climate change. This framework is implemented in a web-based tool, the Coastal Biogeographic Risk Analysis Tool (CBRAT). We evaluated the vulnerability of the ...

  7. Stressed Sebastes: A Trait-Based Evaluation of Climate Risks to Rockfishes of the Northeastern Pacific Using the Coastal Biogeographic Risk Analysis Tool (CBRAT)

    EPA Science Inventory

    The EPA and USGS have developed a framework to evaluate the relative vulnerability of near-coastal species to impacts of climate change. This framework was implemented in a web-based tool, the Coastal Biogeographic Risk Analysis Tool (CBRAT). We evaluated the vulnerability of the...

  8. Development of a User Interface for a Regression Analysis Software Tool

    NASA Technical Reports Server (NTRS)

    Ulbrich, Norbert Manfred; Volden, Thomas R.

    2010-01-01

    An easy-to -use user interface was implemented in a highly automated regression analysis tool. The user interface was developed from the start to run on computers that use the Windows, Macintosh, Linux, or UNIX operating system. Many user interface features were specifically designed such that a novice or inexperienced user can apply the regression analysis tool with confidence. Therefore, the user interface s design minimizes interactive input from the user. In addition, reasonable default combinations are assigned to those analysis settings that influence the outcome of the regression analysis. These default combinations will lead to a successful regression analysis result for most experimental data sets. The user interface comes in two versions. The text user interface version is used for the ongoing development of the regression analysis tool. The official release of the regression analysis tool, on the other hand, has a graphical user interface that is more efficient to use. This graphical user interface displays all input file names, output file names, and analysis settings for a specific software application mode on a single screen which makes it easier to generate reliable analysis results and to perform input parameter studies. An object-oriented approach was used for the development of the graphical user interface. This choice keeps future software maintenance costs to a reasonable limit. Examples of both the text user interface and graphical user interface are discussed in order to illustrate the user interface s overall design approach.

  9. A new spatial multi-criteria decision support tool for site selection for implementation of managed aquifer recharge.

    PubMed

    Rahman, M Azizur; Rusteberg, Bernd; Gogu, R C; Lobo Ferreira, J P; Sauter, Martin

    2012-05-30

    This study reports the development of a new spatial multi-criteria decision analysis (SMCDA) software tool for selecting suitable sites for Managed Aquifer Recharge (MAR) systems. The new SMCDA software tool functions based on the combination of existing multi-criteria evaluation methods with modern decision analysis techniques. More specifically, non-compensatory screening, criteria standardization and weighting, and Analytical Hierarchy Process (AHP) have been combined with Weighted Linear Combination (WLC) and Ordered Weighted Averaging (OWA). This SMCDA tool may be implemented with a wide range of decision maker's preferences. The tool's user-friendly interface helps guide the decision maker through the sequential steps for site selection, those steps namely being constraint mapping, criteria hierarchy, criteria standardization and weighting, and criteria overlay. The tool offers some predetermined default criteria and standard methods to increase the trade-off between ease-of-use and efficiency. Integrated into ArcGIS, the tool has the advantage of using GIS tools for spatial analysis, and herein data may be processed and displayed. The tool is non-site specific, adaptive, and comprehensive, and may be applied to any type of site-selection problem. For demonstrating the robustness of the new tool, a case study was planned and executed at Algarve Region, Portugal. The efficiency of the SMCDA tool in the decision making process for selecting suitable sites for MAR was also demonstrated. Specific aspects of the tool such as built-in default criteria, explicit decision steps, and flexibility in choosing different options were key features, which benefited the study. The new SMCDA tool can be augmented by groundwater flow and transport modeling so as to achieve a more comprehensive approach to the selection process for the best locations of the MAR infiltration basins, as well as the locations of recovery wells and areas of groundwater protection. The new spatial multicriteria analysis tool has already been implemented within the GIS based Gabardine decision support system as an innovative MAR planning tool. Copyright © 2012 Elsevier Ltd. All rights reserved.

  10. PAPST, a User Friendly and Powerful Java Platform for ChIP-Seq Peak Co-Localization Analysis and Beyond.

    PubMed

    Bible, Paul W; Kanno, Yuka; Wei, Lai; Brooks, Stephen R; O'Shea, John J; Morasso, Maria I; Loganantharaj, Rasiah; Sun, Hong-Wei

    2015-01-01

    Comparative co-localization analysis of transcription factors (TFs) and epigenetic marks (EMs) in specific biological contexts is one of the most critical areas of ChIP-Seq data analysis beyond peak calling. Yet there is a significant lack of user-friendly and powerful tools geared towards co-localization analysis based exploratory research. Most tools currently used for co-localization analysis are command line only and require extensive installation procedures and Linux expertise. Online tools partially address the usability issues of command line tools, but slow response times and few customization features make them unsuitable for rapid data-driven interactive exploratory research. We have developed PAPST: Peak Assignment and Profile Search Tool, a user-friendly yet powerful platform with a unique design, which integrates both gene-centric and peak-centric co-localization analysis into a single package. Most of PAPST's functions can be completed in less than five seconds, allowing quick cycles of data-driven hypothesis generation and testing. With PAPST, a researcher with or without computational expertise can perform sophisticated co-localization pattern analysis of multiple TFs and EMs, either against all known genes or a set of genomic regions obtained from public repositories or prior analysis. PAPST is a versatile, efficient, and customizable tool for genome-wide data-driven exploratory research. Creatively used, PAPST can be quickly applied to any genomic data analysis that involves a comparison of two or more sets of genomic coordinate intervals, making it a powerful tool for a wide range of exploratory genomic research. We first present PAPST's general purpose features then apply it to several public ChIP-Seq data sets to demonstrate its rapid execution and potential for cutting-edge research with a case study in enhancer analysis. To our knowledge, PAPST is the first software of its kind to provide efficient and sophisticated post peak-calling ChIP-Seq data analysis as an easy-to-use interactive application. PAPST is available at https://github.com/paulbible/papst and is a public domain work.

  11. PAPST, a User Friendly and Powerful Java Platform for ChIP-Seq Peak Co-Localization Analysis and Beyond

    PubMed Central

    Bible, Paul W.; Kanno, Yuka; Wei, Lai; Brooks, Stephen R.; O’Shea, John J.; Morasso, Maria I.; Loganantharaj, Rasiah; Sun, Hong-Wei

    2015-01-01

    Comparative co-localization analysis of transcription factors (TFs) and epigenetic marks (EMs) in specific biological contexts is one of the most critical areas of ChIP-Seq data analysis beyond peak calling. Yet there is a significant lack of user-friendly and powerful tools geared towards co-localization analysis based exploratory research. Most tools currently used for co-localization analysis are command line only and require extensive installation procedures and Linux expertise. Online tools partially address the usability issues of command line tools, but slow response times and few customization features make them unsuitable for rapid data-driven interactive exploratory research. We have developed PAPST: Peak Assignment and Profile Search Tool, a user-friendly yet powerful platform with a unique design, which integrates both gene-centric and peak-centric co-localization analysis into a single package. Most of PAPST’s functions can be completed in less than five seconds, allowing quick cycles of data-driven hypothesis generation and testing. With PAPST, a researcher with or without computational expertise can perform sophisticated co-localization pattern analysis of multiple TFs and EMs, either against all known genes or a set of genomic regions obtained from public repositories or prior analysis. PAPST is a versatile, efficient, and customizable tool for genome-wide data-driven exploratory research. Creatively used, PAPST can be quickly applied to any genomic data analysis that involves a comparison of two or more sets of genomic coordinate intervals, making it a powerful tool for a wide range of exploratory genomic research. We first present PAPST’s general purpose features then apply it to several public ChIP-Seq data sets to demonstrate its rapid execution and potential for cutting-edge research with a case study in enhancer analysis. To our knowledge, PAPST is the first software of its kind to provide efficient and sophisticated post peak-calling ChIP-Seq data analysis as an easy-to-use interactive application. PAPST is available at https://github.com/paulbible/papst and is a public domain work. PMID:25970601

  12. Analysis Tool Web Services from the EMBL-EBI.

    PubMed

    McWilliam, Hamish; Li, Weizhong; Uludag, Mahmut; Squizzato, Silvano; Park, Young Mi; Buso, Nicola; Cowley, Andrew Peter; Lopez, Rodrigo

    2013-07-01

    Since 2004 the European Bioinformatics Institute (EMBL-EBI) has provided access to a wide range of databases and analysis tools via Web Services interfaces. This comprises services to search across the databases available from the EMBL-EBI and to explore the network of cross-references present in the data (e.g. EB-eye), services to retrieve entry data in various data formats and to access the data in specific fields (e.g. dbfetch), and analysis tool services, for example, sequence similarity search (e.g. FASTA and NCBI BLAST), multiple sequence alignment (e.g. Clustal Omega and MUSCLE), pairwise sequence alignment and protein functional analysis (e.g. InterProScan and Phobius). The REST/SOAP Web Services (http://www.ebi.ac.uk/Tools/webservices/) interfaces to these databases and tools allow their integration into other tools, applications, web sites, pipeline processes and analytical workflows. To get users started using the Web Services, sample clients are provided covering a range of programming languages and popular Web Service tool kits, and a brief guide to Web Services technologies, including a set of tutorials, is available for those wishing to learn more and develop their own clients. Users of the Web Services are informed of improvements and updates via a range of methods.

  13. Analysis Tool Web Services from the EMBL-EBI

    PubMed Central

    McWilliam, Hamish; Li, Weizhong; Uludag, Mahmut; Squizzato, Silvano; Park, Young Mi; Buso, Nicola; Cowley, Andrew Peter; Lopez, Rodrigo

    2013-01-01

    Since 2004 the European Bioinformatics Institute (EMBL-EBI) has provided access to a wide range of databases and analysis tools via Web Services interfaces. This comprises services to search across the databases available from the EMBL-EBI and to explore the network of cross-references present in the data (e.g. EB-eye), services to retrieve entry data in various data formats and to access the data in specific fields (e.g. dbfetch), and analysis tool services, for example, sequence similarity search (e.g. FASTA and NCBI BLAST), multiple sequence alignment (e.g. Clustal Omega and MUSCLE), pairwise sequence alignment and protein functional analysis (e.g. InterProScan and Phobius). The REST/SOAP Web Services (http://www.ebi.ac.uk/Tools/webservices/) interfaces to these databases and tools allow their integration into other tools, applications, web sites, pipeline processes and analytical workflows. To get users started using the Web Services, sample clients are provided covering a range of programming languages and popular Web Service tool kits, and a brief guide to Web Services technologies, including a set of tutorials, is available for those wishing to learn more and develop their own clients. Users of the Web Services are informed of improvements and updates via a range of methods. PMID:23671338

  14. Rotorcraft Optimization Tools: Incorporating Rotorcraft Design Codes into Multi-Disciplinary Design, Analysis, and Optimization

    NASA Technical Reports Server (NTRS)

    Meyn, Larry A.

    2018-01-01

    One of the goals of NASA's Revolutionary Vertical Lift Technology Project (RVLT) is to provide validated tools for multidisciplinary design, analysis and optimization (MDAO) of vertical lift vehicles. As part of this effort, the software package, RotorCraft Optimization Tools (RCOTOOLS), is being developed to facilitate incorporating key rotorcraft conceptual design codes into optimizations using the OpenMDAO multi-disciplinary optimization framework written in Python. RCOTOOLS, also written in Python, currently supports the incorporation of the NASA Design and Analysis of RotorCraft (NDARC) vehicle sizing tool and the Comprehensive Analytical Model of Rotorcraft Aerodynamics and Dynamics II (CAMRAD II) analysis tool into OpenMDAO-driven optimizations. Both of these tools use detailed, file-based inputs and outputs, so RCOTOOLS provides software wrappers to update input files with new design variable values, execute these codes and then extract specific response variable values from the file outputs. These wrappers are designed to be flexible and easy to use. RCOTOOLS also provides several utilities to aid in optimization model development, including Graphical User Interface (GUI) tools for browsing input and output files in order to identify text strings that are used to identify specific variables as optimization input and response variables. This paper provides an overview of RCOTOOLS and its use

  15. Lifecycle Industry GreenHouse gas, Technology and Energy through the Use Phase (LIGHTEnUP) – Analysis Tool User’s Guide

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morrow, William R.; Shehabi, Arman; Smith, Sarah

    The LIGHTEnUP Analysis Tool (Lifecycle Industry GreenHouse gas, Technology and Energy through the Use Phase) has been developed for The United States Department of Energy’s (U.S. DOE) Advanced Manufacturing Office (AMO) to forecast both the manufacturing sector and product life-cycle energy consumption implications of manufactured products across the U.S. economy. The tool architecture incorporates publicly available historic and projection datasets of U.S. economy-wide energy use including manufacturing, buildings operations, electricity generation and transportation. The tool requires minimal inputs to define alternate scenarios to business-as-usual projection data. The tool is not an optimization or equilibrium model and therefore does not selectmore » technologies or deployment scenarios endogenously. Instead, inputs are developed exogenous to the tool by the user to reflect detailed engineering calculations, future targets and goals, or creative insights. The tool projects the scenario’s energy, CO 2 emissions, and energy expenditure (i.e., economic spending to purchase energy) implications and provides documentation to communicate results. The tool provides a transparent and uniform system of comparing manufacturing and use-phase impacts of technologies. The tool allows the user to create multiple scenarios that can reflect a range of possible future outcomes. However, reasonable scenarios require careful attention to assumptions and details about the future. This tool is part of an emerging set of AMO’s life cycle analysis (LCA) tool such as the Material Flows the Industry (MFI) tool, and the Additive Manufacturing LCA tool.« less

  16. Statistical methods for the forensic analysis of striated tool marks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hoeksema, Amy Beth

    In forensics, fingerprints can be used to uniquely identify suspects in a crime. Similarly, a tool mark left at a crime scene can be used to identify the tool that was used. However, the current practice of identifying matching tool marks involves visual inspection of marks by forensic experts which can be a very subjective process. As a result, declared matches are often successfully challenged in court, so law enforcement agencies are particularly interested in encouraging research in more objective approaches. Our analysis is based on comparisons of profilometry data, essentially depth contours of a tool mark surface taken alongmore » a linear path. In current practice, for stronger support of a match or non-match, multiple marks are made in the lab under the same conditions by the suspect tool. We propose the use of a likelihood ratio test to analyze the difference between a sample of comparisons of lab tool marks to a field tool mark, against a sample of comparisons of two lab tool marks. Chumbley et al. (2010) point out that the angle of incidence between the tool and the marked surface can have a substantial impact on the tool mark and on the effectiveness of both manual and algorithmic matching procedures. To better address this problem, we describe how the analysis can be enhanced to model the effect of tool angle and allow for angle estimation for a tool mark left at a crime scene. With sufficient development, such methods may lead to more defensible forensic analyses.« less

  17. Addressing multi-label imbalance problem of surgical tool detection using CNN.

    PubMed

    Sahu, Manish; Mukhopadhyay, Anirban; Szengel, Angelika; Zachow, Stefan

    2017-06-01

    A fully automated surgical tool detection framework is proposed for endoscopic video streams. State-of-the-art surgical tool detection methods rely on supervised one-vs-all or multi-class classification techniques, completely ignoring the co-occurrence relationship of the tools and the associated class imbalance. In this paper, we formulate tool detection as a multi-label classification task where tool co-occurrences are treated as separate classes. In addition, imbalance on tool co-occurrences is analyzed and stratification techniques are employed to address the imbalance during convolutional neural network (CNN) training. Moreover, temporal smoothing is introduced as an online post-processing step to enhance runtime prediction. Quantitative analysis is performed on the M2CAI16 tool detection dataset to highlight the importance of stratification, temporal smoothing and the overall framework for tool detection. The analysis on tool imbalance, backed by the empirical results, indicates the need and superiority of the proposed framework over state-of-the-art techniques.

  18. Financing Alternatives Comparison Tool

    EPA Pesticide Factsheets

    FACT is a financial analysis tool that helps identify the most cost-effective method to fund a wastewater or drinking water management project. It produces a comprehensive analysis that compares various financing options.

  19. An Observation Analysis Tool for time-series analysis and sensor management in the FREEWAT GIS environment for water resources management

    NASA Astrophysics Data System (ADS)

    Cannata, Massimiliano; Neumann, Jakob; Cardoso, Mirko; Rossetto, Rudy; Foglia, Laura; Borsi, Iacopo

    2017-04-01

    In situ time-series are an important aspect of environmental modelling, especially with the advancement of numerical simulation techniques and increased model complexity. In order to make use of the increasing data available through the requirements of the EU Water Framework Directive, the FREEWAT GIS environment incorporates the newly developed Observation Analysis Tool for time-series analysis. The tool is used to import time-series data into QGIS from local CSV files, online sensors using the istSOS service, or MODFLOW model result files and enables visualisation, pre-processing of data for model development, and post-processing of model results. OAT can be used as a pre-processor for calibration observations, integrating the creation of observations for calibration directly from sensor time-series. The tool consists in an expandable Python library of processing methods and an interface integrated in the QGIS FREEWAT plug-in which includes a large number of modelling capabilities, data management tools and calibration capacity.

  20. Dataflow Design Tool: User's Manual

    NASA Technical Reports Server (NTRS)

    Jones, Robert L., III

    1996-01-01

    The Dataflow Design Tool is a software tool for selecting a multiprocessor scheduling solution for a class of computational problems. The problems of interest are those that can be described with a dataflow graph and are intended to be executed repetitively on a set of identical processors. Typical applications include signal processing and control law problems. The software tool implements graph-search algorithms and analysis techniques based on the dataflow paradigm. Dataflow analyses provided by the software are introduced and shown to effectively determine performance bounds, scheduling constraints, and resource requirements. The software tool provides performance optimization through the inclusion of artificial precedence constraints among the schedulable tasks. The user interface and tool capabilities are described. Examples are provided to demonstrate the analysis, scheduling, and optimization functions facilitated by the tool.

  1. Development of a task analysis tool to facilitate user interface design

    NASA Technical Reports Server (NTRS)

    Scholtz, Jean C.

    1992-01-01

    A good user interface is one that facilitates the user in carrying out his task. Such interfaces are difficult and costly to produce. The most important aspect in producing a good interface is the ability to communicate to the software designers what the user's task is. The Task Analysis Tool is a system for cooperative task analysis and specification of the user interface requirements. This tool is intended to serve as a guide to development of initial prototypes for user feedback.

  2. Tools4miRs – one place to gather all the tools for miRNA analysis

    PubMed Central

    Lukasik, Anna; Wójcikowski, Maciej; Zielenkiewicz, Piotr

    2016-01-01

    Summary: MiRNAs are short, non-coding molecules that negatively regulate gene expression and thereby play several important roles in living organisms. Dozens of computational methods for miRNA-related research have been developed, which greatly differ in various aspects. The substantial availability of difficult-to-compare approaches makes it challenging for the user to select a proper tool and prompts the need for a solution that will collect and categorize all the methods. Here, we present tools4miRs, the first platform that gathers currently more than 160 methods for broadly defined miRNA analysis. The collected tools are classified into several general and more detailed categories in which the users can additionally filter the available methods according to their specific research needs, capabilities and preferences. Tools4miRs is also a web-based target prediction meta-server that incorporates user-designated target prediction methods into the analysis of user-provided data. Availability and Implementation: Tools4miRs is implemented in Python using Django and is freely available at tools4mirs.org. Contact: piotr@ibb.waw.pl Supplementary information: Supplementary data are available at Bioinformatics online. PMID:27153626

  3. Tools4miRs - one place to gather all the tools for miRNA analysis.

    PubMed

    Lukasik, Anna; Wójcikowski, Maciej; Zielenkiewicz, Piotr

    2016-09-01

    MiRNAs are short, non-coding molecules that negatively regulate gene expression and thereby play several important roles in living organisms. Dozens of computational methods for miRNA-related research have been developed, which greatly differ in various aspects. The substantial availability of difficult-to-compare approaches makes it challenging for the user to select a proper tool and prompts the need for a solution that will collect and categorize all the methods. Here, we present tools4miRs, the first platform that gathers currently more than 160 methods for broadly defined miRNA analysis. The collected tools are classified into several general and more detailed categories in which the users can additionally filter the available methods according to their specific research needs, capabilities and preferences. Tools4miRs is also a web-based target prediction meta-server that incorporates user-designated target prediction methods into the analysis of user-provided data. Tools4miRs is implemented in Python using Django and is freely available at tools4mirs.org. piotr@ibb.waw.pl Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.

  4. In response to 'Can sugars be produced from fatty acids? A test case for pathway analysis tools'.

    PubMed

    Faust, Karoline; Croes, Didier; van Helden, Jacques

    2009-12-01

    In their article entitled 'Can sugars be produced from fatty acids? A test case for pathway analysis tools' de Figueiredo and co-authors assess the performance of three pathway prediction tools (METATOOL, PathFinding and Pathway Hunter Tool) using the synthesis of glucose-6-phosphate (G6P) from acetyl-CoA in humans as a test case. We think that this article is biased for three reasons: (i) the metabolic networks used as input for the respective tools were of very different sizes; (ii) the 'assessment' is restricted to two study cases; (iii) developers are inherently more skilled to use their own tools than those developed by other people. We extended the analyses led by de Figueiredo and clearly show that the apparent superior performance of their tool (METATOOL) is partly due to the differences in input network sizes. We also see a conceptual problem in the comparison of tools that serve different purposes. In our opinion, metabolic path finding and elementary mode analysis are answering different biological questions, and should be considered as complementary rather than competitive approaches. Supplementary data are available at Bioinformatics online.

  5. Nucleic acid tool enzymes-aided signal amplification strategy for biochemical analysis: status and challenges.

    PubMed

    Qing, Taiping; He, Dinggeng; He, Xiaoxiao; Wang, Kemin; Xu, Fengzhou; Wen, Li; Shangguan, Jingfang; Mao, Zhengui; Lei, Yanli

    2016-04-01

    Owing to their highly efficient catalytic effects and substrate specificity, the nucleic acid tool enzymes are applied as 'nano-tools' for manipulating different nucleic acid substrates both in the test-tube and in living organisms. In addition to the function as molecular scissors and molecular glue in genetic engineering, the application of nucleic acid tool enzymes in biochemical analysis has also been extensively developed in the past few decades. Used as amplifying labels for biorecognition events, the nucleic acid tool enzymes are mainly applied in nucleic acids amplification sensing, as well as the amplification sensing of biorelated variations of nucleic acids. With the introduction of aptamers, which can bind different target molecules, the nucleic acid tool enzymes-aided signal amplification strategies can also be used to sense non-nucleic targets (e.g., ions, small molecules, proteins, and cells). This review describes and discusses the amplification strategies of nucleic acid tool enzymes-aided biosensors for biochemical analysis applications. Various analytes, including nucleic acids, ions, small molecules, proteins, and cells, are reviewed briefly. This work also addresses the future trends and outlooks for signal amplification in nucleic acid tool enzymes-aided biosensors.

  6. Tool Efficiency Analysis model research in SEMI industry

    NASA Astrophysics Data System (ADS)

    Lei, Ma; Nana, Zhang; Zhongqiu, Zhang

    2018-06-01

    One of the key goals in SEMI industry is to improve equipment through put and ensure equipment production efficiency maximization. This paper is based on SEMI standards in semiconductor equipment control, defines the transaction rules between different tool states, and presents a TEA system model which is to analysis tool performance automatically based on finite state machine. The system was applied to fab tools and verified its effectiveness successfully, and obtained the parameter values used to measure the equipment performance, also including the advices of improvement.

  7. Different type 2 diabetes risk assessments predict dissimilar numbers at ‘high risk’: a retrospective analysis of diabetes risk-assessment tools

    PubMed Central

    Gray, Benjamin J; Bracken, Richard M; Turner, Daniel; Morgan, Kerry; Thomas, Michael; Williams, Sally P; Williams, Meurig; Rice, Sam; Stephens, Jeffrey W

    2015-01-01

    Background Use of a validated risk-assessment tool to identify individuals at high risk of developing type 2 diabetes is currently recommended. It is under-reported, however, whether a different risk tool alters the predicted risk of an individual. Aim This study explored any differences between commonly used validated risk-assessment tools for type 2 diabetes. Design and setting Cross-sectional analysis of individuals who participated in a workplace-based risk assessment in Carmarthenshire, South Wales. Method Retrospective analysis of 676 individuals (389 females and 287 males) who participated in a workplace-based diabetes risk-assessment initiative. Ten-year risk of type 2 diabetes was predicted using the validated QDiabetes®, Leicester Risk Assessment (LRA), FINDRISC, and Cambridge Risk Score (CRS) algorithms. Results Differences between the risk-assessment tools were apparent following retrospective analysis of individuals. CRS categorised the highest proportion (13.6%) of individuals at ‘high risk’ followed by FINDRISC (6.6%), QDiabetes (6.1%), and, finally, the LRA was the most conservative risk tool (3.1%). Following further analysis by sex, over one-quarter of males were categorised at high risk using CRS (25.4%), whereas a greater percentage of females were categorised as high risk using FINDRISC (7.8%). Conclusion The adoption of a different valid risk-assessment tool can alter the predicted risk of an individual and caution should be used to identify those individuals who really are at high risk of type 2 diabetes. PMID:26541180

  8. Different type 2 diabetes risk assessments predict dissimilar numbers at 'high risk': a retrospective analysis of diabetes risk-assessment tools.

    PubMed

    Gray, Benjamin J; Bracken, Richard M; Turner, Daniel; Morgan, Kerry; Thomas, Michael; Williams, Sally P; Williams, Meurig; Rice, Sam; Stephens, Jeffrey W

    2015-12-01

    Use of a validated risk-assessment tool to identify individuals at high risk of developing type 2 diabetes is currently recommended. It is under-reported, however, whether a different risk tool alters the predicted risk of an individual. This study explored any differences between commonly used validated risk-assessment tools for type 2 diabetes. Cross-sectional analysis of individuals who participated in a workplace-based risk assessment in Carmarthenshire, South Wales. Retrospective analysis of 676 individuals (389 females and 287 males) who participated in a workplace-based diabetes risk-assessment initiative. Ten-year risk of type 2 diabetes was predicted using the validated QDiabetes(®), Leicester Risk Assessment (LRA), FINDRISC, and Cambridge Risk Score (CRS) algorithms. Differences between the risk-assessment tools were apparent following retrospective analysis of individuals. CRS categorised the highest proportion (13.6%) of individuals at 'high risk' followed by FINDRISC (6.6%), QDiabetes (6.1%), and, finally, the LRA was the most conservative risk tool (3.1%). Following further analysis by sex, over one-quarter of males were categorised at high risk using CRS (25.4%), whereas a greater percentage of females were categorised as high risk using FINDRISC (7.8%). The adoption of a different valid risk-assessment tool can alter the predicted risk of an individual and caution should be used to identify those individuals who really are at high risk of type 2 diabetes. © British Journal of General Practice 2015.

  9. Open | SpeedShop: An Open Source Infrastructure for Parallel Performance Analysis

    DOE PAGES

    Schulz, Martin; Galarowicz, Jim; Maghrak, Don; ...

    2008-01-01

    Over the last decades a large number of performance tools has been developed to analyze and optimize high performance applications. Their acceptance by end users, however, has been slow: each tool alone is often limited in scope and comes with widely varying interfaces and workflow constraints, requiring different changes in the often complex build and execution infrastructure of the target application. We started the Open | SpeedShop project about 3 years ago to overcome these limitations and provide efficient, easy to apply, and integrated performance analysis for parallel systems. Open | SpeedShop has two different faces: it provides an interoperable tool set covering themore » most common analysis steps as well as a comprehensive plugin infrastructure for building new tools. In both cases, the tools can be deployed to large scale parallel applications using DPCL/Dyninst for distributed binary instrumentation. Further, all tools developed within or on top of Open | SpeedShop are accessible through multiple fully equivalent interfaces including an easy-to-use GUI as well as an interactive command line interface reducing the usage threshold for those tools.« less

  10. Computing tools for implementing standards for single-case designs.

    PubMed

    Chen, Li-Ting; Peng, Chao-Ying Joanne; Chen, Ming-E

    2015-11-01

    In the single-case design (SCD) literature, five sets of standards have been formulated and distinguished: design standards, assessment standards, analysis standards, reporting standards, and research synthesis standards. This article reviews computing tools that can assist researchers and practitioners in meeting the analysis standards recommended by the What Works Clearinghouse: Procedures and Standards Handbook-the WWC standards. These tools consist of specialized web-based calculators or downloadable software for SCD data, and algorithms or programs written in Excel, SAS procedures, SPSS commands/Macros, or the R programming language. We aligned these tools with the WWC standards and evaluated them for accuracy and treatment of missing data, using two published data sets. All tools were tested to be accurate. When missing data were present, most tools either gave an error message or conducted analysis based on the available data. Only one program used a single imputation method. This article concludes with suggestions for an inclusive computing tool or environment, additional research on the treatment of missing data, and reasonable and flexible interpretations of the WWC standards. © The Author(s) 2015.

  11. Analysis and control on changeable wheel tool system of hybrid grinding and polishing machine tool for blade finishing

    NASA Astrophysics Data System (ADS)

    He, Qiuwei; Lv, Xingming; Wang, Xin; Qu, Xingtian; Zhao, Ji

    2017-01-01

    Blade is the key component in the energy power equipment of turbine, aircraft engines and so on. Researches on the process and equipment for blade finishing become one of important and difficult point. To control precisely tool system of developed hybrid grinding and polishing machine tool for blade finishing, the tool system with changeable wheel for belt polishing is analyzed in this paper. Firstly, the belt length and wrap angle of each wheel in different position of tension wheel swing angle in the process of changing wheel is analyzed. The reasonable belt length is calculated by using MATLAB, and relationships between wrap angle of each wheel and cylinder expansion amount of contact wheel are obtained. Then, the control system for changeable wheel tool structure is developed. Lastly, the surface roughness of blade finishing is verified by experiments. Theoretical analysis and experimental results show that reasonable belt length and wheel wrap angle can be obtained by proposed analysis method, the changeable wheel tool system can be controlled precisely, and the surface roughness of blade after grinding meets the design requirements.

  12. The Legacy Archive for Microwave Background Data Analysis (LAMBDA)

    NASA Astrophysics Data System (ADS)

    Miller, Nathan; LAMBDA

    2018-01-01

    The Legacy Archive for Microwave Background Data Analysis (LAMBDA) provides CMB researchers with archival data for cosmology missions, software tools, and links to other sites of interest. LAMBDA is one-stop shopping for CMB researchers. It hosts data from WMAP along with many suborbital experiments. Over the past year, LAMBDA has acquired new data from SPTpol, SPIDER and ACTPol. In addition to the primary CMB, LAMBDA also provides foreground data.LAMBDA has several ongoing efforts to provide tools for CMB researchers. These tools include a web interface for CAMB and a web interface for a CMB survey footprint database and plotting tool. Additionally, we have recently developed a Docker container with standard CMB analysis tools and demonstrations in the form of Jupyter notebooks. These containers will be publically available through Docker's container repository and the source will be available on github.

  13. Computational Tools for Metabolic Engineering

    PubMed Central

    Copeland, Wilbert B.; Bartley, Bryan A.; Chandran, Deepak; Galdzicki, Michal; Kim, Kyung H.; Sleight, Sean C.; Maranas, Costas D.; Sauro, Herbert M.

    2012-01-01

    A great variety of software applications are now employed in the metabolic engineering field. These applications have been created to support a wide range of experimental and analysis techniques. Computational tools are utilized throughout the metabolic engineering workflow to extract and interpret relevant information from large data sets, to present complex models in a more manageable form, and to propose efficient network design strategies. In this review, we present a number of tools that can assist in modifying and understanding cellular metabolic networks. The review covers seven areas of relevance to metabolic engineers. These include metabolic reconstruction efforts, network visualization, nucleic acid and protein engineering, metabolic flux analysis, pathway prospecting, post-structural network analysis and culture optimization. The list of available tools is extensive and we can only highlight a small, representative portion of the tools from each area. PMID:22629572

  14. Open source Modeling and optimization tools for Planning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peles, S.

    Open source modeling and optimization tools for planning The existing tools and software used for planning and analysis in California are either expensive, difficult to use, or not generally accessible to a large number of participants. These limitations restrict the availability of participants for larger scale energy and grid studies in the state. The proposed initiative would build upon federal and state investments in open source software, and create and improve open source tools for use in the state planning and analysis activities. Computational analysis and simulation frameworks in development at national labs and universities can be brought forward tomore » complement existing tools. An open source platform would provide a path for novel techniques and strategies to be brought into the larger community and reviewed by a broad set of stakeholders.« less

  15. Tools for Data Analysis in the Middle School Classroom: A Teacher Professional Development Program

    NASA Astrophysics Data System (ADS)

    Ledley, T. S.; Haddad, N.; McAuliffe, C.; Dahlman, L.

    2006-12-01

    In order for students to learn how to engage with scientific data to answer questions about the real world, it is imperative that their teachers are 1) comfortable with the data and the tools used to analyze it, and 2) feel prepared to support their students in this complex endeavor. TERC's Tools for Data Analysis in the Middle School Classroom (DataTools) professional development program, funded by NSF's ITEST program, prepares middle school teachers to integrate Web-based scientific data and analysis tools into their existing curricula. This 13-month program supports teachers in using a set of freely or commonly available tools with a wide range of data. It also gives them an opportunity to practice teaching these skills to students before teaching in their own classrooms. The ultimate goal of the program is to increase the number of middle school students who work directly with scientific data, who use the tools of technology to import, manipulate, visualize and analyze the data, who come to understand the power of data-based arguments, and who will consider pursuing a career in technical and scientific fields. In this session, we will describe the elements of the DataTools program and the Earth Exploration Toolbook (EET, http://serc.carleton.edu/eet), a Web-based resource that supports Earth system education for teachers and students in grades 6 through 16. The EET provides essential support to DataTools teachers as they use it to learn to locate and download Web-based data and use data analysis tools. We will also share what we have learned during the first year of this three-year program.

  16. Simulation for Prediction of Entry Article Demise (SPEAD): An Analysis Tool for Spacecraft Safety Analysis and Ascent/Reentry Risk Assessment

    NASA Technical Reports Server (NTRS)

    Ling, Lisa

    2014-01-01

    For the purpose of performing safety analysis and risk assessment for a potential off-nominal atmospheric reentry resulting in vehicle breakup, a synthesis of trajectory propagation coupled with thermal analysis and the evaluation of node failure is required to predict the sequence of events, the timeline, and the progressive demise of spacecraft components. To provide this capability, the Simulation for Prediction of Entry Article Demise (SPEAD) analysis tool was developed. The software and methodology have been validated against actual flights, telemetry data, and validated software, and safety/risk analyses were performed for various programs using SPEAD. This report discusses the capabilities, modeling, validation, and application of the SPEAD analysis tool.

  17. Exploratory Climate Data Visualization and Analysis Using DV3D and UVCDAT

    NASA Technical Reports Server (NTRS)

    Maxwell, Thomas

    2012-01-01

    Earth system scientists are being inundated by an explosion of data generated by ever-increasing resolution in both global models and remote sensors. Advanced tools for accessing, analyzing, and visualizing very large and complex climate data are required to maintain rapid progress in Earth system research. To meet this need, NASA, in collaboration with the Ultra-scale Visualization Climate Data Analysis Tools (UVCOAT) consortium, is developing exploratory climate data analysis and visualization tools which provide data analysis capabilities for the Earth System Grid (ESG). This paper describes DV3D, a UV-COAT package that enables exploratory analysis of climate simulation and observation datasets. OV3D provides user-friendly interfaces for visualization and analysis of climate data at a level appropriate for scientists. It features workflow inte rfaces, interactive 40 data exploration, hyperwall and stereo visualization, automated provenance generation, and parallel task execution. DV30's integration with CDAT's climate data management system (COMS) and other climate data analysis tools provides a wide range of high performance climate data analysis operations. DV3D expands the scientists' toolbox by incorporating a suite of rich new exploratory visualization and analysis methods for addressing the complexity of climate datasets.

  18. Elementary Mode Analysis: A Useful Metabolic Pathway Analysis Tool for Characterizing Cellular Metabolism

    PubMed Central

    Trinh, Cong T.; Wlaschin, Aaron; Srienc, Friedrich

    2010-01-01

    Elementary Mode Analysis is a useful Metabolic Pathway Analysis tool to identify the structure of a metabolic network that links the cellular phenotype to the corresponding genotype. The analysis can decompose the intricate metabolic network comprised of highly interconnected reactions into uniquely organized pathways. These pathways consisting of a minimal set of enzymes that can support steady state operation of cellular metabolism represent independent cellular physiological states. Such pathway definition provides a rigorous basis to systematically characterize cellular phenotypes, metabolic network regulation, robustness, and fragility that facilitate understanding of cell physiology and implementation of metabolic engineering strategies. This mini-review aims to overview the development and application of elementary mode analysis as a metabolic pathway analysis tool in studying cell physiology and as a basis of metabolic engineering. PMID:19015845

  19. Smart roadside initiative macro benefit analysis : user’s guide for the benefit-cost analysis tool.

    DOT National Transportation Integrated Search

    2015-03-01

    Through the Smart Roadside Initiative (SRI), a Benefit-Cost Analysis (BCA) tool was developed for the evaluation of various new transportation technologies at a State level and to provide results that could support technology adoption by a State Depa...

  20. Method for automation of tool preproduction

    NASA Astrophysics Data System (ADS)

    Rychkov, D. A.; Yanyushkin, A. S.; Lobanov, D. V.; Arkhipov, P. V.

    2018-03-01

    The primary objective of tool production is a creation or selection of such tool design which could make it possible to secure high process efficiency, tool availability as well as a quality of received surfaces with minimum means and resources spent on it. It takes much time of application people, being engaged in tool preparation, to make a correct selection of the appropriate tool among the set of variants. Program software has been developed to solve the problem, which helps to create, systematize and carry out a comparative analysis of tool design to identify the rational variant under given production conditions. The literature indicates that systematization and selection of the tool rational design has been carried out in accordance with the developed modeling technology and comparative design analysis. Software application makes it possible to reduce the period of design by 80....85% and obtain a significant annual saving.

  1. Modeling and Simulation Tools for Heavy Lift Airships

    NASA Technical Reports Server (NTRS)

    Hochstetler, Ron; Chachad, Girish; Hardy, Gordon; Blanken, Matthew; Melton, John

    2016-01-01

    For conventional fixed wing and rotary wing aircraft a variety of modeling and simulation tools have been developed to provide designers the means to thoroughly investigate proposed designs and operational concepts. However, lighter-than-air (LTA) airships, hybrid air vehicles, and aerostats have some important aspects that are different from heavier-than-air (HTA) vehicles. In order to account for these differences, modifications are required to the standard design tools to fully characterize the LTA vehicle design and performance parameters.. To address these LTA design and operational factors, LTA development organizations have created unique proprietary modeling tools, often at their own expense. An expansion of this limited LTA tool set could be accomplished by leveraging existing modeling and simulation capabilities available in the National laboratories and public research centers. Development of an expanded set of publicly available LTA modeling and simulation tools for LTA developers would mitigate the reliance on proprietary LTA design tools in use today. A set of well researched, open source, high fidelity LTA design modeling and simulation tools would advance LTA vehicle development and also provide the analytical basis for accurate LTA operational cost assessments. This paper will present the modeling and analysis tool capabilities required for LTA vehicle design, analysis of operations, and full life-cycle support. A survey of the tools currently available will be assessed to identify the gaps between their capabilities and the LTA industry's needs. Options for development of new modeling and analysis capabilities to supplement contemporary tools will also be presented.

  2. CoryneBase: Corynebacterium Genomic Resources and Analysis Tools at Your Fingertips

    PubMed Central

    Tan, Mui Fern; Jakubovics, Nick S.; Wee, Wei Yee; Mutha, Naresh V. R.; Wong, Guat Jah; Ang, Mia Yang; Yazdi, Amir Hessam; Choo, Siew Woh

    2014-01-01

    Corynebacteria are used for a wide variety of industrial purposes but some species are associated with human diseases. With increasing number of corynebacterial genomes having been sequenced, comparative analysis of these strains may provide better understanding of their biology, phylogeny, virulence and taxonomy that may lead to the discoveries of beneficial industrial strains or contribute to better management of diseases. To facilitate the ongoing research of corynebacteria, a specialized central repository and analysis platform for the corynebacterial research community is needed to host the fast-growing amount of genomic data and facilitate the analysis of these data. Here we present CoryneBase, a genomic database for Corynebacterium with diverse functionality for the analysis of genomes aimed to provide: (1) annotated genome sequences of Corynebacterium where 165,918 coding sequences and 4,180 RNAs can be found in 27 species; (2) access to comprehensive Corynebacterium data through the use of advanced web technologies for interactive web interfaces; and (3) advanced bioinformatic analysis tools consisting of standard BLAST for homology search, VFDB BLAST for sequence homology search against the Virulence Factor Database (VFDB), Pairwise Genome Comparison (PGC) tool for comparative genomic analysis, and a newly designed Pathogenomics Profiling Tool (PathoProT) for comparative pathogenomic analysis. CoryneBase offers the access of a range of Corynebacterium genomic resources as well as analysis tools for comparative genomics and pathogenomics. It is publicly available at http://corynebacterium.um.edu.my/. PMID:24466021

  3. Control/structure interaction conceptual design tool

    NASA Technical Reports Server (NTRS)

    Briggs, Hugh C.

    1990-01-01

    The JPL Control/Structure Interaction Program is developing new analytical methods for designing micro-precision spacecraft with controlled structures. One of these, the Conceptual Design Tool, will illustrate innovative new approaches to the integration of multi-disciplinary analysis and design methods. The tool will be used to demonstrate homogeneity of presentation, uniform data representation across analytical methods, and integrated systems modeling. The tool differs from current 'integrated systems' that support design teams most notably in its support for the new CSI multi-disciplinary engineer. The design tool will utilize a three dimensional solid model of the spacecraft under design as the central data organization metaphor. Various analytical methods, such as finite element structural analysis, control system analysis, and mechanical configuration layout, will store and retrieve data from a hierarchical, object oriented data structure that supports assemblies of components with associated data and algorithms. In addition to managing numerical model data, the tool will assist the designer in organizing, stating, and tracking system requirements.

  4. Audio signal analysis for tool wear monitoring in sheet metal stamping

    NASA Astrophysics Data System (ADS)

    Ubhayaratne, Indivarie; Pereira, Michael P.; Xiang, Yong; Rolfe, Bernard F.

    2017-02-01

    Stamping tool wear can significantly degrade product quality, and hence, online tool condition monitoring is a timely need in many manufacturing industries. Even though a large amount of research has been conducted employing different sensor signals, there is still an unmet demand for a low-cost easy to set up condition monitoring system. Audio signal analysis is a simple method that has the potential to meet this demand, but has not been previously used for stamping process monitoring. Hence, this paper studies the existence and the significance of the correlation between emitted sound signals and the wear state of sheet metal stamping tools. The corrupting sources generated by the tooling of the stamping press and surrounding machinery have higher amplitudes compared to that of the sound emitted by the stamping operation itself. Therefore, a newly developed semi-blind signal extraction technique was employed as a pre-processing technique to mitigate the contribution of these corrupting sources. The spectral analysis results of the raw and extracted signals demonstrate a significant qualitative relationship between wear progression and the emitted sound signature. This study lays the basis for employing low-cost audio signal analysis in the development of a real-time industrial tool condition monitoring system.

  5. PyHLA: tests for the association between HLA alleles and diseases.

    PubMed

    Fan, Yanhui; Song, You-Qiang

    2017-02-06

    Recently, several tools have been designed for human leukocyte antigen (HLA) typing using single nucleotide polymorphism (SNP) array and next-generation sequencing (NGS) data. These tools provide high-throughput and cost-effective approaches for identifying HLA types. Therefore, tools for downstream association analysis are highly desirable. Although several tools have been designed for multi-allelic marker association analysis, they were designed only for microsatellite markers and do not scale well with increasing data volumes, or they were designed for large-scale data but provided a limited number of tests. We have developed a Python package called PyHLA, which implements several methods for HLA association analysis, to fill the gap. PyHLA is a tailor-made, easy to use, and flexible tool designed specifically for the association analysis of the HLA types imputed from genome-wide genotyping and NGS data. PyHLA provides functions for association analysis, zygosity tests, and interaction tests between HLA alleles and diseases. Monte Carlo permutation and several methods for multiple testing corrections have also been implemented. PyHLA provides a convenient and powerful tool for HLA analysis. Existing methods have been integrated and desired methods have been added in PyHLA. Furthermore, PyHLA is applicable to small and large sample sizes and can finish the analysis in a timely manner on a personal computer with different platforms. PyHLA is implemented in Python. PyHLA is a free, open source software distributed under the GPLv2 license. The source code, tutorial, and examples are available at https://github.com/felixfan/PyHLA.

  6. FunRich proteomics software analysis, let the fun begin!

    PubMed

    Benito-Martin, Alberto; Peinado, Héctor

    2015-08-01

    Protein MS analysis is the preferred method for unbiased protein identification. It is normally applied to a large number of both small-scale and high-throughput studies. However, user-friendly computational tools for protein analysis are still needed. In this issue, Mathivanan and colleagues (Proteomics 2015, 15, 2597-2601) report the development of FunRich software, an open-access software that facilitates the analysis of proteomics data, providing tools for functional enrichment and interaction network analysis of genes and proteins. FunRich is a reinterpretation of proteomic software, a standalone tool combining ease of use with customizable databases, free access, and graphical representations. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. User Guide for the Financing Alternatives Comparison Tool

    EPA Pesticide Factsheets

    FACT is a financial analysis tool that helps identify the most cost-effective method to fund a wastewater or drinking water management project. It creates a comprehensive analysis that compares various financing options.

  8. Post-Flight Data Analysis Tool

    NASA Technical Reports Server (NTRS)

    George, Marina

    2018-01-01

    A software tool that facilitates the retrieval and analysis of post-flight data. This allows our team and other teams to effectively and efficiently analyze and evaluate post-flight data in order to certify commercial providers.

  9. A Thermal Management Systems Model for the NASA GTX RBCC Concept

    NASA Technical Reports Server (NTRS)

    Traci, Richard M.; Farr, John L., Jr.; Laganelli, Tony; Walker, James (Technical Monitor)

    2002-01-01

    The Vehicle Integrated Thermal Management Analysis Code (VITMAC) was further developed to aid the analysis, design, and optimization of propellant and thermal management concepts for advanced propulsion systems. The computational tool is based on engineering level principles and models. A graphical user interface (GUI) provides a simple and straightforward method to assess and evaluate multiple concepts before undertaking more rigorous analysis of candidate systems. The tool incorporates the Chemical Equilibrium and Applications (CEA) program and the RJPA code to permit heat transfer analysis of both rocket and air breathing propulsion systems. Key parts of the code have been validated with experimental data. The tool was specifically tailored to analyze rocket-based combined-cycle (RBCC) propulsion systems being considered for space transportation applications. This report describes the computational tool and its development and verification for NASA GTX RBCC propulsion system applications.

  10. Full 3D visualization tool-kit for Monte Carlo and deterministic transport codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Frambati, S.; Frignani, M.

    2012-07-01

    We propose a package of tools capable of translating the geometric inputs and outputs of many Monte Carlo and deterministic radiation transport codes into open source file formats. These tools are aimed at bridging the gap between trusted, widely-used radiation analysis codes and very powerful, more recent and commonly used visualization software, thus supporting the design process and helping with shielding optimization. Three main lines of development were followed: mesh-based analysis of Monte Carlo codes, mesh-based analysis of deterministic codes and Monte Carlo surface meshing. The developed kit is considered a powerful and cost-effective tool in the computer-aided design formore » radiation transport code users of the nuclear world, and in particular in the fields of core design and radiation analysis. (authors)« less

  11. Performance Analysis of GYRO: A Tool Evaluation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Worley, P.; Roth, P.; Candy, J.

    2005-06-26

    The performance of the Eulerian gyrokinetic-Maxwell solver code GYRO is analyzed on five high performance computing systems. First, a manual approach is taken, using custom scripts to analyze the output of embedded wall clock timers, floating point operation counts collected using hardware performance counters, and traces of user and communication events collected using the profiling interface to Message Passing Interface (MPI) libraries. Parts of the analysis are then repeated or extended using a number of sophisticated performance analysis tools: IPM, KOJAK, SvPablo, TAU, and the PMaC modeling tool suite. The paper briefly discusses what has been discovered via this manualmore » analysis process, what performance analyses are inconvenient or infeasible to attempt manually, and to what extent the tools show promise in accelerating or significantly extending the manual performance analyses.« less

  12. Tools for observational gait analysis in patients with stroke: a systematic review.

    PubMed

    Ferrarello, Francesco; Bianchi, Valeria Anna Maria; Baccini, Marco; Rubbieri, Gaia; Mossello, Enrico; Cavallini, Maria Chiara; Marchionni, Niccolò; Di Bari, Mauro

    2013-12-01

    Stroke severely affects walking ability, and assessment of gait kinematics is important in defining diagnosis, planning treatment, and evaluating interventions in stroke rehabilitation. Although observational gait analysis is the most common approach to evaluate gait kinematics, tools useful for this purpose have received little attention in the scientific literature and have not been thoroughly reviewed. The aims of this systematic review were to identify tools proposed to conduct observational gait analysis in adults with a stroke, to summarize evidence concerning their quality, and to assess their implementation in rehabilitation research and clinical practice. An extensive search was performed of original articles reporting on visual/observational tools developed to investigate gait kinematics in adults with a stroke. Two reviewers independently selected studies, extracted data, assessed quality of the included studies, and scored the metric properties and clinical utility of each tool. Rigor in reporting metric properties and dissemination of the tools also was evaluated. Five tools were identified, not all of which had been tested adequately for their metric properties. Evaluation of content validity was partially satisfactory. Reliability was poorly investigated in all but one tool. Concurrent validity and sensitivity to change were shown for 3 and 2 tools, respectively. Overall, adequate levels of quality were rarely reached. The dissemination of the tools was poor. Based on critical appraisal, the Gait Assessment and Intervention Tool shows a good level of quality, and its use in stroke rehabilitation is recommended. Rigorous studies are needed for the other tools in order to establish their usefulness.

  13. The methodological quality assessment tools for preclinical and clinical studies, systematic review and meta-analysis, and clinical practice guideline: a systematic review.

    PubMed

    Zeng, Xiantao; Zhang, Yonggang; Kwong, Joey S W; Zhang, Chao; Li, Sheng; Sun, Feng; Niu, Yuming; Du, Liang

    2015-02-01

    To systematically review the methodological assessment tools for pre-clinical and clinical studies, systematic review and meta-analysis, and clinical practice guideline. We searched PubMed, the Cochrane Handbook for Systematic Reviews of Interventions, Joanna Briggs Institute (JBI) Reviewers Manual, Centre for Reviews and Dissemination, Critical Appraisal Skills Programme (CASP), Scottish Intercollegiate Guidelines Network (SIGN), and the National Institute for Clinical Excellence (NICE) up to May 20th, 2014. Two authors selected studies and extracted data; quantitative analysis was performed to summarize the characteristics of included tools. We included a total of 21 assessment tools for analysis. A number of tools were developed by academic organizations, and some were developed by only a small group of researchers. The JBI developed the highest number of methodological assessment tools, with CASP coming second. Tools for assessing the methodological quality of randomized controlled studies were most abundant. The Cochrane Collaboration's tool for assessing risk of bias is the best available tool for assessing RCTs. For cohort and case-control studies, we recommend the use of the Newcastle-Ottawa Scale. The Methodological Index for Non-Randomized Studies (MINORS) is an excellent tool for assessing non-randomized interventional studies, and the Agency for Healthcare Research and Quality (ARHQ) methodology checklist is applicable for cross-sectional studies. For diagnostic accuracy test studies, the Quality Assessment of Diagnostic Accuracy Studies-2 (QUADAS-2) tool is recommended; the SYstematic Review Centre for Laboratory animal Experimentation (SYRCLE) risk of bias tool is available for assessing animal studies; Assessment of Multiple Systematic Reviews (AMSTAR) is a measurement tool for systematic reviews/meta-analyses; an 18-item tool has been developed for appraising case series studies, and the Appraisal of Guidelines, Research and Evaluation (AGREE)-II instrument is widely used to evaluate clinical practice guidelines. We have successfully identified a variety of methodological assessment tools for different types of study design. However, further efforts in the development of critical appraisal tools are warranted since there is currently a lack of such tools for other fields, e.g. genetic studies, and some existing tools (nested case-control studies and case reports, for example) are in need of updating to be in line with current research practice and rigor. In addition, it is very important that all critical appraisal tools remain subjective and performance bias is effectively avoided. © 2015 Chinese Cochrane Center, West China Hospital of Sichuan University and Wiley Publishing Asia Pty Ltd.

  14. Upgrade of DRAMA-ESA's Space Debris Mitigation Analysis Tool Suite

    NASA Astrophysics Data System (ADS)

    Gelhaus, Johannes; Sanchez-Ortiz, Noelia; Braun, Vitali; Kebschull, Christopher; de Oliveira, Joaquim Correia; Dominguez-Gonzalez, Raul; Wiedemann, Carsten; Krag, Holger; Vorsmann, Peter

    2013-08-01

    One decade ago ESA started the dev elopment of the first version of the software tool called DRAMA (Debris Risk Assessment and Mitigation Analysis) to enable ESA space programs to assess their compliance with the recommendations in the European Code of Conduct for Space Debris Mitigation. This tool was maintained, upgraded and extended during the last year and is now a combination of five individual tools, each addressing a different aspect of debris mitigation. This paper gives an overview of the new DRAMA software in general. Both, the main tools ARES, OSCAR, MIDAS, CROC and SARA will be discussed and the environment used by DRAMA will be explained shortly.

  15. Python Spectral Analysis Tool (PySAT) for Preprocessing, Multivariate Analysis, and Machine Learning with Point Spectra

    NASA Astrophysics Data System (ADS)

    Anderson, R. B.; Finch, N.; Clegg, S.; Graff, T.; Morris, R. V.; Laura, J.

    2017-06-01

    We present a Python-based library and graphical interface for the analysis of point spectra. The tool is being developed with a focus on methods used for ChemCam data, but is flexible enough to handle spectra from other instruments.

  16. Sustainability Tools Inventory Initial Gap Analysis

    EPA Science Inventory

    This report identifies a suite of tools that address a comprehensive set of community sustainability concerns. The objective is to discover whether "gaps" exist in the tool suite’s analytic capabilities. These tools address activities that significantly influence resource consu...

  17. Image edge detection based tool condition monitoring with morphological component analysis.

    PubMed

    Yu, Xiaolong; Lin, Xin; Dai, Yiquan; Zhu, Kunpeng

    2017-07-01

    The measurement and monitoring of tool condition are keys to the product precision in the automated manufacturing. To meet the need, this study proposes a novel tool wear monitoring approach based on the monitored image edge detection. Image edge detection has been a fundamental tool to obtain features of images. This approach extracts the tool edge with morphological component analysis. Through the decomposition of original tool wear image, the approach reduces the influence of texture and noise for edge measurement. Based on the target image sparse representation and edge detection, the approach could accurately extract the tool wear edge with continuous and complete contour, and is convenient in charactering tool conditions. Compared to the celebrated algorithms developed in the literature, this approach improves the integrity and connectivity of edges, and the results have shown that it achieves better geometry accuracy and lower error rate in the estimation of tool conditions. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  18. Comparative analysis and visualization of multiple collinear genomes

    PubMed Central

    2012-01-01

    Background Genome browsers are a common tool used by biologists to visualize genomic features including genes, polymorphisms, and many others. However, existing genome browsers and visualization tools are not well-suited to perform meaningful comparative analysis among a large number of genomes. With the increasing quantity and availability of genomic data, there is an increased burden to provide useful visualization and analysis tools for comparison of multiple collinear genomes such as the large panels of model organisms which are the basis for much of the current genetic research. Results We have developed a novel web-based tool for visualizing and analyzing multiple collinear genomes. Our tool illustrates genome-sequence similarity through a mosaic of intervals representing local phylogeny, subspecific origin, and haplotype identity. Comparative analysis is facilitated through reordering and clustering of tracks, which can vary throughout the genome. In addition, we provide local phylogenetic trees as an alternate visualization to assess local variations. Conclusions Unlike previous genome browsers and viewers, ours allows for simultaneous and comparative analysis. Our browser provides intuitive selection and interactive navigation about features of interest. Dynamic visualizations adjust to scale and data content making analysis at variable resolutions and of multiple data sets more informative. We demonstrate our genome browser for an extensive set of genomic data sets composed of almost 200 distinct mouse laboratory strains. PMID:22536897

  19. Holmes: a graphical tool for development, simulation and analysis of Petri net based models of complex biological systems.

    PubMed

    Radom, Marcin; Rybarczyk, Agnieszka; Szawulak, Bartlomiej; Andrzejewski, Hubert; Chabelski, Piotr; Kozak, Adam; Formanowicz, Piotr

    2017-12-01

    Model development and its analysis is a fundamental step in systems biology. The theory of Petri nets offers a tool for such a task. Since the rapid development of computer science, a variety of tools for Petri nets emerged, offering various analytical algorithms. From this follows a problem of using different programs to analyse a single model. Many file formats and different representations of results make the analysis much harder. Especially for larger nets the ability to visualize the results in a proper form provides a huge help in the understanding of their significance. We present a new tool for Petri nets development and analysis called Holmes. Our program contains algorithms for model analysis based on different types of Petri nets, e.g. invariant generator, Maximum Common Transitions (MCT) sets and cluster modules, simulation algorithms or knockout analysis tools. A very important feature is the ability to visualize the results of almost all analytical modules. The integration of such modules into one graphical environment allows a researcher to fully devote his or her time to the model building and analysis. Available at http://www.cs.put.poznan.pl/mradom/Holmes/holmes.html. piotr@cs.put.poznan.pl. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  20. RSAT 2018: regulatory sequence analysis tools 20th anniversary.

    PubMed

    Nguyen, Nga Thi Thuy; Contreras-Moreira, Bruno; Castro-Mondragon, Jaime A; Santana-Garcia, Walter; Ossio, Raul; Robles-Espinoza, Carla Daniela; Bahin, Mathieu; Collombet, Samuel; Vincens, Pierre; Thieffry, Denis; van Helden, Jacques; Medina-Rivera, Alejandra; Thomas-Chollier, Morgane

    2018-05-02

    RSAT (Regulatory Sequence Analysis Tools) is a suite of modular tools for the detection and the analysis of cis-regulatory elements in genome sequences. Its main applications are (i) motif discovery, including from genome-wide datasets like ChIP-seq/ATAC-seq, (ii) motif scanning, (iii) motif analysis (quality assessment, comparisons and clustering), (iv) analysis of regulatory variations, (v) comparative genomics. Six public servers jointly support 10 000 genomes from all kingdoms. Six novel or refactored programs have been added since the 2015 NAR Web Software Issue, including updated programs to analyse regulatory variants (retrieve-variation-seq, variation-scan, convert-variations), along with tools to extract sequences from a list of coordinates (retrieve-seq-bed), to select motifs from motif collections (retrieve-matrix), and to extract orthologs based on Ensembl Compara (get-orthologs-compara). Three use cases illustrate the integration of new and refactored tools to the suite. This Anniversary update gives a 20-year perspective on the software suite. RSAT is well-documented and available through Web sites, SOAP/WSDL (Simple Object Access Protocol/Web Services Description Language) web services, virtual machines and stand-alone programs at http://www.rsat.eu/.

  1. Oqtans: the RNA-seq workbench in the cloud for complete and reproducible quantitative transcriptome analysis.

    PubMed

    Sreedharan, Vipin T; Schultheiss, Sebastian J; Jean, Géraldine; Kahles, André; Bohnert, Regina; Drewe, Philipp; Mudrakarta, Pramod; Görnitz, Nico; Zeller, Georg; Rätsch, Gunnar

    2014-05-01

    We present Oqtans, an open-source workbench for quantitative transcriptome analysis, that is integrated in Galaxy. Its distinguishing features include customizable computational workflows and a modular pipeline architecture that facilitates comparative assessment of tool and data quality. Oqtans integrates an assortment of machine learning-powered tools into Galaxy, which show superior or equal performance to state-of-the-art tools. Implemented tools comprise a complete transcriptome analysis workflow: short-read alignment, transcript identification/quantification and differential expression analysis. Oqtans and Galaxy facilitate persistent storage, data exchange and documentation of intermediate results and analysis workflows. We illustrate how Oqtans aids the interpretation of data from different experiments in easy to understand use cases. Users can easily create their own workflows and extend Oqtans by integrating specific tools. Oqtans is available as (i) a cloud machine image with a demo instance at cloud.oqtans.org, (ii) a public Galaxy instance at galaxy.cbio.mskcc.org, (iii) a git repository containing all installed software (oqtans.org/git); most of which is also available from (iv) the Galaxy Toolshed and (v) a share string to use along with Galaxy CloudMan.

  2. Analytical Tools for Affordability Analysis

    DTIC Science & Technology

    2015-05-01

    function (Womer)  Unit cost as a function of learning and rate  Learning with forgetting (Benkard)  Learning depreciates over time  Discretionary...Analytical Tools for Affordability Analysis David Tate Cost Analysis and Research Division Institute for Defense Analyses Report Documentation...ES) Institute for Defense Analyses, Cost Analysis and Research Division,4850 Mark Center Drive,Alexandria,VA,22311-1882 8. PERFORMING ORGANIZATION

  3. Multi-body Dynamic Contact Analysis Tool for Transmission Design

    DTIC Science & Technology

    2003-04-01

    frequencies were computed in COSMIC NASTRAN, and were validated against the published experimental modal analysis [17]. • Using assumed time domain... modal superposition. • Results from the structural analysis (mode shapes or forced response) were converted into IDEAS universal format (dataset 55...ARMY RESEARCH LABORATORY Multi-body Dynamic Contact Analysis Tool for Transmission Design SBIR Phase II Final Report by

  4. HEPDOOP: High-Energy Physics Analysis using Hadoop

    NASA Astrophysics Data System (ADS)

    Bhimji, W.; Bristow, T.; Washbrook, A.

    2014-06-01

    We perform a LHC data analysis workflow using tools and data formats that are commonly used in the "Big Data" community outside High Energy Physics (HEP). These include Apache Avro for serialisation to binary files, Pig and Hadoop for mass data processing and Python Scikit-Learn for multi-variate analysis. Comparison is made with the same analysis performed with current HEP tools in ROOT.

  5. Principal Angle Enrichment Analysis (PAEA): Dimensionally Reduced Multivariate Gene Set Enrichment Analysis Tool

    PubMed Central

    Clark, Neil R.; Szymkiewicz, Maciej; Wang, Zichen; Monteiro, Caroline D.; Jones, Matthew R.; Ma’ayan, Avi

    2016-01-01

    Gene set analysis of differential expression, which identifies collectively differentially expressed gene sets, has become an important tool for biology. The power of this approach lies in its reduction of the dimensionality of the statistical problem and its incorporation of biological interpretation by construction. Many approaches to gene set analysis have been proposed, but benchmarking their performance in the setting of real biological data is difficult due to the lack of a gold standard. In a previously published work we proposed a geometrical approach to differential expression which performed highly in benchmarking tests and compared well to the most popular methods of differential gene expression. As reported, this approach has a natural extension to gene set analysis which we call Principal Angle Enrichment Analysis (PAEA). PAEA employs dimensionality reduction and a multivariate approach for gene set enrichment analysis. However, the performance of this method has not been assessed nor its implementation as a web-based tool. Here we describe new benchmarking protocols for gene set analysis methods and find that PAEA performs highly. The PAEA method is implemented as a user-friendly web-based tool, which contains 70 gene set libraries and is freely available to the community. PMID:26848405

  6. Principal Angle Enrichment Analysis (PAEA): Dimensionally Reduced Multivariate Gene Set Enrichment Analysis Tool.

    PubMed

    Clark, Neil R; Szymkiewicz, Maciej; Wang, Zichen; Monteiro, Caroline D; Jones, Matthew R; Ma'ayan, Avi

    2015-11-01

    Gene set analysis of differential expression, which identifies collectively differentially expressed gene sets, has become an important tool for biology. The power of this approach lies in its reduction of the dimensionality of the statistical problem and its incorporation of biological interpretation by construction. Many approaches to gene set analysis have been proposed, but benchmarking their performance in the setting of real biological data is difficult due to the lack of a gold standard. In a previously published work we proposed a geometrical approach to differential expression which performed highly in benchmarking tests and compared well to the most popular methods of differential gene expression. As reported, this approach has a natural extension to gene set analysis which we call Principal Angle Enrichment Analysis (PAEA). PAEA employs dimensionality reduction and a multivariate approach for gene set enrichment analysis. However, the performance of this method has not been assessed nor its implementation as a web-based tool. Here we describe new benchmarking protocols for gene set analysis methods and find that PAEA performs highly. The PAEA method is implemented as a user-friendly web-based tool, which contains 70 gene set libraries and is freely available to the community.

  7. Hilbert-Huang Transform: A Spectral Analysis Tool Applied to Sunspot Number and Total Solar Irradiance Variations, as well as Near-Surface Atmospheric Variables

    NASA Astrophysics Data System (ADS)

    Barnhart, B. L.; Eichinger, W. E.; Prueger, J. H.

    2010-12-01

    Hilbert-Huang transform (HHT) is a relatively new data analysis tool which is used to analyze nonstationary and nonlinear time series data. It consists of an algorithm, called empirical mode decomposition (EMD), which extracts the cyclic components embedded within time series data, as well as Hilbert spectral analysis (HSA) which displays the time and frequency dependent energy contributions from each component in the form of a spectrogram. The method can be considered a generalized form of Fourier analysis which can describe the intrinsic cycles of data with basis functions whose amplitudes and phases may vary with time. The HHT will be introduced and compared to current spectral analysis tools such as Fourier analysis, short-time Fourier analysis, wavelet analysis and Wigner-Ville distributions. A number of applications are also presented which demonstrate the strengths and limitations of the tool, including analyzing sunspot number variability and total solar irradiance proxies as well as global averaged temperature and carbon dioxide concentration. Also, near-surface atmospheric quantities such as temperature and wind velocity are analyzed to demonstrate the nonstationarity of the atmosphere.

  8. Data and Tools | Integrated Energy Solutions | NREL

    Science.gov Websites

    for a research campus eQUEST. Detailed analysis of today's state-of-the-art building design source software tools to support whole building energy modeling and advanced daylight analysis BESTEST-EX

  9. A dynamical framework for integrated corridor management.

    DOT National Transportation Integrated Search

    2016-01-11

    We develop analysis and control synthesis tools for dynamic traffic flow over networks. Our analysis : relies on exploiting monotonicity properties of the dynamics, and on adapting relevant tools from : stochastic queuing networks. We develop proport...

  10. Analysis of Alternatives for Risk Assessment Methodologies and Tools

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nachtigal, Noel M.; Fruetel, Julia A.; Gleason, Nathaniel J.

    The purpose of this document is to provide a basic overview and understanding of risk assessment methodologies and tools from the literature and to assess the suitability of these methodologies and tools for cyber risk assessment. Sandia National Laboratories (SNL) performed this review in support of risk modeling activities performed for the Stakeholder Engagement and Cyber Infrastructure Resilience (SECIR) division of the Department of Homeland Security (DHS) Office of Cybersecurity and Communications (CS&C). The set of methodologies and tools covered in this document is not intended to be exhaustive; instead, it focuses on those that are commonly used in themore » risk assessment community. The classification of methodologies and tools was performed by a group of analysts with experience in risk analysis and cybersecurity, and the resulting analysis of alternatives has been tailored to address the needs of a cyber risk assessment.« less

  11. Designing Real-time Decision Support for Trauma Resuscitations

    PubMed Central

    Yadav, Kabir; Chamberlain, James M.; Lewis, Vicki R.; Abts, Natalie; Chawla, Shawn; Hernandez, Angie; Johnson, Justin; Tuveson, Genevieve; Burd, Randall S.

    2016-01-01

    Background Use of electronic clinical decision support (eCDS) has been recommended to improve implementation of clinical decision rules. Many eCDS tools, however, are designed and implemented without taking into account the context in which clinical work is performed. Implementation of the pediatric traumatic brain injury (TBI) clinical decision rule at one Level I pediatric emergency department includes an electronic questionnaire triggered when ordering a head computed tomography using computerized physician order entry (CPOE). Providers use this CPOE tool in less than 20% of trauma resuscitation cases. A human factors engineering approach could identify the implementation barriers that are limiting the use of this tool. Objectives The objective was to design a pediatric TBI eCDS tool for trauma resuscitation using a human factors approach. The hypothesis was that clinical experts will rate a usability-enhanced eCDS tool better than the existing CPOE tool for user interface design and suitability for clinical use. Methods This mixed-methods study followed usability evaluation principles. Pediatric emergency physicians were surveyed to identify barriers to using the existing eCDS tool. Using standard trauma resuscitation protocols, a hierarchical task analysis of pediatric TBI evaluation was developed. Five clinical experts, all board-certified pediatric emergency medicine faculty members, then iteratively modified the hierarchical task analysis until reaching consensus. The software team developed a prototype eCDS display using the hierarchical task analysis. Three human factors engineers provided feedback on the prototype through a heuristic evaluation, and the software team refined the eCDS tool using a rapid prototyping process. The eCDS tool then underwent iterative usability evaluations by the five clinical experts using video review of 50 trauma resuscitation cases. A final eCDS tool was created based on their feedback, with content analysis of the evaluations performed to ensure all concerns were identified and addressed. Results Among 26 EPs (76% response rate), the main barriers to using the existing tool were that the information displayed is redundant and does not fit clinical workflow. After the prototype eCDS tool was developed based on the trauma resuscitation hierarchical task analysis, the human factors engineers rated it to be better than the CPOE tool for nine of 10 standard user interface design heuristics on a three-point scale. The eCDS tool was also rated better for clinical use on the same scale, in 84% of 50 expert–video pairs, and was rated equivalent in the remainder. Clinical experts also rated barriers to use of the eCDS tool as being low. Conclusions An eCDS tool for diagnostic imaging designed using human factors engineering methods has improved perceived usability among pediatric emergency physicians. PMID:26300010

  12. PANTHER version 11: expanded annotation data from Gene Ontology and Reactome pathways, and data analysis tool enhancements.

    PubMed

    Mi, Huaiyu; Huang, Xiaosong; Muruganujan, Anushya; Tang, Haiming; Mills, Caitlin; Kang, Diane; Thomas, Paul D

    2017-01-04

    The PANTHER database (Protein ANalysis THrough Evolutionary Relationships, http://pantherdb.org) contains comprehensive information on the evolution and function of protein-coding genes from 104 completely sequenced genomes. PANTHER software tools allow users to classify new protein sequences, and to analyze gene lists obtained from large-scale genomics experiments. In the past year, major improvements include a large expansion of classification information available in PANTHER, as well as significant enhancements to the analysis tools. Protein subfamily functional classifications have more than doubled due to progress of the Gene Ontology Phylogenetic Annotation Project. For human genes (as well as a few other organisms), PANTHER now also supports enrichment analysis using pathway classifications from the Reactome resource. The gene list enrichment tools include a new 'hierarchical view' of results, enabling users to leverage the structure of the classifications/ontologies; the tools also allow users to upload genetic variant data directly, rather than requiring prior conversion to a gene list. The updated coding single-nucleotide polymorphisms (SNP) scoring tool uses an improved algorithm. The hidden Markov model (HMM) search tools now use HMMER3, dramatically reducing search times and improving accuracy of E-value statistics. Finally, the PANTHER Tree-Attribute Viewer has been implemented in JavaScript, with new views for exploring protein sequence evolution. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  13. Revisiting Information Technology tools serving authorship and editorship: a case-guided tutorial to statistical analysis and plagiarism detection

    PubMed Central

    Bamidis, P D; Lithari, C; Konstantinidis, S T

    2010-01-01

    With the number of scientific papers published in journals, conference proceedings, and international literature ever increasing, authors and reviewers are not only facilitated with an abundance of information, but unfortunately continuously confronted with risks associated with the erroneous copy of another's material. In parallel, Information Communication Technology (ICT) tools provide to researchers novel and continuously more effective ways to analyze and present their work. Software tools regarding statistical analysis offer scientists the chance to validate their work and enhance the quality of published papers. Moreover, from the reviewers and the editor's perspective, it is now possible to ensure the (text-content) originality of a scientific article with automated software tools for plagiarism detection. In this paper, we provide a step-bystep demonstration of two categories of tools, namely, statistical analysis and plagiarism detection. The aim is not to come up with a specific tool recommendation, but rather to provide useful guidelines on the proper use and efficiency of either category of tools. In the context of this special issue, this paper offers a useful tutorial to specific problems concerned with scientific writing and review discourse. A specific neuroscience experimental case example is utilized to illustrate the young researcher's statistical analysis burden, while a test scenario is purpose-built using open access journal articles to exemplify the use and comparative outputs of seven plagiarism detection software pieces. PMID:21487489

  14. Revisiting Information Technology tools serving authorship and editorship: a case-guided tutorial to statistical analysis and plagiarism detection.

    PubMed

    Bamidis, P D; Lithari, C; Konstantinidis, S T

    2010-12-01

    With the number of scientific papers published in journals, conference proceedings, and international literature ever increasing, authors and reviewers are not only facilitated with an abundance of information, but unfortunately continuously confronted with risks associated with the erroneous copy of another's material. In parallel, Information Communication Technology (ICT) tools provide to researchers novel and continuously more effective ways to analyze and present their work. Software tools regarding statistical analysis offer scientists the chance to validate their work and enhance the quality of published papers. Moreover, from the reviewers and the editor's perspective, it is now possible to ensure the (text-content) originality of a scientific article with automated software tools for plagiarism detection. In this paper, we provide a step-bystep demonstration of two categories of tools, namely, statistical analysis and plagiarism detection. The aim is not to come up with a specific tool recommendation, but rather to provide useful guidelines on the proper use and efficiency of either category of tools. In the context of this special issue, this paper offers a useful tutorial to specific problems concerned with scientific writing and review discourse. A specific neuroscience experimental case example is utilized to illustrate the young researcher's statistical analysis burden, while a test scenario is purpose-built using open access journal articles to exemplify the use and comparative outputs of seven plagiarism detection software pieces.

  15. Analysis and classification of the tools for assessing the risks associated with industrial machines.

    PubMed

    Paques, Joseph-Jean; Gauthier, François; Perez, Alejandro

    2007-01-01

    To assess and plan future risk-analysis research projects, 275 documents describing methods and tools for assessing the risks associated with industrial machines or with other sectors such as the military, and the nuclear and aeronautics industries, etc., were collected. These documents were in the format of published books or papers, standards, technical guides and company procedures collected throughout industry. From the collected documents, 112 documents were selected for analysis; 108 methods applied or potentially applicable for assessing the risks associated with industrial machines were analyzed and classified. This paper presents the main quantitative results of the analysis of the methods and tools.

  16. CloudMan as a platform for tool, data, and analysis distribution.

    PubMed

    Afgan, Enis; Chapman, Brad; Taylor, James

    2012-11-27

    Cloud computing provides an infrastructure that facilitates large scale computational analysis in a scalable, democratized fashion, However, in this context it is difficult to ensure sharing of an analysis environment and associated data in a scalable and precisely reproducible way. CloudMan (usecloudman.org) enables individual researchers to easily deploy, customize, and share their entire cloud analysis environment, including data, tools, and configurations. With the enabled customization and sharing of instances, CloudMan can be used as a platform for collaboration. The presented solution improves accessibility of cloud resources, tools, and data to the level of an individual researcher and contributes toward reproducibility and transparency of research solutions.

  17. The Complexity Analysis Tool

    DTIC Science & Technology

    1988-10-01

    overview of the complexity analysis tool ( CAT ), an automated tool which will analyze mission critical computer resources (MCCR) software. CAT is based...84 MAR UNCLASSIFIED SECURITY CLASSIFICATION OF THIS PAGE 19. ABSTRACT: (cont) CAT automates the metric for BASIC (HP-71), ATLAS (EQUATE), Ada (subset...UNIX 5.2). CAT analyzes source code and computes complexity on a module basis. CAT also generates graphic representations of the logic flow paths and

  18. Automated Slicing for a Multi-Axis Metal Deposition System (Preprint)

    DTIC Science & Technology

    2006-09-01

    experimented with different materials like H13 tool steel to build the part. Following the same slicing and scanning toolpath result, there is a geometric...and analysis tool -centroidal axis. Similar to medial axis, it contains geometry and topological information but is significantly computationally...geometry reasoning and analysis tool -centroidal axis. Similar to medial axis, it contains geometry and topological information but is significantly

  19. DAnTE: a statistical tool for quantitative analysis of –omics data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Polpitiya, Ashoka D.; Qian, Weijun; Jaitly, Navdeep

    2008-05-03

    DAnTE (Data Analysis Tool Extension) is a statistical tool designed to address challenges unique to quantitative bottom-up, shotgun proteomics data. This tool has also been demonstrated for microarray data and can easily be extended to other high-throughput data types. DAnTE features selected normalization methods, missing value imputation algorithms, peptide to protein rollup methods, an extensive array of plotting functions, and a comprehensive ANOVA scheme that can handle unbalanced data and random effects. The Graphical User Interface (GUI) is designed to be very intuitive and user friendly.

  20. Regulatory sequence analysis tools.

    PubMed

    van Helden, Jacques

    2003-07-01

    The web resource Regulatory Sequence Analysis Tools (RSAT) (http://rsat.ulb.ac.be/rsat) offers a collection of software tools dedicated to the prediction of regulatory sites in non-coding DNA sequences. These tools include sequence retrieval, pattern discovery, pattern matching, genome-scale pattern matching, feature-map drawing, random sequence generation and other utilities. Alternative formats are supported for the representation of regulatory motifs (strings or position-specific scoring matrices) and several algorithms are proposed for pattern discovery. RSAT currently holds >100 fully sequenced genomes and these data are regularly updated from GenBank.

  1. Expert systems tools for Hubble Space Telescope observation scheduling

    NASA Technical Reports Server (NTRS)

    Miller, Glenn; Rosenthal, Don; Cohen, William; Johnston, Mark

    1987-01-01

    The utility of expert systems techniques for the Hubble Space Telescope (HST) planning and scheduling is discussed and a plan for development of expert system tools which will augment the existing ground system is described. Additional capabilities provided by these tools will include graphics-oriented plan evaluation, long-range analysis of the observation pool, analysis of optimal scheduling time intervals, constructing sequences of spacecraft activities which minimize operational overhead, and optimization of linkages between observations. Initial prototyping of a scheduler used the Automated Reasoning Tool running on a LISP workstation.

  2. Tool for Turbine Engine Closed-Loop Transient Analysis (TTECTrA) Users' Guide

    NASA Technical Reports Server (NTRS)

    Csank, Jeffrey T.; Zinnecker, Alicia M.

    2014-01-01

    The tool for turbine engine closed-loop transient analysis (TTECTrA) is a semi-automated control design tool for subsonic aircraft engine simulations. At a specific flight condition, TTECTrA produces a basic controller designed to meet user-defined goals and containing only the fundamental limiters that affect the transient performance of the engine. The purpose of this tool is to provide the user a preliminary estimate of the transient performance of an engine model without the need to design a full nonlinear controller.

  3. Integrated Data Visualization and Virtual Reality Tool

    NASA Technical Reports Server (NTRS)

    Dryer, David A.

    1998-01-01

    The Integrated Data Visualization and Virtual Reality Tool (IDVVRT) Phase II effort was for the design and development of an innovative Data Visualization Environment Tool (DVET) for NASA engineers and scientists, enabling them to visualize complex multidimensional and multivariate data in a virtual environment. The objectives of the project were to: (1) demonstrate the transfer and manipulation of standard engineering data in a virtual world; (2) demonstrate the effects of design and changes using finite element analysis tools; and (3) determine the training and engineering design and analysis effectiveness of the visualization system.

  4. Strategic Analysis Overview

    NASA Technical Reports Server (NTRS)

    Cirillo, William M.; Earle, Kevin D.; Goodliff, Kandyce E.; Reeves, J. D.; Stromgren, Chel; Andraschko, Mark R.; Merrill, R. Gabe

    2008-01-01

    NASA s Constellation Program employs a strategic analysis methodology in providing an integrated analysis capability of Lunar exploration scenarios and to support strategic decision-making regarding those scenarios. The strategic analysis methodology integrates the assessment of the major contributors to strategic objective satisfaction performance, affordability, and risk and captures the linkages and feedbacks between all three components. Strategic analysis supports strategic decision making by senior management through comparable analysis of alternative strategies, provision of a consistent set of high level value metrics, and the enabling of cost-benefit analysis. The tools developed to implement the strategic analysis methodology are not element design and sizing tools. Rather, these models evaluate strategic performance using predefined elements, imported into a library from expert-driven design/sizing tools or expert analysis. Specific components of the strategic analysis tool set include scenario definition, requirements generation, mission manifesting, scenario lifecycle costing, crew time analysis, objective satisfaction benefit, risk analysis, and probabilistic evaluation. Results from all components of strategic analysis are evaluated a set of pre-defined figures of merit (FOMs). These FOMs capture the high-level strategic characteristics of all scenarios and facilitate direct comparison of options. The strategic analysis methodology that is described in this paper has previously been applied to the Space Shuttle and International Space Station Programs and is now being used to support the development of the baseline Constellation Program lunar architecture. This paper will present an overview of the strategic analysis methodology and will present sample results from the application of the strategic analysis methodology to the Constellation Program lunar architecture.

  5. Learning from Adverse Events in Obstetrics: Is a Standardized Computer Tool an Effective Strategy for Root Cause Analysis?

    PubMed

    Murray-Davis, Beth; McDonald, Helen; Cross-Sudworth, Fiona; Ahmed, Rashid; Simioni, Julia; Dore, Sharon; Marrin, Michael; DeSantis, Judy; Leyland, Nicholas; Gardosi, Jason; Hutton, Eileen; McDonald, Sarah

    2015-08-01

    Adverse events occur in up to 10% of obstetric cases, and up to one half of these could be prevented. Case reviews and root cause analysis using a structured tool may help health care providers to learn from adverse events and to identify trends and recurring systems issues. We sought to establish the reliability of a root cause analysis computer application called Standardized Clinical Outcome Review (SCOR). We designed a mixed methods study to evaluate the effectiveness of the tool. We conducted qualitative content analysis of five charts reviewed by both the traditional obstetric quality assurance methods and the SCOR tool. We also determined inter-rater reliability by having four health care providers review the same five cases using the SCOR tool. The comparative qualitative review revealed that the traditional quality assurance case review process used inconsistent language and made serious, personalized recommendations for those involved in the case. In contrast, the SCOR review provided a consistent format for recommendations, a list of action points, and highlighted systems issues. The mean percentage agreement between the four reviewers for the five cases was 75%. The different health care providers completed data entry and assessment of the case in a similar way. Missing data from the chart and poor wording of questions were identified as issues affecting percentage agreement. The SCOR tool provides a standardized, objective, obstetric-specific tool for root cause analysis that may improve identification of risk factors and dissemination of action plans to prevent future events.

  6. Community Solar Scenario Tool | Integrated Energy Solutions | NREL

    Science.gov Websites

    Community Solar Scenario Tool Community Solar Scenario Tool The Community Solar Scenario Tool (CSST ) provides a "first cut" analysis of different community or shared solar program options. NREL sponsoring utility. Community Solar Scenario Tool -Beta Version Available as a Microsoft Excel file, which

  7. Review and Comparison of Electronic Patient-Facing Family Health History Tools.

    PubMed

    Welch, Brandon M; Wiley, Kevin; Pflieger, Lance; Achiangia, Rosaline; Baker, Karen; Hughes-Halbert, Chanita; Morrison, Heath; Schiffman, Joshua; Doerr, Megan

    2018-04-01

    Family health history (FHx) is one of the most important pieces of information available to help genetic counselors and other clinicians identify risk and prevent disease. Unfortunately, the collection of FHx from patients is often too time consuming to be done during a clinical visit. Fortunately, there are many electronic FHx tools designed to help patients gather and organize their own FHx information prior to a clinic visit. We conducted a review and analysis of electronic FHx tools to better understand what tools are available, to compare and contrast to each other, to highlight features of various tools, and to provide a foundation for future evaluation and comparisons across FHx tools. Through our analysis, we included and abstracted 17 patient-facing electronic FHx tools and explored these tools around four axes: organization information, family history collection and display, clinical data collected, and clinical workflow integration. We found a large number of differences among FHx tools, with no two the same. This paper provides a useful review for health care providers, researchers, and patient advocates interested in understanding the differences among the available patient-facing electronic FHx tools.

  8. RDNAnalyzer: A tool for DNA secondary structure prediction and sequence analysis

    PubMed Central

    Afzal, Muhammad; Shahid, Ahmad Ali; Shehzadi, Abida; Nadeem, Shahid; Husnain, Tayyab

    2012-01-01

    RDNAnalyzer is an innovative computer based tool designed for DNA secondary structure prediction and sequence analysis. It can randomly generate the DNA sequence or user can upload the sequences of their own interest in RAW format. It uses and extends the Nussinov dynamic programming algorithm and has various application for the sequence analysis. It predicts the DNA secondary structure and base pairings. It also provides the tools for routinely performed sequence analysis by the biological scientists such as DNA replication, reverse compliment generation, transcription, translation, sequence specific information as total number of nucleotide bases, ATGC base contents along with their respective percentages and sequence cleaner. RDNAnalyzer is a unique tool developed in Microsoft Visual Studio 2008 using Microsoft Visual C# and Windows Presentation Foundation and provides user friendly environment for sequence analysis. It is freely available. Availability http://www.cemb.edu.pk/sw.html Abbreviations RDNAnalyzer - Random DNA Analyser, GUI - Graphical user interface, XAML - Extensible Application Markup Language. PMID:23055611

  9. Hybrid Wing Body Planform Design with Vehicle Sketch Pad

    NASA Technical Reports Server (NTRS)

    Wells, Douglas P.; Olson, Erik D.

    2011-01-01

    The objective of this paper was to provide an update on NASA s current tools for design and analysis of hybrid wing body (HWB) aircraft with an emphasis on Vehicle Sketch Pad (VSP). NASA started HWB analysis using the Flight Optimization System (FLOPS). That capability is enhanced using Phoenix Integration's ModelCenter(Registered TradeMark). Model Center enables multifidelity analysis tools to be linked as an integrated structure. Two major components are linked to FLOPS as an example; a planform discretization tool and VSP. The planform discretization tool ensures the planform is smooth and continuous. VSP is used to display the output geometry. This example shows that a smooth & continuous HWB planform can be displayed as a three-dimensional model and rapidly sized and analyzed.

  10. Implementation Analysis of Cutting Tool Carbide with Cast Iron Material S45 C on Universal Lathe

    NASA Astrophysics Data System (ADS)

    Junaidi; hestukoro, Soni; yanie, Ahmad; Jumadi; Eddy

    2017-12-01

    Cutting tool is the tools lathe. Cutting process tool CARBIDE with Cast Iron Material Universal Lathe which is commonly found at Analysiscutting Process by some aspects numely Cutting force, Cutting Speed, Cutting Power, Cutting Indication Power, Temperature Zone 1 and Temperatur Zone 2. Purpose of this Study was to determine how big the cutting Speed, Cutting Power, electromotor Power,Temperatur Zone 1 and Temperatur Zone 2 that drives the chisel cutting CARBIDE in the Process of tur ning Cast Iron Material. Cutting force obtained from image analysis relationship between the recommended Component Cuting Force with plane of the cut and Cutting Speed obtained from image analysis of relationships between the recommended Cutting Speed Feed rate.

  11. A Cross-Platform Infrastructure for Scalable Runtime Application Performance Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jack Dongarra; Shirley Moore; Bart Miller, Jeffrey Hollingsworth

    2005-03-15

    The purpose of this project was to build an extensible cross-platform infrastructure to facilitate the development of accurate and portable performance analysis tools for current and future high performance computing (HPC) architectures. Major accomplishments include tools and techniques for multidimensional performance analysis, as well as improved support for dynamic performance monitoring of multithreaded and multiprocess applications. Previous performance tool development has been limited by the burden of having to re-write a platform-dependent low-level substrate for each architecture/operating system pair in order to obtain the necessary performance data from the system. Manual interpretation of performance data is not scalable for large-scalemore » long-running applications. The infrastructure developed by this project provides a foundation for building portable and scalable performance analysis tools, with the end goal being to provide application developers with the information they need to analyze, understand, and tune the performance of terascale applications on HPC architectures. The backend portion of the infrastructure provides runtime instrumentation capability and access to hardware performance counters, with thread-safety for shared memory environments and a communication substrate to support instrumentation of multiprocess and distributed programs. Front end interfaces provides tool developers with a well-defined, platform-independent set of calls for requesting performance data. End-user tools have been developed that demonstrate runtime data collection, on-line and off-line analysis of performance data, and multidimensional performance analysis. The infrastructure is based on two underlying performance instrumentation technologies. These technologies are the PAPI cross-platform library interface to hardware performance counters and the cross-platform Dyninst library interface for runtime modification of executable images. The Paradyn and KOJAK projects have made use of this infrastructure to build performance measurement and analysis tools that scale to long-running programs on large parallel and distributed systems and that automate much of the search for performance bottlenecks.« less

  12. A Methodology for the Development of RESTful Semantic Web Services for Gene Expression Analysis

    PubMed Central

    Guardia, Gabriela D. A.; Pires, Luís Ferreira; Vêncio, Ricardo Z. N.; Malmegrim, Kelen C. R.; de Farias, Cléver R. G.

    2015-01-01

    Gene expression studies are generally performed through multi-step analysis processes, which require the integrated use of a number of analysis tools. In order to facilitate tool/data integration, an increasing number of analysis tools have been developed as or adapted to semantic web services. In recent years, some approaches have been defined for the development and semantic annotation of web services created from legacy software tools, but these approaches still present many limitations. In addition, to the best of our knowledge, no suitable approach has been defined for the functional genomics domain. Therefore, this paper aims at defining an integrated methodology for the implementation of RESTful semantic web services created from gene expression analysis tools and the semantic annotation of such services. We have applied our methodology to the development of a number of services to support the analysis of different types of gene expression data, including microarray and RNASeq. All developed services are publicly available in the Gene Expression Analysis Services (GEAS) Repository at http://dcm.ffclrp.usp.br/lssb/geas. Additionally, we have used a number of the developed services to create different integrated analysis scenarios to reproduce parts of two gene expression studies documented in the literature. The first study involves the analysis of one-color microarray data obtained from multiple sclerosis patients and healthy donors. The second study comprises the analysis of RNA-Seq data obtained from melanoma cells to investigate the role of the remodeller BRG1 in the proliferation and morphology of these cells. Our methodology provides concrete guidelines and technical details in order to facilitate the systematic development of semantic web services. Moreover, it encourages the development and reuse of these services for the creation of semantically integrated solutions for gene expression analysis. PMID:26207740

  13. A Methodology for the Development of RESTful Semantic Web Services for Gene Expression Analysis.

    PubMed

    Guardia, Gabriela D A; Pires, Luís Ferreira; Vêncio, Ricardo Z N; Malmegrim, Kelen C R; de Farias, Cléver R G

    2015-01-01

    Gene expression studies are generally performed through multi-step analysis processes, which require the integrated use of a number of analysis tools. In order to facilitate tool/data integration, an increasing number of analysis tools have been developed as or adapted to semantic web services. In recent years, some approaches have been defined for the development and semantic annotation of web services created from legacy software tools, but these approaches still present many limitations. In addition, to the best of our knowledge, no suitable approach has been defined for the functional genomics domain. Therefore, this paper aims at defining an integrated methodology for the implementation of RESTful semantic web services created from gene expression analysis tools and the semantic annotation of such services. We have applied our methodology to the development of a number of services to support the analysis of different types of gene expression data, including microarray and RNASeq. All developed services are publicly available in the Gene Expression Analysis Services (GEAS) Repository at http://dcm.ffclrp.usp.br/lssb/geas. Additionally, we have used a number of the developed services to create different integrated analysis scenarios to reproduce parts of two gene expression studies documented in the literature. The first study involves the analysis of one-color microarray data obtained from multiple sclerosis patients and healthy donors. The second study comprises the analysis of RNA-Seq data obtained from melanoma cells to investigate the role of the remodeller BRG1 in the proliferation and morphology of these cells. Our methodology provides concrete guidelines and technical details in order to facilitate the systematic development of semantic web services. Moreover, it encourages the development and reuse of these services for the creation of semantically integrated solutions for gene expression analysis.

  14. Predicting performance with traffic analysis tools : final report.

    DOT National Transportation Integrated Search

    2008-03-01

    This document provides insights into the common pitfalls and challenges associated with use of traffic analysis tools for predicting future performance of a transportation facility. It provides five in-depth case studies that demonstrate common ways ...

  15. Surface Analysis Cluster Tool | Materials Science | NREL

    Science.gov Websites

    spectroscopic ellipsometry during film deposition. The cluster tool can be used to study the effect of various prior to analysis. Here we illustrate the surface cleaning effect of an aqueous ammonia treatment on a

  16. Biofuel transportation analysis tool : description, methodology, and demonstration scenarios

    DOT National Transportation Integrated Search

    2014-01-01

    This report describes a Biofuel Transportation Analysis Tool (BTAT), developed by the U.S. Department of Transportation (DOT) Volpe National Transportation Systems Center (Volpe) in support of the Department of Defense (DOD) Office of Naval Research ...

  17. Improved Data Analysis Tools for the Thermal Emission Spectrometer

    NASA Astrophysics Data System (ADS)

    Rodriguez, K.; Laura, J.; Fergason, R.; Bogle, R.

    2017-06-01

    We plan to stand up three different database systems for testing of a new datastore for MGS TES data allowing for more accessible tools supporting high throughput data analysis on the high-dimensionality hyperspectral data set.

  18. A Software Tool for Integrated Optical Design Analysis

    NASA Technical Reports Server (NTRS)

    Moore, Jim; Troy, Ed; DePlachett, Charles; Montgomery, Edward (Technical Monitor)

    2001-01-01

    Design of large precision optical systems requires multi-disciplinary analysis, modeling, and design. Thermal, structural and optical characteristics of the hardware must be accurately understood in order to design a system capable of accomplishing the performance requirements. The interactions between each of the disciplines become stronger as systems are designed lighter weight for space applications. This coupling dictates a concurrent engineering design approach. In the past, integrated modeling tools have been developed that attempt to integrate all of the complex analysis within the framework of a single model. This often results in modeling simplifications and it requires engineering specialist to learn new applications. The software described in this presentation addresses the concurrent engineering task using a different approach. The software tool, Integrated Optical Design Analysis (IODA), uses data fusion technology to enable a cross discipline team of engineering experts to concurrently design an optical system using their standard validated engineering design tools.

  19. Two implementations of the Expert System for the Flight Analysis System (ESFAS) project

    NASA Technical Reports Server (NTRS)

    Wang, Lui

    1988-01-01

    A comparison is made between the two most sophisticated expert system building tools, the Automated Reasoning Tool (ART) and the Knowledge Engineering Environment (KEE). The same problem domain (ESFAS) was used in making the comparison. The Expert System for the Flight Analysis System (ESFAS) acts as an intelligent front end for the Flight Analysis System (FAS). FAS is a complex configuration controlled set of interrelated processors (FORTRAN routines) which will be used by the Mission Planning and Analysis Div. (MPAD) to design and analyze Shuttle and potential Space Station missions. Implementations of ESFAS are described. The two versions represent very different programming paradigms; ART uses rules and KEE uses objects. Due to each of the tools philosophical differences, KEE is implemented using a depth first traversal algorithm, whereas ART uses a user directed traversal method. Either tool could be used to solve this particular problem.

  20. A Data Warehouse Architecture for DoD Healthcare Performance Measurements.

    DTIC Science & Technology

    1999-09-01

    design, develop, implement, and apply statistical analysis and data mining tools to a Data Warehouse of healthcare metrics. With the DoD healthcare...framework, this thesis defines a methodology to design, develop, implement, and apply statistical analysis and data mining tools to a Data Warehouse...21 F. INABILITY TO CONDUCT HELATHCARE ANALYSIS

  1. Patterns of Propaganda and Persuasion.

    ERIC Educational Resources Information Center

    Rank, Hugh

    Because children are exposed to highly professional sales pitches on television and because the old material produced by the Institute of Propaganda Analysis is outdated and in error, a new tool for the analysis of propaganda and persuasion is called for. Such a tool is the intensify/downplay pattern analysis chart, which includes the basic…

  2. Analyzing the Scientific Evolution of Social Work Using Science Mapping

    ERIC Educational Resources Information Center

    Martínez, Ma Angeles; Cobo, Manuel Jesús; Herrera, Manuel; Herrera-Viedma, Enrique

    2015-01-01

    Objectives: This article reports the first science mapping analysis of the social work field, which shows its conceptual structure and scientific evolution. Methods: Science Mapping Analysis Software Tool, a bibliometric science mapping tool based on co-word analysis and h-index, is applied using a sample of 18,794 research articles published from…

  3. Systems Analysis and Integration | Transportation Research | NREL

    Science.gov Websites

    data visualization displayed on a wall. Using a suite of simulation and analysis tools, NREL evaluates savings and reduce emissions. Pictured here, engineers discuss the 3D results of a vehicle simulation vehicles, and other alternative fuel vehicles. Using a suite of simulation and analysis tools, NREL

  4. Retrospective Video Analysis: A Reflective Tool for Teachers and Teacher Educators

    ERIC Educational Resources Information Center

    Mosley Wetzel, Melissa; Maloch, Beth; Hoffman, James V.

    2017-01-01

    Teachers may need tools to use video for reflection toward ongoing toward education and teacher leadership. Based on Goodman's (1996) notion of retrospective miscue analysis, a method of reading instruction that revalues the reader and his or her strategies, retrospective video analysis guides teachers in appreciating and understanding their own…

  5. On-line analysis capabilities developed to support the AFW wind-tunnel tests

    NASA Technical Reports Server (NTRS)

    Wieseman, Carol D.; Hoadley, Sherwood T.; Mcgraw, Sandra M.

    1992-01-01

    A variety of on-line analysis tools were developed to support two active flexible wing (AFW) wind-tunnel tests. These tools were developed to verify control law execution, to satisfy analysis requirements of the control law designers, to provide measures of system stability in a real-time environment, and to provide project managers with a quantitative measure of controller performance. Descriptions and purposes of the developed capabilities are presented along with examples. Procedures for saving and transferring data for near real-time analysis, and descriptions of the corresponding data interface programs are also presented. The on-line analysis tools worked well before, during, and after the wind tunnel test and proved to be a vital and important part of the entire test effort.

  6. Spec Tool; an online education and research resource

    NASA Astrophysics Data System (ADS)

    Maman, S.; Shenfeld, A.; Isaacson, S.; Blumberg, D. G.

    2016-06-01

    Education and public outreach (EPO) activities related to remote sensing, space, planetary and geo-physics sciences have been developed widely in the Earth and Planetary Image Facility (EPIF) at Ben-Gurion University of the Negev, Israel. These programs aim to motivate the learning of geo-scientific and technologic disciplines. For over the past decade, the facility hosts research and outreach activities for researchers, local community, school pupils, students and educators. As software and data are neither available nor affordable, the EPIF Spec tool was created as a web-based resource to assist in initial spectral analysis as a need for researchers and students. The tool is used both in the academic courses and in the outreach education programs and enables a better understanding of the theoretical data of spectroscopy and Imaging Spectroscopy in a 'hands-on' activity. This tool is available online and provides spectra visualization tools and basic analysis algorithms including Spectral plotting, Spectral angle mapping and Linear Unmixing. The tool enables to visualize spectral signatures from the USGS spectral library and additional spectra collected in the EPIF such as of dunes in southern Israel and from Turkmenistan. For researchers and educators, the tool allows loading collected samples locally for further analysis.

  7. Analytical Tools for Space Suit Design

    NASA Technical Reports Server (NTRS)

    Aitchison, Lindsay

    2011-01-01

    As indicated by the implementation of multiple small project teams within the agency, NASA is adopting a lean approach to hardware development that emphasizes quick product realization and rapid response to shifting program and agency goals. Over the past two decades, space suit design has been evolutionary in approach with emphasis on building prototypes then testing with the largest practical range of subjects possible. The results of these efforts show continuous improvement but make scaled design and performance predictions almost impossible with limited budgets and little time. Thus, in an effort to start changing the way NASA approaches space suit design and analysis, the Advanced Space Suit group has initiated the development of an integrated design and analysis tool. It is a multi-year-if not decadal-development effort that, when fully implemented, is envisioned to generate analysis of any given space suit architecture or, conversely, predictions of ideal space suit architectures given specific mission parameters. The master tool will exchange information to and from a set of five sub-tool groups in order to generate the desired output. The basic functions of each sub-tool group, the initial relationships between the sub-tools, and a comparison to state of the art software and tools are discussed.

  8. Rapid State Space Modeling Tool for Rectangular Wing Aeroservoelastic Studies

    NASA Technical Reports Server (NTRS)

    Suh, Peter M.; Conyers, Howard J.; Mavris, Dimitri N.

    2014-01-01

    This paper introduces a modeling and simulation tool for aeroservoelastic analysis of rectangular wings with trailing edge control surfaces. The inputs to the code are planform design parameters such as wing span, aspect ratio and number of control surfaces. A doublet lattice approach is taken to compute generalized forces. A rational function approximation is computed. The output, computed in a few seconds, is a state space aeroservoelastic model which can be used for analysis and control design. The tool is fully parameterized with default information so there is little required interaction with the model developer. Although, all parameters can be easily modified if desired.The focus of this paper is on tool presentation, verification and validation. This process is carried out in stages throughout the paper. The rational function approximation is verified against computed generalized forces for a plate model. A model composed of finite element plates is compared to a modal analysis from commercial software and an independently conducted experimental ground vibration test analysis. Aeroservoelastic analysis is the ultimate goal of this tool. Therefore the flutter speed and frequency for a clamped plate are computed using V-g and V-f analysis. The computational results are compared to a previously published computational analysis and wind tunnel results for the same structure. Finally a case study of a generic wing model with a single control surface is presented. Verification of the state space model is presented in comparison to V-g and V-f analysis. This also includes the analysis of the model in response to a 1-cos gust.

  9. Rapid State Space Modeling Tool for Rectangular Wing Aeroservoelastic Studies

    NASA Technical Reports Server (NTRS)

    Suh, Peter M.; Conyers, Howard J.; Mavris, Dimitri N.

    2015-01-01

    This paper introduces a modeling and simulation tool for aeroservoelastic analysis of rectangular wings with trailing-edge control surfaces. The inputs to the code are planform design parameters such as wing span, aspect ratio, and number of control surfaces. Using this information, the generalized forces are computed using the doublet-lattice method. Using Roger's approximation, a rational function approximation is computed. The output, computed in a few seconds, is a state space aeroservoelastic model which can be used for analysis and control design. The tool is fully parameterized with default information so there is little required interaction with the model developer. All parameters can be easily modified if desired. The focus of this paper is on tool presentation, verification, and validation. These processes are carried out in stages throughout the paper. The rational function approximation is verified against computed generalized forces for a plate model. A model composed of finite element plates is compared to a modal analysis from commercial software and an independently conducted experimental ground vibration test analysis. Aeroservoelastic analysis is the ultimate goal of this tool, therefore, the flutter speed and frequency for a clamped plate are computed using damping-versus-velocity and frequency-versus-velocity analysis. The computational results are compared to a previously published computational analysis and wind-tunnel results for the same structure. A case study of a generic wing model with a single control surface is presented. Verification of the state space model is presented in comparison to damping-versus-velocity and frequency-versus-velocity analysis, including the analysis of the model in response to a 1-cos gust.

  10. Rapid State Space Modeling Tool for Rectangular Wing Aeroservoelastic Studies

    NASA Technical Reports Server (NTRS)

    Suh, Peter M.; Conyers, Howard Jason; Mavris, Dimitri N.

    2015-01-01

    This report introduces a modeling and simulation tool for aeroservoelastic analysis of rectangular wings with trailing-edge control surfaces. The inputs to the code are planform design parameters such as wing span, aspect ratio, and number of control surfaces. Using this information, the generalized forces are computed using the doublet-lattice method. Using Roger's approximation, a rational function approximation is computed. The output, computed in a few seconds, is a state space aeroservoelastic model which can be used for analysis and control design. The tool is fully parameterized with default information so there is little required interaction with the model developer. All parameters can be easily modified if desired. The focus of this report is on tool presentation, verification, and validation. These processes are carried out in stages throughout the report. The rational function approximation is verified against computed generalized forces for a plate model. A model composed of finite element plates is compared to a modal analysis from commercial software and an independently conducted experimental ground vibration test analysis. Aeroservoelastic analysis is the ultimate goal of this tool, therefore, the flutter speed and frequency for a clamped plate are computed using damping-versus-velocity and frequency-versus-velocity analysis. The computational results are compared to a previously published computational analysis and wind-tunnel results for the same structure. A case study of a generic wing model with a single control surface is presented. Verification of the state space model is presented in comparison to damping-versus-velocity and frequency-versus-velocity analysis, including the analysis of the model in response to a 1-cos gust.

  11. Extension of an Object-Oriented Optimization Tool: User's Reference Manual

    NASA Technical Reports Server (NTRS)

    Pak, Chan-Gi; Truong, Samson S.

    2015-01-01

    The National Aeronautics and Space Administration Armstrong Flight Research Center has developed a cost-effective and flexible object-oriented optimization (O (sup 3)) tool that leverages existing tools and practices and allows easy integration and adoption of new state-of-the-art software. This object-oriented framework can integrate the analysis codes for multiple disciplines, as opposed to relying on one code to perform analysis for all disciplines. Optimization can thus take place within each discipline module, or in a loop between the O (sup 3) tool and the discipline modules, or both. Six different sample mathematical problems are presented to demonstrate the performance of the O (sup 3) tool. Instructions for preparing input data for the O (sup 3) tool are detailed in this user's manual.

  12. New Tools in Orthology Analysis: A Brief Review of Promising Perspectives

    PubMed Central

    Nichio, Bruno T. L.; Marchaukoski, Jeroniza Nunes; Raittz, Roberto Tadeu

    2017-01-01

    Nowadays defying homology relationships among sequences is essential for biological research. Within homology the analysis of orthologs sequences is of great importance for computational biology, annotation of genomes and for phylogenetic inference. Since 2007, with the increase in the number of new sequences being deposited in large biological databases, researchers have begun to analyse computerized methodologies and tools aimed at selecting the most promising ones in the prediction of orthologous groups. Literature in this field of research describes the problems that the majority of available tools show, such as those encountered in accuracy, time required for analysis (especially in light of the increasing volume of data being submitted, which require faster techniques) and the automatization of the process without requiring manual intervention. Conducting our search through BMC, Google Scholar, NCBI PubMed, and Expasy, we examined more than 600 articles pursuing the most recent techniques and tools developed to solve most the problems still existing in orthology detection. We listed the main computational tools created and developed between 2011 and 2017, taking into consideration the differences in the type of orthology analysis, outlining the main features of each tool and pointing to the problems that each one tries to address. We also observed that several tools still use as their main algorithm the BLAST “all-against-all” methodology, which entails some limitations, such as limited number of queries, computational cost, and high processing time to complete the analysis. However, new promising tools are being developed, like OrthoVenn (which uses the Venn diagram to show the relationship of ortholog groups generated by its algorithm); or proteinOrtho (which improves the accuracy of ortholog groups); or ReMark (tackling the integration of the pipeline to turn the entry process automatic); or OrthAgogue (using algorithms developed to minimize processing time); and proteinOrtho (developed for dealing with large amounts of biological data). We made a comparison among the main features of four tool and tested them using four for prokaryotic genomas. We hope that our review can be useful for researchers and will help them in selecting the most appropriate tool for their work in the field of orthology. PMID:29163633

  13. New Tools in Orthology Analysis: A Brief Review of Promising Perspectives.

    PubMed

    Nichio, Bruno T L; Marchaukoski, Jeroniza Nunes; Raittz, Roberto Tadeu

    2017-01-01

    Nowadays defying homology relationships among sequences is essential for biological research. Within homology the analysis of orthologs sequences is of great importance for computational biology, annotation of genomes and for phylogenetic inference. Since 2007, with the increase in the number of new sequences being deposited in large biological databases, researchers have begun to analyse computerized methodologies and tools aimed at selecting the most promising ones in the prediction of orthologous groups. Literature in this field of research describes the problems that the majority of available tools show, such as those encountered in accuracy, time required for analysis (especially in light of the increasing volume of data being submitted, which require faster techniques) and the automatization of the process without requiring manual intervention. Conducting our search through BMC, Google Scholar, NCBI PubMed, and Expasy, we examined more than 600 articles pursuing the most recent techniques and tools developed to solve most the problems still existing in orthology detection. We listed the main computational tools created and developed between 2011 and 2017, taking into consideration the differences in the type of orthology analysis, outlining the main features of each tool and pointing to the problems that each one tries to address. We also observed that several tools still use as their main algorithm the BLAST "all-against-all" methodology, which entails some limitations, such as limited number of queries, computational cost, and high processing time to complete the analysis. However, new promising tools are being developed, like OrthoVenn (which uses the Venn diagram to show the relationship of ortholog groups generated by its algorithm); or proteinOrtho (which improves the accuracy of ortholog groups); or ReMark (tackling the integration of the pipeline to turn the entry process automatic); or OrthAgogue (using algorithms developed to minimize processing time); and proteinOrtho (developed for dealing with large amounts of biological data). We made a comparison among the main features of four tool and tested them using four for prokaryotic genomas. We hope that our review can be useful for researchers and will help them in selecting the most appropriate tool for their work in the field of orthology.

  14. Lightweight Object Oriented Structure analysis: Tools for building Tools to Analyze Molecular Dynamics Simulations

    PubMed Central

    Romo, Tod D.; Leioatts, Nicholas; Grossfield, Alan

    2014-01-01

    LOOS (Lightweight Object-Oriented Structure-analysis) is a C++ library designed to facilitate making novel tools for analyzing molecular dynamics simulations by abstracting out the repetitive tasks, allowing developers to focus on the scientifically relevant part of the problem. LOOS supports input using the native file formats of most common biomolecular simulation packages, including CHARMM, NAMD, Amber, Tinker, and Gromacs. A dynamic atom selection language based on the C expression syntax is included and is easily accessible to the tool-writer. In addition, LOOS is bundled with over 120 pre-built tools, including suites of tools for analyzing simulation convergence, 3D histograms, and elastic network models. Through modern C++ design, LOOS is both simple to develop with (requiring knowledge of only 4 core classes and a few utility functions) and is easily extensible. A python interface to the core classes is also provided, further facilitating tool development. PMID:25327784

  15. Lightweight object oriented structure analysis: tools for building tools to analyze molecular dynamics simulations.

    PubMed

    Romo, Tod D; Leioatts, Nicholas; Grossfield, Alan

    2014-12-15

    LOOS (Lightweight Object Oriented Structure-analysis) is a C++ library designed to facilitate making novel tools for analyzing molecular dynamics simulations by abstracting out the repetitive tasks, allowing developers to focus on the scientifically relevant part of the problem. LOOS supports input using the native file formats of most common biomolecular simulation packages, including CHARMM, NAMD, Amber, Tinker, and Gromacs. A dynamic atom selection language based on the C expression syntax is included and is easily accessible to the tool-writer. In addition, LOOS is bundled with over 140 prebuilt tools, including suites of tools for analyzing simulation convergence, three-dimensional histograms, and elastic network models. Through modern C++ design, LOOS is both simple to develop with (requiring knowledge of only four core classes and a few utility functions) and is easily extensible. A python interface to the core classes is also provided, further facilitating tool development. © 2014 Wiley Periodicals, Inc.

  16. Multi-Body Dynamic Contact Analysis. Tool for Transmission Design SBIR Phase II Final Report

    DTIC Science & Technology

    2003-04-01

    shapes and natural frequencies were computed in COSMIC NASTRAN, and were validated against the published experimental modal analysis [17]. • Using...COSMIC NASTRAN via modal superposition. • Results from the structural analysis (mode shapes or forced response) were converted into IDEAS universal...ARMY RESEARCH LABORATORY Multi-body Dynamic Contact Analysis Tool for Transmission Design SBIR Phase II Final Report by

  17. A digital flight control system verification laboratory

    NASA Technical Reports Server (NTRS)

    De Feo, P.; Saib, S.

    1982-01-01

    A NASA/FAA program has been established for the verification and validation of digital flight control systems (DFCS), with the primary objective being the development and analysis of automated verification tools. In order to enhance the capabilities, effectiveness, and ease of using the test environment, software verification tools can be applied. Tool design includes a static analyzer, an assertion generator, a symbolic executor, a dynamic analysis instrument, and an automated documentation generator. Static and dynamic tools are integrated with error detection capabilities, resulting in a facility which analyzes a representative testbed of DFCS software. Future investigations will ensue particularly in the areas of increase in the number of software test tools, and a cost effectiveness assessment.

  18. Semantic integration of gene expression analysis tools and data sources using software connectors

    PubMed Central

    2013-01-01

    Background The study and analysis of gene expression measurements is the primary focus of functional genomics. Once expression data is available, biologists are faced with the task of extracting (new) knowledge associated to the underlying biological phenomenon. Most often, in order to perform this task, biologists execute a number of analysis activities on the available gene expression dataset rather than a single analysis activity. The integration of heteregeneous tools and data sources to create an integrated analysis environment represents a challenging and error-prone task. Semantic integration enables the assignment of unambiguous meanings to data shared among different applications in an integrated environment, allowing the exchange of data in a semantically consistent and meaningful way. This work aims at developing an ontology-based methodology for the semantic integration of gene expression analysis tools and data sources. The proposed methodology relies on software connectors to support not only the access to heterogeneous data sources but also the definition of transformation rules on exchanged data. Results We have studied the different challenges involved in the integration of computer systems and the role software connectors play in this task. We have also studied a number of gene expression technologies, analysis tools and related ontologies in order to devise basic integration scenarios and propose a reference ontology for the gene expression domain. Then, we have defined a number of activities and associated guidelines to prescribe how the development of connectors should be carried out. Finally, we have applied the proposed methodology in the construction of three different integration scenarios involving the use of different tools for the analysis of different types of gene expression data. Conclusions The proposed methodology facilitates the development of connectors capable of semantically integrating different gene expression analysis tools and data sources. The methodology can be used in the development of connectors supporting both simple and nontrivial processing requirements, thus assuring accurate data exchange and information interpretation from exchanged data. PMID:24341380

  19. SPARTA: Simple Program for Automated reference-based bacterial RNA-seq Transcriptome Analysis.

    PubMed

    Johnson, Benjamin K; Scholz, Matthew B; Teal, Tracy K; Abramovitch, Robert B

    2016-02-04

    Many tools exist in the analysis of bacterial RNA sequencing (RNA-seq) transcriptional profiling experiments to identify differentially expressed genes between experimental conditions. Generally, the workflow includes quality control of reads, mapping to a reference, counting transcript abundance, and statistical tests for differentially expressed genes. In spite of the numerous tools developed for each component of an RNA-seq analysis workflow, easy-to-use bacterially oriented workflow applications to combine multiple tools and automate the process are lacking. With many tools to choose from for each step, the task of identifying a specific tool, adapting the input/output options to the specific use-case, and integrating the tools into a coherent analysis pipeline is not a trivial endeavor, particularly for microbiologists with limited bioinformatics experience. To make bacterial RNA-seq data analysis more accessible, we developed a Simple Program for Automated reference-based bacterial RNA-seq Transcriptome Analysis (SPARTA). SPARTA is a reference-based bacterial RNA-seq analysis workflow application for single-end Illumina reads. SPARTA is turnkey software that simplifies the process of analyzing RNA-seq data sets, making bacterial RNA-seq analysis a routine process that can be undertaken on a personal computer or in the classroom. The easy-to-install, complete workflow processes whole transcriptome shotgun sequencing data files by trimming reads and removing adapters, mapping reads to a reference, counting gene features, calculating differential gene expression, and, importantly, checking for potential batch effects within the data set. SPARTA outputs quality analysis reports, gene feature counts and differential gene expression tables and scatterplots. SPARTA provides an easy-to-use bacterial RNA-seq transcriptional profiling workflow to identify differentially expressed genes between experimental conditions. This software will enable microbiologists with limited bioinformatics experience to analyze their data and integrate next generation sequencing (NGS) technologies into the classroom. The SPARTA software and tutorial are available at sparta.readthedocs.org.

  20. Semantic integration of gene expression analysis tools and data sources using software connectors.

    PubMed

    Miyazaki, Flávia A; Guardia, Gabriela D A; Vêncio, Ricardo Z N; de Farias, Cléver R G

    2013-10-25

    The study and analysis of gene expression measurements is the primary focus of functional genomics. Once expression data is available, biologists are faced with the task of extracting (new) knowledge associated to the underlying biological phenomenon. Most often, in order to perform this task, biologists execute a number of analysis activities on the available gene expression dataset rather than a single analysis activity. The integration of heterogeneous tools and data sources to create an integrated analysis environment represents a challenging and error-prone task. Semantic integration enables the assignment of unambiguous meanings to data shared among different applications in an integrated environment, allowing the exchange of data in a semantically consistent and meaningful way. This work aims at developing an ontology-based methodology for the semantic integration of gene expression analysis tools and data sources. The proposed methodology relies on software connectors to support not only the access to heterogeneous data sources but also the definition of transformation rules on exchanged data. We have studied the different challenges involved in the integration of computer systems and the role software connectors play in this task. We have also studied a number of gene expression technologies, analysis tools and related ontologies in order to devise basic integration scenarios and propose a reference ontology for the gene expression domain. Then, we have defined a number of activities and associated guidelines to prescribe how the development of connectors should be carried out. Finally, we have applied the proposed methodology in the construction of three different integration scenarios involving the use of different tools for the analysis of different types of gene expression data. The proposed methodology facilitates the development of connectors capable of semantically integrating different gene expression analysis tools and data sources. The methodology can be used in the development of connectors supporting both simple and nontrivial processing requirements, thus assuring accurate data exchange and information interpretation from exchanged data.

  1. Tool and Task Analysis Guide for Vocational Welding (150 Tasks). Performance Based Vocational Education.

    ERIC Educational Resources Information Center

    John H. Hinds Area Vocational School, Elwood, IN.

    This book contains a task inventory, a task analysis of 150 tasks from that inventory, and a tool list for performance-based welding courses in the state of Indiana. The task inventory and tool list reflect 28 job titles found in Indiana. In the first part of the guide, tasks are listed by these domains: carbon-arc, electron beam, G.M.A.W., gas…

  2. Periodically-Scheduled Controller Analysis using Hybrid Systems Reachability and Continuization

    DTIC Science & Technology

    2015-12-01

    tools to verify specifications for hybrid automata do not perform well on such periodically scheduled models. This is due to a combination of the large...an additive nondeterministic input. Reachability tools for hybrid automata can better handle such systems. We further improve the analysis by...formally as a hybrid automaton. However, reachability tools to verify specifications for hybrid automata do not perform well on such periodically

  3. Cost Benefit Analysis: Cost Benefit Analysis for Human Effectiveness Research: Bioacoustic Protection

    DTIC Science & Technology

    2001-07-21

    APPENDIX A. ACRONYMS ACCES Attenuating Custom Communication Earpiece System ACEIT Automated Cost estimating Integrated Tools AFSC Air Force...documented in the ACEIT cost estimating tool developed by Tecolote, Inc. The factor used was 14 percent of PMP. 1.3 System Engineering/ Program...The data source is the ASC Aeronautical Engineering Products Cost Factor Handbook which is documented in the ACEIT cost estimating tool developed

  4. Sentiment Analysis of Health Care Tweets: Review of the Methods Used.

    PubMed

    Gohil, Sunir; Vuik, Sabine; Darzi, Ara

    2018-04-23

    Twitter is a microblogging service where users can send and read short 140-character messages called "tweets." There are several unstructured, free-text tweets relating to health care being shared on Twitter, which is becoming a popular area for health care research. Sentiment is a metric commonly used to investigate the positive or negative opinion within these messages. Exploring the methods used for sentiment analysis in Twitter health care research may allow us to better understand the options available for future research in this growing field. The first objective of this study was to understand which tools would be available for sentiment analysis of Twitter health care research, by reviewing existing studies in this area and the methods they used. The second objective was to determine which method would work best in the health care settings, by analyzing how the methods were used to answer specific health care questions, their production, and how their accuracy was analyzed. A review of the literature was conducted pertaining to Twitter and health care research, which used a quantitative method of sentiment analysis for the free-text messages (tweets). The study compared the types of tools used in each case and examined methods for tool production, tool training, and analysis of accuracy. A total of 12 papers studying the quantitative measurement of sentiment in the health care setting were found. More than half of these studies produced tools specifically for their research, 4 used open source tools available freely, and 2 used commercially available software. Moreover, 4 out of the 12 tools were trained using a smaller sample of the study's final data. The sentiment method was trained against, on an average, 0.45% (2816/627,024) of the total sample data. One of the 12 papers commented on the analysis of accuracy of the tool used. Multiple methods are used for sentiment analysis of tweets in the health care setting. These range from self-produced basic categorizations to more complex and expensive commercial software. The open source and commercial methods are developed on product reviews and generic social media messages. None of these methods have been extensively tested against a corpus of health care messages to check their accuracy. This study suggests that there is a need for an accurate and tested tool for sentiment analysis of tweets trained using a health care setting-specific corpus of manually annotated tweets first. ©Sunir Gohil, Sabine Vuik, Ara Darzi. Originally published in JMIR Public Health and Surveillance (http://publichealth.jmir.org), 23.04.2018.

  5. ReSeqTools: an integrated toolkit for large-scale next-generation sequencing based resequencing analysis.

    PubMed

    He, W; Zhao, S; Liu, X; Dong, S; Lv, J; Liu, D; Wang, J; Meng, Z

    2013-12-04

    Large-scale next-generation sequencing (NGS)-based resequencing detects sequence variations, constructs evolutionary histories, and identifies phenotype-related genotypes. However, NGS-based resequencing studies generate extraordinarily large amounts of data, making computations difficult. Effective use and analysis of these data for NGS-based resequencing studies remains a difficult task for individual researchers. Here, we introduce ReSeqTools, a full-featured toolkit for NGS (Illumina sequencing)-based resequencing analysis, which processes raw data, interprets mapping results, and identifies and annotates sequence variations. ReSeqTools provides abundant scalable functions for routine resequencing analysis in different modules to facilitate customization of the analysis pipeline. ReSeqTools is designed to use compressed data files as input or output to save storage space and facilitates faster and more computationally efficient large-scale resequencing studies in a user-friendly manner. It offers abundant practical functions and generates useful statistics during the analysis pipeline, which significantly simplifies resequencing analysis. Its integrated algorithms and abundant sub-functions provide a solid foundation for special demands in resequencing projects. Users can combine these functions to construct their own pipelines for other purposes.

  6. Data and Tools | Hydrogen and Fuel Cells | NREL

    Science.gov Websites

    researchers, developers, investors, and others interested in the viability, analysis, and development of , energy use, and emissions. Alternative Fuels Data Center Tools Collection of tools-calculators -makers reduce petroleum use. FASTSim: Future Automotive Systems Technology Simulator Simulation tool that

  7. A drill-soil system modelization for future Mars exploration

    NASA Astrophysics Data System (ADS)

    Finzi, A. E.; Lavagna, M.; Rocchitelli, G.

    2004-01-01

    This paper presents a first approach to the problem of modeling a drilling process to be carried on in the space environment by a dedicated payload. Systems devoted to work in space present very strict requirements in many different fields such as thermal response, electric power demand, reliability and so on. Thus, models devoted to the operational behaviour simulation represent a fundamental help in the design phase and give a great improvement in the final product quality. As the required power is the crucial constraint within drilling devices, the tool-soil interaction modelization and simulation are finalized to the computation of the power demand as a function of both the drill and the soil parameters. An accurate study of the tool and the soil separately has been firstly carried on and, secondly their interaction has been analyzed. The Dee-Dri system, designed by Tecnospazio and to be part of the lander components in the NASA's Mars Sample Return Mission, has been taken as the tool reference. The Deep-Drill system is a complex rotary tool devoted to the soil perforation and sample collection; it has to operate in a Martian zone made of rocks similar to the terrestrial basalt, then the modelization is restricted to the interaction analysis between the tool and materials belonging to the rock set. The tool geometric modelization has been faced by a finite element approach with a Langrangian formulation: for the static analysis a refined model is assumed considering both the actual geometry of the head and the rod screws; a simplified model has been used to deal with the dynamic analysis. The soil representation is based on the Mohr-Coulomb crack criterion and an Eulerian approach has been selected to model it. However, software limitations in dealing with the tool-soil interface definition required assuming a Langrangian formulation for the soil too. The interaction between the soil and the tool has been modeled by extending the two-dimensional Nishimatsu's theory for rock cutting for rotating perforation tools. A fine analysis on f.e.m. element choice for each part of the tool is presented together with static analysis results. The dynamic analysis results are limited to the first impact phenomenon between the rock and the tool head. The validity of both the theoretical and numerical models is confirmed by the good agreement between simulation results and data coming from the experiments done within the Tecnospazio facilities.

  8. HISTORICAL ANALYSIS OF ECOLOGICAL EFFECTS: A USEFUL EDUCATIONAL TOOL

    EPA Science Inventory

    An historical analysis that presents the ecological consequences of development can be a valuable educational tool for citizens, students, and environmental managers. In highly impacted areas, the cumulative impacts of multiple stressors can result in complex environmental condit...

  9. DCODE.ORG Anthology of Comparative Genomic Tools

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Loots, G G; Ovcharenko, I

    2005-01-11

    Comparative genomics provides the means to demarcate functional regions in anonymous DNA sequences. The successful application of this method to identifying novel genes is currently shifting to deciphering the noncoding encryption of gene regulation across genomes. To facilitate the use of comparative genomics to practical applications in genetics and genomics we have developed several analytical and visualization tools for the analysis of arbitrary sequences and whole genomes. These tools include two alignment tools: zPicture and Mulan; a phylogenetic shadowing tool: eShadow for identifying lineage- and species-specific functional elements; two evolutionary conserved transcription factor analysis tools: rVista and multiTF; a toolmore » for extracting cis-regulatory modules governing the expression of co-regulated genes, CREME; and a dynamic portal to multiple vertebrate and invertebrate genome alignments, the ECR Browser. Here we briefly describe each one of these tools and provide specific examples on their practical applications. All the tools are publicly available at the http://www.dcode.org/ web site.« less

  10. HiC-bench: comprehensive and reproducible Hi-C data analysis designed for parameter exploration and benchmarking.

    PubMed

    Lazaris, Charalampos; Kelly, Stephen; Ntziachristos, Panagiotis; Aifantis, Iannis; Tsirigos, Aristotelis

    2017-01-05

    Chromatin conformation capture techniques have evolved rapidly over the last few years and have provided new insights into genome organization at an unprecedented resolution. Analysis of Hi-C data is complex and computationally intensive involving multiple tasks and requiring robust quality assessment. This has led to the development of several tools and methods for processing Hi-C data. However, most of the existing tools do not cover all aspects of the analysis and only offer few quality assessment options. Additionally, availability of a multitude of tools makes scientists wonder how these tools and associated parameters can be optimally used, and how potential discrepancies can be interpreted and resolved. Most importantly, investigators need to be ensured that slight changes in parameters and/or methods do not affect the conclusions of their studies. To address these issues (compare, explore and reproduce), we introduce HiC-bench, a configurable computational platform for comprehensive and reproducible analysis of Hi-C sequencing data. HiC-bench performs all common Hi-C analysis tasks, such as alignment, filtering, contact matrix generation and normalization, identification of topological domains, scoring and annotation of specific interactions using both published tools and our own. We have also embedded various tasks that perform quality assessment and visualization. HiC-bench is implemented as a data flow platform with an emphasis on analysis reproducibility. Additionally, the user can readily perform parameter exploration and comparison of different tools in a combinatorial manner that takes into account all desired parameter settings in each pipeline task. This unique feature facilitates the design and execution of complex benchmark studies that may involve combinations of multiple tool/parameter choices in each step of the analysis. To demonstrate the usefulness of our platform, we performed a comprehensive benchmark of existing and new TAD callers exploring different matrix correction methods, parameter settings and sequencing depths. Users can extend our pipeline by adding more tools as they become available. HiC-bench consists an easy-to-use and extensible platform for comprehensive analysis of Hi-C datasets. We expect that it will facilitate current analyses and help scientists formulate and test new hypotheses in the field of three-dimensional genome organization.

  11. Wear and breakage monitoring of cutting tools by an optical method: theory

    NASA Astrophysics Data System (ADS)

    Li, Jianfeng; Zhang, Yongqing; Chen, Fangrong; Tian, Zhiren; Wang, Yao

    1996-10-01

    An essential part of a machining system in the unmanned flexible manufacturing system, is the ability to automatically change out tools that are worn or damaged. An optoelectronic method for in situ monitoring of the flank wear and breakage of cutting tools is presented. A flank wear estimation system is implemented in a laboratory environment, and its performance is evaluated through turning experiments. The flank wear model parameters that need to be known a priori are determined through several preliminary experiments, or from data available in the literature. The resulting cutting conditions are typical of those used in finishing cutting operations. Through time and amplitude domain analysis of the cutting tool wear states and breakage states, it is found that the original signal digital specificity (sigma) 2x and the self correlation coefficient (rho) (m) can reflect the change regularity of the cutting tool wear and break are determined, but which is not enough due to the complexity of the wear and break procedure of cutting tools. Time series analysis and frequency spectrum analysis will be carried out, which will be described in the later papers.

  12. Integrated verification and testing system (IVTS) for HAL/S programs

    NASA Technical Reports Server (NTRS)

    Senn, E. H.; Ames, K. R.; Smith, K. A.

    1983-01-01

    The IVTS is a large software system designed to support user-controlled verification analysis and testing activities for programs written in the HAL/S language. The system is composed of a user interface and user command language, analysis tools and an organized data base of host system files. The analysis tools are of four major types: (1) static analysis, (2) symbolic execution, (3) dynamic analysis (testing), and (4) documentation enhancement. The IVTS requires a split HAL/S compiler, divided at the natural separation point between the parser/lexical analyzer phase and the target machine code generator phase. The IVTS uses the internal program form (HALMAT) between these two phases as primary input for the analysis tools. The dynamic analysis component requires some way to 'execute' the object HAL/S program. The execution medium may be an interpretive simulation or an actual host or target machine.

  13. [Factor Analysis: Principles to Evaluate Measurement Tools for Mental Health].

    PubMed

    Campo-Arias, Adalberto; Herazo, Edwin; Oviedo, Heidi Celina

    2012-09-01

    The validation of a measurement tool in mental health is a complex process that usually starts by estimating reliability, to later approach its validity. Factor analysis is a way to know the number of dimensions, domains or factors of a measuring tool, generally related to the construct validity of the scale. The analysis could be exploratory or confirmatory, and helps in the selection of the items with better performance. For an acceptable factor analysis, it is necessary to follow some steps and recommendations, conduct some statistical tests, and rely on a proper sample of participants. Copyright © 2012 Asociación Colombiana de Psiquiatría. Publicado por Elsevier España. All rights reserved.

  14. CloudMan as a platform for tool, data, and analysis distribution

    PubMed Central

    2012-01-01

    Background Cloud computing provides an infrastructure that facilitates large scale computational analysis in a scalable, democratized fashion, However, in this context it is difficult to ensure sharing of an analysis environment and associated data in a scalable and precisely reproducible way. Results CloudMan (usecloudman.org) enables individual researchers to easily deploy, customize, and share their entire cloud analysis environment, including data, tools, and configurations. Conclusions With the enabled customization and sharing of instances, CloudMan can be used as a platform for collaboration. The presented solution improves accessibility of cloud resources, tools, and data to the level of an individual researcher and contributes toward reproducibility and transparency of research solutions. PMID:23181507

  15. Altered Pathway Analyzer: A gene expression dataset analysis tool for identification and prioritization of differentially regulated and network rewired pathways

    PubMed Central

    Kaushik, Abhinav; Ali, Shakir; Gupta, Dinesh

    2017-01-01

    Gene connection rewiring is an essential feature of gene network dynamics. Apart from its normal functional role, it may also lead to dysregulated functional states by disturbing pathway homeostasis. Very few computational tools measure rewiring within gene co-expression and its corresponding regulatory networks in order to identify and prioritize altered pathways which may or may not be differentially regulated. We have developed Altered Pathway Analyzer (APA), a microarray dataset analysis tool for identification and prioritization of altered pathways, including those which are differentially regulated by TFs, by quantifying rewired sub-network topology. Moreover, APA also helps in re-prioritization of APA shortlisted altered pathways enriched with context-specific genes. We performed APA analysis of simulated datasets and p53 status NCI-60 cell line microarray data to demonstrate potential of APA for identification of several case-specific altered pathways. APA analysis reveals several altered pathways not detected by other tools evaluated by us. APA analysis of unrelated prostate cancer datasets identifies sample-specific as well as conserved altered biological processes, mainly associated with lipid metabolism, cellular differentiation and proliferation. APA is designed as a cross platform tool which may be transparently customized to perform pathway analysis in different gene expression datasets. APA is freely available at http://bioinfo.icgeb.res.in/APA. PMID:28084397

  16. Improvements to Integrated Tradespace Analysis of Communications Architectures (ITACA) Network Loading Analysis Tool

    NASA Technical Reports Server (NTRS)

    Lee, Nathaniel; Welch, Bryan W.

    2018-01-01

    NASA's SCENIC project aims to simplify and reduce the cost of space mission planning by replicating the analysis capabilities of commercially licensed software which are integrated with relevant analysis parameters specific to SCaN assets and SCaN supported user missions. SCENIC differs from current tools that perform similar analyses in that it 1) does not require any licensing fees, 2) will provide an all-in-one package for various analysis capabilities that normally requires add-ons or multiple tools to complete. As part of SCENIC's capabilities, the ITACA network loading analysis tool will be responsible for assessing the loading on a given network architecture and generating a network service schedule. ITACA will allow users to evaluate the quality of service of a given network architecture and determine whether or not the architecture will satisfy the mission's requirements. ITACA is currently under development, and the following improvements were made during the fall of 2017: optimization of runtime, augmentation of network asset pre-service configuration time, augmentation of Brent's method of root finding, augmentation of network asset FOV restrictions, augmentation of mission lifetimes, and the integration of a SCaN link budget calculation tool. The improvements resulted in (a) 25% reduction in runtime, (b) more accurate contact window predictions when compared to STK(Registered Trademark) contact window predictions, and (c) increased fidelity through the use of specific SCaN asset parameters.

  17. GARNET--gene set analysis with exploration of annotation relations.

    PubMed

    Rho, Kyoohyoung; Kim, Bumjin; Jang, Youngjun; Lee, Sanghyun; Bae, Taejeong; Seo, Jihae; Seo, Chaehwa; Lee, Jihyun; Kang, Hyunjung; Yu, Ungsik; Kim, Sunghoon; Lee, Sanghyuk; Kim, Wan Kyu

    2011-02-15

    Gene set analysis is a powerful method of deducing biological meaning for an a priori defined set of genes. Numerous tools have been developed to test statistical enrichment or depletion in specific pathways or gene ontology (GO) terms. Major difficulties towards biological interpretation are integrating diverse types of annotation categories and exploring the relationships between annotation terms of similar information. GARNET (Gene Annotation Relationship NEtwork Tools) is an integrative platform for gene set analysis with many novel features. It includes tools for retrieval of genes from annotation database, statistical analysis & visualization of annotation relationships, and managing gene sets. In an effort to allow access to a full spectrum of amassed biological knowledge, we have integrated a variety of annotation data that include the GO, domain, disease, drug, chromosomal location, and custom-defined annotations. Diverse types of molecular networks (pathways, transcription and microRNA regulations, protein-protein interaction) are also included. The pair-wise relationship between annotation gene sets was calculated using kappa statistics. GARNET consists of three modules--gene set manager, gene set analysis and gene set retrieval, which are tightly integrated to provide virtually automatic analysis for gene sets. A dedicated viewer for annotation network has been developed to facilitate exploration of the related annotations. GARNET (gene annotation relationship network tools) is an integrative platform for diverse types of gene set analysis, where complex relationships among gene annotations can be easily explored with an intuitive network visualization tool (http://garnet.isysbio.org/ or http://ercsb.ewha.ac.kr/garnet/).

  18. MatSeis and the GNEM R&E regional seismic anaylsis tools.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chael, Eric Paul; Hart, Darren M.; Young, Christopher John

    2003-08-01

    To improve the nuclear event monitoring capability of the U.S., the NNSA Ground-based Nuclear Explosion Monitoring Research & Engineering (GNEM R&E) program has been developing a collection of products known as the Knowledge Base (KB). Though much of the focus for the KB has been on the development of calibration data, we have also developed numerous software tools for various purposes. The Matlab-based MatSeis package and the associated suite of regional seismic analysis tools were developed to aid in the testing and evaluation of some Knowledge Base products for which existing applications were either not available or ill-suited. This presentationmore » will provide brief overviews of MatSeis and each of the tools, emphasizing features added in the last year. MatSeis was begun in 1996 and is now a fairly mature product. It is a highly flexible seismic analysis package that provides interfaces to read data from either flatfiles or an Oracle database. All of the standard seismic analysis tasks are supported (e.g. filtering, 3 component rotation, phase picking, event location, magnitude calculation), as well as a variety of array processing algorithms (beaming, FK, coherency analysis, vespagrams). The simplicity of Matlab coding and the tremendous number of available functions make MatSeis/Matlab an ideal environment for developing new monitoring research tools (see the regional seismic analysis tools below). New MatSeis features include: addition of evid information to events in MatSeis, options to screen picks by author, input and output of origerr information, improved performance in reading flatfiles, improved speed in FK calculations, and significant improvements to Measure Tool (filtering, multiple phase display), Free Plot (filtering, phase display and alignment), Mag Tool (maximum likelihood options), and Infra Tool (improved calculation speed, display of an F statistic stream). Work on the regional seismic analysis tools (CodaMag, EventID, PhaseMatch, and Dendro) began in 1999 and the tools vary in their level of maturity. All rely on MatSeis to provide necessary data (waveforms, arrivals, origins, and travel time curves). CodaMag Tool implements magnitude calculation by scaling to fit the envelope shape of the coda for a selected phase type (Mayeda, 1993; Mayeda and Walter, 1996). New tool features include: calculation of a yield estimate based on the source spectrum, display of a filtered version of the seismogram based on the selected band, and the output of codamag data records for processed events. EventID Tool implements event discrimination using phase ratios of regional arrivals (Hartse et al., 1997; Walter et al., 1999). New features include: bandpass filtering of displayed waveforms, screening of reference events based on SNR, multivariate discriminants, use of libcgi to access correction surfaces, and the output of discrim{_}data records for processed events. PhaseMatch Tool implements match filtering to isolate surface waves (Herrin and Goforth, 1977). New features include: display of the signal's observed dispersion and an option to use a station-based dispersion surface. Dendro Tool implements agglomerative hierarchical clustering using dendrograms to identify similar events based on waveform correlation (Everitt, 1993). New features include: modifications to include arrival information within the tool, and the capability to automatically add/re-pick arrivals based on the picked arrivals for similar events.« less

  19. Systemic safety project selection tool.

    DOT National Transportation Integrated Search

    2013-07-01

    "The Systemic Safety Project Selection Tool presents a process for incorporating systemic safety planning into traditional safety management processes. The Systemic Tool provides a step-by-step process for conducting systemic safety analysis; conside...

  20. Integrating automated structured analysis and design with Ada programming support environments

    NASA Technical Reports Server (NTRS)

    Hecht, Alan; Simmons, Andy

    1986-01-01

    Ada Programming Support Environments (APSE) include many powerful tools that address the implementation of Ada code. These tools do not address the entire software development process. Structured analysis is a methodology that addresses the creation of complete and accurate system specifications. Structured design takes a specification and derives a plan to decompose the system subcomponents, and provides heuristics to optimize the software design to minimize errors and maintenance. It can also produce the creation of useable modules. Studies have shown that most software errors result from poor system specifications, and that these errors also become more expensive to fix as the development process continues. Structured analysis and design help to uncover error in the early stages of development. The APSE tools help to insure that the code produced is correct, and aid in finding obscure coding errors. However, they do not have the capability to detect errors in specifications or to detect poor designs. An automated system for structured analysis and design TEAMWORK, which can be integrated with an APSE to support software systems development from specification through implementation is described. These tools completement each other to help developers improve quality and productivity, as well as to reduce development and maintenance costs. Complete system documentation and reusable code also resultss from the use of these tools. Integrating an APSE with automated tools for structured analysis and design provide capabilities and advantages beyond those realized with any of these systems used by themselves.

  1. Toward Enhancing Automated Credibility Assessment: A Model for Question Type Classification and Tools for Linguistic Analysis

    ERIC Educational Resources Information Center

    Moffitt, Kevin Christopher

    2011-01-01

    The three objectives of this dissertation were to develop a question type model for predicting linguistic features of responses to interview questions, create a tool for linguistic analysis of documents, and use lexical bundle analysis to identify linguistic differences between fraudulent and non-fraudulent financial reports. First, The Moffitt…

  2. HUMAN HEALTH METRICS FOR ENVIRONMENTAL DECISION SUPPORT TOOLS: LESSONS FROM HEALTH ECONOMICS AND DECISION ANALYSIS: JOURNAL ARTICLE

    EPA Science Inventory

    NRMRL-CIN-1351 Hofstetter**, P., and Hammitt, J. K. Human Health Metrics for Environmental Decision Support Tools: Lessons from Health Economics and Decision Analysis. Risk Analysis 600/R/01/104, Available: on internet, www.epa.gov/ORD/NRMRL/Pubs/600R01104, [NET]. 03/07/2001 D...

  3. Combining the Bourne-Shell, sed and awk in the UNIX Environment for Language Analysis.

    ERIC Educational Resources Information Center

    Schmitt, Lothar M.; Christianson, Kiel T.

    This document describes how to construct tools for language analysis in research and teaching using the Bourne-shell, sed, and awk, three search tools, in the UNIX operating system. Applications include: searches for words, phrases, grammatical patterns, and phonemic patterns in text; statistical analysis of text in regard to such searches,…

  4. On the blind use of statistical tools in the analysis of globular cluster stars

    NASA Astrophysics Data System (ADS)

    D'Antona, Francesca; Caloi, Vittoria; Tailo, Marco

    2018-04-01

    As with most data analysis methods, the Bayesian method must be handled with care. We show that its application to determine stellar evolution parameters within globular clusters can lead to paradoxical results if used without the necessary precautions. This is a cautionary tale on the use of statistical tools for big data analysis.

  5. XMI2USE: A Tool for Transforming XMI to USE Specifications

    NASA Astrophysics Data System (ADS)

    Sun, Wuliang; Song, Eunjee; Grabow, Paul C.; Simmonds, Devon M.

    The UML-based Specification Environment (USE) tool supports syntactic analysis, type checking, consistency checking, and dynamic validation of invariants and pre-/post conditions specified in the Object Constraint Language (OCL). Due to its animation and analysis power, it is useful when checking critical non-functional properties such as security policies. However, the USE tool requires one to specify (i.e., "write") a model using its own textual language and does not allow one to import any model specification files created by other UML modeling tools. Hence, to make the best use of existing UML tools, we often create a model with OCL constraints using a modeling tool such as the IBM Rational Software Architect (RSA) and then use the USE tool for model validation. This approach, however, requires a manual transformation between the specifications of two different tool formats, which is error-prone and diminishes the benefit of automated model-level validations. In this paper, we describe our own implementation of a specification transformation engine that is based on the Model Driven Architecture (MDA) framework and currently supports automatic tool-level transformations from RSA to USE.

  6. EMAAS: An extensible grid-based Rich Internet Application for microarray data analysis and management

    PubMed Central

    Barton, G; Abbott, J; Chiba, N; Huang, DW; Huang, Y; Krznaric, M; Mack-Smith, J; Saleem, A; Sherman, BT; Tiwari, B; Tomlinson, C; Aitman, T; Darlington, J; Game, L; Sternberg, MJE; Butcher, SA

    2008-01-01

    Background Microarray experimentation requires the application of complex analysis methods as well as the use of non-trivial computer technologies to manage the resultant large data sets. This, together with the proliferation of tools and techniques for microarray data analysis, makes it very challenging for a laboratory scientist to keep up-to-date with the latest developments in this field. Our aim was to develop a distributed e-support system for microarray data analysis and management. Results EMAAS (Extensible MicroArray Analysis System) is a multi-user rich internet application (RIA) providing simple, robust access to up-to-date resources for microarray data storage and analysis, combined with integrated tools to optimise real time user support and training. The system leverages the power of distributed computing to perform microarray analyses, and provides seamless access to resources located at various remote facilities. The EMAAS framework allows users to import microarray data from several sources to an underlying database, to pre-process, quality assess and analyse the data, to perform functional analyses, and to track data analysis steps, all through a single easy to use web portal. This interface offers distance support to users both in the form of video tutorials and via live screen feeds using the web conferencing tool EVO. A number of analysis packages, including R-Bioconductor and Affymetrix Power Tools have been integrated on the server side and are available programmatically through the Postgres-PLR library or on grid compute clusters. Integrated distributed resources include the functional annotation tool DAVID, GeneCards and the microarray data repositories GEO, CELSIUS and MiMiR. EMAAS currently supports analysis of Affymetrix 3' and Exon expression arrays, and the system is extensible to cater for other microarray and transcriptomic platforms. Conclusion EMAAS enables users to track and perform microarray data management and analysis tasks through a single easy-to-use web application. The system architecture is flexible and scalable to allow new array types, analysis algorithms and tools to be added with relative ease and to cope with large increases in data volume. PMID:19032776

  7. Cytoscape: the network visualization tool for GenomeSpace workflows.

    PubMed

    Demchak, Barry; Hull, Tim; Reich, Michael; Liefeld, Ted; Smoot, Michael; Ideker, Trey; Mesirov, Jill P

    2014-01-01

    Modern genomic analysis often requires workflows incorporating multiple best-of-breed tools. GenomeSpace is a web-based visual workbench that combines a selection of these tools with mechanisms that create data flows between them. One such tool is Cytoscape 3, a popular application that enables analysis and visualization of graph-oriented genomic networks. As Cytoscape runs on the desktop, and not in a web browser, integrating it into GenomeSpace required special care in creating a seamless user experience and enabling appropriate data flows. In this paper, we present the design and operation of the Cytoscape GenomeSpace app, which accomplishes this integration, thereby providing critical analysis and visualization functionality for GenomeSpace users. It has been downloaded over 850 times since the release of its first version in September, 2013.

  8. Cytoscape: the network visualization tool for GenomeSpace workflows

    PubMed Central

    Demchak, Barry; Hull, Tim; Reich, Michael; Liefeld, Ted; Smoot, Michael; Ideker, Trey; Mesirov, Jill P.

    2014-01-01

    Modern genomic analysis often requires workflows incorporating multiple best-of-breed tools. GenomeSpace is a web-based visual workbench that combines a selection of these tools with mechanisms that create data flows between them. One such tool is Cytoscape 3, a popular application that enables analysis and visualization of graph-oriented genomic networks. As Cytoscape runs on the desktop, and not in a web browser, integrating it into GenomeSpace required special care in creating a seamless user experience and enabling appropriate data flows. In this paper, we present the design and operation of the Cytoscape GenomeSpace app, which accomplishes this integration, thereby providing critical analysis and visualization functionality for GenomeSpace users. It has been downloaded over 850 times since the release of its first version in September, 2013. PMID:25165537

  9. Transonic CFD applications at Boeing

    NASA Technical Reports Server (NTRS)

    Tinoco, E. N.

    1989-01-01

    The use of computational methods for three dimensional transonic flow design and analysis at the Boeing Company is presented. A range of computational tools consisting of production tools for every day use by project engineers, expert user tools for special applications by computational researchers, and an emerging tool which may see considerable use in the near future are described. These methods include full potential and Euler solvers, some coupled to three dimensional boundary layer analysis methods, for transonic flow analysis about nacelle, wing-body, wing-body-strut-nacelle, and complete aircraft configurations. As the examples presented show, such a toolbox of codes is necessary for the variety of applications typical of an industrial environment. Such a toolbox of codes makes possible aerodynamic advances not previously achievable in a timely manner, if at all.

  10. Playbook Data Analysis Tool: Collecting Interaction Data from Extremely Remote Users

    NASA Technical Reports Server (NTRS)

    Kanefsky, Bob; Zheng, Jimin; Deliz, Ivonne; Marquez, Jessica J.; Hillenius, Steven

    2017-01-01

    Typically, user tests for software tools are conducted in person. At NASA, the users may be located at the bottom of the ocean in a pressurized habitat, above the atmosphere in the International Space Station, or in an isolated capsule on a simulated asteroid mission. The Playbook Data Analysis Tool (P-DAT) is a human-computer interaction (HCI) evaluation tool that the NASA Ames HCI Group has developed to record user interactions with Playbook, the group's existing planning-and-execution software application. Once the remotely collected user interaction data makes its way back to Earth, researchers can use P-DAT for in-depth analysis. Since a critical component of the Playbook project is to understand how to develop more intuitive software tools for astronauts to plan in space, P-DAT helps guide us in the development of additional easy-to-use features for Playbook, informing the design of future crew autonomy tools.P-DAT has demonstrated the capability of discreetly capturing usability data in amanner that is transparent to Playbook’s end-users. In our experience, P-DAT data hasalready shown its utility, revealing potential usability patterns, helping diagnose softwarebugs, and identifying metrics and events that are pertinent to Playbook usage aswell as spaceflight operations. As we continue to develop this analysis tool, P-DATmay yet provide a method for long-duration, unobtrusive human performance collectionand evaluation for mission controllers back on Earth and researchers investigatingthe effects and mitigations related to future human spaceflight performance.

  11. Financial Statement Analysis for Colleges and Universities.

    ERIC Educational Resources Information Center

    Woelfel, Charles J.

    1987-01-01

    Presents ratio analysis of financial statements as a tool applicable for use by nonprofit institutions for evaluation of financial and operational performance of an institution. It can be used as a screening, forecasting, diagnostic, and evaluative tool for administration and governance. (MD)

  12. Rapid Benefit Indicators (RBI) Spatial Analysis Tools

    EPA Science Inventory

    The Rapid Benefit Indicators (RBI) approach consists of five steps and is outlined in Assessing the Benefits of Wetland Restoration - A Rapid Benefits Indicators Approach for Decision Makers. This spatial analysis tool is intended to be used to analyze existing spatial informatio...

  13. MODA A Framework for Memory Centric Performance Characterization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shrestha, Sunil; Su, Chun-Yi; White, Amanda M.

    2012-06-29

    In the age of massive parallelism, the focus of performance analysis has switched from the processor and related structures to the memory and I/O resources. Adapting to this new reality, a performance analysis tool has to provide a way to analyze resource usage to pinpoint existing and potential problems in a given application. This paper provides an overview of the Memory Observant Data Analysis (MODA) tool, a memory-centric tool first implemented on the Cray XMT supercomputer. Throughout the paper, MODA's capabilities have been showcased with experiments done on matrix multiply and Graph-500 application codes.

  14. Interfacing Computer Aided Parallelization and Performance Analysis

    NASA Technical Reports Server (NTRS)

    Jost, Gabriele; Jin, Haoqiang; Labarta, Jesus; Gimenez, Judit; Biegel, Bryan A. (Technical Monitor)

    2003-01-01

    When porting sequential applications to parallel computer architectures, the program developer will typically go through several cycles of source code optimization and performance analysis. We have started a project to develop an environment where the user can jointly navigate through program structure and performance data information in order to make efficient optimization decisions. In a prototype implementation we have interfaced the CAPO computer aided parallelization tool with the Paraver performance analysis tool. We describe both tools and their interface and give an example for how the interface helps within the program development cycle of a benchmark code.

  15. SMART: A Propositional Logic-Based Trade Analysis and Risk Assessment Tool for a Complex Mission

    NASA Technical Reports Server (NTRS)

    Ono, Masahiro; Nicholas, Austin; Alibay, Farah; Parrish, Joseph

    2015-01-01

    This paper introduces a new trade analysis software called the Space Mission Architecture and Risk Analysis Tool (SMART). This tool supports a high-level system trade study on a complex mission, such as a potential Mars Sample Return (MSR) mission, in an intuitive and quantitative manner. In a complex mission, a common approach to increase the probability of success is to have redundancy and prepare backups. Quantitatively evaluating the utility of adding redundancy to a system is important but not straightforward, particularly when the failure of parallel subsystems are correlated.

  16. Implementation of GenePattern within the Stanford Microarray Database.

    PubMed

    Hubble, Jeremy; Demeter, Janos; Jin, Heng; Mao, Maria; Nitzberg, Michael; Reddy, T B K; Wymore, Farrell; Zachariah, Zachariah K; Sherlock, Gavin; Ball, Catherine A

    2009-01-01

    Hundreds of researchers across the world use the Stanford Microarray Database (SMD; http://smd.stanford.edu/) to store, annotate, view, analyze and share microarray data. In addition to providing registered users at Stanford access to their own data, SMD also provides access to public data, and tools with which to analyze those data, to any public user anywhere in the world. Previously, the addition of new microarray data analysis tools to SMD has been limited by available engineering resources, and in addition, the existing suite of tools did not provide a simple way to design, execute and share analysis pipelines, or to document such pipelines for the purposes of publication. To address this, we have incorporated the GenePattern software package directly into SMD, providing access to many new analysis tools, as well as a plug-in architecture that allows users to directly integrate and share additional tools through SMD. In this article, we describe our implementation of the GenePattern microarray analysis software package into the SMD code base. This extension is available with the SMD source code that is fully and freely available to others under an Open Source license, enabling other groups to create a local installation of SMD with an enriched data analysis capability.

  17. Methodological tools for the collection and analysis of participant observation data using grounded theory.

    PubMed

    Laitinen, Heleena; Kaunonen, Marja; Astedt-Kurki, Päivi

    2014-11-01

    To give clarity to the analysis of participant observation in nursing when implementing the grounded theory method. Participant observation (PO) is a method of collecting data that reveals the reality of daily life in a specific context. In grounded theory, interviews are the primary method of collecting data but PO gives a distinctive insight, revealing what people are really doing, instead of what they say they are doing. However, more focus is needed on the analysis of PO. An observational study carried out to gain awareness of nursing care and its electronic documentation in four acute care wards in hospitals in Finland. Discussion of using the grounded theory method and PO as a data collection tool. The following methodological tools are discussed: an observational protocol, jotting of notes, microanalysis, the use of questioning, constant comparison, and writing and illustrating. Each tool has specific significance in collecting and analysing data, working in constant interaction. Grounded theory and participant observation supplied rich data and revealed the complexity of the daily reality of acute care. In this study, the methodological tools provided a base for the study at the research sites and outside. The process as a whole was challenging. It was time-consuming and it required rigorous and simultaneous data collection and analysis, including reflective writing. Using these methodological tools helped the researcher stay focused from data collection and analysis to building theory. Using PO as a data collection method in qualitative nursing research provides insights. It is not commonly discussed in nursing research and therefore this study can provide insight, which cannot be seen or revealed by using other data collection methods. Therefore, this paper can produce a useful tool for those who intend to use PO and grounded theory in their nursing research.

  18. Portfolio: a prototype workstation for development and evaluation of tools for analysis and management of digital portal images.

    PubMed

    Boxwala, A A; Chaney, E L; Fritsch, D S; Friedman, C P; Rosenman, J G

    1998-09-01

    The purpose of this investigation was to design and implement a prototype physician workstation, called PortFolio, as a platform for developing and evaluating, by means of controlled observer studies, user interfaces and interactive tools for analyzing and managing digital portal images. The first observer study was designed to measure physician acceptance of workstation technology, as an alternative to a view box, for inspection and analysis of portal images for detection of treatment setup errors. The observer study was conducted in a controlled experimental setting to evaluate physician acceptance of the prototype workstation technology exemplified by PortFolio. PortFolio incorporates a windows user interface, a compact kit of carefully selected image analysis tools, and an object-oriented data base infrastructure. The kit evaluated in the observer study included tools for contrast enhancement, registration, and multimodal image visualization. Acceptance was measured in the context of performing portal image analysis in a structured protocol designed to simulate clinical practice. The acceptability and usage patterns were measured from semistructured questionnaires and logs of user interactions. Radiation oncologists, the subjects for this study, perceived the tools in PortFolio to be acceptable clinical aids. Concerns were expressed regarding user efficiency, particularly with respect to the image registration tools. The results of our observer study indicate that workstation technology is acceptable to radiation oncologists as an alternative to a view box for clinical detection of setup errors from digital portal images. Improvements in implementation, including more tools and a greater degree of automation in the image analysis tasks, are needed to make PortFolio more clinically practical.

  19. Language-Agnostic Reproducible Data Analysis Using Literate Programming.

    PubMed

    Vassilev, Boris; Louhimo, Riku; Ikonen, Elina; Hautaniemi, Sampsa

    2016-01-01

    A modern biomedical research project can easily contain hundreds of analysis steps and lack of reproducibility of the analyses has been recognized as a severe issue. While thorough documentation enables reproducibility, the number of analysis programs used can be so large that in reality reproducibility cannot be easily achieved. Literate programming is an approach to present computer programs to human readers. The code is rearranged to follow the logic of the program, and to explain that logic in a natural language. The code executed by the computer is extracted from the literate source code. As such, literate programming is an ideal formalism for systematizing analysis steps in biomedical research. We have developed the reproducible computing tool Lir (literate, reproducible computing) that allows a tool-agnostic approach to biomedical data analysis. We demonstrate the utility of Lir by applying it to a case study. Our aim was to investigate the role of endosomal trafficking regulators to the progression of breast cancer. In this analysis, a variety of tools were combined to interpret the available data: a relational database, standard command-line tools, and a statistical computing environment. The analysis revealed that the lipid transport related genes LAPTM4B and NDRG1 are coamplified in breast cancer patients, and identified genes potentially cooperating with LAPTM4B in breast cancer progression. Our case study demonstrates that with Lir, an array of tools can be combined in the same data analysis to improve efficiency, reproducibility, and ease of understanding. Lir is an open-source software available at github.com/borisvassilev/lir.

  20. Language-Agnostic Reproducible Data Analysis Using Literate Programming

    PubMed Central

    Vassilev, Boris; Louhimo, Riku; Ikonen, Elina; Hautaniemi, Sampsa

    2016-01-01

    A modern biomedical research project can easily contain hundreds of analysis steps and lack of reproducibility of the analyses has been recognized as a severe issue. While thorough documentation enables reproducibility, the number of analysis programs used can be so large that in reality reproducibility cannot be easily achieved. Literate programming is an approach to present computer programs to human readers. The code is rearranged to follow the logic of the program, and to explain that logic in a natural language. The code executed by the computer is extracted from the literate source code. As such, literate programming is an ideal formalism for systematizing analysis steps in biomedical research. We have developed the reproducible computing tool Lir (literate, reproducible computing) that allows a tool-agnostic approach to biomedical data analysis. We demonstrate the utility of Lir by applying it to a case study. Our aim was to investigate the role of endosomal trafficking regulators to the progression of breast cancer. In this analysis, a variety of tools were combined to interpret the available data: a relational database, standard command-line tools, and a statistical computing environment. The analysis revealed that the lipid transport related genes LAPTM4B and NDRG1 are coamplified in breast cancer patients, and identified genes potentially cooperating with LAPTM4B in breast cancer progression. Our case study demonstrates that with Lir, an array of tools can be combined in the same data analysis to improve efficiency, reproducibility, and ease of understanding. Lir is an open-source software available at github.com/borisvassilev/lir. PMID:27711123

  1. Building energy analysis tool

    DOEpatents

    Brackney, Larry; Parker, Andrew; Long, Nicholas; Metzger, Ian; Dean, Jesse; Lisell, Lars

    2016-04-12

    A building energy analysis system includes a building component library configured to store a plurality of building components, a modeling tool configured to access the building component library and create a building model of a building under analysis using building spatial data and using selected building components of the plurality of building components stored in the building component library, a building analysis engine configured to operate the building model and generate a baseline energy model of the building under analysis and further configured to apply one or more energy conservation measures to the baseline energy model in order to generate one or more corresponding optimized energy models, and a recommendation tool configured to assess the one or more optimized energy models against the baseline energy model and generate recommendations for substitute building components or modifications.

  2. A Review of Evidence Presented in Support of Three Key Claims in the Validity Argument for the "TextEvaluator"® Text Analysis Tool. Research Report. ETS RR-16-12

    ERIC Educational Resources Information Center

    Sheehan, Kathleen M.

    2016-01-01

    The "TextEvaluator"® text analysis tool is a fully automated text complexity evaluation tool designed to help teachers and other educators select texts that are consistent with the text complexity guidelines specified in the Common Core State Standards (CCSS). This paper provides an overview of the TextEvaluator measurement approach and…

  3. An Overview of Promising Grades of Tool Materials Based on the Analysis of their Physical-Mechanical Characteristics

    NASA Astrophysics Data System (ADS)

    Kudryashov, E. A.; Smirnov, I. M.; Grishin, D. V.; Khizhnyak, N. A.

    2018-06-01

    The work is aimed at selecting a promising grade of a tool material, whose physical-mechanical characteristics would allow using it for processing the surfaces of discontinuous parts in the presence of shock loads. An analysis of the physical-mechanical characteristics of most common tool materials is performed and the data on a possible provision of the metal-working processes with promising composite grades are presented.

  4. Tool Use in a Psychomotor Task: The Role of Tool and Learner Variables

    ERIC Educational Resources Information Center

    Juarez-Collazo, Norma A.; Lust, Griet; Elen, Jan; Clarebout, Geraldine

    2011-01-01

    Research on the use of learning tools has brought to light variables that influence the learner on using or not using the tools. A deeper analysis on the current findings is attempted in this study. It adds a psychomotor task; it assesses the actual functionality of the employed tools, and it further explores learner-related variables that…

  5. Introducing W.A.T.E.R.S.: a workflow for the alignment, taxonomy, and ecology of ribosomal sequences.

    PubMed

    Hartman, Amber L; Riddle, Sean; McPhillips, Timothy; Ludäscher, Bertram; Eisen, Jonathan A

    2010-06-12

    For more than two decades microbiologists have used a highly conserved microbial gene as a phylogenetic marker for bacteria and archaea. The small-subunit ribosomal RNA gene, also known as 16 S rRNA, is encoded by ribosomal DNA, 16 S rDNA, and has provided a powerful comparative tool to microbial ecologists. Over time, the microbial ecology field has matured from small-scale studies in a select number of environments to massive collections of sequence data that are paired with dozens of corresponding collection variables. As the complexity of data and tool sets have grown, the need for flexible automation and maintenance of the core processes of 16 S rDNA sequence analysis has increased correspondingly. We present WATERS, an integrated approach for 16 S rDNA analysis that bundles a suite of publicly available 16 S rDNA analysis software tools into a single software package. The "toolkit" includes sequence alignment, chimera removal, OTU determination, taxonomy assignment, phylogentic tree construction as well as a host of ecological analysis and visualization tools. WATERS employs a flexible, collection-oriented 'workflow' approach using the open-source Kepler system as a platform. By packaging available software tools into a single automated workflow, WATERS simplifies 16 S rDNA analyses, especially for those without specialized bioinformatics, programming expertise. In addition, WATERS, like some of the newer comprehensive rRNA analysis tools, allows researchers to minimize the time dedicated to carrying out tedious informatics steps and to focus their attention instead on the biological interpretation of the results. One advantage of WATERS over other comprehensive tools is that the use of the Kepler workflow system facilitates result interpretation and reproducibility via a data provenance sub-system. Furthermore, new "actors" can be added to the workflow as desired and we see WATERS as an initial seed for a sizeable and growing repository of interoperable, easy-to-combine tools for asking increasingly complex microbial ecology questions.

  6. The Watershed Deposition Tool: A Tool for Incorporating Atmospheric Deposition in Watershed Analysis

    EPA Science Inventory

    The tool for providing the linkage between air and water quality modeling needed for determining the Total Maximum Daily Load (TMDL) and for analyzing related nonpoint-source impacts on watersheds has been developed. The Watershed Deposition Tool (WDT) takes gridded output of at...

  7. The MetabolomeExpress Project: enabling web-based processing, analysis and transparent dissemination of GC/MS metabolomics datasets.

    PubMed

    Carroll, Adam J; Badger, Murray R; Harvey Millar, A

    2010-07-14

    Standardization of analytical approaches and reporting methods via community-wide collaboration can work synergistically with web-tool development to result in rapid community-driven expansion of online data repositories suitable for data mining and meta-analysis. In metabolomics, the inter-laboratory reproducibility of gas-chromatography/mass-spectrometry (GC/MS) makes it an obvious target for such development. While a number of web-tools offer access to datasets and/or tools for raw data processing and statistical analysis, none of these systems are currently set up to act as a public repository by easily accepting, processing and presenting publicly submitted GC/MS metabolomics datasets for public re-analysis. Here, we present MetabolomeExpress, a new File Transfer Protocol (FTP) server and web-tool for the online storage, processing, visualisation and statistical re-analysis of publicly submitted GC/MS metabolomics datasets. Users may search a quality-controlled database of metabolite response statistics from publicly submitted datasets by a number of parameters (eg. metabolite, species, organ/biofluid etc.). Users may also perform meta-analysis comparisons of multiple independent experiments or re-analyse public primary datasets via user-friendly tools for t-test, principal components analysis, hierarchical cluster analysis and correlation analysis. They may interact with chromatograms, mass spectra and peak detection results via an integrated raw data viewer. Researchers who register for a free account may upload (via FTP) their own data to the server for online processing via a novel raw data processing pipeline. MetabolomeExpress https://www.metabolome-express.org provides a new opportunity for the general metabolomics community to transparently present online the raw and processed GC/MS data underlying their metabolomics publications. Transparent sharing of these data will allow researchers to assess data quality and draw their own insights from published metabolomics datasets.

  8. Web-based visual analysis for high-throughput genomics

    PubMed Central

    2013-01-01

    Background Visualization plays an essential role in genomics research by making it possible to observe correlations and trends in large datasets as well as communicate findings to others. Visual analysis, which combines visualization with analysis tools to enable seamless use of both approaches for scientific investigation, offers a powerful method for performing complex genomic analyses. However, there are numerous challenges that arise when creating rich, interactive Web-based visualizations/visual analysis applications for high-throughput genomics. These challenges include managing data flow from Web server to Web browser, integrating analysis tools and visualizations, and sharing visualizations with colleagues. Results We have created a platform simplifies the creation of Web-based visualization/visual analysis applications for high-throughput genomics. This platform provides components that make it simple to efficiently query very large datasets, draw common representations of genomic data, integrate with analysis tools, and share or publish fully interactive visualizations. Using this platform, we have created a Circos-style genome-wide viewer, a generic scatter plot for correlation analysis, an interactive phylogenetic tree, a scalable genome browser for next-generation sequencing data, and an application for systematically exploring tool parameter spaces to find good parameter values. All visualizations are interactive and fully customizable. The platform is integrated with the Galaxy (http://galaxyproject.org) genomics workbench, making it easy to integrate new visual applications into Galaxy. Conclusions Visualization and visual analysis play an important role in high-throughput genomics experiments, and approaches are needed to make it easier to create applications for these activities. Our framework provides a foundation for creating Web-based visualizations and integrating them into Galaxy. Finally, the visualizations we have created using the framework are useful tools for high-throughput genomics experiments. PMID:23758618

  9. Scoring Tools for the Analysis of Infant Respiratory Inductive Plethysmography Signals.

    PubMed

    Robles-Rubio, Carlos Alejandro; Bertolizio, Gianluca; Brown, Karen A; Kearney, Robert E

    2015-01-01

    Infants recovering from anesthesia are at risk of life threatening Postoperative Apnea (POA). POA events are rare, and so the study of POA requires the analysis of long cardiorespiratory records. Manual scoring is the preferred method of analysis for these data, but it is limited by low intra- and inter-scorer repeatability. Furthermore, recommended scoring rules do not provide a comprehensive description of the respiratory patterns. This work describes a set of manual scoring tools that address these limitations. These tools include: (i) a set of definitions and scoring rules for 6 mutually exclusive, unique patterns that fully characterize infant respiratory inductive plethysmography (RIP) signals; (ii) RIPScore, a graphical, manual scoring software to apply these rules to infant data; (iii) a library of data segments representing each of the 6 patterns; (iv) a fully automated, interactive formal training protocol to standardize the analysis and establish intra- and inter-scorer repeatability; and (v) a quality control method to monitor scorer ongoing performance over time. To evaluate these tools, three scorers from varied backgrounds were recruited and trained to reach a performance level similar to that of an expert. These scorers used RIPScore to analyze data from infants at risk of POA in two separate, independent instances. Scorers performed with high accuracy and consistency, analyzed data efficiently, had very good intra- and inter-scorer repeatability, and exhibited only minor confusion between patterns. These results indicate that our tools represent an excellent method for the analysis of respiratory patterns in long data records. Although the tools were developed for the study of POA, their use extends to any study of respiratory patterns using RIP (e.g., sleep apnea, extubation readiness). Moreover, by establishing and monitoring scorer repeatability, our tools enable the analysis of large data sets by multiple scorers, which is essential for longitudinal and multicenter studies.

  10. CRCDA—Comprehensive resources for cancer NGS data analysis

    PubMed Central

    Thangam, Manonanthini; Gopal, Ramesh Kumar

    2015-01-01

    Next generation sequencing (NGS) innovations put a compelling landmark in life science and changed the direction of research in clinical oncology with its productivity to diagnose and treat cancer. The aim of our portal comprehensive resources for cancer NGS data analysis (CRCDA) is to provide a collection of different NGS tools and pipelines under diverse classes with cancer pathways and databases and furthermore, literature information from PubMed. The literature data was constrained to 18 most common cancer types such as breast cancer, colon cancer and other cancers that exhibit in worldwide population. NGS-cancer tools for the convenience have been categorized into cancer genomics, cancer transcriptomics, cancer epigenomics, quality control and visualization. Pipelines for variant detection, quality control and data analysis were listed to provide out-of-the box solution for NGS data analysis, which may help researchers to overcome challenges in selecting and configuring individual tools for analysing exome, whole genome and transcriptome data. An extensive search page was developed that can be queried by using (i) type of data [literature, gene data and sequence read archive (SRA) data] and (ii) type of cancer (selected based on global incidence and accessibility of data). For each category of analysis, variety of tools are available and the biggest challenge is in searching and using the right tool for the right application. The objective of the work is collecting tools in each category available at various places and arranging the tools and other data in a simple and user-friendly manner for biologists and oncologists to find information easier. To the best of our knowledge, we have collected and presented a comprehensive package of most of the resources available in cancer for NGS data analysis. Given these factors, we believe that this website will be an useful resource to the NGS research community working on cancer. Database URL: http://bioinfo.au-kbc.org.in/ngs/ngshome.html. PMID:26450948

  11. The integration of FMEA with other problem solving tools: A review of enhancement opportunities

    NASA Astrophysics Data System (ADS)

    Ng, W. C.; Teh, S. Y.; Low, H. C.; Teoh, P. C.

    2017-09-01

    Failure Mode Effect Analysis (FMEA) is one the most effective and accepted problem solving (PS) tools for most of the companies in the world. Since FMEA was first introduced in 1949, practitioners have implemented FMEA in various industries for their quality improvement initiatives. However, studies have shown that there are drawbacks that hinder the effectiveness of FMEA for continuous quality improvement from product design to manufacturing. Therefore, FMEA is integrated with other PS tools such as inventive problem solving methodology (TRIZ), Quality Function Deployment (QFD), Root Cause Analysis (RCA) and seven basic tools of quality to address the drawbacks. This study begins by identifying the drawbacks in FMEA. A comprehensive literature review on the integration of FMEA with other tools is carried out to categorise the integrations based on the drawbacks identified. The three categories are inefficiency of failure analysis, psychological inertia and neglect of customers’ perspective. This study concludes by discussing the gaps and opportunities in the integration for future research.

  12. SNP_tools: A compact tool package for analysis and conversion of genotype data for MS-Excel

    PubMed Central

    Chen, Bowang; Wilkening, Stefan; Drechsel, Marion; Hemminki, Kari

    2009-01-01

    Background Single nucleotide polymorphism (SNP) genotyping is a major activity in biomedical research. Scientists prefer to have a facile access to the results which may require conversions between data formats. First hand SNP data is often entered in or saved in the MS-Excel format, but this software lacks genetic and epidemiological related functions. A general tool to do basic genetic and epidemiological analysis and data conversion for MS-Excel is needed. Findings The SNP_tools package is prepared as an add-in for MS-Excel. The code is written in Visual Basic for Application, embedded in the Microsoft Office package. This add-in is an easy to use tool for users with basic computer knowledge (and requirements for basic statistical analysis). Conclusion Our implementation for Microsoft Excel 2000-2007 in Microsoft Windows 2000, XP, Vista and Windows 7 beta can handle files in different formats and converts them into other formats. It is a free software. PMID:19852806

  13. SNP_tools: A compact tool package for analysis and conversion of genotype data for MS-Excel.

    PubMed

    Chen, Bowang; Wilkening, Stefan; Drechsel, Marion; Hemminki, Kari

    2009-10-23

    Single nucleotide polymorphism (SNP) genotyping is a major activity in biomedical research. Scientists prefer to have a facile access to the results which may require conversions between data formats. First hand SNP data is often entered in or saved in the MS-Excel format, but this software lacks genetic and epidemiological related functions. A general tool to do basic genetic and epidemiological analysis and data conversion for MS-Excel is needed. The SNP_tools package is prepared as an add-in for MS-Excel. The code is written in Visual Basic for Application, embedded in the Microsoft Office package. This add-in is an easy to use tool for users with basic computer knowledge (and requirements for basic statistical analysis). Our implementation for Microsoft Excel 2000-2007 in Microsoft Windows 2000, XP, Vista and Windows 7 beta can handle files in different formats and converts them into other formats. It is a free software.

  14. Automated SEM and TEM sample preparation applied to copper/low k materials

    NASA Astrophysics Data System (ADS)

    Reyes, R.; Shaapur, F.; Griffiths, D.; Diebold, A. C.; Foran, B.; Raz, E.

    2001-01-01

    We describe the use of automated microcleaving for preparation of both SEM and TEM samples as done by SELA's new MC500 and TEMstation tools. The MC500 is an automated microcleaving tool that is capable of producing cleaves with 0.25 μm accuracy resulting in SEM-ready samples. The TEMstation is capable of taking a sample output from the MC500 (or from SELA's earlier MC200 tool) and producing a FIB ready slice of 25±5 μm, mounted on a TEM-washer and ready for FIB thinning to electron transparency for TEM analysis. The materials selected for the tool set evaluation mainly included the Cu/TaN/HOSP low-k system. The paper is divided into three sections, experimental approach, SEM preparation and analysis of HOSP low-k, and TEM preparation and analysis of Cu/TaN/HOSP low-k samples. For the samples discussed, data is presented to show the quality of preparation provided by these new automated tools.

  15. PlantCV v2: Image analysis software for high-throughput plant phenotyping

    PubMed Central

    Abbasi, Arash; Berry, Jeffrey C.; Callen, Steven T.; Chavez, Leonardo; Doust, Andrew N.; Feldman, Max J.; Gilbert, Kerrigan B.; Hodge, John G.; Hoyer, J. Steen; Lin, Andy; Liu, Suxing; Lizárraga, César; Lorence, Argelia; Miller, Michael; Platon, Eric; Tessman, Monica; Sax, Tony

    2017-01-01

    Systems for collecting image data in conjunction with computer vision techniques are a powerful tool for increasing the temporal resolution at which plant phenotypes can be measured non-destructively. Computational tools that are flexible and extendable are needed to address the diversity of plant phenotyping problems. We previously described the Plant Computer Vision (PlantCV) software package, which is an image processing toolkit for plant phenotyping analysis. The goal of the PlantCV project is to develop a set of modular, reusable, and repurposable tools for plant image analysis that are open-source and community-developed. Here we present the details and rationale for major developments in the second major release of PlantCV. In addition to overall improvements in the organization of the PlantCV project, new functionality includes a set of new image processing and normalization tools, support for analyzing images that include multiple plants, leaf segmentation, landmark identification tools for morphometrics, and modules for machine learning. PMID:29209576

  16. PlantCV v2: Image analysis software for high-throughput plant phenotyping.

    PubMed

    Gehan, Malia A; Fahlgren, Noah; Abbasi, Arash; Berry, Jeffrey C; Callen, Steven T; Chavez, Leonardo; Doust, Andrew N; Feldman, Max J; Gilbert, Kerrigan B; Hodge, John G; Hoyer, J Steen; Lin, Andy; Liu, Suxing; Lizárraga, César; Lorence, Argelia; Miller, Michael; Platon, Eric; Tessman, Monica; Sax, Tony

    2017-01-01

    Systems for collecting image data in conjunction with computer vision techniques are a powerful tool for increasing the temporal resolution at which plant phenotypes can be measured non-destructively. Computational tools that are flexible and extendable are needed to address the diversity of plant phenotyping problems. We previously described the Plant Computer Vision (PlantCV) software package, which is an image processing toolkit for plant phenotyping analysis. The goal of the PlantCV project is to develop a set of modular, reusable, and repurposable tools for plant image analysis that are open-source and community-developed. Here we present the details and rationale for major developments in the second major release of PlantCV. In addition to overall improvements in the organization of the PlantCV project, new functionality includes a set of new image processing and normalization tools, support for analyzing images that include multiple plants, leaf segmentation, landmark identification tools for morphometrics, and modules for machine learning.

  17. Online characterization of planetary surfaces: PlanetServer, an open-source analysis and visualization tool

    NASA Astrophysics Data System (ADS)

    Marco Figuera, R.; Pham Huu, B.; Rossi, A. P.; Minin, M.; Flahaut, J.; Halder, A.

    2018-01-01

    The lack of open-source tools for hyperspectral data visualization and analysis creates a demand for new tools. In this paper we present the new PlanetServer, a set of tools comprising a web Geographic Information System (GIS) and a recently developed Python Application Programming Interface (API) capable of visualizing and analyzing a wide variety of hyperspectral data from different planetary bodies. Current WebGIS open-source tools are evaluated in order to give an overview and contextualize how PlanetServer can help in this matters. The web client is thoroughly described as well as the datasets available in PlanetServer. Also, the Python API is described and exposed the reason of its development. Two different examples of mineral characterization of different hydrosilicates such as chlorites, prehnites and kaolinites in the Nili Fossae area on Mars are presented. As the obtained results show positive outcome in hyperspectral analysis and visualization compared to previous literature, we suggest using the PlanetServer approach for such investigations.

  18. Experience with case tools in the design of process-oriented software

    NASA Astrophysics Data System (ADS)

    Novakov, Ognian; Sicard, Claude-Henri

    1994-12-01

    In Accelerator systems such as the CERN PS complex, process equipment has a life time which may exceed the typical life cycle of its related software. Taking into account the variety of such equipment, it is important to keep the analysis and design of the software in a system-independent form. This paper discusses the experience gathered in using commercial CASE tools for analysis, design and reverse engineering of different process-oriented software modules, with a principal emphasis on maintaining the initial analysis in a standardized form. Such tools have been in existence for several years, but this paper shows that they are not fully adapted to our needs. In particular, the paper stresses the problems of integrating such a tool into an existing data-base-dependent development chain, the lack of real-time simulation tools and of Object-Oriented concepts in existing commercial packages. Finally, the paper gives a broader view of software engineering needs in our particular context.

  19. PlantCV v2: Image analysis software for high-throughput plant phenotyping

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gehan, Malia A.; Fahlgren, Noah; Abbasi, Arash

    Systems for collecting image data in conjunction with computer vision techniques are a powerful tool for increasing the temporal resolution at which plant phenotypes can be measured non-destructively. Computational tools that are flexible and extendable are needed to address the diversity of plant phenotyping problems. We previously described the Plant Computer Vision (PlantCV) software package, which is an image processing toolkit for plant phenotyping analysis. The goal of the PlantCV project is to develop a set of modular, reusable, and repurposable tools for plant image analysis that are open-source and community-developed. Here in this paper we present the details andmore » rationale for major developments in the second major release of PlantCV. In addition to overall improvements in the organization of the PlantCV project, new functionality includes a set of new image processing and normalization tools, support for analyzing images that include multiple plants, leaf segmentation, landmark identification tools for morphometrics, and modules for machine learning.« less

  20. PlantCV v2: Image analysis software for high-throughput plant phenotyping

    DOE PAGES

    Gehan, Malia A.; Fahlgren, Noah; Abbasi, Arash; ...

    2017-12-01

    Systems for collecting image data in conjunction with computer vision techniques are a powerful tool for increasing the temporal resolution at which plant phenotypes can be measured non-destructively. Computational tools that are flexible and extendable are needed to address the diversity of plant phenotyping problems. We previously described the Plant Computer Vision (PlantCV) software package, which is an image processing toolkit for plant phenotyping analysis. The goal of the PlantCV project is to develop a set of modular, reusable, and repurposable tools for plant image analysis that are open-source and community-developed. Here in this paper we present the details andmore » rationale for major developments in the second major release of PlantCV. In addition to overall improvements in the organization of the PlantCV project, new functionality includes a set of new image processing and normalization tools, support for analyzing images that include multiple plants, leaf segmentation, landmark identification tools for morphometrics, and modules for machine learning.« less

  1. Scalability of Comparative Analysis, Novel Algorithms and Tools (MICW - Metagenomics Informatics Challenges Workshop: 10K Genomes at a Time)

    ScienceCinema

    Mavrommatis, Kostas

    2017-12-22

    DOE JGI's Kostas Mavrommatis, chair of the Scalability of Comparative Analysis, Novel Algorithms and Tools panel, at the Metagenomics Informatics Challenges Workshop held at the DOE JGI on October 12-13, 2011.

  2. Economics of Agroforestry

    Treesearch

    D. Evan Mercer; Frederick W. Cubbage; Gregory E. Frey

    2014-01-01

    This chapter provides principles, literature and a case study about the economics of agroforestry. We examine necessary conditions for achieving efficiency in agroforestry system design and economic analysis tools for assessing efficiency and adoptability of agroforestry. The tools presented here (capital budgeting, linear progranuning, production frontier analysis...

  3. Measuring Security Effectiveness and Efficiency at U.S. Commercial Airports

    DTIC Science & Technology

    2013-03-01

    formative program evaluation and policy analysis to investigate current airport security programs. It identifies innovative public administration and...policy-analysis tools that could provide potential benefits to airport security . These tools will complement the System Based Risk Management framework if

  4. HISTORICAL ANALYSIS, A VALUABLE TOOL IN COMMUNITY-BASED ENVIRONMENTAL PROTECTION

    EPA Science Inventory

    A historical analysis of the ecological consequences of development can be a valuable tool in community-based environmental protection. These studies can engage the public in environmental issues and lead to informed decision making. Historical studies provide an understanding of...

  5. Communications Effects Server (CES) Model for Systems Engineering Research

    DTIC Science & Technology

    2012-01-31

    Visualization Tool Interface «logical» HLA Tool Interface «logical» DIS Tool Interface «logical» STK Tool Interface «module» Execution Kernels «logical...interoperate with STK when running simulations. GUI Components  Architect – The Architect represents the main network design and visualization ...interest» CES «block» Third Party Visualization Tool «block» Third Party Analysis Tool «block» Third Party Text Editor «block» HLA Tools Analyst User Army

  6. Scalability Analysis of Gleipnir: A Memory Tracing and Profiling Tool, on Titan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Janjusic, Tommy; Kartsaklis, Christos; Wang, Dali

    2013-01-01

    Application performance is hindered by a variety of factors but most notably driven by the well know CPU-memory speed gap (also known as the memory wall). Understanding application s memory behavior is key if we are trying to optimize performance. Understanding application performance properties is facilitated with various performance profiling tools. The scope of profiling tools varies in complexity, ease of deployment, profiling performance, and the detail of profiled information. Specifically, using profiling tools for performance analysis is a common task when optimizing and understanding scientific applications on complex and large scale systems such as Cray s XK7. This papermore » describes the performance characteristics of using Gleipnir, a memory tracing tool, on the Titan Cray XK7 system when instrumenting large applications such as the Community Earth System Model. Gleipnir is a memory tracing tool built as a plug-in tool for the Valgrind instrumentation framework. The goal of Gleipnir is to provide fine-grained trace information. The generated traces are a stream of executed memory transactions mapped to internal structures per process, thread, function, and finally the data structure or variable. Our focus was to expose tool performance characteristics when using Gleipnir with a combination of an external tools such as a cache simulator, Gl CSim, to characterize the tool s overall performance. In this paper we describe our experience with deploying Gleipnir on the Titan Cray XK7 system, report on the tool s ease-of-use, and analyze run-time performance characteristics under various workloads. While all performance aspects are important we mainly focus on I/O characteristics analysis due to the emphasis on the tools output which are trace-files. Moreover, the tool is dependent on the run-time system to provide the necessary infrastructure to expose low level system detail; therefore, we also discuss any theoretical benefits that can be achieved if such modules were present.« less

  7. SentiHealth-Cancer: A sentiment analysis tool to help detecting mood of patients in online social networks.

    PubMed

    Rodrigues, Ramon Gouveia; das Dores, Rafael Marques; Camilo-Junior, Celso G; Rosa, Thierson Couto

    2016-01-01

    Cancer is a critical disease that affects millions of people and families around the world. In 2012 about 14.1 million new cases of cancer occurred globally. Because of many reasons like the severity of some cases, the side effects of some treatments and death of other patients, cancer patients tend to be affected by serious emotional disorders, like depression, for instance. Thus, monitoring the mood of the patients is an important part of their treatment. Many cancer patients are users of online social networks and many of them take part in cancer virtual communities where they exchange messages commenting about their treatment or giving support to other patients in the community. Most of these communities are of public access and thus are useful sources of information about the mood of patients. Based on that, Sentiment Analysis methods can be useful to automatically detect positive or negative mood of cancer patients by analyzing their messages in these online communities. The objective of this work is to present a Sentiment Analysis tool, named SentiHealth-Cancer (SHC-pt), that improves the detection of emotional state of patients in Brazilian online cancer communities, by inspecting their posts written in Portuguese language. The SHC-pt is a sentiment analysis tool which is tailored specifically to detect positive, negative or neutral messages of patients in online communities of cancer patients. We conducted a comparative study of the proposed method with a set of general-purpose sentiment analysis tools adapted to this context. Different collections of posts were obtained from two cancer communities in Facebook. Additionally, the posts were analyzed by sentiment analysis tools that support the Portuguese language (Semantria and SentiStrength) and by the tool SHC-pt, developed based on the method proposed in this paper called SentiHealth. Moreover, as a second alternative to analyze the texts in Portuguese, the collected texts were automatically translated into English, and submitted to sentiment analysis tools that do not support the Portuguese language (AlchemyAPI and Textalytics) and also to Semantria and SentiStrength, using the English option of these tools. Six experiments were conducted with some variations and different origins of the collected posts. The results were measured using the following metrics: precision, recall, F1-measure and accuracy The proposed tool SHC-pt reached the best averages for accuracy and F1-measure (harmonic mean between recall and precision) in the three sentiment classes addressed (positive, negative and neutral) in all experimental settings. Moreover, the worst accuracy value (58%) achieved by SHC-pt in any experiment is 11.53% better than the greatest accuracy (52%) presented by other addressed tools. Finally, the worst average F1 (48.46%) reached by SHC-pt in any experiment is 4.14% better than the greatest average F1 (46.53%) achieved by other addressed tools. Thus, even when we compare the SHC-pt results in complex scenario versus others in easier scenario the SHC-pt is better. This paper presents two contributions. First, it proposes the method SentiHealth to detect the mood of cancer patients that are also users of communities of patients in online social networks. Second, it presents an instantiated tool from the method, called SentiHealth-Cancer (SHC-pt), dedicated to automatically analyze posts in communities of cancer patients, based on SentiHealth. This context-tailored tool outperformed other general-purpose sentiment analysis tools at least in the cancer context. This suggests that the SentiHealth method could be instantiated as other disease-based tools during future works, for instance SentiHealth-HIV, SentiHealth-Stroke and SentiHealth-Sclerosis. Copyright © 2015. Published by Elsevier Ireland Ltd.

  8. Terminal Area Conflict Detection and Resolution Tool

    NASA Technical Reports Server (NTRS)

    Verma, Savita Arora

    2011-01-01

    This poster will describe analysis of a conflict detection and resolution tool for the terminal area called T-TSAFE. With altitude clearance information, the tool can reduce false alerts to as low as 2 per hour.

  9. MetaMeta: integrating metagenome analysis tools to improve taxonomic profiling.

    PubMed

    Piro, Vitor C; Matschkowski, Marcel; Renard, Bernhard Y

    2017-08-14

    Many metagenome analysis tools are presently available to classify sequences and profile environmental samples. In particular, taxonomic profiling and binning methods are commonly used for such tasks. Tools available among these two categories make use of several techniques, e.g., read mapping, k-mer alignment, and composition analysis. Variations on the construction of the corresponding reference sequence databases are also common. In addition, different tools provide good results in different datasets and configurations. All this variation creates a complicated scenario to researchers to decide which methods to use. Installation, configuration and execution can also be difficult especially when dealing with multiple datasets and tools. We propose MetaMeta: a pipeline to execute and integrate results from metagenome analysis tools. MetaMeta provides an easy workflow to run multiple tools with multiple samples, producing a single enhanced output profile for each sample. MetaMeta includes a database generation, pre-processing, execution, and integration steps, allowing easy execution and parallelization. The integration relies on the co-occurrence of organisms from different methods as the main feature to improve community profiling while accounting for differences in their databases. In a controlled case with simulated and real data, we show that the integrated profiles of MetaMeta overcome the best single profile. Using the same input data, it provides more sensitive and reliable results with the presence of each organism being supported by several methods. MetaMeta uses Snakemake and has six pre-configured tools, all available at BioConda channel for easy installation (conda install -c bioconda metameta). The MetaMeta pipeline is open-source and can be downloaded at: https://gitlab.com/rki_bioinformatics .

  10. Antigen Receptor Galaxy: A User-Friendly, Web-Based Tool for Analysis and Visualization of T and B Cell Receptor Repertoire Data

    PubMed Central

    IJspeert, Hanna; van Schouwenburg, Pauline A.; van Zessen, David; Pico-Knijnenburg, Ingrid

    2017-01-01

    Antigen Receptor Galaxy (ARGalaxy) is a Web-based tool for analyses and visualization of TCR and BCR sequencing data of 13 species. ARGalaxy consists of four parts: the demultiplex tool, the international ImMunoGeneTics information system (IMGT) concatenate tool, the immune repertoire pipeline, and the somatic hypermutation (SHM) and class switch recombination (CSR) pipeline. Together they allow the analysis of all different aspects of the immune repertoire. All pipelines can be run independently or combined, depending on the available data and the question of interest. The demultiplex tool allows data trimming and demultiplexing, whereas with the concatenate tool multiple IMGT/HighV-QUEST output files can be merged into a single file. The immune repertoire pipeline is an extended version of our previously published ImmunoGlobulin Galaxy (IGGalaxy) virtual machine that was developed to visualize V(D)J gene usage. It allows analysis of both BCR and TCR rearrangements, visualizes CDR3 characteristics (length and amino acid usage) and junction characteristics, and calculates the diversity of the immune repertoire. Finally, ARGalaxy includes the newly developed SHM and CSR pipeline to analyze SHM and/or CSR in BCR rearrangements. It analyzes the frequency and patterns of SHM, Ag selection (including BASELINe), clonality (Change-O), and CSR. The functionality of the ARGalaxy tool is illustrated in several clinical examples of patients with primary immunodeficiencies. In conclusion, ARGalaxy is a novel tool for the analysis of the complete immune repertoire, which is applicable to many patient groups with disturbances in the immune repertoire such as autoimmune diseases, allergy, and leukemia, but it can also be used to address basic research questions in repertoire formation and selection. PMID:28416602

  11. Multispectral analysis tools can increase utility of RGB color images in histology

    NASA Astrophysics Data System (ADS)

    Fereidouni, Farzad; Griffin, Croix; Todd, Austin; Levenson, Richard

    2018-04-01

    Multispectral imaging (MSI) is increasingly finding application in the study and characterization of biological specimens. However, the methods typically used come with challenges on both the acquisition and the analysis front. MSI can be slow and photon-inefficient, leading to long imaging times and possible phototoxicity and photobleaching. The resulting datasets can be large and complex, prompting the development of a number of mathematical approaches for segmentation and signal unmixing. We show that under certain circumstances, just three spectral channels provided by standard color cameras, coupled with multispectral analysis tools, including a more recent spectral phasor approach, can efficiently provide useful insights. These findings are supported with a mathematical model relating spectral bandwidth and spectral channel number to achievable spectral accuracy. The utility of 3-band RGB and MSI analysis tools are demonstrated on images acquired using brightfield and fluorescence techniques, as well as a novel microscopy approach employing UV-surface excitation. Supervised linear unmixing, automated non-negative matrix factorization and phasor analysis tools all provide useful results, with phasors generating particularly helpful spectral display plots for sample exploration.

  12. ICO amplicon NGS data analysis: a Web tool for variant detection in common high-risk hereditary cancer genes analyzed by amplicon GS Junior next-generation sequencing.

    PubMed

    Lopez-Doriga, Adriana; Feliubadaló, Lídia; Menéndez, Mireia; Lopez-Doriga, Sergio; Morón-Duran, Francisco D; del Valle, Jesús; Tornero, Eva; Montes, Eva; Cuesta, Raquel; Campos, Olga; Gómez, Carolina; Pineda, Marta; González, Sara; Moreno, Victor; Capellá, Gabriel; Lázaro, Conxi

    2014-03-01

    Next-generation sequencing (NGS) has revolutionized genomic research and is set to have a major impact on genetic diagnostics thanks to the advent of benchtop sequencers and flexible kits for targeted libraries. Among the main hurdles in NGS are the difficulty of performing bioinformatic analysis of the huge volume of data generated and the high number of false positive calls that could be obtained, depending on the NGS technology and the analysis pipeline. Here, we present the development of a free and user-friendly Web data analysis tool that detects and filters sequence variants, provides coverage information, and allows the user to customize some basic parameters. The tool has been developed to provide accurate genetic analysis of targeted sequencing of common high-risk hereditary cancer genes using amplicon libraries run in a GS Junior System. The Web resource is linked to our own mutation database, to assist in the clinical classification of identified variants. We believe that this tool will greatly facilitate the use of the NGS approach in routine laboratories.

  13. PAINT: a promoter analysis and interaction network generation tool for gene regulatory network identification.

    PubMed

    Vadigepalli, Rajanikanth; Chakravarthula, Praveen; Zak, Daniel E; Schwaber, James S; Gonye, Gregory E

    2003-01-01

    We have developed a bioinformatics tool named PAINT that automates the promoter analysis of a given set of genes for the presence of transcription factor binding sites. Based on coincidence of regulatory sites, this tool produces an interaction matrix that represents a candidate transcriptional regulatory network. This tool currently consists of (1) a database of promoter sequences of known or predicted genes in the Ensembl annotated mouse genome database, (2) various modules that can retrieve and process the promoter sequences for binding sites of known transcription factors, and (3) modules for visualization and analysis of the resulting set of candidate network connections. This information provides a substantially pruned list of genes and transcription factors that can be examined in detail in further experimental studies on gene regulation. Also, the candidate network can be incorporated into network identification methods in the form of constraints on feasible structures in order to render the algorithms tractable for large-scale systems. The tool can also produce output in various formats suitable for use in external visualization and analysis software. In this manuscript, PAINT is demonstrated in two case studies involving analysis of differentially regulated genes chosen from two microarray data sets. The first set is from a neuroblastoma N1E-115 cell differentiation experiment, and the second set is from neuroblastoma N1E-115 cells at different time intervals following exposure to neuropeptide angiotensin II. PAINT is available for use as an agent in BioSPICE simulation and analysis framework (www.biospice.org), and can also be accessed via a WWW interface at www.dbi.tju.edu/dbi/tools/paint/.

  14. Ares I-X Flight Test Validation of Control Design Tools in the Frequency-Domain

    NASA Technical Reports Server (NTRS)

    Johnson, Matthew; Hannan, Mike; Brandon, Jay; Derry, Stephen

    2011-01-01

    A major motivation of the Ares I-X flight test program was to Design for Data, in order to maximize the usefulness of the data recorded in support of Ares I modeling and validation of design and analysis tools. The Design for Data effort was intended to enable good post-flight characterizations of the flight control system, the vehicle structural dynamics, and also the aerodynamic characteristics of the vehicle. To extract the necessary data from the system during flight, a set of small predetermined Programmed Test Inputs (PTIs) was injected directly into the TVC signal. These PTIs were designed to excite the necessary vehicle dynamics while exhibiting a minimal impact on loads. The method is similar to common approaches in aircraft flight test programs, but with unique launch vehicle challenges due to rapidly changing states, short duration of flight, a tight flight envelope, and an inability to repeat any test. This paper documents the validation effort of the stability analysis tools to the flight data which was performed by comparing the post-flight calculated frequency response of the vehicle to the frequency response calculated by the stability analysis tools used to design and analyze the preflight models during the control design effort. The comparison between flight day frequency response and stability tool analysis for flight of the simulated vehicle shows good agreement and provides a high level of confidence in the stability analysis tools for use in any future program. This is true for both a nominal model as well as for dispersed analysis, which shows that the flight day frequency response is enveloped by the vehicle s preflight uncertainty models.

  15. Extended Testability Analysis Tool

    NASA Technical Reports Server (NTRS)

    Melcher, Kevin; Maul, William A.; Fulton, Christopher

    2012-01-01

    The Extended Testability Analysis (ETA) Tool is a software application that supports fault management (FM) by performing testability analyses on the fault propagation model of a given system. Fault management includes the prevention of faults through robust design margins and quality assurance methods, or the mitigation of system failures. Fault management requires an understanding of the system design and operation, potential failure mechanisms within the system, and the propagation of those potential failures through the system. The purpose of the ETA Tool software is to process the testability analysis results from a commercial software program called TEAMS Designer in order to provide a detailed set of diagnostic assessment reports. The ETA Tool is a command-line process with several user-selectable report output options. The ETA Tool also extends the COTS testability analysis and enables variation studies with sensor sensitivity impacts on system diagnostics and component isolation using a single testability output. The ETA Tool can also provide extended analyses from a single set of testability output files. The following analysis reports are available to the user: (1) the Detectability Report provides a breakdown of how each tested failure mode was detected, (2) the Test Utilization Report identifies all the failure modes that each test detects, (3) the Failure Mode Isolation Report demonstrates the system s ability to discriminate between failure modes, (4) the Component Isolation Report demonstrates the system s ability to discriminate between failure modes relative to the components containing the failure modes, (5) the Sensor Sensor Sensitivity Analysis Report shows the diagnostic impact due to loss of sensor information, and (6) the Effect Mapping Report identifies failure modes that result in specified system-level effects.

  16. Wavelets, non-linearity and turbulence in fusion plasmas

    NASA Astrophysics Data System (ADS)

    van Milligen, B. Ph.

    Introduction Linear spectral analysis tools Wavelet analysis Wavelet spectra and coherence Joint wavelet phase-frequency spectra Non-linear spectral analysis tools Wavelet bispectra and bicoherence Interpretation of the bicoherence Analysis of computer-generated data Coupled van der Pol oscillators A large eddy simulation model for two-fluid plasma turbulence A long wavelength plasma drift wave model Analysis of plasma edge turbulence from Langmuir probe data Radial coherence observed on the TJ-IU torsatron Bicoherence profile at the L/H transition on CCT Conclusions

  17. System Analysis Applied to Autonomy: Application to High-Altitude Long-Endurance Remotely Operated Aircraft

    NASA Technical Reports Server (NTRS)

    Young, Larry A.; Yetter, Jeffrey A.; Guynn, Mark D.

    2006-01-01

    Maturation of intelligent systems technologies and their incorporation into aerial platforms are dictating the development of new analysis tools and incorporation of such tools into existing system analysis methodologies in order to fully capture the trade-offs of autonomy on vehicle and mission success. A first-order "system analysis of autonomy" methodology is outlined in this paper. Further, this analysis methodology is subsequently applied to notional high-altitude long-endurance (HALE) aerial vehicle missions.

  18. The JASMIN Analysis Platform - bridging the gap between traditional climate data practicies and data-centric analysis paradigms

    NASA Astrophysics Data System (ADS)

    Pascoe, Stephen; Iwi, Alan; kershaw, philip; Stephens, Ag; Lawrence, Bryan

    2014-05-01

    The advent of large-scale data and the consequential analysis problems have led to two new challenges for the research community: how to share such data to get the maximum value and how to carry out efficient analysis. Solving both challenges require a form of parallelisation: the first is social parallelisation (involving trust and information sharing), the second data parallelisation (involving new algorithms and tools). The JASMIN infrastructure supports both kinds of parallelism by providing a multi-tennent environment with petabyte-scale storage, VM provisioning and batch cluster facilities. The JASMIN Analysis Platform (JAP) is an analysis software layer for JASMIN which emphasises ease of transition from a researcher's local environment to JASMIN. JAP brings together tools traditionally used by multiple communities and configures them to work together, enabling users to move analysis from their local environment to JASMIN without rewriting code. JAP also provides facilities to exploit JASMIN's parallel capabilities whilst maintaining their familiar analysis environment where ever possible. Modern opensource analysis tools typically have multiple dependent packages, increasing the installation burden on system administrators. When you consider a suite of tools, often with both common and conflicting dependencies, analysis pipelines can become locked to a particular installation simply because of the effort required to reconstruct the dependency tree. JAP addresses this problem by providing a consistent suite of RPMs compatible with RedHat Enterprise Linux and CentOS 6.4. Researchers can install JAP locally, either as RPMs or through a pre-built VM image, giving them the confidence to know moving analysis to JASMIN will not disrupt their environment. Analysis parallelisation is in it's infancy in climate sciences, with few tools capable of exploiting any parallel environment beyond manual scripting of the use of multiple processors. JAP begins to bridge this gap through a veriety of higher-level tools for parallelisation and job scheduling such as IPython-parallel and MPI support for interactive analysis languages. We find that enabling even simple parallelisation of workflows, together with the state of the art I/O performance of JASMIN storage, provides many users with the large increases in efficiency they need to scale their analyses to conteporary data volumes and tackly new, previously inaccessible, problems.

  19. Policy Analysis: A Tool for Setting District Computer Use Policy. Paper and Report Series No. 97.

    ERIC Educational Resources Information Center

    Gray, Peter J.

    This report explores the use of policy analysis as a tool for setting computer use policy in a school district by discussing the steps in the policy formation and implementation processes and outlining how policy analysis methods can contribute to the creation of effective policy. Factors related to the adoption and implementation of innovations…

  20. Operational Analysis of Time-Optimal Maneuvering for Imaging Spacecraft

    DTIC Science & Technology

    2013-03-01

    imaging spacecraft. The analysis is facilitated through the use of AGI’s Systems Tool Kit ( STK ) software. An Analytic Hierarchy Process (AHP)-based...the Singapore-developed X-SAT imaging spacecraft. The analysis is facilitated through the use of AGI’s Systems Tool Kit ( STK ) software. An Analytic...89  B.  FUTURE WORK................................................................................. 90  APPENDIX A. STK DATA AND BENEFIT

  1. The Effectiveness of Virtual Learning Tools for Millennial Generation Students in a Community College Criminal Justice Degree Program

    ERIC Educational Resources Information Center

    Snyder, Lawrence

    2013-01-01

    An analysis of data from the Community College Survey of Student Engagement and multiyear analysis of pretest/posttest scores in introductory criminal justice courses revealed there was a systemic decline in student engagement and achievement. Because of this analysis, a commercial virtual learning tool (CJI) that purported great success in…

  2. Fault Tree Analysis: A Research Tool for Educational Planning. Technical Report No. 1.

    ERIC Educational Resources Information Center

    Alameda County School Dept., Hayward, CA. PACE Center.

    This ESEA Title III report describes fault tree analysis and assesses its applicability to education. Fault tree analysis is an operations research tool which is designed to increase the probability of success in any system by analyzing the most likely modes of failure that could occur. A graphic portrayal, which has the form of a tree, is…

  3. Use of a Process Analysis Tool for Diagnostic Study on Fine Particulate Matter Predictions in the U.S.-Part II: Analysis and Sensitivity Simulations

    EPA Science Inventory

    Following the Part I paper that described an application of the U.S. EPA Models-3/Community Multiscale Air Quality (CMAQ) modeling system to the 1999 Southern Oxidants Study episode, this paper presents results from process analysis (PA) using the PA tool embedded in CMAQ and s...

  4. Effects-based strategy development through center of gravity and target system analysis

    NASA Astrophysics Data System (ADS)

    White, Christopher M.; Prendergast, Michael; Pioch, Nicholas; Jones, Eric K.; Graham, Stephen

    2003-09-01

    This paper describes an approach to effects-based planning in which a strategic-theater-level mission is refined into operational-level and ultimately tactical-level tasks and desired effects, informed by models of the expected enemy response at each level of abstraction. We describe a strategy development system that implements this approach and supports human-in-the-loop development of an effects-based plan. This system consists of plan authoring tools tightly integrated with a suite of center of gravity (COG) and target system analysis tools. A human planner employs the plan authoring tools to develop a hierarchy of tasks and desired effects. Upon invocation, the target system analysis tools use reduced-order models of enemy centers of gravity to select appropriate target set options for the achievement of desired effects, together with associated indicators for each option. The COG analysis tools also provide explicit models of the causal mechanisms linking tasks and desired effects to one another, and suggest appropriate observable indicators to guide ISR planning, execution monitoring, and campaign assessment. We are currently implementing the system described here as part of the AFRL-sponsored Effects Based Operations program.

  5. Managing complex research datasets using electronic tools: A meta-analysis exemplar

    PubMed Central

    Brown, Sharon A.; Martin, Ellen E.; Garcia, Theresa J.; Winter, Mary A.; García, Alexandra A.; Brown, Adama; Cuevas, Heather E.; Sumlin, Lisa L.

    2013-01-01

    Meta-analyses of broad scope and complexity require investigators to organize many study documents and manage communication among several research staff. Commercially available electronic tools, e.g., EndNote, Adobe Acrobat Pro, Blackboard, Excel, and IBM SPSS Statistics (SPSS), are useful for organizing and tracking the meta-analytic process, as well as enhancing communication among research team members. The purpose of this paper is to describe the electronic processes we designed, using commercially available software, for an extensive quantitative model-testing meta-analysis we are conducting. Specific electronic tools improved the efficiency of (a) locating and screening studies, (b) screening and organizing studies and other project documents, (c) extracting data from primary studies, (d) checking data accuracy and analyses, and (e) communication among team members. The major limitation in designing and implementing a fully electronic system for meta-analysis was the requisite upfront time to: decide on which electronic tools to use, determine how these tools would be employed, develop clear guidelines for their use, and train members of the research team. The electronic process described here has been useful in streamlining the process of conducting this complex meta-analysis and enhancing communication and sharing documents among research team members. PMID:23681256

  6. Managing complex research datasets using electronic tools: a meta-analysis exemplar.

    PubMed

    Brown, Sharon A; Martin, Ellen E; Garcia, Theresa J; Winter, Mary A; García, Alexandra A; Brown, Adama; Cuevas, Heather E; Sumlin, Lisa L

    2013-06-01

    Meta-analyses of broad scope and complexity require investigators to organize many study documents and manage communication among several research staff. Commercially available electronic tools, for example, EndNote, Adobe Acrobat Pro, Blackboard, Excel, and IBM SPSS Statistics (SPSS), are useful for organizing and tracking the meta-analytic process as well as enhancing communication among research team members. The purpose of this article is to describe the electronic processes designed, using commercially available software, for an extensive, quantitative model-testing meta-analysis. Specific electronic tools improved the efficiency of (a) locating and screening studies, (b) screening and organizing studies and other project documents, (c) extracting data from primary studies, (d) checking data accuracy and analyses, and (e) communication among team members. The major limitation in designing and implementing a fully electronic system for meta-analysis was the requisite upfront time to decide on which electronic tools to use, determine how these tools would be used, develop clear guidelines for their use, and train members of the research team. The electronic process described here has been useful in streamlining the process of conducting this complex meta-analysis and enhancing communication and sharing documents among research team members.

  7. The Development of a Humanitarian Health Ethics Analysis Tool.

    PubMed

    Fraser, Veronique; Hunt, Matthew R; de Laat, Sonya; Schwartz, Lisa

    2015-08-01

    Introduction Health care workers (HCWs) who participate in humanitarian aid work experience a range of ethical challenges in providing care and assistance to communities affected by war, disaster, or extreme poverty. Although there is increasing discussion of ethics in humanitarian health care practice and policy, there are very few resources available for humanitarian workers seeking ethical guidance in the field. To address this knowledge gap, a Humanitarian Health Ethics Analysis Tool (HHEAT) was developed and tested as an action-oriented resource to support humanitarian workers in ethical decision making. While ethical analysis tools increasingly have become prevalent in a variety of practice contexts over the past two decades, very few of these tools have undergone a process of empirical validation to assess their usefulness for practitioners. A qualitative study consisting of a series of six case-analysis sessions with 16 humanitarian HCWs was conducted to evaluate and refine the HHEAT. Participant feedback inspired the creation of a simplified and shortened version of the tool and prompted the development of an accompanying handbook. The study generated preliminary insight into the ethical deliberation processes of humanitarian health workers and highlighted different types of ethics support that humanitarian workers might find helpful in supporting the decision-making process.

  8. Advanced Vibration Analysis Tool Developed for Robust Engine Rotor Designs

    NASA Technical Reports Server (NTRS)

    Min, James B.

    2005-01-01

    The primary objective of this research program is to develop vibration analysis tools, design tools, and design strategies to significantly improve the safety and robustness of turbine engine rotors. Bladed disks in turbine engines always feature small, random blade-to-blade differences, or mistuning. Mistuning can lead to a dramatic increase in blade forced-response amplitudes and stresses. Ultimately, this results in high-cycle fatigue, which is a major safety and cost concern. In this research program, the necessary steps will be taken to transform a state-of-the-art vibration analysis tool, the Turbo- Reduce forced-response prediction code, into an effective design tool by enhancing and extending the underlying modeling and analysis methods. Furthermore, novel techniques will be developed to assess the safety of a given design. In particular, a procedure will be established for using natural-frequency curve veerings to identify ranges of operating conditions (rotational speeds and engine orders) in which there is a great risk that the rotor blades will suffer high stresses. This work also will aid statistical studies of the forced response by reducing the necessary number of simulations. Finally, new strategies for improving the design of rotors will be pursued.

  9. Improved Aerodynamic Analysis for Hybrid Wing Body Conceptual Design Optimization

    NASA Technical Reports Server (NTRS)

    Gern, Frank H.

    2012-01-01

    This paper provides an overview of ongoing efforts to develop, evaluate, and validate different tools for improved aerodynamic modeling and systems analysis of Hybrid Wing Body (HWB) aircraft configurations. Results are being presented for the evaluation of different aerodynamic tools including panel methods, enhanced panel methods with viscous drag prediction, and computational fluid dynamics. Emphasis is placed on proper prediction of aerodynamic loads for structural sizing as well as viscous drag prediction to develop drag polars for HWB conceptual design optimization. Data from transonic wind tunnel tests at the Arnold Engineering Development Center s 16-Foot Transonic Tunnel was used as a reference data set in order to evaluate the accuracy of the aerodynamic tools. Triangularized surface data and Vehicle Sketch Pad (VSP) models of an X-48B 2% scale wind tunnel model were used to generate input and model files for the different analysis tools. In support of ongoing HWB scaling studies within the NASA Environmentally Responsible Aviation (ERA) program, an improved finite element based structural analysis and weight estimation tool for HWB center bodies is currently under development. Aerodynamic results from these analyses are used to provide additional aerodynamic validation data.

  10. WebArray: an online platform for microarray data analysis

    PubMed Central

    Xia, Xiaoqin; McClelland, Michael; Wang, Yipeng

    2005-01-01

    Background Many cutting-edge microarray analysis tools and algorithms, including commonly used limma and affy packages in Bioconductor, need sophisticated knowledge of mathematics, statistics and computer skills for implementation. Commercially available software can provide a user-friendly interface at considerable cost. To facilitate the use of these tools for microarray data analysis on an open platform we developed an online microarray data analysis platform, WebArray, for bench biologists to utilize these tools to explore data from single/dual color microarray experiments. Results The currently implemented functions were based on limma and affy package from Bioconductor, the spacings LOESS histogram (SPLOSH) method, PCA-assisted normalization method and genome mapping method. WebArray incorporates these packages and provides a user-friendly interface for accessing a wide range of key functions of limma and others, such as spot quality weight, background correction, graphical plotting, normalization, linear modeling, empirical bayes statistical analysis, false discovery rate (FDR) estimation, chromosomal mapping for genome comparison. Conclusion WebArray offers a convenient platform for bench biologists to access several cutting-edge microarray data analysis tools. The website is freely available at . It runs on a Linux server with Apache and MySQL. PMID:16371165

  11. Space Science Cloud: a Virtual Space Science Research Platform Based on Cloud Model

    NASA Astrophysics Data System (ADS)

    Hu, Xiaoyan; Tong, Jizhou; Zou, Ziming

    Through independent and co-operational science missions, Strategic Pioneer Program (SPP) on Space Science, the new initiative of space science program in China which was approved by CAS and implemented by National Space Science Center (NSSC), dedicates to seek new discoveries and new breakthroughs in space science, thus deepen the understanding of universe and planet earth. In the framework of this program, in order to support the operations of space science missions and satisfy the demand of related research activities for e-Science, NSSC is developing a virtual space science research platform based on cloud model, namely the Space Science Cloud (SSC). In order to support mission demonstration, SSC integrates interactive satellite orbit design tool, satellite structure and payloads layout design tool, payload observation coverage analysis tool, etc., to help scientists analyze and verify space science mission designs. Another important function of SSC is supporting the mission operations, which runs through the space satellite data pipelines. Mission operators can acquire and process observation data, then distribute the data products to other systems or issue the data and archives with the services of SSC. In addition, SSC provides useful data, tools and models for space researchers. Several databases in the field of space science are integrated and an efficient retrieve system is developing. Common tools for data visualization, deep processing (e.g., smoothing and filtering tools), analysis (e.g., FFT analysis tool and minimum variance analysis tool) and mining (e.g., proton event correlation analysis tool) are also integrated to help the researchers to better utilize the data. The space weather models on SSC include magnetic storm forecast model, multi-station middle and upper atmospheric climate model, solar energetic particle propagation model and so on. All the services above-mentioned are based on the e-Science infrastructures of CAS e.g. cloud storage and cloud computing. SSC provides its users with self-service storage and computing resources at the same time.At present, the prototyping of SSC is underway and the platform is expected to be put into trial operation in August 2014. We hope that as SSC develops, our vision of Digital Space may come true someday.

  12. MetaGenyo: a web tool for meta-analysis of genetic association studies.

    PubMed

    Martorell-Marugan, Jordi; Toro-Dominguez, Daniel; Alarcon-Riquelme, Marta E; Carmona-Saez, Pedro

    2017-12-16

    Genetic association studies (GAS) aims to evaluate the association between genetic variants and phenotypes. In the last few years, the number of this type of study has increased exponentially, but the results are not always reproducible due to experimental designs, low sample sizes and other methodological errors. In this field, meta-analysis techniques are becoming very popular tools to combine results across studies to increase statistical power and to resolve discrepancies in genetic association studies. A meta-analysis summarizes research findings, increases statistical power and enables the identification of genuine associations between genotypes and phenotypes. Meta-analysis techniques are increasingly used in GAS, but it is also increasing the amount of published meta-analysis containing different errors. Although there are several software packages that implement meta-analysis, none of them are specifically designed for genetic association studies and in most cases their use requires advanced programming or scripting expertise. We have developed MetaGenyo, a web tool for meta-analysis in GAS. MetaGenyo implements a complete and comprehensive workflow that can be executed in an easy-to-use environment without programming knowledge. MetaGenyo has been developed to guide users through the main steps of a GAS meta-analysis, covering Hardy-Weinberg test, statistical association for different genetic models, analysis of heterogeneity, testing for publication bias, subgroup analysis and robustness testing of the results. MetaGenyo is a useful tool to conduct comprehensive genetic association meta-analysis. The application is freely available at http://bioinfo.genyo.es/metagenyo/ .

  13. Using satellite data on meteorological and vegetation characteristics and soil surface humidity in the Land Surface Model for the vast territory of agricultural destination

    NASA Astrophysics Data System (ADS)

    Muzylev, Eugene; Startseva, Zoya; Uspensky, Alexander; Vasilenko, Eugene; Volkova, Elena; Kukharsky, Alexander

    2017-04-01

    The model of water and heat exchange between vegetation covered territory and atmosphere (LSM, Land Surface Model) for vegetation season has been developed to calculate soil water content, evapotranspiration, infiltration of water into the soil, vertical latent and sensible heat fluxes and other water and heat balances components as well as soil surface and vegetation cover temperatures and depth distributions of moisture and temperature. The LSM is suited for utilizing satellite-derived estimates of precipitation, land surface temperature and vegetation characteristics and soil surface humidity for each pixel. Vegetation and meteorological characteristics being the model parameters and input variables, correspondingly, have been estimated by ground observations and thematic processing measurement data of scanning radiometers AVHRR/NOAA, SEVIRI/Meteosat-9, -10 (MSG-2, -3) and MSU-MR/Meteor-M № 2. Values of soil surface humidity has been calculated from remote sensing data of scatterometers ASCAT/MetOp-A, -B. The case study has been carried out for the territory of part of the agricultural Central Black Earth Region of European Russia with area of 227300 km2 located in the forest-steppe zone for years 2012-2015 vegetation seasons. The main objectives of the study have been: - to built estimates of precipitation, land surface temperatures (LST) and vegetation characteristics from MSU-MR measurement data using the refined technologies (including algorithms and programs) of thematic processing satellite information matured on AVHRR and SEVIRI data. All technologies have been adapted to the area of interest; - to investigate the possibility of utilizing satellite-derived estimates of values above in the LSM including verification of obtained estimates and development of procedure of their inputting into the model. From the AVHRR data there have been built the estimates of precipitation, three types of LST: land skin temperature Tsg, air temperature at a level of vegetation cover (taken for vegetation temperature) Ta and efficient radiation temperature Ts.eff, as well as land surface emissivity E, normalized difference vegetation index NDVI, vegetation cover fraction B, and leaf area index LAI. The SEVIRI-based retrievals have included precipitation, LST Tls and Ta, E at daylight and nighttime, LAI (daily), and B. From the MSU-MR data there have been retrieved values of all the same characteristics as from the AVHRR data. The MSU-MR-based daily and monthly sums of precipitation have been calculated using the developed earlier and modified Multi Threshold Method (MTM) intended for the cloud detection and identification of its types around the clock as well as allocation of precipitation zones and determination of instantaneous maximum rainfall intensities for each pixel at that the transition from assessing rainfall intensity to estimating their daily values is a key element of the MTM. Measurement data from 3 IR MSU-MR channels (3.8, 11 i 12 μm) as well as their differences have been used in the MTM as predictors. Controlling the correctness of the MSU-MR-derived rainfall estimates has been carried out when comparing with analogous AVHRR- and SEVIRI-based retrievals and with precipitation amounts measured at the agricultural meteorological station of the study region. Probability of rainfall zones determination from the MSU-MR data, to match against the actual ones, has been 75-85% as well as for the AVHRR and SEVIRI data. The time behaviors of satellite-derived and ground-measured daily and monthly precipitation sums for vegetation season and yeaŗ correspondingly, have been in good agreement with each other although the first ones have been smoother than the latter. Discrepancies have existed for a number of local maxima for which satellite-derived precipitation estimates have been less than ground-measured values. It may be due to the different spatial scales of areal satellite-derived and point ground-based estimates. Some spatial displacement of the satellite-determined rainfall maxima and minima regarding to ground-based data can be explained by the discrepancy between the cloud location on satellite images and in reality at high angles of the satellite sightings and considerable altitudes of the cloud tops. Reliability of MSU-MR-derived rainfall estimates at each time step obtained using the MTM has been verified by comparing their values determined from the MSU-MR, AVHRR and SEVIRI measurements and distributed over the study area with similar estimates obtained by interpolation of ground observation data. The MSU-MR-derived estimates of temperatures Tsg, Ts.eff, and Ta have been obtained using computational algorithm developed on the base of the MTM and matured on AVHRR and SEVIRI data for the region under investigation. Since the apparatus MSU-MR is similar to radiometer AVHRR, the developed methods of satellite estimating Tsg, Ts.eff, and Ta from AVHRR data could be easily transferred to the MSU-MR data. Comparison of the ground-measured and MSU-MR-, AVHRR- and SEVIRI-derived LSTs has shown that the differences between all the estimates for the vast majority of observation terms have not exceed the RMSE of these quantities built from the AVHRR data. The similar conclusion has been also made from the results of building the time behavior of the MSU-MR-derived value of LAI for vegetation season. Satellite-based estimates of precipitation, LST, LAI and B have been utilized in the model with the help of specially developed procedures of replacing these values determined from observations at agricultural meteorological stations by their satellite-derived values taking into account spatial heterogeneity of their fields. Adequacy of such replacement has been confirmed by the results of comparing modeled and ground-measured values of soil moisture content W and evapotranspiration Ev. Discrepancies between the modeled and ground-measured values of W and Ev have been in the range of 10-15 and 20-25 %, correspondingly. It may be considered as acceptable result. Resulted products of the model calculations using satellite data have been spatial fields of W, Ev, vertical sensible and latent heat fluxes and other water and heat regime characteristics for the region of interest over the year 2012-2015 vegetation seasons. Thus, there has been shown the possibility of utilizing MSU-MR/Meteor-M №2 data jointly with those of other satellites in the LSM to calculate characteristics of water and heat regimes for the area under consideration. Besides the first trial estimations of the soil surface moisture from ASCAT scatterometers data for the study region have been obtained for the years 2014-2015 vegetation seasons, their comparison has been performed with the results of modeling for several agricultural meteorological stations of the region that has been carried out utilizing ground-based and satellite data, specific requirements for the obtained information have been formulated. To date, estimates of surface moisture built from ASCAT data can be used for the selection of the model soil parameter values and the initial soil moisture conditions for the vegetation season.

  14. EEG and MEG data analysis in SPM8.

    PubMed

    Litvak, Vladimir; Mattout, Jérémie; Kiebel, Stefan; Phillips, Christophe; Henson, Richard; Kilner, James; Barnes, Gareth; Oostenveld, Robert; Daunizeau, Jean; Flandin, Guillaume; Penny, Will; Friston, Karl

    2011-01-01

    SPM is a free and open source software written in MATLAB (The MathWorks, Inc.). In addition to standard M/EEG preprocessing, we presently offer three main analysis tools: (i) statistical analysis of scalp-maps, time-frequency images, and volumetric 3D source reconstruction images based on the general linear model, with correction for multiple comparisons using random field theory; (ii) Bayesian M/EEG source reconstruction, including support for group studies, simultaneous EEG and MEG, and fMRI priors; (iii) dynamic causal modelling (DCM), an approach combining neural modelling with data analysis for which there are several variants dealing with evoked responses, steady state responses (power spectra and cross-spectra), induced responses, and phase coupling. SPM8 is integrated with the FieldTrip toolbox , making it possible for users to combine a variety of standard analysis methods with new schemes implemented in SPM and build custom analysis tools using powerful graphical user interface (GUI) and batching tools.

  15. EEG and MEG Data Analysis in SPM8

    PubMed Central

    Litvak, Vladimir; Mattout, Jérémie; Kiebel, Stefan; Phillips, Christophe; Henson, Richard; Kilner, James; Barnes, Gareth; Oostenveld, Robert; Daunizeau, Jean; Flandin, Guillaume; Penny, Will; Friston, Karl

    2011-01-01

    SPM is a free and open source software written in MATLAB (The MathWorks, Inc.). In addition to standard M/EEG preprocessing, we presently offer three main analysis tools: (i) statistical analysis of scalp-maps, time-frequency images, and volumetric 3D source reconstruction images based on the general linear model, with correction for multiple comparisons using random field theory; (ii) Bayesian M/EEG source reconstruction, including support for group studies, simultaneous EEG and MEG, and fMRI priors; (iii) dynamic causal modelling (DCM), an approach combining neural modelling with data analysis for which there are several variants dealing with evoked responses, steady state responses (power spectra and cross-spectra), induced responses, and phase coupling. SPM8 is integrated with the FieldTrip toolbox , making it possible for users to combine a variety of standard analysis methods with new schemes implemented in SPM and build custom analysis tools using powerful graphical user interface (GUI) and batching tools. PMID:21437221

  16. Orbit Software Suite

    NASA Technical Reports Server (NTRS)

    Osgood, Cathy; Williams, Kevin; Gentry, Philip; Brownfield, Dana; Hallstrom, John; Stuit, Tim

    2012-01-01

    Orbit Software Suite is used to support a variety of NASA/DM (Dependable Multiprocessor) mission planning and analysis activities on the IPS (Intrusion Prevention System) platform. The suite of Orbit software tools (Orbit Design and Orbit Dynamics) resides on IPS/Linux workstations, and is used to perform mission design and analysis tasks corresponding to trajectory/ launch window, rendezvous, and proximity operations flight segments. A list of tools in Orbit Software Suite represents tool versions established during/after the Equipment Rehost-3 Project.

  17. Configuration Analysis Tool (CAT). System Description and users guide (revision 1)

    NASA Technical Reports Server (NTRS)

    Decker, W.; Taylor, W.; Mcgarry, F. E.; Merwarth, P.

    1982-01-01

    A system description of, and user's guide for, the Configuration Analysis Tool (CAT) are presented. As a configuration management tool, CAT enhances the control of large software systems by providing a repository for information describing the current status of a project. CAT provides an editing capability to update the information and a reporting capability to present the information. CAT is an interactive program available in versions for the PDP-11/70 and VAX-11/780 computers.

  18. Measuring laboratory-based influenza surveillance capacity: development of the 'International Influenza Laboratory Capacity Review' Tool.

    PubMed

    Muir-Paulik, S A; Johnson, L E A; Kennedy, P; Aden, T; Villanueva, J; Reisdorf, E; Humes, R; Moen, A C

    2016-01-01

    The 2005 International Health Regulations (IHR 2005) emphasized the importance of laboratory capacity to detect emerging diseases including novel influenza viruses. To support IHR 2005 requirements and the need to enhance influenza laboratory surveillance capacity, the Association of Public Health Laboratories (APHL) and the Centers for Disease Control and Prevention (CDC) Influenza Division developed the International Influenza Laboratory Capacity Review (Tool). Data from 37 assessments were reviewed and analyzed to verify that the quantitative analysis results accurately depicted a laboratory's capacity and capabilities. Subject matter experts in influenza and laboratory practice used an iterative approach to develop the Tool incorporating feedback and lessons learnt through piloting and implementation. To systematically analyze assessment data, a quantitative framework for analysis was added to the Tool. The review indicated that changes in scores consistently reflected enhanced or decreased capacity. The review process also validated the utility of adding a quantitative analysis component to the assessments and the benefit of establishing a baseline from which to compare future assessments in a standardized way. Use of the Tool has provided APHL, CDC and each assessed laboratory with a standardized analysis of the laboratory's capacity. The information generated is used to improve laboratory systems for laboratory testing and enhance influenza surveillance globally. We describe the development of the Tool and lessons learnt. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  19. Error Propagation Analysis in the SAE Architecture Analysis and Design Language (AADL) and the EDICT Tool Framework

    NASA Technical Reports Server (NTRS)

    LaValley, Brian W.; Little, Phillip D.; Walter, Chris J.

    2011-01-01

    This report documents the capabilities of the EDICT tools for error modeling and error propagation analysis when operating with models defined in the Architecture Analysis & Design Language (AADL). We discuss our experience using the EDICT error analysis capabilities on a model of the Scalable Processor-Independent Design for Enhanced Reliability (SPIDER) architecture that uses the Reliable Optical Bus (ROBUS). Based on these experiences we draw some initial conclusions about model based design techniques for error modeling and analysis of highly reliable computing architectures.

  20. Decision Support Methods and Tools

    NASA Technical Reports Server (NTRS)

    Green, Lawrence L.; Alexandrov, Natalia M.; Brown, Sherilyn A.; Cerro, Jeffrey A.; Gumbert, Clyde r.; Sorokach, Michael R.; Burg, Cecile M.

    2006-01-01

    This paper is one of a set of papers, developed simultaneously and presented within a single conference session, that are intended to highlight systems analysis and design capabilities within the Systems Analysis and Concepts Directorate (SACD) of the National Aeronautics and Space Administration (NASA) Langley Research Center (LaRC). This paper focuses on the specific capabilities of uncertainty/risk analysis, quantification, propagation, decomposition, and management, robust/reliability design methods, and extensions of these capabilities into decision analysis methods within SACD. These disciplines are discussed together herein under the name of Decision Support Methods and Tools. Several examples are discussed which highlight the application of these methods within current or recent aerospace research at the NASA LaRC. Where applicable, commercially available, or government developed software tools are also discussed

  1. SHRP2 EconWorks : wider economic benefits analysis tools : final report.

    DOT National Transportation Integrated Search

    2016-01-01

    CDM Smith has completed an evaluation of the EconWorks Wider Economic Benefits (W.E.B.) : Analysis Tools for Connecticut Department of Transportation (CTDOT). The intent of this : evaluation was to compare the results of the outputs of this toolkit t...

  2. Water Quality Analysis Tool (WQAT)

    EPA Science Inventory

    The purpose of the Water Quality Analysis Tool (WQAT) software is to provide a means for analyzing and producing useful remotely sensed data products for an entire estuary, a particular point or area of interest (AOI or POI) in estuaries, or water bodies of interest where pre-pro...

  3. Applications of a broad-spectrum tool for conservation and fisheries analysis: aquatic gap analysis

    USGS Publications Warehouse

    McKenna, James E.; Steen, Paul J.; Lyons, John; Stewart, Jana S.

    2009-01-01

    Natural resources support all of our social and economic activities, as well as our biological existence. Humans have little control over most of the physical, biological, and sociological conditions dictating the status and capacity of natural resources in any particular area. However, the most rapid and threatening influences on natural resources typically are anthropogenic overuse and degradation. In addition, living natural resources (i.e., organisms) do not respect political boundaries, but are aware of their optimal habitat and environmental conditions. Most organisms have wider spatial ranges than the jurisdictional boundaries of environmental agencies that deal with them; even within those jurisdictions, information is patchy and disconnected. Planning and projecting effects of ecological management are difficult, because many organisms, habitat conditions, and interactions are involved. Conservation and responsible resource use involves wise management and manipulation of the aspects of the environment and biological communities that can be effectively changed. Tools and data sets that provide new insights and analysis capabilities can enhance the ability of resource managers to make wise decisions and plan effective, long-term management strategies. Aquatic gap analysis has been developed to provide those benefits. Gap analysis is more than just the assessment of the match or mis-match (i.e., gaps) between habitats of ecological value and areas with an appropriate level of environmental protection (e.g., refuges, parks, preserves), as the name suggests. Rather, a Gap Analysis project is a process which leads to an organized database of georeferenced information and previously available tools to examine conservation and other ecological issues; it provides a geographic analysis platform that serves as a foundation for aquatic ecological studies. This analytical tool box allows one to conduct assessments of all habitat elements within an area of interest. Aquatic gap analysis naturally focuses on aquatic habitats. The analytical tools are largely based on specification of the species-habitat relations for the system and organism group of interest (Morrison et al. 2003; McKenna et al. 2006; Steen et al. 2006; Sowa et al. 2007). The Great Lakes Regional Aquatic Gap Analysis (GLGap) project focuses primarily on lotic habitat of the U.S. Great Lakes drainage basin and associated states and has been developed to address fish and fisheries issues. These tools are unique because they allow us to address problems at a range of scales from the region to the stream segment and include the ability to predict species specific occurrence or abundance for most of the fish species in the study area. The results and types of questions that can be addressed provide better global understanding of the ecological context within which specific natural resources fit (e.g., neighboring environments and resources, and large and small scale processes). The geographic analysis platform consists of broad and flexible geospatial tools (and associated data) with many potential applications. The objectives of this article are to provide a brief overview of GLGap methods and analysis tools, and demonstrate conservation and planning applications of those data and tools. Although there are many potential applications, we will highlight just three: (1) support for the Eastern Brook Trout Joint Venture (EBTJV), (2) Aquatic Life classification in Wisconsin, and (3) an educational tool that makes use of Google Earth (use of trade or product names does not imply endorsement by the U.S. Government) and Internet accessibility.

  4. Prony Ringdown GUI (CERTS Prony Ringdown, part of the DSI Tool Box)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tuffner, Francis; Marinovici, PNNL Laurentiu; Hauer, PNNL John

    2014-02-21

    The PNNL Prony Ringdown graphical user interface is one analysis tool included in the Dynamic System Identification toolbox (DSI Toolbox). The Dynamic System Identification toolbox is a MATLAB-based collection of tools for parsing and analyzing phasor measurement unit data, especially in regards to small signal stability. It includes tools to read the data, preprocess it, and perform small signal analysis. 5. Method of Solution: The Dynamic System Identification Toolbox (DSI Toolbox) is designed to provide a research environment for examining phasor measurement unit data and performing small signal stability analysis. The software uses a series of text-driven menus to helpmore » guide users and organize the toolbox features. Methods for reading in populate phasor measurement unit data are provided, with appropriate preprocessing options for small-signal-stability analysis. The toolbox includes the Prony Ringdown GUI and basic algorithms to estimate information on oscillatory modes of the system, such as modal frequency and damping ratio.« less

  5. TACIT: An open-source text analysis, crawling, and interpretation tool.

    PubMed

    Dehghani, Morteza; Johnson, Kate M; Garten, Justin; Boghrati, Reihane; Hoover, Joe; Balasubramanian, Vijayan; Singh, Anurag; Shankar, Yuvarani; Pulickal, Linda; Rajkumar, Aswin; Parmar, Niki Jitendra

    2017-04-01

    As human activity and interaction increasingly take place online, the digital residues of these activities provide a valuable window into a range of psychological and social processes. A great deal of progress has been made toward utilizing these opportunities; however, the complexity of managing and analyzing the quantities of data currently available has limited both the types of analysis used and the number of researchers able to make use of these data. Although fields such as computer science have developed a range of techniques and methods for handling these difficulties, making use of those tools has often required specialized knowledge and programming experience. The Text Analysis, Crawling, and Interpretation Tool (TACIT) is designed to bridge this gap by providing an intuitive tool and interface for making use of state-of-the-art methods in text analysis and large-scale data management. Furthermore, TACIT is implemented as an open, extensible, plugin-driven architecture, which will allow other researchers to extend and expand these capabilities as new methods become available.

  6. Grid Stiffened Structure Analysis Tool

    NASA Technical Reports Server (NTRS)

    1999-01-01

    The Grid Stiffened Analysis Tool contract is contract performed by Boeing under NASA purchase order H30249D. The contract calls for a "best effort" study comprised of two tasks: (1) Create documentation for a composite grid-stiffened structure analysis tool, in the form of a Microsoft EXCEL spread sheet, that was developed by originally at Stanford University and later further developed by the Air Force, and (2) Write a program that functions as a NASTRAN pre-processor to generate an FEM code for grid-stiffened structure. In performing this contract, Task 1 was given higher priority because it enables NASA to make efficient use of a unique tool they already have; Task 2 was proposed by Boeing because it also would be beneficial to the analysis of composite grid-stiffened structures, specifically in generating models for preliminary design studies. The contract is now complete, this package includes copies of the user's documentation for Task 1 and a CD ROM & diskette with an electronic copy of the user's documentation and an updated version of the "GRID 99" spreadsheet.

  7. Using Galaxy to Perform Large-Scale Interactive Data Analyses

    PubMed Central

    Hillman-Jackson, Jennifer; Clements, Dave; Blankenberg, Daniel; Taylor, James; Nekrutenko, Anton

    2014-01-01

    Innovations in biomedical research technologies continue to provide experimental biologists with novel and increasingly large genomic and high-throughput data resources to be analyzed. As creating and obtaining data has become easier, the key decision faced by many researchers is a practical one: where and how should an analysis be performed? Datasets are large and analysis tool set-up and use is riddled with complexities outside of the scope of core research activities. The authors believe that Galaxy provides a powerful solution that simplifies data acquisition and analysis in an intuitive Web application, granting all researchers access to key informatics tools previously only available to computational specialists working in Unix-based environments. We will demonstrate through a series of biomedically relevant protocols how Galaxy specifically brings together (1) data retrieval from public and private sources, for example, UCSC's Eukaryote and Microbial Genome Browsers, (2) custom tools (wrapped Unix functions, format standardization/conversions, interval operations), and 3rd-party analysis tools. PMID:22700312

  8. ASTEC and MODEL: Controls software development at Goddard Space Flight Center

    NASA Technical Reports Server (NTRS)

    Downing, John P.; Bauer, Frank H.; Surber, Jeffrey L.

    1993-01-01

    The ASTEC (Analysis and Simulation Tools for Engineering Controls) software is under development at the Goddard Space Flight Center (GSFC). The design goal is to provide a wide selection of controls analysis tools at the personal computer level, as well as the capability to upload compute-intensive jobs to a mainframe or supercomputer. In the last three years the ASTEC (Analysis and Simulation Tools for Engineering Controls) software has been under development. ASTEC is meant to be an integrated collection of controls analysis tools for use at the desktop level. MODEL (Multi-Optimal Differential Equation Language) is a translator that converts programs written in the MODEL language to FORTRAN. An upgraded version of the MODEL program will be merged into ASTEC. MODEL has not been modified since 1981 and has not kept with changes in computers or user interface techniques. This paper describes the changes made to MODEL in order to make it useful in the 90's and how it relates to ASTEC.

  9. Web-based tools for modelling and analysis of multivariate data: California ozone pollution activity

    PubMed Central

    Dinov, Ivo D.; Christou, Nicolas

    2014-01-01

    This article presents a hands-on web-based activity motivated by the relation between human health and ozone pollution in California. This case study is based on multivariate data collected monthly at 20 locations in California between 1980 and 2006. Several strategies and tools for data interrogation and exploratory data analysis, model fitting and statistical inference on these data are presented. All components of this case study (data, tools, activity) are freely available online at: http://wiki.stat.ucla.edu/socr/index.php/SOCR_MotionCharts_CAOzoneData. Several types of exploratory (motion charts, box-and-whisker plots, spider charts) and quantitative (inference, regression, analysis of variance (ANOVA)) data analyses tools are demonstrated. Two specific human health related questions (temporal and geographic effects of ozone pollution) are discussed as motivational challenges. PMID:24465054

  10. Web-based tools for modelling and analysis of multivariate data: California ozone pollution activity.

    PubMed

    Dinov, Ivo D; Christou, Nicolas

    2011-09-01

    This article presents a hands-on web-based activity motivated by the relation between human health and ozone pollution in California. This case study is based on multivariate data collected monthly at 20 locations in California between 1980 and 2006. Several strategies and tools for data interrogation and exploratory data analysis, model fitting and statistical inference on these data are presented. All components of this case study (data, tools, activity) are freely available online at: http://wiki.stat.ucla.edu/socr/index.php/SOCR_MotionCharts_CAOzoneData. Several types of exploratory (motion charts, box-and-whisker plots, spider charts) and quantitative (inference, regression, analysis of variance (ANOVA)) data analyses tools are demonstrated. Two specific human health related questions (temporal and geographic effects of ozone pollution) are discussed as motivational challenges.

  11. The effective integration of analysis, modeling, and simulation tools.

    DOT National Transportation Integrated Search

    2013-08-01

    The need for model integration arises from the recognition that both transportation decisionmaking and the tools supporting it continue to increase in complexity. Many strategies that agencies evaluate require using tools that are sensitive to supply...

  12. GET SMARTE: DECISION TOOLS TO REVITALIZE BROWNFIELDS

    EPA Science Inventory

    SMARTe (Sustainable Management Approaches and Revitalization Tools-electronic) is an open-source, web-based, decision-support system for developing and evaluating future use scenarios for potentially contaminated sites (i.e., brownfields). It contains resources and analysis tools...

  13. An evaluation of the accuracy and speed of metagenome analysis tools

    PubMed Central

    Lindgreen, Stinus; Adair, Karen L.; Gardner, Paul P.

    2016-01-01

    Metagenome studies are becoming increasingly widespread, yielding important insights into microbial communities covering diverse environments from terrestrial and aquatic ecosystems to human skin and gut. With the advent of high-throughput sequencing platforms, the use of large scale shotgun sequencing approaches is now commonplace. However, a thorough independent benchmark comparing state-of-the-art metagenome analysis tools is lacking. Here, we present a benchmark where the most widely used tools are tested on complex, realistic data sets. Our results clearly show that the most widely used tools are not necessarily the most accurate, that the most accurate tool is not necessarily the most time consuming, and that there is a high degree of variability between available tools. These findings are important as the conclusions of any metagenomics study are affected by errors in the predicted community composition and functional capacity. Data sets and results are freely available from http://www.ucbioinformatics.org/metabenchmark.html PMID:26778510

  14. Higher Education Faculty Utilization of Online Technological Tools: A Multilevel Analysis

    ERIC Educational Resources Information Center

    Jackson, Brianne L.

    2017-01-01

    As online learning and the use of online technological tools in higher education continues to grow exponentially, higher education faculty are expected to incorporate these tools into their instruction. However, many faculty members are reluctant to embrace such tools, for a variety of professional and personal reasons. This study employs survey…

  15. FFI: A software tool for ecological monitoring

    Treesearch

    Duncan C. Lutes; Nathan C. Benson; MaryBeth Keifer; John F. Caratti; S. Austin Streetman

    2009-01-01

    A new monitoring tool called FFI (FEAT/FIREMON Integrated) has been developed to assist managers with collection, storage and analysis of ecological information. The tool was developed through the complementary integration of two fire effects monitoring systems commonly used in the United States: FIREMON and the Fire Ecology Assessment Tool. FFI provides software...

  16. Communicative Tools and Modes in Thematic Preschool Work

    ERIC Educational Resources Information Center

    Ahlskog-Björkman, Eva; Björklund, Camilla

    2016-01-01

    This study focuses on teachers' ways of mediating meaning through communicative tools and modes in preschool thematic work. A socio-cultural perspective is used for analysis on how tools and modes are provided for children to make use of for communicative purposes. The research questions are: (1) what communicative tools do teachers use in their…

  17. Use of the MATRIXx Integrated Toolkit on the Microwave Anisotropy Probe Attitude Control System

    NASA Technical Reports Server (NTRS)

    Ward, David K.; Andrews, Stephen F.; McComas, David C.; ODonnell, James R., Jr.

    1999-01-01

    Recent advances in analytical software tools allow the analysis, simulation, flight code, and documentation of an algorithm to be generated from a single source, all within one integrated analytical design package. NASA's Microwave Anisotropy Probe project has used one such package, Integrated Systems' MATRIXx suite, in the design of the spacecraft's Attitude Control System. The project's experience with the linear analysis, simulation, code generation, and documentation tools will be presented and compared with more traditional development tools. In particular, the quality of the flight software generated will be examined in detail. Finally, lessons learned on each of the tools will be shared.

  18. Cost analysis of objective resident cataract surgery assessments.

    PubMed

    Nandigam, Kiran; Soh, Jonathan; Gensheimer, William G; Ghazi, Ahmed; Khalifa, Yousuf M

    2015-05-01

    To compare 8 ophthalmology resident surgical training tools to determine which is most cost effective. University of Rochester Medical Center, Rochester, New York, USA. Retrospective evaluation of technology. A cost-analysis model was created to compile all relevant costs in running each tool in a medium-sized ophthalmology program. Quantitative cost estimates were obtained based on cost of tools, cost of time in evaluations, and supply and maintenance costs. For wet laboratory simulation, Eyesi was the least expensive cataract surgery simulation method; however, it is only capable of evaluating simulated cataract surgery rehearsal and requires supplementation with other evaluative methods for operating room performance and for noncataract wet lab training and evaluation. The most expensive training tool was the Eye Surgical Skills Assessment Test (ESSAT). The 2 most affordable methods for resident evaluation in operating room performance were the Objective Assessment of Skills in Intraocular Surgery (OASIS) and Global Rating Assessment of Skills in Intraocular Surgery (GRASIS). Cost-based analysis of ophthalmology resident surgical training tools are needed so residency programs can implement tools that are valid, reliable, objective, and cost effective. There is no perfect training system at this time. Copyright © 2015 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.

  19. simuwatt - A Tablet Based Electronic Auditing Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Macumber, Daniel; Parker, Andrew; Lisell, Lars

    2014-05-08

    'simuwatt Energy Auditor' (TM) is a new tablet-based electronic auditing tool that is designed to dramatically reduce the time and cost to perform investment-grade audits and improve quality and consistency. The tool uses the U.S. Department of Energy's OpenStudio modeling platform and integrated Building Component Library to automate modeling and analysis. simuwatt's software-guided workflow helps users gather required data, and provides the data in a standard electronic format that is automatically converted to a baseline OpenStudio model for energy analysis. The baseline energy model is calibrated against actual monthly energy use to ASHRAE Standard 14 guidelines. Energy conservation measures frommore » the Building Component Library are then evaluated using OpenStudio's parametric analysis capability. Automated reporting creates audit documents that describe recommended packages of energy conservation measures. The development of this tool was partially funded by the U.S. Department of Defense's Environmental Security Technology Certification Program. As part of this program, the tool is being tested at 13 buildings on 5 Department of Defense sites across the United States. Results of the first simuwatt audit tool demonstration are presented in this paper.« less

  20. Application of modern tools and techniques to maximize engineering productivity in the development of orbital operations plans for the space station progrm

    NASA Technical Reports Server (NTRS)

    Manford, J. S.; Bennett, G. R.

    1985-01-01

    The Space Station Program will incorporate analysis of operations constraints and considerations in the early design phases to avoid the need for later modifications to the Space Station for operations. The application of modern tools and administrative techniques to minimize the cost of performing effective orbital operations planning and design analysis in the preliminary design phase of the Space Station Program is discussed. Tools and techniques discussed include: approach for rigorous analysis of operations functions, use of the resources of a large computer network, and providing for efficient research and access to information.

Top