NASA Astrophysics Data System (ADS)
Sasaki, Hiroaki; Siswanto, Eko; Nishiuchi, Kou; Tanaka, Katsuhisa; Hasegawa, Toru; Ishizaka, Joji
2008-02-01
Absorption coefficients of colored dissolved organic matter (CDOM) [a g(λ)] were measured and relationship with salinity was derived in the East China Sea (ECS) during summer when amount of the Changjiang River discharge is large. Low salinity Changjiang Diluted Water (CDW) was observed widely in the shelf region and was considered to be the main origin of CDOM, resulting in a strong relationship between salinity and a g(λ). Error of satellite a g(λ) estimated by the present ocean color algorithm could be corrected by satellite-retrieved chlorophyll data. Satellite-retrieved salinity could be predicted with about +/-1.0 accuracy from satellite a g(λ) and the relation between salinity and a g(λ). Our study suggests that satellite-derived a g(λ) can be an indicator of the low salinity CDW during summer.
The Aquarius Salinity Retrieval Algorithm
NASA Technical Reports Server (NTRS)
Meissner, Thomas; Wentz, Frank; Hilburn, Kyle; Lagerloef, Gary; Le Vine, David
2012-01-01
The first part of this presentation gives an overview over the Aquarius salinity retrieval algorithm. The instrument calibration [2] converts Aquarius radiometer counts into antenna temperatures (TA). The salinity retrieval algorithm converts those TA into brightness temperatures (TB) at a flat ocean surface. As a first step, contributions arising from the intrusion of solar, lunar and galactic radiation are subtracted. The antenna pattern correction (APC) removes the effects of cross-polarization contamination and spillover. The Aquarius radiometer measures the 3rd Stokes parameter in addition to vertical (v) and horizontal (h) polarizations, which allows for an easy removal of ionospheric Faraday rotation. The atmospheric absorption at L-band is almost entirely due to molecular oxygen, which can be calculated based on auxiliary input fields from numerical weather prediction models and then successively removed from the TB. The final step in the TA to TB conversion is the correction for the roughness of the sea surface due to wind, which is addressed in more detail in section 3. The TB of the flat ocean surface can now be matched to a salinity value using a surface emission model that is based on a model for the dielectric constant of sea water [3], [4] and an auxiliary field for the sea surface temperature. In the current processing only v-pol TB are used for this last step.
Leveraging Machine Learning to Estimate Soil Salinity through Satellite-Based Remote Sensing
NASA Astrophysics Data System (ADS)
Welle, P.; Ravanbakhsh, S.; Póczos, B.; Mauter, M.
2016-12-01
Human-induced salinization of agricultural soils is a growing problem which now affects an estimated 76 million hectares and causes billions of dollars of lost agricultural revenues annually. While there are indications that soil salinization is increasing in extent, current assessments of global salinity levels are outdated and rely heavily on expert opinion due to the prohibitive cost of a worldwide sampling campaign. A more practical alternative to field sampling may be earth observation through remote sensing, which takes advantage of the distinct spectral signature of salts in order to estimate soil conductivity. Recent efforts to map salinity using remote sensing have been met with limited success due to tractability issues of managing the computational load associated with large amounts of satellite data. In this study, we use Google Earth Engine to create composite satellite soil datasets, which combine data from multiple sources and sensors. These composite datasets contain pixel-level surface reflectance values for dates in which the algorithm is most confident that the surface contains bare soil. We leverage the detailed soil maps created and updated by the United States Geological Survey as label data and apply machine learning regression techniques such as Gaussian processes to learn a smooth mapping from surface reflection to noisy estimates of salinity. We also explore a semi-supervised approach using deep generative convolutional networks to leverage the abundance of unlabeled satellite images in producing better estimates for salinity values where we have relatively fewer measurements across the globe. The general method results in two significant contributions: (1) an algorithm that can be used to predict levels of soil salinity in regions without detailed soil maps and (2) a general framework that serves as an example for how remote sensing can be paired with extensive label data to generate methods for prediction of physical phenomenon.
Aquarius Salinity Retrieval Algorithm: Final Pre-Launch Version
NASA Technical Reports Server (NTRS)
Wentz, Frank J.; Le Vine, David M.
2011-01-01
This document provides the theoretical basis for the Aquarius salinity retrieval algorithm. The inputs to the algorithm are the Aquarius antenna temperature (T(sub A)) measurements along with a number of NCEP operational products and pre-computed tables of space radiation coming from the galaxy and sun. The output is sea-surface salinity and many intermediate variables required for the salinity calculation. This revision of the Algorithm Theoretical Basis Document (ATBD) is intended to be the final pre-launch version.
NASA Astrophysics Data System (ADS)
Irianto, D. S.; Supriatna; Pin, TjiongGiok
2016-11-01
Eel (Anguilla spp.) is consumed fish that has an important economic value, either for local or international market. Pelabuhanratu Bay is an area with big potential for supplying eel seed. One of important factor, which affect an eel existence, is salinity, because eel migrate from fresh water, brackish, and sea naturally although the otherwise so that need ways to describe the distribution of glass eel by the salinity. To find out the percentage of salinity, it obtained from Landsat 8 Imagery in year 2015 using salinity prediction of Algorithm Cimandiri. The research has been conducted at Cimandiri Estuary, Citepus Estuary, and Cimaja Estuary based on wet and dry months. The existence of glass eel which is obtained from the catch was occurs on dry month when the most catch was occurs at the edge of estuary. The catch is reduced if it's farther from the edge of estuary, at the beach towards the sea and the inside of the river mouth with the percentage of salinity towards the sea is increase while the percentage of salinity towards the river is decrease.
Picos-Benítez, Alain R; López-Hincapié, Juan D; Chávez-Ramírez, Abraham U; Rodríguez-García, Adrián
2017-03-01
The complex non-linear behavior presented in the biological treatment of wastewater requires an accurate model to predict the system performance. This study evaluates the effectiveness of an artificial intelligence (AI) model, based on the combination of artificial neural networks (ANNs) and genetic algorithms (GAs), to find the optimum performance of an up-flow anaerobic sludge blanket reactor (UASB) for saline wastewater treatment. Chemical oxygen demand (COD) removal was predicted using conductivity, organic loading rate (OLR) and temperature as input variables. The ANN model was built from experimental data and performance was assessed through the maximum mean absolute percentage error (= 9.226%) computed from the measured and model predicted values of the COD. Accordingly, the ANN model was used as a fitness function in a GA to find the best operational condition. In the worst case scenario (low energy requirements, high OLR usage and high salinity) this model guaranteed COD removal efficiency values above 70%. This result is consistent and was validated experimentally, confirming that this ANN-GA model can be used as a tool to achieve the best performance of a UASB reactor with the minimum requirement of energy for saline wastewater treatment.
The Aquarius Salinity Retrieval Algorithm: Early Results
NASA Technical Reports Server (NTRS)
Meissner, Thomas; Wentz, Frank J.; Lagerloef, Gary; LeVine, David
2012-01-01
The Aquarius L-band radiometer/scatterometer system is designed to provide monthly salinity maps at 150 km spatial scale to a 0.2 psu accuracy. The sensor was launched on June 10, 2011, aboard the Argentine CONAE SAC-D spacecraft. The L-band radiometers and the scatterometer have been taking science data observations since August 25, 2011. The first part of this presentation gives an overview over the Aquarius salinity retrieval algorithm. The instrument calibration converts Aquarius radiometer counts into antenna temperatures (TA). The salinity retrieval algorithm converts those TA into brightness temperatures (TB) at a flat ocean surface. As a first step, contributions arising from the intrusion of solar, lunar and galactic radiation are subtracted. The antenna pattern correction (APC) removes the effects of cross-polarization contamination and spillover. The Aquarius radiometer measures the 3rd Stokes parameter in addition to vertical (v) and horizontal (h) polarizations, which allows for an easy removal of ionospheric Faraday rotation. The atmospheric absorption at L-band is almost entirely due to O2, which can be calculated based on auxiliary input fields from numerical weather prediction models and then successively removed from the TB. The final step in the TA to TB conversion is the correction for the roughness of the sea surface due to wind. This is based on the radar backscatter measurements by the scatterometer. The TB of the flat ocean surface can now be matched to a salinity value using a surface emission model that is based on a model for the dielectric constant of sea water and an auxiliary field for the sea surface temperature. In the current processing (as of writing this abstract) only v-pol TB are used for this last process and NCEP winds are used for the roughness correction. Before the salinity algorithm can be operationally implemented and its accuracy assessed by comparing versus in situ measurements, an extensive calibration and validation (cal/val) activity needs to be completed. This is necessary in order to tune the inputs to the algorithm and remove biases that arise due to the instrument calibration, foremost the values of the noise diode injection temperatures and the losses that occur in the feedhorns. This is the subject of the second part of our presentation. The basic tool is to analyze the observed difference between the Aquarius measured TA and an expected TA that is computed from a reference salinity field. It is also necessary to derive a relation between the scatterometer backscatter measurements and the radiometer emissivity that is induced by surface winds. In order to do this we collocate Aquarius radiometer and scatterometer measurements with wind speed retrievals from the WindSat and SSMIS F17 microwave radiometers. Both of these satellites fly in orbits that have the same equatorial ascending crossing time (6 pm) as the Aquarius/SAC-D observatory. Rain retrievals from WindSat and SSMIS F 17 can be used to remove Aquarius observations that are rain contaminated. A byproduct of this analysis is a prediction for the wind-induced sea surface emissivity at L-band.
Comparisons of Aquarius Measurements over Oceans with Radiative Transfer Models at L-Band
NASA Technical Reports Server (NTRS)
Dinnat, E.; LeVine, D.; Abraham, S.; DeMattheis, P.; Utku, C.
2012-01-01
The Aquarius/SAC-D spacecraft includes three L-band (1.4 GHz) radiometers dedicated to measuring sea surface salinity. It was launched in June 2011 by NASA and CONAE (Argentine space agency). We report detailed comparisons of Aquarius measurements with radiative transfer model predictions. These comparisons are used as part of the initial assessment of Aquarius data and to estimate the radiometer calibration bias and stability. Comparisons are also being performed to assess the performance of models used in the retrieval algorithm for correcting the effect of various sources of geophysical "noise" (e.g. Faraday rotation, surface roughness). Such corrections are critical in bringing the error in retrieved salinity down to the required 0.2 practical salinity unit on monthly global maps at 150 km by 150 km resolution.
Prediction of Greenhouse Gas (GHG) Fluxes from Coastal Salt Marshes using Artificial Neural Network
NASA Astrophysics Data System (ADS)
Ishtiaq, K. S.; Abdul-Aziz, O. I.
2017-12-01
Coastal salt marshes are among the most productive ecosystems on earth. Given the complex interactions between ambient environment and ecosystem biological exchanges, it is difficult to predict the salt marsh greenhouse gas (GHG) fluxes (CO2 and CH4) from their environmental drivers. In this study, we developed an artificial neural network (ANN) model to robustly predict the salt marsh GHG fluxes using a limited number of input variables (photosynthetically active radiation, soil temperature and porewater salinity). The ANN parameterization involved an optimized 3-layer feed forward Levenberg-Marquardt training algorithm. Four tidal salt marshes of Waquoit Bay, MA — incorporating a gradient in land-use, salinity and hydrology — were considered as the case study sites. The wetlands were dominated by native Spartina Alterniflora, and characterized by high salinity and frequent flooding. The developed ANN model showed a good performance (training R2 = 0.87 - 0.96; testing R2 = 0.84 - 0.88) in predicting the fluxes across the case study sites. The model can be used to estimate wetland GHG fluxes and potential carbon balance under different IPCC climate change and sea level rise scenarios. The model can also aid the development of GHG offset protocols to set monitoring guidelines for restoration of coastal salt marshes.
A global algorithm for estimating Absolute Salinity
NASA Astrophysics Data System (ADS)
McDougall, T. J.; Jackett, D. R.; Millero, F. J.; Pawlowicz, R.; Barker, P. M.
2012-12-01
The International Thermodynamic Equation of Seawater - 2010 has defined the thermodynamic properties of seawater in terms of a new salinity variable, Absolute Salinity, which takes into account the spatial variation of the composition of seawater. Absolute Salinity more accurately reflects the effects of the dissolved material in seawater on the thermodynamic properties (particularly density) than does Practical Salinity. When a seawater sample has standard composition (i.e. the ratios of the constituents of sea salt are the same as those of surface water of the North Atlantic), Practical Salinity can be used to accurately evaluate the thermodynamic properties of seawater. When seawater is not of standard composition, Practical Salinity alone is not sufficient and the Absolute Salinity Anomaly needs to be estimated; this anomaly is as large as 0.025 g kg-1 in the northernmost North Pacific. Here we provide an algorithm for estimating Absolute Salinity Anomaly for any location (x, y, p) in the world ocean. To develop this algorithm, we used the Absolute Salinity Anomaly that is found by comparing the density calculated from Practical Salinity to the density measured in the laboratory. These estimates of Absolute Salinity Anomaly however are limited to the number of available observations (namely 811). In order to provide a practical method that can be used at any location in the world ocean, we take advantage of approximate relationships between Absolute Salinity Anomaly and silicate concentrations (which are available globally).
NASA Astrophysics Data System (ADS)
Millard, R. C.; Seaver, G.
1990-12-01
A 27-term index of refraction algorithm for pure and sea waters has been developed using four experimental data sets of differing accuracies. They cover the range 500-700 nm in wavelength, 0-30°C in temperature, 0-40 psu in salinity, and 0-11,000 db in pressure. The index of refraction algorithm has an accuracy that varies from 0.4 ppm for pure water at atmospheric pressure to 80 ppm at high pressures, but preserves the accuracy of each original data set. This algorithm is a significant improvement over existing descriptions as it is in analytical form with a better and more carefully defined accuracy. A salinometer algorithm with the same uncertainty has been created by numerically inverting the index algorithm using the Newton-Raphson method. The 27-term index algorithm was used to generate a pseudo-data set at the sodium D wavelength (589.26 nm) from which a 6-term densitometer algorithm was constructed. The densitometer algorithm also produces salinity as an intermediate step in the salinity inversion. The densitometer residuals have a standard deviation of 0.049 kg m -3 which is not accurate enough for most oceanographic applications. However, the densitometer algorithm was used to explore the sensitivity of density from this technique to temperature and pressure uncertainties. To achieve a deep ocean densitometer of 0.001 kg m -3 accuracy would require the index of refraction to have an accuracy of 0.3 ppm, the temperature an accuracy of 0.01°C and the pressure 1 db. Our assessment of the currently available index of refraction measurements finds that only the data for fresh water at atmospheric pressure produce an algorithm satisfactory for oceanographic use (density to 0.4 ppm). The data base for the algorithm at higher pressures and various salinities requires an order of magnitude or better improvement in index measurement accuracy before the resultant density accuracy will be comparable to the currently available oceanographic algorithm.
NASA Astrophysics Data System (ADS)
Shamkhali Chenar, S.; Deng, Z.
2017-12-01
Pathogenic viruses pose a significant public health threat and economic losses to shellfish industry in the coastal environment. Norovirus is a contagious virus and the leading cause of epidemic gastroenteritis following consumption of oysters harvested from sewage-contaminated waters. While it is challenging to detect noroviruses in coastal waters due to the lack of sensitive and routine diagnostic methods, machine learning techniques are allowing us to prevent or at least reduce the risks by developing effective predictive models. This study attempts to develop an algorithm between historical norovirus outbreak reports and environmental parameters including water temperature, solar radiation, water level, salinity, precipitation, and wind. For this purpose, the Random Forests statistical technique was utilized to select relevant environmental parameters and their various combinations with different time lags controlling the virus distribution in oyster harvesting areas along the Louisiana Coast. An Artificial Neural Networks (ANN) approach was then presented to predict the outbreaks using a final set of input variables. Finally, a sensitivity analysis was conducted to evaluate relative importance and contribution of the input variables to the model output. Findings demonstrated that the developed model was capable of reproducing historical oyster norovirus outbreaks along the Louisiana Coast with the overall accuracy of than 99.83%, demonstrating the efficacy of the model. Moreover, the increase in water temperature, solar radiation, water level, and salinity, and the decrease in wind and rainfall are associated with the reduction in the model-predicted risk of norovirus outbreak according to sensitivity analysis results. In conclusion, the presented machine learning approach provided reliable tools for predicting potential norovirus outbreaks and could be used for early detection of possible outbreaks and reduce the risk of norovirus to public health and the seafood industry.
NASA Astrophysics Data System (ADS)
Meissner, Thomas; Hilburn, Kyle; Wentz, Frank; Gentemann, Chelle
2013-04-01
The Aquarius L-band radiometer/scatterometer system is designed to provide monthly salinity maps at 150 km spatial scale to an accuracy of 0.2 psu. The sensor was launched on June 10, 2011, aboard the Argentine CONAE SAC-D spacecraft. The L-band radiometers and the scatterometer have been taking science data observations since August 25, 2011. This first part of the presentation gives an overview over the major features of the Version 2.1 Aquarius Level 2 salinity retrieval algorithm: 1. Antenna pattern correction: spillover and cross polarization contamination. 2. Correction for the drift of the Aquarius internal calibration system. 3. Correction for intruding celestial radiation, foremost from the galaxy. 4. Correction for effects of the wind roughened ocean surface. We then present a thorough validation study for the salinity product, which consists in a 3-way intercomparison between Aquarius, SMOS and in-situ buoy salinity measurements. The Aquarius - buy comparison shows that that the Aquarius Version 2.1 salinity product is very close to meet the aforementioned mission requirement of 0.2 psu. We demonstrate that in order to meet this accuracy it is crucial to use the L-band scatterometer for correcting effects from the wind roughened ocean surface, which turns out to be the major driver in the salinity retrieval uncertainty budget. A surface roughness correction algorithm that is based solely on auxiliary input of wind fields from numerical weather prediction models (e.g. NCEP, ECMWF) is not sufficient to meet the stringent Aquarius mission requirement, especially at wind speeds above 10 m/s. We show that presence of the Aquarius L-band scatterometer together with the L-band radiometer allows the retrieval of an Aquarius wind speed product whose accuracy matches or exceeds that of other common ocean wind speeds (WindSat, SSMIS). By comparing SMOS and Aquarius salinity fields with the in-situ observations we assess the importance of the roughness correction and the presence of the L-band scatterometer, which is a major difference between the two missions.
NASA Astrophysics Data System (ADS)
Budakoǧlu, Murat; Karaman, Muhittin; Damla Uça Avcı, Z.; Kumral, Mustafa; Geredeli (Yılmaz), Serpil
2014-05-01
Salinity of a lake is an important characteristic since, these are potentially industrial lakes and the degree of salinity can significantly be used for determination of mineral resources and for the production management. In the literature, there are many studies of using satellite data for salinity related lake studies such as determination of salinity distribution and detection of potential freshwater sources in less salt concentrated regions. As the study area Lake Acigol, located in Denizli (Turkey) was selected. With it's saline environment, it's the major sodium sulphate production resource of Turkey. In this study, remote sensing data and data from a field study was used and correlated. Remote sensing is an efficient tool to monitor and analyze lake properties by using it complementary to field data. Worldview-2 satellite data was used in this study which consists of 8 bands. At the same time with the satellite data acquisition, a field study was conducted to collect the salinity values in 17 points of the laker with using YSI 556 Multiparametre for measurements. The values were measured as salinity amount in grams per kilogram solution and obtained as ppt unit. It was observed that the values vary from 34 ppt - 40.1 ppt and the average is 38.056 ppt. In Thalassic serie, the lake was in mixoeuhaline state in the time of issue. As a first step, ATCOR correction was performed on satellite image for atmospheric correction. There were some clouds on the lake field, hence it was decided to continue the study by using the 12 sampling points which were clear on the image. Then, for each sampling point, a spectral value was obtained by calculating the average at a 11*11 neighborhood. The relation between the spectral reflectance values and the salinity was investigated. The 4-band algorithm, which was used for determination of chlorophyll-a distribution in highly turbid coastal environment by Wei (2012) was applied. Salinity α (Λi-1 / Λj-1) * (Λk-1 / Λm-1) (i,j,k,m=1..8) (i≠ j≠ k≠ m) By using each band of WV-2 and possible combinationsfor 4-band algorithm, 1680 band combinations were used to get the correlation with the in-situ measured salinity values. As a result the highest correlation (R=0.926) was found. The correlation coefficient of the 4-band algorithm indices (ΛCoastal-1 / ΛNIR2-1) * (ΛRed-1 / ΛGreen-1) and the salinity values was R2=0.86.
NASA Astrophysics Data System (ADS)
Zare, Ehsan; Huang, Jingyi; Koganti, Triven; Triantafilis, John
2017-04-01
In order to understand the drivers of topsoil salinization, the distribution and movement of salt in accordance with groundwater need mapping. In this study, we described a method to map the distribution of soil salinity, as measured by the electrical conductivity of a saturated soil-paste extract (ECe), and in 3-dimensions around a water storage reservoir in an irrigated field near Bourke, New South Wales, Australia. A quasi-3d electromagnetic conductivity image (EMCI) or model of the true electrical conductivity (sigma) was developed using 133 apparent electrical conductivity (ECa) measurements collected on a 50 m grid and using various coil arrays of DUALEM-421S and EM34 instruments. For the DUALEM-421S we considered ECa in horizontal coplanar (i.e., 1 mPcon, 2 mPcon and 4 mPcon) and vertical coplanar (i.e., 1 mHcon, 2 mHcon and 4 mHcon) arrays. For the EM34, three measurements in the horizontal mode (i.e., EM34-10H, EM34-20H and EM34-40H) were considered. We estimated σ using a quasi-3d joint-inversion algorithm (EM4Soil). The best correlation (R2 = 0.92) between σ and measured soil ECe was identified when a forward modelling (FS), inversion algorithm (S2) and damping factor (lambda = 0.2) were used and using both DUALEM-421 and EM34 data; but not including the 4 m coil arrays of the DUALEM-421S. A linear regression calibration model was used to predict ECe in 3-dimensions beneath the study field. The predicted ECe was consistent with previous studies and revealed the distribution of ECe and helped to infer a freshwater intrusion from a water storage reservoir at depth and as a function of its proximity to near-surface prior stream channels and buried paleochannels. It was concluded that this method can be applied elsewhere to map the soil salinity and water movement and provide guidance for improved land management.|
Huang, J; Koganti, T; Santos, F A Monteiro; Triantafilis, J
2017-01-15
In order to understand the drivers of topsoil salinization, the distribution and movement of salt in accordance with groundwater need mapping. In this study, we described a method to map the distribution of soil salinity, as measured by the electrical conductivity of a saturated soil-paste extract (EC e ), and in 3-dimensions around a water storage reservoir in an irrigated field near Bourke, New South Wales, Australia. A quasi-3d electromagnetic conductivity image (EMCI) or model of the true electrical conductivity (σ) was developed using 133 apparent electrical conductivity (EC a ) measurements collected on a 50m grid and using various coil arrays of DUALEM-421S and EM34 instruments. For the DUALEM-421S we considered EC a in horizontal coplanar (i.e., 1mPcon, 2mPcon and 4mPcon) and vertical coplanar (i.e., 1mHcon, 2mHcon and 4mHcon) arrays. For the EM34, three measurements in the horizontal mode (i.e., EM34-10H, EM34-20H and EM34-40H) were considered. We estimated σ using a quasi-3d joint-inversion algorithm (EM4Soil). The best correlation (R 2 =0.92) between σ and measured soil EC e was identified when a forward modelling (FS), inversion algorithm (S2) and damping factor (λ=0.2) were used and using both DUALEM-421 and EM34 data; but not including the 4m coil arrays of the DUALEM-421S. A linear regression calibration model was used to predict EC e in 3-dimensions beneath the study field. The predicted EC e was consistent with previous studies and revealed the distribution of EC e and helped to infer a freshwater intrusion from a water storage reservoir at depth and as a function of its proximity to near-surface prior stream channels and buried paleochannels. It was concluded that this method can be applied elsewhere to map the soil salinity and water movement and provide guidance for improved land management. Copyright © 2016 Elsevier B.V. All rights reserved.
Assessing the potential of Landsat 8 OLI for retrieving salinity in the hypersaline Arabian Gulf
NASA Astrophysics Data System (ADS)
Zhao, Jun; Temimi, Marouane
2016-04-01
The Arabian Gulf, located in an arid region in the Middle East, has high salinity that can exceed 43 practical salinity units (psu) due to its special conditions, such as high evaporation, low precipitation, and desalination discharge. In this study, a regional algorithm was developed to retrieve salinity using in situ measurements conducted between June 2013 and November 2014 along the western coast of Abu Dhabi, United Arab Emirates (UAE). A multivariate linear regression model using the visible bands of Operational Land Imager (OLI) was proposed and indicated good performance with a determination coefficient (R2) of 0.7. The algorithm was then applied to an OLI scene, which revealed the spatial distribution of salinity over the study area. The findings are favorable for better interpretation of the complex water mass exchange between the Arabian Gulf and the Sea of Oman through the Strait of Hormuz, validating salinity from numerical models, studying the effects of anthropogenic activities and climate change on ecosystem in the hypersaline Arabian Gulf, etc.
NASA Technical Reports Server (NTRS)
Thompson, David E.; Rajkumar, T.; Clancy, Daniel (Technical Monitor)
2002-01-01
The San Francisco Bay Delta is a large hydrodynamic complex that incorporates the Sacramento and San Joaquin Estuaries, the Burman Marsh, and the San Francisco Bay proper. Competition exists for the use of this extensive water system both from the fisheries industry, the agricultural industry, and from the marine and estuarine animal species within the Delta. As tidal fluctuations occur, more saline water pushes upstream allowing fish to migrate beyond the Burman Marsh for breeding and habitat occupation. However, the agriculture industry does not want extensive salinity intrusion to impact water quality for human and plant consumption. The balance is regulated by pumping stations located alone the estuaries and reservoirs whereby flushing of fresh water keeps the saline intrusion at bay. The pumping schedule is driven by data collected at various locations within the Bay Delta and by numerical models that predict the salinity intrusion as part of a larger model of the system. The Interagency Ecological Program (IEP) for the San Francisco Bay/Sacramento-San Joaquin Estuary collects, monitors, and archives the data, and the Department of Water Resources provides a numerical model simulation (DSM2) from which predictions are made that drive the pumping schedule. A problem with this procedure is that the numerical simulation takes roughly 16 hours to complete a C: prediction. We have created a neural net, optimized with a genetic algorithm, that takes as input the archived data from multiple stations and predicts stage, salinity, and flow at the Carquinez Straits (at the downstream end of the Burman Marsh). This model seems to be robust in its predictions and operates much faster than the current numerical DSM2 model. Because the system is strongly tidal driven, we used both Principal Component Analysis and Fast Fourier Transforms to discover dominant features within the IEP data. We then filtered out the dominant tidal forcing to discover non-primary tidal effects, and used this to enhance the neural network by mapping input-output relationships in a more efficient manner. Furthermore, the neural network implicitly incorporates both the hydrodynamic and water quality models into a single predictive system. Although our model has not yet been enhanced to demonstrate improve pumping schedules, it has the possibility to support better decision-making procedures that may then be implemented by State agencies if desired. Our intention is now to use this model in the smaller Elkhorn Slough complex near Monterey Bay where no such hydrodynamic model currently exists. At the Elkhorn Slough, we are fusing the neural net model of tidally-driven flow with in situ flow data and airborne and satellite remote sensation data. These further constrain the behavior of the model in predicting the longer-term health and future of this vital estuary.
Singh, Urminder; Rajkumar, Mohan Singh; Garg, Rohini
2017-01-01
Abstract Long non-coding RNAs (lncRNAs) make up a significant portion of non-coding RNAs and are involved in a variety of biological processes. Accurate identification/annotation of lncRNAs is the primary step for gaining deeper insights into their functions. In this study, we report a novel tool, PLncPRO, for prediction of lncRNAs in plants using transcriptome data. PLncPRO is based on machine learning and uses random forest algorithm to classify coding and long non-coding transcripts. PLncPRO has better prediction accuracy as compared to other existing tools and is particularly well-suited for plants. We developed consensus models for dicots and monocots to facilitate prediction of lncRNAs in non-model/orphan plants. The performance of PLncPRO was quite better with vertebrate transcriptome data as well. Using PLncPRO, we discovered 3714 and 3457 high-confidence lncRNAs in rice and chickpea, respectively, under drought or salinity stress conditions. We investigated different characteristics and differential expression under drought/salinity stress conditions, and validated lncRNAs via RT-qPCR. Overall, we developed a new tool for the prediction of lncRNAs in plants and showed its utility via identification of lncRNAs in rice and chickpea. PMID:29036354
Caccetta, Peter; Dunne, Robert; George, Richard; McFarlane, Don
2010-01-01
In the southwestern agricultural region of Western Australia, the clearing of the original perennial vegetation for annual vegetation-based dryland agriculture has lead to rising saline groundwater levels. This has had effects such as reduced productivity of agricultural land, death of native vegetation, reduced stream water quality and infrastructure damage. These effects have been observed at many locations within the 18 million ha of cleared land. This has lead to efforts to quantify, in a spatially explicit way, the historical and likely future extent of the area affected, with the view to informing management decisions. This study was conducted to determine whether the likely future extent of the area affected by dryland salinity could be estimated by means of developing spatially explicit maps for use in management and planning. We derived catchment-related variables from digital elevation models and perennial vegetation presence/absence maps. We then used these variables to predict the salinity hazard extent by applying a combination of decision tree classification and morphological image processing algorithms. Sufficient objective data such as groundwater depth, its rate of rise, and its concentration of dissolved salts were generally not available, so we used regional expert opinion (derived from the limited existing studies on salinity hazard extent) as training and validation data. We obtained an 87% agreement in the salinity hazard extent estimated by this method compared with the validation data, and conclude that the maps are sufficient for planning. We estimate that the salinity hazard extent is 29.7% of the agricultural land.
NASA Astrophysics Data System (ADS)
Thompson, D. E.; Rajkumar, T.
2002-12-01
The San Francisco Bay Delta is a large hydrodynamic complex that incorporates the Sacramento and San Joaquin Estuaries, the Suisan Marsh, and the San Francisco Bay proper. Competition exists for the use of this extensive water system both from the fisheries industry, the agricultural industry, and from the marine and estuarine animal species within the Delta. As tidal fluctuations occur, more saline water pushes upstream allowing fish to migrate beyond the Suisan Marsh for breeding and habitat occupation. However, the agriculture industry does not want extensive salinity intrusion to impact water quality for human and plant consumption. The balance is regulated by pumping stations located along the estuaries and reservoirs whereby flushing of fresh water keeps the saline intrusion at bay. The pumping schedule is driven by data collected at various locations within the Bay Delta and by numerical models that predict the salinity intrusion as part of a larger model of the system. The Interagency Ecological Program (IEP) for the San Francisco Bay / Sacramento-San Joaquin Estuary collects, monitors, and archives the data, and the Department of Water Resources provides a numerical model simulation (DSM2) from which predictions are made that drive the pumping schedule. A problem with DSM2 is that the numerical simulation takes roughly 16 hours to complete a prediction. We have created a neural net, optimized with a genetic algorithm, that takes as input the archived data from multiple gauging stations and predicts stage, salinity, and flow at the Carquinez Straits (at the downstream end of the Suisan Marsh). This model seems to be robust in its predictions and operates much faster than the current numerical DSM2 model. Because the Bay-Delta is strongly tidally driven, we used both Principal Component Analysis and Fast Fourier Transforms to discover dominant features within the IEP data. We then filtered out the dominant tidal forcing to discover non-primary tidal effects, and used this to enhance the neural network by mapping input-output relationships in a more efficient manner. Furthermore, the neural network implicitly incorporates both the hydrodynamic and water quality models into a single predictive system. Although our model has not yet been enhanced to demonstrate improve pumping schedules, it has the possibility to support better decision-making procedures that may then be implemented by State agencies if desired. Our intention is now to use our calibrated Bay-Delta neural model in the smaller Elkhorn Slough complex near Monterey Bay where no such hydrodynamic model currently exists. At the Elkhorn Slough, we are fusing the neural net model of tidally-driven flow with in situ flow data and airborne and satellite remote sensing data. These further constrain the behavior of the model in predicting the longer-term health and future of this vital estuary. In particular, we are using visible data to explore the effects of the sediment plume that wastes into Monterey Bay, and infrared data and thermal emissivities to characterize the plant habitat along the margins of the Slough as salinity intrusion and sediment removal change the boundary of the estuary. The details of the Bay-Delta neural net model and its application to the Elkhorn Slough are presented in this paper.
NASA Astrophysics Data System (ADS)
Asmala, Eero; Stedmon, Colin A.; Thomas, David N.
2012-10-01
The quantity of chromophoric dissolved organic matter (CDOM) and dissolved organic carbon (DOC) in three Finnish estuaries (Karjaanjoki, Kyrönjoki and Kiiminkijoki) was investigated, with respect to predicting DOC concentrations and loadings from spectral CDOM absorption measurements. Altogether 87 samples were collected from three estuarine transects which were studied in three seasons, covering a salinity range between 0 and 6.8, and DOC concentrations from 1572 μmol l-1 in freshwater to 222 μmol l-1 in coastal waters. CDOM absorption coefficient, aCDOM(375) values followed the trend in DOC concentrations across the salinity gradient and ranged from 1.67 to 33.4 m-1. The link between DOC and CDOM was studied using a range of wavelengths and algorithms. Wavelengths between 250 and 270 nm gave the best predictions with single linear regression. Total dissolved iron was found to influence the prediction in wavelengths above 520 nm. Despite significant seasonal and spatial differences in DOC-CDOM models, a universal relationship was tested with an independent data set and found to be robust. DOC and CDOM yields (loading/catchment area) from the catchments ranged from 1.98 to 5.44 g C m-2 yr-1, and 1.67 to 11.5 aCDOM(375) yr-1, respectively.
Potential role of salinity in ENSO and MJO predictions
NASA Astrophysics Data System (ADS)
Zhu, J.; Kumar, A.; Murtugudde, R. G.; Xie, P.
2017-12-01
Studies have suggested that ocean salinity can vary in response to ENSO and MJO. For example, during an El Niño event, sea surface salinity decreases in the western and central equatorial Pacific, as a result of zonal advection of low salinity water by anomalous eastward surface currents, and to a lesser extent as a result of a rainfall excess associated with atmospheric convection and warm water displacements. However, the effect of salinity on ENSO and MJO evolutions and their forecasts has been less explored. In this analysis, we explored the potential role of salinity in ENSO and MJO predictions by conducting sensitivity experiments with NCEP CFSv2. Firstly, two forecasts experiments are conducted to explore its effect on ENSO predictions, in which the interannual variability of salinity in the ocean initial states is either included or excluded. Comparisons suggested that the salinity variability is essential to correctly forecast the 2007/08 La Niña starting from April 2007. With realistic salinity initial states, the tendency to decay of the subsurface cold condition during the spring and early summer 2007 was interrupted by positive salinity anomalies in the upper central Pacific, which working together with the Bjerknes positive feedback, contributed to the development of the La Niña event. Our study suggests that ENSO forecasts will benefit from more accurate sustained salinity observations having large-scale spatial coverage. We also assessed the potential role of salinity in MJO by evaluating a long coupled free run that has a relatively realistic MJO simulation and a set of predictability experiment, both based on CFSv2. Diagnostics of the free run suggest that, while the intraseasonal SST variations lead convections by a quarter cycle, they are almost in phase only with changes in barrier layer thickness, thereby suggesting an active role of salinity on SST. Its effect on MJO predictions is further explored by controlling the surface salinity feedback during the predictability experiments.
Evaluating Carbonate System Algorithms in a Nearshore System: Does Total Alkalinity Matter?
Sweet, Julia; Brzezinski, Mark A.; McNair, Heather M.; Passow, Uta
2016-01-01
Ocean acidification is a threat to many marine organisms, especially those that use calcium carbonate to form their shells and skeletons. The ability to accurately measure the carbonate system is the first step in characterizing the drivers behind this threat. Due to logistical realities, regular carbonate system sampling is not possible in many nearshore ocean habitats, particularly in remote, difficult-to-access locations. The ability to autonomously measure the carbonate system in situ relieves many of the logistical challenges; however, it is not always possible to measure the two required carbonate parameters autonomously. Observed relationships between sea surface salinity and total alkalinity can frequently provide a second carbonate parameter thus allowing for the calculation of the entire carbonate system. Here, we assessed the rigor of estimating total alkalinity from salinity at a depth <15 m by routinely sampling water from a pier in southern California for several carbonate system parameters. Carbonate system parameters based on measured values were compared with those based on estimated TA values. Total alkalinity was not predictable from salinity or from a combination of salinity and temperature at this site. However, dissolved inorganic carbon and the calcium carbonate saturation state of these nearshore surface waters could both be estimated within on average 5% of measured values using measured pH and salinity-derived or regionally averaged total alkalinity. Thus we find that the autonomous measurement of pH and salinity can be used to monitor trends in coastal changes in DIC and saturation state and be a useful method for high-frequency, long-term monitoring of ocean acidification. PMID:27893739
Evaluating Carbonate System Algorithms in a Nearshore System: Does Total Alkalinity Matter?
Jones, Jonathan M; Sweet, Julia; Brzezinski, Mark A; McNair, Heather M; Passow, Uta
2016-01-01
Ocean acidification is a threat to many marine organisms, especially those that use calcium carbonate to form their shells and skeletons. The ability to accurately measure the carbonate system is the first step in characterizing the drivers behind this threat. Due to logistical realities, regular carbonate system sampling is not possible in many nearshore ocean habitats, particularly in remote, difficult-to-access locations. The ability to autonomously measure the carbonate system in situ relieves many of the logistical challenges; however, it is not always possible to measure the two required carbonate parameters autonomously. Observed relationships between sea surface salinity and total alkalinity can frequently provide a second carbonate parameter thus allowing for the calculation of the entire carbonate system. Here, we assessed the rigor of estimating total alkalinity from salinity at a depth <15 m by routinely sampling water from a pier in southern California for several carbonate system parameters. Carbonate system parameters based on measured values were compared with those based on estimated TA values. Total alkalinity was not predictable from salinity or from a combination of salinity and temperature at this site. However, dissolved inorganic carbon and the calcium carbonate saturation state of these nearshore surface waters could both be estimated within on average 5% of measured values using measured pH and salinity-derived or regionally averaged total alkalinity. Thus we find that the autonomous measurement of pH and salinity can be used to monitor trends in coastal changes in DIC and saturation state and be a useful method for high-frequency, long-term monitoring of ocean acidification.
NASA Astrophysics Data System (ADS)
Cameron, Enrico; Pilla, Giorgio; Stella, Fabio A.
2018-06-01
The application of statistical classification methods is investigated—in comparison also to spatial interpolation methods—for predicting the acceptability of well-water quality in a situation where an effective quantitative model of the hydrogeological system under consideration cannot be developed. In the example area in northern Italy, in particular, the aquifer is locally affected by saline water and the concentration of chloride is the main indicator of both saltwater occurrence and groundwater quality. The goal is to predict if the chloride concentration in a water well will exceed the allowable concentration so that the water is unfit for the intended use. A statistical classification algorithm achieved the best predictive performances and the results of the study show that statistical classification methods provide further tools for dealing with groundwater quality problems concerning hydrogeological systems that are too difficult to describe analytically or to simulate effectively.
Global Soil Moisture from the Aquarius/SAC-D Satellite: Description and Initial Assessment
NASA Technical Reports Server (NTRS)
Bindlish, Rajat; Jackson, Thomas; Cosh, Michael; Zhao, Tianjie; O'Neil, Peggy
2015-01-01
Aquarius satellite observations over land offer a new resource for measuring soil moisture from space. Although Aquarius was designed for ocean salinity mapping, our objective in this investigation is to exploit the large amount of land observations that Aquarius acquires and extend the mission scope to include the retrieval of surface soil moisture. The soil moisture retrieval algorithm development focused on using only the radiometer data because of the extensive heritage of passive microwave retrieval of soil moisture. The single channel algorithm (SCA) was implemented using the Aquarius observations to estimate surface soil moisture. Aquarius radiometer observations from three beams (after bias/gain modification) along with the National Centers for Environmental Prediction model forecast surface temperatures were then used to retrieve soil moisture. Ancillary data inputs required for using the SCA are vegetation water content, land surface temperature, and several soil and vegetation parameters based on land cover classes. The resulting global spatial patterns of soil moisture were consistent with the precipitation climatology and with soil moisture from other satellite missions (Advanced Microwave Scanning Radiometer for the Earth Observing System and Soil Moisture Ocean Salinity). Initial assessments were performed using in situ observations from the U.S. Department of Agriculture Little Washita and Little River watershed soil moisture networks. Results showed good performance by the algorithm for these land surface conditions for the period of August 2011-June 2013 (rmse = 0.031 m(exp 3)/m(exp 3), Bias = -0.007 m(exp 3)/m(exp 3), and R = 0.855). This radiometer-only soil moisture product will serve as a baseline for continuing research on both active and combined passive-active soil moisture algorithms. The products are routinely available through the National Aeronautics and Space Administration data archive at the National Snow and Ice Data Center.
NASA Astrophysics Data System (ADS)
Shanmugam, Palanisamy; Varunan, Theenathayalan; Nagendra Jaiganesh, S. N.; Sahay, Arvind; Chauhan, Prakash
2016-06-01
Prediction of the curve of the absorption coefficient of colored dissolved organic matter (CDOM) and differentiation between marine and terrestrially derived CDOM pools in coastal environments are hampered by a high degree of variability in the composition and concentration of CDOM, uncertainties in retrieved remote sensing reflectance and the weak signal-to-noise ratio of space-borne instruments. In the present study, a hybrid model is presented along with empirical methods to remotely determine the amount and type of CDOM in coastal and inland water environments. A large set of in-situ data collected on several oceanographic cruises and field campaigns from different regional waters was used to develop empirical methods for studying the distribution and dynamics of CDOM, dissolved organic carbon (DOC) and salinity. Our validation analyses demonstrated that the hybrid model is a better descriptor of CDOM absorption spectra compared to the existing models. Additional spectral slope parameters included in the present model to differentiate between terrestrially derived and marine CDOM pools make a substantial improvement over those existing models. Empirical algorithms to derive CDOM, DOC and salinity from remote sensing reflectance data demonstrated success in retrieval of these products with significantly low mean relative percent differences from large in-situ measurements. The performance of these algorithms was further assessed using three hyperspectral HICO images acquired simultaneously with our field measurements in productive coastal and lagoon waters on the southeast part of India. The validation match-ups of CDOM and salinity showed good agreement between HICO retrievals and field observations. Further analyses of these data showed significant temporal changes in CDOM and phytoplankton absorption coefficients with a distinct phase shift between these two products. Healthy phytoplankton cells and macrophytes were recognized to directly contribute to the autochthonous production of colored humic-like substances in variable amounts within the lagoon system, despite CDOM content being partly derived through river run-off and wetland discharges as well as from conservative mixing of different water masses. Spatial and temporal maps of CDOM, DOC and salinity products provided an interesting insight into these CDOM dynamics and conservative behavior within the lagoon and its extension in coastal and offshore waters of the Bay of Bengal. The hybrid model and empirical algorithms presented here can be useful to assess CDOM, DOC and salinity fields and their changes in response to increasing runoff of nutrient pollution, anthropogenic activities, hydrographic variations and climate oscillations.
USDA-ARS?s Scientific Manuscript database
Aquarius is a combined passive/active L-band microwave instrument developed to map the ocean surface salinity field from space. The primary science objective of this mission is to monitor the seasonal and interannual variation of the large scale features of the surface salinity field in the open oc...
Three Years of Aquarius Salinity Measurements: Algorithm, Validation and Applications
NASA Astrophysics Data System (ADS)
Meissner, T.; Wentz, F. J.; Le Vine, D. M.; Lagerloef, G. S. E.
2014-12-01
Aquarius is an L-band radiometer/scatterometer (i.e. active/passive) system designed to provide monthly salinity maps at 150 km spatial scale to an accuracy of 0.2 psu. The sensor was launched on June 10, 2011 as part of the Aquarius/SAC-D mission and has been collecting data since August 25, 2011. Version 3 of the data product was released in June 2014 and provides a major milestone towards reaching the mission requirement of 0.2 psu. This presentation reports the status of the Aquarius salinity retrieval algorithm highlighting the advances that have been made for and since the Version 3 release. The most important ones are: 1) An improved surface roughness correction that is based on Aquarius scatterometer observations; 2) A reduction in ascending/descending differences due to galactic background radiation reflected from the ocean surface; 3) A refinement of the quality control flags and masks that indicate degradation under certain environmental conditions. The Aquarius salinity algorithm also retrieves wind speed as part of the roughness correction with an accuracy comparable to the products from other satellites such as WindSat, SSMIS, ASCAT, and QuikSCAT. Validation of the salinity retrievals is accomplished using measurements from ARGO drifters measuring at 5 m depth and in the tropics also from moored buoys measuring at 1 m depth which are co-located with the nearest Aquarius footprint. In the most recent work an effort has also been made to identify areas with frequent rain to isolate potential issues with rain freshening in the upper ocean layer. Results in rain-free regions indicate that on monthly basis and 150 km grid, the V3 Aquarius salinity maps have an accuracy of about 0.13 psu in the tropics and 0.22 psu globally. Comparing Aquarius with ARGO and moored buoy salinity measurements during and after rain events permits a quantitative assessment of the effect of salinity stratification within the first 5 m of the upper ocean layer.
NASA Technical Reports Server (NTRS)
Kalcic, Maria; Turowski, Mark; Hall, Callie
2010-01-01
Presentation topics include: importance of salinity of coastal waters, habitat switching algorithm, habitat switching module, salinity estimates from Landsat for Sabine Calcasieu Basin, percent of time inundated in 2006, salinity data, prototyping the system, system as packaged for field tests, salinity probe and casing, opening for water flow, cellular antenna used to transmit data, preparing to launch, system is launched in the Pearl River at Stennis Space Center, data are transmitted to Twitter by cell phone modem every 15 minutes, Google spreadsheet I used to import the data from the Twitter feed and to compute salinity (from conductivity) and display charts of salinity and temperature, results are uploaded to NASA's Applied Science and Technology Project Office Webpage.
NASA Technical Reports Server (NTRS)
McClain, Charles R.; Signorini, Sergio
2002-01-01
Sensitivity analyses of sea-air CO2 flux to gas transfer algorithms, climatological wind speeds, sea surface temperatures (SST) and salinity (SSS) were conducted for the global oceans and selected regional domains. Large uncertainties in the global sea-air flux estimates are identified due to different gas transfer algorithms, global climatological wind speeds, and seasonal SST and SSS data. The global sea-air flux ranges from -0.57 to -2.27 Gt/yr, depending on the combination of gas transfer algorithms and global climatological wind speeds used. Different combinations of SST and SSS global fields resulted in changes as large as 35% on the oceans global sea-air flux. An error as small as plus or minus 0.2 in SSS translates into a plus or minus 43% deviation on the mean global CO2 flux. This result emphasizes the need for highly accurate satellite SSS observations for the development of remote sensing sea-air flux algorithms.
High-Performance Integrated Control of water quality and quantity in urban water reservoirs
NASA Astrophysics Data System (ADS)
Galelli, S.; Castelletti, A.; Goedbloed, A.
2015-11-01
This paper contributes a novel High-Performance Integrated Control framework to support the real-time operation of urban water supply storages affected by water quality problems. We use a 3-D, high-fidelity simulation model to predict the main water quality dynamics and inform a real-time controller based on Model Predictive Control. The integration of the simulation model into the control scheme is performed by a model reduction process that identifies a low-order, dynamic emulator running 4 orders of magnitude faster. The model reduction, which relies on a semiautomatic procedural approach integrating time series clustering and variable selection algorithms, generates a compact and physically meaningful emulator that can be coupled with the controller. The framework is used to design the hourly operation of Marina Reservoir, a 3.2 Mm3 storm-water-fed reservoir located in the center of Singapore, operated for drinking water supply and flood control. Because of its recent formation from a former estuary, the reservoir suffers from high salinity levels, whose behavior is modeled with Delft3D-FLOW. Results show that our control framework reduces the minimum salinity levels by nearly 40% and cuts the average annual deficit of drinking water supply by about 2 times the active storage of the reservoir (about 4% of the total annual demand).
NASA Technical Reports Server (NTRS)
Yueh, Simon H.; Chaubell, Mario J.
2012-01-01
Several L-band microwave radiometer and radar missions have been, or will be, operating in space for land and ocean observations. These include the NASA Aquarius mission and the Soil Moisture Active Passive (SMAP) mission, both of which use combined passive/ active L-band instruments. Aquarius s passive/active L-band microwave sensor has been designed to map the salinity field at the surface of the ocean from space. SMAP s primary objectives are for soil moisture and freeze/thaw detection, but it will operate continuously over the ocean, and hence will have significant potential for ocean surface research. In this innovation, an algorithm has been developed to retrieve simultaneously ocean surface salinity and wind from combined passive/active L-band microwave observations of sea surfaces. The algorithm takes advantage of the differing response of brightness temperatures and radar backscatter to salinity, wind speed, and direction, thus minimizing the least squares error (LSE) measure, which signifies the difference between measurements and model functions of brightness temperatures and radar backscatter. The algorithm uses the conjugate gradient method to search for the local minima of the LSE. Three LSE measures with different measurement combinations have been tested. The first LSE measure uses passive microwave data only with retrieval errors reaching 1 to 2 psu (practical salinity units) for salinity, and 1 to 2 m/s for wind speed. The second LSE measure uses both passive and active microwave data for vertical and horizontal polarizations. The addition of active microwave data significantly improves the retrieval accuracy by about a factor of five. To mitigate the impact of Faraday rotation on satellite observations, the third LSE measure uses measurement combinations invariant under the Faraday rotation. For Aquarius, the expected RMS SSS (sea surface salinity) error will be less than about 0.2 psu for low winds, and increases to 0.3 psu at 25 m/s wind speed for warm waters (25 C). To achieve the required 0.2 psu accuracy, the impact of sea surface roughness (e.g. wind-generated ripples) on the observed brightness temperature has to be corrected to better than one tenth of a degree Kelvin. With this algorithm, the accuracy of retrieved wind speed will be high, varying from a few tenths to 0.6 m/s. The expected direction accuracy is also excellent (less than 10 ) for mid to high winds, but degrades for lower speeds (less than 7 m/s).
Khang, Hyun Soo; Lee, Byung Il; Oh, Suk Hoon; Woo, Eung Je; Lee, Soo Yeol; Cho, Min Hyoung; Kwon, Ohin; Yoon, Jeong Rock; Seo, Jin Keun
2002-06-01
Recently, a new static resistivity image reconstruction algorithm is proposed utilizing internal current density data obtained by magnetic resonance current density imaging technique. This new imaging method is called magnetic resonance electrical impedance tomography (MREIT). The derivation and performance of J-substitution algorithm in MREIT have been reported as a new accurate and high-resolution static impedance imaging technique via computer simulation methods. In this paper, we present experimental procedures, denoising techniques, and image reconstructions using a 0.3-tesla (T) experimental MREIT system and saline phantoms. MREIT using J-substitution algorithm effectively utilizes the internal current density information resolving the problem inherent in a conventional EIT, that is, the low sensitivity of boundary measurements to any changes of internal tissue resistivity values. Resistivity images of saline phantoms show an accuracy of 6.8%-47.2% and spatial resolution of 64 x 64. Both of them can be significantly improved by using an MRI system with a better signal-to-noise ratio.
Empirical algorithms to predict aragonite saturation state
NASA Astrophysics Data System (ADS)
Turk, Daniela; Dowd, Michael
2017-04-01
Novel sensor packages deployed on autonomous platforms (Profiling Floats, Gliders, Moorings, SeaCycler) and biogeochemical models have a potential to increase the coverage of a key water chemistry variable, aragonite saturation state (ΩAr) in time and space, in particular in the under sampled regions of global ocean. However, these do not provide the set of inorganic carbon measurements commonly used to derive ΩAr. There is therefore a need to develop regional predictive models to determine ΩAr from measurements of commonly observed or/and non carbonate oceanic variables. Here, we investigate predictive skill of several commonly observed oceanographic variables (temperature, salinity, oxygen, nitrate, phosphate and silicate) in determining ΩAr using climatology and shipboard data. This will allow us to assess potential for autonomous sensors and biogeochemical models to monitor ΩAr regionally and globally. We apply the regression models to several time series data sets and discuss regional differences and their implications for global estimates of ΩAr.
Paganoni, C.A.; Chang, K.C.; Robblee, M.B.
2006-01-01
A significant data quality challenge for highly variant systems surrounds the limited ability to quantify operationally reasonable limits on the data elements being collected and provide reasonable threshold predictions. In many instances, the number of influences that drive a resulting value or operational range is too large to enable physical sampling for each influencer, or is too complicated to accurately model in an explicit simulation. An alternative method to determine reasonable observation thresholds is to employ an automation algorithm that would emulate a human analyst visually inspecting data for limits. Using the visualization technique of self-organizing maps (SOM) on data having poorly understood relationships, a methodology for determining threshold limits was developed. To illustrate this approach, analysis of environmental influences that drive the abundance of a target indicator species (the pink shrimp, Farfantepenaeus duorarum) provided a real example of applicability. The relationship between salinity and temperature and abundance of F. duorarum is well documented, but the effect of changes in water quality upstream on pink shrimp abundance is not well understood. The highly variant nature surrounding catch of a specific number of organisms in the wild, and the data available from up-stream hydrology measures for salinity and temperature, made this an ideal candidate for the approach to provide a determination about the influence of changes in hydrology on populations of organisms.
NASA Astrophysics Data System (ADS)
Paganoni, Christopher A.; Chang, K. C.; Robblee, Michael B.
2006-05-01
A significant data quality challenge for highly variant systems surrounds the limited ability to quantify operationally reasonable limits on the data elements being collected and provide reasonable threshold predictions. In many instances, the number of influences that drive a resulting value or operational range is too large to enable physical sampling for each influencer, or is too complicated to accurately model in an explicit simulation. An alternative method to determine reasonable observation thresholds is to employ an automation algorithm that would emulate a human analyst visually inspecting data for limits. Using the visualization technique of self-organizing maps (SOM) on data having poorly understood relationships, a methodology for determining threshold limits was developed. To illustrate this approach, analysis of environmental influences that drive the abundance of a target indicator species (the pink shrimp, Farfantepenaeus duorarum) provided a real example of applicability. The relationship between salinity and temperature and abundance of F. duorarum is well documented, but the effect of changes in water quality upstream on pink shrimp abundance is not well understood. The highly variant nature surrounding catch of a specific number of organisms in the wild, and the data available from up-stream hydrology measures for salinity and temperature, made this an ideal candidate for the approach to provide a determination about the influence of changes in hydrology on populations of organisms.
Person, M.; Konikow, Leonard F.
1986-01-01
A solute-transport model of an irrigated stream-aquifer system was recalibrated because of discrepancies between prior predictions of ground-water salinity trends during 1971-1982 and the observed outcome in February 1982. The original model was calibrated with a 1-year record of data collected during 1971-1972 in an 18-km reach of the Arkansas River Valley in southeastern Colorado. The model is improved by incorporating additional hydrologic processes (salt transport through the unsaturated zone) and through reexamination of the reliability of some input data (regression relationship used to estimate salinity from specific conductance data). Extended simulations using the recalibrated model are made to investigate the usefulness of the model for predicting long-term trends of salinity and water levels within the study area. Predicted ground-water levels during 1971-1982 are in good agreement with the observed, indicating that the original 1971-1972 study period was sufficient to calibrate the flow model. However, long-term simulations using the recalibrated model based on recycling the 1971-1972 data alone yield an average ground-water salinity for 1982 that is too low by about 10%. Simulations that incorporate observed surface-water salinity variations yield better results, in that the calculated average ground-water salinity for 1982 is within 3% of the observed value. Statistical analysis of temporal salinity variations of the applied surface water indicates that at least a 4-year sampling period is needed to accurately calibrate the transport model. ?? 1986.
Impact of Satellite Remote Sensing Data on Simulations of ...
We estimated surface salinity flux and solar penetration from satellite data, and performed model simulations to examine the impact of including the satellite estimates on temperature, salinity, and dissolved oxygen distributions on the Louisiana continental shelf (LCS) near the annual hypoxic zone. Rainfall data from the Tropical Rainfall Measurement Mission (TRMM) were used for the salinity flux, and the diffuse attenuation coefficient (Kd) from Moderate Resolution Imaging Spectroradiometer (MODIS) were used for solar penetration. Improvements in the model results in comparison with in situ observations occurred when the two types of satellite data were included. Without inclusion of the satellite-derived surface salinity flux, realistic monthly variability in the model salinity fields was observed, but important inter-annual variability wasmissed. Without inclusion of the satellite-derived light attenuation, model bottom water temperatures were too high nearshore due to excessive penetration of solar irradiance. In general, these salinity and temperature errors led to model stratification that was too weak, and the model failed to capture observed spatial and temporal variability in water-column vertical stratification. Inclusion of the satellite data improved temperature and salinity predictions and the vertical stratification was strengthened, which improved prediction of bottom-water dissolved oxygen. The model-predicted area of bottom-water hypoxia on the
Empirical tools for simulating salinity in the estuaries in Everglades National Park, Florida
NASA Astrophysics Data System (ADS)
Marshall, F. E.; Smith, D. T.; Nickerson, D. M.
2011-12-01
Salinity in a shallow estuary is affected by upland freshwater inputs (surface runoff, stream/canal flows, groundwater), atmospheric processes (precipitation, evaporation), marine connectivity, and wind patterns. In Everglades National Park (ENP) in South Florida, the unique Everglades ecosystem exists as an interconnected system of fresh, brackish, and salt water marshes, mangroves, and open water. For this effort a coastal aquifer conceptual model of the Everglades hydrologic system was used with traditional correlation and regression hydrologic techniques to create a series of multiple linear regression (MLR) salinity models from observed hydrologic, marine, and weather data. The 37 ENP MLR salinity models cover most of the estuarine areas of ENP and produce daily salinity simulations that are capable of estimating 65-80% of the daily variability in salinity depending upon the model. The Root Mean Squared Error is typically about 2-4 salinity units, and there is little bias in the predictions. However, the absolute error of a model prediction in the nearshore embayments and the mangrove zone of Florida Bay may be relatively large for a particular daily simulation during the seasonal transitions. Comparisons show that the models group regionally by similar independent variables and salinity regimes. The MLR salinity models have approximately the same expected range of simulation accuracy and error as higher spatial resolution salinity models.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kang, Peter K.; Lee, Jonghyun; Fu, Xiaojing
Managing recharge of freshwater into saline aquifers requires accurate estimation of the heterogeneous permeability field for maximizing injection and recovery efficiency. Here we present a methodology for subsurface characterization in saline aquifers that takes advantage of the density difference between the injected freshwater and the ambient saline groundwater. We combine high-resolution forward modeling of density-driven flow with an efficient Bayesian geostatistical inversion algorithm. In the presence of a density difference between the injected and ambient fluids due to differences in salinity, the pressure field is coupled to the spatial distribution of salinity. This coupling renders the pressure field transient: themore » time evolution of the salinity distribution controls the density distribution which then leads to a time-evolving pressure distribution. We exploit this coupling between pressure and salinity to obtain an improved characterization of the permeability field without multiple pumping tests or additional salinity measurements. We show that the inversion performance improves with an increase in the mixed convection ratio—the relative importance between viscous forces from injection and buoyancy forces from density difference. Thus, our work shows that measuring transient pressure data at multiple sampling points during freshwater injection into saline aquifers can be an effective strategy for aquifer characterization, key to the successful management of aquifer recharge.« less
Kang, Peter K.; Lee, Jonghyun; Fu, Xiaojing; ...
2017-05-31
Managing recharge of freshwater into saline aquifers requires accurate estimation of the heterogeneous permeability field for maximizing injection and recovery efficiency. Here we present a methodology for subsurface characterization in saline aquifers that takes advantage of the density difference between the injected freshwater and the ambient saline groundwater. We combine high-resolution forward modeling of density-driven flow with an efficient Bayesian geostatistical inversion algorithm. In the presence of a density difference between the injected and ambient fluids due to differences in salinity, the pressure field is coupled to the spatial distribution of salinity. This coupling renders the pressure field transient: themore » time evolution of the salinity distribution controls the density distribution which then leads to a time-evolving pressure distribution. We exploit this coupling between pressure and salinity to obtain an improved characterization of the permeability field without multiple pumping tests or additional salinity measurements. We show that the inversion performance improves with an increase in the mixed convection ratio—the relative importance between viscous forces from injection and buoyancy forces from density difference. Thus, our work shows that measuring transient pressure data at multiple sampling points during freshwater injection into saline aquifers can be an effective strategy for aquifer characterization, key to the successful management of aquifer recharge.« less
NASA Astrophysics Data System (ADS)
Millie, David F.; Weckman, Gary R.; Young, William A.; Ivey, James E.; Fries, David P.; Ardjmand, Ehsan; Fahnenstiel, Gary L.
2013-07-01
Coastal monitoring has become reliant upon automated sensors for data acquisition. Such a technical commitment comes with a cost; particularly, the generation of large, high-dimensional data streams ('Big Data') that personnel must search through to identify data structures. Nature-inspired computation, inclusive of artificial neural networks (ANNs), affords the unearthing of complex, recurring patterns within sizable data volumes. In 2009, select meteorological and hydrological data were acquired via autonomous instruments in Sarasota Bay, Florida (USA). ANNs estimated continuous chlorophyll (CHL) a concentrations from abiotic predictors, with correlations between measured:modeled concentrations >0.90 and model efficiencies ranging from 0.80 to 0.90. Salinity and water temperature were the principal influences for modeled CHL within the Bay; concentrations steadily increased at temperatures >28° C and were greatest at salinities <36 (maximizing at ca. 35.3). Categorical ANNs modeled CHL classes of 6.1 and 11 μg CHL L-1 (representative of local and state-imposed constraint thresholds, respectively), with an accuracy of ca. 83% and class precision ranging from 0.79 to 0.91. The occurrence likelihood of concentrations > 6.1 μg CHL L-1 maximized at a salinity of ca. 36.3 and a temperature of ca. 29.5 °C. A 10th-order Chebyshev bivariate polynomial equation was fit (adj. r2 = 0.99, p < 0.001) to a three-dimensional response surface portraying modeled CHL concentrations, conditional to the temperature-salinity interaction. The TREPAN algorithm queried a continuous ANN to extract a decision tree for delineation of CHL classes; turbidity, temperature, and salinity (and to lesser degrees, wind speed, wind/current direction, irradiance, and urea-nitrogen) were key variables for quantitative rules in tree formalisms. Taken together, computations enabled knowledge provision for and quantifiable representations of the non-linear relationships between environmental variables and CHL a.
NASA Astrophysics Data System (ADS)
Delsman, J. R.; Hu-a-ng, K. R. M.; Vos, P. C.; de Louw, P. G. B.; Oude Essink, G. H. P.; Stuyfzand, P. J.; Bierkens, M. F. P.
2013-11-01
Management of coastal fresh groundwater reserves requires a thorough understanding of the present-day groundwater salinity distribution and its possible future development. However, coastal groundwater often still reflects a complex history of marine transgressions and regressions, and is only rarely in equilibrium with current boundary conditions. In addition, the distribution of groundwater salinity is virtually impossible to characterize satisfactorily, complicating efforts to model and predict coastal groundwater flow. A way forward may be to account for the historical development of groundwater salinity when modeling present-day coastal groundwater flow. In this paper, we construct a palaeo-hydrogeological model to simulate the evolution of groundwater salinity in the coastal area of the Netherlands throughout the Holocene. While intended as a perceptual tool, confidence in our model results is warranted by a good correspondence with a hydrochemical characterization of groundwater origin. Model results attest to the impact of groundwater density differences on coastal groundwater flow on millennial timescales and highlight their importance in shaping today's groundwater salinity distribution. Not once reaching steady-state throughout the Holocene, our results demonstrate the long-term dynamics of salinity in coastal aquifers. This stresses the importance of accounting for the historical evolution of coastal groundwater salinity when modeling present-day coastal groundwater flow, or when predicting impacts of e.g. sea level rise on coastal aquifers. Of more local importance, our findings suggest a more significant role of pre-Holocene groundwater in the present-day groundwater salinity distribution in the Netherlands than previously recognized. The implications of our results extend beyond understanding the present-day distribution of salinity, as the proven complex history of coastal groundwater also holds important clues for understanding and predicting the distribution of other societally relevant groundwater constituents.
Schmöckel, Sandra M.; Lightfoot, Damien J.; Razali, Rozaimi; Tester, Mark; Jarvis, David E.
2017-01-01
Chenopodium quinoa (quinoa) is an emerging crop that produces nutritious grains with the potential to contribute to global food security. Quinoa can also grow on marginal lands, such as soils affected by high salinity. To identify candidate salt tolerance genes in the recently sequenced quinoa genome, we used a multifaceted approach integrating RNAseq analyses with comparative genomics and topology prediction. We identified 219 candidate genes by selecting those that were differentially expressed in response to salinity, were specific to or overrepresented in quinoa relative to other Amaranthaceae species, and had more than one predicted transmembrane domain. To determine whether these genes might underlie variation in salinity tolerance in quinoa and its close relatives, we compared the response to salinity stress in a panel of 21 Chenopodium accessions (14 C. quinoa, 5 C. berlandieri, and 2 C. hircinum). We found large variation in salinity tolerance, with one C. hircinum displaying the highest salinity tolerance. Using genome re-sequencing data from these accessions, we investigated single nucleotide polymorphisms and copy number variation (CNV) in the 219 candidate genes in accessions of contrasting salinity tolerance, and identified 15 genes that could contribute to the differences in salinity tolerance of these Chenopodium accessions. PMID:28680429
Abraham, Michael H; Gola, Joelle M R; Ibrahim, Adam; Acree, William E; Liu, Xiangli
2014-07-01
There is considerable interest in the blood-tissue distribution of agrochemicals, and a number of researchers have developed experimental methods for in vitro distribution. These methods involve the determination of saline-blood and saline-tissue partitions; not only are they indirect, but they do not yield the required in vivo distribution. The authors set out equations for gas-tissue and blood-tissue distribution, for partition from water into skin and for permeation from water through human skin. Together with Abraham descriptors for the agrochemicals, these equations can be used to predict values for all of these processes. The present predictions compare favourably with experimental in vivo blood-tissue distribution where available. The predictions require no more than simple arithmetic. The present method represents a much easier and much more economic way of estimating blood-tissue partitions than the method that uses saline-blood and saline-tissue partitions. It has the added advantages of yielding the required in vivo partitions and being easily extended to the prediction of partition of agrochemicals from water into skin and permeation from water through skin. © 2013 Society of Chemical Industry.
Modeling Salinity Exchanges Between the Equatorial Indian Ocean and the Bay of Bengal
2016-06-01
Technology, has produced a model salinity climatology using daily atmosphere and surface flux climatology as forcing. Here, we present the results...surface, the model was forced by the daily climatology of atmo- spheric variables obtained from vari- ous sources. We used daily QuikSCAT and...2012). Precipitation data were obtained from the Global Precipitation Climatology Project (GPCP). Using the bulk flux algorithm by Fairall et al
Simulation of salinity effects on past, present, and future soil organic carbon stocks.
Setia, Raj; Smith, Pete; Marschner, Petra; Gottschalk, Pia; Baldock, Jeff; Verma, Vipan; Setia, Deepika; Smith, Jo
2012-02-07
Soil organic carbon (SOC) models are used to predict changes in SOC stocks and carbon dioxide (CO(2)) emissions from soils, and have been successfully validated for non-saline soils. However, SOC models have not been developed to simulate SOC turnover in saline soils. Due to the large extent of salt-affected areas in the world, it is important to correctly predict SOC dynamics in salt-affected soils. To close this knowledge gap, we modified the Rothamsted Carbon Model (RothC) to simulate SOC turnover in salt-affected soils, using data from non-salt-affected and salt-affected soils in two agricultural regions in India (120 soils) and in Australia (160 soils). Recently we developed a decomposition rate modifier based on an incubation study of a subset of these soils. In the present study, we introduce a new method to estimate the past losses of SOC due to salinity and show how salinity affects future SOC stocks on a regional scale. Because salinity decreases decomposition rates, simulations using the decomposition rate modifier for salinity suggest an accumulation of SOC. However, if the plant inputs are also adjusted to reflect reduced plant growth under saline conditions, the simulations show a significant loss of soil carbon in the past due to salinization, with a higher average loss of SOC in Australian soils (55 t C ha(-1)) than in Indian soils (31 t C ha(-1)). There was a significant negative correlation (p < 0.05) between SOC loss and osmotic potential. Simulations of future SOC stocks with the decomposition rate modifier and the plant input modifier indicate a greater decrease in SOC in saline than in non-saline soils under future climate. The simulations of past losses of SOC due to salinity were repeated using either measured charcoal-C or the inert organic matter predicted by the Falloon et al. equation to determine how much deviation from the Falloon et al. equation affects the amount of plant inputs generated by the model for the soils used in this study. Both sets of results suggest that saline soils have lost carbon and will continue to lose carbon under future climate. This demonstrates the importance of both reduced decomposition and reduced plant input in simulations of future changes in SOC stocks in saline soils.
Liu, Mingyue; Du, Baojia; Zhang, Bai
2018-01-01
Soil salinity and sodicity can significantly reduce the value and the productivity of affected lands, posing degradation, and threats to sustainable development of natural resources on earth. This research attempted to map soil salinity/sodicity via disentangling the relationships between Landsat 8 Operational Land Imager (OLI) imagery and in-situ measurements (EC, pH) over the west Jilin of China. We established the retrieval models for soil salinity and sodicity using Partial Least Square Regression (PLSR). Spatial distribution of the soils that were subjected to hybridized salinity and sodicity (HSS) was obtained by overlay analysis using maps of soil salinity and sodicity in geographical information system (GIS) environment. We analyzed the severity and occurring sizes of soil salinity, sodicity, and HSS with regard to specified soil types and land cover. Results indicated that the models’ accuracy was improved by combining the reflectance bands and spectral indices that were mathematically transformed. Therefore, our results stipulated that the OLI imagery and PLSR method applied to mapping soil salinity and sodicity in the region. The mapping results revealed that the areas of soil salinity, sodicity, and HSS were 1.61 × 106 hm2, 1.46 × 106 hm2, and 1.36 × 106 hm2, respectively. Also, the occurring area of moderate and intensive sodicity was larger than that of salinity. This research may underpin efficiently mapping regional salinity/sodicity occurrences, understanding the linkages between spectral reflectance and ground measurements of soil salinity and sodicity, and provide tools for soil salinity monitoring and the sustainable utilization of land resources. PMID:29614727
Validation of Aquarius Measurements Using Radiative Transfer Models at L-Band
NASA Technical Reports Server (NTRS)
Dinnat, E.; LeVine, David M.; Abraham, S.; DeMattheis, P.; Utku, C.
2012-01-01
Aquarius/SAC-D was launched in June 2011 by NASA and CONAE (Argentine space agency). Aquarius includes three L-band (1.4 GHz) radiometers dedicated to measuring sea surface salinity. We report detailed comparisons of Aquarius measurements with radiative transfer model predictions. These comparisons were used as part ofthe initial assessment of Aquarius data. In particular, they were used successfully to estimate the radiometer calibration bias and stability. Further comparisons are being performed to assess the performance of models in the retrieval algorithm for correcting the effect of sources of geophysical "noise" (e.g. the galactic background, atmospheric attenuation and reflected signal from the Sun). Such corrections are critical in bringing the error in retrieved salinity down to the required 0.2 practical salinity unit (psu) on monthly global maps at 150 km by 150 km resolution. The forward models making up the Aquarius simulator have been very useful for preparatory studies in the years leading to Aquarius' launch. The simulator includes various components to compute effects ofthe following processes on the measured signal: 1) emission from Earth surfaces (ocean, land, ice), 2) atmospheric emission and absorption, 3) emission from the Sun, Moon and celestial Sky (directly through the antenna sidelobes or after reflection/scattering at the Earth surface), 4) Faraday rotation, and 5) convolution of the scene by the antenna gain patterns. Since the Aquarius radiometers tum-on in late July 2011, the simulator has been used to perform a first order validation of the data. This included checking the order of magnitude ofthe signal over ocean, land and ice surfaces, checking the relative amplitude of signal at different polarizations, and checking the variation with incidence angle. The comparisons were also used to assess calibration bias and monitor instruments calibration drift. The simulator is also being used in the salinity retrieval. For example, initial assessments of the salinity retrieved from Aquarius data showed degradation in accuracy at locations where glint from the galactic sky background was important. This was traced to an inaccurate correction for the Sky glint. We present comparisons of the simulator prediction to the Aquarius data in order to assess the performances of the models of various physical processes impacting the measurements, such as the effect of sea surface roughness, the impact of the celestial Sky and the Sun emission scattered at the rough ocean surface. We discuss what components of the simulator appear reliable and which ones need improvements. Improved knowledge on the radiative transfer models at L-band will not only lead to better salinity retrieved from Aquarius data, it will also allow be beneficial for SMOS or the upcoming SMAP mission.
Bolduc, F.; Afton, A.D.
2003-01-01
Aquatic invertebrates are important food resources for wintering waterbirds, and prey selection generally is limited by prey size. Aquatic invertebrate communities are influenced by sediments and hydrologic characteristics of wetlands, which were affected by structural marsh management (levees, water-control structures and impoundments; SMM) and salinity on the Gulf Coast Chenier Plain of North America. Based on previous research, we tested general predictions that SMM reduces biomass of infaunal invertebrates and increases that of epifaunal invertebrates and those that tolerate low levels of dissolved oxygen (O2) and salinity. We also tested the general prediction that invertebrate biomass in freshwater, oligohaline, and mesohaline marshes are similar, except for taxa adapted to specific ranges of salinity. Finally, we investigated relationships among invertebrate biomass and sizes, sediment and hydrologic variables, and marsh types. Accordingly, we measured biomass of common invertebrate by three size classes (63 to 199 ??m, 200 to 999 ??m, and ???1000 ??m), sediment variables (carbon content, C:N ratio, hardness, particle size, and O, penetration), and hydrologic variables (salinity, water depth,temperature, 02, and turbidity) in ponds of impounded freshwater (IF), oligohaline (IO), mesohaline (IM), and unimpounded mesohaline (UM) marshes during winters 1997-1998 to 1999-2000 on Rockefeller State Wildlife Refuge, near Grand Chenier, Louisiana, USA. As predicted, an a priori multivariate analysis of variance (MANOVA) contrast indicated that biomass of an infaunal class of invertebrates (Nematoda, 63 to 199 ??m) was greater in UM marsh ponds than in those of IM marshes, and biomass of an epifaunal class of invertebrates (Ostracoda, 200 to 999 ??m) was greater in IM marsh ponds than in those of UM marshes. The observed reduction in Nematoda due to SMM also was consistent with the prediction that SMM reduces invertebrates that do not tolerate low salinity. Furthermore, as predicted, an a priori MANOVA contrast indicated that biomass of a single invertebrate class adapted to low salinity (Oligochaeta, 200 to 999 ??m) was greater in ponds of IF marshes than in those of IO and IM marshes. A canonical correspondence analysis indicated that variation in salinity and O2 penetration best explained differences among sites that maximized biomass of the common invertebrate classes. Salinity was positively correlated with the silt-clay fraction, O2, and O2 penetration, and negatively correlated with water depth, sediment hardness, carbon, and C:N. Nematoda, Foraminifera, and Copepoda generally were associated with UM marsh ponds and high salinity, whereas other invertebrate classes were distributed among impounded marsh ponds and associated with lower salinity. Our results suggest that SMM and salinity have relatively small effects on invertebrate prey of wintering waterbirds in marsh ponds because they affect biomass of Nematoda and Oligochaeta, and few waterbirds consume these invertebrates. ?? 2003, The Society of Wetland Scientists.
Schofield, Pamela J.; Peterson, Mark S.; Lowe, Michael R.; Brown-Peterson, Nancy J.; Slack, William T.
2011-01-01
The physiological tolerances of non-native fishes is an integral component of assessing potential invasive risk. Salinity and temperature are environmental variables that limit the spread of many non-native fishes. We hypothesised that combinations of temperature and salinity will interact to affect survival, growth, and reproduction of Nile tilapia, Oreochromis niloticus, introduced into Mississippi, USA. Tilapia withstood acute transfer from fresh water up to a salinity of 20 and survived gradual transfer up to 60 at typical summertime (30°C) temperatures. However, cold temperature (14°C) reduced survival of fish in saline waters ≥10 and increased the incidence of disease in freshwater controls. Although fish were able to equilibrate to saline waters in warm temperatures, reproductive parameters were reduced at salinities ≥30. These integrated responses suggest that Nile tilapia can invade coastal areas beyond their point of introduction. However, successful invasion is subject to two caveats: (1) wintertime survival depends on finding thermal refugia, and (2) reproduction is hampered in regions where salinities are ≥30. These data are vital to predicting the invasion of non-native fishes into coastal watersheds. This is particularly important given the predicted changes in coastal landscapes due to global climate change and sea-level rise.
NASA Technical Reports Server (NTRS)
Yueh, Simon H.; Chaubell, Mario J.
2011-01-01
Aquarius is a combined passive/active L-band microwave instrument developed to map the salinity field at the surface of the ocean from space. The data will support studies of the coupling between ocean circulation, the global water cycle, and climate. The primary science objective of this mission is to monitor the seasonal and interannual variation of the large scale features of the surface salinity field in the open ocean with a spatial resolution of 150 kilometers and a retrieval accuracy of 0.2 practical salinity units globally on a monthly basis. The measurement principle is based on the response of the L-band (1.413 gigahertz) sea surface brightness temperatures (T (sub B)) to sea surface salinity. To achieve the required 0.2 practical salinity units accuracy, the impact of sea surface roughness (e.g. wind-generated ripples and waves) along with several factors on the observed brightness temperature has to be corrected to better than a few tenths of a degree Kelvin. To the end, Aquarius includes a scatterometer to help correct for this surface roughness effect.
Setia, Raj; Smith, Pete; Marschner, Petra; Baldock, Jeff; Chittleborough, David; Smith, Jo
2011-08-01
Soil organic carbon (SOC) models such as the Rothamsted Carbon Model (RothC) have been used to estimate SOC dynamics in soils over different time scales but, until recently, their ability to accurately predict SOC stocks/carbon dioxide (CO(2)) emissions from salt-affected soils has not been assessed. Given the large extent of salt-affected soils (19% of the 20.8 billion ha of arable land on Earth), this may lead to miss-estimation of CO(2) release. Using soils from two salt-affected regions (one in Punjab, India and one in South Australia), an incubation study was carried out measuring CO(2) release over 120 days. The soils varied both in salinity (measured as electrical conductivity (EC) and calculated as osmotic potential using EC and water content) and sodicity (measured as sodium adsorption ratio, SAR). For soils from both regions, the osmotic potential had a significant positive relationship with CO(2)-C release, but no significant relationship was found between SAR and CO(2)-C release. The monthly cumulative CO(2)-C was simulated using RothC. RothC was modified to take into account reductions in plant inputs due to salinity. A subset of non-salt-affected soils was used to derive an equation for a "lab-effect" modifier to account for changes in decomposition under lab conditions and this modifier was significantly related with pH. Using a subset of salt-affected soils, a decomposition rate modifier (as a function of osmotic potential) was developed to match measured and modelled CO(2)-C release after correcting for the lab effect. Using this decomposition rate modifier, we found an agreement (R(2) = 0.92) between modelled and independently measured data for a set of soils from the incubation experiment. RothC, modified by including reduced plant inputs due to salinity and the salinity decomposition rate modifier, was used to predict SOC stocks of soils in a field in South Australia. The predictions clearly showed that SOC stocks are reduced in saline soils. Therefore both the decomposition rate modifier and plant input modifier should be taken into account when accounting for SOC turnover in saline soils. Since modeling has previously not accounted for the impact of salinity, our results suggest that previous predictions may have overestimated SOC stocks.
NASA Astrophysics Data System (ADS)
Zhang, Zhilin; Savenije, Hubert H. G.
2017-07-01
The practical value of the surprisingly simple Van der Burgh equation in predicting saline water intrusion in alluvial estuaries is well documented, but the physical foundation of the equation is still weak. In this paper we provide a connection between the empirical equation and the theoretical literature, leading to a theoretical range of Van der Burgh's coefficient of 1/2 < K < 2/3 for density-driven mixing which falls within the feasible range of 0 < K < 1. In addition, we developed a one-dimensional predictive equation for the dispersion of salinity as a function of local hydraulic parameters that can vary along the estuary axis, including mixing due to tide-driven residual circulation. This type of mixing is relevant in the wider part of alluvial estuaries where preferential ebb and flood channels appear. Subsequently, this dispersion equation is combined with the salt balance equation to obtain a new predictive analytical equation for the longitudinal salinity distribution. Finally, the new equation was tested and applied to a large database of observations in alluvial estuaries, whereby the calibrated K values appeared to correspond well to the theoretical range.
NASA Astrophysics Data System (ADS)
La Peyre, Megan K.; Eberline, Benjamin S.; Soniat, Thomas M.; La Peyre, Jerome F.
2013-12-01
Understanding how different life history stages are impacted by extreme or stochastic environmental variation is critical for predicting and modeling organism population dynamics. This project examined recruitment, growth, and mortality of seed (25-75 mm) and market (>75 mm) sized oysters along a salinity gradient over two years in Breton Sound, LA. In April 2010, management responses to the Deepwater Horizon oil spill resulted in extreme low salinity (<5) at all sites through August 2010; in 2011, a 100-year Mississippi River flood event resulted in low salinity in late spring. Extended low salinity (<5) during hot summer months (>25 °C) significantly and negatively impacted oyster recruitment, survival and growth in 2010, while low salinity (<5) for a shorter period that did not extend into July (<25 °C) in 2011 had minimal impacts on oyster growth and mortality. In 2011, recruitment was limited, which may be due to a combination of low spring time salinities, high 2010 oyster mortality, minimal 2010 recruitment, cumulative effects from 10 years of declining oyster stock in the area, and poor cultch quality. In both 2010 and 2011, Perkinsus marinus infection prevalence remained low throughout the year at all sites and almost all infection intensities were light. Oyster plasma osmolality failed to match surrounding low salinity waters in 2010, while oysters appeared to osmoconform throughout 2011 indicating that the high mortality in 2010 may be due to extended valve closing and resulting starvation or asphyxiation in response to the combination of low salinity during high temperatures (>25 °C). With increasing management of our freshwater inputs to estuaries combined with predicted climate changes, how extreme events affect different life history stages is key to understanding variation in population demographics of commercially important species and predicting future populations.
NASA Astrophysics Data System (ADS)
Lowe, A. T.; Roberts, E. A.; Galloway, A. W. E.
2016-02-01
Coastal regions around the world are changing rapidly, generating many physiological stressors for marine organisms. Food availability, a major factor determining physiological condition of marine organisms, in these systems reflects the influence of biological and environmental factors, and will likely respond dramatically to long-term changes. Using observations of phytoplankton, detritus, and their corresponding fatty acids and stable isotopes of carbon, nitrogen and sulfur, we identified environmental drivers of pelagic food availability and quality along a salinity gradient in a large tidally influenced estuary (San Juan Archipelago, Salish Sea, USA). Variation in chlorophyll a (Chl a), biomarkers and environmental conditions exhibited a similar range at both tidal and seasonal scales, highlighting a tide-related mechanism controlling productivity that is important to consider for long-term monitoring. Multiple parameters of food availability were inversely and non-linearly correlated to salinity, such that availability of high-quality (based on abundance, essential fatty acid concentration and C:N) seston increased below a salinity threshold of 30. The increased marine productivity was associated with increased pH and dissolved oxygen (DO) at lower salinity. Based on this observation we predicted that a decrease of salinity to below the threshold would result in higher Chl a, temperature, DO and pH across a range of temporal and spatial scales, and tested the prediction with a meta-analysis of available data. At all scales, these variables showed significant and consistent increases related to the salinity threshold. This finding provides important context to the increased frequency of below-threshold salinity over the last 71 years in this region, suggesting greater food availability with positive feedbacks on DO and pH. Together, these findings indicate that many of the environmental factors predicted to increase physiological stress to benthic suspension feeders (e.g. decreased salinity) may simultaneously and paradoxically improve conditions for benthic organisms.
Stagg, Camille L.; Schoolmaster, Donald; Krauss, Ken W.; Cormier, Nicole; Conner, William H.
2017-01-01
Coastal wetlands significantly contribute to global carbon storage potential. Sea-level rise and other climate change-induced disturbances threaten coastal wetland sustainability and carbon storage capacity. It is critical that we understand the mechanisms controlling wetland carbon loss so that we can predict and manage these resources in anticipation of climate change. However, our current understanding of the mechanisms that control soil organic matter decomposition, in particular the impacts of elevated salinity, are limited, and literature reports are contradictory. In an attempt to improve our understanding of these complex processes, we measured root and rhizome decomposition and developed a causal model to identify and quantify the mechanisms that influence soil organic matter decomposition in coastal wetlands that are impacted by sea-level rise. We identified three causal pathways: 1) a direct pathway representing the effects of flooding on soil moisture, 2) a direct pathway representing the effects of salinity on decomposer microbial communities and soil biogeochemistry, and 3) an indirect pathway representing the effects of salinity on litter quality through changes in plant community composition over time. We used this model to test the effects of alternate scenarios on the response of tidal freshwater forested wetlands and oligohaline marshes to short- and long-term climate-induced disturbances of flooding and salinity. In tidal freshwater forested wetlands, the model predicted less decomposition in response to drought, hurricane salinity pulsing, and long-term sea-level rise. In contrast, in the oligohaline marsh, the model predicted no change in response to sea-level rise, and increased decomposition following a drought or a hurricane salinity pulse. Our results show that it is critical to consider the temporal scale of disturbance and the magnitude of exposure when assessing the effects of salinity intrusion on carbon mineralization in coastal wetlands. Here we identify three causal mechanisms that can reconcile disparities between long-term and short-term salinity impacts on organic matter decomposition.
LaPeyre, Megan K.; Eberline, Benjamin S.; Soniat, Thomas M.; La Peyre, Jerome F.
2013-01-01
Understanding how different life history stages are impacted by extreme or stochastic environmental variation is critical for predicting and modeling organism population dynamics. This project examined recruitment, growth, and mortality of seed (25–75 mm) and market (>75 mm) sized oysters along a salinity gradient over two years in Breton Sound, LA. In April 2010, management responses to the Deepwater Horizon oil spill resulted in extreme low salinity (<5) at all sites through August 2010; in 2011, a 100-year Mississippi River flood event resulted in low salinity in late spring. Extended low salinity (<5) during hot summer months (>25 °C) significantly and negatively impacted oyster recruitment, survival and growth in 2010, while low salinity (<5) for a shorter period that did not extend into July (<25 °C) in 2011 had minimal impacts on oyster growth and mortality. In 2011, recruitment was limited, which may be due to a combination of low spring time salinities, high 2010 oyster mortality, minimal 2010 recruitment, cumulative effects from 10 years of declining oyster stock in the area, and poor cultch quality. In both 2010 and 2011, Perkinsus marinusinfection prevalence remained low throughout the year at all sites and almost all infection intensities were light. Oyster plasma osmolality failed to match surrounding low salinity waters in 2010, while oysters appeared to osmoconform throughout 2011 indicating that the high mortality in 2010 may be due to extended valve closing and resulting starvation or asphyxiation in response to the combination of low salinity during high temperatures (>25 °C). With increasing management of our freshwater inputs to estuaries combined with predicted climate changes, how extreme events affect different life history stages is key to understanding variation in population demographics of commercially important species and predicting future populations.
Stagg, Camille L; Schoolmaster, Donald R; Krauss, Ken W; Cormier, Nicole; Conner, William H
2017-08-01
Coastal wetlands significantly contribute to global carbon storage potential. Sea-level rise and other climate-change-induced disturbances threaten coastal wetland sustainability and carbon storage capacity. It is critical that we understand the mechanisms controlling wetland carbon loss so that we can predict and manage these resources in anticipation of climate change. However, our current understanding of the mechanisms that control soil organic matter decomposition, in particular the impacts of elevated salinity, are limited, and literature reports are contradictory. In an attempt to improve our understanding of these complex processes, we measured root and rhizome decomposition and developed a causal model to identify and quantify the mechanisms that influence soil organic matter decomposition in coastal wetlands that are impacted by sea-level rise. We identified three causal pathways: (1) a direct pathway representing the effects of flooding on soil moisture, (2) a direct pathway representing the effects of salinity on decomposer microbial communities and soil biogeochemistry, and (3) an indirect pathway representing the effects of salinity on litter quality through changes in plant community composition over time. We used this model to test the effects of alternate scenarios on the response of tidal freshwater forested wetlands and oligohaline marshes to short- and long-term climate-induced disturbances of flooding and salinity. In tidal freshwater forested wetlands, the model predicted less decomposition in response to drought, hurricane salinity pulsing, and long-term sea-level rise. In contrast, in the oligohaline marsh, the model predicted no change in response to drought and sea-level rise, and increased decomposition following a hurricane salinity pulse. Our results show that it is critical to consider the temporal scale of disturbance and the magnitude of exposure when assessing the effects of salinity intrusion on carbon mineralization in coastal wetlands. Here, we identify three causal mechanisms that can reconcile disparities between long-term and short-term salinity impacts on organic matter decomposition. © 2017 by the Ecological Society of America.
Salinized rivers: degraded systems or new habitats for salt-tolerant faunas?
Buchwalter, David; Davis, Jenny
2016-01-01
Anthropogenic salinization of rivers is an emerging issue of global concern, with significant adverse effects on biodiversity and ecosystem functioning. Impacts of freshwater salinization on biota are strongly mediated by evolutionary history, as this is a major factor determining species physiological salinity tolerance. Freshwater insects dominate most flowing waters, and the common lotic insect orders Ephemeroptera (mayflies), Plecoptera (stoneflies) and Trichoptera (caddisflies) are particularly salt-sensitive. Tolerances of existing taxa, rapid adaption, colonization by novel taxa (from naturally saline environments) and interactions between species will be key drivers of assemblages in saline lotic systems. Here we outline a conceptual framework predicting how communities may change in salinizing rivers. We envision that a relatively small number of taxa will be saline-tolerant and able to colonize salinized rivers (e.g. most naturally saline habitats are lentic; thus potential colonizers would need to adapt to lotic environments), leading to depauperate communities in these environments. PMID:26932680
Impacts of Low Salinity on Growth and Calcification in Baltic Sea Mytilus edulis x trossulus
NASA Astrophysics Data System (ADS)
Sanders, T.; Melzner, F.
2016-02-01
The Baltic Sea is characterized by a steep salinity gradient (25 psu - <5 psu) which is predicted to increase in the future due to increased precipitation. This provides an excellent biological system to study the effects of salinity and inorganic carbon supply on animal physiology. Mytilus edulis x trossulus is adapted to the low saline Baltic Sea, at the cost of slow body growth and reduced shell thickness. The explanation for the small size of Baltic mytilids has been attributed to tradeoffs in energy partitioning due to high energetic costs associated with osmoregulation. However, salinity may effect calcification mechanisms and reduce calcification and thus, body size and growth. To understand the mechanistic effects salinity has on calcification, energy budgets were quantified in larvae, juveniles and adults from 3 populations of Baltic Sea Mytilus spp. at different salinities (6, 11 and 16 psu). Net CaCO3 production at varying salinities and bicarbonate concentrations was also measured. Larvae from low salinity adapted populations (6 psu) had a 3-fold higher respiration rate compared to higher salinity populations. This was also accompanied by a delay of 48 hours in early shell formation. Reductions in growth and increases in metabolism were largest between 11 psu and 6 psu indicating that the predicted desalination of the Baltic will go along with huge energetic costs for mussel populations, potentially leading to loss of reefs in the Eastern Baltic. To investigate the mechanisms behind increased metabolic cost and decreased allocation to growth, energy budgets are presently being constrained in our three populations using modulations in food supply and temperature.
Konikow, Leonard F.
1981-01-01
Undesirable salinity increases occur in both groundwater and surface water and are commonly related to agricultural practices. Groundwater recharge from precipitation or irrigation will transport and disperse residual salts concentrated by evapotranspiration, salts leached from soil and aquifer materials, as well as some dissolved fertilizers and pesticides. Where stream salinity is affected by agricultural practices, the increases in salt load usually are attributable mostly to a groundwater component of flow. Thus, efforts to predict, manage, or control stream salinity increases should consider the role of groundwater in salt transport. Two examples of groundwater salinity problems in Colorado, U.S.A., illustrate that a model which simulates accurately the transport and dispersion of solutes in flowing groundwater can be (1) a valuable investigative tool to help understand the processes and parameters controlling the movement and fate of the salt, and (2) a valuable management tool for predicting responses and optimizing the development and use of the total water resource. ?? 1981.
NASA Technical Reports Server (NTRS)
Dinnat, E. P.; Boutin, J.; Yin, X.; LeVine, D. M.
2014-01-01
Two ongoing space missions share the scientific objective of mapping the global Sea Surface Salinity (SSS), yet their observations show significant discrepancies. ESA's Soil Moisture and Ocean Salinity (SMOS) and NASA's Aquarius use L-band (1.4 GHz) radiometers to measure emission from the sea surface and retrieve SSS. Significant differences in SSS retrieved by both sensors are observed, with SMOS SSS being generally lower than Aquarius SSS, except for very cold waters where SMOS SSS is the highest overall. Figure 1 is an example of the difference between the SSS retrieved by SMOS and Aquarius averaged over one month and 1 degree in longitude and latitude. Differences are mostly between -1 psu and +1 psu (psu, practical salinity unit), with a significant regional and latitudinal dependence. We investigate the impact of the vicarious calibration and retrieval algorithm used by both mission on these differences.
NASA Technical Reports Server (NTRS)
Dinnat, E. P.; Boutin, J.; Yin, X.; Le Vine, D. M.; Waldteufel, P.; Vergely, J. -L.
2014-01-01
Two ongoing space missions share the scientific objective of mapping the global Sea Surface Salinity (SSS), yet their observations show significant discrepancies. ESA's Soil Moisture and Ocean Salinity (SMOS) and NASA's Aquarius use L-band (1.4 GHz) radiometers to measure emission from the sea surface and retrieve SSS. Significant differences in SSS retrieved by both sensors are observed, with SMOS SSS being generally lower than Aquarius SSS, except for very cold waters where SMOS SSS is the highest overall. Figure 1 is an example of the difference between the SSS retrieved by SMOS and Aquarius averaged over one month and 1 degree in longitude and latitude. Differences are mostly between -1 psu and +1 psu (psu, practical salinity unit), with a significant regional and latitudinal dependence. We investigate the impact of the vicarious calibration and some components of the retrieval algorithm used by both mission on these differences.
Consistent Transition of Salinity Retrievals From Aquarius to SMAP
NASA Astrophysics Data System (ADS)
Mears, C. A.; Meissner, T.; Wentz, F. J.; Manaster, A.
2017-12-01
The Aquarius Version 5.0 release in late 2017 has achieved an excellent level of accuracy and significantly mitigated most of the regional and seasonal biases that had been observed in prior releases. The SMAP NASA/RSS Version 2.0 release does not quite yet reach that level of accuracy. Our presentation discusses the necessary steps that need to be undertaken in the upcoming V 3.0 of the SMAP salinity retrieval algorithm to achieve a seamless transition between the salinity products from the two instruments. We also discuss where fundamental differences in the sensors make it difficult to reach complete consistency. In the Aquarius V 4.0 and earlier releases, comparison with ARGO floats have revealed small fresh biases at low latitudes and larger seasonally varying salty biases at high latitudes. These biases have been tracked back to inaccuracies in the models that are used for correcting the absorption by atmospheric oxygen and for correcting the wind induced roughness. The geophysical models have been changed in Aquarius V5.0, which resulted in a significant improvement of these biases. The upcoming SMAP V3 release will implement the same geophysical model. In deriving the changes of the geophysical model, monthly ARGO analyzed fields from Scripps are now being used consistently as reference salinity for both Aquarius V5.0 and the upcoming SMAP V3.0 releases. Earlier versions had used HYOCM as reference salinity field. The development of the Aquarius V 5.0 algorithm has already strongly benefited from the full 360o look capability of SMAP. This aided in deriving the correction of the reflected galaxy, which is a strong spurious signal for both sensors. Consistent corrections for the galactic signal are now used for both Aquarius and SMAP. It is also important to filter out rain when developing the GMF and when validating the satellite salinities versus in-situ measurements on order to avoid mismatches due to salinity stratification in the upper ocean layer. One major difference between Aquarius and SMAP is the emissive SMAP mesh antenna. In order to correct for it an accurate thermal model for the physical temperature of the SMAP antenna needs to be developed.
Hillmann, Eva R.; DeMarco, Kristin; LaPeyre, Megan K.
2016-01-01
Coastal ecosystems are dynamic and productive areas that are vulnerable to effects of global climate change. Despite their potentially limited spatial extent, submerged aquatic vegetation (SAV) beds function in coastal ecosystems as foundation species, and perform important ecological services. However, limited understanding of the factors controlling SAV distribution and abundance across multiple salinity zones (fresh, intermediate, brackish, and saline) in the northern Gulf of Mexico restricts the ability of models to accurately predict resource availability. We sampled 384 potential coastal SAV sites across the northern Gulf of Mexico in 2013 and 2014, and examined community and species-specific SAV distribution and biomass in relation to year, salinity, turbidity, and water depth. After two years of sampling, 14 species of SAV were documented, with three species (coontail [Ceratophyllum demersum], Eurasian watermilfoil [Myriophyllum spicatum], and widgeon grass [Ruppia maritima]) accounting for 54% of above-ground biomass collected. Salinity and water depth were dominant drivers of species assemblages but had little effect on SAV biomass. Predicted changes in salinity and water depths along the northern Gulf of Mexico coast will likely alter SAV production and species assemblages, shifting to more saline and depth-tolerant assemblages, which in turn may affect habitat and food resources for associated faunal species.
NASA Technical Reports Server (NTRS)
Vernieres, Guillaume Rene Jean; Kovach, Robin M.; Keppenne, Christian L.; Akella, Santharam; Brucker, Ludovic; Dinnat, Emmanuel Phillippe
2014-01-01
Ocean salinity and temperature differences drive thermohaline circulations. These properties also play a key role in the ocean-atmosphere coupling. With the availability of L-band space-borne observations, it becomes possible to provide global scale sea surface salinity (SSS) distribution. This study analyzes globally the along-track (Level 2) Aquarius SSS retrievals obtained using both passive and active L-band observations. Aquarius alongtrack retrieved SSS are assimilated into the ocean data assimilation component of Version 5 of the Goddard Earth Observing System (GEOS-5) assimilation and forecast model. We present a methodology to correct the large biases and errors apparent in Version 2.0 of the Aquarius SSS retrieval algorithm and map the observed Aquarius SSS retrieval into the ocean models bulk salinity in the topmost layer. The impact of the assimilation of the corrected SSS on the salinity analysis is evaluated by comparisons with insitu salinity observations from Argo. The results show a significant reduction of the global biases and RMS of observations-minus-forecast differences at in-situ locations. The most striking results are found in the tropics and southern latitudes. Our results highlight the complementary role and problems that arise during the assimilation of salinity information from in-situ (Argo) and space-borne surface (SSS) observations
NASA Astrophysics Data System (ADS)
Snedden, Gregg A.; Steyer, Gregory D.
2013-02-01
Understanding plant community zonation along estuarine stress gradients is critical for effective conservation and restoration of coastal wetland ecosystems. We related the presence of plant community types to estuarine hydrology at 173 sites across coastal Louisiana. Percent relative cover by species was assessed at each site near the end of the growing season in 2008, and hourly water level and salinity were recorded at each site Oct 2007-Sep 2008. Nine plant community types were delineated with k-means clustering, and indicator species were identified for each of the community types with indicator species analysis. An inverse relation between salinity and species diversity was observed. Canonical correspondence analysis (CCA) effectively segregated the sites across ordination space by community type, and indicated that salinity and tidal amplitude were both important drivers of vegetation composition. Multinomial logistic regression (MLR) and Akaike's Information Criterion (AIC) were used to predict the probability of occurrence of the nine vegetation communities as a function of salinity and tidal amplitude, and probability surfaces obtained from the MLR model corroborated the CCA results. The weighted kappa statistic, calculated from the confusion matrix of predicted versus actual community types, was 0.7 and indicated good agreement between observed community types and model predictions. Our results suggest that models based on a few key hydrologic variables can be valuable tools for predicting vegetation community development when restoring and managing coastal wetlands.
NASA Astrophysics Data System (ADS)
Zaib Jadoon, Khan; Umer Altaf, Muhammad; McCabe, Matthew Francis; Hoteit, Ibrahim; Muhammad, Nisar; Moghadas, Davood; Weihermüller, Lutz
2017-10-01
A substantial interpretation of electromagnetic induction (EMI) measurements requires quantifying optimal model parameters and uncertainty of a nonlinear inverse problem. For this purpose, an adaptive Bayesian Markov chain Monte Carlo (MCMC) algorithm is used to assess multi-orientation and multi-offset EMI measurements in an agriculture field with non-saline and saline soil. In MCMC the posterior distribution is computed using Bayes' rule. The electromagnetic forward model based on the full solution of Maxwell's equations was used to simulate the apparent electrical conductivity measured with the configurations of EMI instrument, the CMD Mini-Explorer. Uncertainty in the parameters for the three-layered earth model are investigated by using synthetic data. Our results show that in the scenario of non-saline soil, the parameters of layer thickness as compared to layers electrical conductivity are not very informative and are therefore difficult to resolve. Application of the proposed MCMC-based inversion to field measurements in a drip irrigation system demonstrates that the parameters of the model can be well estimated for the saline soil as compared to the non-saline soil, and provides useful insight about parameter uncertainty for the assessment of the model outputs.
Zhai, Lu; Jiang, Jiang; DeAngelis, Donald L.; Sternberg, Leonel d.S.L
2016-01-01
Sea level rise and the subsequent intrusion of saline seawater can result in an increase in soil salinity, and potentially cause coastal salinity-intolerant vegetation (for example, hardwood hammocks or pines) to be replaced by salinity-tolerant vegetation (for example, mangroves or salt marshes). Although the vegetation shifts can be easily monitored by satellite imagery, it is hard to predict a particular area or even a particular tree that is vulnerable to such a shift. To find an appropriate indicator for the potential vegetation shift, we incorporated stable isotope 18O abundance as a tracer in various hydrologic components (for example, vadose zone, water table) in a previously published model describing ecosystem shifts between hammock and mangrove communities in southern Florida. Our simulations showed that (1) there was a linear relationship between salinity and the δ18O value in the water table, whereas this relationship was curvilinear in the vadose zone; (2) hammock trees with higher probability of being replaced by mangroves had higher δ18O values of plant stem water, and this difference could be detected 2 years before the trees reached a tipping point, beyond which future replacement became certain; and (3) individuals that were eventually replaced by mangroves from the hammock tree population with a 50% replacement probability had higher stem water δ18O values 3 years before their replacement became certain compared to those from the same population which were not replaced. Overall, these simulation results suggest that it is promising to track the yearly δ18O values of plant stem water in hammock forests to predict impending salinity stress and mortality.
Schoolmaster, Donald; Stagg, Camille L.
2018-01-01
A trade-off between competitive ability and stress tolerance has been hypothesized and empirically supported to explain the zonation of species across stress gradients for a number of systems. Since stress often reduces plant productivity, one might expect a pattern of decreasing productivity across the zones of the stress gradient. However, this pattern is often not observed in coastal wetlands that show patterns of zonation along a salinity gradient. To address the potentially complex relationship between stress, zonation, and productivity in coastal wetlands, we developed a model of plant biomass as a function of resource competition and salinity stress. Analysis of the model confirms the conventional wisdom that a trade-off between competitive ability and stress tolerance is a necessary condition for zonation. It also suggests that a negative relationship between salinity and production can be overcome if (1) the supply of the limiting resource increases with greater salinity stress or (2) nutrient use efficiency increases with increasing salinity. We fit the equilibrium solution of the dynamic model to data from Louisiana coastal wetlands to test its ability to explain patterns of production across the landscape gradient and derive predictions that could be tested with independent data. We found support for a number of the model predictions, including patterns of decreasing competitive ability and increasing nutrient use efficiency across a gradient from freshwater to saline wetlands. In addition to providing a quantitative framework to support the mechanistic hypotheses of zonation, these results suggest that this simple model is a useful platform to further build upon, simulate and test mechanistic hypotheses of more complex patterns and phenomena in coastal wetlands.
Specialization of Bacillus in the Geochemically Challenged Environment of Death Valley
NASA Astrophysics Data System (ADS)
Kopac, S.
2014-04-01
Death Valley is the hottest, driest place in North America, a desert with soils containing toxic elements such as boron and lead. While most organisms are unable to survive under these conditions, a diverse community of bacteria survives here. What has enabled bacteria to adapt and thrive in a plethora of extreme and stressful environments where other organisms are unable to grow? The unique environmental adaptations that distinguish ecologically distinct bacterial groups (ecotypes) remain a mystery, in contrast to many animal species (perhaps most notably Darwin's ecologically distinct finch species). We resolve the ecological factors associated with recently diverged ecotypes of the soil bacteria Bacillus subtilis and Bacillus licheniformis, isolated from the dry, geochemically challenging soils of Death Valley, CA. To investigate speciation associated with challenging environmental parameters, we sampled soil transects along a 400m stretch that parallels a decrease in salinity adjacent to a salt flat; transects also encompass gradients in soil B, Cu, Fe, NO3, and P, all of which were quantified in our soil samples. We demarcated strains using Ecotype Simulation, a sequence-based algorithm. Each ecotype's habitat associations were determined with respect to salinity, B, Cu, Fe, NO3, and P. In addition, our sample strains were tested for tolerance of copper, boron and salinity (all known to inhibit growth at high concentrations) by comparing their growth over a 20 hour period. Ecotypes differed in their habitat associations with salinity, boron, copper, iron, and other ecological factors; these environmental dimensions are likely causing speciation of B. subtilis-licheniformis ecotypes at our sample site. Strains also differed in tolerance of boron and copper, providing evidence that our sequence-based demarcations reflect real differences in metabolism. By better understanding the relationship between bacterial speciation and the environment, we can begin to predict the habitability of unexplored extreme and extra-Earth environments.
ERIC Educational Resources Information Center
Fofonoff, N. P.; Millard, R. C., Jr.
Algorithms for computation of fundamental properties of seawater, based on the practicality salinity scale (PSS-78) and the international equation of state for seawater (EOS-80), are compiled in the present report for implementing and standardizing computer programs for oceanographic data processing. Sample FORTRAN subprograms and tables are given…
Guo, Yan; Huang, Jingyi; Shi, Zhou; Li, Hongyi
2015-01-01
In coastal China, there is an urgent need to increase land area for agricultural production and urban development, where there is a rapid growing population. One solution is land reclamation from coastal tidelands, but soil salinization is problematic. As such, it is very important to characterize and map the within-field variability of soil salinity in space and time. Conventional methods are often time-consuming, expensive, labor-intensive, and unpractical. Fortunately, proximal sensing has become an important technology in characterizing within-field spatial variability. In this study, we employed the EM38 to study spatial variability of soil salinity in a coastal paddy field. Significant correlation relationship between ECa and EC1:5 (i.e. r >0.9) allowed us to use EM38 data to characterize the spatial variability of soil salinity. Geostatistical methods were used to determine the horizontal spatio-temporal variability of soil salinity over three consecutive years. The study found that the distribution of salinity was heterogeneous and the leaching of salts was more significant in the edges of the study field. By inverting the EM38 data using a Quasi-3D inversion algorithm, the vertical spatio-temporal variability of soil salinity was determined and the leaching of salts over time was easily identified. The methodology of this study can be used as guidance for researchers interested in understanding soil salinity development as well as land managers aiming for effective soil salinity monitoring and management practices. In order to better characterize the variations in soil salinity to a deeper soil profile, the deeper mode of EM38 (i.e., EM38v) as well as other EMI instruments (e.g. DUALEM-421) can be incorporated to conduct Quasi-3D inversions for deeper soil profiles. PMID:26020969
Guo, Yan; Huang, Jingyi; Shi, Zhou; Li, Hongyi
2015-01-01
In coastal China, there is an urgent need to increase land area for agricultural production and urban development, where there is a rapid growing population. One solution is land reclamation from coastal tidelands, but soil salinization is problematic. As such, it is very important to characterize and map the within-field variability of soil salinity in space and time. Conventional methods are often time-consuming, expensive, labor-intensive, and unpractical. Fortunately, proximal sensing has become an important technology in characterizing within-field spatial variability. In this study, we employed the EM38 to study spatial variability of soil salinity in a coastal paddy field. Significant correlation relationship between ECa and EC1:5 (i.e. r >0.9) allowed us to use EM38 data to characterize the spatial variability of soil salinity. Geostatistical methods were used to determine the horizontal spatio-temporal variability of soil salinity over three consecutive years. The study found that the distribution of salinity was heterogeneous and the leaching of salts was more significant in the edges of the study field. By inverting the EM38 data using a Quasi-3D inversion algorithm, the vertical spatio-temporal variability of soil salinity was determined and the leaching of salts over time was easily identified. The methodology of this study can be used as guidance for researchers interested in understanding soil salinity development as well as land managers aiming for effective soil salinity monitoring and management practices. In order to better characterize the variations in soil salinity to a deeper soil profile, the deeper mode of EM38 (i.e., EM38v) as well as other EMI instruments (e.g. DUALEM-421) can be incorporated to conduct Quasi-3D inversions for deeper soil profiles.
A Particle Swarm Optimization-Based Approach with Local Search for Predicting Protein Folding.
Yang, Cheng-Hong; Lin, Yu-Shiun; Chuang, Li-Yeh; Chang, Hsueh-Wei
2017-10-01
The hydrophobic-polar (HP) model is commonly used for predicting protein folding structures and hydrophobic interactions. This study developed a particle swarm optimization (PSO)-based algorithm combined with local search algorithms; specifically, the high exploration PSO (HEPSO) algorithm (which can execute global search processes) was combined with three local search algorithms (hill-climbing algorithm, greedy algorithm, and Tabu table), yielding the proposed HE-L-PSO algorithm. By using 20 known protein structures, we evaluated the performance of the HE-L-PSO algorithm in predicting protein folding in the HP model. The proposed HE-L-PSO algorithm exhibited favorable performance in predicting both short and long amino acid sequences with high reproducibility and stability, compared with seven reported algorithms. The HE-L-PSO algorithm yielded optimal solutions for all predicted protein folding structures. All HE-L-PSO-predicted protein folding structures possessed a hydrophobic core that is similar to normal protein folding.
Wang, Ya-Nan; Wang, Hui; Zhu, Xiao-Wen; Luo, Ming-Ming; Liu, Zhi-Gang; Du, Xiao-Dong
2012-12-01
By using central composite experimental design and response surface method, the joint effects of water temperature (16-40 degrees C) and salinity (10-50) on the expression of gill Hsp70 gene in Pinctada martensii (Dunker) were studied under laboratory conditions. The results showed that the linear and quadratic effects of temperature on the expression of gill Hsp70 gene were significant, the linear effect of salinity was not significant, while the quadratic effect of salinity was significant. The interactive effect of temperature and salinity was not significant, and the effect of temperature was greater than that of salinity. The model equation of the gill Hsp70 gene expression was established, with the R2, Adj. R2, and Pred. R2 as high as 98.7%, 97.4%, and 89.2%, respectively, suggesting that the overarching predictive capability of the model was very satisfactory, and could be practicably applied for prediction. Through the optimization of the model, the expression of the gill Hsp70 gene reached its minimum (0.5276) when the temperature was 26.78 degrees C and the salinity was 29.33, with the desirability value being 98%. These experimental results could offer theoretical reference for the high expression of gill Hsp70 gene in P. martensii, the maintenance of cell internal environment stability, and the enhancement of P. martensii stress resistance.
NASA Astrophysics Data System (ADS)
Zawawi, M. H.; Zahar, M. F.; Hashim, M. M. M.; Hazreek, Z. A. M.; Zahari, N. M.; Kamaruddin, M. A.
2018-04-01
Saline water intrusion is a serious threat to the groundwater as many part of the world utilize groundwater as their main source of fresh water supply. The usage of high salinity level of water as drinking water can lead to a very serious health hazard towards human. Saline water intrusion is a process by which induced flow of seawater into freshwater aquifer along the coastal area. It might happen due to human action and/or by natural event. The climate change and rise up of sea level may speed up the saline water intrusion process. The conventional method for distinguishing and checking saltwater interference to groundwater along the coast aquifers is to gather and test the groundwater from series of observation wells (borehole) with an end goal to give the important information about the hydrochemistry data to conclude whether the water in the well are safe to consume or not. An integrated approach of field and laboratory electrical resistivity investigation is proposed for indicating the contact region between saline and fresh groundwater. It was found that correlation for both soilbox produced almost identical curvilinear trends for 2% increment of seawater tested using sand sample. This project contributes towards predicting the saline water intrusion to the groundwater by non-destructive test that can replaced the conventional method of groundwater monitoring using series of boreholes in the coastal area
Development of predictive mapping techniques for soil survey and salinity mapping
NASA Astrophysics Data System (ADS)
Elnaggar, Abdelhamid A.
Conventional soil maps represent a valuable source of information about soil characteristics, however they are subjective, very expensive, and time-consuming to prepare. Also, they do not include explicit information about the conceptual mental model used in developing them nor information about their accuracy, in addition to the error associated with them. Decision tree analysis (DTA) was successfully used in retrieving the expert knowledge embedded in old soil survey data. This knowledge was efficiently used in developing predictive soil maps for the study areas in Benton and Malheur Counties, Oregon and accessing their consistency. A retrieved soil-landscape model from a reference area in Harney County was extrapolated to develop a preliminary soil map for the neighboring unmapped part of Malheur County. The developed map had a low prediction accuracy and only a few soil map units (SMUs) were predicted with significant accuracy, mostly those shallow SMUs that have either a lithic contact with the bedrock or developed on a duripan. On the other hand, the developed soil map based on field data was predicted with very high accuracy (overall was about 97%). Salt-affected areas of the Malheur County study area are indicated by their high spectral reflectance and they are easily discriminated from the remote sensing data. However, remote sensing data fails to distinguish between the different classes of soil salinity. Using the DTA method, five classes of soil salinity were successfully predicted with an overall accuracy of about 99%. Moreover, the calculated area of salt-affected soil was overestimated when mapped using remote sensing data compared to that predicted by using DTA. Hence, DTA could be a very helpful approach in developing soil survey and soil salinity maps in more objective, effective, less-expensive and quicker ways based on field data.
Liu, Ya; Pan, Xianzhang; Wang, Changkun; Li, Yanli; Shi, Rongjie
2015-01-01
Robust models for predicting soil salinity that use visible and near-infrared (vis–NIR) reflectance spectroscopy are needed to better quantify soil salinity in agricultural fields. Currently available models are not sufficiently robust for variable soil moisture contents. Thus, we used external parameter orthogonalization (EPO), which effectively projects spectra onto the subspace orthogonal to unwanted variation, to remove the variations caused by an external factor, e.g., the influences of soil moisture on spectral reflectance. In this study, 570 spectra between 380 and 2400 nm were obtained from soils with various soil moisture contents and salt concentrations in the laboratory; 3 soil types × 10 salt concentrations × 19 soil moisture levels were used. To examine the effectiveness of EPO, we compared the partial least squares regression (PLSR) results established from spectra with and without EPO correction. The EPO method effectively removed the effects of moisture, and the accuracy and robustness of the soil salt contents (SSCs) prediction model, which was built using the EPO-corrected spectra under various soil moisture conditions, were significantly improved relative to the spectra without EPO correction. This study contributes to the removal of soil moisture effects from soil salinity estimations when using vis–NIR reflectance spectroscopy and can assist others in quantifying soil salinity in the future. PMID:26468645
Snedden, Gregg A.; Steyer, Gregory D.
2013-01-01
Understanding plant community zonation along estuarine stress gradients is critical for effective conservation and restoration of coastal wetland ecosystems. We related the presence of plant community types to estuarine hydrology at 173 sites across coastal Louisiana. Percent relative cover by species was assessed at each site near the end of the growing season in 2008, and hourly water level and salinity were recorded at each site Oct 2007–Sep 2008. Nine plant community types were delineated with k-means clustering, and indicator species were identified for each of the community types with indicator species analysis. An inverse relation between salinity and species diversity was observed. Canonical correspondence analysis (CCA) effectively segregated the sites across ordination space by community type, and indicated that salinity and tidal amplitude were both important drivers of vegetation composition. Multinomial logistic regression (MLR) and Akaike's Information Criterion (AIC) were used to predict the probability of occurrence of the nine vegetation communities as a function of salinity and tidal amplitude, and probability surfaces obtained from the MLR model corroborated the CCA results. The weighted kappa statistic, calculated from the confusion matrix of predicted versus actual community types, was 0.7 and indicated good agreement between observed community types and model predictions. Our results suggest that models based on a few key hydrologic variables can be valuable tools for predicting vegetation community development when restoring and managing coastal wetlands.
Liska, Adam J; Shevchenko, Andrej; Pick, Uri; Katz, Adriana
2004-09-01
Salinity is a major limiting factor for the proliferation of plants and inhibits central metabolic activities such as photosynthesis. The halotolerant green alga Dunaliella can adapt to hypersaline environments and is considered a model photosynthetic organism for salinity tolerance. To clarify the molecular basis for salinity tolerance, a proteomic approach has been applied for identification of salt-induced proteins in Dunaliella. Seventy-six salt-induced proteins were selected from two-dimensional gel separations of different subcellular fractions and analyzed by mass spectrometry (MS). Application of nanoelectrospray mass spectrometry, combined with sequence-similarity database-searching algorithms, MS BLAST and MultiTag, enabled identification of 80% of the salt-induced proteins. Salinity stress up-regulated key enzymes in the Calvin cycle, starch mobilization, and redox energy production; regulatory factors in protein biosynthesis and degradation; and a homolog of a bacterial Na(+)-redox transporters. The results indicate that Dunaliella responds to high salinity by enhancement of photosynthetic CO(2) assimilation and by diversion of carbon and energy resources for synthesis of glycerol, the osmotic element in Dunaliella. The ability of Dunaliella to enhance photosynthetic activity at high salinity is remarkable because, in most plants and cyanobacteria, salt stress inhibits photosynthesis. The results demonstrated the power of MS BLAST searches for the identification of proteins in organisms whose genomes are not known and paved the way for dissecting molecular mechanisms of salinity tolerance in algae and higher plants.
NASA Astrophysics Data System (ADS)
Xu, Z.; Hu, B.
2017-12-01
The interest to predict seawater intrusion and salinity distribution in Woodville Karst Plain (WKP) has increased due to the huge challenge on quality of drinkable water and serious environmental problems. Seawater intrudes into the conduit system from submarine karst caves at Spring Creek Spring due to density difference and sea level rising, nowadays the low salinity has been detected at Wakulla Spring which is 18 km from coastal line. The groundwater discharge at two major springs and salinity distribution in this area is controlled by the seawater/freshwater interaction under different rainfall conditions: during low rainfall periods, seawater flow into the submarine spring through karst windows, then the salinity rising at the submarine spring leads to seawater further intrudes into conduit system; during high rainfall periods, seawater is pushed out by fresh water discharge at submarine spring. The previous numerical studies of WKP mainly focused on the density independent transport modeling and seawater/freshwater discharge at major karst springs, in this study, a SEAWAT model has been developed to fully investigate the salinity distribution in the WKP under repeating phases of low rainfall and high rainfall periods, the conduit system was simulated as porous media with high conductivity and porosity. The precipitation, salinity and discharge at springs were used to calibrate the model. The results showed that the salinity distribution in porous media and conduit system is controlled by the rainfall change, in general, the salinity distribution inland under low rainfall conditions is much higher and wider than the high rainfall conditions. The results propose a prediction on the environmental problem caused by seawater intrusion in karst coastal aquifer, in addition, provide a visual and scientific basis for future groundwater remediation.
The Predictability of Near-Coastal Currents Using a Baroclinic Unstructured Grid Model
2011-12-28
clinic simulations. ADCIRC solves the time-dependent scalar transport equation for salinity and temperature. Through the equation of state...described by McDougall ct al. (2003), ADCIRC uses the temperature, salinity , and pressure in determining the density field. In order to avoid spurious...model. 2.3 Initialization and boundary forcing Temperature, salinity , elevation, and velocity fields from a regional ocean model are needed both to
Minimalist ensemble algorithms for genome-wide protein localization prediction.
Lin, Jhih-Rong; Mondal, Ananda Mohan; Liu, Rong; Hu, Jianjun
2012-07-03
Computational prediction of protein subcellular localization can greatly help to elucidate its functions. Despite the existence of dozens of protein localization prediction algorithms, the prediction accuracy and coverage are still low. Several ensemble algorithms have been proposed to improve the prediction performance, which usually include as many as 10 or more individual localization algorithms. However, their performance is still limited by the running complexity and redundancy among individual prediction algorithms. This paper proposed a novel method for rational design of minimalist ensemble algorithms for practical genome-wide protein subcellular localization prediction. The algorithm is based on combining a feature selection based filter and a logistic regression classifier. Using a novel concept of contribution scores, we analyzed issues of algorithm redundancy, consensus mistakes, and algorithm complementarity in designing ensemble algorithms. We applied the proposed minimalist logistic regression (LR) ensemble algorithm to two genome-wide datasets of Yeast and Human and compared its performance with current ensemble algorithms. Experimental results showed that the minimalist ensemble algorithm can achieve high prediction accuracy with only 1/3 to 1/2 of individual predictors of current ensemble algorithms, which greatly reduces computational complexity and running time. It was found that the high performance ensemble algorithms are usually composed of the predictors that together cover most of available features. Compared to the best individual predictor, our ensemble algorithm improved the prediction accuracy from AUC score of 0.558 to 0.707 for the Yeast dataset and from 0.628 to 0.646 for the Human dataset. Compared with popular weighted voting based ensemble algorithms, our classifier-based ensemble algorithms achieved much better performance without suffering from inclusion of too many individual predictors. We proposed a method for rational design of minimalist ensemble algorithms using feature selection and classifiers. The proposed minimalist ensemble algorithm based on logistic regression can achieve equal or better prediction performance while using only half or one-third of individual predictors compared to other ensemble algorithms. The results also suggested that meta-predictors that take advantage of a variety of features by combining individual predictors tend to achieve the best performance. The LR ensemble server and related benchmark datasets are available at http://mleg.cse.sc.edu/LRensemble/cgi-bin/predict.cgi.
Minimalist ensemble algorithms for genome-wide protein localization prediction
2012-01-01
Background Computational prediction of protein subcellular localization can greatly help to elucidate its functions. Despite the existence of dozens of protein localization prediction algorithms, the prediction accuracy and coverage are still low. Several ensemble algorithms have been proposed to improve the prediction performance, which usually include as many as 10 or more individual localization algorithms. However, their performance is still limited by the running complexity and redundancy among individual prediction algorithms. Results This paper proposed a novel method for rational design of minimalist ensemble algorithms for practical genome-wide protein subcellular localization prediction. The algorithm is based on combining a feature selection based filter and a logistic regression classifier. Using a novel concept of contribution scores, we analyzed issues of algorithm redundancy, consensus mistakes, and algorithm complementarity in designing ensemble algorithms. We applied the proposed minimalist logistic regression (LR) ensemble algorithm to two genome-wide datasets of Yeast and Human and compared its performance with current ensemble algorithms. Experimental results showed that the minimalist ensemble algorithm can achieve high prediction accuracy with only 1/3 to 1/2 of individual predictors of current ensemble algorithms, which greatly reduces computational complexity and running time. It was found that the high performance ensemble algorithms are usually composed of the predictors that together cover most of available features. Compared to the best individual predictor, our ensemble algorithm improved the prediction accuracy from AUC score of 0.558 to 0.707 for the Yeast dataset and from 0.628 to 0.646 for the Human dataset. Compared with popular weighted voting based ensemble algorithms, our classifier-based ensemble algorithms achieved much better performance without suffering from inclusion of too many individual predictors. Conclusions We proposed a method for rational design of minimalist ensemble algorithms using feature selection and classifiers. The proposed minimalist ensemble algorithm based on logistic regression can achieve equal or better prediction performance while using only half or one-third of individual predictors compared to other ensemble algorithms. The results also suggested that meta-predictors that take advantage of a variety of features by combining individual predictors tend to achieve the best performance. The LR ensemble server and related benchmark datasets are available at http://mleg.cse.sc.edu/LRensemble/cgi-bin/predict.cgi. PMID:22759391
Combined effects of seawater acidification and salinity changes in Ruditapes philippinarum.
Velez, Catia; Figueira, Etelvina; Soares, Amadeu M V M; Freitas, Rosa
2016-07-01
Due to human activities, predictions for the coming years indicate increasing frequency and intensity of extreme weather events (rainy and drought periods) and pollution levels, leading to salinity shifts and ocean acidification. Therefore, several authors have assessed the effects of seawater salinity shifts and pH decrease on marine bivalves, but most of these studies evaluated the impacts of both factors independently. Since pH and salinity may act together in the environment, and their impacts may differ from their effects when acting alone, there is an urgent need to increase our knowledge when these environmental changes act in combination. Thus, the present study assessed the effects of seawater acidification and salinity changes, both acting alone and in combination, on the physiological (condition index, Na and K concentrations) and biochemical (oxidative stress related biomarkers) performance of Ruditapes philippinarum. For that, specimens of R. philippinarum were exposed for 28days to the combination of different pH levels (7.8 and 7.3) and salinities (14, 28 and 35). The results obtained showed that under control pH (7.8) and low salinity (14) the physiological status and biochemical performance of clams was negatively affected, revealing oxidative stress. However, under the same pH and at salinities 28 and 35 clams were able to maintain/regulate their physiological status and biochemical performance. Moreover, our findings showed that clams under low pH (7.3) and different salinities were able to maintain their physiological status and biochemical performance, suggesting that the low pH tested may mask the negative effects of salinity. Our results further demonstrated that, in general, at each salinity, similar physiological and biochemical responses were found in clams under both tested pH levels. Also, individuals under low pH (salinities 14, 28 and 25) and exposed to pH 7.8 and salinity 28 (control) tend to present a similar response pattern. These results indicate that pH may have a lower impact on clams than salinity. Thus, our findings point out that the predicted increase of CO2 in seawater and consequent seawater acidification will have fewer impacts on physiological and biochemical performance of R. philippinarum clams than salinity shifts. Copyright © 2016 Elsevier B.V. All rights reserved.
Damasceno, Évila Pinheiro; de Figuerêdo, Lívia Pitombeira; Pimentel, Marcionília Fernandes; Loureiro, Susana; Costa-Lotufo, Letícia Veras
2017-08-01
Few studies have examined the toxicity of metal mixtures to marine organisms exposed to different salinities. The aim of the present study was to investigate the acute toxicity of zinc and nickel exposures singly and in combination to Artemia sp. under salinities of 10, 17, and 35 psu. The mixture concentrations were determined according to individual toxic units (TUs) to follow a fixed ratio design. Zinc was more toxic than nickel, and both their individual toxicities were higher at lower salinities. These changes in toxicity can be attributed to the Biotic Ligand Model (BLM) rather than to metal speciation. To analyze the mixture effect, the observed data were compared with the expected mixture effects predicted by the concentration addition (CA) model and by deviations for synergistic/antagonistic interactions and dose-level and dose-ratio dependencies. For a salinity of 35 psu, the mixture had no deviations; therefore, the effects were additive. After decreasing the salinity to 17 psu, the toxicity pattern changed to antagonism at low concentrations and synergism at higher equivalent LC 50 levels. For the lowest salinity tested (10 psu), antagonism was observed. The speciations of both metals were similar when in a mixture and when isolated, and changes in toxicity patterns are more related to the organism's physiology than metal speciation. Therefore, besides considering chemical interactions in real-world scenarios, where several chemicals can be present, the influence of abiotic factors, such as salinity, should also be considered. Copyright © 2017 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Lu, C.; Zhang, C.; Huang, H.; Johnson, T.
2012-12-01
Geological sequestration of carbon dioxide (CO2) into the subsurface has been considered as one solution to reduce greenhouse emission to the atmosphere. Successful sequestration process requires efficient and adequate monitoring of injected fluids as they migrate into the aquifer to evaluate flow path, leakage, and geochemical interactions between CO2 and geologic media. In this synthetic field scale study, we have integrated 3D multiphase flow modeling code PFLOTRAN with 3D time-laps electrical resistivity tomography (ERT) to gain insight into the supercritical (SC) CO2 plumes movement in the deep saline aquifer and associated brine intrusion into shallower fresh water aquifer. A parallel ERT forward and inverse modeling package was introduced, and related algorithms are briefly described. The capabilities and limitations of ERT in monitoring CO2 migration are assessed by comparing the results from PFLOTRAN simulations with the ERT inversion results. In general, our study shows the ERT inversion results compare well with PFLOTRAN with reasonable discrepancies, indicating that the ERT can capture the actual CO2 plume dynamics and brine intrusion. Detailed comparisons on the location, size and volume of CO2 plume show the ERT method underestimated area review and overestimated total plume volume in the predictions of SC CO2 movements. These comparisons also show the ERT method constantly overestimate salt intrusion area and underestimated total solute amount in the predictions of brine filtration. Our study shows that together with other geochemical and geophysical methods, ERT is a potentially useful monitoring tool in detecting the SC CO2 and formation fluid migrations.
Predicting the survival of diabetes using neural network
NASA Astrophysics Data System (ADS)
Mamuda, Mamman; Sathasivam, Saratha
2017-08-01
Data mining techniques at the present time are used in predicting diseases of health care industries. Neural Network is one among the prevailing method in data mining techniques of an intelligent field for predicting diseases in health care industries. This paper presents a study on the prediction of the survival of diabetes diseases using different learning algorithms from the supervised learning algorithms of neural network. Three learning algorithms are considered in this study: (i) The levenberg-marquardt learning algorithm (ii) The Bayesian regulation learning algorithm and (iii) The scaled conjugate gradient learning algorithm. The network is trained using the Pima Indian Diabetes Dataset with the help of MATLAB R2014(a) software. The performance of each algorithm is further discussed through regression analysis. The prediction accuracy of the best algorithm is further computed to validate the accurate prediction
Klein, R; Adler, A; Beanlands, R S; deKemp, R A
2004-01-01
A rubidium-82 (/sup 82/Rb) elution system is described for use with clinical positron emission tomography. The system is self-calibrating with 1.4% repeatability, independent of generator activity and elution flow rate. Saline flow is switched between a /sup 82/Sr//sup 82/Rb generator and a bypass line to achieve a constant activity elution of /sup 82/Rb. In the present study, pulse width modulation (PWM) of a solenoid valve is compared to simple threshold control as a means to simulate a proportional valve. A predictive-corrective control algorithm is developed which produces a constant activity elution within the constraints of long feedback delay and short elution time. Accurate constant-activity elutions of 10-70% of the total generator activity were demonstrated using the threshold comparison control. The adaptive-corrective control of the PWM valve provided a substantial improvement in precision of the steady-state output.
Lowe, Michael R; Wu, Wei; Peterson, Mark S; Brown-Peterson, Nancy J; Slack, William T; Schofield, Pamela J
2012-01-01
Understanding the fundamental niche of invasive species facilitates our ability to predict both dispersal patterns and invasion success and therefore provides the basis for better-informed conservation and management policies. Here we focus on Nile tilapia (Oreochromis niloticus Linnaeus, 1758), one of the most widely cultured fish worldwide and a species that has escaped local aquaculture facilities to become established in a coastal-draining river in Mississippi (northern Gulf of Mexico). Using empirical physiological data, logistic regression models were developed to predict the probabilities of Nile tilapia survival, growth, and reproduction at different combinations of temperature (14 and 30°C) and salinity (0-60, by increments of 10). These predictive models were combined with kriged seasonal salinity data derived from multiple long-term data sets to project the species' fundamental niche in Mississippi coastal waters during normal salinity years (averaged across all years) and salinity patterns in extremely wet and dry years (which might emerge more frequently under scenarios of climate change). The derived fundamental niche projections showed that during the summer, Nile tilapia is capable of surviving throughout Mississippi's coastal waters but growth and reproduction were limited to river mouths (or upriver). Overwinter survival was also limited to river mouths. The areas where Nile tilapia could survive, grow, and reproduce increased during extremely wet years (2-368%) and decreased during extremely dry years (86-92%) in the summer with a similar pattern holding for overwinter survival. These results indicate that Nile tilapia is capable of 1) using saline waters to gain access to other watersheds throughout the region and 2) establishing populations in nearshore, low-salinity waters, particularly in the western portion of coastal Mississippi.
Lowe, Michael R.; Wu, Wei; Peterson, Mark S.; Brown-Peterson, Nancy J.; Slack, William T.; Schofield, Pamela J.
2012-01-01
Understanding the fundamental niche of invasive species facilitates our ability to predict both dispersal patterns and invasion success and therefore provides the basis for better-informed conservation and management policies. Here we focus on Nile tilapia (Oreochromis niloticus Linnaeus, 1758), one of the most widely cultured fish worldwide and a species that has escaped local aquaculture facilities to become established in a coastal-draining river in Mississippi (northern Gulf of Mexico). Using empirical physiological data, logistic regression models were developed to predict the probabilities of Nile tilapia survival, growth, and reproduction at different combinations of temperature (14 and 30°C) and salinity (0–60, by increments of 10). These predictive models were combined with kriged seasonal salinity data derived from multiple long-term data sets to project the species' fundamental niche in Mississippi coastal waters during normal salinity years (averaged across all years) and salinity patterns in extremely wet and dry years (which might emerge more frequently under scenarios of climate change). The derived fundamental niche projections showed that during the summer, Nile tilapia is capable of surviving throughout Mississippi's coastal waters but growth and reproduction were limited to river mouths (or upriver). Overwinter survival was also limited to river mouths. The areas where Nile tilapia could survive, grow, and reproduce increased during extremely wet years (2–368%) and decreased during extremely dry years (86–92%) in the summer with a similar pattern holding for overwinter survival. These results indicate that Nile tilapia is capable of 1) using saline waters to gain access to other watersheds throughout the region and 2) establishing populations in nearshore, low-salinity waters, particularly in the western portion of coastal Mississippi. PMID:22848533
Lowe, Michael R.; Wu, Wei; Peterson, Mark S.; Brown-Peterson, Nancy J.; Slack, William T.; Schofield, Pamela J.
2012-01-01
Understanding the fundamental niche of invasive species facilitates our ability to predict both dispersal patterns and invasion success and therefore provides the basis for better-informed conservation and management policies. Here we focus on Nile tilapia (Oreochromis niloticus Linnaeus, 1758), one of the most widely cultured fish worldwide and a species that has escaped local aquaculture facilities to become established in a coastal-draining river in Mississippi (northern Gulf of Mexico). Using empirical physiological data, logistic regression models were developed to predict the probabilities of Nile tilapia survival, growth, and reproduction at different combinations of temperature (14 and 30°C) and salinity (0–60, by increments of 10). These predictive models were combined with kriged seasonal salinity data derived from multiple long-term data sets to project the species' fundamental niche in Mississippi coastal waters during normal salinity years (averaged across all years) and salinity patterns in extremely wet and dry years (which might emerge more frequently under scenarios of climate change). The derived fundamental niche projections showed that during the summer, Nile tilapia is capable of surviving throughout Mississippi's coastal waters but growth and reproduction were limited to river mouths (or upriver). Overwinter survival was also limited to river mouths. The areas where Nile tilapia could survive, grow, and reproduce increased during extremely wet years (2–368%) and decreased during extremely dry years (86–92%) in the summer with a similar pattern holding for overwinter survival. These results indicate that Nile tilapia is capable of 1) using saline waters to gain access to other watersheds throughout the region and 2) establishing populations in nearshore, low-salinity waters, particularly in the western portion of coastal Mississippi.
Inorganic fouling mitigation by salinity cycling in batch reverse osmosis.
Warsinger, David M; Tow, Emily W; Maswadeh, Laith A; Connors, Grace B; Swaminathan, Jaichander; Lienhard V, John H
2018-06-15
Enhanced fouling resistance has been observed in recent variants of reverse osmosis (RO) desalination which use time-varying batch or semi-batch processes, such as closed-circuit RO (CCRO) and pulse flow RO (PFRO). However, the mechanisms of batch processes' fouling resistance are not well-understood, and models have not been developed for prediction of their fouling performance. Here, a framework for predicting reverse osmosis fouling is developed by comparing the fluid residence time in batch and continuous (conventional) reverse osmosis systems to the nucleation induction times for crystallization of sparingly soluble salts. This study considers the inorganic foulants calcium sulfate (gypsum), calcium carbonate (calcite), and silica, and the work predicts maximum recovery ratios for the treatment of typical water sources using batch reverse osmosis (BRO) and continuous reverse osmosis. The prediction method is validated through comparisons to the measured time delay for CaSO 4 membrane scaling in a bench-scale, recirculating reverse osmosis unit. The maximum recovery ratio for each salt solution (CaCO 3 , CaSO 4 ) is individually predicted as a function of inlet salinity, as shown in contour plots. Next, the maximum recovery ratios of batch and conventional RO are compared across several water sources, including seawater, brackish groundwater, and RO brine. Batch RO's shorter residence times, associated with cycling from low to high salinity during each batch, enable significantly higher recovery ratios and higher salinity than in continuous RO for all cases examined. Finally, representative brackish RO brine samples were analyzed to determine the maximum possible recovery with batch RO. Overall, the induction time modeling methodology provided here can be used to allow batch RO to operate at high salinity and high recovery, while controlling scaling. The results show that, in addition to its known energy efficiency improvement, batch RO has superior inorganic fouling resistance relative to conventional RO. Copyright © 2018 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Perelman, Adi; Jorda, Helena; Vanderborght, Jan; Pohlmeier, Andreas; Lazarovitch, Naftali
2017-04-01
When salinity increases beyond a certain threshold it will result in reduced crop yield at a fixed rate, according to Maas and Hoffman model (1976). Thus, there is a great importance of predicting salinization and its impact on crops. Current models do not consider the impact of environmental conditions on plants salt tolerance, even though these conditions are affecting plant water uptake and therefore salt accumulation around the roots. Different factors, such as transpiration rates, can influence the plant sensitivity to salinity by influencing salt concentrations around the roots. Better parametrization of a model can help improving predicting the real effects of salinity on crop growth and yield. The aim of this research is to study Na+ distribution around roots at different scales using different non-invasive methods, and study how this distribution is being affected by transpiration rate and plant water uptake. Results from tomato plants growing on Rhizoslides (capillary paper growth system), show that Na+ concentration is higher at the root- substrate interface, compared with the bulk. Also, Na+ accumulation around the roots decreased under low transpiration rate, which is supporting our hypothesis. Additionally, Rhizoslides enable to study roots' growth rate and architecture under different salinity levels. Root system architecture was retrieved from photos taken during the experiment and enabled us to incorporate real root systems into a simulation. To observe the correlation of root system architectures and Na+ distribution in three dimensions, we used magnetic resonance imaging (MRI). MRI provides fine resolution of Na+ accumulation around a single root without disturbing the root system. With time, Na+ was accumulating only where roots were found in the soil and later on around specific roots. These data are being used for model calibration, which is expected to predict root water uptake in saline soils for different climatic conditions and different soil water availabilities.
Long, Jeremy D.; Porturas, Laura D.
2014-01-01
Plant communities are disturbed by several stressors and they are expected to be further impacted by increasing anthropogenic stress. The consequences of these stressors will depend, in part, upon the ability of plants to compensate for herbivory. Previous studies found that herbivore impacts on plants can vary from negative to positive because of environmental control of plant compensatory responses, a.k.a. the Compensatory Continuum Hypothesis. While these influential studies enhanced our appreciation of the dynamic nature of plant-herbivore interactions, they largely focused on the impact of resource limitation. This bias limits our ability to predict how other environmental factors will shape the impact of herbivory. We examined the role of salinity stress on herbivory of salt marsh cordgrass, Spartina foliosa, by an herbivore previously hypothesized to influence the success of restoration projects (the scale insect, Haliaspis spartinae). Using a combination of field and mesocosm manipulations of scales and salinity, we measured how these factors affected Spartina growth and timing of senescence. In mesocosm studies, Spartina overcompensated for herbivory by growing taller shoots at low salinities but the impact of scales on plants switched from positive to neutral with increasing salinity stress. In field studies of intermediate salinities, scales reduced Spartina growth and increased the rate of senescence. Experimental salinity additions at this field site returned the impact of scales to neutral. Because salinity decreased scale densities, the switch in impact of scales on Spartina with increasing salinity was not simply a linear function of scale abundance. Thus, the impact of scales on primary production depended strongly upon environmental context because intermediate salinity stress prevented plant compensatory responses to herbivory. Understanding this context-dependency will be required if we are going to successfully predict the success of restoration efforts and the ecological consequences of anthropogenic disturbances. PMID:25310475
Testing an earthquake prediction algorithm
Kossobokov, V.G.; Healy, J.H.; Dewey, J.W.
1997-01-01
A test to evaluate earthquake prediction algorithms is being applied to a Russian algorithm known as M8. The M8 algorithm makes intermediate term predictions for earthquakes to occur in a large circle, based on integral counts of transient seismicity in the circle. In a retroactive prediction for the period January 1, 1985 to July 1, 1991 the algorithm as configured for the forward test would have predicted eight of ten strong earthquakes in the test area. A null hypothesis, based on random assignment of predictions, predicts eight earthquakes in 2.87% of the trials. The forward test began July 1, 1991 and will run through December 31, 1997. As of July 1, 1995, the algorithm had forward predicted five out of nine earthquakes in the test area, which success ratio would have been achieved in 53% of random trials with the null hypothesis.
Maden, Orhan; Balci, Kevser Gülcihan; Selcuk, Mehmet Timur; Balci, Mustafa Mücahit; Açar, Burak; Unal, Sefa; Kara, Meryem; Selcuk, Hatice
2015-12-01
The aim of this study was to investigate the accuracy of three algorithms in predicting accessory pathway locations in adult patients with Wolff-Parkinson-White syndrome in Turkish population. A total of 207 adult patients with Wolff-Parkinson-White syndrome were retrospectively analyzed. The most preexcited 12-lead electrocardiogram in sinus rhythm was used for analysis. Two investigators blinded to the patient data used three algorithms for prediction of accessory pathway location. Among all locations, 48.5% were left-sided, 44% were right-sided, and 7.5% were located in the midseptum or anteroseptum. When only exact locations were accepted as match, predictive accuracy for Chiang was 71.5%, 72.4% for d'Avila, and 71.5% for Arruda. The percentage of predictive accuracy of all algorithms did not differ between the algorithms (p = 1.000; p = 0.875; p = 0.885, respectively). The best algorithm for prediction of right-sided, left-sided, and anteroseptal and midseptal accessory pathways was Arruda (p < 0.001). Arruda was significantly better than d'Avila in predicting adjacent sites (p = 0.035) and the percent of the contralateral site prediction was higher with d'Avila than Arruda (p = 0.013). All algorithms were similar in predicting accessory pathway location and the predicted accuracy was lower than previously reported by their authors. However, according to the accessory pathway site, the algorithm designed by Arruda et al. showed better predictions than the other algorithms and using this algorithm may provide advantages before a planned ablation.
Park, Jin Hee; Li, Xiaofang; Edraki, Mansour; Baumgartl, Thomas; Kirsch, Bernie
2013-06-01
Coal mining wastes in the form of spoils, rejects and tailings deposited on a mine lease can cause various environmental issues including contamination by toxic metals, acid mine drainage and salinity. Dissolution of salt from saline mine spoil, in particular, during rainfall events may result in local or regional dispersion of salts through leaching or in the accumulation of dissolved salts in soil pore water and inhibition of plant growth. The salinity in coal mine environments is from the geogenic salt accumulations and weathering of spoils upon surface exposure. The salts are mainly sulfates and chlorides of calcium, magnesium and sodium. The objective of the research is to investigate and assess the source and mobility of salts and trace elements in various spoil types, thereby predicting the leaching behavior of the salts and trace elements from spoils which have similar geochemical properties. X-ray diffraction analysis, total digestion, sequential extraction and column experiments were conducted to achieve the objectives. Sodium and chloride concentrations best represented salinity of the spoils, which might originate from halite. Electrical conductivity, sodium and chloride concentrations in the leachate decreased sharply with increasing leaching cycles. Leaching of trace elements was not significant in the studied area. Geochemical classification of spoil/waste defined for rehabilitation purposes was useful to predict potential salinity, which corresponded with the classification from cluster analysis based on leaching data of major elements. Certain spoil groups showed high potential salinity by releasing high sodium and chloride concentrations. Therefore, the leaching characteristics of sites having saline susceptible spoils require monitoring, and suitable remediation technologies have to be applied.
Lowe, Michael R.; Sehlinger, Troy; Soniat, Thomas M.; LaPeyre, Megan K.
2017-01-01
Despite nearly a century of exploitation and scientific study, predicting growth and mortality rates of the eastern oyster (Crassostrea virginica) as a means to inform local harvest and management activities remains difficult. Ensuring that models reflect local population responses to varying salinity and temperature combinations requires locally appropriate models. Using long-term (1988 to 2015) monitoring data from Louisiana's public oyster reefs, we develop regionally specific models of temperature- and salinity-driven mortality (sack oysters only) and growth for spat (<25 mm), seed (25–75 mm), and sack (>75 mm) oyster size classes. The results demonstrate that the optimal combination of temperature and salinity where Louisiana oysters experience reduced mortality and fast growth rates is skewed toward lower salinities and higher water temperatures than previous models have suggested. Outside of that optimal range, oysters are commonly exposed to combinations of temperature and salinity that are correlated with high mortality and reduced growth. How these combinations affect growth, and to a lesser degree mortality, appears to be size class dependent. Given current climate predictions for the region and ongoing large-scale restoration activities in coastal Louisiana, the growth and mortality models are a critical step toward ensuring sustainable oyster reefs for long-term harvest and continued delivery of the ecological services in a changing environment.
French, Susannah S.; Brodie, Edmund D.
2017-01-01
To accurately predict the impact of environmental change, it is necessary to assay effects of key interacting stressors on vulnerable organisms, and the potential resiliency of their populations. Yet, for the most part, these critical data are missing. We examined the effects of two common abiotic stressors predicted to interact with climate change, salinity and temperature, on the embryonic survival and development of a model freshwater vertebrate, the rough-skinned newt (Taricha granulosa) from different populations. We found that salinity and temperature significantly interacted to affect newt embryonic survival and development, with the negative effects of salinity most pronounced at temperature extremes. We also found significant variation among, and especially within, populations, with different females varying in the performance of their eggs at different salinity–temperature combinations, possibly providing the raw material for future natural selection. Our results highlight the complex nature of predicting responses to climate change in space and time, and provide critical data towards that aim. PMID:28680662
Soil salinity decreases global soil organic carbon stocks.
Setia, Raj; Gottschalk, Pia; Smith, Pete; Marschner, Petra; Baldock, Jeff; Setia, Deepika; Smith, Jo
2013-11-01
Saline soils cover 3.1% (397 million hectare) of the total land area of the world. The stock of soil organic carbon (SOC) reflects the balance between carbon (C) inputs from plants, and losses through decomposition, leaching and erosion. Soil salinity decreases plant productivity and hence C inputs to the soil, but also microbial activity and therefore SOC decomposition rates. Using a modified Rothamsted Carbon model (RothC) with a newly introduced salinity decomposition rate modifier and a plant input modifier we estimate that, historically, world soils that are currently saline have lost an average of 3.47 tSOC ha(-1) since they became saline. With the extent of saline soils predicted to increase in the future, our modelling suggests that world soils may lose 6.8 Pg SOC due to salinity by the year 2100. Our findings suggest that current models overestimate future global SOC stocks and underestimate net CO2 emissions from the soil-plant system by not taking salinity effects into account. From the perspective of enhancing soil C stocks, however, given the lower SOC decomposition rate in saline soils, salt tolerant plants could be used to sequester C in salt-affected areas. Copyright © 2012 Elsevier B.V. All rights reserved.
Data assimilation in the low noise regime
NASA Astrophysics Data System (ADS)
Weare, J.; Vanden-Eijnden, E.
2012-12-01
On-line data assimilation techniques such as ensemble Kalman filters and particle filters tend to lose accuracy dramatically when presented with an unlikely observation. Such observation may be caused by an unusually large measurement error or reflect a rare fluctuation in the dynamics of the system. Over a long enough span of time it becomes likely that one or several of these events will occur. In some cases they are signatures of the most interesting features of the underlying system and their prediction becomes the primary focus of the data assimilation procedure. The Kuroshio or Black Current that runs along the eastern coast of Japan is an example of just such a system. It undergoes infrequent but dramatic changes of state between a small meander during which the current remains close to the coast of Japan, and a large meander during which the current bulges away from the coast. Because of the important role that the Kuroshio plays in distributing heat and salinity in the surrounding region, prediction of these transitions is of acute interest. { Here we focus on a regime in which both the stochastic forcing on the system and the observational noise are small. In this setting large deviation theory can be used to understand why standard filtering methods fail and guide the design of the more effective data assimilation techniques. Motivated by our large deviations analysis we propose several data assimilation strategies capable of efficiently handling rare events such as the transitions of the Kuroshio. These techniques are tested on a model of the Kuroshio and shown to perform much better than standard filtering methods.Here the sequence of observations (circles) are taken directly from one of our Kuroshio model's transition events from the small meander to the large meander. We tested two new algorithms (Algorithms 3 and 4 in the legend) motivated by our large deviations analysis as well as a standard particle filter and an ensemble Kalman filter. The parameters of each algorithm are chosen so that their costs are comparable. The particle filter and an ensemble Kalman filter fail to accurately track the transition. Algorithms 3 and 4 maintain accuracy (and smaller scale resolution) throughout the transition.
Measurement of the Dielectric Constant of Seawater at L-Band: Techniques and Measurements
NASA Technical Reports Server (NTRS)
Lang, R.; Utku, C.; Tarkocin, Y.; LeVine, D.
2009-01-01
Satellite instruments, that will monitor salinity from space in the near future, require an accurate relationship between salinity/temperature and seawater dielectric constant. This paper will review measurements that were made of the dielectric constant of seawater during the past several years. The objective of the measurements is to determine the dependence of the dielectric constant of seawater on salinity and on temperature, more accurately than in the past. by taking advantage of modem instrumentation. The measurements of seawater permittivity have been performed as a function of salinity and temperature using a transmission resonant cavity technique. The measurements have been made in the salinity range of 10 to 38 psu and in the temperature range of IOU C to 35 C. These results will be useful in algorithm development for sensor systems such as SMOS and Aquarius. The measurement system consists of a brass microwave cavity that is resonant at 1.413 GHz. The seawater is introduced into the cavity through a capillary glass tube having an inner diameter of 0.1 mm. The diameter of the tube has been made very small so that the amount of seawater introduced in the cavity is small - thus maintaining the sensitivity of the measurements and allowing the use of perturbation theory predicting the seawater permittivity. The change in resonant frequency and the change in cavity Q can be used to determine the real and imaginary pare of the dielectric constant of seawater introduced into the slender tube. The microwave measurements are made by an HPS722D network analyzer. The cavity has been immersed in a uateriethylene-glycol bath which is connected to a Lauda circulator. The circulator keeps the brass cavity at a temperature constant to within 0.01 degrees. The system is automated using a Visual Basic program to control the analyzer and to collect the data. The results of the dielectric constant measurements of seawater will be presented. The measurement results will be compared with permittivity values generated from the Kline and Swift relationship. Two methods of calibration will be discussed, The errors that each technique introduces into the measurement resulls will be reviewed. Temperature stability, frequency drift and the effect of increasing cavity transmission loss on the unloaded cavity Q will also be discussed.
Inflow of shelf waters into the Mississippi Sound and Mobile Bay estuaries in October 2015
NASA Astrophysics Data System (ADS)
Cambazoglu, Mustafa Kemal; Soto, Inia M.; Howden, Stephan D.; Dzwonkowski, Brian; Fitzpatrick, Patrick J.; Arnone, Robert A.; Jacobs, Gregg A.; Lau, Yee H.
2017-07-01
The exchange of coastal waters between the Mississippi Sound (MSS), Mobile Bay, and Mississippi Bight is an important pathway for oil and pollutants into coastal ecosystems. This study investigated an event of strong and persistent inflow of shelf waters into MSS and Mobile Bay during October 2015 by combining in situ measurements, satellite ocean color data, and ocean model predictions. Navy Coastal Ocean Model predicted high-salinity shelf waters continuously flowing into the estuarine system and forecasted low-salinity waters trapped inside the estuaries which did not flush out until the passage of tropical cyclone Patricia's remnants in late October. The October 2015 chlorophyll-a anomaly was significantly low inside and outside the MSS for the 2003 to 2015 time series. Similar low-chlorophyll-a anomalies were only seen in 2003. The October 2015 mean in situ salinities were up to 8 psu higher than mean from 2007 to 2015, and some estuarine stations showed persistent salinities above 30 psu for almost a month in agreement with model predictions. October 2015 was associated with low fall seasonal discharge, typical of fall season, and wind which was persistently out of the east to southeast [45-180]°. These persistent wind conditions were linked to the observed anomalous conditions.
Aquarius/SAC-D soil moisture product using V3.0 observations
USDA-ARS?s Scientific Manuscript database
Although Aquarius was designed for ocean salinity mapping, our objective in this investigation is to exploit the large amount of land observations that Aquarius acquires and extend the mission scope to include the retrieval of surface soil moisture. The soil moisture retrieval algorithm development ...
Synoptic and frequent monitoring of water quality parameters from satellite is useful for determining the health of aquatic ecosystems and development of effective management strategies. Northwest Florida estuaries are classified as optically-complex, or waters influenced by chlo...
Gridding Global δ 18Owater and Interpreting Core Top δ 18Oforam
NASA Astrophysics Data System (ADS)
Legrande, A. N.; Schmidt, G.
2004-05-01
Estimations of the oxygen isotope ratio in seawater (δ 18O water) traditionally have relied on regional δ 18O water to salinity relationships to convert seawater salinity into δ 18O water. This indirect method of determining δ 18O water is necessary since ?18Owater measurements are relatively sparse. We improve upon this process by constructing local δ 18O water to salinity curves using the Schmidt et al. (1999) global database of δ 18O water and salinity. We calculate local δ 18O water to salinity relationship on a 1x1 grid based on the closest database points to each grid box. Each ocean basin is analyzed separately, and each curve is processed to exclude outliers. These local relationships in combination with seawater salinity (Levitus, 1994) allow us to construct a global map of δ 18O water on a 1x1 grid. We combine seawater temperature (Levitus, 1994) with this dataset to predict δ 18O calcite on a 1x1 grid. These predicted values are then compared to previous compilations of core top δ 18O foram data for individual species of foraminifera. This comparison provides insight into the calcification habitats (as inferred by seawater temperature and salinity) of these species. Additionally, we compare the 1x1 grid of δ 18O water to preliminary output from the latest GISS coupled Atmosphere/Ocean GCM that tracks water isotopes through the hydrologic cycle. This comparison provides insight into possible model applications as a tool to aid in interpreting paleo-isotope data.
Interaction of flooding and salinity stress on baldcypress (Taxodium distichum)
Allen, J.A.; Pezeshki, S.R.; Chambers, J.L.
1996-01-01
Coastal wetlands of the southeastern United States are threatened by increases in flooding and salinity as a result of both natural processes and man-induced hydrologic alterations. Furthermore, global climate change scenarios suggest that, as a consequence of rising sea levels, much larger areas of coastal wetlands may be affected by flooding and salinity in the next 50 to 100 years. In this paper, we review studies designed to improve our ability to predict and ameliorate the impacts of increased flooding and salinity stress on baldcypress (Taxodium distichum (L.) Rich.), which is a dominant species of many coastal forested wetlands. Specifically, we review studies on species-level responses to flooding and salinity stress, alone and in combination, we summarize two studies on intraspecific variation in response to flooding and salinity stress, we analyze the physiological mechanisms thought to be responsible for the interaction between flooding and salinity stress, and we discuss the implications for coastal wetland loss and the prospects for developing salt-tolerant lines of baldcypress.
Salinity transfer in double diffusive convection bounded by two parallel plates
NASA Astrophysics Data System (ADS)
Yang, Yantao; van der Poel, Erwin P.; Ostilla-Monico, Rodolfo; Sun, Chao; Verzicco, Roberto; Grossmann, Siegfried; Lohse, Detlef
2014-11-01
The double diffusive convection (DDC) is the convection flow with the fluid density affected by two different components. In this study we numerically investigate DDC between two parallel plates with no-slip boundary conditions. The top plate has higher salinity and temperature than the lower one. Thus the flow is driven by the salinity difference and stabilised by the temperature difference. Our simulations are compared with the experiments by Hage and Tilgner (Phys. Fluids 22, 076603 (2010)) for several sets of parameters. Reasonable agreement is achieved for the salinity flux and its dependence on the salinity Rayleigh number. For all parameters considered, salt fingers emerge and extend through the entire domain height. The thermal Rayleigh number shows minor influence on the salinity flux although it does affect the Reynolds number. We apply the Grossmann-Lohse theory for Rayleigh-Bénard flow to the current problem without introducing any new coefficients. The theory successfully predicts the salinity flux with respect to the scaling for both the numerical and experimental results.
Evaluation of SMAP Level 2 Soil Moisture Algorithms Using SMOS Data
NASA Technical Reports Server (NTRS)
Bindlish, Rajat; Jackson, Thomas J.; Zhao, Tianjie; Cosh, Michael; Chan, Steven; O'Neill, Peggy; Njoku, Eni; Colliander, Andreas; Kerr, Yann; Shi, J. C.
2011-01-01
The objectives of the SMAP (Soil Moisture Active Passive) mission are global measurements of soil moisture and land freeze/thaw state at 10 km and 3 km resolution, respectively. SMAP will provide soil moisture with a spatial resolution of 10 km with a 3-day revisit time at an accuracy of 0.04 m3/m3 [1]. In this paper we contribute to the development of the Level 2 soil moisture algorithm that is based on passive microwave observations by exploiting Soil Moisture Ocean Salinity (SMOS) satellite observations and products. SMOS brightness temperatures provide a global real-world, rather than simulated, test input for the SMAP radiometer-only soil moisture algorithm. Output of the potential SMAP algorithms will be compared to both in situ measurements and SMOS soil moisture products. The investigation will result in enhanced SMAP pre-launch algorithms for soil moisture.
Ocean Surface Emissivity at L-band (1.4 GHz): The Dependence on Salinity and Roughness
NASA Technical Reports Server (NTRS)
LeVine, D. M.; Lang, R.; Wentz, F.; Messiner, T.
2012-01-01
A characterization of the emissivity of sea water at L-band is important for the remote sensing of sea surface salinity. Measurements of salinity are currently being made in the radio astronomy band at 1.413 GHz by ESA's Soil Moisture and Ocean Salinity (SMOS) mission and NASA's Aquarius instrument aboard the Aquarius/SAC-D observatory. The goal of both missions is accuracy on the order of 0.1 psu. This requires accurate knowledge of the dielectric constant of sea water as a function of salinity and temperature and also the effect of waves (roughness). The former determines the emissivity of an ideal (i.e. flat) surface and the later is the major source of error from predictions based on a flat surface. These two aspects of the problem of characterizing the emissivity are being addressed in the context of the Aquarius mission. First, laboratory measurements are being made of the dielectric constant of sea water. This is being done at the George Washington University using a resonant cavity. In this technique, sea water of known salinity and temperature is fed into the cavity along its axis through a narrow tube. The sea water changes the resonant frequency and Q of the cavity which, if the sample is small enough, can be related to the dielectric constant of the sample. An extensive set of measurements have been conducted at 1.413 GHz to develop a model for the real and imaginary part of the dielectric constant as a function of salinity and temperature. The results are compared to the predictions of models based on parameterization of the Debye resonance of the water molecule. The models and measurements are close; however, the differences are significant for remote sensing of salinity. This is especially true at low temperatures where the sensitivity to salinity is lowest.
Global Identification of MicroRNAs and Their Targets in Barley under Salinity Stress
Cui, Licao; Feng, Kewei; Liu, Fuyan; Du, Xianghong; Tong, Wei; Nie, Xiaojun; Ji, Wanquan; Weining, Song
2015-01-01
Salinity is a major limiting factor for agricultural production worldwide. A better understanding of the mechanisms of salinity stress response will aid efforts to improve plant salt tolerance. In this study, a combination of small RNA and mRNA degradome sequencing was used to identify salinity responsive-miRNAs and their targets in barley. A total of 152 miRNAs belonging to 126 families were identified, of which 44 were found to be salinity responsive with 30 up-regulated and 25 down-regulated respectively. The majority of the salinity-responsive miRNAs were up-regulated at the 8h time point, while down-regulated at the 3h and 27h time points. The targets of these miRNAs were further detected by degradome sequencing coupled with bioinformatics prediction. Finally, qRT-PCR was used to validate the identified miRNA and their targets. Our study systematically investigated the expression profile of miRNA and their targets in barley during salinity stress phase, which can contribute to understanding how miRNAs respond to salinity stress in barley and other cereal crops. PMID:26372557
Wicklein, S.M.; Gain, W.S.
1999-01-01
The St. Sebastian River lies in the southern part of the Indian River basin on the east coast of Florida. Increases in freshwater discharge due to urbanization and changes in land use have reduced salinity in the St. Sebastian River and, consequently, salinity in the Indian River, affecting the commercial fishing industry. Wind, water temperature, tidal flux, freshwater discharge, and downstream salinity all affect salinity in the St. Sebastian River estuary, but freshwater discharge is the only one of these hydrologic factors which might be affected by water-management practices. A probability analysis of salinity conditions in the St. Sebastian River estuary, taking into account the effects of freshwater discharge over a period from May 1992 to March 1996, was used to determine the likelihood (probability) that salinities, as represented by daily mean specific- conductance values, will fall below a given threshold. The effects of freshwater discharge on salinities were evaluated with a simple volumetric model fitted to time series of measured specific conductance, by using nonlinear optimization techniques. Specific-conductance values for two depths at monitored sites represent stratified flow which results from differences in salt concentration between freshwater and saltwater. Layering of freshwater and saltwater is assumed, and the model is applied independently to each layer with the assumption that the water within the layer is well mixed. The model of specific conductance as a function of discharge (a salinity response model) was combined with a model of residual variation to produce a total probability model. Flow distributions and model residuals were integrated to produce a salinity distribution and determine differences in salinity probabilities as a result of changes in water-management practices. Two possible management alternatives were analyzed: stormwater detention (reducing the peak rate of discharge but not reducing the overall flow volume) and stormwater retention (reducing peak discharges without later release). Detention of freshwater discharges increased the probability of specific- conductance values falling below a given limit (20,000 microsiemens per centimeter) for all sites but one. The retention of freshwater input to the system decreased the likelihood of falling below a selected limit of specific conductance at all sites. For limits of specific conductance (1,000 microsiemens per centimeter or 20,000 microsiemens per centimeter, depending on the site), the predicted days of occurrence below a limit decreased ranging from 17 to 68 percent of the predicted days of occurrence for unregulated flow. The primary finding to be drawn from the discharge-salinity analysis is that an empirical-response model alone does not provide adequate information to assess the response of the system to changes in flow regime. Whether a given level of discharge can produce a given response on a given day is not as important as the probability of that response on a given day and over a period of many days. A deterministic model of the St. Sebastian River estuary based only on discharge would predict that retention of discharge peaks should increase the average salinity conditions in the St. Sebastian River estuary. The probabilistic model produces a very different response indicating that salinity can decrease by a power of three as discharges increase, and that random factors can predominate and control salinity until discharges increase sufficiently to flush the entire system of saltwater.
NASA Astrophysics Data System (ADS)
Chen, Ge; Yu, Fangjie
2015-01-01
In this study, we propose a new algorithm for estimating the annual maximum mixed layer depth (M2LD) analogous to a full range of local "ventilation" depth, and corresponding to the deepest surface to which atmospheric influence can be "felt." Two "seasonality indices" are defined, respectively, for temperature and salinity through Fourier analysis of their time series using Argo data, on the basis of which a significant local minimum of the index corresponding to a maximum penetration depth can be identified. A final M2LD is then determined by maximizing the thermal and haline effects. Unlike most of the previous schemes which use arbitrary thresholds or subjective criteria, the new algorithm is objective, robust, and property adaptive provided a significant periodic geophysical forcing such as annual cycle is available. The validity of our methodology is confirmed by the spatial correlation of the tropical dominance of saline effect (mainly related to rainfall cycle) and the extratropical dominance of thermal effect (mainly related to solar cycle). It is also recognized that the M2LD distribution is characterized by the coexistence of basin-scale zonal structures and eddy-scale local patches. In addition to the fundamental buoyancy forcing caused mainly by latitude-dependent solar radiation, the impressive two-scale pattern is found to be primarily attributable to (1) large-wave climate due to extreme winds (large scale) and (2) systematic eddy shedding as a result of persistent winds (mesoscale). Moreover, a general geographical consistency and a good quantitative agreement are found between the new algorithm and those published in the literature. However, a major discrepancy in our result is the existence of a constantly deeper M2LD band compared with other results in the midlatitude oceans of both hemispheres. Given the better correspondence of our M2LDs with the depth of the oxygen saturation limit, it is argued that there might be a systematic underestimation with existing criteria in these regions. Our results demonstrate that the M2LD may serve as an integrated proxy for studying the coherent multidisciplinary variabilities of the coupled ocean-atmosphere system.
NASA Astrophysics Data System (ADS)
Rath, Kristin; Fierer, Noah; Rousk, Johannes
2017-04-01
Our knowledge of the dynamics structuring microbial communities and the consequences this has for soil functions is rudimentary. In particular, predictions of the response of microbial communities to environmental change and the implications for associated ecosystem processes remain elusive. Understanding how environmental factors structure microbial communities and regulate the functions they perform is key to a mechanistic understanding of how biogeochemical cycles respond to environmental change. Soil salinization is an agricultural problem in many parts of the world. The activity of soil microorganisms is reduced in saline soils compared to non-saline soil. However, soil salinity often co-varies with other factors, making it difficult to assign responses of microbial communities to direct effects of salinity. A trait-based approach allows us to connect the environmental factor salinity with the responses of microbial community composition and functioning. Salinity along a salinity gradient serves as a filter for the community trait distribution of salt tolerance, selecting for higher salt tolerance at more saline sites. This trait-environment relationship can be used to predict responses of microbial communities to environmental change. Our aims were to (i) use salinity along natural salinity gradients as an environmental filter, and (ii) link the resulting filtered trait-distributions of the communities (the trait being salt tolerance) to the community composition. Soil samples were obtained from two replicated salinity gradients along an Australian salt lake, spanning a wide range of soil salinities (0.1 dS m-1 to >50 dS m-1). In one of the two gradients salinity was correlated with pH. Community trait distributions for salt tolerance were assessed by establishing dose-dependences for extracted bacterial communities using growth rate assays. In addition, functional parameters were measured along the salt gradients. Community composition of sites was compared through 16S rRNA gene amplicon sequencing. Microbial community composition changed greatly along the salinity gradients. Using the salt-tolerance assessments to estimate bacterial trait-distributions we could determine substantial differences in tolerance to salt revealing a strong causal connection between environment and trait distributions. By constraining the community composition with salinity tolerance in ordinations, we could assign which community differences were directly due to a shift in community trait distributions. These analyses revealed that a substantial part (up to 30%) of the community composition differences were directly driven by environmental salt concentrations.. Even though communities in saline soils had trait-distributions aligned to their environment, their performance (respiration, growth rates) was lower than those in non-saline soils and remained low even after input of organic material. Using a trait-based approach we could connect filtered trait distributions along environmental gradients, to the composition of the microbial community. We show that soil salinity played an important role in shaping microbial community composition by selecting for communities with higher salt tolerance. The shift toward bacterial communities with trait distributions matched to salt environments probably compensated for much of the potential loss of function induced by salinity, resulting in a degree of apparent functional redundancy for decomposition. However, more tolerant communities still showed reduced functioning, suggesting a trade-off between salt tolerance and performance.
Influence of predicted climage change elements on Z. ...
Global climate change (GCC) is expected to have pronounced impacts on estuarine and marine habitats including sea level rise, increased storm intensity, increased air and water temperatures, changes in upwelling dynamics and ocean acidification. All of these elements are likely to impact the growth and potential distribution of the non-indigenous seagrass Zostera japonica both within the State of Washington and within the region. Understanding how Z. japonica will respond to GCC requires a thorough understanding of plant physiology and predictions of GCC effects. Furthermore, Washington State is proposing to list Z. japonica as a “noxious weed” which will allow the state to use herbicide controls for management. We present data from manipulative experiments designed to better understand how Z. japonica photosynthetic physiology responds to temperature, salinity and light. We found that Z. japonica is well adapted to moderate temperatures and salinity with maximum photosynthesis of salinity of 20. The Coos Bay population had greater Pmax and saturation irradiance (Ik) than the Padilla bay population (p < 0.001) and tolerates daily exposure to both freshwater and marine water, suggesting that this population tolerates fairly extreme environmental fluctuations. Extreme temperatures (35 °C) were generally lethal to Z. japonica populations from Padilla, Coos and Yaquina Bays. High salinity (35) had lower mortality than either salinity of 5 or 20 (p = 0.0
A Feature and Algorithm Selection Method for Improving the Prediction of Protein Structural Class.
Ni, Qianwu; Chen, Lei
2017-01-01
Correct prediction of protein structural class is beneficial to investigation on protein functions, regulations and interactions. In recent years, several computational methods have been proposed in this regard. However, based on various features, it is still a great challenge to select proper classification algorithm and extract essential features to participate in classification. In this study, a feature and algorithm selection method was presented for improving the accuracy of protein structural class prediction. The amino acid compositions and physiochemical features were adopted to represent features and thirty-eight machine learning algorithms collected in Weka were employed. All features were first analyzed by a feature selection method, minimum redundancy maximum relevance (mRMR), producing a feature list. Then, several feature sets were constructed by adding features in the list one by one. For each feature set, thirtyeight algorithms were executed on a dataset, in which proteins were represented by features in the set. The predicted classes yielded by these algorithms and true class of each protein were collected to construct a dataset, which were analyzed by mRMR method, yielding an algorithm list. From the algorithm list, the algorithm was taken one by one to build an ensemble prediction model. Finally, we selected the ensemble prediction model with the best performance as the optimal ensemble prediction model. Experimental results indicate that the constructed model is much superior to models using single algorithm and other models that only adopt feature selection procedure or algorithm selection procedure. The feature selection procedure or algorithm selection procedure are really helpful for building an ensemble prediction model that can yield a better performance. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.
Salinity effects on viability, metabolic activity and proliferation of three Perkinsus species
La, Peyre M.; Casas, S.; La, Peyre J.
2006-01-01
Little is known regarding the range of conditions in which many Perkinsus species may proliferate, making it difficult to predict conditions favorable for their expansion, to identify conditions inducing mortality, or to identify instances of potential cross-infectivity among sympatric host species. In this study, the effects of salinity on viability, metabolic activity and proliferation of P. marinus, P. olseni and P. chesapeaki were determined. Specifically, this research examined the effects of 5 salinities (7, 11, 15, 25, 35???), (1) without acclimation, on the viability and metabolic activity of 2 isolates of each Perkinsus species, and (2) with acclimation, on the viability, metabolic activity, size and number of 1 isolate of each species. P. chesapeaki showed the widest range of salinity tolerance of the 3 species, with high viability and cell proliferation at all salinities tested. Although P. chesapeaki originated from low salinity areas (i.e. <15???), several measures (i.e. cell number and metabolic activity) indicated that higher salinities (15, 25???) were more favorable for its growth. P. olseni, originating from high salinity areas, had better viability and proliferation at the higher salinities (15, 25, 35???). Distinct differences in acute salinity response of the 2 P. olseni isolates at lower salinities (7, 11???), however, suggest the need for a more expansive comparison of isolates to better define the lower salinity tolerance. Lastly, P. marinus was more tolerant of the lower salinities (7 and 11???) than P. olseni, but exhibited reduced viability at 7???, even after acclimation. ?? Inter-Research 2006.
NASA Technical Reports Server (NTRS)
Dinnat, Emmanuel P.; Boutin, Jacqueline; Yin, Xiaobin; Le Vine, David M.
2014-01-01
Two spaceborne instruments share the scientific objective of mapping the global Sea Surface Salinity (SSS). ESA's Soil Moisture and Ocean Salinity (SMOS) and NASA's Aquarius use L-band (1.4 GHz) radiometry to retrieve SSS. We find that SSS retrieved by SMOS is generally lower than SSS retrieved by Aquarius, except for very cold waters where SMOS SSS is higher overall. The spatial distribution of the differences in SSS is similar to the distribution of sea surface temperature. There are several differences in the retrieval algorithm that could explain the observed SSS differences. We assess the impact of the dielectric constant model and the ancillary sea surface salinity used by both missions for calibrating the radiometers and retrieving SSS. The differences in dielectric constant model produce differences in SSS of the order of 0.3 psu and exhibit a dependence on latitude and temperature. We use comparisons with the Argo in situ data to assess the performances of the model in various regions of the globe. Finally, the differences in the ancillary sea surface salinity products used to perform the vicarious calibration of both instruments are relatively small (0.1 psu), but not negligible considering the requirements for spaceborne remote sensing of SSS.
NASA Astrophysics Data System (ADS)
Yoon, S.; Williams, J. R.; Juanes, R.; Kang, P. K.
2017-12-01
Managed aquifer recharge (MAR) is becoming an important solution for ensuring sustainable water resources and mitigating saline water intrusion in coastal aquifers. Accurate estimates of hydrogeological parameters in subsurface flow and solute transport models are critical for making predictions and managing aquifer systems. In the presence of a density difference between the injected freshwater and ambient saline groundwater, the pressure field is coupled to the spatial distribution of salinity distribution, and therefore experiences transient changes. The variable-density effects can be quantified by a mixed convection ratio between two characteristic types of convection: free convection due to density contrast, and forced convection due to a hydraulic gradient. We analyze the variable-density effects on the value-of-information of pressure and concentration data for saline aquifer characterization. An ensemble Kalman filter is used to estimate permeability fields by assimilating the data, and the performance of the estimation is analyzed in terms of the accuracy and the uncertainty of estimated permeability fields and the predictability of arrival times of breakthrough curves in a realistic push-pull setting. This study demonstrates that: 1. Injecting fluids with the velocity that balances the two characteristic convections maximizes the value of data for saline aquifer characterization; 2. The variable-density effects on the value of data for the inverse estimation decrease as the permeability heterogeneity increases; 3. The advantage of joint inversion of pressure and concentration data decreases as the coupling effects between flow and transport increase.
Ando, Tomotaka; Okuhara, Yoshitaka; Orihara, Yoshiyuki; Nishimura, Koichi; Yamamoto, Kyoko; Masuyama, Tohru; Hirotani, Shinichi
2018-03-19
Recently, we and other group have reported that furosemide administration along with hypertonic saline solution enhanced diuretic efficiency of furosemide. However, little is known about factors which associated with high diuretic efficiency by hypertonic saline solution with furosemide therapy. To identify predictors of diuretic efficiency in the hypertonic saline solution with furosemide therapy, we recruited 30 consecutive hospitalized heart failure (HF) patients with volume overload (77 ± 10 years, systolic blood pressure > 90 mmHg, and estimated glomerular filtration rate > 15 ml/min/1.73 m 2 ). Hypertonic saline with furosemide solution, consisting of 500 ml of 1.7% hypertonic saline solution with 40 mg of furosemide, was administered continuously over 24 h. The patients were divided into two groups on the basis of 24-h urine volume (UV) after initiation of diuretic treatment ≥ 2000 ml (high urine volume: HUV) and < 2000 ml (low urine volume: LUV). The basal clinical characteristics of both groups were analyzed and the predictors of HUV after receiving the treatment were identified. There were not significant differences between two groups in baseline clinical characteristics and medication. Univariate logistic analysis revealed that blood urea nitrogen/creatinine ratio, urine urea nitrogen/creatinine ratio (UUN/UCre), fractional excretion of sodium, and tricuspid annular plane systolic excursion positively associated with HUV. Multivariate logistic regression analysis revealed that UUN/UCre at baseline was independently associated with HUV, and UUN/UCre best predicts HUV by the therapy with a cut-off value of 6.16 g/dl/g Cre (AUC 0.910, 95% CI 0.696-0.999, sensitivity 80%, specificity 87%). The Kaplan-Meier curves revealed significant difference for HF rehospitalization and death rate at 180 days between patients with UUN/UCre ≥ 6.16 g/dl/g Cre and those with UUN/UCre < 6.16 g/dl/g Cre (log-rank P = 0.0489). UUN/UCre at baseline strongly predicted of diuretic efficiency in the hypertonic saline solution with furosemide therapy, and was associated with HF prognosis.
Predictive spatial modelling for mapping soil salinity at continental scale
NASA Astrophysics Data System (ADS)
Bui, Elisabeth; Wilford, John; de Caritat, Patrice
2017-04-01
Soil salinity is a serious limitation to agriculture and one of the main causes of land degradation. Soil is considered saline if its electrical conductivity (EC) is > 4 dS/m. Maps of saline soil distribution are essential for appropriate land development. Previous attempts to map soil salinity over extensive areas have relied on satellite imagery, aerial electromagnetic (EM) and/or proximally sensed EM data; other environmental (climate, topographic, geologic or soil) datasets are generally not used. Having successfully modelled and mapped calcium carbonate distribution over the 0-80 cm depth in Australian soils using machine learning with point samples from the National Geochemical Survey of Australia (NGSA), we took a similar approach to map soil salinity at 90-m resolution over the continent. The input data were the EC1:5 measurements on the < 2mm fraction at 1315 georeferenced points across the continent at two depth intervals (TOS, 0-10 cm, and BOS, 60-80 cm) (see http://www.ga.gov.au/energy/projects/national-geochemical-survey/atlas.html) were log-transformed and combined with values for climate, elevation and terrain attributes, soil and lithology classes, geophysics, and MODIS vegetation indices extracted at the same locations which were used as predictors in decision tree models. The machine learning software 'Cubist' (www.rulequest.com) was used as the inference engine for the modelling, a 90:10 training:test set data split was used to validate results, and 100 randomly sampled trees were built using the training data. The results were good with an average internal correlation (r) of 0.88 between predicted and measured logEC1:5 (training data), an average external correlation of 0.48 (test subset), and a Lin's concordance correlation coefficient (which evaluates the 1:1 fit) of 0.61. Therefore, the rules derived were mapped and the mean prediction for each 90-m pixel was used for the final logEC1:5 map. This is the most detailed picture of soil salinity over Australia since the 2001 National Land and Water Resources Audit and is generally consistent with it. Our map will be useful as a baseline salinity map circa 2008, when the NGSA samples were collected, for future State of the Environment reports.
Application of XGBoost algorithm in hourly PM2.5 concentration prediction
NASA Astrophysics Data System (ADS)
Pan, Bingyue
2018-02-01
In view of prediction techniques of hourly PM2.5 concentration in China, this paper applied the XGBoost(Extreme Gradient Boosting) algorithm to predict hourly PM2.5 concentration. The monitoring data of air quality in Tianjin city was analyzed by using XGBoost algorithm. The prediction performance of the XGBoost method is evaluated by comparing observed and predicted PM2.5 concentration using three measures of forecast accuracy. The XGBoost method is also compared with the random forest algorithm, multiple linear regression, decision tree regression and support vector machines for regression models using computational results. The results demonstrate that the XGBoost algorithm outperforms other data mining methods.
von Dewitz, Burkhard; Tamm, Susanne; Höflich, Katharina; Voss, Rüdiger; Hinrichsen, Hans-Harald
2018-01-01
The semi-enclosed nature and estuarine characteristics, together with its strongly alternating bathymetry, make the Baltic Sea prone to much stronger interannual variations in the abiotic environment, than other spawning habitats of Atlantic cod (Gadus morhua). Processes determining salinity and oxygen conditions in the basins are influenced both by long term gradual climate change, e.g. global warming, but also by short-term meteorological variations and events. Specifically one main factor influencing cod spawning conditions, the advection of highly saline and well-oxygenated water masses from the North Sea, is observed in irregular frequencies and causes strong interannual variations in stock productivity. This study investigates the possibility to use the available hydrographic process knowledge to predict the annual spawning conditions for Eastern Baltic cod in its most important spawning ground, the Bornholm Basin, only by salinity measurements from a specific location in the western Baltic. Such a prediction could serve as an environmental early warning indicator to inform stock assessment and management. Here we used a hydrodynamic model to hindcast hydrographic property fields for the last 40+ years. High and significant correlations were found for months early in the year between the 33m salinity level in the Arkona Basin and the oxygen-dependent cod spawning environment in the Bornholm Basin. Direct prediction of the Eastern Baltic cod egg survival in the Bornholm Basin based on salinity values in the Arkona Basin at the 33 m depth level is shown to be possible for eggs spawned by mid-age and young females, which currently predominate the stock structure. We recommend to routinely perform short-term predictions of the Eastern Baltic cod spawning environment, in order to generate environmental information highly relevant for stock dynamics. Our statistical approach offers the opportunity to make best use of permanently existing infrastructure in the western Baltic to timely provide scientific knowledge on the spawning conditions of Eastern Baltic cod. Furthermore it could be a tool to assist ecosystem-based fisheries management with a cost-effective implementation by including the short term predictions as a simple indicator in the annual assessments.
Tamm, Susanne; Höflich, Katharina; Voss, Rüdiger; Hinrichsen, Hans-Harald
2018-01-01
The semi-enclosed nature and estuarine characteristics, together with its strongly alternating bathymetry, make the Baltic Sea prone to much stronger interannual variations in the abiotic environment, than other spawning habitats of Atlantic cod (Gadus morhua). Processes determining salinity and oxygen conditions in the basins are influenced both by long term gradual climate change, e.g. global warming, but also by short-term meteorological variations and events. Specifically one main factor influencing cod spawning conditions, the advection of highly saline and well-oxygenated water masses from the North Sea, is observed in irregular frequencies and causes strong interannual variations in stock productivity. This study investigates the possibility to use the available hydrographic process knowledge to predict the annual spawning conditions for Eastern Baltic cod in its most important spawning ground, the Bornholm Basin, only by salinity measurements from a specific location in the western Baltic. Such a prediction could serve as an environmental early warning indicator to inform stock assessment and management. Here we used a hydrodynamic model to hindcast hydrographic property fields for the last 40+ years. High and significant correlations were found for months early in the year between the 33m salinity level in the Arkona Basin and the oxygen-dependent cod spawning environment in the Bornholm Basin. Direct prediction of the Eastern Baltic cod egg survival in the Bornholm Basin based on salinity values in the Arkona Basin at the 33 m depth level is shown to be possible for eggs spawned by mid-age and young females, which currently predominate the stock structure. We recommend to routinely perform short-term predictions of the Eastern Baltic cod spawning environment, in order to generate environmental information highly relevant for stock dynamics. Our statistical approach offers the opportunity to make best use of permanently existing infrastructure in the western Baltic to timely provide scientific knowledge on the spawning conditions of Eastern Baltic cod. Furthermore it could be a tool to assist ecosystem-based fisheries management with a cost-effective implementation by including the short term predictions as a simple indicator in the annual assessments. PMID:29768443
Application of SMAP Data for Ocean Surface Remote Sensing
NASA Astrophysics Data System (ADS)
Fore, A.; Yueh, S. H.; Tang, W.; Stiles, B. W.; Hayashi, A.
2017-12-01
The Soil Moisture Active Passive (SMAP) mission was launched January 31st, 2015. It is designed to measure the soil moisture over land using a combined active / passive L-band system. Due to the Aquarius mission, L-band model functions for ocean winds and salinity are mature and are directly applicable to the SMAP mission. In contrast to Aquarius, the higher resolution and scanning geometry of SMAP allow for wide-swath ocean winds and salinities to be retrieved. In this talk we present the SMAP Sea Surface Salinity (SSS) and extreme winds dataset and its performance. First we discuss the heritage of SMAP SSS algorithms, showing that SMAP and Aquarius show excellent agreement in the ocean surface roughness correction. Then, we give an overview of some newly developed algorithms that are only relevant to the SMAP system; a new galaxy correction and land correction enabling SSS retrievals up to 40 km from coast. We discuss recent improvements to the SMAP data processing for version 4.0. Next we compare the performance of the SMAP SSS to in-situ salinity measurements obtained from ARGO floats, tropical moored buoys, and ship-based data. SMAP SSS has accuracy of 0.2 PSU on a monthly basis compared to ARGO gridded data in tropics and mid-latitudes. In tropical oceans, time series comparison of salinity measured at 1 m depth by moored buoys indicates SMAP can track large salinity changes within a month. Synergetic analysis of SMAP, SMOS, and Argo data allows us to identify and exclude erroneous buoy data from assessment of SMAP SSS. The resulting SMAP-buoy matchup analysis gives a mean standard deviation (STD) of 0.22 PSU and correlation of 0.73 on weekly scale; at monthly scale the mean STD decreased to 0.17 PSU and the correlation increased to 0.8. In addition to SSS, SMAP provides a view into tropical cyclones having much higher sensitivity than traditional scatterometers. We validate the high-winds using collocations with SFMR during tropical cyclones as well as triple-collocations with RapidScat and WindSat. We consider two validation regimes, storm force winds and hurricane force winds. For storm force winds we validate using other space-borne scatterometers and microwave radiometers as well as with SFMR, however, for hurricane force winds we must use SFMR. Finally we discuss the various data products and where they may be obtained.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Voyer, R.A.; McGovern, D.G.
1991-01-01
Two 28-day, life-cycle tests were conducted to evaluate effects of constant and fluctuating salinities on chronic toxicity of cadmium to Mysidopsis bahia at 27C. Salinities of 10 to 32% and cadmium concentrations of 1 to 9 micrograms/l were examined. Estimated median tolerance concentrations at day 28 ranged from 4.8 to 6.3 micrograms Cd/l over the salinity range of 13 to 29%. Size and fecundity of exposed and unexposed females were predicted to be comparable when cadmium was equal or greater than 5.0 micrograms Cd/l and salinities equal or less than 20% and at concentrations of less than 5 micrograms/l atmore » lower salinities. At higher cadmium levels both responses were impaired regardless of salinity. Reproduction in control treatments was an order of magnitude lower in low (10 and 13%) as compared to high (21, 29, 32%) salinity treatments. This effect of salinity on reproduction was not moderated by periodic exposure to higher, more suitable salinities. Survival, growth and reproduction were not impacted by addition of 5 micrograms Cd/l under fluctuating salinity conditions. The no-effect concentration is 4-5 micgrogram Cd/1 regardless of salinity. Changes in survival, growth and reproduction observed are consistent with the principal distribution of M. bahia in estuaries relative to salinity. Comparison of these data with previously reported acute responses suggests that the acute water quality criterion for cadmium should be salinity-dependent whereas the chronic criterion need not be.« less
Submarine groundwater discharge and solute transport under a transgressive barrier island
NASA Astrophysics Data System (ADS)
Evans, Tyler B.; Wilson, Alicia M.
2017-04-01
Many recent investigations of groundwater dynamics in beaches employed groundwater models that assumed isotropic, numerically-convenient hydrogeological conditions. Real beaches exhibit local variability with respect to stratigraphy, sediment grain size and associated topographic profile, so that groundwater flow may diverge significantly from idealized models. We used a combination of hydrogeologic field methods and a variable-density, saturated-unsaturated, transient groundwater flow model to investigate SGD and solute transport under Cabretta Beach, a small transgressive barrier island seaward of Sapelo Island, Georgia. We found that the inclusion of real beach heterogeneity drove important deviations from predictions based on theoretical beaches. Cabretta Beach sustained a stronger upper saline plume than predicted due to the presence of a buried silty mud layer beneath the surface. Infiltration of seawater was greater for neap tides than for spring tides due to variations in beach slope. The strength of the upper saline plume was greatest during spring tides, contrary to recent model predictions. The position and width of the upper saline plume was highly dynamic through the lunar cycle. Our results suggest that field measurements of salinity gradients may be useful for estimating rates of tidally and density driven recirculation through the beach. Finally, our results indicate that several important biogeochemical cycles recently studied at Cabretta Beach were heavily influenced by groundwater flow and associated solute transport.
Juvenile bottlenecks and salinity shape grey mullet assemblages in Mediterranean estuaries
NASA Astrophysics Data System (ADS)
Cardona, Luis; Hereu, Bernat; Torras, Xavier
2008-05-01
Previous research has suggested that competitive bottlenecks may exist for the Mediterranean grey mullets (Osteichthyes, Mugilidae) at the fry stage with the exotic Cyprinus carpio (Osteichthyes, Cyprinidae) playing a central role. As a consequence, the structure of grey mullet assemblages at later stages is thought to reflect previous competition as well as differences in osmoregulatory skills. This paper tests that hypothesis by examining four predictions about the relative abundance of five grey mullet species in 42 Western Mediterranean estuary sites from three areas (Aiguamolls de l'Empordà, Ebro Delta and Minorca) differing in the salinity level and occurrence of C. carpio. Field data confirmed the predictions as: (1) Liza aurata and Mugil cephalus were scarce everywhere and never dominated the assemblage; (2) Liza saliens dominated the assemblage where the salinity level was higher than 13; (3) Liza ramado always dominated the assemblage where the salinity level was lower than 13 and C. carpio was present; and (4) Chelon labrosus dominated the assemblage only where the salinity level was lower than 13 and C. carpio was absent. The catch per unit effort of C. labrosus of any size was smaller in the presence of C. carpio than where it had not been introduced, which is in agreement with the juvenile competitive bottleneck hypothesis. Discriminant analysis confirmed that the assemblage structure was linked to the salinity level and the occurrence of C. carpio for both early juveniles and late juveniles as well as adults. The data reported here reveal that the structure of grey mullet assemblages inhabiting Mediterranean estuaries is determined by salinity and competitive interactions at the fry stage.
Learning Instance-Specific Predictive Models
Visweswaran, Shyam; Cooper, Gregory F.
2013-01-01
This paper introduces a Bayesian algorithm for constructing predictive models from data that are optimized to predict a target variable well for a particular instance. This algorithm learns Markov blanket models, carries out Bayesian model averaging over a set of models to predict a target variable of the instance at hand, and employs an instance-specific heuristic to locate a set of suitable models to average over. We call this method the instance-specific Markov blanket (ISMB) algorithm. The ISMB algorithm was evaluated on 21 UCI data sets using five different performance measures and its performance was compared to that of several commonly used predictive algorithms, including nave Bayes, C4.5 decision tree, logistic regression, neural networks, k-Nearest Neighbor, Lazy Bayesian Rules, and AdaBoost. Over all the data sets, the ISMB algorithm performed better on average on all performance measures against all the comparison algorithms. PMID:25045325
A High Performance Cloud-Based Protein-Ligand Docking Prediction Algorithm
Chen, Jui-Le; Yang, Chu-Sing
2013-01-01
The potential of predicting druggability for a particular disease by integrating biological and computer science technologies has witnessed success in recent years. Although the computer science technologies can be used to reduce the costs of the pharmaceutical research, the computation time of the structure-based protein-ligand docking prediction is still unsatisfied until now. Hence, in this paper, a novel docking prediction algorithm, named fast cloud-based protein-ligand docking prediction algorithm (FCPLDPA), is presented to accelerate the docking prediction algorithm. The proposed algorithm works by leveraging two high-performance operators: (1) the novel migration (information exchange) operator is designed specially for cloud-based environments to reduce the computation time; (2) the efficient operator is aimed at filtering out the worst search directions. Our simulation results illustrate that the proposed method outperforms the other docking algorithms compared in this paper in terms of both the computation time and the quality of the end result. PMID:23762864
Stable near-surface ocean salinity stratifications due to evaporation observed during STRASSE
NASA Astrophysics Data System (ADS)
Asher, William E.; Jessup, Andrew T.; Clark, Dan
2014-05-01
Under conditions with a large solar flux and low wind speed, a stably stratified warm layer forms at the ocean surface. Evaporation can then lead to an increase in salinity in the warm layer. A large temperature gradient will decrease density enough to counter the density increase caused by the salinity increase, forming a stable positive salinity anomaly at the surface. If these positive salinity anomalies are large in terms of the change in salinity from surface to the base of the gradient, if their areal coverage is a significant fraction of the satellite footprint, and if they persist long enough to be in the satellite field of view, they could be relevant for calibration and validation of L-band microwave salinity measurements. A towed, surface-following profiler was deployed from the N/O Thalassa during the Subtropical Atlantic Surface Salinity Experiment (STRASSE). The profiler measured temperature and conductivity in the surface ocean at depths of 10, 50, and 100 cm. The measurements show that positive salinity anomalies are common at the ocean surface for wind speeds less than 4 m s-1 when the average daily insolation is >300 W m-2 and the sea-to-air latent heat flux is greater than zero. A semiempirical model predicts the observed dependence of measured anomalies on environmental conditions. However, the model results and the field data suggest that these ocean surface salinity anomalies are not large enough in terms of the salinity difference to significantly affect microwave radiometric measurements of salinity.
A test to evaluate the earthquake prediction algorithm, M8
Healy, John H.; Kossobokov, Vladimir G.; Dewey, James W.
1992-01-01
A test of the algorithm M8 is described. The test is constructed to meet four rules, which we propose to be applicable to the test of any method for earthquake prediction: 1. An earthquake prediction technique should be presented as a well documented, logical algorithm that can be used by investigators without restrictions. 2. The algorithm should be coded in a common programming language and implementable on widely available computer systems. 3. A test of the earthquake prediction technique should involve future predictions with a black box version of the algorithm in which potentially adjustable parameters are fixed in advance. The source of the input data must be defined and ambiguities in these data must be resolved automatically by the algorithm. 4. At least one reasonable null hypothesis should be stated in advance of testing the earthquake prediction method, and it should be stated how this null hypothesis will be used to estimate the statistical significance of the earthquake predictions. The M8 algorithm has successfully predicted several destructive earthquakes, in the sense that the earthquakes occurred inside regions with linear dimensions from 384 to 854 km that the algorithm had identified as being in times of increased probability for strong earthquakes. In addition, M8 has successfully "post predicted" high percentages of strong earthquakes in regions to which it has been applied in retroactive studies. The statistical significance of previous predictions has not been established, however, and post-prediction studies in general are notoriously subject to success-enhancement through hindsight. Nor has it been determined how much more precise an M8 prediction might be than forecasts and probability-of-occurrence estimates made by other techniques. We view our test of M8 both as a means to better determine the effectiveness of M8 and as an experimental structure within which to make observations that might lead to improvements in the algorithm or conceivably lead to a radically different approach to earthquake prediction.
A traveling salesman approach for predicting protein functions.
Johnson, Olin; Liu, Jing
2006-10-12
Protein-protein interaction information can be used to predict unknown protein functions and to help study biological pathways. Here we present a new approach utilizing the classic Traveling Salesman Problem to study the protein-protein interactions and to predict protein functions in budding yeast Saccharomyces cerevisiae. We apply the global optimization tool from combinatorial optimization algorithms to cluster the yeast proteins based on the global protein interaction information. We then use this clustering information to help us predict protein functions. We use our algorithm together with the direct neighbor algorithm 1 on characterized proteins and compare the prediction accuracy of the two methods. We show our algorithm can produce better predictions than the direct neighbor algorithm, which only considers the immediate neighbors of the query protein. Our method is a promising one to be used as a general tool to predict functions of uncharacterized proteins and a successful sample of using computer science knowledge and algorithms to study biological problems.
A traveling salesman approach for predicting protein functions
Johnson, Olin; Liu, Jing
2006-01-01
Background Protein-protein interaction information can be used to predict unknown protein functions and to help study biological pathways. Results Here we present a new approach utilizing the classic Traveling Salesman Problem to study the protein-protein interactions and to predict protein functions in budding yeast Saccharomyces cerevisiae. We apply the global optimization tool from combinatorial optimization algorithms to cluster the yeast proteins based on the global protein interaction information. We then use this clustering information to help us predict protein functions. We use our algorithm together with the direct neighbor algorithm [1] on characterized proteins and compare the prediction accuracy of the two methods. We show our algorithm can produce better predictions than the direct neighbor algorithm, which only considers the immediate neighbors of the query protein. Conclusion Our method is a promising one to be used as a general tool to predict functions of uncharacterized proteins and a successful sample of using computer science knowledge and algorithms to study biological problems. PMID:17147783
NASA Astrophysics Data System (ADS)
Little, S.; Wood, P. J.; Elliott, M.
2017-11-01
Coastal and estuarine systems worldwide are under threat from global climate change, with potential consequences including an increase in salinities and incursion of saltwater into areas currently subject to tidal and non-tidal freshwater regimes. It is commonly assumed that climate-driven increases in estuarine salinities and saline incursion will be directly reflected in an upstream shift in species distributions and patterns of community composition based on salinity tolerance. This study examined the responses of benthos to medium-term salinity changes in two macrotidal river-estuary systems in SE England to test whether these responses may be representative of climate-induced salinity changes over the long-term. The study reinforced the effect of salinity, related to tidal incursion, as the primary environmental driver of benthic species distribution and community composition. Salinity, however, acted within a hierarchy of factors followed by substratum type, with biotic competition and predator-prey relationships superimposed on these. The assumption that increasing salinities will be directly reflected in a shift in species distributions and patterns of community composition upstream over the long-term was shown to be over simplistic and not representative of a complex and highly variable system. Relative Sea Level Rise (RSLR) projections were predicted to increase estuarine salinities and saline incursion in the study estuaries, which together with projected reductions in river flow will have important consequences for estuarine structure and function, particularly in tidal limnetic zones, despite estuarine communities being pre-adapted to cope with fluctuating salinities. The study identified, however, that limnic-derived fauna inhabiting these zones may demonstrate greater tolerance to salinity change than is currently recognised, and may persist where salinity increases are gradual and zones unbounded.
Nigatu, Yeshambel T; Liu, Yan; Wang, JianLi
2016-07-22
Multivariable risk prediction algorithms are useful for making clinical decisions and for health planning. While prediction algorithms for new onset of major depression in the primary care attendees in Europe and elsewhere have been developed, the performance of these algorithms in different populations is not known. The objective of this study was to validate the PredictD algorithm for new onset of major depressive episode (MDE) in the US general population. Longitudinal study design was conducted with approximate 3-year follow-up data from a nationally representative sample of the US general population. A total of 29,621 individuals who participated in Wave 1 and 2 of the US National Epidemiologic Survey on Alcohol and Related Conditions (NESARC) and who did not have an MDE in the past year at Wave 1 were included. The PredictD algorithm was directly applied to the selected participants. MDE was assessed by the Alcohol Use Disorder and Associated Disabilities Interview Schedule, based on the DSM-IV criteria. Among the participants, 8 % developed an MDE over three years. The PredictD algorithm had acceptable discriminative power (C-statistics = 0.708, 95 % CI: 0.696, 0.720), but poor calibration (p < 0.001) with the NESARC data. In the European primary care attendees, the algorithm had a C-statistics of 0.790 (95 % CI: 0.767, 0.813) with a perfect calibration. The PredictD algorithm has acceptable discrimination, but the calibration capacity was poor in the US general population despite of re-calibration. Therefore, based on the results, at current stage, the use of PredictD in the US general population for predicting individual risk of MDE is not encouraged. More independent validation research is needed.
Rice, Karen C.; Bennett, Mark; Shen, Jian
2011-01-01
As a result of climate change and variability, sea level is rising throughout the world, but the rate along the east coast of the United States is higher than the global mean rate. The U.S. Geological Survey, in cooperation with the City of Newport News, Virginia, conducted a study to evaluate the effects of possible future sea-level rise on the salinity front in two tributaries to Chesapeake Bay, the York River, and the Chickahominy/James River estuaries. Numerical modeling was used to represent sea-level rise and the resulting hydrologic effects. Estuarine models for the two tributaries were developed and model simulations were made by use of the Three-Dimensional Hydrodynamic-Eutrophication Model (HEM-3D), developed by the Virginia Institute of Marine Science. HEM-3D was used to simulate tides, tidal currents, and salinity for Chesapeake Bay, the York River and the Chickahominy/James River. The three sea-level rise scenarios that were evaluated showed an increase of 30, 50, and 100 centimeters (cm). Model results for both estuaries indicated that high freshwater river flow was effective in pushing the salinity back toward Chesapeake Bay. Model results indicated that increases in mean salinity will greatly alter the existing water-quality gradients between brackish water and freshwater. This will be particularly important for the freshwater part of the Chickahominy River, where a drinking-water-supply intake for the City of Newport News is located. Significant changes in the salinity gradients for the York River and Chickahominy/James River estuaries were predicted for the three sea-level rise scenarios. When a 50-cm sea-level rise scenario on the York River during a typical year (2005) was used, the model simulation showed a salinity of 15 parts per thousand (ppt) at river kilometer (km) 39. During a dry year (2002), the same salinity (15 ppt) was simulated at river km 45, which means that saltwater was shown to migrate 6 km farther upstream during a dry year than a typical year. The same was true of the Chickahominy River for a 50-cm sea-level rise scenario but to a greater extent; a salinity of 4 ppt was simulated at river km 13 during a typical year and at river km 28 during a dry year, indicating that saltwater migrated 15 km farther upstream during a dry year. Near a drinking-water intake on the Chickahominy River, for a dry year, salinity is predicted to more than double for all three sea-level rise scenarios, relative to a typical year. During a typical year at this location, salinity is predicted to increase to 0.006, 0.07, and more than 2 ppt for the 30-, 50-, and 100-cm rise scenarios, respectively.
Research on wind field algorithm of wind lidar based on BP neural network and grey prediction
NASA Astrophysics Data System (ADS)
Chen, Yong; Chen, Chun-Li; Luo, Xiong; Zhang, Yan; Yang, Ze-hou; Zhou, Jie; Shi, Xiao-ding; Wang, Lei
2018-01-01
This paper uses the BP neural network and grey algorithm to forecast and study radar wind field. In order to reduce the residual error in the wind field prediction which uses BP neural network and grey algorithm, calculating the minimum value of residual error function, adopting the residuals of the gray algorithm trained by BP neural network, using the trained network model to forecast the residual sequence, using the predicted residual error sequence to modify the forecast sequence of the grey algorithm. The test data show that using the grey algorithm modified by BP neural network can effectively reduce the residual value and improve the prediction precision.
NASA Astrophysics Data System (ADS)
Nazeer, Majid; Bilal, Muhammad
2018-04-01
Landsat-5 Thematic Mapper (TM) dataset have been used to estimate salinity in the coastal area of Hong Kong. Four adjacent Landsat TM images were used in this study, which was atmospherically corrected using the Second Simulation of the Satellite Signal in the Solar Spectrum (6S) radiative transfer code. The atmospherically corrected images were further used to develop models for salinity using Ordinary Least Square (OLS) regression and Geographically Weighted Regression (GWR) based on in situ data of October 2009. Results show that the coefficient of determination ( R 2) of 0.42 between the OLS estimated and in situ measured salinity is much lower than that of the GWR model, which is two times higher ( R 2 = 0.86). It indicates that the GWR model has more ability than the OLS regression model to predict salinity and show its spatial heterogeneity better. It was observed that the salinity was high in Deep Bay (north-western part of Hong Kong) which might be due to the industrial waste disposal, whereas the salinity was estimated to be constant (32 practical salinity units) towards the open sea.
Sublethal salinity stress contributes to habitat limitation in an endangered estuarine fish.
Komoroske, Lisa M; Jeffries, Ken M; Connon, Richard E; Dexter, Jason; Hasenbein, Matthias; Verhille, Christine; Fangue, Nann A
2016-09-01
As global change alters multiple environmental conditions, predicting species' responses can be challenging without understanding how each environmental factor influences organismal performance. Approaches quantifying mechanistic relationships can greatly complement correlative field data, strengthening our abilities to forecast global change impacts. Substantial salinity increases are projected in the San Francisco Estuary, California, due to anthropogenic water diversion and climatic changes, where the critically endangered delta smelt (Hypomesus transpacificus) largely occurs in a low-salinity zone (LSZ), despite their ability to tolerate a much broader salinity range. In this study, we combined molecular and organismal measures to quantify the physiological mechanisms and sublethal responses involved in coping with salinity changes. Delta smelt utilize a suite of conserved molecular mechanisms to rapidly adjust their osmoregulatory physiology in response to salinity changes in estuarine environments. However, these responses can be energetically expensive, and delta smelt body condition was reduced at high salinities. Thus, acclimating to salinities outside the LSZ could impose energetic costs that constrain delta smelt's ability to exploit these habitats. By integrating data across biological levels, we provide key insight into the mechanistic relationships contributing to phenotypic plasticity and distribution limitations and advance the understanding of the molecular osmoregulatory responses in nonmodel estuarine fishes.
Hart, Kristen M.; Schofield, Pamela J.; Gregoire, Denise R.
2012-01-01
In a laboratory setting, we tested the ability of 24 non-native, wild-caught hatchling Burmese pythons (Python molurus bivittatus) collected in the Florida Everglades to survive when given water containing salt to drink. After a one-month acclimation period in the laboratory, we grouped snakes into three treatments, giving them access to water that was fresh (salinity of 0, control), brackish (salinity of 10), or full-strength sea water (salinity of 35). Hatchlings survived about one month at the highest marine salinity and about five months at the brackish-water salinity; no control animals perished during the experiment. These results are indicative of a "worst-case scenario", as in the laboratory we denied access to alternate fresh-water sources that may be accessible in the wild (e.g., through rainfall). Therefore, our results may underestimate the potential of hatchling pythons to persist in saline habitats in the wild. Because of the effect of different salinity regimes on survival, predictions of ultimate geographic expansion by non-native Burmese pythons that consider salt water as barriers to dispersal for pythons may warrant re-evaluation, especially under global climate change and associated sea-level-rise scenarios.
Hart, K.M.; Schofield, P.J.; Gregoire, D.R.
2012-01-01
In a laboratory setting, we tested the ability of 24 non-native, wild-caught hatchling Burmese pythons (Python molurus bivittatus) collected in the Florida Everglades to survive when given water containing salt to drink. After a one-month acclimation period in the laboratory, we grouped snakes into three treatments, giving them access to water that was fresh (salinity of 0, control), brackish (salinity of 10), or full-strength sea water (salinity of 35). Hatchlings survived about one month at the highest marine salinity and about five months at the brackish-water salinity; no control animals perished during the experiment. These results are indicative of a "worst-case scenario", as in the laboratory we denied access to alternate fresh-water sources that may be accessible in the wild (e.g., through rainfall). Therefore, our results may underestimate the potential of hatchling pythons to persist in saline habitats in the wild. Because of the effect of different salinity regimes on survival, predictions of ultimate geographic expansion by non-native Burmese pythons that consider salt water as barriers to dispersal for pythons may warrant re-evaluation, especially under global climate change and associated sea-level-rise scenarios. ?? 2011.
Espinar, J.L.; Garcia, L.V.; Clemente, L.
2005-01-01
The effect of salinity level and extended exposure to different salinity and flooding conditions on germination patterns of three saltmarsh clonal growth plants (Juncus subulatus, Scirpus litoralis, and S. maritimus) was studied. Seed exposure to extended flooding and saline conditions significantly affected the outcome of the germination process in a different, though predictable, way for each species, after favorable conditions for germination were restored. Tolerance of the germination process was related to the average salinity level measured during the growth/germination season at sites where established individuals of each species dominated the species cover. No relationship was found between salinity tolerance of the germination process and seed response to extended exposure to flooding and salinity conditions. The salinity response was significantly related to the conditions prevailing in the habitats of the respective species during the unfavorable (nongrowth/nongermination) season. Our results indicate that changes in salinity and hydrology while seeds are dormant affect the outcome of the seed-bank response, even when conditions at germination are identical. Because these environmental-history-dependent responses differentially affect seed germination, seedling density, and probably sexual recruitment in the studied and related species, these influences should be considered for wetland restoration and management.
Zhu, Liqin; Jiang, Cuiling; Wang, Youheng; Peng, Yanmei; Zhang, Peng
2013-09-01
Water salinization of coastal reservoirs seriously threatens the safety of their water supply. To elucidate the mechanism of salinization and to quantitatively analyze the risk in the initial period of the impoundment of a proposed reservoir in Tianjin Binhai New Area, laboratory and field simulation experiments were implemented and integrated with the actual operation of Beitang Reservoir, which is located in the same region and has been operational for many years. The results suggested that water salinization of the proposed reservoir was mainly governed by soil saline release, evaporation and leakage. Saline release was the prevailing factor in the earlier stage of the impoundment, then the evaporation and leakage effects gradually became notable over time. By referring to the actual case of Beitang Reservoir, it was predicted that the chloride ion (Cl(-)) concentration of the water during the initial impounding period of the proposed reservoir would exceed the standard for quality of drinking water from surface water sources (250 mg L(-1)), and that the proposed reservoir had a high risk of water salinization.
Lü, Si-Dan; Chen, Wei-Ping; Wang, Mei-E
2012-12-01
In order to promote safe irrigation with reclaimed water and prevent soil salinisation, the dynamic transport of salts in urban soils of Beijing under irrigation of reclaimed water was simulated by ENVIRO-GRO model in this study. The accumulation trends and profile distribution of soil salinity were predicted. Simultaneously, the effects of different soil properties and plants on soil water-salt movement and salt accumulation were investigated. Results indicated that soil salinity in the profiles reached uniform equilibrium conditions by repeated simulation, with different initial soil salinity. Under the conditions of loam and clay loam soil, salinity in the profiles increased over time until reaching equilibrium conditions, while under the condition of sandy loam soil, salinity in the profiles decreased over time until reaching equilibrium conditions. The saturated soil salinity (EC(e)) under equilibrium conditions followed an order of sandy loam < loam < clay loam. Salt accumulations in Japan euonymus and Chinese pine were less than that in Blue grass. The temporal and spatial distributions of soil salinity were also different in these three types of plants. In addition, the growth of the plants was not influenced by soil salinity (except clay loam), but mild soil salinization occurred under all conditions (except sandy loam).
Fast Demand Forecast of Electric Vehicle Charging Stations for Cell Phone Application
DOE Office of Scientific and Technical Information (OSTI.GOV)
Majidpour, Mostafa; Qiu, Charlie; Chung, Ching-Yen
This paper describes the core cellphone application algorithm which has been implemented for the prediction of energy consumption at Electric Vehicle (EV) Charging Stations at UCLA. For this interactive user application, the total time of accessing database, processing the data and making the prediction, needs to be within a few seconds. We analyze four relatively fast Machine Learning based time series prediction algorithms for our prediction engine: Historical Average, kNearest Neighbor, Weighted k-Nearest Neighbor, and Lazy Learning. The Nearest Neighbor algorithm (k Nearest Neighbor with k=1) shows better performance and is selected to be the prediction algorithm implemented for themore » cellphone application. Two applications have been designed on top of the prediction algorithm: one predicts the expected available energy at the station and the other one predicts the expected charging finishing time. The total time, including accessing the database, data processing, and prediction is about one second for both applications.« less
NASA Astrophysics Data System (ADS)
Bourke, Sarah A.; Hermann, Kristian J.; Hendry, M. Jim
2017-11-01
Elevated groundwater salinity associated with produced water, leaching from landfills or secondary salinity can degrade arable soils and potable water resources. Direct-push electrical conductivity (EC) profiling enables rapid, relatively inexpensive, high-resolution in-situ measurements of subsurface salinity, without requiring core collection or installation of groundwater wells. However, because the direct-push tool measures the bulk EC of both solid and liquid phases (ECa), incorporation of ECa data into regional or historical groundwater data sets requires the prediction of pore water EC (ECw) or chloride (Cl-) concentrations from measured ECa. Statistical linear regression and physically based models for predicting ECw and Cl- from ECa profiles were tested on a brine plume in central Saskatchewan, Canada. A linear relationship between ECa/ECw and porosity was more accurate for predicting ECw and Cl- concentrations than a power-law relationship (Archie's Law). Despite clay contents of up to 96%, the addition of terms to account for electrical conductance in the solid phase did not improve model predictions. In the absence of porosity data, statistical linear regression models adequately predicted ECw and Cl- concentrations from direct-push ECa profiles (ECw = 5.48 ECa + 0.78, R 2 = 0.87; Cl- = 1,978 ECa - 1,398, R 2 = 0.73). These statistical models can be used to predict ECw in the absence of lithologic data and will be particularly useful for initial site assessments. The more accurate linear physically based model can be used to predict ECw and Cl- as porosity data become available and the site-specific ECw-Cl- relationship is determined.
Satellite surface salinity maps to determine fresh water fluxes in the Arctic Ocean
NASA Astrophysics Data System (ADS)
Gabarro, Carolina; Estrella, Olmedo; Emelianov, Mikhail; Ballabrera, Joaquim; Turiel, Antonio
2017-04-01
Salinity and temperature gradients drive the thermohaline circulation of the oceans, and play a key role in the ocean-atmosphere coupling. The strong and direct interactions between the ocean and the cryosphere (primarily through sea ice and ice shelves) are also a key ingredient of the thermohaline circulation. Recent observational studies have documented changes in upper Arctic Ocean hydrography [1, 2]. The ESA's Soil Moisture and Ocean Salinity (SMOS) mission, launched in 2009, have the objective to measure soil moisture over the continents and sea surface salinity over the oceans [3]. However, SMOS is also making inroads in Cryospheric science, as the measurements of thin ice thickness and sea ice concentration. SMOS carries an innovative L-band (1.4 GHz, or 21-cm wavelength), passive interferometric radiometer (the so-called MIRAS) that measures the electromagnetic radiation emitted by the Earth's surface, at about 50 km spatial resolution wide swath (1200-km), and with a 3-day revisit time at the equator, but more frequently at the poles. Although the SMOS radiometer operating frequency offers almost the maximum sensitivity of the brightness temperature (TB) to sea surface salinity (SSS) variations, such sensitivity is rather low, even lower at cold waters [4]: 90% of ocean SSS values span a range of brightness temperatures of just 5K. This implies that the SMOS SSS retrieval requires a high performance of the MIRAS interferometric radiometer [5]. New algorithms, recently developed at the Barcelona Expert Center (BEC) to improve the quality of SMOS measurements [6], allow for the first time to derive cold-water SSS maps from SMOS data, and to observe the variability of the SSS in the higher north Atlantic and the Arctic Ocean. In this work, we will provide an assessment of the quality of these new SSS Arctic maps, and we will illustrate their potential to monitor the impact on ocean state of the discharges from the main rivers to the Arctic Ocean. Moreover, results make you think that assimilating SMOS Arctic SSS data could be beneficial for the TOPAZ Arctic Ocean Prediction system. Therefore, SMOS shows great potential to routinely monitor the extension of the surface freshwater fluxes also in the Arctic Ocean. The new SMOS Arctic products can therefore substantially contribute to increase our knowledge of the critical processes that are taking place in the Arctic. [1] Haine, T. et al. (2015), 'Arctic freshwater export: Status, mechanisms, and prospects', Global and Planetary Change, 125, 2015. [2] Peterson, B., et al. (2002), 'Increasing river discharge to the arctic ocean', Science, 298, 21712173. [3] Font, J. et al. (2010), 'The Challenging Sea Surface Salinity Measurement From Space'. Proceed. IEEE, 98, 649 -665 [4] Swift, C. (1980). Boundary-layer Meteorology, 18:25-54. [5] McMullan, K. et al. (2008), 'SMOS: The payload', IEEE T. Geosci. Remote, 46. [6] Olmedo, E., et al. (2017) 'Debiased Non-Bayesian retrieval: a novel approach to SMOS Sea Surface Salinity', Remote Sensing of Environment, under review.
A review of predictive coding algorithms.
Spratling, M W
2017-03-01
Predictive coding is a leading theory of how the brain performs probabilistic inference. However, there are a number of distinct algorithms which are described by the term "predictive coding". This article provides a concise review of these different predictive coding algorithms, highlighting their similarities and differences. Five algorithms are covered: linear predictive coding which has a long and influential history in the signal processing literature; the first neuroscience-related application of predictive coding to explaining the function of the retina; and three versions of predictive coding that have been proposed to model cortical function. While all these algorithms aim to fit a generative model to sensory data, they differ in the type of generative model they employ, in the process used to optimise the fit between the model and sensory data, and in the way that they are related to neurobiology. Copyright © 2016 Elsevier Inc. All rights reserved.
Is there a signal of sea-level rise in Chesapeake Bay salinity?
NASA Astrophysics Data System (ADS)
Hilton, T. W.; Najjar, R. G.; Zhong, L.; Li, M.
2008-09-01
We evaluate the hypothesis that sea-level rise over the second half of the 20th century has led to detectable increases in Chesapeake Bay salinity. We exploit a simple, statistical model that predicts monthly mean salinity as a function of Susquehanna River flow in 23 segments of the main stem Chesapeake Bay. The residual (observed minus modeled) salinity exhibits statistically significant linear (p < 0.05) trends between 1949 and 2006 in 13 of the 23 segments of the bay. The salinity change estimated from the trend line over this period varies from -2.0 to 2.2, with 10 of the 13 cells showing positive changes. The mean and median salinity changes over all 23 cells are 0.47 and 0.72; over the 13 cells with significant trends they are 0.71 and 1.1. We ran a hydrodynamic model of the bay under present-day and reduced sea level conditions and found a bay-average salinity increase of about 0.5, which supports the hypothesis that the salinity residual trends have a significant component due to sea-level rise. Uncertainties remain, however, due to the spatial and temporal extent of historical salinity data and the infilling of the bay due to sedimentation. The salinity residuals also exhibit interannual variability, with peaks occurring at intervals of roughly 7 to 9 years, which are partially explained by Atlantic Shelf salinity, Potomac River flow and the meridional component of wind stress.
Miller, Seth H; Zarate, Sonia; Smith, Edmund H; Gaylord, Brian; Hosfelt, Jessica D; Hill, Tessa M
2014-01-01
Future climate change is predicted to alter the physical characteristics of oceans and estuaries, including pH, temperature, oxygen, and salinity. Investigating how species react to the influence of such multiple stressors is crucial for assessing how future environmental change will alter marine ecosystems. The timing of multiple stressors can also be important, since in some cases stressors arise simultaneously, while in others they occur in rapid succession. In this study, we investigated the effects of elevated pCO2 on oxygen consumption by larvae of the intertidal porcelain crab Petrolisthes cinctipes when exposed to subsequent salinity stress. Such an exposure mimics how larvae under future acidified conditions will likely experience sudden runoff events such as those that occur seasonally along portions of the west coast of the U.S. and in other temperate systems, or how larvae encounter hypersaline waters when crossing density gradients via directed swimming. We raised larvae in the laboratory under ambient and predicted future pCO2 levels (385 and 1000 µatm) for 10 days, and then moved them to seawater at ambient pCO2 but with decreased, ambient, or elevated salinity, to monitor their respiration. While larvae raised under elevated pCO2 or exposed to stressful salinity conditions alone did not exhibit higher respiration rates than larvae held in ambient conditions, larvae exposed to elevated pCO2 followed by stressful salinity conditions consumed more oxygen. These results show that even when multiple stressors act sequentially rather than simultaneously, they can retain their capacity to detrimentally affect organisms.
Miller, Seth H.; Zarate, Sonia; Smith, Edmund H.; Gaylord, Brian; Hosfelt, Jessica D.; Hill, Tessa M.
2014-01-01
Future climate change is predicted to alter the physical characteristics of oceans and estuaries, including pH, temperature, oxygen, and salinity. Investigating how species react to the influence of such multiple stressors is crucial for assessing how future environmental change will alter marine ecosystems. The timing of multiple stressors can also be important, since in some cases stressors arise simultaneously, while in others they occur in rapid succession. In this study, we investigated the effects of elevated pCO2 on oxygen consumption by larvae of the intertidal porcelain crab Petrolisthes cinctipes when exposed to subsequent salinity stress. Such an exposure mimics how larvae under future acidified conditions will likely experience sudden runoff events such as those that occur seasonally along portions of the west coast of the U.S. and in other temperate systems, or how larvae encounter hypersaline waters when crossing density gradients via directed swimming. We raised larvae in the laboratory under ambient and predicted future pCO2 levels (385 and 1000 µatm) for 10 days, and then moved them to seawater at ambient pCO2 but with decreased, ambient, or elevated salinity, to monitor their respiration. While larvae raised under elevated pCO2 or exposed to stressful salinity conditions alone did not exhibit higher respiration rates than larvae held in ambient conditions, larvae exposed to elevated pCO2 followed by stressful salinity conditions consumed more oxygen. These results show that even when multiple stressors act sequentially rather than simultaneously, they can retain their capacity to detrimentally affect organisms. PMID:25295878
Management scenarios for the Jordan River salinity crisis
Farber, E.; Vengosh, A.; Gavrieli, I.; Marie, Amarisa; Bullen, T.D.; Mayer, B.; Holtzman, R.; Segal, M.; Shavit, U.
2005-01-01
Recent geochemical and hydrological findings show that the water quality of the base flow of the Lower Jordan River, between the Sea of Galilee and the Dead Sea, is dependent upon the ratio between surface water flow and groundwater discharge. Using water quality data, mass-balance calculations, and actual flow-rate measurements, possible management scenarios for the Lower Jordan River and their potential affects on its salinity are investigated. The predicted scenarios reveal that implementation of some elements of the Israel-Jordan peace treaty will have negative effects on the Jordan River water salinity. It is predicted that removal of sewage effluents dumped into the river (???13 MCM/a) will significantly reduce the river water's flow and increase the relative proportion of the saline groundwater flux into the river. Under this scenario, the Cl content of the river at its southern point (Abdalla Bridge) will rise to almost 7000 mg/L during the summer. In contrast, removal of all the saline water (16.5 MCM/a) that is artificially discharged into the Lower Jordan River will significantly reduce its Cl concentration, to levels of 650-2600 and 3000-3500 mg/L in the northern and southern areas of the Lower Jordan River, respectively. However, because the removal of either the sewage effluents or the saline water will decrease the river's discharge to a level that could potentially cause river desiccation during the summer months, other water sources must be allocated to preserve in-stream flow needs and hence the river's ecosystem. ?? 2005 Elsevier Ltd. All rights reserved.
A Robustly Stabilizing Model Predictive Control Algorithm
NASA Technical Reports Server (NTRS)
Ackmece, A. Behcet; Carson, John M., III
2007-01-01
A model predictive control (MPC) algorithm that differs from prior MPC algorithms has been developed for controlling an uncertain nonlinear system. This algorithm guarantees the resolvability of an associated finite-horizon optimal-control problem in a receding-horizon implementation.
Cifuentes, L.A.; Schemel, L.E.; Sharp, J.H.
1990-01-01
The effects of river inflow variations on alkalinity/salinity distributions in San Francisco Bay and nitrate/salinity distributions in Delaware Bay are described. One-dimensional, advective-dispersion equations for salinity and the dissolved constituents are solved numerically and are used to simulate mixing in the estuaries. These simulations account for time-varying river inflow, variations in estuarine cross-sectional area, and longitudinally varying dispersion coefficients. The model simulates field observations better than models that use constant hydrodynamic coefficients and uniform estuarine geometry. Furthermore, field observations and model simulations are consistent with theoretical 'predictions' that the curvature of propery-salinity distributions depends on the relation between the estuarine residence time and the period of river concentration variation. ?? 1990.
Swain, Eric D.; Decker, Jeremy D.
2009-01-01
A numerical model application was developed for the coastal area inland of the Ten Thousand Islands (TTI) in southwestern Florida using the Flow and Transport in a Linked Overland/Aquifer Density-Dependent System (FTLOADDS) model. This model couples a two-dimensional dynamic surface-water model with a three-dimensional groundwater model, and has been applied to several locations in southern Florida. The model application solves equations for salt transport in groundwater and surface water, and also simulates surface-water temperature using a newly enhanced heat transport algorithm. One of the purposes of the TTI application is to simulate hydrologic factors that relate to habitat suitability for the West Indian Manatee. Both salinity and temperature have been shown to be important factors for manatee survival. The inland area of the TTI domain is the location of the Picayune Strand Restoration Project, which is designed to restore predevelopment hydrology through the filling and plugging of canals, construction of spreader channels, and the construction of levees and pump stations. The effects of these changes are simulated to determine their effects on manatee habitat. The TTI application utilizes a large amount of input data for both surface-water and groundwater flow simulations. These data include topography, frictional resistance, atmospheric data including rainfall and air temperature, aquifer properties, and boundary conditions for tidal levels, inflows, groundwater heads, and salinities. Calibration was achieved by adjusting the parameters having the largest uncertainty: surface-water inflows, the surface-water transport dispersion coefficient, and evapotranspiration. A sensitivity analysis did not indicate that further parameter changes would yield an overall improvement in simulation results. The agreement between field data from GPS-tracked manatees and TTI application results demonstrates that the model can predict the salinity and temperature fluctuations which affect manatee behavior. Comparison of the existing conditions simulation with the simulation incorporating restoration changes indicated that the restoration would increase the period of inundation for most of the coastal wetlands. Generally, surface-water salinity was lowered by restoration changes in most of the wetlands areas, especially during the early dry season. However, the opposite pattern was observed in the primary canal habitat for manatees, namely, the Port of the Islands. Salinities at this location tended to be moderately elevated during the dry season, and unchanged during the wet season. Water temperatures were in close agreement between the existing conditions and restoration simulations, although minimum temperatures at the Port of the Islands were slightly higher in the restoration simulation as a result of the additional surface-water ponding and warming that occurs in adjacent wetlands. The TTI application output was used to generate salinity and temperature time series for comparison to manatee field tracking data and an individually-based manatee-behavior model. Overlaying field data with salinity and temperature results from the TTI application reflects the effect of warm water availability and the periodic need for low-salinity drinking water on manatee movements. The manatee-behavior model uses the TTI application data at specific model nodes along the main manatee travel corridors to determine manatee migration patterns. The differences between the existing conditions and restoration scenarios can then be compared for manatee refugia. The TTI application can be used to test a variety of hydrologic conditions and their effect on important criteria.
Monthly prediction of air temperature in Australia and New Zealand with machine learning algorithms
NASA Astrophysics Data System (ADS)
Salcedo-Sanz, S.; Deo, R. C.; Carro-Calvo, L.; Saavedra-Moreno, B.
2016-07-01
Long-term air temperature prediction is of major importance in a large number of applications, including climate-related studies, energy, agricultural, or medical. This paper examines the performance of two Machine Learning algorithms (Support Vector Regression (SVR) and Multi-layer Perceptron (MLP)) in a problem of monthly mean air temperature prediction, from the previous measured values in observational stations of Australia and New Zealand, and climate indices of importance in the region. The performance of the two considered algorithms is discussed in the paper and compared to alternative approaches. The results indicate that the SVR algorithm is able to obtain the best prediction performance among all the algorithms compared in the paper. Moreover, the results obtained have shown that the mean absolute error made by the two algorithms considered is significantly larger for the last 20 years than in the previous decades, in what can be interpreted as a change in the relationship among the prediction variables involved in the training of the algorithms.
NASA Astrophysics Data System (ADS)
De Lannoy, G. J.; Reichle, R. H.; Vrugt, J. A.
2012-12-01
Simulated L-band (1.4 GHz) brightness temperatures are very sensitive to the values of the parameters in the radiative transfer model (RTM). We assess the optimum RTM parameter values and their (posterior) uncertainty in the Goddard Earth Observing System (GEOS-5) land surface model using observations of multi-angular brightness temperature over North America from the Soil Moisture Ocean Salinity (SMOS) mission. Two different parameter estimation methods are being compared: (i) a particle swarm optimization (PSO) approach, and (ii) an MCMC simulation procedure using the differential evolution adaptive Metropolis (DREAM) algorithm. Our results demonstrate that both methods provide similar "optimal" parameter values. Yet, DREAM exhibits better convergence properties, resulting in a reduced spread of the posterior ensemble. The posterior parameter distributions derived with both methods are used for predictive uncertainty estimation of brightness temperature. This presentation will highlight our model-data synthesis framework and summarize our initial findings.
Deadbeat Predictive Controllers
NASA Technical Reports Server (NTRS)
Juang, Jer-Nan; Phan, Minh
1997-01-01
Several new computational algorithms are presented to compute the deadbeat predictive control law. The first algorithm makes use of a multi-step-ahead output prediction to compute the control law without explicitly calculating the controllability matrix. The system identification must be performed first and then the predictive control law is designed. The second algorithm uses the input and output data directly to compute the feedback law. It combines the system identification and the predictive control law into one formulation. The third algorithm uses an observable-canonical form realization to design the predictive controller. The relationship between all three algorithms is established through the use of the state-space representation. All algorithms are applicable to multi-input, multi-output systems with disturbance inputs. In addition to the feedback terms, feed forward terms may also be added for disturbance inputs if they are measurable. Although the feedforward terms do not influence the stability of the closed-loop feedback law, they enhance the performance of the controlled system.
Utilization of Ancillary Data Sets for SMAP Algorithm Development and Product Generation
NASA Technical Reports Server (NTRS)
ONeill, P.; Podest, E.; Njoku, E.
2011-01-01
Algorithms being developed for the Soil Moisture Active Passive (SMAP) mission require a variety of both static and ancillary data. The selection of the most appropriate source for each ancillary data parameter is driven by a number of considerations, including accuracy, latency, availability, and consistency across all SMAP products and with SMOS (Soil Moisture Ocean Salinity). It is anticipated that initial selection of all ancillary datasets, which are needed for ongoing algorithm development activities on the SMAP algorithm testbed at JPL, will be completed within the year. These datasets will be updated as new or improved sources become available, and all selections and changes will be documented for the benefit of the user community. Wise choices in ancillary data will help to enable SMAP to provide new global measurements of soil moisture and freeze/thaw state at the targeted accuracy necessary to tackle hydrologically-relevant societal issues.
Serum Uric Acid and Risk for Acute Kidney Injury Following Contrast.
Kanbay, Mehmet; Solak, Yalcin; Afsar, Baris; Nistor, Ionut; Aslan, Gamze; Çağlayan, Ozlem Hilal; Aykanat, Asli; Donciu, Mihaela-Dora; Lanaspa, Miguel A; Ejaz, Ahsan A; Johnson, Richard J; Covic, Adrian
2017-02-01
Contrast-induced acute kidney injury (CI-AKI) is a common cause of hospital-acquired acute kidney injury (AKI). We evaluated the evidence that uric acid (UA) plays a pathogenic role in CI-AKI. Ten studies were eligible for inclusion for meta-analysis. Hyperuricemia predicted risk for cases with AKI in prospective cohort studies. Higher levels of serum UA (SUA), as defined by the authors, were associated with a 2-fold increased risk to develop AKI (pooled odds ratio 2.03; 95% confidence interval [CI] 1.48-2.78). Significant heterogeneity was found in cohort studies ( P = .001, I 2 = 85.7%). In 2 clinical trials, lowering of SUA with saline hydration was significantly associated with reduced risk for AKI compared with saline hydration alone or saline hydration with N-acetyl cysteine. An analysis of 2 randomized controlled trials found that allopurinol with saline hydration had a significant protective effect on renal function (assessed by serum creatinine values) compared with hydration alone (mean difference: -0.52 mg/dL; 95% CI: -0.81 to -0.22). Hyperuricemia independently predicts CI-AKI. Two clinical trials suggest lowering SUA may prevent CI-AKI. The mechanism by which UA induces CI-AKI is likely related to acute uricosuria.
Can human experts predict solubility better than computers?
Boobier, Samuel; Osbourn, Anne; Mitchell, John B O
2017-12-13
In this study, we design and carry out a survey, asking human experts to predict the aqueous solubility of druglike organic compounds. We investigate whether these experts, drawn largely from the pharmaceutical industry and academia, can match or exceed the predictive power of algorithms. Alongside this, we implement 10 typical machine learning algorithms on the same dataset. The best algorithm, a variety of neural network known as a multi-layer perceptron, gave an RMSE of 0.985 log S units and an R 2 of 0.706. We would not have predicted the relative success of this particular algorithm in advance. We found that the best individual human predictor generated an almost identical prediction quality with an RMSE of 0.942 log S units and an R 2 of 0.723. The collection of algorithms contained a higher proportion of reasonably good predictors, nine out of ten compared with around half of the humans. We found that, for either humans or algorithms, combining individual predictions into a consensus predictor by taking their median generated excellent predictivity. While our consensus human predictor achieved very slightly better headline figures on various statistical measures, the difference between it and the consensus machine learning predictor was both small and statistically insignificant. We conclude that human experts can predict the aqueous solubility of druglike molecules essentially equally well as machine learning algorithms. We find that, for either humans or algorithms, combining individual predictions into a consensus predictor by taking their median is a powerful way of benefitting from the wisdom of crowds.
Discrete sequence prediction and its applications
NASA Technical Reports Server (NTRS)
Laird, Philip
1992-01-01
Learning from experience to predict sequences of discrete symbols is a fundamental problem in machine learning with many applications. We apply sequence prediction using a simple and practical sequence-prediction algorithm, called TDAG. The TDAG algorithm is first tested by comparing its performance with some common data compression algorithms. Then it is adapted to the detailed requirements of dynamic program optimization, with excellent results.
Cui, Zaixu; Gong, Gaolang
2018-06-02
Individualized behavioral/cognitive prediction using machine learning (ML) regression approaches is becoming increasingly applied. The specific ML regression algorithm and sample size are two key factors that non-trivially influence prediction accuracies. However, the effects of the ML regression algorithm and sample size on individualized behavioral/cognitive prediction performance have not been comprehensively assessed. To address this issue, the present study included six commonly used ML regression algorithms: ordinary least squares (OLS) regression, least absolute shrinkage and selection operator (LASSO) regression, ridge regression, elastic-net regression, linear support vector regression (LSVR), and relevance vector regression (RVR), to perform specific behavioral/cognitive predictions based on different sample sizes. Specifically, the publicly available resting-state functional MRI (rs-fMRI) dataset from the Human Connectome Project (HCP) was used, and whole-brain resting-state functional connectivity (rsFC) or rsFC strength (rsFCS) were extracted as prediction features. Twenty-five sample sizes (ranged from 20 to 700) were studied by sub-sampling from the entire HCP cohort. The analyses showed that rsFC-based LASSO regression performed remarkably worse than the other algorithms, and rsFCS-based OLS regression performed markedly worse than the other algorithms. Regardless of the algorithm and feature type, both the prediction accuracy and its stability exponentially increased with increasing sample size. The specific patterns of the observed algorithm and sample size effects were well replicated in the prediction using re-testing fMRI data, data processed by different imaging preprocessing schemes, and different behavioral/cognitive scores, thus indicating excellent robustness/generalization of the effects. The current findings provide critical insight into how the selected ML regression algorithm and sample size influence individualized predictions of behavior/cognition and offer important guidance for choosing the ML regression algorithm or sample size in relevant investigations. Copyright © 2018 Elsevier Inc. All rights reserved.
Wang, JianLi; Sareen, Jitender; Patten, Scott; Bolton, James; Schmitz, Norbert; Birney, Arden
2014-05-01
Prediction algorithms are useful for making clinical decisions and for population health planning. However, such prediction algorithms for first onset of major depression do not exist. The objective of this study was to develop and validate a prediction algorithm for first onset of major depression in the general population. Longitudinal study design with approximate 3-year follow-up. The study was based on data from a nationally representative sample of the US general population. A total of 28 059 individuals who participated in Waves 1 and 2 of the US National Epidemiologic Survey on Alcohol and Related Conditions and who had not had major depression at Wave 1 were included. The prediction algorithm was developed using logistic regression modelling in 21 813 participants from three census regions. The algorithm was validated in participants from the 4th census region (n=6246). Major depression occurred since Wave 1 of the National Epidemiologic Survey on Alcohol and Related Conditions, assessed by the Alcohol Use Disorder and Associated Disabilities Interview Schedule-diagnostic and statistical manual for mental disorders IV. A prediction algorithm containing 17 unique risk factors was developed. The algorithm had good discriminative power (C statistics=0.7538, 95% CI 0.7378 to 0.7699) and excellent calibration (F-adjusted test=1.00, p=0.448) with the weighted data. In the validation sample, the algorithm had a C statistic of 0.7259 and excellent calibration (Hosmer-Lemeshow χ(2)=3.41, p=0.906). The developed prediction algorithm has good discrimination and calibration capacity. It can be used by clinicians, mental health policy-makers and service planners and the general public to predict future risk of having major depression. The application of the algorithm may lead to increased personalisation of treatment, better clinical decisions and more optimal mental health service planning.
Modeling and predicting intertidal variations of the salinity field in the Bay/Delta
Knowles, Noah; Uncles, Reginald J.
1995-01-01
One approach to simulating daily to monthly variability in the bay is the development of intertidal model using tidally-averaged equations and a time step on the order of the day. An intertidal numerical model of the bay's physics, capable of portraying seasonal and inter-annual variability, would have several uses. Observations are limited in time and space, so simulation could help fill the gaps. Also, the ability to simulate multi-year episodes (eg, an extended drought) could provide insight into the response of the ecosystem to such events. Finally, such a model could be used in a forecast mode wherein predicted delta flow is used as model input, and predicted salinity distribution is output with estimates days and months in advance. This note briefly introduces such a tidally-averaged model (Uncles and Peterson, in press) and a corresponding predictive scheme for baywide forecasting.
2014-01-01
Background Salinity plays an important role in shaping coastal marine communities. Near-future climate predictions indicate that salinity will decrease in many shallow coastal areas due to increased precipitation; however, few studies have addressed this issue. The ability of ecosystems to cope with future changes will depend on species’ capacities to acclimatise or adapt to new environmental conditions. Here, we investigated the effects of a strong salinity gradient (the Baltic Sea system – Baltic, Kattegat, Skagerrak) on plasticity and adaptations in the euryhaline barnacle Balanus improvisus. We used a common-garden approach, where multiple batches of newly settled barnacles from each of three different geographical areas along the Skagerrak-Baltic salinity gradient were exposed to corresponding native salinities (6, 15 and 30 PSU), and phenotypic traits including mortality, growth, shell strength, condition index and reproductive maturity were recorded. Results We found that B. improvisus was highly euryhaline, but had highest growth and reproductive maturity at intermediate salinities. We also found that low salinity had negative effects on other fitness-related traits including initial growth and shell strength, although mortality was also lowest in low salinity. Overall, differences between populations in most measured traits were weak, indicating little local adaptation to salinity. Nonetheless, we observed some population-specific responses – notably that populations from high salinity grew stronger shells in their native salinity compared to the other populations, possibly indicating adaptation to differences in local predation pressure. Conclusions Our study shows that B. improvisus is an example of a true brackish-water species, and that plastic responses are more likely than evolutionary tracking in coping with future changes in coastal salinity. PMID:25038588
Hardy, Sarah M; Lindgren, Michael; Konakanchi, Hanumantharao; Huettmann, Falk
2011-10-01
Populations of the snow crab (Chionoecetes opilio) are widely distributed on high-latitude continental shelves of the North Pacific and North Atlantic, and represent a valuable resource in both the United States and Canada. In US waters, snow crabs are found throughout the Arctic and sub-Arctic seas surrounding Alaska, north of the Aleutian Islands, yet commercial harvest currently focuses on the more southerly population in the Bering Sea. Population dynamics are well-monitored in exploited areas, but few data exist for populations further north where climate trends in the Arctic appear to be affecting species' distributions and community structure on multiple trophic levels. Moreover, increased shipping traffic, as well as fisheries and petroleum resource development, may add additional pressures in northern portions of the range as seasonal ice cover continues to decline. In the face of these pressures, we examined the ecological niche and population distribution of snow crabs in Alaskan waters using a GIS-based spatial modeling approach. We present the first quantitative open-access model predictions of snow-crab distribution, abundance, and biomass in the Chukchi and Beaufort Seas. Multi-variate analysis of environmental drivers of species' distribution and community structure commonly rely on multiple linear regression methods. The spatial modeling approach employed here improves upon linear regression methods in allowing for exploration of nonlinear relationships and interactions between variables. Three machine-learning algorithms were used to evaluate relationships between snow-crab distribution and environmental parameters, including TreeNet, Random Forests, and MARS. An ensemble model was then generated by combining output from these three models to generate consensus predictions for presence-absence, abundance, and biomass of snow crabs. Each algorithm identified a suite of variables most important in predicting snow-crab distribution, including nutrient and chlorophyll-a concentrations in overlying waters, temperature, salinity, and annual sea-ice cover; this information may be used to develop and test hypotheses regarding the ecology of this species. This is the first such quantitative model for snow crabs, and all GIS-data layers compiled for this project are freely available from the authors, upon request, for public use and improvement.
USDA-ARS?s Scientific Manuscript database
The Soil Moisture and Ocean Salinity satellite (SMOS) was launched in November 2009 and started delivering data in January 2010. The commissioning phase ended in May 2010. Subsequently, the satellite has been in operation for over 5 years while the retrieval algorithms from Level 1 to Level 2 underw...
Prediction of pork quality parameters by applying fractals and data mining on MRI.
Caballero, Daniel; Pérez-Palacios, Trinidad; Caro, Andrés; Amigo, José Manuel; Dahl, Anders B; ErsbØll, Bjarne K; Antequera, Teresa
2017-09-01
This work firstly investigates the use of MRI, fractal algorithms and data mining techniques to determine pork quality parameters non-destructively. The main objective was to evaluate the capability of fractal algorithms (Classical Fractal algorithm, CFA; Fractal Texture Algorithm, FTA and One Point Fractal Texture Algorithm, OPFTA) to analyse MRI in order to predict quality parameters of loin. In addition, the effect of the sequence acquisition of MRI (Gradient echo, GE; Spin echo, SE and Turbo 3D, T3D) and the predictive technique of data mining (Isotonic regression, IR and Multiple linear regression, MLR) were analysed. Both fractal algorithm, FTA and OPFTA are appropriate to analyse MRI of loins. The sequence acquisition, the fractal algorithm and the data mining technique seems to influence on the prediction results. For most physico-chemical parameters, prediction equations with moderate to excellent correlation coefficients were achieved by using the following combinations of acquisition sequences of MRI, fractal algorithms and data mining techniques: SE-FTA-MLR, SE-OPFTA-IR, GE-OPFTA-MLR, SE-OPFTA-MLR, with the last one offering the best prediction results. Thus, SE-OPFTA-MLR could be proposed as an alternative technique to determine physico-chemical traits of fresh and dry-cured loins in a non-destructive way with high accuracy. Copyright © 2017. Published by Elsevier Ltd.
Yu, Ying; Wu, Guangwen; Yuan, Hongmei; Cheng, Lili; Zhao, Dongsheng; Huang, Wengong; Zhang, Shuquan; Zhang, Liguo; Chen, Hongyu; Zhang, Jian; Guan, Fengzhi
2016-05-27
MicroRNAs (miRNAs) play a critical role in responses to biotic and abiotic stress and have been characterized in a large number of plant species. Although flax (Linum usitatissimum L.) is one of the most important fiber and oil crops worldwide, no reports have been published describing flax miRNAs (Lus-miRNAs) induced in response to saline, alkaline, and saline-alkaline stresses. In this work, combined small RNA and degradome deep sequencing was used to analyze flax libraries constructed after alkaline-salt stress (AS2), neutral salt stress (NSS), alkaline stress (AS), and the non-stressed control (CK). From the CK, AS, AS2, and NSS libraries, a total of 118, 119, 122, and 120 known Lus-miRNAs and 233, 213, 211, and 212 novel Lus-miRNAs were isolated, respectively. After assessment of differential expression profiles, 17 known Lus-miRNAs and 36 novel Lus-miRNAs were selected and used to predict putative target genes. Gene ontology term enrichment analysis revealed target genes that were involved in responses to stimuli, including signaling and catalytic activity. Eight Lus-miRNAs were selected for analysis using qRT-PCR to confirm the accuracy and reliability of the miRNA-seq results. The qRT-PCR results showed that changes in stress-induced expression profiles of these miRNAs mirrored expression trends observed using miRNA-seq. Degradome sequencing and transcriptome profiling showed that expression of 29 miRNA-target pairs displayed inverse expression patterns under saline, alkaline, and saline-alkaline stresses. From the target prediction analysis, the miR398a-targeted gene codes for a copper/zinc superoxide dismutase, and the miR530 has been shown to explicitly target WRKY family transcription factors, which suggesting that these two micRNAs and their targets may significant involve in the saline, alkaline, and saline-alkaline stress response in flax. Identification and characterization of flax miRNAs, their target genes, functional annotations, and gene expression patterns are reported in this work. These findings will enhance our understanding of flax miRNA regulatory mechanisms under saline, alkaline, and saline-alkaline stresses and provide a foundation for future elucidation of the specific functions of these miRNAs.
NASA Astrophysics Data System (ADS)
Kontaxis, L. C.; Pavlou, C.; Portan, D. V.; Papanicolaou, G. C.
2018-02-01
In the present study, a composite material consisting of a polymeric epoxy resin matrix, reinforced with forty layers of non-woven cotton fiber fabric was manufactured. The method used to manufacture the composite was the Resin Vacuum Infusion technique. This is a technique widely used for high-performance, defect-free, composite materials. Composites and neat polymers are subjected to stresses during their function, while at the same time being influenced by environmental conditions, such as temperature and humidity. The main goal of this study was the investigation of the degradation of composite's viscoelastic behavior, after saline absorption. At this point, it should be mentioned, that this material could be used in biomedical applications. Therefore, a sealed container full of saline was used for the immer s ion of the specimens manufactured, and was placed in a bath at 37°C (body temperature). The specimens remained there for five different immersion periods (24, 72, 144, 216, 336 hours). The viscoelastic behavior of the composite material was determined through stress relaxation under flexure conditions, and the effect of immersion time and amount of saline absorption was studied. It was observed that after 24 hours of immersion a 42% decrease in stress was observed, which in the sequence remained almost constant. The stress relaxation experimental results were predicted by using the Residua l Property Model (RPM), a model developed by Papanicolaou et al. The same model has been successfully applied in the past, to many different materials previously subjected to various types of damage, in order to predict their residual behavior. For its application, the RPM predictive model needs only two experimental points. It was found that in all cases, predictions were in good agreement with experimental findings. Furthermore, the comparison between experimental values and theoretical predictions formed the basis of useful observations and conclusions.
Darnaude, Audrey M.; Sturrock, Anna; Trueman, Clive N.; Mouillot, David; EIMF; Campana, Steven E.; Hunter, Ewan
2014-01-01
Oxygen isotope ratios from fish otoliths are used to discriminate marine stocks and reconstruct past climate, assuming that variations in otolith δ18O values closely reflect differences in temperature history of fish when accounting for salinity induced variability in water δ18O. To investigate this, we exploited the environmental and migratory data gathered from a decade using archival tags to study the behaviour of adult plaice (Pleuronectes platessa L.) in the North Sea. Based on the tag-derived monthly distributions of the fish and corresponding temperature and salinity estimates modelled across three consecutive years, we first predicted annual otolith δ18O values for three geographically discrete offshore sub-stocks, using three alternative plausible scenarios for otolith growth. Comparison of predicted vs. measured annual δ18O values demonstrated >96% correct prediction of sub-stock membership, irrespective of the otolith growth scenario. Pronounced inter-stock differences in δ18O values, notably in summer, provide a robust marker for reconstructing broad-scale plaice distribution in the North Sea. However, although largely congruent, measured and predicted annual δ18O values of did not fully match. Small, but consistent, offsets were also observed between individual high-resolution otolith δ18O values measured during tag recording time and corresponding δ18O predictions using concomitant tag-recorded temperatures and location-specific salinity estimates. The nature of the shifts differed among sub-stocks, suggesting specific vital effects linked to variation in physiological response to temperature. Therefore, although otolith δ18O in free-ranging fish largely reflects environmental temperature and salinity, we counsel prudence when interpreting otolith δ18O data for stock discrimination or temperature reconstruction until the mechanisms underpinning otolith δ18O signature acquisition, and associated variation, are clarified. PMID:25279667
Werdell, P Jeremy; Franz, Bryan A; Lefler, Jason T; Robinson, Wayne D; Boss, Emmanuel
2013-12-30
Time-series of marine inherent optical properties (IOPs) from ocean color satellite instruments provide valuable data records for studying long-term time changes in ocean ecosystems. Semi-analytical algorithms (SAAs) provide a common method for estimating IOPs from radiometric measurements of the marine light field. Most SAAs assign constant spectral values for seawater absorption and backscattering, assume spectral shape functions of the remaining constituent absorption and scattering components (e.g., phytoplankton, non-algal particles, and colored dissolved organic matter), and retrieve the magnitudes of each remaining constituent required to match the spectral distribution of measured radiances. Here, we explore the use of temperature- and salinity-dependent values for seawater backscattering in lieu of the constant spectrum currently employed by most SAAs. Our results suggest that use of temperature- and salinity-dependent seawater spectra elevate the SAA-derived particle backscattering, reduce the non-algal particles plus colored dissolved organic matter absorption, and leave the derived absorption by phytoplankton unchanged.
NASA Technical Reports Server (NTRS)
Werdell, Paul J.; Franz, Bryan Alden; Lefler, Jason Travis; Robinson, Wayne D.; Boss, Emmanuel
2013-01-01
Time-series of marine inherent optical properties (IOPs) from ocean color satellite instruments provide valuable data records for studying long-term time changes in ocean ecosystems. Semi-analytical algorithms (SAAs) provide a common method for estimating IOPs from radiometric measurements of the marine light field. Most SAAs assign constant spectral values for seawater absorption and backscattering, assume spectral shape functions of the remaining constituent absorption and scattering components (e.g., phytoplankton, non-algal particles, and colored dissolved organic matter), and retrieve the magnitudes of each remaining constituent required to match the spectral distribution of measured radiances. Here, we explore the use of temperature- and salinity-dependent values for seawater backscattering in lieu of the constant spectrum currently employed by most SAAs. Our results suggest that use of temperature- and salinity-dependent seawater spectra elevate the SAA-derived particle backscattering, reduce the non-algal particles plus colored dissolved organic matter absorption, and leave the derived absorption by phytoplankton unchanged.
NASA Astrophysics Data System (ADS)
Hunter, Jason M.; Maier, Holger R.; Gibbs, Matthew S.; Foale, Eloise R.; Grosvenor, Naomi A.; Harders, Nathan P.; Kikuchi-Miller, Tahali C.
2018-05-01
Salinity modelling in river systems is complicated by a number of processes, including in-stream salt transport and various mechanisms of saline accession that vary dynamically as a function of water level and flow, often at different temporal scales. Traditionally, salinity models in rivers have either been process- or data-driven. The primary problem with process-based models is that in many instances, not all of the underlying processes are fully understood or able to be represented mathematically. There are also often insufficient historical data to support model development. The major limitation of data-driven models, such as artificial neural networks (ANNs) in comparison, is that they provide limited system understanding and are generally not able to be used to inform management decisions targeting specific processes, as different processes are generally modelled implicitly. In order to overcome these limitations, a generic framework for developing hybrid process and data-driven models of salinity in river systems is introduced and applied in this paper. As part of the approach, the most suitable sub-models are developed for each sub-process affecting salinity at the location of interest based on consideration of model purpose, the degree of process understanding and data availability, which are then combined to form the hybrid model. The approach is applied to a 46 km reach of the Murray River in South Australia, which is affected by high levels of salinity. In this reach, the major processes affecting salinity include in-stream salt transport, accession of saline groundwater along the length of the reach and the flushing of three waterbodies in the floodplain during overbank flows of various magnitudes. Based on trade-offs between the degree of process understanding and data availability, a process-driven model is developed for in-stream salt transport, an ANN model is used to model saline groundwater accession and three linear regression models are used to account for the flushing of the different floodplain storages. The resulting hybrid model performs very well on approximately 3 years of daily validation data, with a Nash-Sutcliffe efficiency (NSE) of 0.89 and a root mean squared error (RMSE) of 12.62 mg L-1 (over a range from approximately 50 to 250 mg L-1). Each component of the hybrid model results in noticeable improvements in model performance corresponding to the range of flows for which they are developed. The predictive performance of the hybrid model is significantly better than that of a benchmark process-driven model (NSE = -0.14, RMSE = 41.10 mg L-1, Gbench index = 0.90) and slightly better than that of a benchmark data-driven (ANN) model (NSE = 0.83, RMSE = 15.93 mg L-1, Gbench index = 0.36). Apart from improved predictive performance, the hybrid model also has advantages over the ANN benchmark model in terms of increased capacity for improving system understanding and greater ability to support management decisions.
Predicting the random drift of MEMS gyroscope based on K-means clustering and OLS RBF Neural Network
NASA Astrophysics Data System (ADS)
Wang, Zhen-yu; Zhang, Li-jie
2017-10-01
Measure error of the sensor can be effectively compensated with prediction. Aiming at large random drift error of MEMS(Micro Electro Mechanical System))gyroscope, an improved learning algorithm of Radial Basis Function(RBF) Neural Network(NN) based on K-means clustering and Orthogonal Least-Squares (OLS) is proposed in this paper. The algorithm selects the typical samples as the initial cluster centers of RBF NN firstly, candidates centers with K-means algorithm secondly, and optimizes the candidate centers with OLS algorithm thirdly, which makes the network structure simpler and makes the prediction performance better. Experimental results show that the proposed K-means clustering OLS learning algorithm can predict the random drift of MEMS gyroscope effectively, the prediction error of which is 9.8019e-007°/s and the prediction time of which is 2.4169e-006s
Wendelberger, Kristie S; Richards, Jennifer H
2017-07-01
Sea level rise (SLR) and land-use change are working together to change coastal communities around the world. Along Florida's coast, SLR and large-scale drying are increasing groundwater salinity, resulting in halophytic (salt-tolerant) species colonizing glycophytic (salt-intolerant) communities. We hypothesized that halophytes can contribute to increased soil salinity as they move into glycophyte communities, making soils more saline than SLR or drying alone. We tested our hypothesis with a replacement-series greenhouse experiment with halophyte/glycophyte ratios of 0:4, 1:3, 2:2, 3:1, 4:0, mimicking halophyte movement into glycophyte communities. We subjected replicates to 0, 26, and 38‰ salinity for one, one, and three months, respectively, taking soil salinity and stomatal conductance measurements at the end of each treatment period. Our results showed that soil salinity increased as halophyte/glycophyte ratio increased. Either osmotic or ionic stress caused decreases in glycophyte biomass, resulting in less per-plant transpiration as compared to halophytes. At 38‰ groundwater, soil salinity increased as halophyte density increased, making conditions more conducive to further halophyte establishment. This study suggests that coastal plant community turnover may occur faster than would be predicted from SLR and anthropogenic disturbance alone.
Salinity Tolerance of Early-Stage Oyster Larvae in the Choptank River, Chesapeake Bay, USA
NASA Astrophysics Data System (ADS)
Scharping, R. J.; North, E. W.; Plough, L. V.
2016-02-01
The eastern oyster (Crassostrea virginica) is ecologically and economically important to the Chesapeake Bay, Maryland, USA. Its population, however, is currently estimated to be less than one percent of what it was historically. To restore oyster populations, techniques such as larval transport modeling are being implemented to aid the selection of sanctuary locations. These models can incorporate biological factors such as salinity-induced mortality, but no data from low-salinity areas such as the oligohaline Choptank River, a major focus of oyster restoration in the Chesapeake, exist. The purpose of our study was to generate salinity-induced mortality data for oyster larvae from the Choptank River and compare their tolerances to those of oysters from different salinity regimes. We performed three experiments looking at the effect of salinities from 3 to 26 on the survival of larvae from 4 to 48 hrs post-fertilization. While overall survival differed across experiments, we found a consistent minimum survival threshold between 5-7 and peak survival window between 9-16. These salinity values were about 7 lower than those of oysters from the polyhaline Long Island Sound (threshold: 12.5-15; peak: 17.5-27). This research has direct application to oyster restoration in the Choptank River and similar low-salinity areas by improving larval transport model predictions.
Wu, Fangli; Xie, Zhe; Lan, Yawen; Dupont, Sam; Sun, Meng; Cui, Shuaikang; Huang, Xizhi; Huang, Wei; Liu, Liping; Hu, Menghong; Lu, Weiqun; Wang, Youji
2018-01-01
With the release of large amounts of CO 2 , ocean acidification is intensifying and affecting aquatic organisms. In addition, salinity also plays an important role for marine organisms and fluctuates greatly in estuarine and coastal ecosystem, where ocean acidification frequently occurs. In present study, flow cytometry was used to investigate immune parameters of haemocytes in the thick shell mussel Mytilus coruscus exposed to different salinities (15, 25, and 35‰) and two pH levels (7.3 and 8.1). A 7-day in vivo and a 5-h in vitro experiments were performed. In both experiments, low pH had significant effects on all tested immune parameters. When exposed to decreased pH, total haemocyte count (THC), phagocytosis (Pha), esterase (Est), and lysosomal content (Lyso) were significantly decreased, whereas haemocyte mortality (HM) and reactive oxygen species (ROS) were increased. High salinity had no significant effects on the immune parameters of haemocytes as compared with low salinity. However, an interaction between pH and salinity was observed in both experiments for most tested haemocyte parameters. This study showed that high salinity, low salinity and low pH have negative and interactive effects on haemocytes of mussels. As a consequence, it can be expected that the combined effect of low pH and changed salinity will have more severe effects on mussel health than predicted by single exposure.
Performance of Subsurface Tube Drainage System in Saline Soils: A Case Study
NASA Astrophysics Data System (ADS)
Pali, A. K.
2015-06-01
In order to improve the saline and water logged soils caused due to groundwater table rise, installation of subsurface drainage system is considered as one of the best remedies. However, the design of the drainage system has to be accurate so that the field performance results conform to the designed results. In this investigation, the field performance of subsurface tube drainage system installed at the study area was evaluated. The performance was evaluated on the basis of comparison of the designed value of water table drop as 30 cm after 2 days of drainage and predicted and field measured hydraulic heads for a consecutive drainage period of 14 days. The investigation revealed that the actual drop of water table after 2 days of drainage was 25 cm, about 17 % less than the designed value of 30 cm after 2 days of drainage. The comparison of hydraulic heads predicted by Van Schilfgaarde equation of unsteady drainage with the field-measured hydraulic heads showed that the deviation of predicted hydraulic heads varied within a range of ±8 % indicating high acceptability of Van Schlifgaarde equation for designing subsurface drainage system in saline and water logged soils resembling to that of the study area.
Competition between hardwood hammocks and mangroves
Sternberg, L.D.S.L.; Teh, S.Y.; Ewe, S.M.L.; Miralles-Wilhelm, F.; DeAngelis, D.L.
2007-01-01
The boundaries between mangroves and freshwater hammocks in coastal ecotones of South Florida are sharp. Further, previous studies indicate that there is a discontinuity in plant predawn water potentials, with woody plants either showing predawn water potentials reflecting exposure to saline water or exposure to freshwater. This abrupt concurrent change in community type and plant water status suggests that there might be feedback dynamics between vegetation and salinity. A model examining the salinity of the aerated zone of soil overlying a saline body of water, known as the vadose layer, as a function of precipitation, evaporation and plant water uptake is presented here. The model predicts that mixtures of saline and freshwater vegetative species represent unstable states. Depending on the initial vegetation composition, subsequent vegetative change will lead either to patches of mangrove coverage having a high salinity vadose zone or to freshwater hammock coverage having a low salinity vadose zone. Complete or nearly complete coverage by either freshwater or saltwater vegetation represents two stable steady-state points. This model can explain many of the previous observations of vegetation patterns in coastal South Florida as well as observations on the dynamics of vegetation shifts caused by sea level rise and climate change. ?? 2007 Springer Science+Business Media, LLC.
A Tidally Averaged Sediment-Transport Model for San Francisco Bay, California
Lionberger, Megan A.; Schoellhamer, David H.
2009-01-01
A tidally averaged sediment-transport model of San Francisco Bay was incorporated into a tidally averaged salinity box model previously developed and calibrated using salinity, a conservative tracer (Uncles and Peterson, 1995; Knowles, 1996). The Bay is represented in the model by 50 segments composed of two layers: one representing the channel (>5-meter depth) and the other the shallows (0- to 5-meter depth). Calculations are made using a daily time step and simulations can be made on the decadal time scale. The sediment-transport model includes an erosion-deposition algorithm, a bed-sediment algorithm, and sediment boundary conditions. Erosion and deposition of bed sediments are calculated explicitly, and suspended sediment is transported by implicitly solving the advection-dispersion equation. The bed-sediment model simulates the increase in bed strength with depth, owing to consolidation of fine sediments that make up San Francisco Bay mud. The model is calibrated to either net sedimentation calculated from bathymetric-change data or measured suspended-sediment concentration. Specified boundary conditions are the tributary fluxes of suspended sediment and suspended-sediment concentration in the Pacific Ocean. Results of model calibration and validation show that the model simulates the trends in suspended-sediment concentration associated with tidal fluctuations, residual velocity, and wind stress well, although the spring neap tidal suspended-sediment concentration variability was consistently underestimated. Model validation also showed poor simulation of seasonal sediment pulses from the Sacramento-San Joaquin River Delta at Point San Pablo because the pulses enter the Bay over only a few days and the fate of the pulses is determined by intra-tidal deposition and resuspension that are not included in this tidally averaged model. The model was calibrated to net-basin sedimentation to calculate budgets of sediment and sediment-associated contaminants. While simulated net sedimentation in the four basins that comprise San Francisco Bay was correct, the simulations incorrectly eroded shallows while channels deposited because model surface-layer boxes span both shallows and channels, and neglect lateral variability of suspended-sediment concentration. Validation with recent (1983-2005) net sedimentation in South San Francisco Bay was poor, perhaps owing to poorly quantified sediment supply, and to invasive species that altered erosion and deposition processes. This demonstrates that deterministically predicting future sedimentation is difficult in this or any estuary for which boundary conditions are not stationary. The model would best be used as a tool for developing past and present sediment budgets, and for creating scenarios of future sedimentation that are compared to one another rather than considered a deterministic prediction.
Zhou, Minghua; Butterbach-Bahl, Klaus; Vereecken, Harry; Brüggemann, Nicolas
2017-03-01
Salinity intrusion caused by land subsidence resulting from increasing groundwater abstraction, decreasing river sediment loads and increasing sea level because of climate change has caused widespread soil salinization in coastal ecosystems. Soil salinization may greatly alter nitrogen (N) cycling in coastal ecosystems. However, a comprehensive understanding of the effects of soil salinization on ecosystem N pools, cycling processes and fluxes is not available for coastal ecosystems. Therefore, we compiled data from 551 observations from 21 peer-reviewed papers and conducted a meta-analysis of experimental soil salinization effects on 19 variables related to N pools, cycling processes and fluxes in coastal ecosystems. Our results showed that the effects of soil salinization varied across different ecosystem types and salinity levels. Soil salinization increased plant N content (18%), soil NH 4 + (12%) and soil total N (210%), although it decreased soil NO 3 - (2%) and soil microbial biomass N (74%). Increasing soil salinity stimulated soil N 2 O fluxes as well as hydrological NH 4 + and NO 2 - fluxes more than threefold, although it decreased the hydrological dissolved organic nitrogen (DON) flux (59%). Soil salinization also increased the net N mineralization by 70%, although salinization effects were not observed on the net nitrification, denitrification and dissimilatory nitrate reduction to ammonium in this meta-analysis. Overall, this meta-analysis improves our understanding of the responses of ecosystem N cycling to soil salinization, identifies knowledge gaps and highlights the urgent need for studies on the effects of soil salinization on coastal agro-ecosystem and microbial N immobilization. Additional increases in knowledge are critical for designing sustainable adaptation measures to the predicted intrusion of salinity intrusion so that the productivity of coastal agro-ecosystems can be maintained or improved and the N losses and pollution of the natural environment can be minimized. © 2016 John Wiley & Sons Ltd.
Zhong, Zhi-Ping; Liu, Ying; Miao, Li-Li; Wang, Fang; Chu, Li-Min; Wang, Jia-Li
2016-01-01
The prokaryotic community composition and diversity and the distribution patterns at various taxonomic levels across gradients of salinity and physiochemical properties in the surface waters of seven plateau lakes in the Qaidam Basin, Tibetan Plateau, were evaluated using Illumina MiSeq sequencing. These lakes included Lakes Keluke (salinity, <1 g/liter), Qing (salinity, 5.5 to 6.6 g/liter), Tuosu (salinity, 24 to 35 g/liter), Dasugan (salinity, 30 to 33 g/liter), Gahai (salinity, 92 to 96 g/liter), Xiaochaidan (salinity, 94 to 99 g/liter), and Gasikule (salinity, 317 to 344 g/liter). The communities were dominated by Bacteria in lakes with salinities of <100 g/liter and by Archaea in Lake Gasikule. The clades At12OctB3 and Salinibacter, previously reported only in hypersaline environments, were found in a hyposaline lake (salinity, 5.5 to 6.6 g/liter) at an abundance of ∼1.0%, indicating their ecological plasticity. Salinity and the concentrations of the chemical ions whose concentrations covary with salinity (Mg2+, K+, Cl−, Na+, SO42−, and Ca2+) were found to be the primary environmental factors that directly or indirectly determined the composition and diversity at the level of individual clades as well as entire prokaryotic communities. The distribution patterns of two phyla, five classes, five orders, five families, and three genera were well predicted by salinity. The variation of the prokaryotic community structure also significantly correlated with the dissolved oxygen concentration, pH, the total nitrogen concentration, and the PO43− concentration. Such correlations varied depending on the taxonomic level, demonstrating the importance of comprehensive correlation analyses at various taxonomic levels in evaluating the effects of environmental variable factors on prokaryotic community structures. Our findings clarify the distribution patterns of the prokaryotic community composition in plateau lakes at the levels of individual clades as well as whole communities along gradients of salinity and ionic concentrations. PMID:26746713
Response of Stream Biodiversity to Increasing Salinization
NASA Astrophysics Data System (ADS)
Hawkins, C. P.; Vander Laan, J. J.; Olson, J. R.
2014-12-01
We used a large data set of macroinvertebrate samples collected from streams in both reference-quality (n = 68) and degraded (n = 401) watersheds in the state of Nevada, USA to assess relationships between stream biodiversity and salinity. We used specific electrical conductance (EC)(μS/cm) as a measure of salinity, and applied a previously developed EC model to estimate natural, baseflow salinity at each stream. We used the difference between observed and predicted salinity (EC-Diff) as a measure of salinization associated with watershed degradation. Observed levels of EC varied between 22 and 994 μS/cm across reference sites and 22 to 3,256 uS/cm across non-reference sites. EC-Diff was as high as 2,743 μS/cm. We used a measure of local biodiversity completeness (ratio of observed to expected number of taxa) to assess ecological response to salinity. This O/E index decreased nearly linearly up to about 25% biodiversity loss, which occurred at EC-Diff of about 300 μS/cm. Too few sites had EC-Diff greater than 300 μS/cm to draw reliable inferences regarding biodiversity response to greater levels of salinization. EC-Diff increased with % agricultural land use, mine density, and % urban land use in the watersheds implying that human activities have been largely responsible for increased salinization in Nevada streams and rivers. Comparison of biological responses to EC and other stressors indicates that increased salinization may be the primary stressor causing biodiversity loss in these streams and that more stringent salinity water quality standards may be needed to protect aquatic life.
Mukherjee, Kaushik; Gupta, Sanjay
2017-03-01
Several mechanobiology algorithms have been employed to simulate bone ingrowth around porous coated implants. However, there is a scarcity of quantitative comparison between the efficacies of commonly used mechanoregulatory algorithms. The objectives of this study are: (1) to predict peri-acetabular bone ingrowth using cell-phenotype specific algorithm and to compare these predictions with those obtained using phenomenological algorithm and (2) to investigate the influences of cellular parameters on bone ingrowth. The variation in host bone material property and interfacial micromotion of the implanted pelvis were mapped onto the microscale model of implant-bone interface. An overall variation of 17-88 % in peri-acetabular bone ingrowth was observed. Despite differences in predicted tissue differentiation patterns during the initial period, both the algorithms predicted similar spatial distribution of neo-tissue layer, after attainment of equilibrium. Results indicated that phenomenological algorithm, being computationally faster than the cell-phenotype specific algorithm, might be used to predict peri-prosthetic bone ingrowth. The cell-phenotype specific algorithm, however, was found to be useful in numerically investigating the influence of alterations in cellular activities on bone ingrowth, owing to biologically related factors. Amongst the host of cellular activities, matrix production rate of bone tissue was found to have predominant influence on peri-acetabular bone ingrowth.
NASA Astrophysics Data System (ADS)
Askri, Brahim; Bouhlila, Rachida; Job, Jean Olivier
2010-01-01
SummaryIn modern oases situated in the south of Tunisia, secondary salination of irrigated lands is a crucial problem. The visible salt deposits and soil salination processes are the consequence of several factors including the excessive use of saline irrigation water, seepage from earthen canal systems, inefficient irrigation practices and inadequate drainage. Understanding the mechanism of the secondary salination is of interest in order to maintain existing oases, and thus ensure the sustainability of date production in this part of the country. Therefore, a conceptual, daily, semi-distributed hydrologic model (OASIS_MOD) was developed to analyse the impact of irrigation management on the water table fluctuation, soil salinity and drain discharge, and to evaluate measures to control salinity within an oasis ecosystem. The basic processes incorporated in the model are irrigation, infiltration, percolation to the shallow groundwater, soil evaporation, crop transpiration, groundwater flow, capillary rise flux, and drain discharge. OASIS_MOD was tested with data collected in a parcel of farmland situated in the Segdoud oasis, in the south-west of Tunisia. The calibration results showed that groundwater levels were simulated with acceptable accuracy, since the differences between the simulated and measured values are less than 0.22 m. However, the model under-predicted some water table peaks when irrigation occurs due to inconsistencies in the irrigation water data. The validation results showed that deviations between observed and simulated groundwater levels have increased to about 0.5 m due to under-estimation of groundwater inflow from an upstream palm plantation. A long-term simulation scenario revealed that the soil salinity and groundwater level have three types of variability in time: a daily variability due to irrigation practices, seasonal fluctuation due to climatic conditions and annual variability explained by the increase in cultivated areas. The irrigation interval was found to be important with irrigating once each ten days leading to soil salinity increase during the dry summer season and to a rising water table during the autumn-winter period. The annual increase in the irrigated area caused a decrease of the irrigation water depths, and thus an augmentation of the soil and groundwater salinities. The surface area affected by a soil salinity concentration above 15 g/L has increased from 2% of the study parcel area in June 1992 to about 50% four years later due to the abandonment of several cultivated basins.
Do Assimilated Drifter Velocities Improve Lagrangian Predictability in an Operational Ocean Model?
2015-05-01
extended Kalman filter . Molcard et al. (2005) used a statistical method to cor- relate model and drifter velocities. Taillandier et al. (2006) describe the... temperature and salinity observations. Trajectory angular differ- ences are also reduced. 1. Introduction The importance of Lagrangian forecasts was seen... Temperature , salinity, and sea surface height (SSH, measured along-track by satellite altimeters) observa- tions are typically assimilated in
Development and Testing of a Coupled Ocean-atmosphere Mesoscale Ensemble Prediction System
2011-06-28
wind, temperature, and moisture variables, while the oceanographic ET is derived from ocean current, temperature, and salinity variables. Estimates of...wind, temperature, and moisture variables while the oceanographic ET is derived from ocean current temperature, and salinity variables. Estimates of...uncertainty in the model. Rigorously accurate ensemble methods for describing the distribution of future states given past information include particle
NASA Astrophysics Data System (ADS)
Nash, Ciaran; Bourke, Mary
2017-04-01
Coastal sand dune systems are some of the most physically dynamic landscapes; their susceptibility to geomorphic change is rooted in a host of interconnected processes and feedbacks. Soil moisture and salinity are two fundamental environmental variables capable of exerting a geomorphic influence but have not been thoroughly investigated in coastal dunes. In northwest Europe, coastal dunes are predominantly sediment-limited systems with reduced capacities to avoid severe morphological changes arising from storms. Climatic changes over the next century are predicted to manifest in more frequent and intense storms with the potential to enact severe geomorphic change in coastal settings. A lack of data pertaining to internal dune hydrosaline dynamics suggests we are missing part of the bigger picture. We conducted a pilot study of moisture and salinity dynamics within the upper 50 cm of the vadose zone in a vegetated dune system at Golden Strand, Achill Island on the west coast of Ireland. Golden Strand is a roughly 800 m long embayed sandy beach, backed by vegetated dunes that protect a low-lying machair grassland. A study transect was established across this dune-machair system, perpendicular to the shore. Innovative instrumentation in the form of capacitance probes and internal dune thermochrons were deployed to sample at 10 cm depth intervals at a sampling rate of 10 minutes and coupled with on-site rainfall data. Results indicate that dune moisture tracks rainfall inputs up to 30 cm depth. Antecedent moisture at depth was found to influence infiltration of water through the dune profile. Salinity within the study transect decreased with distance from the beach, suggesting that salt spray is the primary salt delivery mechanism in the dune system. We also noted that moisture and salinity below 30 cm depth failed to respond to rainfall events of varying intensities. Relatively constant moisture and salinity were observed at all depths within the machair. Predictions of climatic change for Ireland suggest more intense short-period precipitation events, this may increase infiltration depth. Baseline data collected will prove informative in predicting the response of Irish coastal dunes via changes in vegetation and dune stability.
NASA Astrophysics Data System (ADS)
Perelman, A.; Guerra, H. J.; Pohlmeier, A. J.; Vanderborght, J.; Lazarovitch, N.
2017-12-01
When salinity increases beyond a certain threshold, crop yield will decrease at a fixed rate, according to the Maas and Hoffman model (1976). Thus, it is highly important to predict salinization and its impact on crops. Current models do not consider the impact of the transpiration rate on plant salt tolerance, although it affects plant water uptake and thus salt accumulation around the roots, consequently influencing the plant's sensitivity to salinity. Better model parametrization can improve the prediction of real salinity effects on crop growth and yield. The aim of this research is to study Na+ distribution around roots at different scales using different non-invasive methods, and to examine how this distribution is affected by the transpiration rate and plant water uptake. Results from tomato plants that were grown on rhizoslides (a capillary paper growth system) showed that the Na+ concentration was higher at the root-substrate interface than in the bulk. Also, Na+ accumulation around the roots decreased under a low transpiration rate, supporting our hypothesis. The rhizoslides enabled the root growth rate and architecture to be studied under different salinity levels. The root system architecture was retrieved from photos taken during the experiment, enabling us to incorporate real root systems into a simulation. Magnetic resonance imaging (MRI) was used to observe correlations between root system architectures and Na+ distribution. The MRI provided fine resolution of the Na+ accumulation around a single root without disturbing the root system. With time, Na+ accumulated only where roots were found in the soil and later around specific roots. Rhizoslides allow the root systems of larger plants to be investigated, but this method is limited by the medium (paper) and the dimension (2D). The MRI can create a 3D image of Na+ accumulation in soil on a microscopic scale. These data are being used for model calibration, which is expected to enable the prediction of root water uptake in saline soils for different climatic conditions and different soil water availabilities.
Improving personalized link prediction by hybrid diffusion
NASA Astrophysics Data System (ADS)
Liu, Jin-Hu; Zhu, Yu-Xiao; Zhou, Tao
2016-04-01
Inspired by traditional link prediction and to solve the problem of recommending friends in social networks, we introduce the personalized link prediction in this paper, in which each individual will get equal number of diversiform predictions. While the performances of many classical algorithms are not satisfactory under this framework, thus new algorithms are in urgent need. Motivated by previous researches in other fields, we generalize heat conduction process to the framework of personalized link prediction and find that this method outperforms many classical similarity-based algorithms, especially in the performance of diversity. In addition, we demonstrate that adding one ground node that is supposed to connect all the nodes in the system will greatly benefit the performance of heat conduction. Finally, better hybrid algorithms composed of local random walk and heat conduction have been proposed. Numerical results show that the hybrid algorithms can outperform other algorithms simultaneously in all four adopted metrics: AUC, precision, recall and hamming distance. In a word, this work may shed some light on the in-depth understanding of the effect of physical processes in personalized link prediction.
Evolutionary Dynamic Multiobjective Optimization Via Kalman Filter Prediction.
Muruganantham, Arrchana; Tan, Kay Chen; Vadakkepat, Prahlad
2016-12-01
Evolutionary algorithms are effective in solving static multiobjective optimization problems resulting in the emergence of a number of state-of-the-art multiobjective evolutionary algorithms (MOEAs). Nevertheless, the interest in applying them to solve dynamic multiobjective optimization problems has only been tepid. Benchmark problems, appropriate performance metrics, as well as efficient algorithms are required to further the research in this field. One or more objectives may change with time in dynamic optimization problems. The optimization algorithm must be able to track the moving optima efficiently. A prediction model can learn the patterns from past experience and predict future changes. In this paper, a new dynamic MOEA using Kalman filter (KF) predictions in decision space is proposed to solve the aforementioned problems. The predictions help to guide the search toward the changed optima, thereby accelerating convergence. A scoring scheme is devised to hybridize the KF prediction with a random reinitialization method. Experimental results and performance comparisons with other state-of-the-art algorithms demonstrate that the proposed algorithm is capable of significantly improving the dynamic optimization performance.
Virag, Nathalie; Erickson, Mark; Taraborrelli, Patricia; Vetter, Rolf; Lim, Phang Boon; Sutton, Richard
2018-04-28
We developed a vasovagal syncope (VVS) prediction algorithm for use during head-up tilt with simultaneous analysis of heart rate (HR) and systolic blood pressure (SBP). We previously tested this algorithm retrospectively in 1155 subjects, showing sensitivity 95%, specificity 93% and median prediction time of 59s. This study was prospective, single center, on 140 subjects to evaluate this VVS prediction algorithm and assess if retrospective results were reproduced and clinically relevant. Primary endpoint was VVS prediction: sensitivity and specificity >80%. In subjects, referred for 60° head-up tilt (Italian protocol), non-invasive HR and SBP were supplied to the VVS prediction algorithm: simultaneous analysis of RR intervals, SBP trends and their variability represented by low-frequency power generated cumulative risk which was compared with a predetermined VVS risk threshold. When cumulative risk exceeded threshold, an alert was generated. Prediction time was duration between first alert and syncope. Of 140 subjects enrolled, data was usable for 134. Of 83 tilt+ve (61.9%), 81 VVS events were correctly predicted and of 51 tilt-ve subjects (38.1%), 45 were correctly identified as negative by the algorithm. Resulting algorithm performance was sensitivity 97.6%, specificity 88.2%, meeting primary endpoint. Mean VVS prediction time was 2min 26s±3min16s with median 1min 25s. Using only HR and HR variability (without SBP) the mean prediction time reduced to 1min34s±1min45s with median 1min13s. The VVS prediction algorithm, is clinically-relevant tool and could offer applications including providing a patient alarm, shortening tilt-test time, or triggering pacing intervention in implantable devices. Copyright © 2018. Published by Elsevier Inc.
Arts, E E A; Popa, C D; Den Broeder, A A; Donders, R; Sandoo, A; Toms, T; Rollefstad, S; Ikdahl, E; Semb, A G; Kitas, G D; Van Riel, P L C M; Fransen, J
2016-04-01
Predictive performance of cardiovascular disease (CVD) risk calculators appears suboptimal in rheumatoid arthritis (RA). A disease-specific CVD risk algorithm may improve CVD risk prediction in RA. The objectives of this study are to adapt the Systematic COronary Risk Evaluation (SCORE) algorithm with determinants of CVD risk in RA and to assess the accuracy of CVD risk prediction calculated with the adapted SCORE algorithm. Data from the Nijmegen early RA inception cohort were used. The primary outcome was first CVD events. The SCORE algorithm was recalibrated by reweighing included traditional CVD risk factors and adapted by adding other potential predictors of CVD. Predictive performance of the recalibrated and adapted SCORE algorithms was assessed and the adapted SCORE was externally validated. Of the 1016 included patients with RA, 103 patients experienced a CVD event. Discriminatory ability was comparable across the original, recalibrated and adapted SCORE algorithms. The Hosmer-Lemeshow test results indicated that all three algorithms provided poor model fit (p<0.05) for the Nijmegen and external validation cohort. The adapted SCORE algorithm mainly improves CVD risk estimation in non-event cases and does not show a clear advantage in reclassifying patients with RA who develop CVD (event cases) into more appropriate risk groups. This study demonstrates for the first time that adaptations of the SCORE algorithm do not provide sufficient improvement in risk prediction of future CVD in RA to serve as an appropriate alternative to the original SCORE. Risk assessment using the original SCORE algorithm may underestimate CVD risk in patients with RA. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
Bolduc, F.; Afton, A.D.
2004-01-01
The hydrology of marsh ponds influences aquatic invertebrate and waterbird communities. Hydrologic variables in marsh ponds of the Gulf Coast Chenier Plain are potentially affected by structural marsh management (SMM: levees, water control structures and impoundments) that has been implemented since the 1950s. Assuming that SMM restricts tidal flows and drainage of rainwater, we predicted that SMM would increase water depth, and concomitantly decrease salinity and transparency in impounded marsh ponds. We also predicted that SMM would increase seasonal variability in water depth in impounded marsh ponds because of the potential incapacity of water control structures to cope with large flooding events. In addition, we predicted that SMM would decrease spatial variability in water depth. Finally, we predicted that ponds of impounded freshwater (IF), oligohaline (IO), and mesohaline (IM) marshes would be similar in water depth, temperature, dissolved oxygen (O2), and transparency. Using a priori multivariate analysis of variance (MANOVA) contrast, we tested these predictions by comparing hydrologic variables within ponds of impounded and unimpounded marshes during winters 1997-1998 to 1999-2000 on Rockefeller State Wildlife Refuge, near Grand Chenier, Louisiana. Specifically, we compared hydrologic variables (1) between IM and unimpounded mesohaline marsh ponds (UM); and (2) among IF, IO, and IM marshes ponds. As predicted, water depth was higher and salinity and O2 were lower in IM than in UM marsh ponds. However, temperature and transparency did not differ between IM and UM marsh ponds. Water depth varied more among months in IM marsh ponds than within those of UM marshes, and variances among and within ponds were lower in IM than UM marshes. Finally, all hydrologic variables, except salinity, were similar among IF, IO, and IM marsh ponds. Hydrologic changes within marsh ponds due to SMM should (1) promote benthic invertebrate taxa that tolerate low levels of O2 and salinity; (2) deter waterbird species that cannot cope with increased water levels; and (3) reduce waterbird species diversity by decreasing spatial variability in water depth among and within marsh ponds.
NASA Astrophysics Data System (ADS)
Haddout, Soufiane; Igouzal, Mohammed; Maslouhi, Abdellatif
2016-09-01
The longitudinal variation of salinity and the maximum salinity intrusion length in an alluvial estuary are important environmental concerns for policy makers and managers since they influence water quality, water utilization and agricultural development in estuarine environments and the potential use of water resources in general. The supermoon total lunar eclipse is a rare event. According to NASA, they have only occurred 5 times in the 1900s - in 1910, 1928, 1946, 1964 and 1982. After the 28 September 2015 total lunar eclipse, a Super Blood Moon eclipse will not recur before 8 October 2033. In this paper, for the first time, the impact of the combination of a supermoon and a total lunar eclipse on the salinity intrusion along an estuary is studied. The 28 September 2015 supermoon total lunar eclipse is the focus of this study and the Sebou river estuary (Morocco) is used as an application area. The Sebou estuary is an area with high agricultural potential, is becoming one of the most important industrial zones in Morocco and it is experiencing a salt intrusion problem. Hydrodynamic equations for tidal wave propagation coupled with the Savenije theory and a numerical salinity transport model (HEC-RAS software "Hydrologic Engineering Center River Analysis System") are applied to study the impact of the supermoon total lunar eclipse on the salinity intrusion. Intensive salinity measurements during this extreme event were recorded along the Sebou estuary. Measurements showed a modification of the shape of axial salinity profiles and a notable water elevation rise, compared with normal situations. The two optimization parameters (Van der Burgh's and dispersion coefficients) of the analytical model are estimated based on the Levenberg-Marquardt's algorithm (i.e., solving nonlinear least-squares problems). The salinity transport model was calibrated and validated using field data. The results show that the two models described very well the salt intrusion during the supermoon total lunar eclipse day. A good fit between computed salinity and measurements is obtained, as verified by statistical performance tests. These two models can give a rapid assessment of salinity distribution and consequently help to ensure the safety of the water supply, even during such infrequent astronomical phenomenon.
Spreco, A; Eriksson, O; Dahlström, Ö; Timpka, T
2017-07-01
Methods for the detection of influenza epidemics and prediction of their progress have seldom been comparatively evaluated using prospective designs. This study aimed to perform a prospective comparative trial of algorithms for the detection and prediction of increased local influenza activity. Data on clinical influenza diagnoses recorded by physicians and syndromic data from a telenursing service were used. Five detection and three prediction algorithms previously evaluated in public health settings were calibrated and then evaluated over 3 years. When applied on diagnostic data, only detection using the Serfling regression method and prediction using the non-adaptive log-linear regression method showed acceptable performances during winter influenza seasons. For the syndromic data, none of the detection algorithms displayed a satisfactory performance, while non-adaptive log-linear regression was the best performing prediction method. We conclude that evidence was found for that available algorithms for influenza detection and prediction display satisfactory performance when applied on local diagnostic data during winter influenza seasons. When applied on local syndromic data, the evaluated algorithms did not display consistent performance. Further evaluations and research on combination of methods of these types in public health information infrastructures for 'nowcasting' (integrated detection and prediction) of influenza activity are warranted.
NASA Astrophysics Data System (ADS)
Muller, Sybrand Jacobus; van Niekerk, Adriaan
2016-07-01
Soil salinity often leads to reduced crop yield and quality and can render soils barren. Irrigated areas are particularly at risk due to intensive cultivation and secondary salinization caused by waterlogging. Regular monitoring of salt accumulation in irrigation schemes is needed to keep its negative effects under control. The dynamic spatial and temporal characteristics of remote sensing can provide a cost-effective solution for monitoring salt accumulation at irrigation scheme level. This study evaluated a range of pan-fused SPOT-5 derived features (spectral bands, vegetation indices, image textures and image transformations) for classifying salt-affected areas in two distinctly different irrigation schemes in South Africa, namely Vaalharts and Breede River. The relationship between the input features and electro conductivity measurements were investigated using regression modelling (stepwise linear regression, partial least squares regression, curve fit regression modelling) and supervised classification (maximum likelihood, nearest neighbour, decision tree analysis, support vector machine and random forests). Classification and regression trees and random forest were used to select the most important features for differentiating salt-affected and unaffected areas. The results showed that the regression analyses produced weak models (<0.4 R squared). Better results were achieved using the supervised classifiers, but the algorithms tend to over-estimate salt-affected areas. A key finding was that none of the feature sets or classification algorithms stood out as being superior for monitoring salt accumulation at irrigation scheme level. This was attributed to the large variations in the spectral responses of different crops types at different growing stages, coupled with their individual tolerances to saline conditions.
Predicting the Distribution of Vibrio spp. in the Chesapeake Bay: A Vibrio cholerae Case Study
Magny, Guillaume Constantin de; Long, Wen; Brown, Christopher W.; Hood, Raleigh R.; Huq, Anwar; Murtugudde, Raghu; Colwell, Rita R.
2010-01-01
Vibrio cholerae, the causative agent of cholera, is a naturally occurring inhabitant of the Chesapeake Bay and serves as a predictor for other clinically important vibrios, including Vibrio parahaemolyticus and Vibrio vulnificus. A system was constructed to predict the likelihood of the presence of V. cholerae in surface waters of the Chesapeake Bay, with the goal to provide forecasts of the occurrence of this and related pathogenic Vibrio spp. Prediction was achieved by driving an available multivariate empirical habitat model estimating the probability of V. cholerae within a range of temperatures and salinities in the Bay, with hydrodynamically generated predictions of ambient temperature and salinity. The experimental predictions provided both an improved understanding of the in situ variability of V. cholerae, including identification of potential hotspots of occurrence, and usefulness as an early warning system. With further development of the system, prediction of the probability of the occurrence of related pathogenic vibrios in the Chesapeake Bay, notably V. parahaemolyticus and V. vulnificus, will be possible, as well as its transport to any geographical location where sufficient relevant data are available. PMID:20145974
Chen, Yongqiang; Xie, Quan; Sari, Ahmad; ...
2017-11-21
Wettability of the oil/brine/rock system is an essential petro-physical parameter which governs subsurface multiphase flow behaviour and the distribution of fluids, thus directly affecting oil recovery. Recent studies [1–3] show that manipulation of injected brine composition can enhance oil recovery by shifting wettability from oil-wet to water-wet. However, what factor(s) control system wettability has not been completely elucidated due to incomplete understanding of the geochemical system. To isolate and identify the key factors at play we used in this paper SO 4 2—free solutions to examine the effect of salinity (formation brine/FB, 10 times diluted formation brine/10 dFB, and 100more » times diluted formation brine/100 dFB) on the contact angle of oil droplets at the surface of calcite. We then compared contact angle results with predictions of surface complexation by low salinity water using PHREEQC software. We demonstrate that the conventional dilution approach likely triggers an oil-wet system at low pH, which may explain why the low salinity water EOR-effect is not always observed by injecting low salinity water in carbonated reservoirs. pH plays a fundamental role in the surface chemistry of oil/brine interfaces, and wettability. Our contact angle results show that formation brine triggered a strong water-wet system (35°) at pH 2.55, yet 100 times diluted formation brine led to a strongly oil-wet system (contact angle = 175°) at pH 5.68. Surface complexation modelling correctly predicted the wettability trend with salinity; the bond product sum ([>CaOH 2 +][–COO -] + [>CO 3 -][–NH +] + [>CO 3 -][–COOCa +]) increased with decreasing salinity. Finally, at pH < 6 dilution likely makes the calcite surface oil-wet, particularly for crude oils with high base number. Yet, dilution probably causes water wetness at pH > 7 for crude oils with high acid number.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Yongqiang; Xie, Quan; Sari, Ahmad
Wettability of the oil/brine/rock system is an essential petro-physical parameter which governs subsurface multiphase flow behaviour and the distribution of fluids, thus directly affecting oil recovery. Recent studies [1–3] show that manipulation of injected brine composition can enhance oil recovery by shifting wettability from oil-wet to water-wet. However, what factor(s) control system wettability has not been completely elucidated due to incomplete understanding of the geochemical system. To isolate and identify the key factors at play we used in this paper SO 4 2—free solutions to examine the effect of salinity (formation brine/FB, 10 times diluted formation brine/10 dFB, and 100more » times diluted formation brine/100 dFB) on the contact angle of oil droplets at the surface of calcite. We then compared contact angle results with predictions of surface complexation by low salinity water using PHREEQC software. We demonstrate that the conventional dilution approach likely triggers an oil-wet system at low pH, which may explain why the low salinity water EOR-effect is not always observed by injecting low salinity water in carbonated reservoirs. pH plays a fundamental role in the surface chemistry of oil/brine interfaces, and wettability. Our contact angle results show that formation brine triggered a strong water-wet system (35°) at pH 2.55, yet 100 times diluted formation brine led to a strongly oil-wet system (contact angle = 175°) at pH 5.68. Surface complexation modelling correctly predicted the wettability trend with salinity; the bond product sum ([>CaOH 2 +][–COO -] + [>CO 3 -][–NH +] + [>CO 3 -][–COOCa +]) increased with decreasing salinity. Finally, at pH < 6 dilution likely makes the calcite surface oil-wet, particularly for crude oils with high base number. Yet, dilution probably causes water wetness at pH > 7 for crude oils with high acid number.« less
River salinity on a mega-delta, an unstructured grid model approach.
NASA Astrophysics Data System (ADS)
Bricheno, Lucy; Saiful Islam, Akm; Wolf, Judith
2014-05-01
With an average freshwater discharge of around 40,000 m3/s the BGM (Brahmaputra Ganges and Meghna) river system has the third largest discharge worldwide. The BGM river delta is a low-lying fertile area covering over 100,000 km2 mainly in India and Bangladesh. Approximately two-thirds of the Bangladesh people work in agriculture and these local livelihoods depend on freshwater sources directly linked to river salinity. The finite volume coastal ocean model (FVCOM) has been applied to the BGM delta in order to simulate river salinity under present and future climate conditions. Forced by a combination of regional climate model predictions, and a basin-wide river catchment model, the 3D baroclinic delta model can determine river salinity under the current climate, and make predictions for future wet and dry years. The river salinity demonstrates a strong seasonal and tidal cycle, making it important for the model to be able to capture a wide range of timescales. The unstructured mesh approach used in FVCOM is required to properly represent the delta's structure; a complex network of interconnected river channels. The model extends 250 km inland in order to capture the full extent of the tidal influence and grid resolutions of 10s of metres are required to represent narrow inland river channels. The use of FVCOM to simulate flows so far inland is a novel challenge, which also requires knowledge of the shape and cross-section of the river channels.
Lallias, Delphine; Hiddink, Jan G; Fonseca, Vera G; Gaspar, John M; Sung, Way; Neill, Simon P; Barnes, Natalie; Ferrero, Tim; Hall, Neil; Lambshead, P John D; Packer, Margaret; Thomas, W Kelley; Creer, Simon
2015-01-01
Assessing how natural environmental drivers affect biodiversity underpins our understanding of the relationships between complex biotic and ecological factors in natural ecosystems. Of all ecosystems, anthropogenically important estuaries represent a ‘melting pot' of environmental stressors, typified by extreme salinity variations and associated biological complexity. Although existing models attempt to predict macroorganismal diversity over estuarine salinity gradients, attempts to model microbial biodiversity are limited for eukaryotes. Although diatoms commonly feature as bioindicator species, additional microbial eukaryotes represent a huge resource for assessing ecosystem health. Of these, meiofaunal communities may represent the optimal compromise between functional diversity that can be assessed using morphology and phenotype–environment interactions as compared with smaller life fractions. Here, using 454 Roche sequencing of the 18S nSSU barcode we investigate which of the local natural drivers are most strongly associated with microbial metazoan and sampled protist diversity across the full salinity gradient of the estuarine ecosystem. In order to investigate potential variation at the ecosystem scale, we compare two geographically proximate estuaries (Thames and Mersey, UK) with contrasting histories of anthropogenic stress. The data show that although community turnover is likely to be predictable, taxa are likely to respond to different environmental drivers and, in particular, hydrodynamics, salinity range and granulometry, according to varied life-history characteristics. At the ecosystem level, communities exhibited patterns of estuary-specific similarity within different salinity range habitats, highlighting the environmental sequencing biomonitoring potential of meiofauna, dispersal effects or both. PMID:25423027
NWRA AVOSS Wake Vortex Prediction Algorithm. 3.1.1
NASA Technical Reports Server (NTRS)
Robins, R. E.; Delisi, D. P.; Hinton, David (Technical Monitor)
2002-01-01
This report provides a detailed description of the wake vortex prediction algorithm used in the Demonstration Version of NASA's Aircraft Vortex Spacing System (AVOSS). The report includes all equations used in the algorithm, an explanation of how to run the algorithm, and a discussion of how the source code for the algorithm is organized. Several appendices contain important supplementary information, including suggestions for enhancing the algorithm and results from test cases.
Predicting Loss-of-Control Boundaries Toward a Piloting Aid
NASA Technical Reports Server (NTRS)
Barlow, Jonathan; Stepanyan, Vahram; Krishnakumar, Kalmanje
2012-01-01
This work presents an approach to predicting loss-of-control with the goal of providing the pilot a decision aid focused on maintaining the pilot's control action within predicted loss-of-control boundaries. The predictive architecture combines quantitative loss-of-control boundaries, a data-based predictive control boundary estimation algorithm and an adaptive prediction method to estimate Markov model parameters in real-time. The data-based loss-of-control boundary estimation algorithm estimates the boundary of a safe set of control inputs that will keep the aircraft within the loss-of-control boundaries for a specified time horizon. The adaptive prediction model generates estimates of the system Markov Parameters, which are used by the data-based loss-of-control boundary estimation algorithm. The combined algorithm is applied to a nonlinear generic transport aircraft to illustrate the features of the architecture.
Lee, Jae-Hong; Kim, Do-Hyung; Jeong, Seong-Nyum; Choi, Seong-Ho
2018-04-01
The aim of the current study was to develop a computer-assisted detection system based on a deep convolutional neural network (CNN) algorithm and to evaluate the potential usefulness and accuracy of this system for the diagnosis and prediction of periodontally compromised teeth (PCT). Combining pretrained deep CNN architecture and a self-trained network, periapical radiographic images were used to determine the optimal CNN algorithm and weights. The diagnostic and predictive accuracy, sensitivity, specificity, positive predictive value, negative predictive value, receiver operating characteristic (ROC) curve, area under the ROC curve, confusion matrix, and 95% confidence intervals (CIs) were calculated using our deep CNN algorithm, based on a Keras framework in Python. The periapical radiographic dataset was split into training (n=1,044), validation (n=348), and test (n=348) datasets. With the deep learning algorithm, the diagnostic accuracy for PCT was 81.0% for premolars and 76.7% for molars. Using 64 premolars and 64 molars that were clinically diagnosed as severe PCT, the accuracy of predicting extraction was 82.8% (95% CI, 70.1%-91.2%) for premolars and 73.4% (95% CI, 59.9%-84.0%) for molars. We demonstrated that the deep CNN algorithm was useful for assessing the diagnosis and predictability of PCT. Therefore, with further optimization of the PCT dataset and improvements in the algorithm, a computer-aided detection system can be expected to become an effective and efficient method of diagnosing and predicting PCT.
A systematic investigation of computation models for predicting Adverse Drug Reactions (ADRs).
Kuang, Qifan; Wang, MinQi; Li, Rong; Dong, YongCheng; Li, Yizhou; Li, Menglong
2014-01-01
Early and accurate identification of adverse drug reactions (ADRs) is critically important for drug development and clinical safety. Computer-aided prediction of ADRs has attracted increasing attention in recent years, and many computational models have been proposed. However, because of the lack of systematic analysis and comparison of the different computational models, there remain limitations in designing more effective algorithms and selecting more useful features. There is therefore an urgent need to review and analyze previous computation models to obtain general conclusions that can provide useful guidance to construct more effective computational models to predict ADRs. In the current study, the main work is to compare and analyze the performance of existing computational methods to predict ADRs, by implementing and evaluating additional algorithms that have been earlier used for predicting drug targets. Our results indicated that topological and intrinsic features were complementary to an extent and the Jaccard coefficient had an important and general effect on the prediction of drug-ADR associations. By comparing the structure of each algorithm, final formulas of these algorithms were all converted to linear model in form, based on this finding we propose a new algorithm called the general weighted profile method and it yielded the best overall performance among the algorithms investigated in this paper. Several meaningful conclusions and useful findings regarding the prediction of ADRs are provided for selecting optimal features and algorithms.
An improved reversible data hiding algorithm based on modification of prediction errors
NASA Astrophysics Data System (ADS)
Jafar, Iyad F.; Hiary, Sawsan A.; Darabkh, Khalid A.
2014-04-01
Reversible data hiding algorithms are concerned with the ability of hiding data and recovering the original digital image upon extraction. This issue is of interest in medical and military imaging applications. One particular class of such algorithms relies on the idea of histogram shifting of prediction errors. In this paper, we propose an improvement over one popular algorithm in this class. The improvement is achieved by employing a different predictor, the use of more bins in the prediction error histogram in addition to multilevel embedding. The proposed extension shows significant improvement over the original algorithm and its variations.
NASA Astrophysics Data System (ADS)
Li, Zhong-xiao; Li, Zhen-chun
2016-09-01
The multichannel predictive deconvolution can be conducted in overlapping temporal and spatial data windows to solve the 2D predictive filter for multiple removal. Generally, the 2D predictive filter can better remove multiples at the cost of more computation time compared with the 1D predictive filter. In this paper we first use the cross-correlation strategy to determine the limited supporting region of filters where the coefficients play a major role for multiple removal in the filter coefficient space. To solve the 2D predictive filter the traditional multichannel predictive deconvolution uses the least squares (LS) algorithm, which requires primaries and multiples are orthogonal. To relax the orthogonality assumption the iterative reweighted least squares (IRLS) algorithm and the fast iterative shrinkage thresholding (FIST) algorithm have been used to solve the 2D predictive filter in the multichannel predictive deconvolution with the non-Gaussian maximization (L1 norm minimization) constraint of primaries. The FIST algorithm has been demonstrated as a faster alternative to the IRLS algorithm. In this paper we introduce the FIST algorithm to solve the filter coefficients in the limited supporting region of filters. Compared with the FIST based multichannel predictive deconvolution without the limited supporting region of filters the proposed method can reduce the computation burden effectively while achieving a similar accuracy. Additionally, the proposed method can better balance multiple removal and primary preservation than the traditional LS based multichannel predictive deconvolution and FIST based single channel predictive deconvolution. Synthetic and field data sets demonstrate the effectiveness of the proposed method.
NASA Astrophysics Data System (ADS)
Burrage, D. M.; Wesson, J. C.; Wang, D. W.; Garrison, J. L.; Zhang, H.
2017-12-01
The launch of the Cyclone Global Navigation Satellite System (CYGNSS) constellation of 8 microsats carrying GPS L-band reflectometers on 15 Dec., 2016, and continued operation of the L-band radiometer on the European Space Agency (ESA) Soil Moisture and Ocean Salinity (SMOS) satellite, allow these complementary technologies to coincidentally retrieve Ocean surface roughness (Mean Square Slope, MSS), Surface Wind speed (WSP), and Sea Surface Salinity (SSS). The Carolina Offshore (Caro) airborne experiment was conducted jointly by NRL SSC and Purdue University from 7-11 May, 2017 with the goal of under-flying CYGNSS and SMOS and overflying NOAA buoys, to obtain high-resolution reflectometer and radiometer data for combined retrieval of MSS, SSS and WSP on the continental shelf. Airborne instruments included NRL's Salinity Temperature and Roughness Remote Scanner (STARRS) L-, C- and IR-band radiometer system, and a 4-channel dual-pol L-band (GPS) and S-band (XM radio) reflectometer, built by Purdue University. Flights either crossed NOAA buoys on various headings, or intersected with specular point ground tracks at predicted CYGNSS overpass times. Prevailing winds during Caro were light to moderate (1-8 m/s), so specular returns dominated the reflectometer Delay Doppler Maps (DDMs), and MSS was generally low. In contrast, stronger winds (1-12 m/s) and rougher seas (wave heights 1-5 m) were experienced during the preceding Maine Offshore (Maineo) experiment in March, 2016. Several DDM observables were used to retrieve MSS and WSP, and radiometer brightness temperatures produced Sea Surface Temperature (SST), SSS and also WSP estimates. The complementary relationship of Kirchoff's formula e+r=1, between radiometric emissivity, e, and reflectivity, r, was exploited to seek consistent estimates of MSS, and use it to correct the SSS retrievals for sea surface roughness effects. The relative performance and utility of the various airborne and satellite retrieval algorithms were assessed, and the coincident buoy, aircraft and satellite retrievals of MSS, WSP and SSS were compared. During Caro WSP from the different instruments generally agreed. Some anomalously high wind retrievals found here and elsewhere in current CYGNSS Level 2 data may yield to the science team's recent L1 calibration revision.
Mortality risk score prediction in an elderly population using machine learning.
Rose, Sherri
2013-03-01
Standard practice for prediction often relies on parametric regression methods. Interesting new methods from the machine learning literature have been introduced in epidemiologic studies, such as random forest and neural networks. However, a priori, an investigator will not know which algorithm to select and may wish to try several. Here I apply the super learner, an ensembling machine learning approach that combines multiple algorithms into a single algorithm and returns a prediction function with the best cross-validated mean squared error. Super learning is a generalization of stacking methods. I used super learning in the Study of Physical Performance and Age-Related Changes in Sonomans (SPPARCS) to predict death among 2,066 residents of Sonoma, California, aged 54 years or more during the period 1993-1999. The super learner for predicting death (risk score) improved upon all single algorithms in the collection of algorithms, although its performance was similar to that of several algorithms. Super learner outperformed the worst algorithm (neural networks) by 44% with respect to estimated cross-validated mean squared error and had an R2 value of 0.201. The improvement of super learner over random forest with respect to R2 was approximately 2-fold. Alternatives for risk score prediction include the super learner, which can provide improved performance.
Improved hybrid optimization algorithm for 3D protein structure prediction.
Zhou, Changjun; Hou, Caixia; Wei, Xiaopeng; Zhang, Qiang
2014-07-01
A new improved hybrid optimization algorithm - PGATS algorithm, which is based on toy off-lattice model, is presented for dealing with three-dimensional protein structure prediction problems. The algorithm combines the particle swarm optimization (PSO), genetic algorithm (GA), and tabu search (TS) algorithms. Otherwise, we also take some different improved strategies. The factor of stochastic disturbance is joined in the particle swarm optimization to improve the search ability; the operations of crossover and mutation that are in the genetic algorithm are changed to a kind of random liner method; at last tabu search algorithm is improved by appending a mutation operator. Through the combination of a variety of strategies and algorithms, the protein structure prediction (PSP) in a 3D off-lattice model is achieved. The PSP problem is an NP-hard problem, but the problem can be attributed to a global optimization problem of multi-extremum and multi-parameters. This is the theoretical principle of the hybrid optimization algorithm that is proposed in this paper. The algorithm combines local search and global search, which overcomes the shortcoming of a single algorithm, giving full play to the advantage of each algorithm. In the current universal standard sequences, Fibonacci sequences and real protein sequences are certified. Experiments show that the proposed new method outperforms single algorithms on the accuracy of calculating the protein sequence energy value, which is proved to be an effective way to predict the structure of proteins.
An acoustic backscatter thermometer for remotely mapping seafloor water temperature
NASA Astrophysics Data System (ADS)
Jackson, Darrell R.; Dworski, J. George
1992-01-01
A bottom-mounted, circularly scanning sonar operating at 40 kHz has been used to map changes in water sound speed over a circular region 150 m in diameter. If it is assumed that the salinity remains constant, the change in sound speed can be converted to a change in temperature. For the present system, the spatial resolution is 7.5 m and the temperature resolution is 0.05°C. The technique is based on comparison of successive sonar scans by means of a correlation algorithm. The algorithm is illustrated using data from the Sediment Transport Events on Slopes and Shelves (STRESS) experiment.
Evaluation of formation water chemistry and scale prediction: Bakken Shale
Thyne, Geoffrey; Brady, Patrick
2016-10-24
Determination of in situ formation water chemistry is an essential component of reservoir management. This study details the use of thermodynamic computer models to calculate reservoir pH and restore produced water analyses for prediction of scale formation. Bakken produced water samples were restored to formation conditions and calculations of scale formation performed. In situ pH is controlled by feldspar-clay equilibria. Calcite scale is readily formed due to changes in pH during pressure drop from in situ to surface conditions. The formation of anhydrite and halite scale, which has been observed, was predicted only for the most saline samples. Finally, inmore » addition, the formation of anhydrite and/or halite may be related to the localized conditions of increased salinity as water is partitioned into the gas phase during production.« less
Bill, Brian D.; Moore, Stephanie K.; Hay, Levi R.; Anderson, Donald M.; Trainer, Vera L.
2016-01-01
Toxin-producing blooms of dinoflagellates in the genus Alexandrium have plagued the inhabitants of the Salish Sea for centuries. Yet the environmental conditions that promote accelerated growth of this organism, a producer of paralytic shellfish toxins, is lacking. This study quantitatively determined the growth response of two Alexandrium isolates to a range of temperatures and salinities, factors that will strongly respond to future climate change scenarios. An empirical equation, derived from observed growth rates describing the temperature and salinity dependence of growth, was used to hindcast bloom risk. Hindcasting was achieved by comparing predicted growth rates, calculated from in situ temperature and salinity data from Quartermaster Harbor, with corresponding Alexandrium cell counts and shellfish toxin data. The greatest bloom risk, defined at μ>0.25 d−1, generally occurred from April through November annually; however, growth rates rarely fell below 0.10 d−1. Except for a few occasions, Alexandrium cells were only observed during the periods of highest bloom risk and paralytic shellfish toxins above the regulatory limit always fell within the periods of predicted bloom occurrence. While acknowledging that Alexandrium growth rates are affected by other abiotic and biotic factors, such as grazing pressure and nutrient availability, the use of this empirical growth function to predict higher risk time frames for blooms and toxic shellfish within the Salish Sea provides the groundwork for a more comprehensive biological model of Alexandrium bloom dynamics in the region and will enhance our ability to forecast blooms in the Salish Sea under future climate change scenarios. PMID:27037588
Assimilation of temperature and salinity profile data in the Norwegian Climate Prediction Model
NASA Astrophysics Data System (ADS)
Wang, Yiguo; Counillon, Francois; Bertino, Laurent; Bethke, Ingo; Keenlyside, Noel
2016-04-01
Assimilating temperature and salinity profile data is promising to constrain the ocean component of Earth system models for the purpose of seasonal-to-dedacal climate predictions. However, assimilating temperature and salinity profiles that are measured in standard depth coordinate (z-coordinate) into isopycnic coordinate ocean models that are discretised by water densities is challenging. Prior studies (Thacker and Esenkov, 2002; Xie and Zhu, 2010) suggested that converting observations to the model coordinate (i.e. innovations in isopycnic coordinate) performs better than interpolating model state to observation coordinate (i.e. innovations in z-coordinate). This problem is revisited here with the Norwegian Climate Prediction Model, which applies the ensemble Kalman filter (EnKF) into the ocean isopycnic model (MICOM) of the Norwegian Earth System Model. We perform Observing System Simulation Experiments (OSSEs) to compare two schemes (the EnKF-z and EnKF-ρ). In OSSEs, the truth is set to the EN4 objective analyses and observations are perturbations of the truth with white noises. Unlike in previous studies, it is found that EnKF-z outperforms EnKF-ρ for different observed vertical resolution, inhomogeneous sampling (e.g. upper 1000 meter observations only), or lack of salinity measurements. That is mostly because the operator converting observations into isopycnic coordinate is strongly non-linear. We also study the horizontal localisation radius at certain arbitrary grid points. Finally, we perform the EnKF-z with the chosen localisation radius in a realistic framework with NorCPM over a 5-year analysis period. The analysis is validated by different independent datasets.
NASA Astrophysics Data System (ADS)
Abou Najm, M.; Safi, A.; El-Fadel, M.; Doummar, J.; Alameddine, I.
2016-12-01
The relative importance of climate change induced sea level rise on the salinization of a highly urbanized karstified coastal aquifers were compared with non-sustainable pumping. A 3D variable-density groundwater flow and solute transport model was used to predict the displacement of the saltwater-freshwater interface in a pilot aquifer located along the Eastern Mediterranean. The results showed that the influence of sea level rise was marginal when compared with the encroachment of salinity associated with anthropogenic abstraction. Model predictions of salinity mass and volumetric displacement of the interface corresponding to a long-term monthly transient model showed that the saltwater intrusion dynamic is highly sensitive to change in the abstraction rates which were estimated based on combinations of water consumption rates and population growth rates. Salinity encroachment, however, appeared to be more sensitive to water consumption rates in comparison to population growth rates, where a 50% increase in the rate of former led to four times more intrusion as compared to an equivalent increase in population growth rate over 20 years. Coupling both increase in population growth and increased consumption rates had a synergistic effect that aggravated the intrusion beyond the sum of the individual impacts. Adaptation strategies targeting a decrease in groundwater exploitation proved to be effective in retarding the intrusion.
NASA Astrophysics Data System (ADS)
Kneafsey, T. J.; Flemings, P. B.; Bryant, S. L.; You, K.; Polito, P. J.
2013-12-01
Global climate change will cause warming of the oceans and land. This will affect the occurrence, behavior, and location of subseafloor and subterranean methane hydrate deposits. We suggest that in many natural systems local salinity, elevated by hydrate formation or freshened by hydrate dissociation, may control gas transport through the hydrate stability zone. We are performing experiments and modeling the experiments to explore this behavior for different warming scenarios. Initially, we are exploring hydrate association/dissociation in saline systems with constant water mass. We compare experiments run with saline (3.5 wt. %) water vs. distilled water in a sand mixture at an initial water saturation of ~0.5. We increase the pore fluid (methane) pressure to 1050 psig. We then stepwise cool the sample into the hydrate stability field (~3 degrees C), allowing methane gas to enter as hydrate forms. We measure resistivity and the mass of methane consumed. We are currently running these experiments and we predict our results from equilibrium thermodynamics. In the fresh water case, the modeled final hydrate saturation is 63% and all water is consumed. In the saline case, the modeled final hydrate saturation is 47%, the salinity is 12.4 wt. %, and final water saturation is 13%. The fresh water system is water-limited: all the water is converted to hydrate. In the saline system, pore water salinity is elevated and salt is excluded from the hydrate structure during hydrate formation until the salinity drives the system to three phase equilibrium (liquid, gas, hydrate) and no further hydrate forms. In our laboratory we can impose temperature gradients within the column, and we will use this to investigate equilibrium conditions in large samples subjected to temperature gradients and changing temperature. In these tests, we will quantify the hydrate saturation and salinity over our meter-long sample using spatially distributed temperature sensors, spatially distributed resistivity probes, compressional wave velocities, and X-ray computed tomography scanning. Modeling of hydrate formation and dissociation for these conditions indicates that the transport of bulk fluid phases (gas and water) plays a crucial role in the overall behavior, and we will explore open-system boundary conditions in the experiments to test this prediction.
Functional tradeoffs underpin salinity-driven divergence in microbial community composition.
Dupont, Chris L; Larsson, John; Yooseph, Shibu; Ininbergs, Karolina; Goll, Johannes; Asplund-Samuelsson, Johannes; McCrow, John P; Celepli, Narin; Allen, Lisa Zeigler; Ekman, Martin; Lucas, Andrew J; Hagström, Åke; Thiagarajan, Mathangi; Brindefalk, Björn; Richter, Alexander R; Andersson, Anders F; Tenney, Aaron; Lundin, Daniel; Tovchigrechko, Andrey; Nylander, Johan A A; Brami, Daniel; Badger, Jonathan H; Allen, Andrew E; Rusch, Douglas B; Hoffman, Jeff; Norrby, Erling; Friedman, Robert; Pinhassi, Jarone; Venter, J Craig; Bergman, Birgitta
2014-01-01
Bacterial community composition and functional potential change subtly across gradients in the surface ocean. In contrast, while there are significant phylogenetic divergences between communities from freshwater and marine habitats, the underlying mechanisms to this phylogenetic structuring yet remain unknown. We hypothesized that the functional potential of natural bacterial communities is linked to this striking divide between microbiomes. To test this hypothesis, metagenomic sequencing of microbial communities along a 1,800 km transect in the Baltic Sea area, encompassing a continuous natural salinity gradient from limnic to fully marine conditions, was explored. Multivariate statistical analyses showed that salinity is the main determinant of dramatic changes in microbial community composition, but also of large scale changes in core metabolic functions of bacteria. Strikingly, genetically and metabolically different pathways for key metabolic processes, such as respiration, biosynthesis of quinones and isoprenoids, glycolysis and osmolyte transport, were differentially abundant at high and low salinities. These shifts in functional capacities were observed at multiple taxonomic levels and within dominant bacterial phyla, while bacteria, such as SAR11, were able to adapt to the entire salinity gradient. We propose that the large differences in central metabolism required at high and low salinities dictate the striking divide between freshwater and marine microbiomes, and that the ability to inhabit different salinity regimes evolved early during bacterial phylogenetic differentiation. These findings significantly advance our understanding of microbial distributions and stress the need to incorporate salinity in future climate change models that predict increased levels of precipitation and a reduction in salinity.
Wu, Fangli; Xie, Zhe; Lan, Yawen; Dupont, Sam; Sun, Meng; Cui, Shuaikang; Huang, Xizhi; Huang, Wei; Liu, Liping; Hu, Menghong; Lu, Weiqun; Wang, Youji
2018-01-01
With the release of large amounts of CO2, ocean acidification is intensifying and affecting aquatic organisms. In addition, salinity also plays an important role for marine organisms and fluctuates greatly in estuarine and coastal ecosystem, where ocean acidification frequently occurs. In present study, flow cytometry was used to investigate immune parameters of haemocytes in the thick shell mussel Mytilus coruscus exposed to different salinities (15, 25, and 35‰) and two pH levels (7.3 and 8.1). A 7-day in vivo and a 5-h in vitro experiments were performed. In both experiments, low pH had significant effects on all tested immune parameters. When exposed to decreased pH, total haemocyte count (THC), phagocytosis (Pha), esterase (Est), and lysosomal content (Lyso) were significantly decreased, whereas haemocyte mortality (HM) and reactive oxygen species (ROS) were increased. High salinity had no significant effects on the immune parameters of haemocytes as compared with low salinity. However, an interaction between pH and salinity was observed in both experiments for most tested haemocyte parameters. This study showed that high salinity, low salinity and low pH have negative and interactive effects on haemocytes of mussels. As a consequence, it can be expected that the combined effect of low pH and changed salinity will have more severe effects on mussel health than predicted by single exposure. PMID:29559924
Hung, Andrew J; Chen, Jian; Che, Zhengping; Nilanon, Tanachat; Jarc, Anthony; Titus, Micha; Oh, Paul J; Gill, Inderbir S; Liu, Yan
2018-05-01
Surgical performance is critical for clinical outcomes. We present a novel machine learning (ML) method of processing automated performance metrics (APMs) to evaluate surgical performance and predict clinical outcomes after robot-assisted radical prostatectomy (RARP). We trained three ML algorithms utilizing APMs directly from robot system data (training material) and hospital length of stay (LOS; training label) (≤2 days and >2 days) from 78 RARP cases, and selected the algorithm with the best performance. The selected algorithm categorized the cases as "Predicted as expected LOS (pExp-LOS)" and "Predicted as extended LOS (pExt-LOS)." We compared postoperative outcomes of the two groups (Kruskal-Wallis/Fisher's exact tests). The algorithm then predicted individual clinical outcomes, which we compared with actual outcomes (Spearman's correlation/Fisher's exact tests). Finally, we identified five most relevant APMs adopted by the algorithm during predicting. The "Random Forest-50" (RF-50) algorithm had the best performance, reaching 87.2% accuracy in predicting LOS (73 cases as "pExp-LOS" and 5 cases as "pExt-LOS"). The "pExp-LOS" cases outperformed the "pExt-LOS" cases in surgery time (3.7 hours vs 4.6 hours, p = 0.007), LOS (2 days vs 4 days, p = 0.02), and Foley duration (9 days vs 14 days, p = 0.02). Patient outcomes predicted by the algorithm had significant association with the "ground truth" in surgery time (p < 0.001, r = 0.73), LOS (p = 0.05, r = 0.52), and Foley duration (p < 0.001, r = 0.45). The five most relevant APMs, adopted by the RF-50 algorithm in predicting, were largely related to camera manipulation. To our knowledge, ours is the first study to show that APMs and ML algorithms may help assess surgical RARP performance and predict clinical outcomes. With further accrual of clinical data (oncologic and functional data), this process will become increasingly relevant and valuable in surgical assessment and training.
Can machine-learning improve cardiovascular risk prediction using routine clinical data?
Kai, Joe; Garibaldi, Jonathan M.; Qureshi, Nadeem
2017-01-01
Background Current approaches to predict cardiovascular risk fail to identify many people who would benefit from preventive treatment, while others receive unnecessary intervention. Machine-learning offers opportunity to improve accuracy by exploiting complex interactions between risk factors. We assessed whether machine-learning can improve cardiovascular risk prediction. Methods Prospective cohort study using routine clinical data of 378,256 patients from UK family practices, free from cardiovascular disease at outset. Four machine-learning algorithms (random forest, logistic regression, gradient boosting machines, neural networks) were compared to an established algorithm (American College of Cardiology guidelines) to predict first cardiovascular event over 10-years. Predictive accuracy was assessed by area under the ‘receiver operating curve’ (AUC); and sensitivity, specificity, positive predictive value (PPV), negative predictive value (NPV) to predict 7.5% cardiovascular risk (threshold for initiating statins). Findings 24,970 incident cardiovascular events (6.6%) occurred. Compared to the established risk prediction algorithm (AUC 0.728, 95% CI 0.723–0.735), machine-learning algorithms improved prediction: random forest +1.7% (AUC 0.745, 95% CI 0.739–0.750), logistic regression +3.2% (AUC 0.760, 95% CI 0.755–0.766), gradient boosting +3.3% (AUC 0.761, 95% CI 0.755–0.766), neural networks +3.6% (AUC 0.764, 95% CI 0.759–0.769). The highest achieving (neural networks) algorithm predicted 4,998/7,404 cases (sensitivity 67.5%, PPV 18.4%) and 53,458/75,585 non-cases (specificity 70.7%, NPV 95.7%), correctly predicting 355 (+7.6%) more patients who developed cardiovascular disease compared to the established algorithm. Conclusions Machine-learning significantly improves accuracy of cardiovascular risk prediction, increasing the number of patients identified who could benefit from preventive treatment, while avoiding unnecessary treatment of others. PMID:28376093
Can machine-learning improve cardiovascular risk prediction using routine clinical data?
Weng, Stephen F; Reps, Jenna; Kai, Joe; Garibaldi, Jonathan M; Qureshi, Nadeem
2017-01-01
Current approaches to predict cardiovascular risk fail to identify many people who would benefit from preventive treatment, while others receive unnecessary intervention. Machine-learning offers opportunity to improve accuracy by exploiting complex interactions between risk factors. We assessed whether machine-learning can improve cardiovascular risk prediction. Prospective cohort study using routine clinical data of 378,256 patients from UK family practices, free from cardiovascular disease at outset. Four machine-learning algorithms (random forest, logistic regression, gradient boosting machines, neural networks) were compared to an established algorithm (American College of Cardiology guidelines) to predict first cardiovascular event over 10-years. Predictive accuracy was assessed by area under the 'receiver operating curve' (AUC); and sensitivity, specificity, positive predictive value (PPV), negative predictive value (NPV) to predict 7.5% cardiovascular risk (threshold for initiating statins). 24,970 incident cardiovascular events (6.6%) occurred. Compared to the established risk prediction algorithm (AUC 0.728, 95% CI 0.723-0.735), machine-learning algorithms improved prediction: random forest +1.7% (AUC 0.745, 95% CI 0.739-0.750), logistic regression +3.2% (AUC 0.760, 95% CI 0.755-0.766), gradient boosting +3.3% (AUC 0.761, 95% CI 0.755-0.766), neural networks +3.6% (AUC 0.764, 95% CI 0.759-0.769). The highest achieving (neural networks) algorithm predicted 4,998/7,404 cases (sensitivity 67.5%, PPV 18.4%) and 53,458/75,585 non-cases (specificity 70.7%, NPV 95.7%), correctly predicting 355 (+7.6%) more patients who developed cardiovascular disease compared to the established algorithm. Machine-learning significantly improves accuracy of cardiovascular risk prediction, increasing the number of patients identified who could benefit from preventive treatment, while avoiding unnecessary treatment of others.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peyret, Thomas; Poulin, Patrick; Krishnan, Kannan, E-mail: kannan.krishnan@umontreal.ca
The algorithms in the literature focusing to predict tissue:blood PC (P{sub tb}) for environmental chemicals and tissue:plasma PC based on total (K{sub p}) or unbound concentration (K{sub pu}) for drugs differ in their consideration of binding to hemoglobin, plasma proteins and charged phospholipids. The objective of the present study was to develop a unified algorithm such that P{sub tb}, K{sub p} and K{sub pu} for both drugs and environmental chemicals could be predicted. The development of the unified algorithm was accomplished by integrating all mechanistic algorithms previously published to compute the PCs. Furthermore, the algorithm was structured in such amore » way as to facilitate predictions of the distribution of organic compounds at the macro (i.e. whole tissue) and micro (i.e. cells and fluids) levels. The resulting unified algorithm was applied to compute the rat P{sub tb}, K{sub p} or K{sub pu} of muscle (n = 174), liver (n = 139) and adipose tissue (n = 141) for acidic, neutral, zwitterionic and basic drugs as well as ketones, acetate esters, alcohols, aliphatic hydrocarbons, aromatic hydrocarbons and ethers. The unified algorithm reproduced adequately the values predicted previously by the published algorithms for a total of 142 drugs and chemicals. The sensitivity analysis demonstrated the relative importance of the various compound properties reflective of specific mechanistic determinants relevant to prediction of PC values of drugs and environmental chemicals. Overall, the present unified algorithm uniquely facilitates the computation of macro and micro level PCs for developing organ and cellular-level PBPK models for both chemicals and drugs.« less
Development and Validation of an Algorithm to Identify Planned Readmissions From Claims Data.
Horwitz, Leora I; Grady, Jacqueline N; Cohen, Dorothy B; Lin, Zhenqiu; Volpe, Mark; Ngo, Chi K; Masica, Andrew L; Long, Theodore; Wang, Jessica; Keenan, Megan; Montague, Julia; Suter, Lisa G; Ross, Joseph S; Drye, Elizabeth E; Krumholz, Harlan M; Bernheim, Susannah M
2015-10-01
It is desirable not to include planned readmissions in readmission measures because they represent deliberate, scheduled care. To develop an algorithm to identify planned readmissions, describe its performance characteristics, and identify improvements. Consensus-driven algorithm development and chart review validation study at 7 acute-care hospitals in 2 health systems. For development, all discharges qualifying for the publicly reported hospital-wide readmission measure. For validation, all qualifying same-hospital readmissions that were characterized by the algorithm as planned, and a random sampling of same-hospital readmissions that were characterized as unplanned. We calculated weighted sensitivity and specificity, and positive and negative predictive values of the algorithm (version 2.1), compared to gold standard chart review. In consultation with 27 experts, we developed an algorithm that characterizes 7.8% of readmissions as planned. For validation we reviewed 634 readmissions. The weighted sensitivity of the algorithm was 45.1% overall, 50.9% in large teaching centers and 40.2% in smaller community hospitals. The weighted specificity was 95.9%, positive predictive value was 51.6%, and negative predictive value was 94.7%. We identified 4 minor changes to improve algorithm performance. The revised algorithm had a weighted sensitivity 49.8% (57.1% at large hospitals), weighted specificity 96.5%, positive predictive value 58.7%, and negative predictive value 94.5%. Positive predictive value was poor for the 2 most common potentially planned procedures: diagnostic cardiac catheterization (25%) and procedures involving cardiac devices (33%). An administrative claims-based algorithm to identify planned readmissions is feasible and can facilitate public reporting of primarily unplanned readmissions. © 2015 Society of Hospital Medicine.
NASA Astrophysics Data System (ADS)
Sen, Amit
2014-10-01
Launched 10 June 2011, the NASA's Aquarius instrument onboard the Argentine built and managed Satélite de Aplicaciones Científicas (SAC-D) has been tirelessly observing the open oceans, confirming and adding new knowledge to the not so vast measured records of our Earth's global oceans. This paper reviews the data collected over the last 3 years, it's findings, challenges and future work that is at hand for the sleepless oceanographers, hydrologists and climate scientists. Although routine data is being collected, a snapshot is presented from almost 3-years of flawless operations showing new discoveries and possibilities of lot more in the future. Repetitive calibration and validation of measurements from Aquarius continue together with comparison of the data to the existing array of Argo temperature/salinity profiling floats, measurements from the recent Salinity Processes in the Upper Ocean Regional Study (SPURS) in-situ experiment and research, and to the data collected from the European Soil Moisture Ocean Salinity (SMOS) mission. This all aids in the optimization of computer model functions to improve the basic understanding of the water cycle over the oceans and its ties to climate. The Aquarius mission operations team also has been tweaking and optimizing algorithms, reprocessing data as needed, and producing salinity movies that has never been seen before. A brief overview of the accomplishments, technical findings to date will be covered in this paper.
Predicting the physical effects of relocating Boston's sewage outfall
Signell, R.P.; Jenter, H.L.; Blumberg, A.F.
2000-01-01
Boston is scheduled to cease discharge of sewage effluent in Boston Harbor in Spring 2000 and begin discharge at a site 14 km offshore in Massachusetts Bay in a water depth of about 30 m. The effects of this outfall relocation on effluent dilution, salinity and circulation are predicted with a three-dimensional hydrodynamic model. The simulations predict that the new bay outfall will greatly decrease effluent concentrations in Boston Harbor (relative to the harbour outfall) and will not significantly change mean effluent concentrations over most of Massachusetts Bay. With the harbour outfall, previous observations and these simulations show that effluent concentrations exceed 0??5% throughout the harbour, with a harbour wide average of 1-2%. With the bay outfall, effluent concentrations exceed 0??5% only within a few km of the new outfall, and harbour concentrations drop to 0??1-0??2%, a 10-fold reduction. During unstratified winter conditions, the local increase in effluent concentration at the bay outfall site is predicted to exist throughout the water column. During stratified summer conditions, however, effluent released at the sea bed rises and is trapped beneath the pycnocline. The local increase in effluent concentration is limited to the lower layer, and as a result, surface layer effluent concentrations in the vicinity of the new outfall site are predicted to decrease (relative to the harbour outfall) during the summer. Slight changes are predicted for the salinity and circulation fields. Removing the fresh water associated with the effluent discharge in Boston Harbor is predicted to increase the mean salinity of the harbour by 0??5 and decrease the mean salinity by 0??10-0??15 within 2-3 km of the outfall. Relative to the existing mean flow, the buoyant discharge at the new outfall is predicted to generate density-driven mean currents of 2-4 cm s-1 that spiral out in a clockwise motion at the surface during winter and at the pycnocline (15-20 m depth) during summer. Compensating counterclockwise currents are predicted to spiral in toward the source at the bottom. Because the scale of the residual current structure induced by the new discharge is comparable to or smaller than typical subtidal water parcel excursions, Lagrangian trajectories will not follow the Eulerian residual flow. Thus, mean currents measured from moorings within 5 km of the bay outfall site will be more useful for model comparison than to indicate net transport pathways.
NASA Astrophysics Data System (ADS)
Maar, Marie; Saurel, Camille; Landes, Anja; Dolmer, Per; Petersen, Jens Kjerulf
2015-08-01
For blue mussels, Mytilus edulis, one major constrain in the Baltic Sea is the low salinities that reduce the efficiency of mussel production. However, the effects of living in low and variable salinity regimes are rarely considered in models describing mussel growth. The aim of the present study was to incorporate the effects of low salinity into an eco-physiological model of blue mussels and to identify areas suitable for mussel production. A Dynamic Energy Budget (DEB) model was modified with respect to i) the morphological parameters (DW/WW-ratio, shape factor), ii) change in ingestion rate and iii) metabolic costs due to osmoregulation in different salinity environments. The modified DEB model was validated with experimental data from different locations in the Western Baltic Sea (including the Limfjorden) with salinities varying from 8.5 to 29.9 psu. The identified areas suitable for mussel production in the Baltic Sea are located in the Little Belt area, the Great Belt, the southern Kattegat and the Limfjorden according to the prevailing salinity regimes. The new model can be used for supporting site selection of new mussel nutrient extraction cultures in the Baltic Sea that suffers from high eutrophication symptoms or as part of integrated multi-trophic aquaculture production. The model can also be used to predict the effects of salinity changes on mussel populations e.g. in climate change studies.
NASA Astrophysics Data System (ADS)
Masoud, Alaa A.; El-Horiny, Mohamed M.; Atwia, Mohamed G.; Gemail, Khaled S.; Koike, Katsuaki
2018-06-01
Salinization of groundwater and soil resources has long been a serious environmental hazard in arid regions. This study was conducted to investigate and document the factors controlling such salinization and their inter-relationships in the Dakhla Oasis (Egypt). To accomplish this, 60 groundwater samples and 31 soil samples were collected in February 2014. Factor analysis (FA) and hierarchical cluster analysis (HCA) were integrated with geostatistical analyses to characterize the chemical properties of groundwater and soil and their spatial patterns, identify the factors controlling the pattern variability, and clarify the salinization mechanism. Groundwater quality standards revealed emergence of salinization (av. 885.8 mg/L) and extreme occurrences of Fe2+ (av. 17.22 mg/L) and Mn2+ (av. 2.38 mg/L). Soils were highly salt-affected (av. 15.2 dS m-1) and slightly alkaline (av. pH = 7.7). Evaporation and ion-exchange processes governed the evolution of two main water types: Na-Cl (52%) and Ca-Mg-Cl (47%), respectively. Salinization leads the chemical variability of both resources. Distinctive patterns of slight salinization marked the northern part and intense salinization marked the middle and southern parts. Congruence in the resources clusters confirmed common geology, soil types, and urban and agricultural practices. Minimizing the environmental and socioeconomic impacts of the resources salinization urges the need for better understanding of the hydrochemical characteristics and prediction of quality changes.
Effect of temperature and salinity on phosphate sorption on marine sediments.
Zhang, Jia-Zhong; Huang, Xiao-Lan
2011-08-15
Our previous studies on the phosphate sorption on sediments in Florida Bay at 25 °C in salinity 36 seawater revealed that the sorption capacity varies considerably within the bay but can be attributed to the content of sedimentary P and Fe. It is known that both temperature and salinity influence the sorption process and their natural variations are the greatest in estuaries. To provide useful sorption parameters for modeling phosphate cycle in Florida Bay, a systematic study was carried out to quantify the effects of salinity and temperature on phosphate sorption on sediments. For a given sample, the zero equilibrium phosphate concentration and the distribution coefficient were measured over a range of salinity (2-72) and temperature (15-35 °C) conditions. Such a suite of experiments with combinations of different temperature and salinity were performed for 14 selected stations that cover a range of sediment characteristics and geographic locations of the bay. Phosphate sorption was found to increase with increasing temperature or decreasing salinity and their effects depended upon sediment's exchangeable P content. This study provided the first estimate of the phosphate sorption parameters as a function of salinity and temperature in marine sediments. Incorporation of these parameters in water quality models will enable them to predict the effect of increasing freshwater input, as proposed by the Comprehensive Everglades Restoration Plan, on the seasonal cycle of phosphate in Florida Bay.
Youngs, Noah; Penfold-Brown, Duncan; Drew, Kevin; Shasha, Dennis; Bonneau, Richard
2013-05-01
Computational biologists have demonstrated the utility of using machine learning methods to predict protein function from an integration of multiple genome-wide data types. Yet, even the best performing function prediction algorithms rely on heuristics for important components of the algorithm, such as choosing negative examples (proteins without a given function) or determining key parameters. The improper choice of negative examples, in particular, can hamper the accuracy of protein function prediction. We present a novel approach for choosing negative examples, using a parameterizable Bayesian prior computed from all observed annotation data, which also generates priors used during function prediction. We incorporate this new method into the GeneMANIA function prediction algorithm and demonstrate improved accuracy of our algorithm over current top-performing function prediction methods on the yeast and mouse proteomes across all metrics tested. Code and Data are available at: http://bonneaulab.bio.nyu.edu/funcprop.html
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhao, Pengfei; Roy, Subrata, E-mail: roy@ufl.edu
2014-05-07
This work investigates the performance of underwater spark discharge relating to bubble growth and decay under high pressure and with salinity conditions by introducing a modified form of the resistance equation. Here, we study salinity influence on circuit parameters by fitting the experimental data for which gap resistance is much larger in conductive water than in dielectric water. Accordingly, the resistance equation is modified by considering the influence of both plasma and its surrounding liquid. Thermal radiation effect of the bubble is also studied by comparing two different radiation models. Numerical results predict a larger bubble pressure for saline watermore » but a reduced size and a smaller bubble cycle at a greater water depth. Such study may be useful in many saltwater applications, including that for deep sea conditions.« less
NASA Astrophysics Data System (ADS)
Macnae, J.; Ley-Cooper, Y.
2009-05-01
Sub-surface porosity is of importance in estimating fluid contant and salt-load parameters for hydrological modelling. While sparse boreholes may adequately sample the depth to a sub-horizontal water-table and usually also adequately sample ground-water salinity, they do not provide adequate sampling of the spatial variations in porosity or hydraulic permeability caused by spatial variations in sedimentary and other geological processes.. We show in this presentation that spatially detailed porosity can be estimated by applying Archie's law to conductivity estimates from airborne electromagnetic surveys with interpolated ground-water conductivity values. The prediction was tested on data from the Chowilla flood plain in the Murray-Darling Basin of South Australia. A frequency domain, helicopter-borne electromagnetic system collected data at 6 frequencies and 3 to 4 m spacings on lines spaced 100 m apart. This data was transformed into conductivity-depth sections, from which a 3D bulk-conductivity map could be created with about 30 m spatial resolution and 2 to 5 m vertical depth resolution. For that portion of the volume below the interpolated water-table, we predicted porosity in each cell using Archie's law. Generally, predicted porosities were in the 30 to 50 % range, consistent with expectations for the partially consolidated sediments in the floodplain. Porosities were directly measured on core from eight boreholes in the area, and compared quite well with the predictions. The predicted porosity map was spatially consistent, and when combined with measured salinities in the ground water, was able to provide a detailed 3D map of salt-loads in the saturated zone, and as such contribute to a hazard assessment of the saline threat to the river.
Compressed sensing based missing nodes prediction in temporal communication network
NASA Astrophysics Data System (ADS)
Cheng, Guangquan; Ma, Yang; Liu, Zhong; Xie, Fuli
2018-02-01
The reconstruction of complex network topology is of great theoretical and practical significance. Most research so far focuses on the prediction of missing links. There are many mature algorithms for link prediction which have achieved good results, but research on the prediction of missing nodes has just begun. In this paper, we propose an algorithm for missing node prediction in complex networks. We detect the position of missing nodes based on their neighbor nodes under the theory of compressed sensing, and extend the algorithm to the case of multiple missing nodes using spectral clustering. Experiments on real public network datasets and simulated datasets show that our algorithm can detect the locations of hidden nodes effectively with high precision.
Increasing Prediction the Original Final Year Project of Student Using Genetic Algorithm
NASA Astrophysics Data System (ADS)
Saragih, Rijois Iboy Erwin; Turnip, Mardi; Sitanggang, Delima; Aritonang, Mendarissan; Harianja, Eva
2018-04-01
Final year project is very important forgraduation study of a student. Unfortunately, many students are not seriouslydidtheir final projects. Many of studentsask for someone to do it for them. In this paper, an application of genetic algorithms to predict the original final year project of a studentis proposed. In the simulation, the data of the final project for the last 5 years is collected. The genetic algorithm has several operators namely population, selection, crossover, and mutation. The result suggest that genetic algorithm can do better prediction than other comparable model. Experimental results of predicting showed that 70% was more accurate than the previous researched.
DOT National Transportation Integrated Search
2012-05-01
The purpose of this document is to fully define and describe the logic flow and mathematical equations for a predictive braking enforcement algorithm intended for implementation in a Positive Train Control (PTC) system.
CAT-PUMA: CME Arrival Time Prediction Using Machine learning Algorithms
NASA Astrophysics Data System (ADS)
Liu, Jiajia; Ye, Yudong; Shen, Chenglong; Wang, Yuming; Erdélyi, Robert
2018-04-01
CAT-PUMA (CME Arrival Time Prediction Using Machine learning Algorithms) quickly and accurately predicts the arrival of Coronal Mass Ejections (CMEs) of CME arrival time. The software was trained via detailed analysis of CME features and solar wind parameters using 182 previously observed geo-effective partial-/full-halo CMEs and uses algorithms of the Support Vector Machine (SVM) to make its predictions, which can be made within minutes of providing the necessary input parameters of a CME.
A low computation cost method for seizure prediction.
Zhang, Yanli; Zhou, Weidong; Yuan, Qi; Wu, Qi
2014-10-01
The dynamic changes of electroencephalograph (EEG) signals in the period prior to epileptic seizures play a major role in the seizure prediction. This paper proposes a low computation seizure prediction algorithm that combines a fractal dimension with a machine learning algorithm. The presented seizure prediction algorithm extracts the Higuchi fractal dimension (HFD) of EEG signals as features to classify the patient's preictal or interictal state with Bayesian linear discriminant analysis (BLDA) as a classifier. The outputs of BLDA are smoothed by a Kalman filter for reducing possible sporadic and isolated false alarms and then the final prediction results are produced using a thresholding procedure. The algorithm was evaluated on the intracranial EEG recordings of 21 patients in the Freiburg EEG database. For seizure occurrence period of 30 min and 50 min, our algorithm obtained an average sensitivity of 86.95% and 89.33%, an average false prediction rate of 0.20/h, and an average prediction time of 24.47 min and 39.39 min, respectively. The results confirm that the changes of HFD can serve as a precursor of ictal activities and be used for distinguishing between interictal and preictal epochs. Both HFD and BLDA classifier have a low computational complexity. All of these make the proposed algorithm suitable for real-time seizure prediction. Copyright © 2014 Elsevier B.V. All rights reserved.
Remote Sensing Soil Salinity Map for the San Joaquin Vally, California
NASA Astrophysics Data System (ADS)
Scudiero, E.; Skaggs, T. H.; Anderson, R. G.; Corwin, D. L.
2015-12-01
Soil salinization is a major natural hazard to worldwide agriculture. We present a remote imagery approach that maps salinity within a range (i.e., salinities less than 20 dS m-1, when measured as the electrical conductivity of the soil saturation extract), accuracy, and resolution most relevant to agriculture. A case study is presented for the western San Joaquin Valley (WSJV), California, USA (~870,000 ha of farmland) using multi-year Landsat 7 ETM+ canopy reflectance and the Canopy Response Salinity Index (CRSI). Highly detailed salinity maps for 22 fields (542 ha) established from apparent soil electrical conductivity directed sampling were used as ground-truth (sampled in 2013), totaling over 5000 pixels (30×30 m) with salinity values in the range of 0 to 35.2 dS m-1. Multi-year maximum values of CRSI were used to model soil salinity. In addition, soil type, elevation, meteorological data, and crop type were evaluated as covariates. The fitted model (R2=0.73) was validated: i) with a spatial k-folds (i.e., leave-one-field-out) cross-validation (R2=0.61), ii) versus salinity data from three independent fields (sampled in 2013 and 2014), and iii) by determining the accuracy of the qualitative classification of white crusted land as extremely-saline soils. The effect of land use change is evaluated over 2396 ha in the Broadview Water District from a comparison of salinity mapped in 1991 with salinity predicted in 2013 from the fitted model. From 1991 to 2013 salinity increased significantly over the selected study site, bringing attention to potential negative effects on soil quality of shifting from irrigated agriculture to fallow-land. This is cause for concern since over the 3 years of California's drought (2010-2013) the fallow land in the WSJV increased from 12.7% to 21.6%, due to drastic reduction in water allocations to farmers.
Osmotic and hydraulic adjustment of mangrove saplings to extreme salinity.
Méndez-Alonzo, Rodrigo; López-Portillo, Jorge; Moctezuma, Coral; Bartlett, Megan K; Sack, Lawren
2016-12-01
Salinity tolerance in plant species varies widely due to adaptation and acclimation processes at the cellular and whole-plant scales. In mangroves, extreme substrate salinity induces hydraulic failure and ion excess toxicity and reduces growth and survival, thus suggesting a potentially critical role for physiological acclimation to salinity. We tested the hypothesis that osmotic adjustment, a key type of plasticity that mitigates salinity shock, would take place in coordination with declines in whole-plant hydraulic conductance in a common garden experiment using saplings of three mangrove species with different salinity tolerances (Avicennia germinans L., Rhizophora mangle L. and Laguncularia racemosa (L.) C.F. Gaertn., ordered from higher to lower salinity tolerance). For each mangrove species, four salinity treatments (1, 10, 30 and 50 practical salinity units) were established and the time trajectories were determined for leaf osmotic potential (Ψ s ), stomatal conductance (g s ), whole-plant hydraulic conductance (K plant ) and predawn disequilibrium between xylem and substrate water potentials (Ψ pdd ). We expected that, for all three species, salinity increments would result in coordinated declines in Ψ s , g s and K plant , and that the Ψ pdd would increase with substrate salinity and time of exposure. In concordance with our predictions, reductions in substrate water potential promoted a coordinated decline in Ψ s , g s and K plant , whereas the Ψ pdd increased substantially during the first 4 days but dissipated after 7 days, indicating a time lag for equilibration after a change in substratum salinity. Our results show that mangroves confront and partially ameliorate acute salinity stress via simultaneous reductions in Ψ s , g s and K plant , thus developing synergistic physiological responses at the cell and whole-plant scales. © The Author 2016. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
NASA Astrophysics Data System (ADS)
Liu, Dongdong; She, Dongli
2018-06-01
Current physically based erosion models do not carefully consider the dynamic variations of soil properties during rainfall and are unable to simulate saline-sodic soil slope erosion processes. The aim of this work was to build upon a complete model framework, SSEM, to simulate runoff and erosion processes for saline-sodic soils by coupling dynamic saturated hydraulic conductivity Ks and soil erodibility Kτ. Sixty rainfall simulation rainfall experiments (2 soil textures × 5 sodicity levels × 2 slope gradients × 3 duplicates) provided data for model calibration and validation. SSEM worked very well for simulating the runoff and erosion processes of saline-sodic silty clay. The runoff and erosion processes of saline-sodic silt loam were more complex than those of non-saline soils or soils with higher clay contents; thus, SSEM did not perform very well for some validation events. We further examined the model performances of four concepts: Dynamic Ks and Kτ (Case 1, SSEM), Dynamic Ks and Constant Kτ (Case 2), Constant Ks and Dynamic Kτ (Case 3) and Constant Ks and Constant Kτ (Case 4). The results demonstrated that the model, which considers dynamic variations in soil saturated hydraulic conductivity and soil erodibility, can provide more reasonable runoff and erosion prediction results for saline-sodic soils.
Numerical Simulation of Borehole Flow in Deep Monitor Wells, Pearl Harbor Aquifer, Oahu, Hawaii
NASA Astrophysics Data System (ADS)
Rotzoll, K.; Oki, D. S.; El-Kadi, A. I.
2010-12-01
Salinity profiles collected from uncased deep monitor wells are commonly used to monitor freshwater-lens thickness in coastal aquifers. However, vertical flow in these wells can cause the measured salinity to differ from salinity in the adjacent aquifer. Substantial borehole flow has been observed in uncased wells in the Pearl Harbor aquifer, Oahu, Hawaii. A numerical modeling approach, incorporating aquifer hydraulic characteristics and recharge rates representative of the Pearl Harbor aquifer, was used to evaluate the effects of borehole flow on measured salinity profiles from deep monitor wells. Borehole flow caused by vertical hydraulic gradients associated with the natural regional groundwater-flow system and local groundwater withdrawals was simulated. Model results were used to estimate differences between vertical salinity profiles in deep monitor wells and the adjacent aquifer in areas of downward, horizontal, and upward flow within the regional flow system—for cases with and without nearby pumped wells. Aquifer heterogeneity, represented in the model as layers of contrasting permeability, was incorporated in model scenarios. Results from this study provide insight into the magnitude of the differences between vertical salinity profiles from deep monitor wells and the salinity distributions in the aquifers. These insights are relevant and are critically needed for management and predictive modeling purposes.
NASA Astrophysics Data System (ADS)
Lu, Jianbo; Xi, Yugeng; Li, Dewei; Xu, Yuli; Gan, Zhongxue
2018-01-01
A common objective of model predictive control (MPC) design is the large initial feasible region, low online computational burden as well as satisfactory control performance of the resulting algorithm. It is well known that interpolation-based MPC can achieve a favourable trade-off among these different aspects. However, the existing results are usually based on fixed prediction scenarios, which inevitably limits the performance of the obtained algorithms. So by replacing the fixed prediction scenarios with the time-varying multi-step prediction scenarios, this paper provides a new insight into improvement of the existing MPC designs. The adopted control law is a combination of predetermined multi-step feedback control laws, based on which two MPC algorithms with guaranteed recursive feasibility and asymptotic stability are presented. The efficacy of the proposed algorithms is illustrated by a numerical example.
Doble, Brett; Lorgelly, Paula
2016-04-01
To determine the external validity of existing mapping algorithms for predicting EQ-5D-3L utility values from EORTC QLQ-C30 responses and to establish their generalizability in different types of cancer. A main analysis (pooled) sample of 3560 observations (1727 patients) and two disease severity patient samples (496 and 93 patients) with repeated observations over time from Cancer 2015 were used to validate the existing algorithms. Errors were calculated between observed and predicted EQ-5D-3L utility values using a single pooled sample and ten pooled tumour type-specific samples. Predictive accuracy was assessed using mean absolute error (MAE) and standardized root-mean-squared error (RMSE). The association between observed and predicted EQ-5D utility values and other covariates across the distribution was tested using quantile regression. Quality-adjusted life years (QALYs) were calculated using observed and predicted values to test responsiveness. Ten 'preferred' mapping algorithms were identified. Two algorithms estimated via response mapping and ordinary least-squares regression using dummy variables performed well on number of validation criteria, including accurate prediction of the best and worst QLQ-C30 health states, predicted values within the EQ-5D tariff range, relatively small MAEs and RMSEs, and minimal differences between estimated QALYs. Comparison of predictive accuracy across ten tumour type-specific samples highlighted that algorithms are relatively insensitive to grouping by tumour type and affected more by differences in disease severity. Two of the 'preferred' mapping algorithms suggest more accurate predictions, but limitations exist. We recommend extensive scenario analyses if mapped utilities are used in cost-utility analyses.
A Systematic Investigation of Computation Models for Predicting Adverse Drug Reactions (ADRs)
Kuang, Qifan; Wang, MinQi; Li, Rong; Dong, YongCheng; Li, Yizhou; Li, Menglong
2014-01-01
Background Early and accurate identification of adverse drug reactions (ADRs) is critically important for drug development and clinical safety. Computer-aided prediction of ADRs has attracted increasing attention in recent years, and many computational models have been proposed. However, because of the lack of systematic analysis and comparison of the different computational models, there remain limitations in designing more effective algorithms and selecting more useful features. There is therefore an urgent need to review and analyze previous computation models to obtain general conclusions that can provide useful guidance to construct more effective computational models to predict ADRs. Principal Findings In the current study, the main work is to compare and analyze the performance of existing computational methods to predict ADRs, by implementing and evaluating additional algorithms that have been earlier used for predicting drug targets. Our results indicated that topological and intrinsic features were complementary to an extent and the Jaccard coefficient had an important and general effect on the prediction of drug-ADR associations. By comparing the structure of each algorithm, final formulas of these algorithms were all converted to linear model in form, based on this finding we propose a new algorithm called the general weighted profile method and it yielded the best overall performance among the algorithms investigated in this paper. Conclusion Several meaningful conclusions and useful findings regarding the prediction of ADRs are provided for selecting optimal features and algorithms. PMID:25180585
A digital prediction algorithm for a single-phase boost PFC
NASA Astrophysics Data System (ADS)
Qing, Wang; Ning, Chen; Weifeng, Sun; Shengli, Lu; Longxing, Shi
2012-12-01
A novel digital control algorithm for digital control power factor correction is presented, which is called the prediction algorithm and has a feature of a higher PF (power factor) with lower total harmonic distortion, and a faster dynamic response with the change of the input voltage or load current. For a certain system, based on the current system state parameters, the prediction algorithm can estimate the track of the output voltage and the inductor current at the next switching cycle and get a set of optimized control sequences to perfectly track the trajectory of input voltage. The proposed prediction algorithm is verified at different conditions, and computer simulation and experimental results under multi-situations confirm the effectiveness of the prediction algorithm. Under the circumstances that the input voltage is in the range of 90-265 V and the load current in the range of 20%-100%, the PF value is larger than 0.998. The startup and the recovery times respectively are about 0.1 s and 0.02 s without overshoot. The experimental results also verify the validity of the proposed method.
Protein docking prediction using predicted protein-protein interface.
Li, Bin; Kihara, Daisuke
2012-01-10
Many important cellular processes are carried out by protein complexes. To provide physical pictures of interacting proteins, many computational protein-protein prediction methods have been developed in the past. However, it is still difficult to identify the correct docking complex structure within top ranks among alternative conformations. We present a novel protein docking algorithm that utilizes imperfect protein-protein binding interface prediction for guiding protein docking. Since the accuracy of protein binding site prediction varies depending on cases, the challenge is to develop a method which does not deteriorate but improves docking results by using a binding site prediction which may not be 100% accurate. The algorithm, named PI-LZerD (using Predicted Interface with Local 3D Zernike descriptor-based Docking algorithm), is based on a pair wise protein docking prediction algorithm, LZerD, which we have developed earlier. PI-LZerD starts from performing docking prediction using the provided protein-protein binding interface prediction as constraints, which is followed by the second round of docking with updated docking interface information to further improve docking conformation. Benchmark results on bound and unbound cases show that PI-LZerD consistently improves the docking prediction accuracy as compared with docking without using binding site prediction or using the binding site prediction as post-filtering. We have developed PI-LZerD, a pairwise docking algorithm, which uses imperfect protein-protein binding interface prediction to improve docking accuracy. PI-LZerD consistently showed better prediction accuracy over alternative methods in the series of benchmark experiments including docking using actual docking interface site predictions as well as unbound docking cases.
Pitch-Learning Algorithm For Speech Encoders
NASA Technical Reports Server (NTRS)
Bhaskar, B. R. Udaya
1988-01-01
Adaptive algorithm detects and corrects errors in sequence of estimates of pitch period of speech. Algorithm operates in conjunction with techniques used to estimate pitch period. Used in such parametric and hybrid speech coders as linear predictive coders and adaptive predictive coders.
Adaptive Trajectory Prediction Algorithm for Climbing Flights
NASA Technical Reports Server (NTRS)
Schultz, Charles Alexander; Thipphavong, David P.; Erzberger, Heinz
2012-01-01
Aircraft climb trajectories are difficult to predict, and large errors in these predictions reduce the potential operational benefits of some advanced features for NextGen. The algorithm described in this paper improves climb trajectory prediction accuracy by adjusting trajectory predictions based on observed track data. It utilizes rate-of-climb and airspeed measurements derived from position data to dynamically adjust the aircraft weight modeled for trajectory predictions. In simulations with weight uncertainty, the algorithm is able to adapt to within 3 percent of the actual gross weight within two minutes of the initial adaptation. The root-mean-square of altitude errors for five-minute predictions was reduced by 73 percent. Conflict detection performance also improved, with a 15 percent reduction in missed alerts and a 10 percent reduction in false alerts. In a simulation with climb speed capture intent and weight uncertainty, the algorithm improved climb trajectory prediction accuracy by up to 30 percent and conflict detection performance, reducing missed and false alerts by up to 10 percent.
Empirical algorithms to estimate water column pH in the Southern Ocean
NASA Astrophysics Data System (ADS)
Williams, N. L.; Juranek, L. W.; Johnson, K. S.; Feely, R. A.; Riser, S. C.; Talley, L. D.; Russell, J. L.; Sarmiento, J. L.; Wanninkhof, R.
2016-04-01
Empirical algorithms are developed using high-quality GO-SHIP hydrographic measurements of commonly measured parameters (temperature, salinity, pressure, nitrate, and oxygen) that estimate pH in the Pacific sector of the Southern Ocean. The coefficients of determination, R2, are 0.98 for pH from nitrate (pHN) and 0.97 for pH from oxygen (pHOx) with RMS errors of 0.010 and 0.008, respectively. These algorithms are applied to Southern Ocean Carbon and Climate Observations and Modeling (SOCCOM) biogeochemical profiling floats, which include novel sensors (pH, nitrate, oxygen, fluorescence, and backscatter). These algorithms are used to estimate pH on floats with no pH sensors and to validate and adjust pH sensor data from floats with pH sensors. The adjusted float data provide, for the first time, seasonal cycles in surface pH on weekly resolution that range from 0.05 to 0.08 on weekly resolution for the Pacific sector of the Southern Ocean.
Nidheesh, N; Abdul Nazeer, K A; Ameer, P M
2017-12-01
Clustering algorithms with steps involving randomness usually give different results on different executions for the same dataset. This non-deterministic nature of algorithms such as the K-Means clustering algorithm limits their applicability in areas such as cancer subtype prediction using gene expression data. It is hard to sensibly compare the results of such algorithms with those of other algorithms. The non-deterministic nature of K-Means is due to its random selection of data points as initial centroids. We propose an improved, density based version of K-Means, which involves a novel and systematic method for selecting initial centroids. The key idea of the algorithm is to select data points which belong to dense regions and which are adequately separated in feature space as the initial centroids. We compared the proposed algorithm to a set of eleven widely used single clustering algorithms and a prominent ensemble clustering algorithm which is being used for cancer data classification, based on the performances on a set of datasets comprising ten cancer gene expression datasets. The proposed algorithm has shown better overall performance than the others. There is a pressing need in the Biomedical domain for simple, easy-to-use and more accurate Machine Learning tools for cancer subtype prediction. The proposed algorithm is simple, easy-to-use and gives stable results. Moreover, it provides comparatively better predictions of cancer subtypes from gene expression data. Copyright © 2017 Elsevier Ltd. All rights reserved.
Assessing the external validity of algorithms to estimate EQ-5D-3L from the WOMAC.
Kiadaliri, Aliasghar A; Englund, Martin
2016-10-04
The use of mapping algorithms have been suggested as a solution to predict health utilities when no preference-based measure is included in the study. However, validity and predictive performance of these algorithms are highly variable and hence assessing the accuracy and validity of algorithms before use them in a new setting is of importance. The aim of the current study was to assess the predictive accuracy of three mapping algorithms to estimate the EQ-5D-3L from the Western Ontario and McMaster Universities Osteoarthritis Index (WOMAC) among Swedish people with knee disorders. Two of these algorithms developed using ordinary least squares (OLS) models and one developed using mixture model. The data from 1078 subjects mean (SD) age 69.4 (7.2) years with frequent knee pain and/or knee osteoarthritis from the Malmö Osteoarthritis study in Sweden were used. The algorithms' performance was assessed using mean error, mean absolute error, and root mean squared error. Two types of prediction were estimated for mixture model: weighted average (WA), and conditional on estimated component (CEC). The overall mean was overpredicted by an OLS model and underpredicted by two other algorithms (P < 0.001). All predictions but the CEC predictions of mixture model had a narrower range than the observed scores (22 to 90 %). All algorithms suffered from overprediction for severe health states and underprediction for mild health states with lesser extent for mixture model. While the mixture model outperformed OLS models at the extremes of the EQ-5D-3D distribution, it underperformed around the center of the distribution. While algorithm based on mixture model reflected the distribution of EQ-5D-3L data more accurately compared with OLS models, all algorithms suffered from systematic bias. This calls for caution in applying these mapping algorithms in a new setting particularly in samples with milder knee problems than original sample. Assessing the impact of the choice of these algorithms on cost-effectiveness studies through sensitivity analysis is recommended.
NASA Astrophysics Data System (ADS)
Natalia, Slyusar; Pisman, Tamara; Pechurkin, Nikolai S.
Among the most challenging tasks faced by contemporary ecology is modeling of biological production process in different plant communities. The difficulty of the task is determined by the complexity of the study material. Models showing the influence of climate and climate change on plant growth, which would also involve soil site parameters, could be of both practical and theoretical interest. In this work a mathematical model has been constructed to describe the growth dynamics of different plant communities of halophytic meadows as dependent upon the temperature factor and soil salinity level, which could be further used to predict yields of these plant communities. The study was performed on plants of halophytic meadows in the coastal area of Lake of the Republic of Khakasia in 2004 - 2006. Every plant community grew on the soil of a different level of salinity - the amount of the solid residue of the saline soil aqueous extract. The mathematical model was analyzed using field data of 2004 and 2006, the years of contrasting air temperatures. Results of model investigations show that there is a correlation between plant growth and the temperature of the air for plant communities growing on soils containing the lowest (0.1Thus, results of our study, in which we used a mathematical model describing the development of plant communities of halophytic meadows and field measurements, suggest that both climate conditions (temperature) and ecological factors of the plants' habitat (soil salinity level) should be taken into account when constructing models for predicting crop yields.
NASA Astrophysics Data System (ADS)
Ghosh, Soumyadeep
Surfactant-polymer (SP) floods have significant potential to recover waterflood residual oil in shallow oil reservoirs. A thorough understanding of surfactant-oil-brine phase behavior is critical to the design of chemical EOR floods. While considerable progress has been made in developing surfactants and polymers that increase the potential of a chemical enhanced oil recovery (EOR) project, very little progress has been made to predict phase behavior as a function of formulation variables such as pressure, temperature, and oil equivalent alkane carbon number (EACN). The empirical Hand's plot is still used today to model the microemulsion phase behavior with little predictive capability as these and other formulation variables change. Such models could lead to incorrect recovery predictions and improper flood designs. Reservoir crudes also contain acidic components (primarily naphthenic acids), which undergo neutralization to form soaps in the presence of alkali. The generated soaps perform synergistically with injected synthetic surfactants to mobilize waterflood residual oil in what is termed alkali-surfactant-polymer (ASP) flooding. The addition of alkali, however, complicates the measurement and prediction of the microemulsion phase behavior that forms with acidic crudes. In this dissertation, we account for pressure changes in the hydrophilic-lipophilic difference (HLD) equation. This new HLD equation is coupled with the net-average curvature (NAC) model to predict phase volumes, solubilization ratios, and microemulsion phase transitions (Winsor II-, III, and II+). This dissertation presents the first modified HLD-NAC model to predict microemulsion phase behavior for live crudes, including optimal solubilization ratio and the salinity width of the three-phase Winsor III region at different temperatures and pressures. This new equation-of-state-like model could significantly aid the design and forecast of chemical floods where key variables change dynamically, and in screening of potential candidate reservoirs for chemical EOR. The modified HLD-NAC model is also extended here for ASP flooding. We use an empirical equation to calculate the acid distribution coefficient from the molecular structure of the soap. Key HLD-NAC parameters like optimum salinities and optimum solubilization ratios are calculated from soap mole fraction weighted equations. The model is tuned to data from phase behavior experiments with real crudes to demonstrate the procedure. We also examine the ability of the new model to predict fish plots and activity charts that show the evolution of the three-phase region. The modified HLD-NAC equations are then made dimensionless to develop important microemulsion phase behavior relationships and for use in tuning the new model to measured data. Key dimensionless groups that govern phase behavior and their effects are identified and analyzed. A new correlation was developed to predict optimum solubilization ratios at different temperatures, pressures and oil EACN with an average relative error of 10.55%. The prediction of optimum salinities with the modified HLD approach resulted in average relative errors of 2.35%. We also present a robust method to precisely determine optimum salinities and optimum solubilization ratios from salinity scan data with average relative errors of 1.17% and 2.44% for the published data examined.
ESA's Soil Moisture dnd Ocean Salinity Mission - Contributing to Water Resource Management
NASA Astrophysics Data System (ADS)
Mecklenburg, S.; Kerr, Y. H.
2015-12-01
The Soil Moisture and Ocean Salinity (SMOS) mission, launched in November 2009, is the European Space Agency's (ESA) second Earth Explorer Opportunity mission. The scientific objectives of the SMOS mission directly respond to the need for global observations of soil moisture and ocean salinity, two key variables used in predictive hydrological, oceanographic and atmospheric models. SMOS observations also provide information on the characterisation of ice and snow covered surfaces and the sea ice effect on ocean-atmosphere heat fluxes and dynamics, which affects large-scale processes of the Earth's climate system. The focus of this paper will be on SMOS's contribution to support water resource management: SMOS surface soil moisture provides the input to derive root-zone soil moisture, which in turn provides the input for the drought index, an important monitoring prediction tool for plant available water. In addition to surface soil moisture, SMOS also provides observations on vegetation optical depth. Both parameters aid agricultural applications such as crop growth, yield forecasting and drought monitoring, and provide input for carbon and land surface modelling. SMOS data products are used in data assimilation and forecasting systems. Over land, assimilating SMOS derived information has shown to have a positive impact on applications such as NWP, stream flow forecasting and the analysis of net ecosystem exchange. Over ocean, both sea surface salinity and severe wind speed have the potential to increase the predictive skill on the seasonal and short- to medium-range forecast range. Operational users in particular in Numerical Weather Prediction and operational hydrology have put forward a requirement for soil moisture data to be available in near-real time (NRT). This has been addressed by developing a fast retrieval for a NRT level 2 soil moisture product based on Neural Networks, which will be available by autumn 2015. This paper will focus on presenting the above applications and used SMOS data products.
A link prediction approach to cancer drug sensitivity prediction.
Turki, Turki; Wei, Zhi
2017-10-03
Predicting the response to a drug for cancer disease patients based on genomic information is an important problem in modern clinical oncology. This problem occurs in part because many available drug sensitivity prediction algorithms do not consider better quality cancer cell lines and the adoption of new feature representations; both lead to the accurate prediction of drug responses. By predicting accurate drug responses to cancer, oncologists gain a more complete understanding of the effective treatments for each patient, which is a core goal in precision medicine. In this paper, we model cancer drug sensitivity as a link prediction, which is shown to be an effective technique. We evaluate our proposed link prediction algorithms and compare them with an existing drug sensitivity prediction approach based on clinical trial data. The experimental results based on the clinical trial data show the stability of our link prediction algorithms, which yield the highest area under the ROC curve (AUC) and are statistically significant. We propose a link prediction approach to obtain new feature representation. Compared with an existing approach, the results show that incorporating the new feature representation to the link prediction algorithms has significantly improved the performance.
Canovas, Carmen; Alarcon, Aixa; Rosén, Robert; Kasthurirangan, Sanjeev; Ma, Joseph J K; Koch, Douglas D; Piers, Patricia
2018-02-01
To assess the accuracy of toric intraocular lens (IOL) power calculations of a new algorithm that incorporates the effect of posterior corneal astigmatism (PCA). Abbott Medical Optics, Inc., Groningen, the Netherlands. Retrospective case report. In eyes implanted with toric IOLs, the exact vergence formula of the Tecnis toric calculator was used to predict refractive astigmatism from preoperative biometry, surgeon-estimated surgically induced astigmatism (SIA), and implanted IOL power, with and without including the new PCA algorithm. For each calculation method, the error in predicted refractive astigmatism was calculated as the vector difference between the prediction and the actual refraction. Calculations were also made using postoperative keratometry (K) values to eliminate the potential effect of incorrect SIA estimates. The study comprised 274 eyes. The PCA algorithm significantly reduced the centroid error in predicted refractive astigmatism (P < .001). With the PCA algorithm, the centroid error reduced from 0.50 @ 1 to 0.19 @ 3 when using preoperative K values and from 0.30 @ 0 to 0.02 @ 84 when using postoperative K values. Patients who had anterior corneal against-the-rule, with-the-rule, and oblique astigmatism had improvement with the PCA algorithm. In addition, the PCA algorithm reduced the median absolute error in all groups (P < .001). The use of the new PCA algorithm decreased the error in the prediction of residual refractive astigmatism in eyes implanted with toric IOLs. Therefore, the new PCA algorithm, in combination with an exact vergence IOL power calculation formula, led to an increased predictability of toric IOL power. Copyright © 2018 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.
Predicting missing links in complex networks based on common neighbors and distance
Yang, Jinxuan; Zhang, Xiao-Dong
2016-01-01
The algorithms based on common neighbors metric to predict missing links in complex networks are very popular, but most of these algorithms do not account for missing links between nodes with no common neighbors. It is not accurate enough to reconstruct networks by using these methods in some cases especially when between nodes have less common neighbors. We proposed in this paper a new algorithm based on common neighbors and distance to improve accuracy of link prediction. Our proposed algorithm makes remarkable effect in predicting the missing links between nodes with no common neighbors and performs better than most existing currently used methods for a variety of real-world networks without increasing complexity. PMID:27905526
The environmental fluid dynamics code (EFDC) was used to study the three dimensional (3D) circulation, water quality, and ecology in Narragansett Bay, RI. Predictions of the Bay hydrodynamics included the behavior of the water surface elevation, currents, salinity, and temperatur...
NASA Astrophysics Data System (ADS)
Dash, Rajashree
2017-11-01
Forecasting purchasing power of one currency with respect to another currency is always an interesting topic in the field of financial time series prediction. Despite the existence of several traditional and computational models for currency exchange rate forecasting, there is always a need for developing simpler and more efficient model, which will produce better prediction capability. In this paper, an evolutionary framework is proposed by using an improved shuffled frog leaping (ISFL) algorithm with a computationally efficient functional link artificial neural network (CEFLANN) for prediction of currency exchange rate. The model is validated by observing the monthly prediction measures obtained for three currency exchange data sets such as USD/CAD, USD/CHF, and USD/JPY accumulated within same period of time. The model performance is also compared with two other evolutionary learning techniques such as Shuffled frog leaping algorithm and Particle Swarm optimization algorithm. Practical analysis of results suggest that, the proposed model developed using the ISFL algorithm with CEFLANN network is a promising predictor model for currency exchange rate prediction compared to other models included in the study.
3D Protein structure prediction with genetic tabu search algorithm
2010-01-01
Background Protein structure prediction (PSP) has important applications in different fields, such as drug design, disease prediction, and so on. In protein structure prediction, there are two important issues. The first one is the design of the structure model and the second one is the design of the optimization technology. Because of the complexity of the realistic protein structure, the structure model adopted in this paper is a simplified model, which is called off-lattice AB model. After the structure model is assumed, optimization technology is needed for searching the best conformation of a protein sequence based on the assumed structure model. However, PSP is an NP-hard problem even if the simplest model is assumed. Thus, many algorithms have been developed to solve the global optimization problem. In this paper, a hybrid algorithm, which combines genetic algorithm (GA) and tabu search (TS) algorithm, is developed to complete this task. Results In order to develop an efficient optimization algorithm, several improved strategies are developed for the proposed genetic tabu search algorithm. The combined use of these strategies can improve the efficiency of the algorithm. In these strategies, tabu search introduced into the crossover and mutation operators can improve the local search capability, the adoption of variable population size strategy can maintain the diversity of the population, and the ranking selection strategy can improve the possibility of an individual with low energy value entering into next generation. Experiments are performed with Fibonacci sequences and real protein sequences. Experimental results show that the lowest energy obtained by the proposed GATS algorithm is lower than that obtained by previous methods. Conclusions The hybrid algorithm has the advantages from both genetic algorithm and tabu search algorithm. It makes use of the advantage of multiple search points in genetic algorithm, and can overcome poor hill-climbing capability in the conventional genetic algorithm by using the flexible memory functions of TS. Compared with some previous algorithms, GATS algorithm has better performance in global optimization and can predict 3D protein structure more effectively. PMID:20522256
The Behavioral and Neural Mechanisms Underlying the Tracking of Expertise
Boorman, Erie D.; O’Doherty, John P.; Adolphs, Ralph; Rangel, Antonio
2013-01-01
Summary Evaluating the abilities of others is fundamental for successful economic and social behavior. We investigated the computational and neurobiological basis of ability tracking by designing an fMRI task that required participants to use and update estimates of both people and algorithms’ expertise through observation of their predictions. Behaviorally, we find a model-based algorithm characterized subject predictions better than several alternative models. Notably, when the agent’s prediction was concordant rather than discordant with the subject’s own likely prediction, participants credited people more than algorithms for correct predictions and penalized them less for incorrect predictions. Neurally, many components of the mentalizing network—medial prefrontal cortex, anterior cingulate gyrus, temporoparietal junction, and precuneus—represented or updated expertise beliefs about both people and algorithms. Moreover, activity in lateral orbitofrontal and medial prefrontal cortex reflected behavioral differences in learning about people and algorithms. These findings provide basic insights into the neural basis of social learning. PMID:24360551
Neural Generalized Predictive Control: A Newton-Raphson Implementation
NASA Technical Reports Server (NTRS)
Soloway, Donald; Haley, Pamela J.
1997-01-01
An efficient implementation of Generalized Predictive Control using a multi-layer feedforward neural network as the plant's nonlinear model is presented. In using Newton-Raphson as the optimization algorithm, the number of iterations needed for convergence is significantly reduced from other techniques. The main cost of the Newton-Raphson algorithm is in the calculation of the Hessian, but even with this overhead the low iteration numbers make Newton-Raphson faster than other techniques and a viable algorithm for real-time control. This paper presents a detailed derivation of the Neural Generalized Predictive Control algorithm with Newton-Raphson as the minimization algorithm. Simulation results show convergence to a good solution within two iterations and timing data show that real-time control is possible. Comments about the algorithm's implementation are also included.
Marufuzzaman, M; Reaz, M B I; Ali, M A M; Rahman, L F
2015-01-01
The goal of smart homes is to create an intelligent environment adapting the inhabitants need and assisting the person who needs special care and safety in their daily life. This can be reached by collecting the ADL (activities of daily living) data and further analysis within existing computing elements. In this research, a very recent algorithm named sequence prediction via enhanced episode discovery (SPEED) is modified and in order to improve accuracy time component is included. The modified SPEED or M-SPEED is a sequence prediction algorithm, which modified the previous SPEED algorithm by using time duration of appliance's ON-OFF states to decide the next state. M-SPEED discovered periodic episodes of inhabitant behavior, trained it with learned episodes, and made decisions based on the obtained knowledge. The results showed that M-SPEED achieves 96.8% prediction accuracy, which is better than other time prediction algorithms like PUBS, ALZ with temporal rules and the previous SPEED. Since human behavior shows natural temporal patterns, duration times can be used to predict future events more accurately. This inhabitant activity prediction system will certainly improve the smart homes by ensuring safety and better care for elderly and handicapped people.
Algorithm for Training a Recurrent Multilayer Perceptron
NASA Technical Reports Server (NTRS)
Parlos, Alexander G.; Rais, Omar T.; Menon, Sunil K.; Atiya, Amir F.
2004-01-01
An improved algorithm has been devised for training a recurrent multilayer perceptron (RMLP) for optimal performance in predicting the behavior of a complex, dynamic, and noisy system multiple time steps into the future. [An RMLP is a computational neural network with self-feedback and cross-talk (both delayed by one time step) among neurons in hidden layers]. Like other neural-network-training algorithms, this algorithm adjusts network biases and synaptic-connection weights according to a gradient-descent rule. The distinguishing feature of this algorithm is a combination of global feedback (the use of predictions as well as the current output value in computing the gradient at each time step) and recursiveness. The recursive aspect of the algorithm lies in the inclusion of the gradient of predictions at each time step with respect to the predictions at the preceding time step; this recursion enables the RMLP to learn the dynamics. It has been conjectured that carrying the recursion to even earlier time steps would enable the RMLP to represent a noisier, more complex system.
A system for learning statistical motion patterns.
Hu, Weiming; Xiao, Xuejuan; Fu, Zhouyu; Xie, Dan; Tan, Tieniu; Maybank, Steve
2006-09-01
Analysis of motion patterns is an effective approach for anomaly detection and behavior prediction. Current approaches for the analysis of motion patterns depend on known scenes, where objects move in predefined ways. It is highly desirable to automatically construct object motion patterns which reflect the knowledge of the scene. In this paper, we present a system for automatically learning motion patterns for anomaly detection and behavior prediction based on a proposed algorithm for robustly tracking multiple objects. In the tracking algorithm, foreground pixels are clustered using a fast accurate fuzzy K-means algorithm. Growing and prediction of the cluster centroids of foreground pixels ensure that each cluster centroid is associated with a moving object in the scene. In the algorithm for learning motion patterns, trajectories are clustered hierarchically using spatial and temporal information and then each motion pattern is represented with a chain of Gaussian distributions. Based on the learned statistical motion patterns, statistical methods are used to detect anomalies and predict behaviors. Our system is tested using image sequences acquired, respectively, from a crowded real traffic scene and a model traffic scene. Experimental results show the robustness of the tracking algorithm, the efficiency of the algorithm for learning motion patterns, and the encouraging performance of algorithms for anomaly detection and behavior prediction.
Functional Tradeoffs Underpin Salinity-Driven Divergence in Microbial Community Composition
Yooseph, Shibu; Ininbergs, Karolina; Goll, Johannes; Asplund-Samuelsson, Johannes; McCrow, John P.; Celepli, Narin; Allen, Lisa Zeigler; Ekman, Martin; Lucas, Andrew J.; Hagström, Åke; Thiagarajan, Mathangi; Brindefalk, Björn; Richter, Alexander R.; Andersson, Anders F.; Tenney, Aaron; Lundin, Daniel; Tovchigrechko, Andrey; Nylander, Johan A. A.; Brami, Daniel; Badger, Jonathan H.; Allen, Andrew E.; Rusch, Douglas B.; Hoffman, Jeff; Norrby, Erling; Friedman, Robert; Pinhassi, Jarone; Venter, J. Craig; Bergman, Birgitta
2014-01-01
Bacterial community composition and functional potential change subtly across gradients in the surface ocean. In contrast, while there are significant phylogenetic divergences between communities from freshwater and marine habitats, the underlying mechanisms to this phylogenetic structuring yet remain unknown. We hypothesized that the functional potential of natural bacterial communities is linked to this striking divide between microbiomes. To test this hypothesis, metagenomic sequencing of microbial communities along a 1,800 km transect in the Baltic Sea area, encompassing a continuous natural salinity gradient from limnic to fully marine conditions, was explored. Multivariate statistical analyses showed that salinity is the main determinant of dramatic changes in microbial community composition, but also of large scale changes in core metabolic functions of bacteria. Strikingly, genetically and metabolically different pathways for key metabolic processes, such as respiration, biosynthesis of quinones and isoprenoids, glycolysis and osmolyte transport, were differentially abundant at high and low salinities. These shifts in functional capacities were observed at multiple taxonomic levels and within dominant bacterial phyla, while bacteria, such as SAR11, were able to adapt to the entire salinity gradient. We propose that the large differences in central metabolism required at high and low salinities dictate the striking divide between freshwater and marine microbiomes, and that the ability to inhabit different salinity regimes evolved early during bacterial phylogenetic differentiation. These findings significantly advance our understanding of microbial distributions and stress the need to incorporate salinity in future climate change models that predict increased levels of precipitation and a reduction in salinity. PMID:24586863
Environmental tolerances of rare and common mangroves along light and salinity gradients.
Dangremond, Emily M; Feller, Ilka C; Sousa, Wayne P
2015-12-01
Although mangroves possess a variety of morphological and physiological adaptations for life in a stressful habitat, interspecific differences in survival and growth under different environmental conditions can shape their local and geographic distributions. Soil salinity and light are known to affect mangrove performance, often in an interactive fashion. It has also been hypothesized that mangroves are intrinsically shade intolerant due to the high physiological cost of coping with saline flooded soils. To evaluate the relationship between stress tolerance and species distributions, we compared responses of seedlings of three widespread mangrove species and one narrow endemic mangrove species in a factorial array of light levels and soil salinities in an outdoor laboratory experiment. The more narrowly distributed species was expected to exhibit a lower tolerance of potentially stressful conditions. Two of the widespread species, Avicennia germinans and Lumnitzera racemosa, survived and grew well at low-medium salinity, regardless of light level, but performed poorly at high salinity, particularly under high light. The third widespread species, Rhizophora mangle, responded less to variation in light and salinity. However, at high salinity, its relative growth rate was low at every light level and none of these plants flushed leaves. As predicted, the rare species, Pelliciera rhizophorae, was the most sensitive to environmental stressors, suffering especially high mortality and reduced growth and quantum yield under the combined conditions of high light and medium-high salinity. That it only thrives under shaded conditions represents an important exception to the prevailing belief that halophytes are intrinsically constrained to be shade intolerant.
Freshening of the Labrador Sea Surface Waters in the 1990s: Another Great Salinity Anomaly
NASA Technical Reports Server (NTRS)
Hakkinen, Sirpa; Koblinsky, Chester J. (Technical Monitor)
2002-01-01
Both the observed and simulated time series of the Labrador Sea surface salinities show a major freshening event since the middles. It continues the series of decoder events of the 1970s and 1980s from which the freshening in the early 1970's was named as the Great Salinity Anomaly (GSA). These events are especially distinguishable in the late summer (August and September) time series. The observed data suggests that the 1990's freshening may equal the GSA in magnitude. This recent event is associated with a large reduction in the overturning rate between the early and latter part of the 1990s. Both the observations and model results indicate that the surface salinity conditions appear to be returning towards normal daring 1999 and 2000 in the coastal area, but offshore, the model predicts the freshening to linger on after peaking 1997.
Elaziz, Mohamed Abd; Hemdan, Ahmed Monem; Hassanien, AboulElla; Oliva, Diego; Xiong, Shengwu
2017-09-07
The current economics of the fish protein industry demand rapid, accurate and expressive prediction algorithms at every step of protein production especially with the challenge of global climate change. This help to predict and analyze functional and nutritional quality then consequently control food allergies in hyper allergic patients. As, it is quite expensive and time-consuming to know these concentrations by the lab experimental tests, especially to conduct large-scale projects. Therefore, this paper introduced a new intelligent algorithm using adaptive neuro-fuzzy inference system based on whale optimization algorithm. This algorithm is used to predict the concentration levels of bioactive amino acids in fish protein hydrolysates at different times during the year. The whale optimization algorithm is used to determine the optimal parameters in adaptive neuro-fuzzy inference system. The results of proposed algorithm are compared with others and it is indicated the higher performance of the proposed algorithm.
Predicting biomedical metadata in CEDAR: A study of Gene Expression Omnibus (GEO).
Panahiazar, Maryam; Dumontier, Michel; Gevaert, Olivier
2017-08-01
A crucial and limiting factor in data reuse is the lack of accurate, structured, and complete descriptions of data, known as metadata. Towards improving the quantity and quality of metadata, we propose a novel metadata prediction framework to learn associations from existing metadata that can be used to predict metadata values. We evaluate our framework in the context of experimental metadata from the Gene Expression Omnibus (GEO). We applied four rule mining algorithms to the most common structured metadata elements (sample type, molecular type, platform, label type and organism) from over 1.3million GEO records. We examined the quality of well supported rules from each algorithm and visualized the dependencies among metadata elements. Finally, we evaluated the performance of the algorithms in terms of accuracy, precision, recall, and F-measure. We found that PART is the best algorithm outperforming Apriori, Predictive Apriori, and Decision Table. All algorithms perform significantly better in predicting class values than the majority vote classifier. We found that the performance of the algorithms is related to the dimensionality of the GEO elements. The average performance of all algorithm increases due of the decreasing of dimensionality of the unique values of these elements (2697 platforms, 537 organisms, 454 labels, 9 molecules, and 5 types). Our work suggests that experimental metadata such as present in GEO can be accurately predicted using rule mining algorithms. Our work has implications for both prospective and retrospective augmentation of metadata quality, which are geared towards making data easier to find and reuse. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.
Enhanced clinical pharmacy service targeting tools: risk-predictive algorithms.
El Hajji, Feras W D; Scullin, Claire; Scott, Michael G; McElnay, James C
2015-04-01
This study aimed to determine the value of using a mix of clinical pharmacy data and routine hospital admission spell data in the development of predictive algorithms. Exploration of risk factors in hospitalized patients, together with the targeting strategies devised, will enable the prioritization of clinical pharmacy services to optimize patient outcomes. Predictive algorithms were developed using a number of detailed steps using a 75% sample of integrated medicines management (IMM) patients, and validated using the remaining 25%. IMM patients receive targeted clinical pharmacy input throughout their hospital stay. The algorithms were applied to the validation sample, and predicted risk probability was generated for each patient from the coefficients. Risk threshold for the algorithms were determined by identifying the cut-off points of risk scores at which the algorithm would have the highest discriminative performance. Clinical pharmacy staffing levels were obtained from the pharmacy department staffing database. Numbers of previous emergency admissions and admission medicines together with age-adjusted co-morbidity and diuretic receipt formed a 12-month post-discharge and/or readmission risk algorithm. Age-adjusted co-morbidity proved to be the best index to predict mortality. Increased numbers of clinical pharmacy staff at ward level was correlated with a reduction in risk-adjusted mortality index (RAMI). Algorithms created were valid in predicting risk of in-hospital and post-discharge mortality and risk of hospital readmission 3, 6 and 12 months post-discharge. The provision of ward-based clinical pharmacy services is a key component to reducing RAMI and enabling the full benefits of pharmacy input to patient care to be realized. © 2014 John Wiley & Sons, Ltd.
Lai, Fu-Jou; Chang, Hong-Tsun; Huang, Yueh-Min; Wu, Wei-Sheng
2014-01-01
Eukaryotic transcriptional regulation is known to be highly connected through the networks of cooperative transcription factors (TFs). Measuring the cooperativity of TFs is helpful for understanding the biological relevance of these TFs in regulating genes. The recent advances in computational techniques led to various predictions of cooperative TF pairs in yeast. As each algorithm integrated different data resources and was developed based on different rationales, it possessed its own merit and claimed outperforming others. However, the claim was prone to subjectivity because each algorithm compared with only a few other algorithms and only used a small set of performance indices for comparison. This motivated us to propose a series of indices to objectively evaluate the prediction performance of existing algorithms. And based on the proposed performance indices, we conducted a comprehensive performance evaluation. We collected 14 sets of predicted cooperative TF pairs (PCTFPs) in yeast from 14 existing algorithms in the literature. Using the eight performance indices we adopted/proposed, the cooperativity of each PCTFP was measured and a ranking score according to the mean cooperativity of the set was given to each set of PCTFPs under evaluation for each performance index. It was seen that the ranking scores of a set of PCTFPs vary with different performance indices, implying that an algorithm used in predicting cooperative TF pairs is of strength somewhere but may be of weakness elsewhere. We finally made a comprehensive ranking for these 14 sets. The results showed that Wang J's study obtained the best performance evaluation on the prediction of cooperative TF pairs in yeast. In this study, we adopted/proposed eight performance indices to make a comprehensive performance evaluation on the prediction results of 14 existing cooperative TFs identification algorithms. Most importantly, these proposed indices can be easily applied to measure the performance of new algorithms developed in the future, thus expedite progress in this research field.
NASA Astrophysics Data System (ADS)
Kerr, Yann; Waldteufel, Philippe; Cabot, François; Richaume, Philippe; Jacquette, Elsa; Bitar, Ahmad Al; Mamhoodi, Ali; Delwart, Steven; Wigneron, Jean-Pierre
2010-05-01
The Soil Moisture and Ocean Salinity (SMOS) mission is ESA's (European Space Agency ) second Earth Explorer Opportunity mission, launched in November 2009. It is a joint programme between ESA CNES (Centre National d'Etudes Spatiales) and CDTI (Centro para el Desarrollo Tecnologico Industrial). SMOS carries a single payload, an L-band 2D interferometric radiometer in the 1400-1427 MHz protected band. This wavelength penetrates well through the atmosphere and hence the instrument probes the Earth surface emissivity. Surface emissivity can then be related to the moisture content in the first few centimeters of soil, and, after some surface roughness and temperature corrections, to the sea surface salinity over ocean. In order to prepare the data use and dissemination, the ground segment will produce level 1 and 2 data. Level 1 consists mainly of angular brightness temperatures while level 2 consists of geophysical products. In this context, a group of institutes prepared the soil moisture and ocean salinity Algorithm Theoretical Basis documents (ATBD) to be used to produce the operational algorithm. The principle of the soil moisture retrieval algorithm is based on an iterative approach which aims at minimizing a cost function given by the sum of the squared weighted differences between measured and modelled brightness temperature (TB) data, for a variety of incidence angles. This is achieved by finding the best suited set of the parameters which drive the direct TB model, e.g. soil moisture (SM) and vegetation characteristics. Despite the simplicity of this principle, the main reason for the complexity of the algorithm is that SMOS "pixels" can correspond to rather large, inhomogeneous surface areas whose contribution to the radiometric signal is difficult to model. Moreover, the exact description of pixels, given by a weighting function which expresses the directional pattern of the SMOS interferometric radiometer, depends on the incidence angle. The goal is to retrieve soil moisture over fairly large and thus inhomogeneous areas. The retrieval is carried out at nodes of a fixed Earth surface grid. To achieve this purpose, after checking input data quality and ingesting auxiliary data, the retrieval process per se can be initiated. This cannot be done blindly as the direct model will be dependent upon surface characteristics. It is thus necessary to first assess what is the dominant land use of a node. For this, an average weighing function (MEAN_WEF) which takes into account the "antenna"pattern is run over the high resolution land use map to assess the dominant cover type. This is used to drive the decision tree which, step by step, selects the type of model to be used as per surface conditions. All this being said and done the retrieval procedure starts if all the conditions are satisfied, ideally to retrieve 3 parameters over the dominant class (the so-called rich retrieval). If the algorithm does not converge satisfactorily, a new trial is made with less floating parameters ("poorer retrieval") until either results are satisfactory or the algorithm is considered to fail. The retrieval algorithm also delivers whenever possible a dielectric constant parameter (using the-so called cardioid approach). Finally, once the retrieval converged, it is possible to compute the brightness temperature at a given fixed angle (42.5°) using the selected forward models applied to the set of parameters obtained at the end of the retrieval process. So the output product of the level 2 soil moisture algorithm should be node position, soil moisture, dielectric constants, computed brightness temperature at 42.5°, flags and quality indices. During the presentation we will describe in more details the algorithm and accompanying work in particular decision tree principle and characteristics, the auxiliary data used and the special and "exotic"cases. We will also be more explicit on the algorithm validation and verification through the data collected during the commissioning phase. The main hurdle being working in spite of spurious signals (RFI) on some areas of the globe.
Agounad, Said; Aassif, El Houcein; Khandouch, Younes; Maze, Gérard; Décultot, Dominique
2018-02-01
The acoustic scattering of a plane wave by an elastic cylindrical shell is studied. A new approach is developed to predict the form function of an immersed cylindrical shell of the radius ratio b/a ('b' is the inner radius and 'a' is the outer radius). The prediction of the backscattered form function is investigated by a combined approach between fuzzy clustering algorithms and bio-inspired algorithms. Four famous fuzzy clustering algorithms: the fuzzy c-means (FCM), the Gustafson-Kessel algorithm (GK), the fuzzy c-regression model (FCRM) and the Gath-Geva algorithm (GG) are combined with particle swarm optimization and genetic algorithm. The symmetric and antisymmetric circumferential waves A, S 0 , A 1 , S 1 and S 2 are investigated in a reduced frequency (k 1 a) range extends over 0.1
Increased resistance to a generalist herbivore in a salinity-stressed non-halophytic plant
Renault, Sylvie; Wolfe, Scott; Markham, John; Avila-Sakar, Germán
2016-01-01
Plants often grow under the combined stress of several factors. Salinity and herbivory, separately, can severely hinder plant growth and reproduction, but the combined effects of both factors are still not clearly understood. Salinity is known to reduce plant tissue nitrogen content and growth rates. Since herbivores prefer tissues with high N content, and biochemical pathways leading to resistance are commonly elicited by salt-stress, we hypothesized that plants growing in saline conditions would have enhanced resistance against herbivores. The non-halophyte, Brassica juncea, and the generalist herbivore Trichoplusia ni were used to test the prediction that plants subjected to salinity stress would be both more resistant and more tolerant to herbivory than those growing without salt stress. Plants were grown under different NaCl levels, and either exposed to herbivores and followed by removal of half of their leaves, or left intact. Plants were left to grow and reproduce until senescence. Tissue quality was assessed, seeds were counted and biomass of different organs measured. Plants exposed to salinity grew less, had reduced tissue nitrogen, protein and chlorophyll content, although proline levels increased. Specific leaf area, leaf water content, transpiration and root:shoot ratio remained unaffected. Plants growing under saline condition had greater constitutive resistance than unstressed plants. However, induced resistance and tolerance were not affected by salinity. These results support the hypothesis that plants growing under salt-stress are better defended against herbivores, although in B. juncea this may be mostly through resistance, and less through tolerance. PMID:27169610
Prediction algorithms for urban traffic control
DOT National Transportation Integrated Search
1979-02-01
The objectives of this study are to 1) review and assess the state-of-the-art of prediction algorithms for urban traffic control in terms of their accuracy and application, and 2) determine the prediction accuracy obtainable by examining the performa...
Phase 2 development of Great Lakes algorithms for Nimbus-7 coastal zone color scanner
NASA Technical Reports Server (NTRS)
Tanis, Fred J.
1984-01-01
A series of experiments have been conducted in the Great Lakes designed to evaluate the application of the NIMBUS-7 Coastal Zone Color Scanner (CZCS). Atmospheric and water optical models were used to relate surface and subsurface measurements to satellite measured radiances. Absorption and scattering measurements were reduced to obtain a preliminary optical model for the Great Lakes. Algorithms were developed for geometric correction, correction for Rayleigh and aerosol path radiance, and prediction of chlorophyll-a pigment and suspended mineral concentrations. The atmospheric algorithm developed compared favorably with existing algorithms and was the only algorithm found to adequately predict the radiance variations in the 670 nm band. The atmospheric correction algorithm developed was designed to extract needed algorithm parameters from the CZCS radiance values. The Gordon/NOAA ocean algorithms could not be demonstrated to work for Great Lakes waters. Predicted values of chlorophyll-a concentration compared favorably with expected and measured data for several areas of the Great Lakes.
Optimal design of low-density SNP arrays for genomic prediction: algorithm and applications
USDA-ARS?s Scientific Manuscript database
Low-density (LD) single nucleotide polymorphism (SNP) arrays provide a cost-effective solution for genomic prediction and selection, but algorithms and computational tools are needed for their optimal design. A multiple-objective, local optimization (MOLO) algorithm was developed for design of optim...
Water analysis via portable X-ray fluorescence spectrometry
NASA Astrophysics Data System (ADS)
Pearson, Delaina; Chakraborty, Somsubhra; Duda, Bogdan; Li, Bin; Weindorf, David C.; Deb, Shovik; Brevik, Eric; Ray, D. P.
2017-01-01
Rapid, in-situ elemental water analysis would be an invaluable tool in studying polluted and/or salt-impacted waters. Analysis of water salinity has commonly used electrical conductance (EC); however, the identity of the elements responsible for the salinity are not revealed using EC. Several studies have established the viability of using portable X-ray fluorescence (PXRF) spectrometry for elemental data analysis of soil, sediment, and other matrices. However, the accuracy of PXRF is known to be affected while scanning moisture-laden soil samples. This study used PXRF elemental data in water samples to predict water EC. A total of 256 water samples, from 10 different countries were collected and analyzed via PXRF, inductively coupled plasma atomic emission spectroscopy (ICP-AES), and a digital salinity bridge. The PXRF detected some elements more effectively than others, but overall results indicated that PXRF can successfully predict water EC via quantifying Cl in water samples (validation R2 and RMSE of 0.77 and 0.95 log μS cm-1, respectively). The findings of this study elucidated the potential of PXRF for future analysis of pollutant and/or metal contaminated waters.
Petersen, Bjørn Molt; Boel, Mikkel; Montag, Markus; Gardner, David K
2016-10-01
Can a generally applicable morphokinetic algorithm suitable for Day 3 transfers of time-lapse monitored embryos originating from different culture conditions and fertilization methods be developed for the purpose of supporting the embryologist's decision on which embryo to transfer back to the patient in assisted reproduction? The algorithm presented here can be used independently of culture conditions and fertilization method and provides predictive power not surpassed by other published algorithms for ranking embryos according to their blastocyst formation potential. Generally applicable algorithms have so far been developed only for predicting blastocyst formation. A number of clinics have reported validated implantation prediction algorithms, which have been developed based on clinic-specific culture conditions and clinical environment. However, a generally applicable embryo evaluation algorithm based on actual implantation outcome has not yet been reported. Retrospective evaluation of data extracted from a database of known implantation data (KID) originating from 3275 embryos transferred on Day 3 conducted in 24 clinics between 2009 and 2014. The data represented different culture conditions (reduced and ambient oxygen with various culture medium strategies) and fertilization methods (IVF, ICSI). The capability to predict blastocyst formation was evaluated on an independent set of morphokinetic data from 11 218 embryos which had been cultured to Day 5. PARTICIPANTS/MATERIALS, SETTING, The algorithm was developed by applying automated recursive partitioning to a large number of annotation types and derived equations, progressing to a five-fold cross-validation test of the complete data set and a validation test of different incubation conditions and fertilization methods. The results were expressed as receiver operating characteristics curves using the area under the curve (AUC) to establish the predictive strength of the algorithm. By applying the here developed algorithm (KIDScore), which was based on six annotations (the number of pronuclei equals 2 at the 1-cell stage, time from insemination to pronuclei fading at the 1-cell stage, time from insemination to the 2-cell stage, time from insemination to the 3-cell stage, time from insemination to the 5-cell stage and time from insemination to the 8-cell stage) and ranking the embryos in five groups, the implantation potential of the embryos was predicted with an AUC of 0.650. On Day 3 the KIDScore algorithm was capable of predicting blastocyst development with an AUC of 0.745 and blastocyst quality with an AUC of 0.679. In a comparison of blastocyst prediction including six other published algorithms and KIDScore, only KIDScore and one more algorithm surpassed an algorithm constructed on conventional Alpha/ESHRE consensus timings in terms of predictive power. Some morphological assessments were not available and consequently three of the algorithms in the comparison were not used in full and may therefore have been put at a disadvantage. Algorithms based on implantation data from Day 3 embryo transfers require adjustments to be capable of predicting the implantation potential of Day 5 embryo transfers. The current study is restricted by its retrospective nature and absence of live birth information. Prospective Randomized Controlled Trials should be used in future studies to establish the value of time-lapse technology and morphokinetic evaluation. Algorithms applicable to different culture conditions can be developed if based on large data sets of heterogeneous origin. This study was funded by Vitrolife A/S, Denmark and Vitrolife AB, Sweden. B.M.P.'s company BMP Analytics is performing consultancy for Vitrolife A/S. M.B. is employed at Vitrolife A/S. M.M.'s company ilabcomm GmbH received honorarium for consultancy from Vitrolife AB. D.K.G. received research support from Vitrolife AB. © The Author 2016. Published by Oxford University Press on behalf of the European Society of Human Reproduction and Embryology.
Petersen, Bjørn Molt; Boel, Mikkel; Montag, Markus; Gardner, David K.
2016-01-01
STUDY QUESTION Can a generally applicable morphokinetic algorithm suitable for Day 3 transfers of time-lapse monitored embryos originating from different culture conditions and fertilization methods be developed for the purpose of supporting the embryologist's decision on which embryo to transfer back to the patient in assisted reproduction? SUMMARY ANSWER The algorithm presented here can be used independently of culture conditions and fertilization method and provides predictive power not surpassed by other published algorithms for ranking embryos according to their blastocyst formation potential. WHAT IS KNOWN ALREADY Generally applicable algorithms have so far been developed only for predicting blastocyst formation. A number of clinics have reported validated implantation prediction algorithms, which have been developed based on clinic-specific culture conditions and clinical environment. However, a generally applicable embryo evaluation algorithm based on actual implantation outcome has not yet been reported. STUDY DESIGN, SIZE, DURATION Retrospective evaluation of data extracted from a database of known implantation data (KID) originating from 3275 embryos transferred on Day 3 conducted in 24 clinics between 2009 and 2014. The data represented different culture conditions (reduced and ambient oxygen with various culture medium strategies) and fertilization methods (IVF, ICSI). The capability to predict blastocyst formation was evaluated on an independent set of morphokinetic data from 11 218 embryos which had been cultured to Day 5. PARTICIPANTS/MATERIALS, SETTING, METHODS The algorithm was developed by applying automated recursive partitioning to a large number of annotation types and derived equations, progressing to a five-fold cross-validation test of the complete data set and a validation test of different incubation conditions and fertilization methods. The results were expressed as receiver operating characteristics curves using the area under the curve (AUC) to establish the predictive strength of the algorithm. MAIN RESULTS AND THE ROLE OF CHANCE By applying the here developed algorithm (KIDScore), which was based on six annotations (the number of pronuclei equals 2 at the 1-cell stage, time from insemination to pronuclei fading at the 1-cell stage, time from insemination to the 2-cell stage, time from insemination to the 3-cell stage, time from insemination to the 5-cell stage and time from insemination to the 8-cell stage) and ranking the embryos in five groups, the implantation potential of the embryos was predicted with an AUC of 0.650. On Day 3 the KIDScore algorithm was capable of predicting blastocyst development with an AUC of 0.745 and blastocyst quality with an AUC of 0.679. In a comparison of blastocyst prediction including six other published algorithms and KIDScore, only KIDScore and one more algorithm surpassed an algorithm constructed on conventional Alpha/ESHRE consensus timings in terms of predictive power. LIMITATIONS, REASONS FOR CAUTION Some morphological assessments were not available and consequently three of the algorithms in the comparison were not used in full and may therefore have been put at a disadvantage. Algorithms based on implantation data from Day 3 embryo transfers require adjustments to be capable of predicting the implantation potential of Day 5 embryo transfers. The current study is restricted by its retrospective nature and absence of live birth information. Prospective Randomized Controlled Trials should be used in future studies to establish the value of time-lapse technology and morphokinetic evaluation. WIDER IMPLICATIONS OF THE FINDINGS Algorithms applicable to different culture conditions can be developed if based on large data sets of heterogeneous origin. STUDY FUNDING/COMPETING INTEREST(S) This study was funded by Vitrolife A/S, Denmark and Vitrolife AB, Sweden. B.M.P.’s company BMP Analytics is performing consultancy for Vitrolife A/S. M.B. is employed at Vitrolife A/S. M.M.’s company ilabcomm GmbH received honorarium for consultancy from Vitrolife AB. D.K.G. received research support from Vitrolife AB. PMID:27609980
High Speed Research Noise Prediction Code (HSRNOISE) User's and Theoretical Manual
NASA Technical Reports Server (NTRS)
Golub, Robert (Technical Monitor); Rawls, John W., Jr.; Yeager, Jessie C.
2004-01-01
This report describes a computer program, HSRNOISE, that predicts noise levels for a supersonic aircraft powered by mixed flow turbofan engines with rectangular mixer-ejector nozzles. It fully documents the noise prediction algorithms, provides instructions for executing the HSRNOISE code, and provides predicted noise levels for the High Speed Research (HSR) program Technology Concept (TC) aircraft. The component source noise prediction algorithms were developed jointly by Boeing, General Electric Aircraft Engines (GEAE), NASA and Pratt & Whitney during the course of the NASA HSR program. Modern Technologies Corporation developed an alternative mixer ejector jet noise prediction method under contract to GEAE that has also been incorporated into the HSRNOISE prediction code. Algorithms for determining propagation effects and calculating noise metrics were taken from the NASA Aircraft Noise Prediction Program.
NASA Astrophysics Data System (ADS)
Bukhari, W.; Hong, S.-M.
2015-01-01
Motion-adaptive radiotherapy aims to deliver a conformal dose to the target tumour with minimal normal tissue exposure by compensating for tumour motion in real time. The prediction as well as the gating of respiratory motion have received much attention over the last two decades for reducing the targeting error of the treatment beam due to respiratory motion. In this article, we present a real-time algorithm for predicting and gating respiratory motion that utilizes a model-based and a model-free Bayesian framework by combining them in a cascade structure. The algorithm, named EKF-GPR+, implements a gating function without pre-specifying a particular region of the patient’s breathing cycle. The algorithm first employs an extended Kalman filter (LCM-EKF) to predict the respiratory motion and then uses a model-free Gaussian process regression (GPR) to correct the error of the LCM-EKF prediction. The GPR is a non-parametric Bayesian algorithm that yields predictive variance under Gaussian assumptions. The EKF-GPR+ algorithm utilizes the predictive variance from the GPR component to capture the uncertainty in the LCM-EKF prediction error and systematically identify breathing points with a higher probability of large prediction error in advance. This identification allows us to pause the treatment beam over such instances. EKF-GPR+ implements the gating function by using simple calculations based on the predictive variance with no additional detection mechanism. A sparse approximation of the GPR algorithm is employed to realize EKF-GPR+ in real time. Extensive numerical experiments are performed based on a large database of 304 respiratory motion traces to evaluate EKF-GPR+. The experimental results show that the EKF-GPR+ algorithm effectively reduces the prediction error in a root-mean-square (RMS) sense by employing the gating function, albeit at the cost of a reduced duty cycle. As an example, EKF-GPR+ reduces the patient-wise RMS error to 37%, 39% and 42% in percent ratios relative to no prediction for a duty cycle of 80% at lookahead lengths of 192 ms, 384 ms and 576 ms, respectively. The experiments also confirm that EKF-GPR+ controls the duty cycle with reasonable accuracy.
An improved stochastic fractal search algorithm for 3D protein structure prediction.
Zhou, Changjun; Sun, Chuan; Wang, Bin; Wang, Xiaojun
2018-05-03
Protein structure prediction (PSP) is a significant area for biological information research, disease treatment, and drug development and so on. In this paper, three-dimensional structures of proteins are predicted based on the known amino acid sequences, and the structure prediction problem is transformed into a typical NP problem by an AB off-lattice model. This work applies a novel improved Stochastic Fractal Search algorithm (ISFS) to solve the problem. The Stochastic Fractal Search algorithm (SFS) is an effective evolutionary algorithm that performs well in exploring the search space but falls into local minimums sometimes. In order to avoid the weakness, Lvy flight and internal feedback information are introduced in ISFS. In the experimental process, simulations are conducted by ISFS algorithm on Fibonacci sequences and real peptide sequences. Experimental results prove that the ISFS performs more efficiently and robust in terms of finding the global minimum and avoiding getting stuck in local minimums.
Prediction based active ramp metering control strategy with mobility and safety assessment
NASA Astrophysics Data System (ADS)
Fang, Jie; Tu, Lili
2018-04-01
Ramp metering is one of the most direct and efficient motorway traffic flow management measures so as to improve traffic conditions. However, owing to short of traffic conditions prediction, in earlier studies, the impact on traffic flow dynamics of the applied RM control was not quantitatively evaluated. In this study, a RM control algorithm adopting Model Predictive Control (MPC) framework to predict and assess future traffic conditions, which taking both the current traffic conditions and the RM-controlled future traffic states into consideration, was presented. The designed RM control algorithm targets at optimizing the network mobility and safety performance. The designed algorithm is evaluated in a field-data-based simulation. Through comparing the presented algorithm controlled scenario with the uncontrolled scenario, it was proved that the proposed RM control algorithm can effectively relieve the congestion of traffic network with no significant compromises in safety aspect.
False-nearest-neighbors algorithm and noise-corrupted time series
NASA Astrophysics Data System (ADS)
Rhodes, Carl; Morari, Manfred
1997-05-01
The false-nearest-neighbors (FNN) algorithm was originally developed to determine the embedding dimension for autonomous time series. For noise-free computer-generated time series, the algorithm does a good job in predicting the embedding dimension. However, the problem of predicting the embedding dimension when the time-series data are corrupted by noise was not fully examined in the original studies of the FNN algorithm. Here it is shown that with large data sets, even small amounts of noise can lead to incorrect prediction of the embedding dimension. Surprisingly, as the length of the time series analyzed by FNN grows larger, the cause of incorrect prediction becomes more pronounced. An analysis of the effect of noise on the FNN algorithm and a solution for dealing with the effects of noise are given here. Some results on the theoretically correct choice of the FNN threshold are also presented.
2012-10-31
intrapulmonary and intracardiac shunt using saline contrast echocardiography to determine bubble/shunt scores. We will also use nuclear medicine imaging to...subjects have completed saline contrast echocardiography while breathing hypoxic gas mixtures. For Task #2 “10 hr hypoxic exposure and AMS... echocardiography while breathing an FIO2=0.14, will be susceptible or resistant to developing AMS after 10 hr hypoxic exposure. For Task #3 “Hypoxia
NASA Astrophysics Data System (ADS)
Klein, R.; Adler, A.; Beanlands, R. S.; de Kemp, R. A.
2007-02-01
A rubidium-82 (82Rb) elution system is described for use with positron emission tomography. Due to the short half-life of 82Rb (76 s), the system physics must be modelled precisely to account for transport delay and the associated activity decay and dispersion. Saline flow is switched between a 82Sr/82Rb generator and a bypass line to achieve a constant-activity elution of 82Rb. Pulse width modulation (PWM) of a solenoid valve is compared to simple threshold control as a means to simulate a proportional valve. A predictive-corrective control (PCC) algorithm is developed which produces a constant-activity elution within the constraints of long feedback delay and short elution time. The system model parameters are adjusted through a self-tuning algorithm to minimize error versus the requested time-activity profile. The system is self-calibrating with 2.5% repeatability, independent of generator activity and elution flow rate. Accurate 30 s constant-activity elutions of 10-70% of the total generator activity are achieved using both control methods. The combined PWM-PCC method provides significant improvement in precision and accuracy of the requested elution profiles. The 82Rb elution system produces accurate and reproducible constant-activity elution profiles of 82Rb activity, independent of parent 82Sr activity in the generator. More reproducible elution profiles may improve the quality of clinical and research PET perfusion studies using 82Rb.
Klein, R; Adler, A; Beanlands, R S; Dekemp, R A
2007-02-07
A rubidium-82 ((82)Rb) elution system is described for use with positron emission tomography. Due to the short half-life of (82)Rb (76 s), the system physics must be modelled precisely to account for transport delay and the associated activity decay and dispersion. Saline flow is switched between a (82)Sr/(82)Rb generator and a bypass line to achieve a constant-activity elution of (82)Rb. Pulse width modulation (PWM) of a solenoid valve is compared to simple threshold control as a means to simulate a proportional valve. A predictive-corrective control (PCC) algorithm is developed which produces a constant-activity elution within the constraints of long feedback delay and short elution time. The system model parameters are adjusted through a self-tuning algorithm to minimize error versus the requested time-activity profile. The system is self-calibrating with 2.5% repeatability, independent of generator activity and elution flow rate. Accurate 30 s constant-activity elutions of 10-70% of the total generator activity are achieved using both control methods. The combined PWM-PCC method provides significant improvement in precision and accuracy of the requested elution profiles. The (82)Rb elution system produces accurate and reproducible constant-activity elution profiles of (82)Rb activity, independent of parent (82)Sr activity in the generator. More reproducible elution profiles may improve the quality of clinical and research PET perfusion studies using (82)Rb.
The low salinity effect at high temperatures
Xie, Quan; Brady, Patrick V.; Pooryousefy, Ehsan; ...
2017-04-05
The mechanism(s) of low salinity water flooding (LSWF) must be better understood at high temperatures and pressures if the method is to be applied in high T/P kaolinite-bearing sandstone reservoirs. We measured contact angles between a sandstone and an oil (acid number, AN = 3.98 mg KOH/g, base number, BN = 1.3 mg KOH/g) from a reservoir in the Tarim Field in western China in the presence of various water chemistries. We examined the effect of aqueous ionic solutions (formation brine, 100X diluted formation brine, and softened water), temperature (60, 100 and 140 °C) and pressure (20, 30, 40, andmore » 50 MPa) on the contact angle. We also measured the zeta potential of the oil/water and water/rock interfaces to calculate oil/brine/rock disjoining pressures. A surface complexation model was developed to interpret contact angle measurements and compared with DLVO theory predictions. Contact angles were greatest in formation water, followed by the softened water, and low salinity water at the same pressure and temperature. Contact angles increased slightly with temperature, whereas pressure had little effect. DLVO and surface complexation modelling predicted similar wettability trends and allow reasonably accurate interpretation of core-flood results. Water chemistry has a much larger impact on LSWF than reservoir temperature and pressure. As a result, low salinity water flooding should work in high temperature and high pressure kaolinite-bearing sandstone reservoirs.« less
North Atlantic salinity as a predictor of Sahel rainfall.
Li, Laifang; Schmitt, Raymond W; Ummenhofer, Caroline C; Karnauskas, Kristopher B
2016-05-01
Water evaporating from the ocean sustains precipitation on land. This ocean-to-land moisture transport leaves an imprint on sea surface salinity (SSS). Thus, the question arises of whether variations in SSS can provide insight into terrestrial precipitation. This study provides evidence that springtime SSS in the subtropical North Atlantic ocean can be used as a predictor of terrestrial precipitation during the subsequent summer monsoon in Africa. Specifically, increased springtime SSS in the central to eastern subtropical North Atlantic tends to be followed by above-normal monsoon-season precipitation in the African Sahel. In the spring, high SSS is associated with enhanced moisture flux divergence from the subtropical oceans, which converges over the African Sahel and helps to elevate local soil moisture content. From spring to the summer monsoon season, the initial water cycling signal is preserved, amplified, and manifested in excessive precipitation. According to our analysis of currently available soil moisture data sets, this 3-month delay is attributable to a positive coupling between soil moisture, moisture flux convergence, and precipitation in the Sahel. Because of the physical connection between salinity, ocean-to-land moisture transport, and local soil moisture feedback, seasonal forecasts of Sahel precipitation can be improved by incorporating SSS into prediction models. Thus, expanded monitoring of ocean salinity should contribute to more skillful predictions of precipitation in vulnerable subtropical regions, such as the Sahel.
The low salinity effect at high temperatures
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xie, Quan; Brady, Patrick V.; Pooryousefy, Ehsan
The mechanism(s) of low salinity water flooding (LSWF) must be better understood at high temperatures and pressures if the method is to be applied in high T/P kaolinite-bearing sandstone reservoirs. We measured contact angles between a sandstone and an oil (acid number, AN = 3.98 mg KOH/g, base number, BN = 1.3 mg KOH/g) from a reservoir in the Tarim Field in western China in the presence of various water chemistries. We examined the effect of aqueous ionic solutions (formation brine, 100X diluted formation brine, and softened water), temperature (60, 100 and 140 °C) and pressure (20, 30, 40, andmore » 50 MPa) on the contact angle. We also measured the zeta potential of the oil/water and water/rock interfaces to calculate oil/brine/rock disjoining pressures. A surface complexation model was developed to interpret contact angle measurements and compared with DLVO theory predictions. Contact angles were greatest in formation water, followed by the softened water, and low salinity water at the same pressure and temperature. Contact angles increased slightly with temperature, whereas pressure had little effect. DLVO and surface complexation modelling predicted similar wettability trends and allow reasonably accurate interpretation of core-flood results. Water chemistry has a much larger impact on LSWF than reservoir temperature and pressure. As a result, low salinity water flooding should work in high temperature and high pressure kaolinite-bearing sandstone reservoirs.« less
Competitive ability, stress tolerance and plant interactions along stress gradients.
Qi, Man; Sun, Tao; Xue, SuFeng; Yang, Wei; Shao, DongDong; Martínez-López, Javier
2018-04-01
Exceptions to the generality of the stress-gradient hypothesis (SGH) may be reconciled by considering species-specific traits and stress tolerance strategies. Studies have tested stress tolerance and competitive ability in mediating interaction outcomes, but few have incorporated this to predict how species interactions shift between competition and facilitation along stress gradients. We used field surveys, salt tolerance and competition experiments to develop a predictive model interspecific interaction shifts across salinity stress gradients. Field survey and greenhouse tolerance tests revealed tradeoffs between stress tolerance and competitive ability. Modeling showed that along salinity gradients, (1) plant interactions shifted from competition to facilitation at high salinities within the physiological limits of salt-intolerant plants, (2) facilitation collapsed when salinity stress exceeded the physiological tolerance of salt-intolerant plants, and (3) neighbor removal experiments overestimate interspecific facilitation by including intraspecific effects. A community-level field experiment, suggested that (1) species interactions are competitive in benign and, facilitative in harsh condition, but fuzzy under medium environmental stress due to niche differences of species and weak stress amelioration, and (2) the SGH works on strong but not weak stress gradients, so SGH confusion arises when it is applied across questionable stress gradients. Our study clarifies how species interactions vary along stress gradients. Moving forward, focusing on SGH applications rather than exceptions on weak or nonexistent gradients would be most productive. © 2018 by the Ecological Society of America.
North Atlantic salinity as a predictor of Sahel rainfall
Li, Laifang; Schmitt, Raymond W.; Ummenhofer, Caroline C.; Karnauskas, Kristopher B.
2016-01-01
Water evaporating from the ocean sustains precipitation on land. This ocean-to-land moisture transport leaves an imprint on sea surface salinity (SSS). Thus, the question arises of whether variations in SSS can provide insight into terrestrial precipitation. This study provides evidence that springtime SSS in the subtropical North Atlantic ocean can be used as a predictor of terrestrial precipitation during the subsequent summer monsoon in Africa. Specifically, increased springtime SSS in the central to eastern subtropical North Atlantic tends to be followed by above-normal monsoon-season precipitation in the African Sahel. In the spring, high SSS is associated with enhanced moisture flux divergence from the subtropical oceans, which converges over the African Sahel and helps to elevate local soil moisture content. From spring to the summer monsoon season, the initial water cycling signal is preserved, amplified, and manifested in excessive precipitation. According to our analysis of currently available soil moisture data sets, this 3-month delay is attributable to a positive coupling between soil moisture, moisture flux convergence, and precipitation in the Sahel. Because of the physical connection between salinity, ocean-to-land moisture transport, and local soil moisture feedback, seasonal forecasts of Sahel precipitation can be improved by incorporating SSS into prediction models. Thus, expanded monitoring of ocean salinity should contribute to more skillful predictions of precipitation in vulnerable subtropical regions, such as the Sahel. PMID:27386525
Variational data assimilative modeling of the Gulf of Maine in spring and summer 2010
NASA Astrophysics Data System (ADS)
Li, Yizhen; He, Ruoying; Chen, Ke; McGillicuddy, Dennis J.
2015-05-01
A data assimilative ocean circulation model is used to hindcast the Gulf of Maine [GOM) circulation in spring and summer 2010. Using the recently developed incremental strong constraint 4D Variational data assimilation algorithm, the model assimilates satellite sea surface temperature and in situ temperature and salinity profiles measured by expendable bathythermograph, Argo floats, and shipboard CTD casts. Validation against independent observations shows that the model skill is significantly improved after data assimilation. The data-assimilative model hindcast reproduces the temporal and spatial evolution of the ocean state, showing that a sea level depression southwest of the Scotian Shelf played a critical role in shaping the gulf-wide circulation. Heat budget analysis further demonstrates that both advection and surface heat flux contribute to temperature variability. The estimated time scale for coastal water to travel from the Scotian Shelf to the Jordan Basin is around 60 days, which is consistent with previous estimates based on in situ observations. Our study highlights the importance of resolving upstream and offshore forcing conditions in predicting the coastal circulation in the GOM.
Protein Structure Prediction with Evolutionary Algorithms
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hart, W.E.; Krasnogor, N.; Pelta, D.A.
1999-02-08
Evolutionary algorithms have been successfully applied to a variety of molecular structure prediction problems. In this paper we reconsider the design of genetic algorithms that have been applied to a simple protein structure prediction problem. Our analysis considers the impact of several algorithmic factors for this problem: the confirmational representation, the energy formulation and the way in which infeasible conformations are penalized, Further we empirically evaluated the impact of these factors on a small set of polymer sequences. Our analysis leads to specific recommendations for both GAs as well as other heuristic methods for solving PSP on the HP model.
A Universal Tare Load Prediction Algorithm for Strain-Gage Balance Calibration Data Analysis
NASA Technical Reports Server (NTRS)
Ulbrich, N.
2011-01-01
An algorithm is discussed that may be used to estimate tare loads of wind tunnel strain-gage balance calibration data. The algorithm was originally developed by R. Galway of IAR/NRC Canada and has been described in the literature for the iterative analysis technique. Basic ideas of Galway's algorithm, however, are universally applicable and work for both the iterative and the non-iterative analysis technique. A recent modification of Galway's algorithm is presented that improves the convergence behavior of the tare load prediction process if it is used in combination with the non-iterative analysis technique. The modified algorithm allows an analyst to use an alternate method for the calculation of intermediate non-linear tare load estimates whenever Galway's original approach does not lead to a convergence of the tare load iterations. It is also shown in detail how Galway's algorithm may be applied to the non-iterative analysis technique. Hand load data from the calibration of a six-component force balance is used to illustrate the application of the original and modified tare load prediction method. During the analysis of the data both the iterative and the non-iterative analysis technique were applied. Overall, predicted tare loads for combinations of the two tare load prediction methods and the two balance data analysis techniques showed excellent agreement as long as the tare load iterations converged. The modified algorithm, however, appears to have an advantage over the original algorithm when absolute voltage measurements of gage outputs are processed using the non-iterative analysis technique. In these situations only the modified algorithm converged because it uses an exact solution of the intermediate non-linear tare load estimate for the tare load iteration.
Salinity influences on aboveground and belowground net primary productivity in tidal wetlands
Pierfelice, Kathryn N.; Graeme Lockaby, B.; Krauss, Ken W.; Conner, William H.; Noe, Gregory; Ricker, Matthew C.
2017-01-01
Tidal freshwater wetlands are one of the most vulnerable ecosystems to climate change and rising sea levels. However salinification within these systems is poorly understood, therefore, productivity (litterfall, woody biomass, and fine roots) were investigated on three forested tidal wetlands [(1) freshwater, (2) moderately saline, and (3) heavily salt-impacted] and a marsh along the Waccamaw and Turkey Creek in South Carolina. Mean aboveground (litterfall and woody biomass) production on the freshwater, moderately saline, heavily salt-impacted, and marsh, respectively, was 1,061, 492, 79, and 0 g m−2 year−1 versus belowground (fine roots) 860, 490, 620, and 2,128 g m−2 year−1. Litterfall and woody biomass displayed an inverse relationship with salinity. Shifts in productivity across saline sites is of concern because sea level is predicted to continue rising. Results from the research reported in this paper provide baseline data upon which coupled hydrologic/wetland models can be created to quantify future changes in tidal forest functions.
Ro, Hee-Myong; Kim, Pan-Gun; Park, Ji-Suk; Yun, Seok-In; Han, Junho
2018-04-01
Constructed coastal marsh regulates land-born nitrogen (N) loadings through salinity-dependent microbial N transformation processes. A hypothesis that salinity predominantly controls N removal in marsh was tested through incubation in a closed system with added- 15 NH 4 + using sediments collected from five sub-marshes in Shihwa marsh, Korea. Time-course patterns of concentrations and 15 N-atom% of soil-N pools were analyzed. Sediments having higher salinity and lower soil organic-C and acid-extractable organic-N exhibited slower rates of N mineralization and immobilization, nitrification, and denitrification. Rates of denitrification were not predicted well by sediment salinity but by its organic-C, indicating heterotrophic denitrification. Denitrification dominated N-loss from this marsh, and nitrogen removal capacity of this marsh was estimated at 337 kg N day -1 (9.9% of the daily N-loadings) considering the current rooting depth of common reeds (1.0 m). We showed that sediment N removal decreases with increasing salinity and can increase with increasing organic-C for heterotrophic denitrification. Copyright © 2018 Elsevier Ltd. All rights reserved.
Prediction Of The Expected Safety Performance Of Rural Two-Lane Highways
DOT National Transportation Integrated Search
2000-12-01
This report presents an algorithm for predicting the safety performance of a rural two-lane highway. The accident prediction algorithm consists of base models and accident modification factors for both roadway segments and at-grade intersections on r...
Holocene oscillations in temperature and salinity of the surface subpolar North Atlantic.
Thornalley, David J R; Elderfield, Harry; McCave, I Nick
2009-02-05
The Atlantic meridional overturning circulation (AMOC) transports warm salty surface waters to high latitudes, where they cool, sink and return southwards at depth. Through its attendant meridional heat transport, the AMOC helps maintain a warm northwestern European climate, and acts as a control on the global climate. Past climate fluctuations during the Holocene epoch ( approximately 11,700 years ago to the present) have been linked with changes in North Atlantic Ocean circulation. The behaviour of the surface flowing salty water that helped drive overturning during past climatic changes is, however, not well known. Here we investigate the temperature and salinity changes of a substantial surface inflow to a region of deep-water formation throughout the Holocene. We find that the inflow has undergone millennial-scale variations in temperature and salinity ( approximately 3.5 degrees C and approximately 1.5 practical salinity units, respectively) most probably controlled by subpolar gyre dynamics. The temperature and salinity variations correlate with previously reported periods of rapid climate change. The inflow becomes more saline during enhanced freshwater flux to the subpolar North Atlantic. Model studies predict a weakening of AMOC in response to enhanced Arctic freshwater fluxes, although the inflow can compensate on decadal timescales by becoming more saline. Our data suggest that such a negative feedback mechanism may have operated during past intervals of climate change.
NASA Astrophysics Data System (ADS)
Mirck, Jaconette; Schroeder, William
2018-01-01
The change from deep-rooted grass and shrub vegetation to annual-cropping dryland farming has contributed to serious soil salinization challenges on the semi-arid North American Great Plains. In some cases, cultivation of the Great Plains has increased the availability of water, causing dominant sulfate salts to travel from the uphill areas to depressions where it will surface when water evaporates at the soil surface. A potential solution could include the replanting of the native deep-rooted vegetation, which requires knowledge of the spatial distribution of soil salinity. This study tested the soil factors influencing electromagnetic-induction meter (EM38) readings of soil salinity distribution around wetlands. The objectives were to: (1) predict growth and survival of Salix dasyclados Wimm. (cv. `India') along a salinity gradient in a small wetland, and (2) investigate whether newly established willows affected water-table fluctuations, which would indicate their phreatophytic nature or their ability to obtain their water supply from the zone of saturation. Results indicated significantly lower salinity values for sampling points with EM38 readings above 175 and 250 mS m-1 for height and survival, respectively. In addition, diurnal fluxes of the water table in areas of good willow growth and lower salinity indicated that cultivar `India' was phreatophytic in these areas and therefore has great potential for being used to combat saline seeps.
NASA Astrophysics Data System (ADS)
Askri, Brahim; Ahmed, Abdelkader T.; Abichou, Tarek; Bouhlila, Rachida
2014-05-01
In southern Tunisia oases, waterlogging, salinity, and water shortage represent serious threats to the sustainability of irrigated agriculture. Understanding the interaction between these problems and their effects on root water uptake is fundamental for suggesting possible options of improving land and water productivity. In this study, HYDRUS-1D model was used in a plot of farmland located in the Fatnassa oasis to investigate the effects of waterlogging, salinity, and water shortage on the date palm water use. The model was calibrated and validated using experimental data of sap flow density of a date palm, soil hydraulic properties, water table depth, and amount of irrigation water. The comparison between predicted and observed data for date palm transpiration rates was acceptable indicating that the model could well estimate water consumption of this tree crop. Scenario simulations were performed with different water table depths, and salinities and frequencies of irrigation water. The results show that the impacts of water table depth and irrigation frequency vary according to the season. In summer, high irrigation frequency and shallow groundwater are needed to maintain high water content and low salinity of the root-zone and therefore to increase the date palm transpiration rates. However, these factors have no significant effect in winter. The results also reveal that irrigation water salinity has no significant effect under shallow saline groundwater.
NASA Astrophysics Data System (ADS)
Mirck, Jaconette; Schroeder, William
2018-06-01
The change from deep-rooted grass and shrub vegetation to annual-cropping dryland farming has contributed to serious soil salinization challenges on the semi-arid North American Great Plains. In some cases, cultivation of the Great Plains has increased the availability of water, causing dominant sulfate salts to travel from the uphill areas to depressions where it will surface when water evaporates at the soil surface. A potential solution could include the replanting of the native deep-rooted vegetation, which requires knowledge of the spatial distribution of soil salinity. This study tested the soil factors influencing electromagnetic-induction meter (EM38) readings of soil salinity distribution around wetlands. The objectives were to: (1) predict growth and survival of Salix dasyclados Wimm. (cv. `India') along a salinity gradient in a small wetland, and (2) investigate whether newly established willows affected water-table fluctuations, which would indicate their phreatophytic nature or their ability to obtain their water supply from the zone of saturation. Results indicated significantly lower salinity values for sampling points with EM38 readings above 175 and 250 mS m-1 for height and survival, respectively. In addition, diurnal fluxes of the water table in areas of good willow growth and lower salinity indicated that cultivar `India' was phreatophytic in these areas and therefore has great potential for being used to combat saline seeps.
Kumar, A. Biju; Schofield, Pam; Raj, Smrithy; Satheesh, Sima
2018-01-01
Loricariid catfishes of the genus Pterygoplichthys are native to South America and have been introduced in many localities around the world. They are freshwater fishes, but may also use low-salinity habitats such as estuaries for feeding or dispersal. Here we report results of a field survey and salinity-tolerance experiments for a population of Pterygoplichthys sp. collected in Kerala, India. In both chronic and acute salinity-tolerance trials, fish were able to withstand salinities up to 12 ppt with no mortality; however, fish transferred to salinities > 12 ppt did not survive. The experimental results provide evidence that nonnative Pterygoplichthys sp. are able to tolerate mesohaline conditions for extended periods, and can easily invade the brackish water ecosystems of the state. Further, Pterygoplichthys sp. from Kerala have greater salinity tolerance than other congeners. These data are vital to predicting the invasion of non-native fishes such as Pterygoplichthys spp. into coastal systems in Kerala and worldwide. This is particularly important as estuarine ecosystems are under threat of global climate change and sea-level rise. In light of the results of the present study and considering the reports of negative impacts of the species in invaded water bodies, management authorities may consider controlling populations and/or instituting awareness programmes to prevent the spread of this nuisance aquatic invasive species in Kerala.
Lü, Si-Dan; Chen, Wei-Ping; Wang, Mei-E
2012-12-01
As the conflict between water supply and demand, wastewater reuse has become an important measure, which can relieve the water shortage in Beijing. In order to promote safe irrigation with reclaimed water and prevent soil salinisation, the dynamic transport of salts in urban soils of Beijing, a city of water shortage, under irrigation of reclaimed water was simulated by ENVIRO-GRO model in this research. The accumulation trends of soil salinity were predicted. Simultaneously, it investigated the effects of different irrigation practices on soil water-salt movement and salt accumulation. Results indicated that annual averages of soil salinity (EC(e)) increased 29.5%, 97.2%, 197.8% respectively, with the higher irrigation, normal irrigation, and low irrigation under equilibrium conditions. Irrigation frequency had little effect on soil salt-water movement, and soil salt accumulation was in a downward trend with low frequency of irrigation. Under equilibrium conditions, annual averages of EC(e) increased 23.7%, 97.2%, 208.5% respectively, with irrigation water salinity (EC(w)) 0.6, 1.2, 2.4 dS x m(-1). Soil salinity increased slightly with EC(w) = 0.6 dS x m(-1), while soil salinization did not appear. Totally, the growth of Blue grass was not influenced by soil salinity under equilibrium conditions with the regular irrigation in Beijing, but mild soil salinization appeared.
Effects of hydrologic connectivity on aquatic macroinvertebrate assemblages in different marsh types
Kang, Sung-Ryong; King, Sammy L.
2013-01-01
Hydrologic connectivity can be an important driver of aquatic macroinvertebrate assemblages. Its effects on aquatic macroinvertebrate assemblages in coastal marshes, however, are relatively poorly studied. We evaluated the effects of lateral hydrologic connectivity (permanently connected ponds: PCPs; temporary connected ponds: TCPs), and other environmental variables on aquatic macroinvertebrate assemblages and functional feeding groups (FFGs) in freshwater, brackish, and saline marshes in Louisiana, USA. We hypothesized that (1) aquatic macroinvertebrate assemblages in PCPs would have higher assemblage metric values (density, biomass, Shannon-Wiener diversity) than TCPs and (2) the density and proportional abundance of certain FFGs (i.e. scrapers, shredders, and collectors) would be greater in freshwater marsh than brackish and saline marshes. The data in our study only partially supported our first hypothesis: while freshwater marsh PCPs had higher density and biomass than TCPs, assemblage metric values in saline TCPs were greater than saline PCPs. In freshwater TCPs, long duration of isolation limited access of macroinvertebrates from adjacent water bodies, which may have reduced assemblage metric values. However, the relatively short duration of isolation in saline TCPs provided more stable or similar habitat conditions, facilitating higher assemblage metric values. As predicted by our second hypothesis, freshwater PCPs and TCPs supported a greater density of scrapers, shredders, and collectors than brackish and saline ponds. Aquatic macroinvertebrate assemblages seem to be structured by individual taxa responses to salinity as well as pond habitat attributes.
Electric Power Engineering Cost Predicting Model Based on the PCA-GA-BP
NASA Astrophysics Data System (ADS)
Wen, Lei; Yu, Jiake; Zhao, Xin
2017-10-01
In this paper a hybrid prediction algorithm: PCA-GA-BP model is proposed. PCA algorithm is established to reduce the correlation between indicators of original data and decrease difficulty of BP neural network in complex dimensional calculation. The BP neural network is established to estimate the cost of power transmission project. The results show that PCA-GA-BP algorithm can improve result of prediction of electric power engineering cost.
Elnaghy, A M; Elsaka, S E
2017-10-01
To compare the cyclic fatigue resistance of WaveOne Gold (Dentsply Tulsa Dental Specialties, Tulsa, OK, USA) and Reciproc (VDW, Munich, Germany) reciprocating instruments during immersion in sodium hypochlorite (NaOCl) and saline solutions at body temperature. A total of 180 new WaveOne Gold primary size 25, .07 taper, and Reciproc size 25, .08 taper were randomly divided into three groups: group 1: no immersion (control, air); group 2: immersion in saline at 37 ± 1 °C; and group 3: immersion in 5% NaOCl at 37 ± 1 °C. The instruments were reciprocated in the test solution until fracture, and the number of cycles to failure was recorded. The data were analysed statistically using t-tests and one-way analysis of variance (anova) with the significance level set at P < 0.05. A Weibull analysis was performed on number of cycles to failure data. WaveOne Gold instruments had significantly greater number of cycles to failure than Reciproc instruments in all groups (P < 0.001). Fatigue resistance for both instruments tested in air was significantly higher than that in saline and NaOCl solutions (P < 0.001). For both instruments, there was no significant difference in the fatigue resistance between saline and NaOCl solutions (P > 0.05). The Weibull analysis showed that the predicted cycles of WaveOne Gold in air was 1027 cycles for 99% survival. However, Reciproc instruments tested in NaOCl solution had the lowest predicted cycles (613 cycles) among the groups. Immersion of WaveOne Gold and Reciproc reciprocating instruments in saline and NaOCl solutions decreased considerably their cyclic fatigue resistance. The fatigue resistance of WaveOne Gold instruments was higher than that of Reciproc instruments. © 2016 International Endodontic Journal. Published by John Wiley & Sons Ltd.
Estuarine biodiversity as an indicator of groundwater discharge
NASA Astrophysics Data System (ADS)
Silva, A. C. F.; Tavares, P.; Shapouri, M.; Stigter, T. Y.; Monteiro, J. P.; Machado, M.; Cancela da Fonseca, L.; Ribeiro, L.
2012-01-01
Communities located in the interface between marine/brackish and freshwater habitats are likely to be early responders to climatic changes as they are exposed to both saline and freshwater conditions, and thus are expected to be sensitive to any change in their environmental conditions. Climatic effects are predicted to reduce the availability of groundwater, altering the hydrological balance on estuarine-aquifer interfaces. Here, we aimed to characterise the estuarine faunal community along a gradient dependent on groundwater input, under a predicted climatic scenario of reduction in groundwater discharge into the estuary. Sediment macrofauna was sampled along a salinity gradient following both the wet and dry seasons in 2009. Results indicated that species abundance varied significantly with the salinity gradient created by the groundwater discharge into the estuarine habitat and with sampling time. The isopode Cyathura carinata (Krøyer, 1847) and the polychaetes Heteromastus filiformis (Claparède, 1864) and Hediste diversicolor O.F. Muller, 1776 were associated with the more saline locations, while oligochaeta and Spionidae were more abundant in areas of lower salinity. The polychaete Alkmaria romijni Horst, 1919 was the dominant species and ubiquitous throughout sampling stations. This study provides evidence for estuarine fauna to be considered as a potentially valuable indicator of variation in the input of groundwater into marine-freshwater interface habitats, expected from climatic pressures on aquifer levels, condition and recharge rates. For instance, a reduction in the abundance of some polychaete species, found here to be more abundant in freshwater conditions, and increasing Oligochaeta found here on higher salinities, can potentially be early warnings of a reduction in the input of groundwater into estuaries. Estuarine benthic species are often the main prey for commercially important fish predators such as in our case study, making it important to monitor the aquatic habitat interfaces taking into consideration the estuarine macrobenthos and groundwater availability in the system.
NASA Astrophysics Data System (ADS)
Fogg, G. E.
2016-12-01
Hydrologists often compartmentalize subsurface fluid systems into soil, vadose zone, and groundwater even though such entities are all part of a dynamic continuum. Similarly, hydrogeologists mainly study the fresh groundwater that is essential to water resources upon which humans and ecosystems depend. While vast amounts of these fresh groundwater resources are in sedimentary basins, many of those basins contain vast amounts of saline groundwater and petroleum underneath the freshwater. Contrary to popular assumptions in the hydrogeology and petroleum communities, the saline groundwater and petroleum resources are not stagnant, but migrate in response to Tothian, topographically driven flow as well as other driving forces controlled by thermal, density and geomechanical processes. Importantly, the transition between fresh and saline groundwater does not necessarily represent a boundary between deep, stagnant groundwater and shallower, circulating groundwater. The deep groundwater is part of the subsurface fluid continuum, and exploitation of saline aquifer systems for conventional and unconventional (e.g., fracking) petroleum production or for injection of waste fluids should be done with some knowledge of the integrated fresh and saline water hydrogeologic system. Without sufficient knowledge of the deep and shallow hydrogeology, there will be significant uncertainty about the possible impacts of injection and petroleum extraction activities on overlying fresh groundwater quality and quantity. When significant uncertainty like this exists in science, public and scientific perceptions of consequences swing wildly from one extreme to another. Accordingly, professional and lay opinions on fracking range from predictions of doom to predictions of zero impact. This spastic range of opinions stems directly from the scientific uncertainty about hydrogeologic interactions between shallow and deep hydrogeologic systems. To responsibly manage both the fresh and saline, petroliferous groundwater resources, a new era of whole-system characterization is needed that integrates deep and shallow geologic and hydrogeologic models and data, including aquifer-aquitard frameworks, head and pressure in space and time, and hydrogeochemistry.
Strategies for concurrent processing of complex algorithms in data driven architectures
NASA Technical Reports Server (NTRS)
Stoughton, John W.; Mielke, Roland R.; Som, Sukhamony
1990-01-01
The performance modeling and enhancement for periodic execution of large-grain, decision-free algorithms in data flow architectures is examined. Applications include real-time implementation of control and signal processing algorithms where performance is required to be highly predictable. The mapping of algorithms onto the specified class of data flow architectures is realized by a marked graph model called ATAMM (Algorithm To Architecture Mapping Model). Performance measures and bounds are established. Algorithm transformation techniques are identified for performance enhancement and reduction of resource (computing element) requirements. A systematic design procedure is described for generating operating conditions for predictable performance both with and without resource constraints. An ATAMM simulator is used to test and validate the performance prediction by the design procedure. Experiments on a three resource testbed provide verification of the ATAMM model and the design procedure.
Strategies for concurrent processing of complex algorithms in data driven architectures
NASA Technical Reports Server (NTRS)
Som, Sukhamoy; Stoughton, John W.; Mielke, Roland R.
1990-01-01
Performance modeling and performance enhancement for periodic execution of large-grain, decision-free algorithms in data flow architectures are discussed. Applications include real-time implementation of control and signal processing algorithms where performance is required to be highly predictable. The mapping of algorithms onto the specified class of data flow architectures is realized by a marked graph model called algorithm to architecture mapping model (ATAMM). Performance measures and bounds are established. Algorithm transformation techniques are identified for performance enhancement and reduction of resource (computing element) requirements. A systematic design procedure is described for generating operating conditions for predictable performance both with and without resource constraints. An ATAMM simulator is used to test and validate the performance prediction by the design procedure. Experiments on a three resource testbed provide verification of the ATAMM model and the design procedure.
NASA Astrophysics Data System (ADS)
Vance, Steve; Bouffard, Mathieu; Choukroun, Mathieu; Sotin, Christophe
2014-06-01
The large icy moons of Jupiter contain vast quantities of liquid water, a key ingredient for life. Ganymede and Callisto are weaker candidates for habitability than Europa, in part because of the model-based assumption that high-pressure ice layers cover their seafloors and prevent significant water-rock interaction. Water-rock interactions may occur, however, if heating at the rock-ice interface melts the high pressure ice. Highly saline fluids would be gravitationally stable, and might accumulate under the ice due to upward migration, refreezing, and fractionation of salt from less concentrated liquids. To assess the influence of salinity on Ganymede's internal structure, we use available phase-equilibrium data to calculate activity coefficients and predict the freezing of water ice in the presence of aqueous magnesium sulfate. We couple this new equation of state with thermal profiles in Ganymede's interior-employing recently published thermodynamic data for the aqueous phase-to estimate the thicknesses of layers of ice I, III, V, and VI. We compute core and silicate mantle radii consistent with available constraints on Ganymede's mass and gravitational moment of inertia. Mantle radii range from 800 to 900 km for the values of salt and heat flux considered here (4-44 mW m-2 and 0 to 10 wt% MgSO4). Ocean concentrations with salinity higher than 10 wt% have little high pressure ice. Even in a Ganymede ocean that is mostly liquid, achieving such high ocean salinity is permissible for the range of likely S/Si ratios. However, elevated salinity requires a smaller silicate mantle radius to satisfy mass and moment-of-inertia constraints, so ice VI is always present in Ganymede's ocean. For lower values of heat flux, oceans with salinity as low as 3 wt% can co-exist with ice III. Available experimental data indicate that ice phases III and VI become buoyant for salinity higher than 5 wt% and 10 wt%, respectively. Similar behavior probably occurs for ice V at salinities higher than 10 wt%. Flotation can occur over tens of kilometers of depth, indicating the possibility for upward ‘snow’ or other exotic modes of heat and material transport. We assess Ganymede's interior structure for oceans with magnesium sulfate. New activity models predict freezing of ice in magnesium sulfate solutions. High ocean salinities are permitted by constraints on Ganymede's sulfur content. Stability under high pressure ice implies water rock contact and layered oceans. Upward ‘snow’ of high-pressure ices occurs in the lower depths of salty oceans.
Agha, Mickey; Ennen, Joshua R; Bower, Deborah S; Nowakowski, A Justin; Sweat, Sarah C; Todd, Brian D
2018-03-25
The projected rise in global mean sea levels places many freshwater turtle species at risk of saltwater intrusion into freshwater habitats. Freshwater turtles are disproportionately more threatened than other taxa; thus, understanding the role of salinity in determining their contemporary distribution and evolution should be a research priority. Freshwater turtles are a slowly evolving lineage; however, they can adapt physiologically or behaviourally to various levels of salinity and, therefore, temporarily occur in marine or brackish environments. Here, we provide the first comprehensive global review on freshwater turtle use and tolerance of brackish water ecosystems. We link together current knowledge of geographic occurrence, salinity tolerance, phylogenetic relationships, and physiological and behavioural mechanisms to generate a baseline understanding of the response of freshwater turtles to changing saline environments. We also review the potential origins of salinity tolerance in freshwater turtles. Finally, we integrate 2100 sea level rise (SLR) projections, species distribution maps, literature gathered on brackish water use, and a phylogeny to predict the exposure of freshwater turtles to projected SLR globally. From our synthesis of published literature and available data, we build a framework for spatial and phylogenetic conservation prioritization of coastal freshwater turtles. Based on our literature review, 70 species (∼30% of coastal freshwater turtle species) from 10 of the 11 freshwater turtle families have been reported in brackish water ecosystems. Most anecdotal records, observations, and descriptions do not imply long-term salinity tolerance among freshwater turtles. Rather, experiments show that some species exhibit potential for adaptation and plasticity in physiological, behavioural, and life-history traits that enable them to endure varying periods (e.g. days or months) and levels of saltwater exposure. Species that specialize on brackish water habitats are likely to be vulnerable to SLR because of their exclusive coastal distributions and adaptations to a narrow range of salinities. Most species, however, have not been documented in brackish water habitats but may also be highly vulnerable to projected SLR. Our analysis suggests that approximately 90% of coastal freshwater turtle species assessed in our study will be affected by a 1-m increase in global mean SLR by 2100. Most at risk are freshwater turtles found in New Guinea, Southeast Asia, Australia, and North and South America that may lose more than 10% of their present geographic range. In addition, turtle species in the families Chelidae, Emydidae, and Trionychidae may experience the greatest exposure to projected SLR in their present geographic ranges. Better understanding of survival, growth, reproductive and population-level responses to SLR will improve region-specific population viability predictions of freshwater turtles that are increasingly exposed to SLR. Integrating phylogenetic, physiological, and spatial frameworks to assess the effects of projected SLR may improve identification of vulnerable species, guilds, and geographic regions in need of conservation prioritization. We conclude that the use of brackish and marine environments by freshwater turtles provides clues about the evolutionary processes that have prolonged their existence, shaped their unique coastal distributions, and may prove useful in predicting their response to a changing world. © 2018 Cambridge Philosophical Society.
Climate Change, Precipitation and Impacts on an Estuarine Refuge from Disease
Levinton, Jeffrey; Doall, Michael; Ralston, David; Starke, Adam; Allam, Bassem
2011-01-01
Background Oysters play important roles in estuarine ecosystems but have suffered recently due to overfishing, pollution, and habitat loss. A tradeoff between growth rate and disease prevalence as a function of salinity makes the estuarine salinity transition of special concern for oyster survival and restoration. Estuarine salinity varies with discharge, so increases or decreases in precipitation with climate change may shift regions of low salinity and disease refuge away from optimal oyster bottom habitat, negatively impacting reproduction and survival. Temperature is an additional factor for oyster survival, and recent temperature increases have increased vulnerability to disease in higher salinity regions. Methodology/Principal Findings We examined growth, reproduction, and survival of oysters in the New York Harbor-Hudson River region, focusing on a low-salinity refuge in the estuary. Observations were during two years when rainfall was above average and comparable to projected future increases in precipitation in the region and a past period of about 15 years with high precipitation. We found a clear tradeoff between oyster growth and vulnerability to disease. Oysters survived well when exposed to intermediate salinities during two summers (2008, 2010) with moderate discharge conditions. However, increased precipitation and discharge in 2009 reduced salinities in the region with suitable benthic habitat, greatly increasing oyster mortality. To evaluate the estuarine conditions over longer periods, we applied a numerical model of the Hudson to simulate salinities over the past century. Model results suggest that much of the region with suitable benthic habitat that historically had been a low salinity refuge region may be vulnerable to higher mortality under projected increases in precipitation and discharge. Conclusions/Significance Predicted increases in precipitation in the northeastern United States due to climate change may lower salinities past important thresholds for oyster survival in estuarine regions with appropriate substrate, potentially disrupting metapopulation dynamics and impeding oyster restoration efforts, especially in the Hudson estuary where a large basin constitutes an excellent refuge from disease. PMID:21552552
NASA Astrophysics Data System (ADS)
Stagg, C. L.; Wang, H.; Krauss, K.; Conrads, P. A.; Swarzenski, C.; Duberstein, J. A.; DeAngelis, D.
2017-12-01
There is a growing concern about the adverse effects of salt water intrusion via tidal rivers and creeks into tidal freshwater forested wetlands (TFFWs) due to rising sea levels and reduction of freshwater flow. The distribution and composition of plant species, vegetation productivity, and biogeochemical functions including carbon sequestration capacity and flux rates in TFFWs have been found to be affected by increasing river and soil porewater salinities, with significant shifts occurring at a porewater salinity threshold of 3 PSU. However, the drivers of soil porewater salinity, which impact the health and ecological functions of TFFWs remains unclear, limiting our capability of predicting the future impacts of saltwater intrusion on ecosystem services provided by TFFWs. In this study, we developed a soil porewater salinity model for TFFWs based on an existing salt and water balance model with modifications to several key features such as the feedback mechanisms of soil salinity on evapotranspiration reduction and hydraulic conductivity. We selected sites along the floodplains of two rivers, the Waccamaw River (SC, USA) and the Savannah River (GA and SC, USA) that represent landscape salinity gradients of both surface water and soil porewater from tidal influence of the Atlantic Ocean. These sites represent healthy, moderately and highly salt-impacted forests, and oligohaline marshes. The soil porewater salinity model was calibrated and validated using field data collected at these sites throughout 2008-2016. The model results agreed well with field measurements. Analyses of the preliminary simulation results indicate that the magnitude, seasonal and annual variability, and duration of threshold salinities (e.g., 3 PSU) tend to vary significantly with vegetation status and type (i.e., healthy, degraded forests, and oligohaline marshes), especially during drought conditions. The soil porewater salinity model could be coupled with a wetland soil biogeochemistry model to examine the effects of salinity intrusion on carbon cycling processes in dynamic coastal wetlands.
Wren, Christopher; Vogel, Melanie; Lord, Stephen; Abrams, Dominic; Bourke, John; Rees, Philip; Rosenthal, Eric
2012-02-01
The aim of this study was to examine the accuracy in predicting pathway location in children with Wolff-Parkinson-White syndrome for each of seven published algorithms. ECGs from 100 consecutive children with Wolff-Parkinson-White syndrome undergoing electrophysiological study were analysed by six investigators using seven published algorithms, six of which had been developed in adult patients. Accuracy and concordance of predictions were adjusted for the number of pathway locations. Accessory pathways were left-sided in 49, septal in 20 and right-sided in 31 children. Overall accuracy of prediction was 30-49% for the exact location and 61-68% including adjacent locations. Concordance between investigators varied between 41% and 86%. No algorithm was better at predicting septal pathways (accuracy 5-35%, improving to 40-78% including adjacent locations), but one was significantly worse. Predictive accuracy was 24-53% for the exact location of right-sided pathways (50-71% including adjacent locations) and 32-55% for the exact location of left-sided pathways (58-73% including adjacent locations). All algorithms were less accurate in our hands than in other authors' own assessment. None performed well in identifying midseptal or right anteroseptal accessory pathway locations.
Watling, James I.; Brandt, Laura A.; Bucklin, David N.; Fujisaki, Ikuko; Mazzotti, Frank J.; Romañach, Stephanie; Speroterra, Carolina
2015-01-01
Species distribution models (SDMs) are widely used in basic and applied ecology, making it important to understand sources and magnitudes of uncertainty in SDM performance and predictions. We analyzed SDM performance and partitioned variance among prediction maps for 15 rare vertebrate species in the southeastern USA using all possible combinations of seven potential sources of uncertainty in SDMs: algorithms, climate datasets, model domain, species presences, variable collinearity, CO2 emissions scenarios, and general circulation models. The choice of modeling algorithm was the greatest source of uncertainty in SDM performance and prediction maps, with some additional variation in performance associated with the comprehensiveness of the species presences used for modeling. Other sources of uncertainty that have received attention in the SDM literature such as variable collinearity and model domain contributed little to differences in SDM performance or predictions in this study. Predictions from different algorithms tended to be more variable at northern range margins for species with more northern distributions, which may complicate conservation planning at the leading edge of species' geographic ranges. The clear message emerging from this work is that researchers should use multiple algorithms for modeling rather than relying on predictions from a single algorithm, invest resources in compiling a comprehensive set of species presences, and explicitly evaluate uncertainty in SDM predictions at leading range margins.
Cuthbertson, Carmen C; Kucharska-Newton, Anna; Faurot, Keturah R; Stürmer, Til; Jonsson Funk, Michele; Palta, Priya; Windham, B Gwen; Thai, Sydney; Lund, Jennifer L
2018-07-01
Frailty is a geriatric syndrome characterized by weakness and weight loss and is associated with adverse health outcomes. It is often an unmeasured confounder in pharmacoepidemiologic and comparative effectiveness studies using administrative claims data. Among the Atherosclerosis Risk in Communities (ARIC) Study Visit 5 participants (2011-2013; n = 3,146), we conducted a validation study to compare a Medicare claims-based algorithm of dependency in activities of daily living (or dependency) developed as a proxy for frailty with a reference standard measure of phenotypic frailty. We applied the algorithm to the ARIC participants' claims data to generate a predicted probability of dependency. Using the claims-based algorithm, we estimated the C-statistic for predicting phenotypic frailty. We further categorized participants by their predicted probability of dependency (<5%, 5% to <20%, and ≥20%) and estimated associations with difficulties in physical abilities, falls, and mortality. The claims-based algorithm showed good discrimination of phenotypic frailty (C-statistic = 0.71; 95% confidence interval [CI] = 0.67, 0.74). Participants classified with a high predicted probability of dependency (≥20%) had higher prevalence of falls and difficulty in physical ability, and a greater risk of 1-year all-cause mortality (hazard ratio = 5.7 [95% CI = 2.5, 13]) than participants classified with a low predicted probability (<5%). Sensitivity and specificity varied across predicted probability of dependency thresholds. The Medicare claims-based algorithm showed good discrimination of phenotypic frailty and high predictive ability with adverse health outcomes. This algorithm can be used in future Medicare claims analyses to reduce confounding by frailty and improve study validity.
Training the Recurrent neural network by the Fuzzy Min-Max algorithm for fault prediction
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zemouri, Ryad; Racoceanu, Daniel; Zerhouni, Noureddine
2009-03-05
In this paper, we present a training technique of a Recurrent Radial Basis Function neural network for fault prediction. We use the Fuzzy Min-Max technique to initialize the k-center of the RRBF neural network. The k-means algorithm is then applied to calculate the centers that minimize the mean square error of the prediction task. The performances of the k-means algorithm are then boosted by the Fuzzy Min-Max technique.
Methods for predicting properties and tailoring salt solutions for industrial processes
NASA Technical Reports Server (NTRS)
Ally, Moonis R.
1993-01-01
An algorithm developed at Oak Ridge National Laboratory accurately and quickly predicts thermodynamic properties of concentrated aqueous salt solutions. This algorithm is much simpler and much faster than other modeling schemes and is unique because it can predict solution behavior at very high concentrations and under varying conditions. Typical industrial applications of this algorithm would be in manufacture of inorganic chemicals by crystallization, thermal storage, refrigeration and cooling, extraction of metals, emissions controls, etc.
LaPeyre, Megan K.; Rybovich, Molly; Hall, Steven G.; La Peyre, Jerome F.
2016-01-01
Changes in the timing and interaction of seasonal high temperatures and low salinities as predicted by climate change models could dramatically alter oyster population dynamics. Little is known explicitly about how low salinity and high temperature combinations affect spat (<25mm), seed (25–75mm), andmarket (>75mm) oyster growth and mortality. Using field and laboratory studies, this project quantified the combined effects of extremely low salinities (<5) and high temperatures (>30°C) on growth and survival of spat, seed, andmarket-sized oysters. In 2012 and 2013, hatchery-produced oysters were placed in open and closed cages at three sites in Breton Sound, LA, along a salinity gradient that typically ranged from 5 to 20. Growth and mortality were recorded monthly. Regardless of size class, oysters at the lowest salinity site (annualmean = 4.8) experienced significantly highermortality and lower growth than oysters located in higher salinity sites (annual means = 11.1 and 13.0, respectively); furthermore, all oysters in open cages at the two higher salinity sites experienced higher mortality than in closed cages, likely due to predation. To explicitly examine oyster responses to extreme low salinity and high temperature combinations, a series of laboratory studies were conducted. Oysters were placed in 18 tanks in a fully crossed temperature (25°C, 32°C) by salinity (1, 5, and 15) study with three replicates, and repeated at least twice for each oyster size class. Regardless of temperature, seed and market oysters held in low salinity tanks (salinity 1) experienced 100% mortality within 7 days. In contrast, at salinity 5, temperature significantly affected mortality; oysters in all size classes experienced greater than 50%mortality at 32°C and less than 40%mortality at 25°C. At the highest salinity tested (15), only market-sized oysters held at 32°C experienced significant mortality (>60%). These studies demonstrate that high water temperatures (>30°C) and low salinities (<5) negatively impact oyster growth and survival differentially and that high temperatures alone may negatively impact market-sized oysters. It is critical to understand the potential impacts of climate and anthropogenic changes on oyster resources to better adapt and manage for long-term sustainability.
NASA Technical Reports Server (NTRS)
Hunter, H. E.; Amato, R. A.
1972-01-01
The results are presented of the application of Avco Data Analysis and Prediction Techniques (ADAPT) to derivation of new algorithms for the prediction of future sunspot activity. The ADAPT derived algorithms show a factor of 2 to 3 reduction in the expected 2-sigma errors in the estimates of the 81-day running average of the Zurich sunspot numbers. The report presents: (1) the best estimates for sunspot cycles 20 and 21, (2) a comparison of the ADAPT performance with conventional techniques, and (3) specific approaches to further reduction in the errors of estimated sunspot activity and to recovery of earlier sunspot historical data. The ADAPT programs are used both to derive regression algorithm for prediction of the entire 11-year sunspot cycle from the preceding two cycles and to derive extrapolation algorithms for extrapolating a given sunspot cycle based on any available portion of the cycle.
Prediction of monthly rainfall in Victoria, Australia: Clusterwise linear regression approach
NASA Astrophysics Data System (ADS)
Bagirov, Adil M.; Mahmood, Arshad; Barton, Andrew
2017-05-01
This paper develops the Clusterwise Linear Regression (CLR) technique for prediction of monthly rainfall. The CLR is a combination of clustering and regression techniques. It is formulated as an optimization problem and an incremental algorithm is designed to solve it. The algorithm is applied to predict monthly rainfall in Victoria, Australia using rainfall data with five input meteorological variables over the period of 1889-2014 from eight geographically diverse weather stations. The prediction performance of the CLR method is evaluated by comparing observed and predicted rainfall values using four measures of forecast accuracy. The proposed method is also compared with the CLR using the maximum likelihood framework by the expectation-maximization algorithm, multiple linear regression, artificial neural networks and the support vector machines for regression models using computational results. The results demonstrate that the proposed algorithm outperforms other methods in most locations.
Kianmajd, Babak; Carter, David; Soshi, Masakazu
2016-10-01
Robotic total hip arthroplasty is a procedure in which milling operations are performed on the femur to remove material for the insertion of a prosthetic implant. The robot performs the milling operation by following a sequential list of tool motions, also known as a toolpath, generated by a computer-aided manufacturing (CAM) software. The purpose of this paper is to explain a new toolpath force prediction algorithm that predicts cutting forces, which results in improving the quality and safety of surgical systems. With a custom macro developed in the CAM system's native application programming interface, cutting contact patch volume was extracted from CAM simulations. A time domain cutting force model was then developed through the use of a cutting force prediction algorithm. The second portion validated the algorithm by machining a hip canal in simulated bone using a CNC machine. Average cutting forces were measured during machining using a dynamometer and compared to the values predicted from CAM simulation data using the proposed method. The results showed the predicted forces matched the measured forces in both magnitude and overall pattern shape. However, due to inconsistent motion control, the time duration of the forces was slightly distorted. Nevertheless, the algorithm effectively predicted the forces throughout an entire hip canal procedure. This method provides a fast and easy technique for predicting cutting forces during orthopedic milling by utilizing data within a CAM software.
Huang, Y; Song, Y; Li, G; Drake, P L; Zheng, W; Li, Z; Zhou, D
2015-11-01
The abundance and distribution of species can be ascribed to both environmental heterogeneity and stress tolerance, with the latter measure sometimes associated with phenotypic plasticity. Although phenotypic plasticity varies predictably in response to common forms of stress, we lack a mechanistic understanding of the response of species to high saline-sodic soils. We compared the phenotypic plasticity of three pairs of high and low saline-sodic tolerant congeners from the families Poaceae (Leymus chinensis versus L. secalinus), Fabaceae (Lespedeza davurica versus L. bicolor) and Asteraceae (Artemisia mongolica versus A. sieversiana) in a controlled pot experiment in the Songnen grassland, China. The low tolerant species, L. secalinus and A. sieversiana exhibited higher plasticity in response to soil salinity and sodicity than their paired congeners. Highly tolerant species, L. chinensis and A. mongolica, had higher values for several important morphological traits, such as shoot length and total biomass under the high saline-sodic soil treatment than their paired congeners. In contrast, congeners from the family Fabaceae, L. davurica and L. bicolor, did not exhibit significantly different plasticity in response to soil salinity and sodicity. All species held a constant reproductive effort in response to saline-sodic soil stress. The different responses between low and high tolerant species offer an explanation for the distribution patterns of these species in the Songnen grassland. Highly tolerant species showed less morphological plasticity over a range of saline-sodic conditions than their paired congeners, which may manifest as an inability to compete with co-occurring species in locations where saline-sodic soils are absent. © 2015 German Botanical Society and The Royal Botanical Society of the Netherlands.
Effects of flooding, salinity and herbivory on coastal plant communities, Louisiana, United States
Gough, L.; Grace, J.B.
1998-01-01
Flooding and salinity stress are predicted to increase in coastal Louisiana as relative sea level rise (RSLR) continues in the Gulf of Mexico region. Although wetland plant species are adapted to these stressors, questions persist as to how marshes may respond to changed abiotic variables caused by RSLR, and how herbivory by native and non-native mammals may affect this response. The effects of altered flooding and salinity on coastal marsh communities were examined in two field experiments that simultaneously manipulated herbivore pressure. Marsh sods subjected to increased or decreased flooding (by lowering or raising sods, respectively), and increased or decreased salinity (by reciprocally transplanting sods between a brackish and fresh marsh), were monitored inside and outside mammalian herbivore exclosures for three growing seasons. Increased flooding stress reduced species numbers and biomass; alleviating flooding stress did not significantly alter species numbers while community biomass increased. Increased salinity reduced species numbers and biomass, more so if herbivores were present. Decreasing salinity had an unexpected effect: herbivores selectively consumed plants transplanted from the higher-salinity site. In plots protected from herbivory, decreased salinity had little effect on species numbers or biomass, but community composition changed. Overall, herbivore pressure further reduced species richness and biomass under conditions of increased flooding and increased salinity, supporting other findings that coastal marsh species can tolerate increasingly stressful conditions unless another factor, e.g., herbivory, is also present. Also, species dropped out of more stressful treatments much faster than they were added when stresses were alleviated, likely due to restrictions on dispersal. The rate at which plant communities will shift as a result of changed abiotic variables will determine if marshes remain viable when subjected to RSLR.
Global salinity predictors of western United States precipitation
NASA Astrophysics Data System (ADS)
Liu, T.; Schmitt, R. W.; Li, L.
2016-12-01
Moisture transport from the excess of evaporation over precipitation in the global ocean drives terrestrial precipitation patterns. Sea surface salinity (SSS) is sensitive to changes in ocean evaporation and precipitation, and therefore, to changes in the global water cycle. We use the Met Office Hadley Centre EN4.2.0 SSS dataset to search for teleconnections between autumn-lead seasonal salinity signals and winter precipitation over the western United States. NOAA CPC Unified observational US precipitation in winter months is extracted from bounding boxes over the northwest and southwest and averaged. Lead autumn SON SSS in ocean areas that are relatively highly correlated with winter DJF terrestrial precipitation are filtered by a size threshold and treated as individual predictors. After removing linear trends from the response and explanatory variables and accounting for multiple collinearity, we use best subsets regression and the Bayesian information criterion (BIC) to objectively select the best model to predict terrestrial precipitation using SSS and SST predictors. The combination of autumn SSS and SST predictors can skillfully predict western US winter terrestrial precipitation (R2 = 0.51 for the US Northwest and R2 = 0.7 for the US Southwest). In both cases, SSS is a better predictor than SST. Thus, incorporating SSS can greatly enhance the accuracy of existing precipitation prediction frameworks that use SST-based climate indices and by extension improve watershed management.
Aggressive desert goby males also court more, independent of the physiological demands of salinity.
Lehtonen, Topi K; Svensson, P Andreas; Wong, Bob B M
2018-06-19
Both between- and within-individual variation in behaviour can be important in determining mating opportunities and reproductive outcomes. Such behavioural variability can be induced by environmental conditions, especially if individuals vary in their tolerance levels or resource allocation patterns. We tested the effects of exposure to different salinity levels on male investment into two important components of mating success-intrasexual aggression and intersexual courtship-in a fish with a resource defence mating system, the desert goby, Chlamydogobius eremius. We found that males that were more aggressive to rivals also exhibited higher rates of courtship displays towards females. Contrary to predictions, this positive relationship, and the consistency of the two behaviours, were not affected by the salinity treatment, despite the physiological costs that high salinity imposes on the species. Moreover, over the entire data-set, there was only a marginally non-significant tendency for males to show higher levels of aggression and courtship in low, than high, salinity. The positive correlation between male aggression and courtship, independent of the physiological demands of the environment, suggests that males are not inclined to make contrasting resource investments into these two key reproductive behaviours. Instead, in this relatively euryhaline freshwater species, typical investment into current reproductive behaviours can occur under a range of different salinity conditions.
Kim, Yoon-Chang; Cramer, Jeffrey A; Booksh, Karl S
2011-10-21
A combination surface plasmon resonance (SPR) and conductivity sensor array was developed and implemented to demonstrate the ability to differentiate among changes in dissolved organic carbon (DOC) and salinity in coastal water. The array is capable of achieving sufficient spatial and temporal data density to better understand the cycling and fate of terrestrial DOC in coastal areas. DOC is the second largest source of bioreactive carbon in the environment and plays a key role in mediating microbial activity and generation of atmospheric CO(2). In the coastal areas, the salinity is also an important property in many applications, such as leak detection for landfill liners, saltwater intrusion to drinking water, marine environment monitoring, and seasonal climate prediction. Conductivity sensors are the industry standard for determining salinity in ocean systems. However, both conductivity and refractive index sensors, such as SPR spectroscopy based sensors, respond to salinity and DOC levels. To demonstrate the capability of the SPR sensor and a conductivity sensor to collect complimentary data useful in discrimination of salinity and DOC in coastal zone water, conductivity, SPR, and temperature data were collected during passage from the Juan de Fuca ridge area returning to the University of Washington docks.
Effects of saline drinking water on early gosling development
Stolley, D.S.; Bissonette, J.A.; Kadlec, J.A.; Coster, D.
1999-01-01
Relatively high levels of saline drinking water may adversely affect the growth, development, and survival of young waterfowl. Saline drinking water was suspect in the low survival rate of Canada goose (Branta canadensis) goslings at Fish Springs National Wildlife Refuge (FSNWR) in western Utah. Hence, we investigated the effects of saline drinking water on the survival and growth of captive, wild-strain goslings from day 1-28 following hatch. We compared survival and growth (as measured by body mass, wing length, and culmen length) between a control group on tap water with a mean specific conductivity of 650 ??S/cm, and 2 saline water treatments: (1) intermediate level (12,000 ??S/cm), and (2) high level (18,000 ??S/cm). Gosling mortality occurred only in the 18,000 ??S/cm treatment group (33%; n = 9). Slopes of regressions of mean body mass, wing length, and culmen length on age were different from each other (P < 0.05), except for culmen length for the intermediate and high treatment levels. We predict that free-ranging wild goslings will experience mortality at even lower salinity levels than captive goslings because of the combined effects of depressed growth and environmental stresses, including hot desert temperatures and variable food quality over summer.
Zimmerman, C.E.
2005-01-01
Analysis of otolith strontium (Sr) or strontium-to-calcium (Sr:Ca) ratios provides a powerful tool to reconstruct the chronology of migration among salinity environments for diadromous salmonids. Although use of this method has been validated by examination of known individuals and translocation experiments, it has never been validated under controlled experimental conditions. In this study, incorporation of otolith Sr was tested across a range of salinities and resulting levels of ambient Sr and Ca concentrations in juvenile chinook salmon (Oncorhynchus tshawytscha), coho salmon (Oncorhynchus kisutch), sockeye salmon (Oncorhynchus nerka), rainbow trout (Oncorhynchus rnykiss), and Arctic char (Salvelinus alpinus). Experimental water was mixed, using stream water and seawater as end members, to create experimental salinities of 0.1, 6.3, 12.7, 18.6, 25.5, and 33.0 psu. Otolith Sr and Sr:Ca ratios were significantly related to salinity for all species (r2 range: 0.80-0.91) but provide only enough predictive resolution to discriminate among fresh water, brackish water, and saltwater residency. These results validate the use of otolith Sr:Ca ratios to broadly discriminate salinity histories encountered by salmonids but highlight the need for further research concerning the influence of osmoregulation and physiological changes associated with smoking on otolith microchemistry.
Maximizing the value of pressure data in saline aquifer characterization
NASA Astrophysics Data System (ADS)
Yoon, Seonkyoo; Williams, John R.; Juanes, Ruben; Kang, Peter K.
2017-11-01
The injection and storage of freshwater in saline aquifers for the purpose of managed aquifer recharge is an important technology that can help ensure sustainable water resources. As a result of the density difference between the injected freshwater and ambient saline groundwater, the pressure field is coupled to the spatial salinity distribution, and therefore experiences transient changes. The effect of variable density can be quantified by the mixed convection ratio, which is a ratio between the strength of two convection processes: free convection due to the density differences and forced convection due to hydraulic gradients. We combine a density-dependent flow and transport simulator with an ensemble Kalman filter (EnKF) to analyze the effects of freshwater injection rates on the value-of-information of transient pressure data for saline aquifer characterization. The EnKF is applied to sequentially estimate heterogeneous aquifer permeability fields using real-time pressure data. The performance of the permeability estimation is analyzed in terms of the accuracy and the uncertainty of the estimated permeability fields as well as the predictability of breakthrough curve arrival times in a realistic push-pull setting. This study demonstrates that injecting fluids at a rate that balances the two characteristic convections can maximize the value of pressure data for saline aquifer characterization.
Comparing Binaural Pre-processing Strategies I: Instrumental Evaluation.
Baumgärtel, Regina M; Krawczyk-Becker, Martin; Marquardt, Daniel; Völker, Christoph; Hu, Hongmei; Herzke, Tobias; Coleman, Graham; Adiloğlu, Kamil; Ernst, Stephan M A; Gerkmann, Timo; Doclo, Simon; Kollmeier, Birger; Hohmann, Volker; Dietz, Mathias
2015-12-30
In a collaborative research project, several monaural and binaural noise reduction algorithms have been comprehensively evaluated. In this article, eight selected noise reduction algorithms were assessed using instrumental measures, with a focus on the instrumental evaluation of speech intelligibility. Four distinct, reverberant scenarios were created to reflect everyday listening situations: a stationary speech-shaped noise, a multitalker babble noise, a single interfering talker, and a realistic cafeteria noise. Three instrumental measures were employed to assess predicted speech intelligibility and predicted sound quality: the intelligibility-weighted signal-to-noise ratio, the short-time objective intelligibility measure, and the perceptual evaluation of speech quality. The results show substantial improvements in predicted speech intelligibility as well as sound quality for the proposed algorithms. The evaluated coherence-based noise reduction algorithm was able to provide improvements in predicted audio signal quality. For the tested single-channel noise reduction algorithm, improvements in intelligibility-weighted signal-to-noise ratio were observed in all but the nonstationary cafeteria ambient noise scenario. Binaural minimum variance distortionless response beamforming algorithms performed particularly well in all noise scenarios. © The Author(s) 2015.
Comparing Binaural Pre-processing Strategies I
Krawczyk-Becker, Martin; Marquardt, Daniel; Völker, Christoph; Hu, Hongmei; Herzke, Tobias; Coleman, Graham; Adiloğlu, Kamil; Ernst, Stephan M. A.; Gerkmann, Timo; Doclo, Simon; Kollmeier, Birger; Hohmann, Volker; Dietz, Mathias
2015-01-01
In a collaborative research project, several monaural and binaural noise reduction algorithms have been comprehensively evaluated. In this article, eight selected noise reduction algorithms were assessed using instrumental measures, with a focus on the instrumental evaluation of speech intelligibility. Four distinct, reverberant scenarios were created to reflect everyday listening situations: a stationary speech-shaped noise, a multitalker babble noise, a single interfering talker, and a realistic cafeteria noise. Three instrumental measures were employed to assess predicted speech intelligibility and predicted sound quality: the intelligibility-weighted signal-to-noise ratio, the short-time objective intelligibility measure, and the perceptual evaluation of speech quality. The results show substantial improvements in predicted speech intelligibility as well as sound quality for the proposed algorithms. The evaluated coherence-based noise reduction algorithm was able to provide improvements in predicted audio signal quality. For the tested single-channel noise reduction algorithm, improvements in intelligibility-weighted signal-to-noise ratio were observed in all but the nonstationary cafeteria ambient noise scenario. Binaural minimum variance distortionless response beamforming algorithms performed particularly well in all noise scenarios. PMID:26721920
Prediction of dynamical systems by symbolic regression
NASA Astrophysics Data System (ADS)
Quade, Markus; Abel, Markus; Shafi, Kamran; Niven, Robert K.; Noack, Bernd R.
2016-07-01
We study the modeling and prediction of dynamical systems based on conventional models derived from measurements. Such algorithms are highly desirable in situations where the underlying dynamics are hard to model from physical principles or simplified models need to be found. We focus on symbolic regression methods as a part of machine learning. These algorithms are capable of learning an analytically tractable model from data, a highly valuable property. Symbolic regression methods can be considered as generalized regression methods. We investigate two particular algorithms, the so-called fast function extraction which is a generalized linear regression algorithm, and genetic programming which is a very general method. Both are able to combine functions in a certain way such that a good model for the prediction of the temporal evolution of a dynamical system can be identified. We illustrate the algorithms by finding a prediction for the evolution of a harmonic oscillator based on measurements, by detecting an arriving front in an excitable system, and as a real-world application, the prediction of solar power production based on energy production observations at a given site together with the weather forecast.
Conrads, Paul; Greenfield, James M.
2010-01-01
The Savannah River originates at the confluence of the Seneca and Tugaloo Rivers, near Hartwell, Ga. and forms the State boundary between South Carolina and Georgia. The J. Strom Thurmond Dam and Lake, located 187 miles upstream from the coast, is responsible for most of the flow regulation that affects the Savannah River from Augusta to the coast. The Savannah Harbor experiences semi-diurnal tides of two high and two low tides in a 24.8-hour period with pronounced differences in tidal range between neap and spring tides occurring on a 14-day and 28-day lunar cycle. The Savannah National Wildlife Refuge is located in the Savannah River Estuary. The tidal freshwater marsh is an essential part of the 28,000-acre refuge and is home to a diverse variety of wildlife and plant communities. The Southeastern U.S. experienced severe drought conditions in 2008 and if the conditions had persisted in Georgia and South Carolina, Thurmond Lake could have reached an emergency operation level where outflow from the lake is equal to the inflow to the lake. To decrease the effect of the reduced releases on downstream resources, a stepped approach was proposed to reduce the flow in increments of 500 cubic feet per second (ft3/s) intervals. Reduced flows from 3,600 ft3/s to 3,100 ft3/s and 2,600 ft3/s were simulated with two previously developed models of the Lower Savannah River Estuary to evaluate the potential effects on salinity intrusion. The end of the previous drought (2002) was selected as the baseline condition for the simulations with the model. Salinity intrusion coincided with the 28-day cycle semidiurnal tidal cycles. The results show a difference between the model simulations of how the salinity will respond to the decreased flows. The Model-to-Marsh Decision Support System (M2MDSS) salinity response shows a large increase in the magnitude (> 6.0 practical salinity units, psu) and duration (3-4 days) of the salinity intrusion with extended periods (21 days) of tidal freshwater remaining in the system. The Environmental Fluid Dynamic Code (EFDC) model predicts increases in the magnitude of the salinity intrusion but only to 2 and 3 psu and the intrusion duration greater than a week. A potential mitigation to the increased salinity intrusion predicted by the M2MDSS would be to time pulses of increase flows to reduce the magnitude of the intrusion. Seven-day streamflow pulses of 4,500 ft3/s were inserted into the constant 3,100 ft3/s streamflow condition. The streamflow pulses did substantially decrease the magnitude and duration of the salinity intrusion. The result of the streamflow pulse scenario demonstrates how alternative release patterns from Lake Thurmond could be utilized to mitigate potential salinity changes in the Lower Savannah River Estuary.
Code-based Diagnostic Algorithms for Idiopathic Pulmonary Fibrosis. Case Validation and Improvement.
Ley, Brett; Urbania, Thomas; Husson, Gail; Vittinghoff, Eric; Brush, David R; Eisner, Mark D; Iribarren, Carlos; Collard, Harold R
2017-06-01
Population-based studies of idiopathic pulmonary fibrosis (IPF) in the United States have been limited by reliance on diagnostic code-based algorithms that lack clinical validation. To validate a well-accepted International Classification of Diseases, Ninth Revision, code-based algorithm for IPF using patient-level information and to develop a modified algorithm for IPF with enhanced predictive value. The traditional IPF algorithm was used to identify potential cases of IPF in the Kaiser Permanente Northern California adult population from 2000 to 2014. Incidence and prevalence were determined overall and by age, sex, and race/ethnicity. A validation subset of cases (n = 150) underwent expert medical record and chest computed tomography review. A modified IPF algorithm was then derived and validated to optimize positive predictive value. From 2000 to 2014, the traditional IPF algorithm identified 2,608 cases among 5,389,627 at-risk adults in the Kaiser Permanente Northern California population. Annual incidence was 6.8/100,000 person-years (95% confidence interval [CI], 6.1-7.7) and was higher in patients with older age, male sex, and white race. The positive predictive value of the IPF algorithm was only 42.2% (95% CI, 30.6 to 54.6%); sensitivity was 55.6% (95% CI, 21.2 to 86.3%). The corrected incidence was estimated at 5.6/100,000 person-years (95% CI, 2.6-10.3). A modified IPF algorithm had improved positive predictive value but reduced sensitivity compared with the traditional algorithm. A well-accepted International Classification of Diseases, Ninth Revision, code-based IPF algorithm performs poorly, falsely classifying many non-IPF cases as IPF and missing a substantial proportion of IPF cases. A modification of the IPF algorithm may be useful for future population-based studies of IPF.
A range-based predictive localization algorithm for WSID networks
NASA Astrophysics Data System (ADS)
Liu, Yuan; Chen, Junjie; Li, Gang
2017-11-01
Most studies on localization algorithms are conducted on the sensor networks with densely distributed nodes. However, the non-localizable problems are prone to occur in the network with sparsely distributed sensor nodes. To solve this problem, a range-based predictive localization algorithm (RPLA) is proposed in this paper for the wireless sensor networks syncretizing the RFID (WSID) networks. The Gaussian mixture model is established to predict the trajectory of a mobile target. Then, the received signal strength indication is used to reduce the residence area of the target location based on the approximate point-in-triangulation test algorithm. In addition, collaborative localization schemes are introduced to locate the target in the non-localizable situations. Simulation results verify that the RPLA achieves accurate localization for the network with sparsely distributed sensor nodes. The localization accuracy of the RPLA is 48.7% higher than that of the APIT algorithm, 16.8% higher than that of the single Gaussian model-based algorithm and 10.5% higher than that of the Kalman filtering-based algorithm.
Reeder, Jens; Giegerich, Robert
2004-01-01
Background The general problem of RNA secondary structure prediction under the widely used thermodynamic model is known to be NP-complete when the structures considered include arbitrary pseudoknots. For restricted classes of pseudoknots, several polynomial time algorithms have been designed, where the O(n6)time and O(n4) space algorithm by Rivas and Eddy is currently the best available program. Results We introduce the class of canonical simple recursive pseudoknots and present an algorithm that requires O(n4) time and O(n2) space to predict the energetically optimal structure of an RNA sequence, possible containing such pseudoknots. Evaluation against a large collection of known pseudoknotted structures shows the adequacy of the canonization approach and our algorithm. Conclusions RNA pseudoknots of medium size can now be predicted reliably as well as efficiently by the new algorithm. PMID:15294028
Wang, Jie-Sheng; Han, Shuang
2015-01-01
For predicting the key technology indicators (concentrate grade and tailings recovery rate) of flotation process, a feed-forward neural network (FNN) based soft-sensor model optimized by the hybrid algorithm combining particle swarm optimization (PSO) algorithm and gravitational search algorithm (GSA) is proposed. Although GSA has better optimization capability, it has slow convergence velocity and is easy to fall into local optimum. So in this paper, the velocity vector and position vector of GSA are adjusted by PSO algorithm in order to improve its convergence speed and prediction accuracy. Finally, the proposed hybrid algorithm is adopted to optimize the parameters of FNN soft-sensor model. Simulation results show that the model has better generalization and prediction accuracy for the concentrate grade and tailings recovery rate to meet the online soft-sensor requirements of the real-time control in the flotation process. PMID:26583034
Model Predictive Control Based Motion Drive Algorithm for a Driving Simulator
NASA Astrophysics Data System (ADS)
Rehmatullah, Faizan
In this research, we develop a model predictive control based motion drive algorithm for the driving simulator at Toronto Rehabilitation Institute. Motion drive algorithms exploit the limitations of the human vestibular system to formulate a perception of motion within the constrained workspace of a simulator. In the absence of visual cues, the human perception system is unable to distinguish between acceleration and the force of gravity. The motion drive algorithm determines control inputs to displace the simulator platform, and by using the resulting inertial forces and angular rates, creates the perception of motion. By using model predictive control, we can optimize the use of simulator workspace for every maneuver while simulating the vehicle perception. With the ability to handle nonlinear constraints, the model predictive control allows us to incorporate workspace limitations.
Chaves, Francisco A.; Lee, Alvin H.; Nayak, Jennifer; Richards, Katherine A.; Sant, Andrea J.
2012-01-01
The ability to track CD4 T cells elicited in response to pathogen infection or vaccination is critical because of the role these cells play in protective immunity. Coupled with advances in genome sequencing of pathogenic organisms, there is considerable appeal for implementation of computer-based algorithms to predict peptides that bind to the class II molecules, forming the complex recognized by CD4 T cells. Despite recent progress in this area, there is a paucity of data regarding their success in identifying actual pathogen-derived epitopes. In this study, we sought to rigorously evaluate the performance of multiple web-available algorithms by comparing their predictions and our results using purely empirical methods for epitope discovery in influenza that utilized overlapping peptides and cytokine Elispots, for three independent class II molecules. We analyzed the data in different ways, trying to anticipate how an investigator might use these computational tools for epitope discovery. We come to the conclusion that currently available algorithms can indeed facilitate epitope discovery, but all shared a high degree of false positive and false negative predictions. Therefore, efficiencies were low. We also found dramatic disparities among algorithms and between predicted IC50 values and true dissociation rates of peptide:MHC class II complexes. We suggest that improved success of predictive algorithms will depend less on changes in computational methods or increased data sets and more on changes in parameters used to “train” the algorithms that factor in elements of T cell repertoire and peptide acquisition by class II molecules. PMID:22467652
NASA Astrophysics Data System (ADS)
Linard, J.; Leib, K.; Colorado Water Science Center
2010-12-01
Elevated levels of salinity and dissolved selenium can detrimentally effect the quality of water where anthropogenic and natural uses are concerned. In areas, such as the lower Gunnison Basin of western Colorado, salinity and selenium are such a concern that control projects are implemented to limit their mobilization. To prioritize the locations in which control projects are implemented, multi-parameter regression models were developed to identify subbasins in the lower Gunnison River Basin that were most likely to have elevated salinity and dissolved selenium levels. The drainage area is about 5,900 mi2 and is underlain by Cretaceous marine shale, which is the most common source of salinity and dissolved selenium. To characterize the complex hydrologic and chemical processes governing constituent mobilization, geospatial variables representing 70 different environmental characteristics were correlated to mean seasonal (irrigation and nonirrigation seasons) salinity and selenium yields estimated at 154 sampling sites. The variables generally represented characteristics of the physical basin, precipitation, soil, geology, land use, and irrigation water delivery systems. Irrigation and nonirrigation seasons were selected due to documented effects of irrigation on constituent mobilization. Following a stepwise approach, combinations of the geospatial variables were used to develop four multi-parameter regression models. These models predicted salinity and selenium yield, within a 95 percent confidence range, at individual points in the Lower Gunnison Basin for irrigation and non-irrigation seasons. The corresponding subbasins were ranked according to their potential to yield salinity and selenium and rankings were used to prioritize areas that would most benefit from control projects.
Increased resistance to a generalist herbivore in a salinity-stressed non-halophytic plant.
Renault, Sylvie; Wolfe, Scott; Markham, John; Avila-Sakar, Germán
2016-01-01
Plants often grow under the combined stress of several factors. Salinity and herbivory, separately, can severely hinder plant growth and reproduction, but the combined effects of both factors are still not clearly understood. Salinity is known to reduce plant tissue nitrogen content and growth rates. Since herbivores prefer tissues with high N content, and biochemical pathways leading to resistance are commonly elicited by salt-stress, we hypothesized that plants growing in saline conditions would have enhanced resistance against herbivores. The non-halophyte, Brassica juncea, and the generalist herbivore Trichoplusia ni were used to test the prediction that plants subjected to salinity stress would be both more resistant and more tolerant to herbivory than those growing without salt stress. Plants were grown under different NaCl levels, and either exposed to herbivores and followed by removal of half of their leaves, or left intact. Plants were left to grow and reproduce until senescence. Tissue quality was assessed, seeds were counted and biomass of different organs measured. Plants exposed to salinity grew less, had reduced tissue nitrogen, protein and chlorophyll content, although proline levels increased. Specific leaf area, leaf water content, transpiration and root:shoot ratio remained unaffected. Plants growing under saline condition had greater constitutive resistance than unstressed plants. However, induced resistance and tolerance were not affected by salinity. These results support the hypothesis that plants growing under salt-stress are better defended against herbivores, although in B. juncea this may be mostly through resistance, and less through tolerance. Published by Oxford University Press on behalf of the Annals of Botany Company.
Bellón, Juan Ángel; de Dios Luna, Juan; King, Michael; Nazareth, Irwin; Motrico, Emma; GildeGómez-Barragán, María Josefa; Torres-González, Francisco; Montón-Franco, Carmen; Sánchez-Celaya, Marta; Díaz-Barreiros, Miguel Ángel; Vicens, Catalina; Moreno-Peral, Patricia
2017-04-01
Little is known about the risk of progressing to hazardous alcohol use in abstinent or low-risk drinkers. To develop and validate a simple brief risk algorithm for the onset of hazardous alcohol drinking (HAD) over 12 months for use in primary care. Prospective cohort study in 32 health centres from six Spanish provinces, with evaluations at baseline, 6 months, and 12 months. Forty-one risk factors were measured and multilevel logistic regression and inverse probability weighting were used to build the risk algorithm. The outcome was new occurrence of HAD during the study, as measured by the AUDIT. From the lists of 174 GPs, 3954 adult abstinent or low-risk drinkers were recruited. The 'predictAL-10' risk algorithm included just nine variables (10 questions): province, sex, age, cigarette consumption, perception of financial strain, having ever received treatment for an alcohol problem, childhood sexual abuse, AUDIT-C, and interaction AUDIT-C*Age. The c-index was 0.886 (95% CI = 0.854 to 0.918). The optimal cutoff had a sensitivity of 0.83 and specificity of 0.80. Excluding childhood sexual abuse from the model (the 'predictAL-9'), the c-index was 0.880 (95% CI = 0.847 to 0.913), sensitivity 0.79, and specificity 0.81. There was no statistically significant difference between the c-indexes of predictAL-10 and predictAL-9. The predictAL-10/9 is a simple and internally valid risk algorithm to predict the onset of hazardous alcohol drinking over 12 months in primary care attendees; it is a brief tool that is potentially useful for primary prevention of hazardous alcohol drinking. © British Journal of General Practice 2017.
Predicting missing links and identifying spurious links via likelihood analysis
NASA Astrophysics Data System (ADS)
Pan, Liming; Zhou, Tao; Lü, Linyuan; Hu, Chin-Kun
2016-03-01
Real network data is often incomplete and noisy, where link prediction algorithms and spurious link identification algorithms can be applied. Thus far, it lacks a general method to transform network organizing mechanisms to link prediction algorithms. Here we use an algorithmic framework where a network’s probability is calculated according to a predefined structural Hamiltonian that takes into account the network organizing principles, and a non-observed link is scored by the conditional probability of adding the link to the observed network. Extensive numerical simulations show that the proposed algorithm has remarkably higher accuracy than the state-of-the-art methods in uncovering missing links and identifying spurious links in many complex biological and social networks. Such method also finds applications in exploring the underlying network evolutionary mechanisms.
Predicting missing links and identifying spurious links via likelihood analysis
Pan, Liming; Zhou, Tao; Lü, Linyuan; Hu, Chin-Kun
2016-01-01
Real network data is often incomplete and noisy, where link prediction algorithms and spurious link identification algorithms can be applied. Thus far, it lacks a general method to transform network organizing mechanisms to link prediction algorithms. Here we use an algorithmic framework where a network’s probability is calculated according to a predefined structural Hamiltonian that takes into account the network organizing principles, and a non-observed link is scored by the conditional probability of adding the link to the observed network. Extensive numerical simulations show that the proposed algorithm has remarkably higher accuracy than the state-of-the-art methods in uncovering missing links and identifying spurious links in many complex biological and social networks. Such method also finds applications in exploring the underlying network evolutionary mechanisms. PMID:26961965
Traffic Noise Ground Attenuation Algorithm Evaluation
NASA Astrophysics Data System (ADS)
Herman, Lloyd Allen
The Federal Highway Administration traffic noise prediction program, STAMINA 2.0, was evaluated for its accuracy. In addition, the ground attenuation algorithm used in the Ontario ORNAMENT method was evaluated to determine its potential to improve these predictions. Field measurements of sound levels were made at 41 sites on I-440 in Nashville, Tennessee in order to both study noise barrier effectiveness and to evaluate STAMINA 2.0 and the performance of the ORNAMENT ground attenuation algorithm. The measurement sites, which contain large variations in terrain, included several cross sections. Further, all sites contain some type of barrier, natural or constructed, which could more fully expose the strength and weaknesses of the ground attenuation algorithms. The noise barrier evaluation was accomplished in accordance with American National Standard Methods for Determination of Insertion Loss of Outdoor Noise Barriers which resulted in an evaluation of this standard. The entire 7.2 mile length of I-440 was modeled using STAMINA 2.0. A multiple run procedure was developed to emulate the results that would be obtained if the ORNAMENT algorithm was incorporated into STAMINA 2.0. Finally, the predicted noise levels based on STAMINA 2.0 and STAMINA with the ORNAMENT ground attenuation algorithm were compared with each other and with the field measurements. It was found that STAMINA 2.0 overpredicted noise levels by an average of over 2 dB for the receivers on I-440, whereas, the STAMINA with ORNAMENT ground attenuation algorithm overpredicted noise levels by an average of less than 0.5 dB. The mean errors for the two predictions were found to be statistically different from each other, and the mean error for the prediction with the ORNAMENT ground attenuation algorithm was not found to be statistically different from zero. The STAMINA 2.0 program predicts little, if any, ground attenuation for receivers at typical first-row distances from highways where noise barriers are used. The ORNAMENT ground attenuation algorithm, which recognizes and better compensates for the presence of obstacles in the propagation path of a sound wave, predicted significant amounts of ground attenuation for most sites.
González-Recio, O; Jiménez-Montero, J A; Alenda, R
2013-01-01
In the next few years, with the advent of high-density single nucleotide polymorphism (SNP) arrays and genome sequencing, genomic evaluation methods will need to deal with a large number of genetic variants and an increasing sample size. The boosting algorithm is a machine-learning technique that may alleviate the drawbacks of dealing with such large data sets. This algorithm combines different predictors in a sequential manner with some shrinkage on them; each predictor is applied consecutively to the residuals from the committee formed by the previous ones to form a final prediction based on a subset of covariates. Here, a detailed description is provided and examples using a toy data set are included. A modification of the algorithm called "random boosting" was proposed to increase predictive ability and decrease computation time of genome-assisted evaluation in large data sets. Random boosting uses a random selection of markers to add a subsequent weak learner to the predictive model. These modifications were applied to a real data set composed of 1,797 bulls genotyped for 39,714 SNP. Deregressed proofs of 4 yield traits and 1 type trait from January 2009 routine evaluations were used as dependent variables. A 2-fold cross-validation scenario was implemented. Sires born before 2005 were used as a training sample (1,576 and 1,562 for production and type traits, respectively), whereas younger sires were used as a testing sample to evaluate predictive ability of the algorithm on yet-to-be-observed phenotypes. Comparison with the original algorithm was provided. The predictive ability of the algorithm was measured as Pearson correlations between observed and predicted responses. Further, estimated bias was computed as the average difference between observed and predicted phenotypes. The results showed that the modification of the original boosting algorithm could be run in 1% of the time used with the original algorithm and with negligible differences in accuracy and bias. This modification may be used to speed the calculus of genome-assisted evaluation in large data sets such us those obtained from consortiums. Copyright © 2013 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.
A numerical solution of Duffing's equations including the prediction of jump phenomena
NASA Technical Reports Server (NTRS)
Moyer, E. T., Jr.; Ghasghai-Abdi, E.
1987-01-01
Numerical methodology for the solution of Duffing's differential equation is presented. Algorithms for the prediction of multiple equilibrium solutions and jump phenomena are developed. In addition, a filtering algorithm for producing steady state solutions is presented. The problem of a rigidly clamped circular plate subjected to cosinusoidal pressure loading is solved using the developed algorithms (the plate is assumed to be in the geometrically nonlinear range). The results accurately predict regions of solution multiplicity and jump phenomena.
Weiss, Jeremy C; Page, David; Peissig, Peggy L; Natarajan, Sriraam; McCarty, Catherine
2013-01-01
Electronic health records (EHRs) are an emerging relational domain with large potential to improve clinical outcomes. We apply two statistical relational learning (SRL) algorithms to the task of predicting primary myocardial infarction. We show that one SRL algorithm, relational functional gradient boosting, outperforms propositional learners particularly in the medically-relevant high recall region. We observe that both SRL algorithms predict outcomes better than their propositional analogs and suggest how our methods can augment current epidemiological practices. PMID:25360347
Rational Exploitation and Utilizing of Groundwater in Jiangsu Coastal Area
NASA Astrophysics Data System (ADS)
Kang, B.; Lin, X.
2017-12-01
Jiangsu coastal area is located in the southeast coast of China, where is a new industrial base and an important coastal and Land Resources Development Zone of China. In the areas with strong human exploitation activities, regional groundwater evolution is obviously affected by human activities. In order to solve the environmental geological problems caused by groundwater exploitation fundamentally, we must find out the forming conditions of regional groundwater hydrodynamic field, and the impact of human activities on groundwater hydrodynamic field evolution and hydrogeochemical evolition. Based on these results, scientific management and reasonable exploitation of the regional groundwater resources can be provided for the utilization. Taking the coastal area of Jiangsu as the research area, we investigate and analyze of the regional hydrogeological conditions. The numerical simulation model of groundwater flow was established according to the water power, chemical and isotopic methods, the conditions of water flow and the influence of hydrodynamic field on the water chemical field. We predict the evolution of regional groundwater dynamics under the influence of human activities and climate change and evaluate the influence of groundwater dynamic field evolution on the environmental geological problems caused by groundwater exploitation under various conditions. We get the following conclusions: Three groundwater exploitation optimal schemes were established. The groundwater salinization was taken as the primary control condition. The substitution model was proposed to model groundwater exploitation and water level changes by BP network method.Then genetic algorithm was used to solve the optimization solution. Three groundwater exploitation optimal schemes were submit to local water resource management. The first sheme was used to solve the groundwater salinization problem. The second sheme focused on dual water supply. The third sheme concerned on emergency water supppy. This is the first time environment problem taken as water management objectinve in this coastal area.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Taiping; Yang, Zhaoqing; Khangaonkar, Tarang
2010-04-22
In this study, a hydrodynamic model based on the unstructured-grid finite volume coastal ocean model (FVCOM) was developed for Bellingham Bay, Washington. The model simulates water surface elevation, velocity, temperature, and salinity in a three-dimensional domain that covers the entire Bellingham Bay and adjacent water bodies, including Lummi Bay, Samish Bay, Padilla Bay, and Rosario Strait. The model was developed using Pacific Northwest National Laboratory’s high-resolution Puget Sound and Northwest Straits circulation and transport model. A sub-model grid for Bellingham Bay and adjacent coastal waters was extracted from the Puget Sound model and refined in Bellingham Bay using bathymetric lightmore » detection and ranging (LIDAR) and river channel cross-section data. The model uses tides, river inflows, and meteorological inputs to predict water surface elevations, currents, salinity, and temperature. A tidal open boundary condition was specified using standard National Oceanic and Atmospheric Administration (NOAA) predictions. Temperature and salinity open boundary conditions were specified based on observed data. Meteorological forcing (wind, solar radiation, and net surface heat flux) was obtained from NOAA real observations and National Center for Environmental Prediction North American Regional Analysis outputs. The model was run in parallel with 48 cores using a time step of 2.5 seconds. It took 18 hours of cpu time to complete 26 days of simulation. The model was calibrated with oceanographic field data for the period of 6/1/2009 to 6/26/2009. These data were collected specifically for the purpose of model development and calibration. They include time series of water-surface elevation, currents, temperature, and salinity as well as temperature and salinity profiles during instrument deployment and retrieval. Comparisons between model predictions and field observations show an overall reasonable agreement in both temporal and spatial scales. Comparisons of root mean square error values for surface elevation, velocity, temperature, and salinity time series are 0.11 m, 0.10 m/s, 1.28oC, and 1.91 ppt, respectively. The model was able to reproduce the salinity and temperature stratifications inside Bellingham Bay. Wetting and drying processes in tidal flats in Bellingham Bay, Samish Bay, and Padilla Bay were also successfully simulated. Both model results and observed data indicated that water surface elevations inside Bellingham Bay are highly correlated to tides. Circulation inside the bay is weak and complex and is affected by various forcing mechanisms, including tides, winds, freshwater inflows, and other local forcing factors. The Bellingham Bay model solution was successfully linked to the NOAA oil spill trajectory simulation model “General NOAA Operational Modeling Environment (GNOME).” Overall, the Bellingham Bay model has been calibrated reasonably well and can be used to provide detailed hydrodynamic information in the bay and adjacent water bodies. While there is room for further improvement with more available data, the calibrated hydrodynamic model provides useful hydrodynamic information in Bellingham Bay and can be used to support sediment transport and water quality modeling as well as assist in the design of nearshore restoration scenarios.« less
On the Balancing of the SMOS Ocean Salinity Retrieval Cost Function
NASA Astrophysics Data System (ADS)
Sabia, R.; Camps, A.; Portabella, M.; Talone, M.; Ballabrera, J.; Gourrion, J.; Gabarró, C.; Aretxabaleta, A. L.; Font, J.
2009-04-01
The Soil Moisture and Ocean Salinity (SMOS) mission will be launched in mid 2009 to provide synoptic sea surface salinity (SSS) measurements with good temporal resolution [1]. To obtain a proper estimation of the SSS fields derived from the multi-angular brightness temperatures (TB) measured by the Microwave Interferometric Radiometer by Aperture Synthesis (MIRAS) sensor, a comprehensive inversion procedure has been defined [2]. Nevertheless, several salinity retrieval issues remain critical, namely: 1) Scene-dependent bias in the simulated TBs, 2) L-band forward geophysical model function definition, 3) Auxiliary data uncertainties, 4) Constraints in the cost function (inversion), especially in salinity term, and 5) Adequate spatio-temporal averaging. These issues will have to be properly addressed in order to meet the proposed accuracy requirement of the mission: a demanding 0.1 psu (practical salinity units) after averaging in a 30-day and 2°x2° spatio-temporal boxes. The salinity retrieval cost function minimizes the difference between the multi-angular measured SMOS TBs (yet simulated, so far) and the modeled TBs, weighted by the corresponding radiometric noise of the measurements. Furthermore, due to the fact that the minimization problem is both non-linear and ill-posed, background reference terms are needed to nudge the solution and ensuring convergence at the same time [3]. Constraining terms in SSS, sea surface temperature (SST) and wind speed are considered with their respective uncertainties. Moreover, whether SSS constraints have to be included or not as part of the retrieval procedure is still a matter of debate. On one hand, neglecting background reference information on SSS might prevent from retrieving salinity with the prescribed accuracy or at least within reasonable error. Conversely, including constraints in SSS, relying for instance on the climatology, may force the retrieved value to be too close to the reference prior values, thus producing spurious retrievals. In [4] it has been studied the impact of the different auxiliary salinity uncertainties in the accuracy of the retrieval. It has been shown that using physically-consistent salinity field uncertainties of the order of less than 0.5 psu (either as the standard deviation of the considered SSS field or as the standard deviation of the misfit between the original and the auxiliary SSS field) the SSS term turns out to be too constraining. A half-way solution could be envisaged by using empirical weights (regularization factors) which could smooth the overall influence of the SSS term still using the auxiliary fields with their corresponding physically-sounded uncertainties. This operation should be performed for the SST and wind speed term as well. The need for a comprehensive balancing of the different terms included in the cost function is also stressed by recent studies [5], which point out that the even the observational term (TBs) will need to be properly weighted by an effective ratio, taking into account the specific correlation patterns existing in the MIRAS measurements. Simulated data using the SMOS End-to-end Processor Simulator (SEPS), in its full-mode, including the measured antenna patterns for each antenna and all the instrument errors, are used in this study. The salinity retrieval process and the SSS maps (for each satellite overpass) are performed with UPC SMOS-Level 2 Processor Simulator (SMOS-L2PS). The relative weight for each of the terms included in the cost function (observational and background terms) is assessed in different cost function configurations. Regularization factors are introduced to ensure that SMOS information content is fully exploited. Preliminary results on the cost function balancing will be shown at the conference. References [1] Font, J., G. Lagerloef, D. Le Vine, A. Camps, and O.Z. Zanife, The Determination of Surface Salinity with the European SMOS Space Mission, IEEE Trans. Geosci. Remote Sens., 42 (10), 2196-2205, 2004. [2] Zine, S., J. Boutin, J. Font, N. Reul, P. Waldteufel, C. Gabarró, J. Tenerelli, F. Petitcolin, J.L. Vergely, M. Talone, and S. Delwart, Overview of the SMOS Sea Surface Salinity Prototype Processor, IEEE Trans. Geosc. Remote Sens, 46 (3), 621-645, 2008. [3] Gabarró, C., M. Portabella, M. Talone and J. Font, Analysis of the SMOS Ocean Salinity Inversion Algorithm, Proceedings of the International Geoscience and Remote Sensing Symposium (IGARSS), Barcelona, Spain, 971-974, 2007. [4] Sabia, R, Sea Surface Salinity Retrieval Error Budget within the ESA Soil Moisture and Ocean Salinity Mission, Ph.D. Dissertation, Barcelona, Spain, October 2008. [5] Talone, M., A. Camps, C. Gabarró, R. Sabia, J. Gourrion, M. Vall•llossera, B. Mourre, and J. Font, Contributions to the Improvement of the SMOS Level 2 Retrieval Algorithm: Optimization of the Cost Function, Proceedings of the International Geoscience and Remote Sensing Symposium (IGARSS), Boston, Massachusetts USA, 2008.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zawisza, I; Yan, H; Yin, F
Purpose: To assure that tumor motion is within the radiation field during high-dose and high-precision radiosurgery, real-time imaging and surrogate monitoring are employed. These methods are useful in providing real-time tumor/surrogate motion but no future information is available. In order to anticipate future tumor/surrogate motion and track target location precisely, an algorithm is developed and investigated for estimating surrogate motion multiple-steps ahead. Methods: The study utilized a one-dimensional surrogate motion signal divided into three components: (a) training component containing the primary data including the first frame to the beginning of the input subsequence; (b) input subsequence component of the surrogatemore » signal used as input to the prediction algorithm: (c) output subsequence component is the remaining signal used as the known output of the prediction algorithm for validation. The prediction algorithm consists of three major steps: (1) extracting subsequences from training component which best-match the input subsequence according to given criterion; (2) calculating weighting factors from these best-matched subsequence; (3) collecting the proceeding parts of the subsequences and combining them together with assigned weighting factors to form output. The prediction algorithm was examined for several patients, and its performance is assessed based on the correlation between prediction and known output. Results: Respiratory motion data was collected for 20 patients using the RPM system. The output subsequence is the last 50 samples (∼2 seconds) of a surrogate signal, and the input subsequence was 100 (∼3 seconds) frames prior to the output subsequence. Based on the analysis of correlation coefficient between predicted and known output subsequence, the average correlation is 0.9644±0.0394 and 0.9789±0.0239 for equal-weighting and relative-weighting strategies, respectively. Conclusion: Preliminary results indicate that the prediction algorithm is effective in estimating surrogate motion multiple-steps in advance. Relative-weighting method shows better prediction accuracy than equal-weighting method. More parameters of this algorithm are under investigation.« less
Ganapathiraju, Madhavi K; Orii, Naoki
2013-08-30
Advances in biotechnology have created "big-data" situations in molecular and cellular biology. Several sophisticated algorithms have been developed that process big data to generate hundreds of biomedical hypotheses (or predictions). The bottleneck to translating this large number of biological hypotheses is that each of them needs to be studied by experimentation for interpreting its functional significance. Even when the predictions are estimated to be very accurate, from a biologist's perspective, the choice of which of these predictions is to be studied further is made based on factors like availability of reagents and resources and the possibility of formulating some reasonable hypothesis about its biological relevance. When viewed from a global perspective, say from that of a federal funding agency, ideally the choice of which prediction should be studied would be made based on which of them can make the most translational impact. We propose that algorithms be developed to identify which of the computationally generated hypotheses have potential for high translational impact; this way, funding agencies and scientific community can invest resources and drive the research based on a global view of biomedical impact without being deterred by local view of feasibility. In short, data-analytic algorithms analyze big-data and generate hypotheses; in contrast, the proposed inference-analytic algorithms analyze these hypotheses and rank them by predicted biological impact. We demonstrate this through the development of an algorithm to predict biomedical impact of protein-protein interactions (PPIs) which is estimated by the number of future publications that cite the paper which originally reported the PPI. This position paper describes a new computational problem that is relevant in the era of big-data and discusses the challenges that exist in studying this problem, highlighting the need for the scientific community to engage in this line of research. The proposed class of algorithms, namely inference-analytic algorithms, is necessary to ensure that resources are invested in translating those computational outcomes that promise maximum biological impact. Application of this concept to predict biomedical impact of PPIs illustrates not only the concept, but also the challenges in designing these algorithms.
Chastek, Benjamin J; Oleen-Burkey, Merrikay; Lopez-Bresnahan, Maria V
2010-01-01
Relapse is a common measure of disease activity in relapsing-remitting multiple sclerosis (MS). The objective of this study was to test the content validity of an operational algorithm for detecting relapse in claims data. A claims-based relapse detection algorithm was tested by comparing its detection rate over a 1-year period with relapses identified based on medical chart review. According to the algorithm, MS patients in a US healthcare claims database who had either (1) a primary claim for MS during hospitalization or (2) a corticosteroid claim following a MS-related outpatient visit were designated as having a relapse. Patient charts were examined for explicit indication of relapse or care suggestive of relapse. Positive and negative predictive values were calculated. Medical charts were reviewed for 300 MS patients, half of whom had a relapse according to the algorithm. The claims-based criteria correctly classified 67.3% of patients with relapses (positive predictive value) and 70.0% of patients without relapses (negative predictive value; kappa 0.373: p < 0.001). Alternative algorithms did not improve on the predictive value of the operational algorithm. Limitations of the algorithm include lack of differentiation between relapsing-remitting MS and other types, and that it does not incorporate measures of function and disability. The claims-based algorithm appeared to successfully detect moderate-to-severe MS relapse. This validated definition can be applied to future claims-based MS studies.
Preciat Gonzalez, German A.; El Assal, Lemmer R. P.; Noronha, Alberto; ...
2017-06-14
The mechanism of each chemical reaction in a metabolic network can be represented as a set of atom mappings, each of which relates an atom in a substrate metabolite to an atom of the same element in a product metabolite. Genome-scale metabolic network reconstructions typically represent biochemistry at the level of reaction stoichiometry. However, a more detailed representation at the underlying level of atom mappings opens the possibility for a broader range of biological, biomedical and biotechnological applications than with stoichiometry alone. Complete manual acquisition of atom mapping data for a genome-scale metabolic network is a laborious process. However, manymore » algorithms exist to predict atom mappings. How do their predictions compare to each other and to manually curated atom mappings? For more than four thousand metabolic reactions in the latest human metabolic reconstruction, Recon 3D, we compared the atom mappings predicted by six atom mapping algorithms. We also compared these predictions to those obtained by manual curation of atom mappings for over five hundred reactions distributed among all top level Enzyme Commission number classes. Five of the evaluated algorithms had similarly high prediction accuracy of over 91% when compared to manually curated atom mapped reactions. On average, the accuracy of the prediction was highest for reactions catalysed by oxidoreductases and lowest for reactions catalysed by ligases. In addition to prediction accuracy, the algorithms were evaluated on their accessibility, their advanced features, such as the ability to identify equivalent atoms, and their ability to map hydrogen atoms. In addition to prediction accuracy, we found that software accessibility and advanced features were fundamental to the selection of an atom mapping algorithm in practice.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Preciat Gonzalez, German A.; El Assal, Lemmer R. P.; Noronha, Alberto
The mechanism of each chemical reaction in a metabolic network can be represented as a set of atom mappings, each of which relates an atom in a substrate metabolite to an atom of the same element in a product metabolite. Genome-scale metabolic network reconstructions typically represent biochemistry at the level of reaction stoichiometry. However, a more detailed representation at the underlying level of atom mappings opens the possibility for a broader range of biological, biomedical and biotechnological applications than with stoichiometry alone. Complete manual acquisition of atom mapping data for a genome-scale metabolic network is a laborious process. However, manymore » algorithms exist to predict atom mappings. How do their predictions compare to each other and to manually curated atom mappings? For more than four thousand metabolic reactions in the latest human metabolic reconstruction, Recon 3D, we compared the atom mappings predicted by six atom mapping algorithms. We also compared these predictions to those obtained by manual curation of atom mappings for over five hundred reactions distributed among all top level Enzyme Commission number classes. Five of the evaluated algorithms had similarly high prediction accuracy of over 91% when compared to manually curated atom mapped reactions. On average, the accuracy of the prediction was highest for reactions catalysed by oxidoreductases and lowest for reactions catalysed by ligases. In addition to prediction accuracy, the algorithms were evaluated on their accessibility, their advanced features, such as the ability to identify equivalent atoms, and their ability to map hydrogen atoms. In addition to prediction accuracy, we found that software accessibility and advanced features were fundamental to the selection of an atom mapping algorithm in practice.« less
Preciat Gonzalez, German A; El Assal, Lemmer R P; Noronha, Alberto; Thiele, Ines; Haraldsdóttir, Hulda S; Fleming, Ronan M T
2017-06-14
The mechanism of each chemical reaction in a metabolic network can be represented as a set of atom mappings, each of which relates an atom in a substrate metabolite to an atom of the same element in a product metabolite. Genome-scale metabolic network reconstructions typically represent biochemistry at the level of reaction stoichiometry. However, a more detailed representation at the underlying level of atom mappings opens the possibility for a broader range of biological, biomedical and biotechnological applications than with stoichiometry alone. Complete manual acquisition of atom mapping data for a genome-scale metabolic network is a laborious process. However, many algorithms exist to predict atom mappings. How do their predictions compare to each other and to manually curated atom mappings? For more than four thousand metabolic reactions in the latest human metabolic reconstruction, Recon 3D, we compared the atom mappings predicted by six atom mapping algorithms. We also compared these predictions to those obtained by manual curation of atom mappings for over five hundred reactions distributed among all top level Enzyme Commission number classes. Five of the evaluated algorithms had similarly high prediction accuracy of over 91% when compared to manually curated atom mapped reactions. On average, the accuracy of the prediction was highest for reactions catalysed by oxidoreductases and lowest for reactions catalysed by ligases. In addition to prediction accuracy, the algorithms were evaluated on their accessibility, their advanced features, such as the ability to identify equivalent atoms, and their ability to map hydrogen atoms. In addition to prediction accuracy, we found that software accessibility and advanced features were fundamental to the selection of an atom mapping algorithm in practice.
The Current Status of Unsteady CFD Approaches for Aerodynamic Flow Control
NASA Technical Reports Server (NTRS)
Carpenter, Mark H.; Singer, Bart A.; Yamaleev, Nail; Vatsa, Veer N.; Viken, Sally A.; Atkins, Harold L.
2002-01-01
An overview of the current status of time dependent algorithms is presented. Special attention is given to algorithms used to predict fluid actuator flows, as well as other active and passive flow control devices. Capabilities for the next decade are predicted, and principal impediments to the progress of time-dependent algorithms are identified.
A Sensor Dynamic Measurement Error Prediction Model Based on NAPSO-SVM.
Jiang, Minlan; Jiang, Lan; Jiang, Dingde; Li, Fei; Song, Houbing
2018-01-15
Dynamic measurement error correction is an effective way to improve sensor precision. Dynamic measurement error prediction is an important part of error correction, and support vector machine (SVM) is often used for predicting the dynamic measurement errors of sensors. Traditionally, the SVM parameters were always set manually, which cannot ensure the model's performance. In this paper, a SVM method based on an improved particle swarm optimization (NAPSO) is proposed to predict the dynamic measurement errors of sensors. Natural selection and simulated annealing are added in the PSO to raise the ability to avoid local optima. To verify the performance of NAPSO-SVM, three types of algorithms are selected to optimize the SVM's parameters: the particle swarm optimization algorithm (PSO), the improved PSO optimization algorithm (NAPSO), and the glowworm swarm optimization (GSO). The dynamic measurement error data of two sensors are applied as the test data. The root mean squared error and mean absolute percentage error are employed to evaluate the prediction models' performances. The experimental results show that among the three tested algorithms the NAPSO-SVM method has a better prediction precision and a less prediction errors, and it is an effective method for predicting the dynamic measurement errors of sensors.
BIG DATA ANALYTICS AND PRECISION ANIMAL AGRICULTURE SYMPOSIUM: Data to decisions.
White, B J; Amrine, D E; Larson, R L
2018-04-14
Big data are frequently used in many facets of business and agronomy to enhance knowledge needed to improve operational decisions. Livestock operations collect data of sufficient quantity to perform predictive analytics. Predictive analytics can be defined as a methodology and suite of data evaluation techniques to generate a prediction for specific target outcomes. The objective of this manuscript is to describe the process of using big data and the predictive analytic framework to create tools to drive decisions in livestock production, health, and welfare. The predictive analytic process involves selecting a target variable, managing the data, partitioning the data, then creating algorithms, refining algorithms, and finally comparing accuracy of the created classifiers. The partitioning of the datasets allows model building and refining to occur prior to testing the predictive accuracy of the model with naive data to evaluate overall accuracy. Many different classification algorithms are available for predictive use and testing multiple algorithms can lead to optimal results. Application of a systematic process for predictive analytics using data that is currently collected or that could be collected on livestock operations will facilitate precision animal management through enhanced livestock operational decisions.
NASA Astrophysics Data System (ADS)
Phillis, C. C.; Ostrach, D. J.; Weber, P. K.; Ingram, B. L.; Zinkl, J. G.
2005-12-01
Habitat use has been shown to be an important factor in the bioaccumulation of contaminants in striped bass ( Morone saxatilis). This study explores techniques to determine migration in striped bass as part of a larger study investigating maternal transfer of xenobiotics to progeny in the San Francisco Estuary. The timing of movement of fish between salt and fresh water can easily be determined using a number of chemical markers in otoliths. Determining movement within estuaries, however, is a more difficult problem because mesohaline geochemical signatures approach the marine end member at very low salinities. Two tracers were used to reconstruct the migration history of striped bass in the San Francisco Estuary: Sr/Ca (measured by electron microprobe and LA-ICP-MS) and Sr isotope ratio (measured by LA-MC-ICP-MS). Both tracers can be used to map the salinity the fish is exposed to at the time of otolith increment deposition. Salinity, in turn, is mapped to location within the San Francisco Bay estuary based on monthly salinity surveys. The two methods have their respective benefits. Sr/Ca can be measured with higher spatial resolution (<10 microns). Sr isotope ratios are not modulated by metabolism. Sr isotope measurements were made to check the Sr/Ca results. In the San Francisco Estuary, low 87Sr/86Sr (0.706189) river water mixes with high 87Sr/86Sr (0.709168) marine water to 80% of the marine signal (0.7085) when the salinity is only 5% (1.8 ppt) seawater, and 95% of the marine signal (0.7090) at salinities of 20% (6.6 ppt) seawater (Ingram and Sloan, 1992). This salinity model should map directly to the otolith because there is no biological fractionation of Sr isotopes. The Sr/Ca otolith and salinity models predict a similar response. For both models, calculated otolith salinity is mapped to location within the San Francisco Estuary based on monthly salinity surveys. Using previously published salinity models, the otolith Sr/Ca and Sr isotope results are offset. These results suggest that a new Sr/Ca salinity model must be developed for this population of striped bass.
Seasonal pattern of anthropogenic salinization in temperate forested headwater streams.
Timpano, Anthony J; Zipper, Carl E; Soucek, David J; Schoenholtz, Stephen H
2018-04-15
Salinization of freshwaters by human activities is of growing concern globally. Consequences of salt pollution include adverse effects to aquatic biodiversity, ecosystem function, human health, and ecosystem services. In headwater streams of the temperate forests of eastern USA, elevated specific conductance (SC), a surrogate measurement for the major dissolved ions composing salinity, has been linked to decreased diversity of aquatic insects. However, such linkages have typically been based on limited numbers of SC measurements that do not quantify intra-annual variation. Effective management of salinization requires tools to accurately monitor and predict salinity while accounting for temporal variability. Toward that end, high-frequency SC data were collected within the central Appalachian coalfield over 4 years at 25 forested headwater streams spanning a gradient of salinity. A sinusoidal periodic function was used to model the annual cycle of SC, averaged across years and streams. The resultant model revealed that, on average, salinity deviated approximately ±20% from annual mean levels across all years and streams, with minimum SC occurring in late winter and peak SC occurring in late summer. The pattern was evident in headwater streams influenced by surface coal mining, unmined headwater reference streams with low salinity, and larger-order salinized rivers draining the study area. The pattern was strongly responsive to varying seasonal dilution as driven by catchment evapotranspiration, an effect that was amplified slightly in unmined catchments with greater relative forest cover. Evaluation of alternative sampling intervals indicated that discrete sampling can approximate the model performance afforded by high-frequency data but model error increases rapidly as discrete sampling intervals exceed 30 days. This study demonstrates that intra-annual variation of salinity in temperate forested headwater streams of Appalachia USA follows a natural seasonal pattern, driven by interactive influences on water quantity and quality of climate, geology, and terrestrial vegetation. Because climatic and vegetation dynamics vary annually in a seasonal, cyclic manner, a periodic function can be used to fit a sinusoidal model to the salinity pattern. The model framework used here is broadly applicable in systems with streamflow-dependent chronic salinity stress. Copyright © 2018 Elsevier Ltd. All rights reserved.
Predictive Caching Using the TDAG Algorithm
NASA Technical Reports Server (NTRS)
Laird, Philip; Saul, Ronald
1992-01-01
We describe how the TDAG algorithm for learning to predict symbol sequences can be used to design a predictive cache store. A model of a two-level mass storage system is developed and used to calculate the performance of the cache under various conditions. Experimental simulations provide good confirmation of the model.
Predictive Model of Linear Antimicrobial Peptides Active against Gram-Negative Bacteria.
Vishnepolsky, Boris; Gabrielian, Andrei; Rosenthal, Alex; Hurt, Darrell E; Tartakovsky, Michael; Managadze, Grigol; Grigolava, Maya; Makhatadze, George I; Pirtskhalava, Malak
2018-05-29
Antimicrobial peptides (AMPs) have been identified as a potential new class of anti-infectives for drug development. There are a lot of computational methods that try to predict AMPs. Most of them can only predict if a peptide will show any antimicrobial potency, but to the best of our knowledge, there are no tools which can predict antimicrobial potency against particular strains. Here we present a predictive model of linear AMPs being active against particular Gram-negative strains relying on a semi-supervised machine-learning approach with a density-based clustering algorithm. The algorithm can well distinguish peptides active against particular strains from others which may also be active but not against the considered strain. The available AMP prediction tools cannot carry out this task. The prediction tool based on the algorithm suggested herein is available on https://dbaasp.org.
An underwater turbulence degraded image restoration algorithm
NASA Astrophysics Data System (ADS)
Furhad, Md. Hasan; Tahtali, Murat; Lambert, Andrew
2017-09-01
Underwater turbulence occurs due to random fluctuations of temperature and salinity in the water. These fluctuations are responsible for variations in water density, refractive index and attenuation. These impose random geometric distortions, spatio-temporal varying blur, limited range visibility and limited contrast on the acquired images. There are some restoration techniques developed to address this problem, such as image registration based, lucky region based and centroid-based image restoration algorithms. Although these methods demonstrate better results in terms of removing turbulence, they require computationally intensive image registration, higher CPU load and memory allocations. Thus, in this paper, a simple patch based dictionary learning algorithm is proposed to restore the image by alleviating the costly image registration step. Dictionary learning is a machine learning technique which builds a dictionary of non-zero atoms derived from the sparse representation of an image or signal. The image is divided into several patches and the sharp patches are detected from them. Next, dictionary learning is performed on these patches to estimate the restored image. Finally, an image deconvolution algorithm is employed on the estimated restored image to remove noise that still exists.
Gibbons, Taylor C; Metzger, David C H; Healy, Timothy M; Schulte, Patricia M
2017-05-01
Phenotypic plasticity is thought to facilitate the colonization of novel environments and shape the direction of evolution in colonizing populations. However, the relative prevalence of various predicted patterns of changes in phenotypic plasticity following colonization remains unclear. Here, we use a whole-transcriptome approach to characterize patterns of gene expression plasticity in the gills of a freshwater-adapted and a saltwater-adapted ecotype of threespine stickleback (Gasterosteus aculeatus) exposed to a range of salinities. The response of the gill transcriptome to environmental salinity had a large shared component common to both ecotypes (2159 genes) with significant enrichment of genes involved in transmembrane ion transport and the restructuring of the gill epithelium. This transcriptional response to freshwater acclimation is induced at salinities below two parts per thousand. There was also differentiation in gene expression patterns between ecotypes (2515 genes), particularly in processes important for changes in the gill structure and permeability. Only 508 genes that differed between ecotypes also responded to salinity and no specific processes were enriched among this gene set, and an even smaller number (87 genes) showed evidence of changes in the extent of the response to salinity acclimation between ecotypes. No pattern of relative expression dominated among these genes, suggesting that neither gains nor losses of plasticity dominated the changes in expression patterns between the ecotypes. These data demonstrate that multiple patterns of changes in gene expression plasticity can occur following colonization of novel habitats. © 2017 John Wiley & Sons Ltd.
Rotational magneto-acousto-electric tomography (MAET): theory and experimental validation
Kunyansky, L; Ingram, C P; Witte, R S
2017-01-01
We present a novel two-dimensional (2D) MAET scanner, with a rotating object of interest and two fixed pairs of electrodes. Such an acquisition scheme, with our novel reconstruction techniques, recovers the boundaries of the regions of constant conductivity uniformly well, regardless of their orientation. We also present a general image reconstruction algorithm for the 2D MAET in a circular chamber with point-like electrodes immersed into the saline surrounding the object. An alternative linearized reconstruction procedure is developed, suitable for recovering the material interfaces (boundaries) when a non-ideal piezoelectric transducer is used for acoustic excitation. The work of the scanner and the linearized reconstruction algorithm is demonstrated using several phantoms made of high-contrast materials and a biological sample. PMID:28323633
Community detection in complex networks using link prediction
NASA Astrophysics Data System (ADS)
Cheng, Hui-Min; Ning, Yi-Zi; Yin, Zhao; Yan, Chao; Liu, Xin; Zhang, Zhong-Yuan
2018-01-01
Community detection and link prediction are both of great significance in network analysis, which provide very valuable insights into topological structures of the network from different perspectives. In this paper, we propose a novel community detection algorithm with inclusion of link prediction, motivated by the question whether link prediction can be devoted to improving the accuracy of community partition. For link prediction, we propose two novel indices to compute the similarity between each pair of nodes, one of which aims to add missing links, and the other tries to remove spurious edges. Extensive experiments are conducted on benchmark data sets, and the results of our proposed algorithm are compared with two classes of baselines. In conclusion, our proposed algorithm is competitive, revealing that link prediction does improve the precision of community detection.
Arbuthnot, Mary; Mooney, David P
2017-01-01
It is crucial to identify cervical spine injuries while minimizing ionizing radiation. This study analyzes the sensitivity and negative predictive value of a pediatric cervical spine clearance algorithm. We performed a retrospective review of all children <21years old who were admitted following blunt trauma and underwent cervical spine clearance utilizing our institution's cervical spine clearance algorithm over a 10-year period. Age, gender, International Classification of Diseases 9th Edition diagnosis codes, presence or absence of cervical collar on arrival, Injury Severity Score, and type of cervical spine imaging obtained were extracted from the trauma registry and electronic medical record. Descriptive statistics were used and the sensitivity and negative predictive value of the algorithm were calculated. Approximately 125,000 children were evaluated in the Emergency Department and 11,331 were admitted. Of the admitted children, 1023 patients arrived in a cervical collar without advanced cervical spine imaging and were evaluated using the cervical spine clearance algorithm. Algorithm sensitivity was 94.4% and the negative predictive value was 99.9%. There was one missed injury, a spinous process tip fracture in a teenager maintained in a collar. Our algorithm was associated with a low missed injury rate and low CT utilization rate, even in children <3years old. IV. Published by Elsevier Inc.
Predicting the vertical structure of tidal current and salinity in San Francisco Bay, California
Ford, Michael; Wang, Jia; Cheng, Ralph T.
1990-01-01
A two-dimensional laterally averaged numerical estuarine model is developed to study the vertical variations of tidal hydrodynamic properties in the central/north part of San Francisco Bay, California. Tidal stage data, current meter measurements, and conductivity, temperature, and depth profiling data in San Francisco Bay are used for comparison with model predictions. An extensive review of the literature is conducted to assess the success and failure of previous similar investigations and to establish a strategy for development of the present model. A σ plane transformation is used in the vertical dimension to alleviate problems associated with fixed grid model applications in the bay, where the tidal range can be as much as 20–25% of the total water depth. Model predictions of tidal stage and velocity compare favorably with the available field data, and prototype salinity stratification is qualitatively reproduced. Conclusions from this study as well as future model applications and research needs are discussed.
Fang, Li-Gang; Chen, Shui-Sen; Li, Dong; Li, Hong-Li
2009-01-01
Spectra, salinity, total suspended solids (TSS, in mg/L) and colored dissolved organic matter (CDOM, ag(400) at 400 nm) sampled in stations in 44 different locations on December 18, 19 and 21, in 2006 were measured and analyzed. The studied field covered a large variety of optically different waters, the absorption coefficient of CDOM ([ag(400)] in m-1) varied between 0.488 and 1.41 m-1, and the TSS concentrations (mg/L) varied between 7.0 and 241.1 mg/L. In order to detect salinity of the Pearl River Estuary, we analyzed the spectral properties of TSS and CDOM, and the relationships between field water reflectance spectra and water constituents' concentrations based on the synchronous in-situ and satellite hyper-spectral image analysis. A good correlation was discovered (the positive correlation by linear fit), between in-situ reflectance ratio R680/R527 and TSS concentrations (R2 = 0.65) for the salinity range of 1.74-22.12. However, the result also showed that the absorption coefficient of CDOM was not tightly correlated with reflectance. In addition, we also observed two significant relationships (R2 > 0.77), one between TSS concentrations and surface salinity and the other between the absorption coefficient of CDOM and surface salinity. Finally, we develop a novel method to understand surface salinity distribution of estuarine waters from the calibrated EO-1 Hyperion reflectance data in the Pearl River Estuary, i.e. channels with high salinity and shoals with low salinity. The EO-1 Hyperion derived surface salinity and TSS concentrations were validated using in-situ data that were collected on December 21, 2006, synchronous with EO-1 Hyperion satellite imagery acquisition. The results showed that the semi-empirical relationships are capable of predicting salinity from EO-1 Hyperion imagery in the Pearl River Estuary (RMSE < 2‰). PMID:22389623
Transport of water and ions in partially water-saturated porous media. Part 2. Filtration effects
NASA Astrophysics Data System (ADS)
Revil, A.
2017-05-01
A new set of constitutive equations describing the transport of the ions and water through charged porous media and considering the effect of ion filtration is applied to the problem of reverse osmosis and diffusion of a salt. Starting with the constitutive equations derived in Paper 1, I first determine specific formula for the osmotic coefficient and effective diffusion coefficient of a binary symmetric 1:1 salt (such as KCl or NaCl) as a function of a dimensionless number Θ corresponding to the ratio between the cation exchange capacity (CEC) and the salinity. The modeling is first carried with the Donnan model used to describe the concentrations of the charge carriers in the pore water phase. Then a new model is developed in the thin double layer approximation to determine these concentrations. These models provide explicit relationships between the concentration of the ionic species in the pore space and those in a neutral reservoir in local equilibrium with the pore space and the CEC. The case of reverse osmosis and diffusion coefficient are analyzed in details for the case of saturated and partially saturated porous materials. Comparisons are done with experimental data from the literature obtained on bentonite. The model predicts correctly the influence of salinity (including membrane behavior at high salinities), porosity, cation type (K+ versus Na+), and water saturation on the osmotic coefficient. It also correctly predicts the dependence of the diffusion coefficient of the salt with the salinity.
A material political economy: Automated Trading Desk and price prediction in high-frequency trading.
MacKenzie, Donald
2017-04-01
This article contains the first detailed historical study of one of the new high-frequency trading (HFT) firms that have transformed many of the world's financial markets. The study, of Automated Trading Desk (ATD), one of the earliest and most important such firms, focuses on how ATD's algorithms predicted share price changes. The article argues that political-economic struggles are integral to the existence of some of the 'pockets' of predictable structure in the otherwise random movements of prices, to the availability of the data that allow algorithms to identify these pockets, and to the capacity of algorithms to use these predictions to trade profitably. The article also examines the role of HFT algorithms such as ATD's in the epochal, fiercely contested shift in US share trading from 'fixed-role' markets towards 'all-to-all' markets.
Local-search based prediction of medical image registration error
NASA Astrophysics Data System (ADS)
Saygili, Görkem
2018-03-01
Medical image registration is a crucial task in many different medical imaging applications. Hence, considerable amount of work has been published recently that aim to predict the error in a registration without any human effort. If provided, these error predictions can be used as a feedback to the registration algorithm to further improve its performance. Recent methods generally start with extracting image-based and deformation-based features, then apply feature pooling and finally train a Random Forest (RF) regressor to predict the real registration error. Image-based features can be calculated after applying a single registration but provide limited accuracy whereas deformation-based features such as variation of deformation vector field may require up to 20 registrations which is a considerably high time-consuming task. This paper proposes to use extracted features from a local search algorithm as image-based features to estimate the error of a registration. The proposed method comprises a local search algorithm to find corresponding voxels between registered image pairs and based on the amount of shifts and stereo confidence measures, it predicts the amount of registration error in millimetres densely using a RF regressor. Compared to other algorithms in the literature, the proposed algorithm does not require multiple registrations, can be efficiently implemented on a Graphical Processing Unit (GPU) and can still provide highly accurate error predictions in existence of large registration error. Experimental results with real registrations on a public dataset indicate a substantially high accuracy achieved by using features from the local search algorithm.
NASA Technical Reports Server (NTRS)
Joyce, T. M.; Dunworth, J. A.; Schubert, D. M.; Stalcup, M. C.; Barbour, R. L.
1988-01-01
The degree to which Acoustic-Doppler Current Profiler (ADCP) and expendable bathythermograph (XBT) data can provide quantitative measurements of the velocity structure and transport of the Gulf Stream is addressed. An algorithm is used to generate salinity from temperature and depth using an historical Temperature/Salinity relation for the NW Atlantic. Results have been simulated using CTD data and comparing real and pseudo salinity files. Errors are typically less than 2 dynamic cm for the upper 800 m out of a total signal of 80 cm (across the Gulf Stream). When combined with ADCP data for a near-surface reference velocity, transport errors in isopycnal layers are less than about 1 Sv (10 to the 6th power cu m/s), as is the difference in total transport for the upper 800 m between real and pseudo data. The method is capable of measuring the real variability of the Gulf Stream, and when combined with altimeter data, can provide estimates of the geoid slope with oceanic errors of a few parts in 10 to the 8th power over horizontal scales of 500 km.
New Dielectric Measurement Data to Determine the Permittivity of Seawater at 1.4313 Hz
NASA Technical Reports Server (NTRS)
Lang, R.; Zhou, Y.; Utku, C.; Levine, D.
2012-01-01
This paper describes the new measurements - made in 2010-2011 - of the dielectric constant of seawater at 1.413 GHz using a resonant cavity technique. The purpose of these measurements is to develop an accurate relationship concerning the dependence of the dielectric constant of seawater on temperature and salinity for use by the Aquarius inversion algorithm. Aquarius is a NASA/CONAE satellite mission launched in June of 2011 with the primary mission of measuring global sea surface salinity with a 1.413 GHz radiometer to an accuracy of 0.2 psu. A brass microwave cavity resonant at 1.413 GHz has been used to measure the dielectric constant of seawater. The seawater is introduced into the cavity through a capillary glass tube having an inner diameter of 0.1 mm. The change of resonant frequency and the cavity Q value are used to determine the real and imaginary parts of the dielectric constant of seawater. Measurements are automated with Visual Basic software developed at the George Washington University. In this paper, new results from measurements made since September 2010 will be presented for salinities of 30, 35 and 38 psu with a temperature range of 0 C to 35 C in intervals of 5 C. These measurements are more accurate than earlier measurements made in 2008. The new results will be compared to the Klein-Swift (KS) and Meissner-Wentz (MW) model functions. The importance of an accurate model function will be illustrated by using these model functions to invert the Aquarius brightness temperature to retrieve the salinity values. The salinity values will be compared to co-located in situ data collected by Argo buoys.
Choosing the appropriate forecasting model for predictive parameter control.
Aleti, Aldeida; Moser, Irene; Meedeniya, Indika; Grunske, Lars
2014-01-01
All commonly used stochastic optimisation algorithms have to be parameterised to perform effectively. Adaptive parameter control (APC) is an effective method used for this purpose. APC repeatedly adjusts parameter values during the optimisation process for optimal algorithm performance. The assignment of parameter values for a given iteration is based on previously measured performance. In recent research, time series prediction has been proposed as a method of projecting the probabilities to use for parameter value selection. In this work, we examine the suitability of a variety of prediction methods for the projection of future parameter performance based on previous data. All considered prediction methods have assumptions the time series data has to conform to for the prediction method to provide accurate projections. Looking specifically at parameters of evolutionary algorithms (EAs), we find that all standard EA parameters with the exception of population size conform largely to the assumptions made by the considered prediction methods. Evaluating the performance of these prediction methods, we find that linear regression provides the best results by a very small and statistically insignificant margin. Regardless of the prediction method, predictive parameter control outperforms state of the art parameter control methods when the performance data adheres to the assumptions made by the prediction method. When a parameter's performance data does not adhere to the assumptions made by the forecasting method, the use of prediction does not have a notable adverse impact on the algorithm's performance.
Increased Accuracy in the Measurement of the Dielectric Constant of Seawater at 1.413 GHz
NASA Technical Reports Server (NTRS)
Zhou, Y.; Lang R.; Drego, C.; Utku, C.; LeVine, D.
2012-01-01
This paper describes the latest results for the measurements of the dielectric constant at 1.413 GHz by using a resonant cavity technique. The purpose of these measurements is to develop an accurate relationship for the dependence of the dielectric constant of sea water on temperature and salinity which is needed by the Aquarius inversion algorithm to retrieve salinity. Aquarius is the major instrument on the Aquarius/SAC-D observatory, a NASA/CONAE satellite mission launched in June of20ll with the primary mission of measuring global sea surface salinity to an accuracy of 0.2 psu. Aquarius measures salinity with a 1.413 GHz radiometer and uses a scatterometer to compensate for the effects of surface roughness. The core part of the seawater dielectric constant measurement system is a brass microwave cavity that is resonant at 1.413 GHz. The seawater is introduced into the cavity through a capillary glass tube having an inner diameter of 0.1 mm. The change of resonance frequency and the cavity Q value are used to determine the real and imaginary parts of the dielectric constant of seawater introduced into the thin tube. Measurements are automated with the help of software developed at the George Washington University. In this talk, new results from measurements made since September 2010 will be presented for salinities 30, 35 and 38 psu with a temperature range of O C to 350 C in intervals of 5 C. These measurements are more accurate than earlier measurements made in 2008 because of a new method for measuring the calibration constant using methanol. In addition, the variance of repeated seawater measurements has been reduced by letting the system stabilize overnight between temperature changes. The new results are compared to the Kline Swift and Meissner Wentz model functions. The importance of an accurate model function will be illustrated by using these model functions to invert the Aquarius brightness temperature to get the salinity values. The salinity values will be compared to co-located in situ data collected by Argo buoys.
NASA Technical Reports Server (NTRS)
Hess, Ronald A.
1990-01-01
A collection of technical papers are presented that cover modeling pilot interaction with automated digital avionics systems and guidance and control algorithms for contour and nap-of-the-earth flight. The titles of the papers presented are as follows: (1) Automation effects in a multiloop manual control system; (2) A qualitative model of human interaction with complex dynamic systems; (3) Generalized predictive control of dynamic systems; (4) An application of generalized predictive control to rotorcraft terrain-following flight; (5) Self-tuning generalized predictive control applied to terrain-following flight; and (6) Precise flight path control using a predictive algorithm.
Dugan, Hilary A; Bartlett, Sarah L; Burke, Samantha M; Doubek, Jonathan P; Krivak-Tetley, Flora E; Skaff, Nicholas K; Summers, Jamie C; Farrell, Kaitlin J; McCullough, Ian M; Morales-Williams, Ana M; Roberts, Derek C; Ouyang, Zutao; Scordo, Facundo; Hanson, Paul C; Weathers, Kathleen C
2017-04-25
The highest densities of lakes on Earth are in north temperate ecosystems, where increasing urbanization and associated chloride runoff can salinize freshwaters and threaten lake water quality and the many ecosystem services lakes provide. However, the extent to which lake salinity may be changing at broad spatial scales remains unknown, leading us to first identify spatial patterns and then investigate the drivers of these patterns. Significant decadal trends in lake salinization were identified using a dataset of long-term chloride concentrations from 371 North American lakes. Landscape and climate metrics calculated for each site demonstrated that impervious land cover was a strong predictor of chloride trends in Northeast and Midwest North American lakes. As little as 1% impervious land cover surrounding a lake increased the likelihood of long-term salinization. Considering that 27% of large lakes in the United States have >1% impervious land cover around their perimeters, the potential for steady and long-term salinization of these aquatic systems is high. This study predicts that many lakes will exceed the aquatic life threshold criterion for chronic chloride exposure (230 mg L -1 ), stipulated by the US Environmental Protection Agency (EPA), in the next 50 y if current trends continue.
NASA Technical Reports Server (NTRS)
Troccoli, Alberto; Rienecker, Michele M.; Keppenne, Christian L.; Johnson, Gregory C.
2003-01-01
The NASA Seasonal-to-Interannual Prediction Project (NSIPP) has developed an Ocean data assimilation system to initialize the quasi-isopycnal ocean model used in our experimental coupled-model forecast system. Initial tests of the system have focused on the assimilation of temperature profiles in an optimal interpolation framework. It is now recognized that correction of temperature only often introduces spurious water masses. The resulting density distribution can be statically unstable and also have a detrimental impact on the velocity distribution. Several simple schemes have been developed to try to correct these deficiencies. Here the salinity field is corrected by using a scheme which assumes that the temperature-salinity relationship of the model background is preserved during the assimilation. The scheme was first introduced for a zlevel model by Troccoli and Haines (1999). A large set of subsurface observations of salinity and temperature is used to cross-validate two data assimilation experiments run for the 6-year period 1993-1998. In these two experiments only subsurface temperature observations are used, but in one case the salinity field is also updated whenever temperature observations are available.
Space shuttle propulsion parameter estimation using optimal estimation techniques
NASA Technical Reports Server (NTRS)
1983-01-01
The first twelve system state variables are presented with the necessary mathematical developments for incorporating them into the filter/smoother algorithm. Other state variables, i.e., aerodynamic coefficients can be easily incorporated into the estimation algorithm, representing uncertain parameters, but for initial checkout purposes are treated as known quantities. An approach for incorporating the NASA propulsion predictive model results into the optimal estimation algorithm was identified. This approach utilizes numerical derivatives and nominal predictions within the algorithm with global iterations of the algorithm. The iterative process is terminated when the quality of the estimates provided no longer significantly improves.
Development of an Evolutionary Algorithm for the ab Initio Discovery of Two-Dimensional Materials
NASA Astrophysics Data System (ADS)
Revard, Benjamin Charles
Crystal structure prediction is an important first step on the path toward computational materials design. Increasingly robust methods have become available in recent years for computing many materials properties, but because properties are largely a function of crystal structure, the structure must be known before these methods can be brought to bear. In addition, structure prediction is particularly useful for identifying low-energy structures of subperiodic materials, such as two-dimensional (2D) materials, which may adopt unexpected structures that differ from those of the corresponding bulk phases. Evolutionary algorithms, which are heuristics for global optimization inspired by biological evolution, have proven to be a fruitful approach for tackling the problem of crystal structure prediction. This thesis describes the development of an improved evolutionary algorithm for structure prediction and several applications of the algorithm to predict the structures of novel low-energy 2D materials. The first part of this thesis contains an overview of evolutionary algorithms for crystal structure prediction and presents our implementation, including details of extending the algorithm to search for clusters, wires, and 2D materials, improvements to efficiency when running in parallel, improved composition space sampling, and the ability to search for partial phase diagrams. We then present several applications of the evolutionary algorithm to 2D systems, including InP, the C-Si and Sn-S phase diagrams, and several group-IV dioxides. This thesis makes use of the Cornell graduate school's "papers" option. Chapters 1 and 3 correspond to the first-author publications of Refs. [131] and [132], respectively, and chapter 2 will soon be submitted as a first-author publication. The material in chapter 4 is taken from Ref. [144], in which I share joint first-authorship. In this case I have included only my own contributions.
NASA Astrophysics Data System (ADS)
Lin, Mingpei; Xu, Ming; Fu, Xiaoyu
2017-05-01
Currently, a tremendous amount of space debris in Earth's orbit imperils operational spacecraft. It is essential to undertake risk assessments of collisions and predict dangerous encounters in space. However, collision predictions for an enormous amount of space debris give rise to large-scale computations. In this paper, a parallel algorithm is established on the Compute Unified Device Architecture (CUDA) platform of NVIDIA Corporation for collision prediction. According to the parallel structure of NVIDIA graphics processors, a block decomposition strategy is adopted in the algorithm. Space debris is divided into batches, and the computation and data transfer operations of adjacent batches overlap. As a consequence, the latency to access shared memory during the entire computing process is significantly reduced, and a higher computing speed is reached. Theoretically, a simulation of collision prediction for space debris of any amount and for any time span can be executed. To verify this algorithm, a simulation example including 1382 pieces of debris, whose operational time scales vary from 1 min to 3 days, is conducted on Tesla C2075 of NVIDIA. The simulation results demonstrate that with the same computational accuracy as that of a CPU, the computing speed of the parallel algorithm on a GPU is 30 times that on a CPU. Based on this algorithm, collision prediction of over 150 Chinese spacecraft for a time span of 3 days can be completed in less than 3 h on a single computer, which meets the timeliness requirement of the initial screening task. Furthermore, the algorithm can be adapted for multiple tasks, including particle filtration, constellation design, and Monte-Carlo simulation of an orbital computation.
Deep learning improves prediction of CRISPR-Cpf1 guide RNA activity.
Kim, Hui Kwon; Min, Seonwoo; Song, Myungjae; Jung, Soobin; Choi, Jae Woo; Kim, Younggwang; Lee, Sangeun; Yoon, Sungroh; Kim, Hyongbum Henry
2018-03-01
We present two algorithms to predict the activity of AsCpf1 guide RNAs. Indel frequencies for 15,000 target sequences were used in a deep-learning framework based on a convolutional neural network to train Seq-deepCpf1. We then incorporated chromatin accessibility information to create the better-performing DeepCpf1 algorithm for cell lines for which such information is available and show that both algorithms outperform previous machine learning algorithms on our own and published data sets.
A statistical analysis of RNA folding algorithms through thermodynamic parameter perturbation.
Layton, D M; Bundschuh, R
2005-01-01
Computational RNA secondary structure prediction is rather well established. However, such prediction algorithms always depend on a large number of experimentally measured parameters. Here, we study how sensitive structure prediction algorithms are to changes in these parameters. We found already that for changes corresponding to the actual experimental error to which these parameters have been determined, 30% of the structure are falsely predicted whereas the ground state structure is preserved under parameter perturbation in only 5% of all the cases. We establish that base-pairing probabilities calculated in a thermal ensemble are viable although not a perfect measure for the reliability of the prediction of individual structure elements. Here, a new measure of stability using parameter perturbation is proposed, and its limitations are discussed.
Kossobokov, V.G.; Romashkova, L.L.; Keilis-Borok, V. I.; Healy, J.H.
1999-01-01
Algorithms M8 and MSc (i.e., the Mendocino Scenario) were used in a real-time intermediate-term research prediction of the strongest earthquakes in the Circum-Pacific seismic belt. Predictions are made by M8 first. Then, the areas of alarm are reduced by MSc at the cost that some earthquakes are missed in the second approximation of prediction. In 1992-1997, five earthquakes of magnitude 8 and above occurred in the test area: all of them were predicted by M8 and MSc identified correctly the locations of four of them. The space-time volume of the alarms is 36% and 18%, correspondingly, when estimated with a normalized product measure of empirical distribution of epicenters and uniform time. The statistical significance of the achieved results is beyond 99% both for M8 and MSc. For magnitude 7.5 + , 10 out of 19 earthquakes were predicted by M8 in 40% and five were predicted by M8-MSc in 13% of the total volume considered. This implies a significance level of 81% for M8 and 92% for M8-MSc. The lower significance levels might result from a global change in seismic regime in 1993-1996, when the rate of the largest events has doubled and all of them become exclusively normal or reversed faults. The predictions are fully reproducible; the algorithms M8 and MSc in complete formal definitions were published before we started our experiment [Keilis-Borok, V.I., Kossobokov, V.G., 1990. Premonitory activation of seismic flow: Algorithm M8, Phys. Earth and Planet. Inter. 61, 73-83; Kossobokov, V.G., Keilis-Borok, V.I., Smith, S.W., 1990. Localization of intermediate-term earthquake prediction, J. Geophys. Res., 95, 19763-19772; Healy, J.H., Kossobokov, V.G., Dewey, J.W., 1992. A test to evaluate the earthquake prediction algorithm, M8. U.S. Geol. Surv. OFR 92-401]. M8 is available from the IASPEI Software Library [Healy, J.H., Keilis-Borok, V.I., Lee, W.H.K. (Eds.), 1997. Algorithms for Earthquake Statistics and Prediction, Vol. 6. IASPEI Software Library]. ?? 1999 Elsevier Science B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Yueh, Simon H.
2004-01-01
Active and passive microwave remote sensing techniques have been investigated for the remote sensing of ocean surface wind and salinity. We revised an ocean surface spectrum using the CMOD-5 geophysical model function (GMF) for the European Remote Sensing (ERS) C-band scatterometer and the Ku-band GMF for the NASA SeaWinds scatterometer. The predictions of microwave brightness temperatures from this model agree well with satellite, aircraft and tower-based microwave radiometer data. This suggests that the impact of surface roughness on microwave brightness temperatures and radar scattering coefficients of sea surfaces can be consistently characterized by a roughness spectrum, providing physical basis for using combined active and passive remote sensing techniques for ocean surface wind and salinity remote sensing.
Analysis of algorithms for predicting canopy fuel
Katharine L. Gray; Elizabeth Reinhardt
2003-01-01
We compared observed canopy fuel characteristics with those predicted by existing biomass algorithms. We specifically examined the accuracy of the biomass equations developed by Brown (1978. We used destructively sampled data obtained at 5 different study areas. We compared predicted and observed quantities of foliage and crown biomass for individual trees in our study...
Algorithm for Lossless Compression of Calibrated Hyperspectral Imagery
NASA Technical Reports Server (NTRS)
Kiely, Aaron B.; Klimesh, Matthew A.
2010-01-01
A two-stage predictive method was developed for lossless compression of calibrated hyperspectral imagery. The first prediction stage uses a conventional linear predictor intended to exploit spatial and/or spectral dependencies in the data. The compressor tabulates counts of the past values of the difference between this initial prediction and the actual sample value. To form the ultimate predicted value, in the second stage, these counts are combined with an adaptively updated weight function intended to capture information about data regularities introduced by the calibration process. Finally, prediction residuals are losslessly encoded using adaptive arithmetic coding. Algorithms of this type are commonly tested on a readily available collection of images from the Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) hyperspectral imager. On the standard calibrated AVIRIS hyperspectral images that are most widely used for compression benchmarking, the new compressor provides more than 0.5 bits/sample improvement over the previous best compression results. The algorithm has been implemented in Mathematica. The compression algorithm was demonstrated as beneficial on 12-bit calibrated AVIRIS images.
Field-Scale Modeling of Local Capillary Trapping During CO2 Injection into a Saline Aquifer
NASA Astrophysics Data System (ADS)
Ren, B.; Lake, L. W.; Bryant, S. L.
2015-12-01
Local capillary trapping is the small-scale (10-2 to 10+1 m) CO2 trapping that is caused by the capillary pressure heterogeneity. The benefit of LCT, applied specially to CO2 sequestration, is that saturation of stored CO2 is larger than the residual gas, yet these CO2 are not susceptible to leakage through failed seals. Thus quantifying the extent of local capillary trapping is valuable in design and risk assessment of geologic storage projects. Modeling local capillary trapping is computationally expensive and may even be intractable using a conventional reservoir simulator. In this paper, we propose a novel method to model local capillary trapping by combining geologic criteria and connectivity analysis. The connectivity analysis originally developed for characterizing well-to-reservoir connectivity is adapted to this problem by means of a newly defined edge weight property between neighboring grid blocks, which accounts for the multiphase flow properties, injection rate, and gravity effect. Then the connectivity is estimated from shortest path algorithm to predict the CO2 migration behavior and plume shape during injection. A geologic criteria algorithm is developed to estimate the potential local capillary traps based only on the entry capillary pressure field. The latter is correlated to a geostatistical realization of permeability field. The extended connectivity analysis shows a good match of CO2 plume computed by the full-physics simulation. We then incorporate it into the geologic algorithm to quantify the amount of LCT structures identified within the entry capillary pressure field that can be filled during CO2 injection. Several simulations are conducted in the reservoirs with different level of heterogeneity (measured by the Dykstra-Parsons coefficient) under various injection scenarios. We find that there exists a threshold Dykstra-Parsons coefficient, below which low injection rate gives rise to more LCT; whereas higher injection rate increases LCT in heterogeneous reservoirs. Both the geologic algorithm and connectivity analysis are very fast; therefore, the integrated methodology can be used as a quick tool to estimate local capillary trapping. It can also be used as a potential complement to the full-physics simulation to evaluate safe storage capacity.
Algorithm aversion: people erroneously avoid algorithms after seeing them err.
Dietvorst, Berkeley J; Simmons, Joseph P; Massey, Cade
2015-02-01
Research shows that evidence-based algorithms more accurately predict the future than do human forecasters. Yet when forecasters are deciding whether to use a human forecaster or a statistical algorithm, they often choose the human forecaster. This phenomenon, which we call algorithm aversion, is costly, and it is important to understand its causes. We show that people are especially averse to algorithmic forecasters after seeing them perform, even when they see them outperform a human forecaster. This is because people more quickly lose confidence in algorithmic than human forecasters after seeing them make the same mistake. In 5 studies, participants either saw an algorithm make forecasts, a human make forecasts, both, or neither. They then decided whether to tie their incentives to the future predictions of the algorithm or the human. Participants who saw the algorithm perform were less confident in it, and less likely to choose it over an inferior human forecaster. This was true even among those who saw the algorithm outperform the human.
2018-01-01
This paper presents an integrated hybrid optimization algorithm for training the radial basis function neural network (RBF NN). Training of neural networks is still a challenging exercise in machine learning domain. Traditional training algorithms in general suffer and trap in local optima and lead to premature convergence, which makes them ineffective when applied for datasets with diverse features. Training algorithms based on evolutionary computations are becoming popular due to their robust nature in overcoming the drawbacks of the traditional algorithms. Accordingly, this paper proposes a hybrid training procedure with differential search (DS) algorithm functionally integrated with the particle swarm optimization (PSO). To surmount the local trapping of the search procedure, a new population initialization scheme is proposed using Logistic chaotic sequence, which enhances the population diversity and aid the search capability. To demonstrate the effectiveness of the proposed RBF hybrid training algorithm, experimental analysis on publicly available 7 benchmark datasets are performed. Subsequently, experiments were conducted on a practical application case for wind speed prediction to expound the superiority of the proposed RBF training algorithm in terms of prediction accuracy. PMID:29768463
Rani R, Hannah Jessie; Victoire T, Aruldoss Albert
2018-01-01
This paper presents an integrated hybrid optimization algorithm for training the radial basis function neural network (RBF NN). Training of neural networks is still a challenging exercise in machine learning domain. Traditional training algorithms in general suffer and trap in local optima and lead to premature convergence, which makes them ineffective when applied for datasets with diverse features. Training algorithms based on evolutionary computations are becoming popular due to their robust nature in overcoming the drawbacks of the traditional algorithms. Accordingly, this paper proposes a hybrid training procedure with differential search (DS) algorithm functionally integrated with the particle swarm optimization (PSO). To surmount the local trapping of the search procedure, a new population initialization scheme is proposed using Logistic chaotic sequence, which enhances the population diversity and aid the search capability. To demonstrate the effectiveness of the proposed RBF hybrid training algorithm, experimental analysis on publicly available 7 benchmark datasets are performed. Subsequently, experiments were conducted on a practical application case for wind speed prediction to expound the superiority of the proposed RBF training algorithm in terms of prediction accuracy.
Bellón, Juan Ángel; de Dios Luna, Juan; King, Michael; Nazareth, Irwin; Motrico, Emma; GildeGómez-Barragán, María Josefa; Torres-González, Francisco; Montón-Franco, Carmen; Sánchez-Celaya, Marta; Díaz-Barreiros, Miguel Ángel; Vicens, Catalina; Moreno-Peral, Patricia
2017-01-01
Background Little is known about the risk of progressing to hazardous alcohol use in abstinent or low-risk drinkers. Aim To develop and validate a simple brief risk algorithm for the onset of hazardous alcohol drinking (HAD) over 12 months for use in primary care. Design and setting Prospective cohort study in 32 health centres from six Spanish provinces, with evaluations at baseline, 6 months, and 12 months. Method Forty-one risk factors were measured and multilevel logistic regression and inverse probability weighting were used to build the risk algorithm. The outcome was new occurrence of HAD during the study, as measured by the AUDIT. Results From the lists of 174 GPs, 3954 adult abstinent or low-risk drinkers were recruited. The ‘predictAL-10’ risk algorithm included just nine variables (10 questions): province, sex, age, cigarette consumption, perception of financial strain, having ever received treatment for an alcohol problem, childhood sexual abuse, AUDIT-C, and interaction AUDIT-C*Age. The c-index was 0.886 (95% CI = 0.854 to 0.918). The optimal cutoff had a sensitivity of 0.83 and specificity of 0.80. Excluding childhood sexual abuse from the model (the ‘predictAL-9’), the c-index was 0.880 (95% CI = 0.847 to 0.913), sensitivity 0.79, and specificity 0.81. There was no statistically significant difference between the c-indexes of predictAL-10 and predictAL-9. Conclusion The predictAL-10/9 is a simple and internally valid risk algorithm to predict the onset of hazardous alcohol drinking over 12 months in primary care attendees; it is a brief tool that is potentially useful for primary prevention of hazardous alcohol drinking. PMID:28360074
The Ship Movement Trajectory Prediction Algorithm Using Navigational Data Fusion.
Borkowski, Piotr
2017-06-20
It is essential for the marine navigator conducting maneuvers of his ship at sea to know future positions of himself and target ships in a specific time span to effectively solve collision situations. This article presents an algorithm of ship movement trajectory prediction, which, through data fusion, takes into account measurements of the ship's current position from a number of doubled autonomous devices. This increases the reliability and accuracy of prediction. The algorithm has been implemented in NAVDEC, a navigation decision support system and practically used on board ships.
XtalOpt version r9: An open-source evolutionary algorithm for crystal structure prediction
Falls, Zackary; Lonie, David C.; Avery, Patrick; ...
2015-10-23
This is a new version of XtalOpt, an evolutionary algorithm for crystal structure prediction available for download from the CPC library or the XtalOpt website, http://xtalopt.github.io. XtalOpt is published under the Gnu Public License (GPL), which is an open source license that is recognized by the Open Source Initiative. We have detailed the new version incorporates many bug-fixes and new features here and predict the crystal structure of a system from its stoichiometry alone, using evolutionary algorithms.
The Ship Movement Trajectory Prediction Algorithm Using Navigational Data Fusion
Borkowski, Piotr
2017-01-01
It is essential for the marine navigator conducting maneuvers of his ship at sea to know future positions of himself and target ships in a specific time span to effectively solve collision situations. This article presents an algorithm of ship movement trajectory prediction, which, through data fusion, takes into account measurements of the ship’s current position from a number of doubled autonomous devices. This increases the reliability and accuracy of prediction. The algorithm has been implemented in NAVDEC, a navigation decision support system and practically used on board ships. PMID:28632176
A recurrence-weighted prediction algorithm for musical analysis
NASA Astrophysics Data System (ADS)
Colucci, Renato; Leguizamon Cucunuba, Juan Sebastián; Lloyd, Simon
2018-03-01
Forecasting the future behaviour of a system using past data is an important topic. In this article we apply nonlinear time series analysis in the context of music, and present new algorithms for extending a sample of music, while maintaining characteristics similar to the original piece. By using ideas from ergodic theory, we adapt the classical prediction method of Lorenz analogues so as to take into account recurrence times, and demonstrate with examples, how the new algorithm can produce predictions with a high degree of similarity to the original sample.
Neural network-based run-to-run controller using exposure and resist thickness adjustment
NASA Astrophysics Data System (ADS)
Geary, Shane; Barry, Ronan
2003-06-01
This paper describes the development of a run-to-run control algorithm using a feedforward neural network, trained using the backpropagation training method. The algorithm is used to predict the critical dimension of the next lot using previous lot information. It is compared to a common prediction algorithm - the exponentially weighted moving average (EWMA) and is shown to give superior prediction performance in simulations. The manufacturing implementation of the final neural network showed significantly improved process capability when compared to the case where no run-to-run control was utilised.
An O(n(5)) algorithm for MFE prediction of kissing hairpins and 4-chains in nucleic acids.
Chen, Ho-Lin; Condon, Anne; Jabbari, Hosna
2009-06-01
Efficient methods for prediction of minimum free energy (MFE) nucleic secondary structures are widely used, both to better understand structure and function of biological RNAs and to design novel nano-structures. Here, we present a new algorithm for MFE secondary structure prediction, which significantly expands the class of structures that can be handled in O(n(5)) time. Our algorithm can handle H-type pseudoknotted structures, kissing hairpins, and chains of four overlapping stems, as well as nested substructures of these types.
NASA Astrophysics Data System (ADS)
Cha, J.; Ryu, J.; Lee, M.; Song, C.; Cho, Y.; Schumacher, P.; Mah, M.; Kim, D.
Conjunction prediction is one of the critical operations in space situational awareness (SSA). For geospace objects, common algorithms for conjunction prediction are usually based on all-pairwise check, spatial hash, or kd-tree. Computational load is usually reduced through some filters. However, there exists a good chance of missing potential collisions between space objects. We present a novel algorithm which both guarantees no missing conjunction and is efficient to answer to a variety of spatial queries including pairwise conjunction prediction. The algorithm takes only O(k log N) time for N objects in the worst case to answer conjunctions where k is a constant which is linear to prediction time length. The proposed algorithm, named DVD-COOP (Dynamic Voronoi Diagram-based Conjunctive Orbital Object Predictor), is based on the dynamic Voronoi diagram of moving spherical balls in 3D space. The algorithm has a preprocessing which consists of two steps: The construction of an initial Voronoi diagram (taking O(N) time on average) and the construction of a priority queue for the events of topology changes in the Voronoi diagram (taking O(N log N) time in the worst case). The scalability of the proposed algorithm is also discussed. We hope that the proposed Voronoi-approach will change the computational paradigm in spatial reasoning among space objects.
Yang, Xiaoxia; Wang, Jia; Sun, Jun; Liu, Rong
2015-01-01
Protein-nucleic acid interactions are central to various fundamental biological processes. Automated methods capable of reliably identifying DNA- and RNA-binding residues in protein sequence are assuming ever-increasing importance. The majority of current algorithms rely on feature-based prediction, but their accuracy remains to be further improved. Here we propose a sequence-based hybrid algorithm SNBRFinder (Sequence-based Nucleic acid-Binding Residue Finder) by merging a feature predictor SNBRFinderF and a template predictor SNBRFinderT. SNBRFinderF was established using the support vector machine whose inputs include sequence profile and other complementary sequence descriptors, while SNBRFinderT was implemented with the sequence alignment algorithm based on profile hidden Markov models to capture the weakly homologous template of query sequence. Experimental results show that SNBRFinderF was clearly superior to the commonly used sequence profile-based predictor and SNBRFinderT can achieve comparable performance to the structure-based template methods. Leveraging the complementary relationship between these two predictors, SNBRFinder reasonably improved the performance of both DNA- and RNA-binding residue predictions. More importantly, the sequence-based hybrid prediction reached competitive performance relative to our previous structure-based counterpart. Our extensive and stringent comparisons show that SNBRFinder has obvious advantages over the existing sequence-based prediction algorithms. The value of our algorithm is highlighted by establishing an easy-to-use web server that is freely accessible at http://ibi.hzau.edu.cn/SNBRFinder.
RNA design using simulated SHAPE data.
Lotfi, Mohadeseh; Zare-Mirakabad, Fatemeh; Montaseri, Soheila
2018-05-03
It has long been established that in addition to being involved in protein translation, RNA plays essential roles in numerous other cellular processes, including gene regulation and DNA replication. Such roles are known to be dictated by higher-order structures of RNA molecules. It is therefore of prime importance to find an RNA sequence that can fold to acquire a particular function that is desirable for use in pharmaceuticals and basic research. The challenge of finding an RNA sequence for a given structure is known as the RNA design problem. Although there are several algorithms to solve this problem, they mainly consider hard constraints, such as minimum free energy, to evaluate the predicted sequences. Recently, SHAPE data has emerged as a new soft constraint for RNA secondary structure prediction. To take advantage of this new experimental constraint, we report here a new method for accurate design of RNA sequences based on their secondary structures using SHAPE data as pseudo-free energy. We then compare our algorithm with four others: INFO-RNA, ERD, MODENA and RNAifold 2.0. Our algorithm precisely predicts 26 out of 29 new sequences for the structures extracted from the Rfam dataset, while the other four algorithms predict no more than 22 out of 29. The proposed algorithm is comparable to the above algorithms on RNA-SSD datasets, where they can predict up to 33 appropriate sequences for RNA secondary structures out of 34.
Kaspi, Omer; Yosipof, Abraham; Senderowitz, Hanoch
2017-06-06
An important aspect of chemoinformatics and material-informatics is the usage of machine learning algorithms to build Quantitative Structure Activity Relationship (QSAR) models. The RANdom SAmple Consensus (RANSAC) algorithm is a predictive modeling tool widely used in the image processing field for cleaning datasets from noise. RANSAC could be used as a "one stop shop" algorithm for developing and validating QSAR models, performing outlier removal, descriptors selection, model development and predictions for test set samples using applicability domain. For "future" predictions (i.e., for samples not included in the original test set) RANSAC provides a statistical estimate for the probability of obtaining reliable predictions, i.e., predictions within a pre-defined number of standard deviations from the true values. In this work we describe the first application of RNASAC in material informatics, focusing on the analysis of solar cells. We demonstrate that for three datasets representing different metal oxide (MO) based solar cell libraries RANSAC-derived models select descriptors previously shown to correlate with key photovoltaic properties and lead to good predictive statistics for these properties. These models were subsequently used to predict the properties of virtual solar cells libraries highlighting interesting dependencies of PV properties on MO compositions.
Goudarzi, Shidrokh; Haslina Hassan, Wan; Abdalla Hashim, Aisha-Hassan; Soleymani, Seyed Ahmad; Anisi, Mohammad Hossein; Zakaria, Omar M.
2016-01-01
This study aims to design a vertical handover prediction method to minimize unnecessary handovers for a mobile node (MN) during the vertical handover process. This relies on a novel method for the prediction of a received signal strength indicator (RSSI) referred to as IRBF-FFA, which is designed by utilizing the imperialist competition algorithm (ICA) to train the radial basis function (RBF), and by hybridizing with the firefly algorithm (FFA) to predict the optimal solution. The prediction accuracy of the proposed IRBF–FFA model was validated by comparing it to support vector machines (SVMs) and multilayer perceptron (MLP) models. In order to assess the model’s performance, we measured the coefficient of determination (R2), correlation coefficient (r), root mean square error (RMSE) and mean absolute percentage error (MAPE). The achieved results indicate that the IRBF–FFA model provides more precise predictions compared to different ANNs, namely, support vector machines (SVMs) and multilayer perceptron (MLP). The performance of the proposed model is analyzed through simulated and real-time RSSI measurements. The results also suggest that the IRBF–FFA model can be applied as an efficient technique for the accurate prediction of vertical handover. PMID:27438600
Goudarzi, Shidrokh; Haslina Hassan, Wan; Abdalla Hashim, Aisha-Hassan; Soleymani, Seyed Ahmad; Anisi, Mohammad Hossein; Zakaria, Omar M
2016-01-01
This study aims to design a vertical handover prediction method to minimize unnecessary handovers for a mobile node (MN) during the vertical handover process. This relies on a novel method for the prediction of a received signal strength indicator (RSSI) referred to as IRBF-FFA, which is designed by utilizing the imperialist competition algorithm (ICA) to train the radial basis function (RBF), and by hybridizing with the firefly algorithm (FFA) to predict the optimal solution. The prediction accuracy of the proposed IRBF-FFA model was validated by comparing it to support vector machines (SVMs) and multilayer perceptron (MLP) models. In order to assess the model's performance, we measured the coefficient of determination (R2), correlation coefficient (r), root mean square error (RMSE) and mean absolute percentage error (MAPE). The achieved results indicate that the IRBF-FFA model provides more precise predictions compared to different ANNs, namely, support vector machines (SVMs) and multilayer perceptron (MLP). The performance of the proposed model is analyzed through simulated and real-time RSSI measurements. The results also suggest that the IRBF-FFA model can be applied as an efficient technique for the accurate prediction of vertical handover.
Mild hypothermia alters midazolam pharmacokinetics in normal healthy volunteers.
Hostler, David; Zhou, Jiangquan; Tortorici, Michael A; Bies, Robert R; Rittenberger, Jon C; Empey, Philip E; Kochanek, Patrick M; Callaway, Clifton W; Poloyac, Samuel M
2010-05-01
The clinical use of therapeutic hypothermia has been rapidly expanding due to evidence of neuroprotection. However, the effect of hypothermia on specific pathways of drug elimination in humans is relatively unknown. To gain insight into the potential effects of hypothermia on drug metabolism and disposition, we evaluated the pharmacokinetics of midazolam as a probe for CYP3A4/5 activity during mild hypothermia in human volunteers. A second objective of this work was to determine whether benzodiazepines and magnesium administered intravenously would facilitate the induction of hypothermia. Subjects were enrolled in a randomized crossover study, which included two mild hypothermia groups (4 degrees C saline infusions and 4 degrees C saline + magnesium) and two normothermia groups (37 degrees C saline infusions and 37 degrees C saline + magnesium). The lowest temperatures achieved in the 4 degrees C saline + magnesium and 4 degrees C saline infusions were 35.4 +/- 0.4 and 35.8 +/- 0.3 degrees C, respectively. A significant decrease in the formation clearance of the major metabolite 1'-hydroxymidazolam was observed during the 4 degrees C saline + magnesium compared with that in the 37 degrees C saline group (p < 0.05). Population pharmacokinetic modeling identified a significant relationship between temperature and clearance and intercompartmental clearance for midazolam. This model predicted that midazolam clearance decreases 11.1% for each degree Celsius reduction in core temperature from 36.5 degrees C. Midazolam with magnesium facilitated the induction of hypothermia, but shivering was minimally suppressed. These data provided proof of concept that even mild and short-duration changes in body temperature significantly affect midazolam metabolism. Future studies in patients who receive lower levels and a longer duration of hypothermia are warranted.
Yaish, Mahmoud W; Al-Harrasi, Ibtisam; Alansari, Aliya S; Al-Yahyai, Rashid; Glick, Bernard R
2016-09-01
Date palms are able to grow under diverse abiotic stress conditions including in saline soils, where microbial communities may be help in the plant's salinity tolerance. These communities able to produce specific growth promoting substances can enhance date palm growth in a saline environment. However, these communities are poorly defined. In the work reported here, the date palm endophytic bacterial and fungal communities were identified using the pyrosequencing method, and the microbial differential abundance in the root upon exposure to salinity stress was estimated. Approximately 150,061 reads were produced from the analysis of six ribosomal DNA libraries, which were prepared from endophytic microorganisms colonizing date palm root tissues. DNA sequence analysis of these libraries predicted the presence of a variety of bacterial and fungal endophytic species, some known and others unknown. The microbial community compositions of 30% and 8% of the bacterial and fungal species, respectively, were significantly (p ≤ 0.05) altered in response to salinity stress. Differential enrichment analysis showed that microbe diversity indicated by the Chao, Shannon and Simpson indices were slightly reduced, however, the overall microbial community structures were not significantly affected as a consequence of salinity. This may reflect a buffering effect by the host plant on the internal environments that these communities are colonizing. Some of the endophytes identified in this study were strains that were previously isolated from saline and marine environments. This suggests possible interactions with the plant that are favorable to salinity tolerance in date palm. [Int Microbiol 19(3):143-155 (2016)]. Copyright© by the Spanish Society for Microbiology and Institute for Catalan Studies.
NASA Astrophysics Data System (ADS)
Qu, Yonghua; Jiao, Siong; Lin, Xudong
2008-10-01
Hetao Irrigation District located in Inner Mongolia, is one of the three largest irrigated area in China. In the irrigational agriculture region, for the reasons that many efforts have been put on irrigation rather than on drainage, as a result much sedimentary salt that usually is solved in water has been deposited in surface soil. So there has arisen a problem in such irrigation district that soil salinity has become a chief fact which causes land degrading. Remote sensing technology is an efficiency way to map the salinity in regional scale. In the principle of remote sensing, soil spectrum is one of the most important indications which can be used to reflect the status of soil salinity. In the past decades, many efforts have been made to reveal the spectrum characteristics of the salinized soil, such as the traditional statistic regression method. But it also has been found that when the hyper-spectral reflectance data are considered, the traditional regression method can't be treat the large dimension data, because the hyper-spectral data usually have too higher spectral band number. In this paper, a partial least squares regression (PLSR) model was established based on the statistical analysis on the soil salinity and the reflectance of hyper-spectral. Dataset were collect through the field soil samples were collected in the region of Hetao irrigation from the end of July to the beginning of August. The independent validation using data which are not included in the calibration model reveals that the proposed model can predicate the main soil components such as the content of total ions(S%), PH with higher determination coefficients(R2) of 0.728 and 0.715 respectively. And the rate of prediction to deviation(RPD) of the above predicted value are larger than 1.6, which indicates that the calibrated PLSR model can be used as a tool to retrieve soil salinity with accurate results. When the PLSR model's regression coefficients were aggregated according to the wavelength of visual (blue, green, red) and near infrared bands of LandSat Thematic Mapper(TM) sensor, some significant response values were observed, which indicates that the proposed method in this paper can be used to analysis the remotely sensed data from the space-boarded platform.
Wang, Chongyang; Huang, Yong; Zhang, Zuotao; Wang, Hui
2018-04-25
With the close relationship between saline environments and industry, polycyclic aromatic hydrocarbons (PAHs) accumulate in saline/hypersaline environments. Therefore, PAHs degradation by halotolerant/halophilic bacteria has received increasing attention. In this study, the metabolic pathway of phenanthrene degradation by halophilic consortium CY-1 was first studied which showed a single upstream pathway initiated by dioxygenation at the C1 and C2 positions, and at several downstream pathways, including the catechol pathway, gentisic acid pathway and protocatechuic acid pathway. The effects of salinity on the community structure and expression of catabolic genes were further studied by a combination of high-throughput sequencing, catabolic gene clone library and real-time PCR. Pure cultures were also isolated from consortium CY-1 to investigate the contribution made by different microbes in the PAH-degrading process. Marinobacter is the dominant genus that contributed to the upstream degradation of phenanthrene especially in high salt content. Genus Halomonas made a great contribution in transforming intermediates in the subsequent degradation of catechol by using catechol 1,2-dioxygenase (C12O). Other microbes were predicted to be mediating bacteria that were able to utilize intermediates via different downstream pathways. Salinity was investigated to have negative effects on both microbial diversity and activity of consortium CY-1 and consortium CY-1 was found with a high degree of functional redundancy in saline environments.
Estimation of the barrier layer thickness in the Indian Ocean using Aquarius Salinity
NASA Astrophysics Data System (ADS)
Felton, Clifford S.; Subrahmanyam, Bulusu; Murty, V. S. N.; Shriver, Jay F.
2014-07-01
Monthly barrier layer thickness (BLT) estimates are derived from satellite measurements using a multilinear regression model (MRM) within the Indian Ocean. Sea surface salinity (SSS) from the recently launched Soil Moisture and Ocean Salinity (SMOS) and Aquarius SAC-D salinity missions are utilized to estimate the BLT. The MRM relates BLT to sea surface salinity (SSS), sea surface temperature (SST), and sea surface height anomalies (SSHA). Three regions where the BLT variability is most rigorous are selected to evaluate the performance of the MRM for 2012; the Southeast Arabian Sea (SEAS), Bay of Bengal (BoB), and Eastern Equatorial Indian Ocean (EEIO). The MRM derived BLT estimates are compared to gridded Argo and Hybrid Coordinate Ocean Model (HYCOM) BLTs. It is shown that different mechanisms are important for sustaining the BLT variability in each of the selected regions. Sensitivity tests show that SSS is the primary driver of the BLT within the MRM. Results suggest that salinity measurements obtained from Aquarius and SMOS can be useful for tracking and predicting the BLT in the Indian Ocean. Largest MRM errors occur along coastlines and near islands where land contamination skews the satellite SSS retrievals. The BLT evolution during 2012, as well as the advantages and disadvantages of the current model are discussed. BLT estimations using HYCOM simulations display large errors that are related to model layer structure and the selected BLT methodology.
Strang, Barbara; Murphy, Kyla; Seal, Shane; Cin, Arianna Dal
2013-01-01
There is a lack of literature examining the dosimetric implications of irradiating breast implants and expanders with internal ports inserted at the time of mastectomy. To determine whether the presence of breast expanders with port in saline or silicone implants affect the dose uniformity across the breast when irradiated with various photon and electron energies. One tissue-equivalent torso phantom with overlying tissue expanders in saline or silicone implants were irradiated using tangential fields with 6 MV and 18 MV photons and 9 MeV and 12 MeV electrons. All dose measurements were performed using thermoluminescent dosimeters (TLDs). The TLDs were arranged around the port and the perimeters of either the expander, or saline or silicone implant. Comparisons of measured radiation doses, and between the expected and measured doses of radiation from the TLDs on each prosthesis, were performed. Data were analyzed using two-tailed t tests. There were no differences in TLD measurements between the expander and the saline implant for all energy modalities, and for the expected versus actual measurements for the saline implant. Higher than anticipated measurements were recorded for a significant number of TLD positions around the silicone implants. Radiation doses around saline implants or expanders with internal port were unaltered, whereas dose recordings for silicone implants were higher than predicted in the present laboratory/ex vivo study.
Middleton, Beth A.; McKee, Karen L.
2012-01-01
Higher atmospheric concentrations of CO2 can offset the negative effects of flooding or salinity on plant species, but previous studies have focused on mature, rather than regenerating vegetation. This study examined how interacting environments of CO2, water regime, and salinity affect seed germination and seedling biomass of floating freshwater marshes in the Mississippi River Delta, which are dominated by C3 grasses, sedges, and forbs. Germination density and seedling growth of the dominant species depended on multifactor interactions of CO2 (385 and 720 μl l-1) with flooding (drained, +8-cm depth, +8-cm depth-gradual) and salinity (0, 6% seawater) levels. Of the three factors tested, salinity was the most important determinant of seedling response patterns. Species richness (total = 19) was insensitive to CO2. Our findings suggest that for freshwater marsh communities, seedling response to CO2 is species-specific and secondary to salinity and flooding effects. Elevated CO2 did not ameliorate flooding or salinity stress. Consequently, climate-related changes in sea level or human-caused alterations in hydrology may override atmospheric CO2 concentrations in driving shifts in this plant community. The results of this study suggest caution in making extrapolations from species-specific responses to community-level predictions without detailed attention to the nuances of multifactor responses.
Rodríguez-Hernández, María del Carmen; Moreno, Diego A; Carvajal, Micaela; Martínez-Ballesta, María del Carmen
2014-12-01
Climatic change predicts elevated salinity in soils as well as increased carbon dioxide dioxide [CO2] in the atmosphere. The present study aims to determine the effect of combined salinity and elevated [CO2] on sulfur (S) metabolism and S-derived phytochemicals in green and purple broccoli (cv. Naxos and cv. Viola, respectively). Elevated [CO2] involved the amelioration of salt stress, especially in cv. Viola, where a lower biomass reduction by salinity was accompanied by higher sodium (Na(+)) and chloride (Cl(-)) compartmentation in the vacuole. Moreover, salinity and elevated [CO2] affected the mineral and glucosinolate contents and the activity of biosynthetic enzymes of S-derived compounds and the degradative enzyme of glucosinolate metabolism, myrosinase, as well as the related amino acids and the antioxidant glutathione (GSH). In cv. Naxos, elevated [CO2] may trigger the antioxidant response to saline stress by means of increased GSH concentration. Also, in cv. Naxos, indolic glucosinolates were more influenced by the NaCl×CO2 interaction whereas in cv. Viola the aliphatic glucosinolates were significantly increased by these conditions. Salinity and elevated [CO2] enhanced the S cellular partitioning and metabolism affecting the myrosinase-glucosinolate system. © The Author 2014. Published by Oxford University Press on behalf of Japanese Society of Plant Physiologists. All rights reserved. For permissions, please email: journals.permissions@oup.com.
A pore-scale numerical method for simulating low-salinity waterflooding in porous media
NASA Astrophysics Data System (ADS)
Jiang, F.; Yang, J.; Tsuji, T.
2017-12-01
Low-salinity (LS)water injection has been attracting attention as a practical oil recovery technique because of its low cost and high efficiency in recent years. Many researchers conducted laboratory and observed its significant benefits compared to conventional high-salinity (HS) waterflooding. However, the fundamental mechanisms remain poorly understood. Different mechanisms such as fine migration, wettability alteration have been proposed to explain this low-salinity effect. Here, we aim to focus on investigating the effect of wettability alteration on the recovery efficiency. For this purpose, we proposed a pore scale numerical method to quantitatively evaluate the impact of salinity concentration on the sweep efficiency. We first developed the pore scale model by coupling the convection-diffusion model for tracking the concentration change and the lattice Boltzmann model for two-phase flow behavior, and assuming that a reduction of water salinity leads to localised wettability alteration. The model is then validated by simulating the contact angle change of an oil droplet attached to a clay substrate. Finally, the method was applied on a real rock geometry extracted from the micro-CT images of Berea sandstone. The results indicate that the initial wettability state of the system and the extent of wettability alteration are important in predicting the improvement of oil recovery due to LS brine injection. This work was supported by JSPS KAKENHI Grant Numbers 16K18331.
Muhlestein, Whitney E; Akagi, Dallin S; Kallos, Justiss A; Morone, Peter J; Weaver, Kyle D; Thompson, Reid C; Chambless, Lola B
2018-04-01
Objective Machine learning (ML) algorithms are powerful tools for predicting patient outcomes. This study pilots a novel approach to algorithm selection and model creation using prediction of discharge disposition following meningioma resection as a proof of concept. Materials and Methods A diversity of ML algorithms were trained on a single-institution database of meningioma patients to predict discharge disposition. Algorithms were ranked by predictive power and top performers were combined to create an ensemble model. The final ensemble was internally validated on never-before-seen data to demonstrate generalizability. The predictive power of the ensemble was compared with a logistic regression. Further analyses were performed to identify how important variables impact the ensemble. Results Our ensemble model predicted disposition significantly better than a logistic regression (area under the curve of 0.78 and 0.71, respectively, p = 0.01). Tumor size, presentation at the emergency department, body mass index, convexity location, and preoperative motor deficit most strongly influence the model, though the independent impact of individual variables is nuanced. Conclusion Using a novel ML technique, we built a guided ML ensemble model that predicts discharge destination following meningioma resection with greater predictive power than a logistic regression, and that provides greater clinical insight than a univariate analysis. These techniques can be extended to predict many other patient outcomes of interest.
Webb, Samuel J; Hanser, Thierry; Howlin, Brendan; Krause, Paul; Vessey, Jonathan D
2014-03-25
A new algorithm has been developed to enable the interpretation of black box models. The developed algorithm is agnostic to learning algorithm and open to all structural based descriptors such as fragments, keys and hashed fingerprints. The algorithm has provided meaningful interpretation of Ames mutagenicity predictions from both random forest and support vector machine models built on a variety of structural fingerprints.A fragmentation algorithm is utilised to investigate the model's behaviour on specific substructures present in the query. An output is formulated summarising causes of activation and deactivation. The algorithm is able to identify multiple causes of activation or deactivation in addition to identifying localised deactivations where the prediction for the query is active overall. No loss in performance is seen as there is no change in the prediction; the interpretation is produced directly on the model's behaviour for the specific query. Models have been built using multiple learning algorithms including support vector machine and random forest. The models were built on public Ames mutagenicity data and a variety of fingerprint descriptors were used. These models produced a good performance in both internal and external validation with accuracies around 82%. The models were used to evaluate the interpretation algorithm. Interpretation was revealed that links closely with understood mechanisms for Ames mutagenicity. This methodology allows for a greater utilisation of the predictions made by black box models and can expedite further study based on the output for a (quantitative) structure activity model. Additionally the algorithm could be utilised for chemical dataset investigation and knowledge extraction/human SAR development.
An unsupervised classification scheme for improving predictions of prokaryotic TIS.
Tech, Maike; Meinicke, Peter
2006-03-09
Although it is not difficult for state-of-the-art gene finders to identify coding regions in prokaryotic genomes, exact prediction of the corresponding translation initiation sites (TIS) is still a challenging problem. Recently a number of post-processing tools have been proposed for improving the annotation of prokaryotic TIS. However, inherent difficulties of these approaches arise from the considerable variation of TIS characteristics across different species. Therefore prior assumptions about the properties of prokaryotic gene starts may cause suboptimal predictions for newly sequenced genomes with TIS signals differing from those of well-investigated genomes. We introduce a clustering algorithm for completely unsupervised scoring of potential TIS, based on positionally smoothed probability matrices. The algorithm requires an initial gene prediction and the genomic sequence of the organism to perform the reannotation. As compared with other methods for improving predictions of gene starts in bacterial genomes, our approach is not based on any specific assumptions about prokaryotic TIS. Despite the generality of the underlying algorithm, the prediction rate of our method is competitive on experimentally verified test data from E. coli and B. subtilis. Regarding genomes with high G+C content, in contrast to some previously proposed methods, our algorithm also provides good performance on P. aeruginosa, B. pseudomallei and R. solanacearum. On reliable test data we showed that our method provides good results in post-processing the predictions of the widely-used program GLIMMER. The underlying clustering algorithm is robust with respect to variations in the initial TIS annotation and does not require specific assumptions about prokaryotic gene starts. These features are particularly useful on genomes with high G+C content. The algorithm has been implemented in the tool "TICO" (TIs COrrector) which is publicly available from our web site.
Tighe, Patrick J.; Harle, Christopher A.; Hurley, Robert W.; Aytug, Haldun; Boezaart, Andre P.; Fillingim, Roger B.
2015-01-01
Background Given their ability to process highly dimensional datasets with hundreds of variables, machine learning algorithms may offer one solution to the vexing challenge of predicting postoperative pain. Methods Here, we report on the application of machine learning algorithms to predict postoperative pain outcomes in a retrospective cohort of 8071 surgical patients using 796 clinical variables. Five algorithms were compared in terms of their ability to forecast moderate to severe postoperative pain: Least Absolute Shrinkage and Selection Operator (LASSO), gradient-boosted decision tree, support vector machine, neural network, and k-nearest neighbor, with logistic regression included for baseline comparison. Results In forecasting moderate to severe postoperative pain for postoperative day (POD) 1, the LASSO algorithm, using all 796 variables, had the highest accuracy with an area under the receiver-operating curve (ROC) of 0.704. Next, the gradient-boosted decision tree had an ROC of 0.665 and the k-nearest neighbor algorithm had an ROC of 0.643. For POD 3, the LASSO algorithm, using all variables, again had the highest accuracy, with an ROC of 0.727. Logistic regression had a lower ROC of 0.5 for predicting pain outcomes on POD 1 and 3. Conclusions Machine learning algorithms, when combined with complex and heterogeneous data from electronic medical record systems, can forecast acute postoperative pain outcomes with accuracies similar to methods that rely only on variables specifically collected for pain outcome prediction. PMID:26031220
Lai, Fu-Jou; Chang, Hong-Tsun; Wu, Wei-Sheng
2015-01-01
Computational identification of cooperative transcription factor (TF) pairs helps understand the combinatorial regulation of gene expression in eukaryotic cells. Many advanced algorithms have been proposed to predict cooperative TF pairs in yeast. However, it is still difficult to conduct a comprehensive and objective performance comparison of different algorithms because of lacking sufficient performance indices and adequate overall performance scores. To solve this problem, in our previous study (published in BMC Systems Biology 2014), we adopted/proposed eight performance indices and designed two overall performance scores to compare the performance of 14 existing algorithms for predicting cooperative TF pairs in yeast. Most importantly, our performance comparison framework can be applied to comprehensively and objectively evaluate the performance of a newly developed algorithm. However, to use our framework, researchers have to put a lot of effort to construct it first. To save researchers time and effort, here we develop a web tool to implement our performance comparison framework, featuring fast data processing, a comprehensive performance comparison and an easy-to-use web interface. The developed tool is called PCTFPeval (Predicted Cooperative TF Pair evaluator), written in PHP and Python programming languages. The friendly web interface allows users to input a list of predicted cooperative TF pairs from their algorithm and select (i) the compared algorithms among the 15 existing algorithms, (ii) the performance indices among the eight existing indices, and (iii) the overall performance scores from two possible choices. The comprehensive performance comparison results are then generated in tens of seconds and shown as both bar charts and tables. The original comparison results of each compared algorithm and each selected performance index can be downloaded as text files for further analyses. Allowing users to select eight existing performance indices and 15 existing algorithms for comparison, our web tool benefits researchers who are eager to comprehensively and objectively evaluate the performance of their newly developed algorithm. Thus, our tool greatly expedites the progress in the research of computational identification of cooperative TF pairs.
2015-01-01
Background Computational identification of cooperative transcription factor (TF) pairs helps understand the combinatorial regulation of gene expression in eukaryotic cells. Many advanced algorithms have been proposed to predict cooperative TF pairs in yeast. However, it is still difficult to conduct a comprehensive and objective performance comparison of different algorithms because of lacking sufficient performance indices and adequate overall performance scores. To solve this problem, in our previous study (published in BMC Systems Biology 2014), we adopted/proposed eight performance indices and designed two overall performance scores to compare the performance of 14 existing algorithms for predicting cooperative TF pairs in yeast. Most importantly, our performance comparison framework can be applied to comprehensively and objectively evaluate the performance of a newly developed algorithm. However, to use our framework, researchers have to put a lot of effort to construct it first. To save researchers time and effort, here we develop a web tool to implement our performance comparison framework, featuring fast data processing, a comprehensive performance comparison and an easy-to-use web interface. Results The developed tool is called PCTFPeval (Predicted Cooperative TF Pair evaluator), written in PHP and Python programming languages. The friendly web interface allows users to input a list of predicted cooperative TF pairs from their algorithm and select (i) the compared algorithms among the 15 existing algorithms, (ii) the performance indices among the eight existing indices, and (iii) the overall performance scores from two possible choices. The comprehensive performance comparison results are then generated in tens of seconds and shown as both bar charts and tables. The original comparison results of each compared algorithm and each selected performance index can be downloaded as text files for further analyses. Conclusions Allowing users to select eight existing performance indices and 15 existing algorithms for comparison, our web tool benefits researchers who are eager to comprehensively and objectively evaluate the performance of their newly developed algorithm. Thus, our tool greatly expedites the progress in the research of computational identification of cooperative TF pairs. PMID:26677932
NASA Astrophysics Data System (ADS)
Markov, Yu. G.; Mikhailov, M. V.; Pochukaev, V. N.
2012-07-01
An analysis of perturbing factors influencing the motion of a navigation satellite (NS) is carried out, and the degree of influence of each factor on the GLONASS orbit is estimated. It is found that fundamental components of the Earth's rotation parameters (ERP) are one substantial factor commensurable with maximum perturbations. Algorithms for the calculation of orbital perturbations caused by these parameters are given; these algorithms can be implemented in a consumer's equipment. The daily prediction of NS coordinates is performed on the basis of real GLONASS satellite ephemerides transmitted to a consumer, using the developed prediction algorithms taking the ERP into account. The obtained accuracy of the daily prediction of GLONASS ephemerides exceeds by tens of times the accuracy of the daily prediction performed using algorithms recommended in interface control documents.
Parry, J Preston; Riche, Daniel; Aldred, Justin; Isaacs, John; Lutz, Elizabeth; Butler, Vicki; Shwayder, James
To determine whether air bubbles infused into saline during flexible office hysteroscopy can accurately predict tubal patency. Diagnostic accuracy study (Canadian Task Force classification II-1). An academic hospital. Women undergoing office hysteroscopy and ultrasound. Air infusion into saline during office hysteroscopy. The primary outcome measures were whether air bubbles traverse the ostia at hysteroscopy, whether there is patency at abdominal surgery, and the rate of cul-de-sac (CDS) fluid accumulation from office hysteroscopy. Four hundred thirty-five patients underwent office hysteroscopy with air infusion, 89 of whom also had abdominal surgery. Depending on interpretation, sensitivity to tubal occlusion was 98.3% to 100%, and specificity was 83.7% with standard chromopertubation pressures; 95.3% to 100% of the time proximal patency was observed, whole tubal patency was observed through chromopertubation for patients with surgical data. Changes in CDS fluid volume from before to after office hysteroscopy were also used as an indirect proxy for tubal patency. Patients with risk factors for occlusion such as known or suspected tubal disease, known or suspected adhesions, and sonographic identification of adhesions through the sliding sign were all less likely to demonstrate a change in CDS fluid volume after hysteroscopy than women without these risk factors (p < .0001). Bilateral dispersion of air bubbles during hysteroscopy better predicted shifts in CDS volume than these risk factors and demonstrated shifts comparable with bilateral patency at laparoscopy (p < .001). Air-infused saline at office hysteroscopy can accurately assess tubal patency. Additionally, bilateral patency identified through office hysteroscopy may predict bilateral patency at surgery better than several commonly used historic and sonographic variables. Published by Elsevier Inc.
A Sensor Dynamic Measurement Error Prediction Model Based on NAPSO-SVM
Jiang, Minlan; Jiang, Lan; Jiang, Dingde; Li, Fei
2018-01-01
Dynamic measurement error correction is an effective way to improve sensor precision. Dynamic measurement error prediction is an important part of error correction, and support vector machine (SVM) is often used for predicting the dynamic measurement errors of sensors. Traditionally, the SVM parameters were always set manually, which cannot ensure the model’s performance. In this paper, a SVM method based on an improved particle swarm optimization (NAPSO) is proposed to predict the dynamic measurement errors of sensors. Natural selection and simulated annealing are added in the PSO to raise the ability to avoid local optima. To verify the performance of NAPSO-SVM, three types of algorithms are selected to optimize the SVM’s parameters: the particle swarm optimization algorithm (PSO), the improved PSO optimization algorithm (NAPSO), and the glowworm swarm optimization (GSO). The dynamic measurement error data of two sensors are applied as the test data. The root mean squared error and mean absolute percentage error are employed to evaluate the prediction models’ performances. The experimental results show that among the three tested algorithms the NAPSO-SVM method has a better prediction precision and a less prediction errors, and it is an effective method for predicting the dynamic measurement errors of sensors. PMID:29342942
Effects of environmental change on plant species density: Comparing predictions with experiments
Gough, L.; Grace, J.B.
1999-01-01
Ideally, general ecological relationships may be used to predict responses of natural communities to environmental change, but few attempts have been made to determine the reliability of predictions based on descriptive data. Using a previously published structural equation model (SEM) of descriptive data from a coastal marsh landscape, we compared these predictions against observed changes in plant species density resulting from field experiments (manipulations of soil fertility, flooding, salinity, and mammalian herbivory) in two areas within the same marsh. In general, observed experimental responses were fairly consistent with predictions. The largest discrepancy occurred when sods were transplanted from high- to low-salinity sites and herbivores selectively consumed a particularly palatable plant species in the transplanted sods. Individual plot responses to some treatments were predicted more accurately than others. Individual fertilized plot responses were not consistent with predictions (P > 0.05), nor were fenced plots (herbivore exclosures; R2 = 0.15) compared to unfenced plots (R2 = 0.53). For the remaining treatments, predictions reasonably matched responses (R2 = 0.63). We constructed an SEM for the experimental data; it explained 60% of the variance in species density and showed that fencing and fertilization led to decreases in species density that were not predicted from treatment effects on community biomass or observed disturbance levels. These treatments may have affected the ratio of live to dead biomass, and competitive exclusion likely decreased species density in fenced and fertilized plots. We conclude that experimental validation is required to determine the predictive value of comparative relationships derived from descriptive data.
Guerra, Alexandra; Leite, Nuno; Marques, João Carlos; Ford, Alex T; Martins, Irene
2014-01-01
Understanding the environmental parameters that constrain the distribution of a species at its latitudinal extremes is critical for predicting how ecosystems react to climate change. Our first aim was to predict the variation in the amphipod populations of Echinogammarus marinus from the southernmost limit of its distribution under global warming scenarios. Our second aim was to test whether sex-ratio fluctuations - a mechanism frequently displayed by amphipods - respond to the variations in populations under altered climate conditions. To achieve these aims, scenarios were run with a validated model of E. marinus populations. Simulations were divided into: phase I - simulation of the effect of climate change on amphipod populations, and phase II - simulation of the effect of climate change on populations with male and female proportions. In both phases, temperature (T), salinity (S) and temperature and salinity (T-S) were tested. Results showed that E. marinus populations are highly sensitive to increases in temperature (>2 °C), which has adverse effects on amphipod recruitment and growth. Results from the climate change scenarios coupled with the sex-ratio fluctuations depended largely on the degree of female bias within population. Temperature increase of 2 °C had less impact on female-biased populations, particularly when conjugated with increases in salinity. Male-biased populations were highly sensitive to any variation in temperature and/or salinity; these populations exhibited a long-term decline in density. Simulations in which temperature increased more than 4 °C led to a continuous decline in the E. marinus population. According to this work, E. marinus populations at their southernmost limit are vulnerable to global warming. We anticipate that in Europe, temperature increases of 2 °C will incite a withdrawal of the population of 5°N from the amphipod species located at southernmost geographical borders. This effect is discussed in relation to the distribution of E. marinus along the Atlantic coast. © 2013 Elsevier B.V. All rights reserved.
Nenna, Vanessa; Herckenrather, Daan; Knight, Rosemary; Odlum, Nick; McPhee, Darcy
2013-01-01
Developing effective resource management strategies to limit or prevent saltwater intrusion as a result of increasing demands on coastal groundwater resources requires reliable information about the geologic structure and hydrologic state of an aquifer system. A common strategy for acquiring such information is to drill sentinel wells near the coast to monitor changes in water salinity with time. However, installation and operation of sentinel wells is costly and provides limited spatial coverage. We studied the use of noninvasive electromagnetic (EM) geophysical methods as an alternative to installation of monitoring wells for characterizing coastal aquifers. We tested the feasibility of using EM methods at a field site in northern California to identify the potential for and/or presence of hydraulic communication between an unconfined saline aquifer and a confined freshwater aquifer. One-dimensional soundings were acquired using the time-domain electromagnetic (TDEM) and audiomagnetotelluric (AMT) methods. We compared inverted resistivity models of TDEM and AMT data obtained from several inversion algorithms. We found that multiple interpretations of inverted models can be supported by the same data set, but that there were consistencies between all data sets and inversion algorithms. Results from all collected data sets suggested that EM methods are capable of reliably identifying a saltwater-saturated zone in the unconfined aquifer. Geophysical data indicated that the impermeable clay between aquifers may be more continuous than is supported by current models.
LMI-Based Generation of Feedback Laws for a Robust Model Predictive Control Algorithm
NASA Technical Reports Server (NTRS)
Acikmese, Behcet; Carson, John M., III
2007-01-01
This technical note provides a mathematical proof of Corollary 1 from the paper 'A Nonlinear Model Predictive Control Algorithm with Proven Robustness and Resolvability' that appeared in the 2006 Proceedings of the American Control Conference. The proof was omitted for brevity in the publication. The paper was based on algorithms developed for the FY2005 R&TD (Research and Technology Development) project for Small-body Guidance, Navigation, and Control [2].The framework established by the Corollary is for a robustly stabilizing MPC (model predictive control) algorithm for uncertain nonlinear systems that guarantees the resolvability of the associated nite-horizon optimal control problem in a receding-horizon implementation. Additional details of the framework are available in the publication.
NASA Technical Reports Server (NTRS)
Hunter, H. E.
1972-01-01
The Avco Data Analysis and Prediction Techniques (ADAPT) were employed to determine laws capable of detecting failures in a heat plant up to three days in advance of the occurrence of the failure. The projected performance of algorithms yielded a detection probability of 90% with false alarm rates of the order of 1 per year for a sample rate of 1 per day with each detection, followed by 3 hourly samplings. This performance was verified on 173 independent test cases. The program also demonstrated diagnostic algorithms and the ability to predict the time of failure to approximately plus or minus 8 hours up to three days in advance of the failure. The ADAPT programs produce simple algorithms which have a unique possibility of a relatively low cost updating procedure. The algorithms were implemented on general purpose computers at Kennedy Space Flight Center and tested against current data.
Peng, Jiangtao; Peng, Silong; Xie, Qiong; Wei, Jiping
2011-04-01
In order to eliminate the lower order polynomial interferences, a new quantitative calibration algorithm "Baseline Correction Combined Partial Least Squares (BCC-PLS)", which combines baseline correction and conventional PLS, is proposed. By embedding baseline correction constraints into PLS weights selection, the proposed calibration algorithm overcomes the uncertainty in baseline correction and can meet the requirement of on-line attenuated total reflectance Fourier transform infrared (ATR-FTIR) quantitative analysis. The effectiveness of the algorithm is evaluated by the analysis of glucose and marzipan ATR-FTIR spectra. BCC-PLS algorithm shows improved prediction performance over PLS. The root mean square error of cross-validation (RMSECV) on marzipan spectra for the prediction of the moisture is found to be 0.53%, w/w (range 7-19%). The sugar content is predicted with a RMSECV of 2.04%, w/w (range 33-68%). Copyright © 2011 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Nyangweso, Emmanuel; Bole, Brian
2014-01-01
Successful prediction and management of battery life using prognostic algorithms through ground and flight tests is important for performance evaluation of electrical systems. This paper details the design of test beds suitable for replicating loading profiles that would be encountered in deployed electrical systems. The test bed data will be used to develop and validate prognostic algorithms for predicting battery discharge time and battery failure time. Online battery prognostic algorithms will enable health management strategies. The platform used for algorithm demonstration is the EDGE 540T electric unmanned aerial vehicle (UAV). The fully designed test beds developed and detailed in this paper can be used to conduct battery life tests by controlling current and recording voltage and temperature to develop a model that makes a prediction of end-of-charge and end-of-life of the system based on rapid state of health (SOH) assessment.
Johansson, Daniel; Pereyra, Ricardo T; Rafajlović, Marina; Johannesson, Kerstin
2017-04-05
Establishing populations in ecologically marginal habitats may require substantial phenotypic changes that come about through phenotypic plasticity, local adaptation, or both. West-Eberhard's "plasticity-first" model suggests that plasticity allows for rapid colonisation of a new environment, followed by directional selection that develops local adaptation. Two predictions from this model are that (i) individuals of the original population have high enough plasticity to survive and reproduce in the marginal environment, and (ii) individuals of the marginal population show evidence of local adaptation. Individuals of the macroalga Fucus vesiculosus from the North Sea colonised the hyposaline (≥2-3‰) Baltic Sea less than 8000 years ago. The colonisation involved a switch from fully sexual to facultative asexual recruitment with release of adventitious branches that grow rhizoids and attach to the substratum. To test the predictions from the plasticity-first model we reciprocally transplanted F. vesiculosus from the original population (ambient salinity 24‰) and from the marginal population inside the Baltic Sea (ambient salinity 4‰). We also transplanted individuals of the Baltic endemic sister species F. radicans from 4 to 24‰. We assessed the degree of plasticity and local adaptation in growth and reproductive traits after 6 months by comparing the performance of individuals in 4 and 24‰. Branches of all individuals survived the 6 months period in both salinities, but grew better in their native salinity. Baltic Sea individuals more frequently developed asexual traits while North Sea individuals initiated formation of receptacles for sexual reproduction. Marine individuals of F. vesiculosus are highly plastic with respect to salinity and North Sea populations can survive the extreme hyposaline conditions of the Baltic Sea without selective mortality. Plasticity alone would thus allow for an initial establishment of this species inside the postglacial Baltic Sea at salinities where reproduction remains functional. Since establishment, the Baltic Sea populations have evolved adaptations to extreme hyposaline waters and have in addition evolved asexual recruitment that, however, tends to impede local adaptation. Overall, our results support the "plasticity-first" model for the initial colonisation of the Baltic Sea by Fucus vesiculosus.
Algorithms and the Future of Music Education: A Response to Shuler
ERIC Educational Resources Information Center
Thibeault, Matthew D.
2014-01-01
This article is a response to Shuler's 2001 article predicting the future of music education. The respondent assesses Shuler's predictions, finding that many have come true but critiquing Shuler's overall positive assessment. The respondent then goes on to make one prediction about the future of music education: that algorithms will…
Automatic intraaortic balloon pump timing using an intrabeat dicrotic notch prediction algorithm.
Schreuder, Jan J; Castiglioni, Alessandro; Donelli, Andrea; Maisano, Francesco; Jansen, Jos R C; Hanania, Ramzi; Hanlon, Pat; Bovelander, Jan; Alfieri, Ottavio
2005-03-01
The efficacy of intraaortic balloon counterpulsation (IABP) during arrhythmic episodes is questionable. A novel algorithm for intrabeat prediction of the dicrotic notch was used for real time IABP inflation timing control. A windkessel model algorithm was used to calculate real-time aortic flow from aortic pressure. The dicrotic notch was predicted using a percentage of calculated peak flow. Automatic inflation timing was set at intrabeat predicted dicrotic notch and was combined with automatic IAB deflation. Prophylactic IABP was applied in 27 patients with low ejection fraction (< 35%) undergoing cardiac surgery. Analysis of IABP at a 1:4 ratio revealed that IAB inflation occurred at a mean of 0.6 +/- 5 ms from the dicrotic notch. In all patients accurate automatic timing at a 1:1 assist ratio was performed. Seventeen patients had episodes of severe arrhythmia, the novel IABP inflation algorithm accurately assisted 318 of 320 arrhythmic beats at a 1:1 ratio. The novel real-time intrabeat IABP inflation timing algorithm performed accurately in all patients during both regular rhythms and severe arrhythmia, allowing fully automatic intrabeat IABP timing.
Mearelli, Filippo; Fiotti, Nicola; Giansante, Carlo; Casarsa, Chiara; Orso, Daniele; De Helmersen, Marco; Altamura, Nicola; Ruscio, Maurizio; Castello, Luigi Mario; Colonetti, Efrem; Marino, Rossella; Barbati, Giulia; Bregnocchi, Andrea; Ronco, Claudio; Lupia, Enrico; Montrucchio, Giuseppe; Muiesan, Maria Lorenza; Di Somma, Salvatore; Avanzi, Gian Carlo; Biolo, Gianni
2018-05-07
To derive and validate a predictive algorithm integrating a nomogram-based prediction of the pretest probability of infection with a panel of serum biomarkers, which could robustly differentiate sepsis/septic shock from noninfectious systemic inflammatory response syndrome. Multicenter prospective study. At emergency department admission in five University hospitals. Nine-hundred forty-seven adults in inception cohort and 185 adults in validation cohort. None. A nomogram, including age, Sequential Organ Failure Assessment score, recent antimicrobial therapy, hyperthermia, leukocytosis, and high C-reactive protein values, was built in order to take data from 716 infected patients and 120 patients with noninfectious systemic inflammatory response syndrome to predict pretest probability of infection. Then, the best combination of procalcitonin, soluble phospholypase A2 group IIA, presepsin, soluble interleukin-2 receptor α, and soluble triggering receptor expressed on myeloid cell-1 was applied in order to categorize patients as "likely" or "unlikely" to be infected. The predictive algorithm required only procalcitonin backed up with soluble phospholypase A2 group IIA determined in 29% of the patients to rule out sepsis/septic shock with a negative predictive value of 93%. In a validation cohort of 158 patients, predictive algorithm reached 100% of negative predictive value requiring biomarker measurements in 18% of the population. We have developed and validated a high-performing, reproducible, and parsimonious algorithm to assist emergency department physicians in distinguishing sepsis/septic shock from noninfectious systemic inflammatory response syndrome.
CPU-GPU hybrid accelerating the Zuker algorithm for RNA secondary structure prediction applications.
Lei, Guoqing; Dou, Yong; Wan, Wen; Xia, Fei; Li, Rongchun; Ma, Meng; Zou, Dan
2012-01-01
Prediction of ribonucleic acid (RNA) secondary structure remains one of the most important research areas in bioinformatics. The Zuker algorithm is one of the most popular methods of free energy minimization for RNA secondary structure prediction. Thus far, few studies have been reported on the acceleration of the Zuker algorithm on general-purpose processors or on extra accelerators such as Field Programmable Gate-Array (FPGA) and Graphics Processing Units (GPU). To the best of our knowledge, no implementation combines both CPU and extra accelerators, such as GPUs, to accelerate the Zuker algorithm applications. In this paper, a CPU-GPU hybrid computing system that accelerates Zuker algorithm applications for RNA secondary structure prediction is proposed. The computing tasks are allocated between CPU and GPU for parallel cooperate execution. Performance differences between the CPU and the GPU in the task-allocation scheme are considered to obtain workload balance. To improve the hybrid system performance, the Zuker algorithm is optimally implemented with special methods for CPU and GPU architecture. Speedup of 15.93× over optimized multi-core SIMD CPU implementation and performance advantage of 16% over optimized GPU implementation are shown in the experimental results. More than 14% of the sequences are executed on CPU in the hybrid system. The system combining CPU and GPU to accelerate the Zuker algorithm is proven to be promising and can be applied to other bioinformatics applications.
High-speed prediction of crystal structures for organic molecules
NASA Astrophysics Data System (ADS)
Obata, Shigeaki; Goto, Hitoshi
2015-02-01
We developed a master-worker type parallel algorithm for allocating tasks of crystal structure optimizations to distributed compute nodes, in order to improve a performance of simulations for crystal structure predictions. The performance experiments were demonstrated on TUT-ADSIM supercomputer system (HITACHI HA8000-tc/HT210). The experimental results show that our parallel algorithm could achieve speed-ups of 214 and 179 times using 256 processor cores on crystal structure optimizations in predictions of crystal structures for 3-aza-bicyclo(3.3.1)nonane-2,4-dione and 2-diazo-3,5-cyclohexadiene-1-one, respectively. We expect that this parallel algorithm is always possible to reduce computational costs of any crystal structure predictions.
Ultrasonic prediction of term birth weight in Hispanic women. Accuracy in an outpatient clinic.
Nahum, Gerard G; Pham, Krystle Q; McHugh, John P
2003-01-01
To investigate the accuracy of ultrasonic fetal biometric algorithms for estimating term fetal weight. Ultrasonographic fetal biometric assessments were made in 74 Hispanic women who delivered at 37-42 weeks of gestation. Measurements were taken of the fetal biparietal diameter, head circumference, abdominal circumference and femur length. Twenty-seven standard fetal biometric algorithms were assessed for their accuracy in predicting fetal weight. Results were compared to those obtained by merely guessing the mean term birth weight in each case. The correlation between ultrasonically predicted and actual birth weights ranged from 0.52 to 0.79. The different ultrasonic algorithms estimated fetal weight to within +/- 8.6-15.0% (+/- 295-520 g) of actual birth weight as compared with +/- 13.6% (+/- 449 g) for guessing the mean birth weight in each case (mean +/- SD). The mean absolute prediction errors for 17 of the ultrasonic equations (63%) were superior to those obtained by guessing the mean birth weight by 3.2-5.0% (96-154 g) (P < .05). Fourteen algorithms (52%) were more accurate for predicting fetal weight to within +/- 15%, and 20 algorithms (74%) were more accurate for predicting fetal weight to within +/- 10% of actual birth weight than simply guessing the mean birth weight (P < .05). Ten ultrasonic equations (37%) showed significant utility for predicting fetal weight > 4,000 g (likelihood ratio > 5.0). Term fetal weight predictions using the majority of sonographic fetal biometric equations are more accurate, by up to 154 g and 5%, than simply guessing the population-specific mean birth weight.
A utility/cost analysis of breast cancer risk prediction algorithms
NASA Astrophysics Data System (ADS)
Abbey, Craig K.; Wu, Yirong; Burnside, Elizabeth S.; Wunderlich, Adam; Samuelson, Frank W.; Boone, John M.
2016-03-01
Breast cancer risk prediction algorithms are used to identify subpopulations that are at increased risk for developing breast cancer. They can be based on many different sources of data such as demographics, relatives with cancer, gene expression, and various phenotypic features such as breast density. Women who are identified as high risk may undergo a more extensive (and expensive) screening process that includes MRI or ultrasound imaging in addition to the standard full-field digital mammography (FFDM) exam. Given that there are many ways that risk prediction may be accomplished, it is of interest to evaluate them in terms of expected cost, which includes the costs of diagnostic outcomes. In this work we perform an expected-cost analysis of risk prediction algorithms that is based on a published model that includes the costs associated with diagnostic outcomes (true-positive, false-positive, etc.). We assume the existence of a standard screening method and an enhanced screening method with higher scan cost, higher sensitivity, and lower specificity. We then assess expected cost of using a risk prediction algorithm to determine who gets the enhanced screening method under the strong assumption that risk and diagnostic performance are independent. We find that if risk prediction leads to a high enough positive predictive value, it will be cost-effective regardless of the size of the subpopulation. Furthermore, in terms of the hit-rate and false-alarm rate of the of the risk prediction algorithm, iso-cost contours are lines with slope determined by properties of the available diagnostic systems for screening.
Hu, Chen; Steingrimsson, Jon Arni
2018-01-01
A crucial component of making individualized treatment decisions is to accurately predict each patient's disease risk. In clinical oncology, disease risks are often measured through time-to-event data, such as overall survival and progression/recurrence-free survival, and are often subject to censoring. Risk prediction models based on recursive partitioning methods are becoming increasingly popular largely due to their ability to handle nonlinear relationships, higher-order interactions, and/or high-dimensional covariates. The most popular recursive partitioning methods are versions of the Classification and Regression Tree (CART) algorithm, which builds a simple interpretable tree structured model. With the aim of increasing prediction accuracy, the random forest algorithm averages multiple CART trees, creating a flexible risk prediction model. Risk prediction models used in clinical oncology commonly use both traditional demographic and tumor pathological factors as well as high-dimensional genetic markers and treatment parameters from multimodality treatments. In this article, we describe the most commonly used extensions of the CART and random forest algorithms to right-censored outcomes. We focus on how they differ from the methods for noncensored outcomes, and how the different splitting rules and methods for cost-complexity pruning impact these algorithms. We demonstrate these algorithms by analyzing a randomized Phase III clinical trial of breast cancer. We also conduct Monte Carlo simulations to compare the prediction accuracy of survival forests with more commonly used regression models under various scenarios. These simulation studies aim to evaluate how sensitive the prediction accuracy is to the underlying model specifications, the choice of tuning parameters, and the degrees of missing covariates.
Grötzinger, Stefan W.; Alam, Intikhab; Ba Alawi, Wail; Bajic, Vladimir B.; Stingl, Ulrich; Eppinger, Jörg
2014-01-01
Reliable functional annotation of genomic data is the key-step in the discovery of novel enzymes. Intrinsic sequencing data quality problems of single amplified genomes (SAGs) and poor homology of novel extremophile's genomes pose significant challenges for the attribution of functions to the coding sequences identified. The anoxic deep-sea brine pools of the Red Sea are a promising source of novel enzymes with unique evolutionary adaptation. Sequencing data from Red Sea brine pool cultures and SAGs are annotated and stored in the Integrated Data Warehouse of Microbial Genomes (INDIGO) data warehouse. Low sequence homology of annotated genes (no similarity for 35% of these genes) may translate into false positives when searching for specific functions. The Profile and Pattern Matching (PPM) strategy described here was developed to eliminate false positive annotations of enzyme function before progressing to labor-intensive hyper-saline gene expression and characterization. It utilizes InterPro-derived Gene Ontology (GO)-terms (which represent enzyme function profiles) and annotated relevant PROSITE IDs (which are linked to an amino acid consensus pattern). The PPM algorithm was tested on 15 protein families, which were selected based on scientific and commercial potential. An initial list of 2577 enzyme commission (E.C.) numbers was translated into 171 GO-terms and 49 consensus patterns. A subset of INDIGO-sequences consisting of 58 SAGs from six different taxons of bacteria and archaea were selected from six different brine pool environments. Those SAGs code for 74,516 genes, which were independently scanned for the GO-terms (profile filter) and PROSITE IDs (pattern filter). Following stringent reliability filtering, the non-redundant hits (106 profile hits and 147 pattern hits) are classified as reliable, if at least two relevant descriptors (GO-terms and/or consensus patterns) are present. Scripts for annotation, as well as for the PPM algorithm, are available through the INDIGO website. PMID:24778629
Estimation of State Transition Probabilities: A Neural Network Model
NASA Astrophysics Data System (ADS)
Saito, Hiroshi; Takiyama, Ken; Okada, Masato
2015-12-01
Humans and animals can predict future states on the basis of acquired knowledge. This prediction of the state transition is important for choosing the best action, and the prediction is only possible if the state transition probability has already been learned. However, how our brains learn the state transition probability is unknown. Here, we propose a simple algorithm for estimating the state transition probability by utilizing the state prediction error. We analytically and numerically confirmed that our algorithm is able to learn the probability completely with an appropriate learning rate. Furthermore, our learning rule reproduced experimentally reported psychometric functions and neural activities in the lateral intraparietal area in a decision-making task. Thus, our algorithm might describe the manner in which our brains learn state transition probabilities and predict future states.
Predicting CO2-H2O Interfacial Tension Using COSMO-RS.
Silvestri, A; Stipp, S L S; Andersson, M P
2017-02-14
Knowledge about the interaction between fluids and solids and the interfacial tension (IFT) that results is important for predicting behavior and properties in industrial systems and in nature, such as in rock formations before, during, and after CO 2 injection for long-term storage. Many authors have studied the effect of the environmental variables on the IFT in the CO 2 -H 2 O system. However, experimental measurements above CO 2 supercritical conditions are scarce and sometimes contradictory. Molecular modeling is a valuable tool for complementing experimental IFT determination, and it can help us interpret results and gain insight under conditions where experiments are difficult or impossible. Here, we report predictions for CO 2 -water interfacial tension performed using density functional theory (DFT) combined with the COSMO-RS implicit solvent model. We predicted the IFT dependence as a function of pressure (0-50 MPa), temperature (273-383 K), and salinity (0-5 M NaCl). The results agree well with literature data, within the estimated uncertainty for experiments and for molecular dynamics (MD) simulations, suggesting that the model can be used as a fast alternative to time-consuming computational approaches for predicting the CO 2 -water IFT over a range of pressures, temperatures, and salinities.
Sardella, Brian A; Kültz, Dietmar
2014-01-01
The green sturgeon (Acipenser medirostris) is an anadromous species with a distinct population segment in the San Francisco Bay-Sacramento River Delta that is currently listed as threatened. Although this species is able to tolerate salinity challenges as soon as 6 mo posthatch, its ability to deal with unpredictable salinity fluctuations remains unknown. Global climate change is predicted to result in large freshwater (FW) flushes through the estuary during winter and greater tidal influence during the summer. We exposed green sturgeon acclimated to 15 (EST) or 24 (BAY) g/L salinity to a rapid FW influx, where salinity was reduced to 0 g/L in 3 h in order to simulate the effect of the "winter" scenario. Both groups survived, enduring a 10% plasma osmolality reduction after 3 h. BAY-acclimated sturgeon upregulated both Na(+), K(+)-ATPase (NKA) activity and caspase 3/7 activity, but no changes were observed in the EST-acclimated fish. In addition, we exposed FW-acclimated sturgeon to a dual 12-h salinity fluctuation cycle (0-24-0 g/L) in order to simulate the effect of greater tidal influence. At 6 h, the sturgeon showed a significant increase in plasma osmolality, and branchial NKA and caspase 3/7 activities were increased, indicating an acclimation response. There was no acclimation at 18 h, and plasma osmolality was higher than the peak observed at 6 h. The second fluctuation elicited an upregulation of the stress proteins ubiquitin and heat shock 70-kDa protein (HSP 70). Sturgeon can acclimate to changes in salinity; however, salinity fluctuations resulted in substantial cellular stress.
Bartsch, Georg; Mitra, Anirban P; Mitra, Sheetal A; Almal, Arpit A; Steven, Kenneth E; Skinner, Donald G; Fry, David W; Lenehan, Peter F; Worzel, William P; Cote, Richard J
2016-02-01
Due to the high recurrence risk of nonmuscle invasive urothelial carcinoma it is crucial to distinguish patients at high risk from those with indolent disease. In this study we used a machine learning algorithm to identify the genes in patients with nonmuscle invasive urothelial carcinoma at initial presentation that were most predictive of recurrence. We used the genes in a molecular signature to predict recurrence risk within 5 years after transurethral resection of bladder tumor. Whole genome profiling was performed on 112 frozen nonmuscle invasive urothelial carcinoma specimens obtained at first presentation on Human WG-6 BeadChips (Illumina®). A genetic programming algorithm was applied to evolve classifier mathematical models for outcome prediction. Cross-validation based resampling and gene use frequencies were used to identify the most prognostic genes, which were combined into rules used in a voting algorithm to predict the sample target class. Key genes were validated by quantitative polymerase chain reaction. The classifier set included 21 genes that predicted recurrence. Quantitative polymerase chain reaction was done for these genes in a subset of 100 patients. A 5-gene combined rule incorporating a voting algorithm yielded 77% sensitivity and 85% specificity to predict recurrence in the training set, and 69% and 62%, respectively, in the test set. A singular 3-gene rule was constructed that predicted recurrence with 80% sensitivity and 90% specificity in the training set, and 71% and 67%, respectively, in the test set. Using primary nonmuscle invasive urothelial carcinoma from initial occurrences genetic programming identified transcripts in reproducible fashion, which were predictive of recurrence. These findings could potentially impact nonmuscle invasive urothelial carcinoma management. Copyright © 2016 American Urological Association Education and Research, Inc. Published by Elsevier Inc. All rights reserved.
Network-based ranking methods for prediction of novel disease associated microRNAs.
Le, Duc-Hau
2015-10-01
Many studies have shown roles of microRNAs on human disease and a number of computational methods have been proposed to predict such associations by ranking candidate microRNAs according to their relevance to a disease. Among them, machine learning-based methods usually have a limitation in specifying non-disease microRNAs as negative training samples. Meanwhile, network-based methods are becoming dominant since they well exploit a "disease module" principle in microRNA functional similarity networks. Of which, random walk with restart (RWR) algorithm-based method is currently state-of-the-art. The use of this algorithm was inspired from its success in predicting disease gene because the "disease module" principle also exists in protein interaction networks. Besides, many algorithms designed for webpage ranking have been successfully applied in ranking disease candidate genes because web networks share topological properties with protein interaction networks. However, these algorithms have not yet been utilized for disease microRNA prediction. We constructed microRNA functional similarity networks based on shared targets of microRNAs, and then we integrated them with a microRNA functional synergistic network, which was recently identified. After analyzing topological properties of these networks, in addition to RWR, we assessed the performance of (i) PRINCE (PRIoritizatioN and Complex Elucidation), which was proposed for disease gene prediction; (ii) PageRank with Priors (PRP) and K-Step Markov (KSM), which were used for studying web networks; and (iii) a neighborhood-based algorithm. Analyses on topological properties showed that all microRNA functional similarity networks are small-worldness and scale-free. The performance of each algorithm was assessed based on average AUC values on 35 disease phenotypes and average rankings of newly discovered disease microRNAs. As a result, the performance on the integrated network was better than that on individual ones. In addition, the performance of PRINCE, PRP and KSM was comparable with that of RWR, whereas it was worst for the neighborhood-based algorithm. Moreover, all the algorithms were stable with the change of parameters. Final, using the integrated network, we predicted six novel miRNAs (i.e., hsa-miR-101, hsa-miR-181d, hsa-miR-192, hsa-miR-423-3p, hsa-miR-484 and hsa-miR-98) associated with breast cancer. Network-based ranking algorithms, which were successfully applied for either disease gene prediction or for studying social/web networks, can be also used effectively for disease microRNA prediction. Copyright © 2015 Elsevier Ltd. All rights reserved.
Single-step methods for predicting orbital motion considering its periodic components
NASA Astrophysics Data System (ADS)
Lavrov, K. N.
1989-01-01
Modern numerical methods for integration of ordinary differential equations can provide accurate and universal solutions to celestial mechanics problems. The implicit single sequence algorithms of Everhart and multiple step computational schemes using a priori information on periodic components can be combined to construct implicit single sequence algorithms which combine their advantages. The construction and analysis of the properties of such algorithms are studied, utilizing trigonometric approximation of the solutions of differential equations containing periodic components. The algorithms require 10 percent more machine memory than the Everhart algorithms, but are twice as fast, and yield short term predictions valid for five to ten orbits with good accuracy and five to six times faster than algorithms using other methods.
Ozaslan, Cumali; Farooq, Shahid; Onen, Huseyin; Bukun, Bekir; Ozcan, Selcuk; Gunal, Hikmet
2016-01-01
Invasive plants are recognized for their impressive abilities to withstand adverse environmental conditions however, all invaders do not express the similar abilities. Therefore, survival, growth, nutrient uptake and fecundity of two co-occurring, invasive Physalis species were tested under water and salinity stresses, and different soil textures in the current study. Five different water stress levels (100, 75, 50, 25, and 12.5% pot water contents), four different soil salinity levels (0, 3, 6, and 12 dSm-1) and four different soil textures (67% clay, 50% clay, silt clay loam and sandy loam) were included in three different pot experiments. Both weeds survived under all levels of water stress except 12.5% water contents and on all soil types however, behaved differently under increasing salinity. The weeds responded similarly to salinity up till 3 dSm-1 whereas, P. philadelphica survived for longer time than P. angulata under remaining salinity regimes. Water and salinity stress hampered the growth and fecundity of both weeds while, soil textures had slight effect. Both weeds preferred clay textured soils for better growth and nutrient uptake however, interactive effect of weeds and soil textures was non-significant. P. angulata accumulated higher K and Na while P. philadelphica accrued more Ca and Mg as well as maintained better K/Na ratio. P. angulata accumulated more Na and P under salinity stress while, P. philadelphica accrued higher K and Mg, and maintained higher K/Na ratio. Collectively, highest nutrient accumulation was observed under stress free conditions and on clay textured soils. P. philadelphica exhibited higher reproductive output under all experimental conditions than P. angulata. It is predicted that P. philadelphica will be more problematic under optimal water supply and high salinity while P. angulata can better adapt water limited environments. The results indicate that both weeds have considerable potential to further expand their ranges in semi-arid regions of Turkey.
NASA Astrophysics Data System (ADS)
Lekakis, E. H.; Antonopoulos, V. Z.
2015-11-01
Simulation models can be important tools for analyzing and managing irrigation, soil salinization or crop production problems. In this study a mathematical model that describes the water movement and mass transport of individual ions (Ca2+, Mg2+ and Na+) and overall soil salinity by means of the soil solution electrical conductivity, is used. The mass transport equations of Ca2+, Mg2+ and Na+ have been incorporated as part of the integrated model WANISIM and the soil salinity was computed as the sum of individual ions. The model was calibrated and validated against field data, collected during a three year experiment in plots of maize, irrigated with three different irrigation water qualities, at Thessaloniki area in Northern Greece. The model was also used to evaluate salinization and sodification hazards by the use of irrigation water with increasing electrical conductivity of 0.8, 3.2 and 6.4 dS m-1, while maintaining a ratio of Ca2+:Mg2+:Na+ equal to 3:3:2. The qualitative and quantitative procedures for results evaluation showed that there was good agreement between the simulated and measured values of the water content, overall salinity and the concentration of individual soluble cations, at two soil layers (0-35 and 35-75 cm). Nutrient uptake was also taken into account. Locally available irrigation water (ECiw = 0.8 dS m-1) did not cause soil salinization or sodification. On the other hand, irrigation water with ECiw equal to 3.2 and 6.4 dS m-1 caused severe soil salinization, but not sodification. The rainfall water during the winter seasons was not sufficient to leach salts below the soil profile of 110 cm. The modified version of model WANISIM is able to predict the effects of irrigation with saline waters on soil and plant growth and it is suitable for irrigation management in areas with scarce and low quality water resources.
Ozaslan, Cumali; Bukun, Bekir; Ozcan, Selcuk
2016-01-01
Invasive plants are recognized for their impressive abilities to withstand adverse environmental conditions however, all invaders do not express the similar abilities. Therefore, survival, growth, nutrient uptake and fecundity of two co-occurring, invasive Physalis species were tested under water and salinity stresses, and different soil textures in the current study. Five different water stress levels (100, 75, 50, 25, and 12.5% pot water contents), four different soil salinity levels (0, 3, 6, and 12 dSm-1) and four different soil textures (67% clay, 50% clay, silt clay loam and sandy loam) were included in three different pot experiments. Both weeds survived under all levels of water stress except 12.5% water contents and on all soil types however, behaved differently under increasing salinity. The weeds responded similarly to salinity up till 3 dSm-1 whereas, P. philadelphica survived for longer time than P. angulata under remaining salinity regimes. Water and salinity stress hampered the growth and fecundity of both weeds while, soil textures had slight effect. Both weeds preferred clay textured soils for better growth and nutrient uptake however, interactive effect of weeds and soil textures was non-significant. P. angulata accumulated higher K and Na while P. philadelphica accrued more Ca and Mg as well as maintained better K/Na ratio. P. angulata accumulated more Na and P under salinity stress while, P. philadelphica accrued higher K and Mg, and maintained higher K/Na ratio. Collectively, highest nutrient accumulation was observed under stress free conditions and on clay textured soils. P. philadelphica exhibited higher reproductive output under all experimental conditions than P. angulata. It is predicted that P. philadelphica will be more problematic under optimal water supply and high salinity while P. angulata can better adapt water limited environments. The results indicate that both weeds have considerable potential to further expand their ranges in semi-arid regions of Turkey. PMID:27741269
Céspedes, V; Pallarés, S; Arribas, P; Millán, A; Velasco, J
2013-10-01
Water salinity and ionic composition are among the main environmental variables that constrain the fundamental niches of aquatic species, and accordingly, physiological tolerance to these factors constitutes a crucial part of the evolution, ecology, and biogeography of these organisms. The present study experimentally estimated the fundamental saline and anionic niches of adults of two pairs of congeneric saline beetle species that differ in habitat preference (lotic and lentic) in order to test the habitat constraint hypothesis. Osmotic and anionic realised niches were also estimated based on the field occurrences of adult beetle species using Outlying Mean Index analysis and their relationship with experimental tolerances. In the laboratory, all of the studied species showed a threshold response to increased salinity, displaying high survival times when exposed to low and intermediate conductivity levels. These results suggest that these species are not strictly halophilic, but that they are able to regulate both hyperosmotically and hypoosmotically. Anionic water composition had a significant effect on salinity tolerance at conductivity levels near their upper tolerance limits, with decreased species survival at elevated sulphate concentrations. Species occupying lentic habitats demonstrated higher salinity tolerance than their lotic congeners in agreement with the habitat constraint hypothesis. As expected, realised salinity niches were narrower than fundamental niches and corresponded to conditions near the upper tolerance limits of the species. These species are uncommon on freshwater-low conductivity habitats despite the fact that these conditions might be physiologically suitable for the adult life stage. Other factors, such as biotic interactions, could prevent their establishment at low salinities. Differences in the realised anionic niches of congeneric species could be partially explained by the varying habitat availability in the study area. Combining the experimental estimation of fundamental niches with realised field data niche estimates is a powerful method for understanding the main factors constraining species' distribution at multiple scales, which is a key issue when predicting species' ability to cope with global change. Copyright © 2013 Elsevier Ltd. All rights reserved.
An algorithm for direct causal learning of influences on patient outcomes.
Rathnam, Chandramouli; Lee, Sanghoon; Jiang, Xia
2017-01-01
This study aims at developing and introducing a new algorithm, called direct causal learner (DCL), for learning the direct causal influences of a single target. We applied it to both simulated and real clinical and genome wide association study (GWAS) datasets and compared its performance to classic causal learning algorithms. The DCL algorithm learns the causes of a single target from passive data using Bayesian-scoring, instead of using independence checks, and a novel deletion algorithm. We generate 14,400 simulated datasets and measure the number of datasets for which DCL correctly and partially predicts the direct causes. We then compare its performance with the constraint-based path consistency (PC) and conservative PC (CPC) algorithms, the Bayesian-score based fast greedy search (FGS) algorithm, and the partial ancestral graphs algorithm fast causal inference (FCI). In addition, we extend our comparison of all five algorithms to both a real GWAS dataset and real breast cancer datasets over various time-points in order to observe how effective they are at predicting the causal influences of Alzheimer's disease and breast cancer survival. DCL consistently outperforms FGS, PC, CPC, and FCI in discovering the parents of the target for the datasets simulated using a simple network. Overall, DCL predicts significantly more datasets correctly (McNemar's test significance: p<0.0001) than any of the other algorithms for these network types. For example, when assessing overall performance (simple and complex network results combined), DCL correctly predicts approximately 1400 more datasets than the top FGS method, 1600 more datasets than the top CPC method, 4500 more datasets than the top PC method, and 5600 more datasets than the top FCI method. Although FGS did correctly predict more datasets than DCL for the complex networks, and DCL correctly predicted only a few more datasets than CPC for these networks, there is no significant difference in performance between these three algorithms for this network type. However, when we use a more continuous measure of accuracy, we find that all the DCL methods are able to better partially predict more direct causes than FGS and CPC for the complex networks. In addition, DCL consistently had faster runtimes than the other algorithms. In the application to the real datasets, DCL identified rs6784615, located on the NISCH gene, and rs10824310, located on the PRKG1 gene, as direct causes of late onset Alzheimer's disease (LOAD) development. In addition, DCL identified ER category as a direct predictor of breast cancer mortality within 5 years, and HER2 status as a direct predictor of 10-year breast cancer mortality. These predictors have been identified in previous studies to have a direct causal relationship with their respective phenotypes, supporting the predictive power of DCL. When the other algorithms discovered predictors from the real datasets, these predictors were either also found by DCL or could not be supported by previous studies. Our results show that DCL outperforms FGS, PC, CPC, and FCI in almost every case, demonstrating its potential to advance causal learning. Furthermore, our DCL algorithm effectively identifies direct causes in the LOAD and Metabric GWAS datasets, which indicates its potential for clinical applications. Copyright © 2016 Elsevier B.V. All rights reserved.
Soil sail content estimation in the yellow river delta with satellite hyperspectral data
Weng, Yongling; Gong, Peng; Zhu, Zhi-Liang
2008-01-01
Soil salinization is one of the most common land degradation processes and is a severe environmental hazard. The primary objective of this study is to investigate the potential of predicting salt content in soils with hyperspectral data acquired with EO-1 Hyperion. Both partial least-squares regression (PLSR) and conventional multiple linear regression (MLR), such as stepwise regression (SWR), were tested as the prediction model. PLSR is commonly used to overcome the problem caused by high-dimensional and correlated predictors. Chemical analysis of 95 samples collected from the top layer of soils in the Yellow River delta area shows that salt content was high on average, and the dominant chemicals in the saline soil were NaCl and MgCl2. Multivariate models were established between soil contents and hyperspectral data. Our results indicate that the PLSR technique with laboratory spectral data has a strong prediction capacity. Spectral bands at 1487-1527, 1971-1991, 2032-2092, and 2163-2355 nm possessed large absolute values of regression coefficients, with the largest coefficient at 2203 nm. We obtained a root mean squared error (RMSE) for calibration (with 61 samples) of RMSEC = 0.753 (R2 = 0.893) and a root mean squared error for validation (with 30 samples) of RMSEV = 0.574. The prediction model was applied on a pixel-by-pixel basis to a Hyperion reflectance image to yield a quantitative surface distribution map of soil salt content. The result was validated successfully from 38 sampling points. We obtained an RMSE estimate of 1.037 (R2 = 0.784) for the soil salt content map derived by the PLSR model. The salinity map derived from the SWR model shows that the predicted value is higher than the true value. These results demonstrate that the PLSR method is a more suitable technique than stepwise regression for quantitative estimation of soil salt content in a large area. ?? 2008 CASI.
Bever, Aaron J.; MacWilliams, Michael L.; Herbold, Bruce; Brown, Larry R.; Feyrer, Frederick V.
2016-01-01
Long-term fish sampling data from the San Francisco Estuary were combined with detailed three dimensional hydrodynamic modeling to investigate the relationship between historical fish catch and hydrodynamic complexity. Delta Smelt catch data at 45 stations from the Fall Midwater Trawl (FMWT) survey in the vicinity of Suisun Bay were used to develop a quantitative catch-based station index. This index was used to rank stations based on historical Delta Smelt catch. The correlations between historical Delta Smelt catch and 35 quantitative metrics of environmental complexity were evaluated at each station. Eight metrics of environmental conditions were derived from FMWT data and 27 metrics were derived from model predictions at each FMWT station. To relate the station index to conceptual models of Delta Smelt habitat, the metrics were used to predict the station ranking based on the quantified environmental conditions. Salinity, current speed, and turbidity metrics were used to predict the relative ranking of each station for Delta Smelt catch. Including a measure of the current speed at each station improved predictions of the historical ranking for Delta Smelt catch relative to similar predictions made using only salinity and turbidity. Current speed was also found to be a better predictor of historical Delta Smelt catch than water depth. The quantitative approach developed using the FMWT data was validated using the Delta Smelt catch data from the San Francisco Bay Study. Complexity metrics in Suisun Bay were-evaluated during 2010 and 2011. This analysis indicated that a key to historical Delta Smelt catch is the overlap of low salinity, low maximum velocity, and low Secchi depth regions. This overlap occurred in Suisun Bay during 2011, and may have contributed to higher Delta Smelt abundance in 2011 than in 2010 when the favorable ranges of the metrics did not overlap in Suisun Bay.
Applied Distributed Model Predictive Control for Energy Efficient Buildings and Ramp Metering
NASA Astrophysics Data System (ADS)
Koehler, Sarah Muraoka
Industrial large-scale control problems present an interesting algorithmic design challenge. A number of controllers must cooperate in real-time on a network of embedded hardware with limited computing power in order to maximize system efficiency while respecting constraints and despite communication delays. Model predictive control (MPC) can automatically synthesize a centralized controller which optimizes an objective function subject to a system model, constraints, and predictions of disturbance. Unfortunately, the computations required by model predictive controllers for large-scale systems often limit its industrial implementation only to medium-scale slow processes. Distributed model predictive control (DMPC) enters the picture as a way to decentralize a large-scale model predictive control problem. The main idea of DMPC is to split the computations required by the MPC problem amongst distributed processors that can compute in parallel and communicate iteratively to find a solution. Some popularly proposed solutions are distributed optimization algorithms such as dual decomposition and the alternating direction method of multipliers (ADMM). However, these algorithms ignore two practical challenges: substantial communication delays present in control systems and also problem non-convexity. This thesis presents two novel and practically effective DMPC algorithms. The first DMPC algorithm is based on a primal-dual active-set method which achieves fast convergence, making it suitable for large-scale control applications which have a large communication delay across its communication network. In particular, this algorithm is suited for MPC problems with a quadratic cost, linear dynamics, forecasted demand, and box constraints. We measure the performance of this algorithm and show that it significantly outperforms both dual decomposition and ADMM in the presence of communication delay. The second DMPC algorithm is based on an inexact interior point method which is suited for nonlinear optimization problems. The parallel computation of the algorithm exploits iterative linear algebra methods for the main linear algebra computations in the algorithm. We show that the splitting of the algorithm is flexible and can thus be applied to various distributed platform configurations. The two proposed algorithms are applied to two main energy and transportation control problems. The first application is energy efficient building control. Buildings represent 40% of energy consumption in the United States. Thus, it is significant to improve the energy efficiency of buildings. The goal is to minimize energy consumption subject to the physics of the building (e.g. heat transfer laws), the constraints of the actuators as well as the desired operating constraints (thermal comfort of the occupants), and heat load on the system. In this thesis, we describe the control systems of forced air building systems in practice. We discuss the "Trim and Respond" algorithm which is a distributed control algorithm that is used in practice, and show that it performs similarly to a one-step explicit DMPC algorithm. Then, we apply the novel distributed primal-dual active-set method and provide extensive numerical results for the building MPC problem. The second main application is the control of ramp metering signals to optimize traffic flow through a freeway system. This application is particularly important since urban congestion has more than doubled in the past few decades. The ramp metering problem is to maximize freeway throughput subject to freeway dynamics (derived from mass conservation), actuation constraints, freeway capacity constraints, and predicted traffic demand. In this thesis, we develop a hybrid model predictive controller for ramp metering that is guaranteed to be persistently feasible and stable. This contrasts to previous work on MPC for ramp metering where such guarantees are absent. We apply a smoothing method to the hybrid model predictive controller and apply the inexact interior point method to this nonlinear non-convex ramp metering problem.
Calibration of Passive Microwave Polarimeters that Use Hybrid Coupler-Based Correlators
NASA Technical Reports Server (NTRS)
Piepmeier, J. R.
2003-01-01
Four calibration algorithms are studied for microwave polarimeters that use hybrid coupler-based correlators: 1) conventional two-look of hot and cold sources, 2) three looks of hot and cold source combinations, 3) two-look with correlated source, and 4) four-look combining methods 2 and 3. The systematic errors are found to depend on the polarimeter component parameters and accuracy of calibration noise temperatures. A case study radiometer in four different remote sensing scenarios was considered in light of these results. Applications for Ocean surface salinity, Ocean surface winds, and soil moisture were found to be sensitive to different systematic errors. Finally, a standard uncertainty analysis was performed on the four-look calibration algorithm, which was found to be most sensitive to the correlated calibration source.
NASA Astrophysics Data System (ADS)
Emami Niri, Mohammad; Amiri Kolajoobi, Rasool; Khodaiy Arbat, Mohammad; Shahbazi Raz, Mahdi
2018-06-01
Seismic wave velocities, along with petrophysical data, provide valuable information during the exploration and development stages of oil and gas fields. The compressional-wave velocity (VP ) is acquired using conventional acoustic logging tools in many drilled wells. But the shear-wave velocity (VS ) is recorded using advanced logging tools only in a limited number of wells, mainly because of the high operational costs. In addition, laboratory measurements of seismic velocities on core samples are expensive and time consuming. So, alternative methods are often used to estimate VS . Heretofore, several empirical correlations that predict VS by using well logging measurements and petrophysical data such as VP , porosity and density are proposed. However, these empirical relations can only be used in limited cases. The use of intelligent systems and optimization algorithms are inexpensive, fast and efficient approaches for predicting VS. In this study, in addition to the widely used Greenberg–Castagna empirical method, we implement three relatively recently developed metaheuristic algorithms to construct linear and nonlinear models for predicting VS : teaching–learning based optimization, imperialist competitive and artificial bee colony algorithms. We demonstrate the applicability and performance of these algorithms to predict Vs using conventional well logs in two field data examples, a sandstone formation from an offshore oil field and a carbonate formation from an onshore oil field. We compared the estimated VS using each of the employed metaheuristic approaches with observed VS and also with those predicted by Greenberg–Castagna relations. The results indicate that, for both sandstone and carbonate case studies, all three implemented metaheuristic algorithms are more efficient and reliable than the empirical correlation to predict VS . The results also demonstrate that in both sandstone and carbonate case studies, the performance of an artificial bee colony algorithm in VS prediction is slightly higher than two other alternative employed approaches.
Wearable physiological sensors and real-time algorithms for detection of acute mountain sickness.
Muza, Stephen R
2018-03-01
This is a minireview of potential wearable physiological sensors and algorithms (process and equations) for detection of acute mountain sickness (AMS). Given the emerging status of this effort, the focus of the review is on the current clinical assessment of AMS, known risk factors (environmental, demographic, and physiological), and current understanding of AMS pathophysiology. Studies that have examined a range of physiological variables to develop AMS prediction and/or detection algorithms are reviewed to provide insight and potential technological roadmaps for future development of real-time physiological sensors and algorithms to detect AMS. Given the lack of signs and nonspecific symptoms associated with AMS, development of wearable physiological sensors and embedded algorithms to predict in the near term or detect established AMS will be challenging. Prior work using [Formula: see text], HR, or HRv has not provided the sensitivity and specificity for useful application to predict or detect AMS. Rather than using spot checks as most prior studies have, wearable systems that continuously measure SpO 2 and HR are commercially available. Employing other statistical modeling approaches such as general linear and logistic mixed models or time series analysis to these continuously measured variables is the most promising approach for developing algorithms that are sensitive and specific for physiological prediction or detection of AMS.
NASA Astrophysics Data System (ADS)
Deo, Ravinesh C.; Şahin, Mehmet
2015-02-01
The prediction of future drought is an effective mitigation tool for assessing adverse consequences of drought events on vital water resources, agriculture, ecosystems and hydrology. Data-driven model predictions using machine learning algorithms are promising tenets for these purposes as they require less developmental time, minimal inputs and are relatively less complex than the dynamic or physical model. This paper authenticates a computationally simple, fast and efficient non-linear algorithm known as extreme learning machine (ELM) for the prediction of Effective Drought Index (EDI) in eastern Australia using input data trained from 1957-2008 and the monthly EDI predicted over the period 2009-2011. The predictive variables for the ELM model were the rainfall and mean, minimum and maximum air temperatures, supplemented by the large-scale climate mode indices of interest as regression covariates, namely the Southern Oscillation Index, Pacific Decadal Oscillation, Southern Annular Mode and the Indian Ocean Dipole moment. To demonstrate the effectiveness of the proposed data-driven model a performance comparison in terms of the prediction capabilities and learning speeds was conducted between the proposed ELM algorithm and the conventional artificial neural network (ANN) algorithm trained with Levenberg-Marquardt back propagation. The prediction metrics certified an excellent performance of the ELM over the ANN model for the overall test sites, thus yielding Mean Absolute Errors, Root-Mean Square Errors, Coefficients of Determination and Willmott's Indices of Agreement of 0.277, 0.008, 0.892 and 0.93 (for ELM) and 0.602, 0.172, 0.578 and 0.92 (for ANN) models. Moreover, the ELM model was executed with learning speed 32 times faster and training speed 6.1 times faster than the ANN model. An improvement in the prediction capability of the drought duration and severity by the ELM model was achieved. Based on these results we aver that out of the two machine learning algorithms tested, the ELM was the more expeditious tool for prediction of drought and its related properties.
TargetSpy: a supervised machine learning approach for microRNA target prediction.
Sturm, Martin; Hackenberg, Michael; Langenberger, David; Frishman, Dmitrij
2010-05-28
Virtually all currently available microRNA target site prediction algorithms require the presence of a (conserved) seed match to the 5' end of the microRNA. Recently however, it has been shown that this requirement might be too stringent, leading to a substantial number of missed target sites. We developed TargetSpy, a novel computational approach for predicting target sites regardless of the presence of a seed match. It is based on machine learning and automatic feature selection using a wide spectrum of compositional, structural, and base pairing features covering current biological knowledge. Our model does not rely on evolutionary conservation, which allows the detection of species-specific interactions and makes TargetSpy suitable for analyzing unconserved genomic sequences.In order to allow for an unbiased comparison of TargetSpy to other methods, we classified all algorithms into three groups: I) no seed match requirement, II) seed match requirement, and III) conserved seed match requirement. TargetSpy predictions for classes II and III are generated by appropriate postfiltering. On a human dataset revealing fold-change in protein production for five selected microRNAs our method shows superior performance in all classes. In Drosophila melanogaster not only our class II and III predictions are on par with other algorithms, but notably the class I (no-seed) predictions are just marginally less accurate. We estimate that TargetSpy predicts between 26 and 112 functional target sites without a seed match per microRNA that are missed by all other currently available algorithms. Only a few algorithms can predict target sites without demanding a seed match and TargetSpy demonstrates a substantial improvement in prediction accuracy in that class. Furthermore, when conservation and the presence of a seed match are required, the performance is comparable with state-of-the-art algorithms. TargetSpy was trained on mouse and performs well in human and drosophila, suggesting that it may be applicable to a broad range of species. Moreover, we have demonstrated that the application of machine learning techniques in combination with upcoming deep sequencing data results in a powerful microRNA target site prediction tool http://www.targetspy.org.
TargetSpy: a supervised machine learning approach for microRNA target prediction
2010-01-01
Background Virtually all currently available microRNA target site prediction algorithms require the presence of a (conserved) seed match to the 5' end of the microRNA. Recently however, it has been shown that this requirement might be too stringent, leading to a substantial number of missed target sites. Results We developed TargetSpy, a novel computational approach for predicting target sites regardless of the presence of a seed match. It is based on machine learning and automatic feature selection using a wide spectrum of compositional, structural, and base pairing features covering current biological knowledge. Our model does not rely on evolutionary conservation, which allows the detection of species-specific interactions and makes TargetSpy suitable for analyzing unconserved genomic sequences. In order to allow for an unbiased comparison of TargetSpy to other methods, we classified all algorithms into three groups: I) no seed match requirement, II) seed match requirement, and III) conserved seed match requirement. TargetSpy predictions for classes II and III are generated by appropriate postfiltering. On a human dataset revealing fold-change in protein production for five selected microRNAs our method shows superior performance in all classes. In Drosophila melanogaster not only our class II and III predictions are on par with other algorithms, but notably the class I (no-seed) predictions are just marginally less accurate. We estimate that TargetSpy predicts between 26 and 112 functional target sites without a seed match per microRNA that are missed by all other currently available algorithms. Conclusion Only a few algorithms can predict target sites without demanding a seed match and TargetSpy demonstrates a substantial improvement in prediction accuracy in that class. Furthermore, when conservation and the presence of a seed match are required, the performance is comparable with state-of-the-art algorithms. TargetSpy was trained on mouse and performs well in human and drosophila, suggesting that it may be applicable to a broad range of species. Moreover, we have demonstrated that the application of machine learning techniques in combination with upcoming deep sequencing data results in a powerful microRNA target site prediction tool http://www.targetspy.org. PMID:20509939
Serious injury prediction algorithm based on large-scale data and under-triage control.
Nishimoto, Tetsuya; Mukaigawa, Kosuke; Tominaga, Shigeru; Lubbe, Nils; Kiuchi, Toru; Motomura, Tomokazu; Matsumoto, Hisashi
2017-01-01
The present study was undertaken to construct an algorithm for an advanced automatic collision notification system based on national traffic accident data compiled by Japanese police. While US research into the development of a serious-injury prediction algorithm is based on a logistic regression algorithm using the National Automotive Sampling System/Crashworthiness Data System, the present injury prediction algorithm was based on comprehensive police data covering all accidents that occurred across Japan. The particular focus of this research is to improve the rescue of injured vehicle occupants in traffic accidents, and the present algorithm assumes the use of an onboard event data recorder data from which risk factors such as pseudo delta-V, vehicle impact location, seatbelt wearing or non-wearing, involvement in a single impact or multiple impact crash and the occupant's age can be derived. As a result, a simple and handy algorithm suited for onboard vehicle installation was constructed from a sample of half of the available police data. The other half of the police data was applied to the validation testing of this new algorithm using receiver operating characteristic analysis. An additional validation was conducted using in-depth investigation of accident injuries in collaboration with prospective host emergency care institutes. The validated algorithm, named the TOYOTA-Nihon University algorithm, proved to be as useful as the US URGENCY and other existing algorithms. Furthermore, an under-triage control analysis found that the present algorithm could achieve an under-triage rate of less than 10% by setting a threshold of 8.3%. Copyright © 2016 Elsevier Ltd. All rights reserved.
Real coded genetic algorithm for fuzzy time series prediction
NASA Astrophysics Data System (ADS)
Jain, Shilpa; Bisht, Dinesh C. S.; Singh, Phool; Mathpal, Prakash C.
2017-10-01
Genetic Algorithm (GA) forms a subset of evolutionary computing, rapidly growing area of Artificial Intelligence (A.I.). Some variants of GA are binary GA, real GA, messy GA, micro GA, saw tooth GA, differential evolution GA. This research article presents a real coded GA for predicting enrollments of University of Alabama. Data of Alabama University is a fuzzy time series. Here, fuzzy logic is used to predict enrollments of Alabama University and genetic algorithm optimizes fuzzy intervals. Results are compared to other eminent author works and found satisfactory, and states that real coded GA are fast and accurate.
A Novel Admixture-Based Pharmacogenetic Approach to Refine Warfarin Dosing in Caribbean Hispanics.
Duconge, Jorge; Ramos, Alga S; Claudio-Campos, Karla; Rivera-Miranda, Giselle; Bermúdez-Bosch, Luis; Renta, Jessicca Y; Cadilla, Carmen L; Cruz, Iadelisse; Feliu, Juan F; Vergara, Cunegundo; Ruaño, Gualberto
2016-01-01
This study is aimed at developing a novel admixture-adjusted pharmacogenomic approach to individually refine warfarin dosing in Caribbean Hispanic patients. A multiple linear regression analysis of effective warfarin doses versus relevant genotypes, admixture, clinical and demographic factors was performed in 255 patients and further validated externally in another cohort of 55 individuals. The admixture-adjusted, genotype-guided warfarin dosing refinement algorithm developed in Caribbean Hispanics showed better predictability (R2 = 0.70, MAE = 0.72mg/day) than a clinical algorithm that excluded genotypes and admixture (R2 = 0.60, MAE = 0.99mg/day), and outperformed two prior pharmacogenetic algorithms in predicting effective dose in this population. For patients at the highest risk of adverse events, 45.5% of the dose predictions using the developed pharmacogenetic model resulted in ideal dose as compared with only 29% when using the clinical non-genetic algorithm (p<0.001). The admixture-driven pharmacogenetic algorithm predicted 58% of warfarin dose variance when externally validated in 55 individuals from an independent validation cohort (MAE = 0.89 mg/day, 24% mean bias). Results supported our rationale to incorporate individual's genotypes and unique admixture metrics into pharmacogenetic refinement models in order to increase predictability when expanding them to admixed populations like Caribbean Hispanics. ClinicalTrials.gov NCT01318057.
Arnulf, Jan Ketil; Larsen, Kai Rune; Martinsen, Øyvind Lund; Bong, Chih How
2014-01-01
Some disciplines in the social sciences rely heavily on collecting survey responses to detect empirical relationships among variables. We explored whether these relationships were a priori predictable from the semantic properties of the survey items, using language processing algorithms which are now available as new research methods. Language processing algorithms were used to calculate the semantic similarity among all items in state-of-the-art surveys from Organisational Behaviour research. These surveys covered areas such as transformational leadership, work motivation and work outcomes. This information was used to explain and predict the response patterns from real subjects. Semantic algorithms explained 60–86% of the variance in the response patterns and allowed remarkably precise prediction of survey responses from humans, except in a personality test. Even the relationships between independent and their purported dependent variables were accurately predicted. This raises concern about the empirical nature of data collected through some surveys if results are already given a priori through the way subjects are being asked. Survey response patterns seem heavily determined by semantics. Language algorithms may suggest these prior to administering a survey. This study suggests that semantic algorithms are becoming new tools for the social sciences, opening perspectives on survey responses that prevalent psychometric theory cannot explain. PMID:25184672
NASA Astrophysics Data System (ADS)
Guruprasad, R.; Behera, B. K.
2015-10-01
Quantitative prediction of fabric mechanical properties is an essential requirement for design engineering of textile and apparel products. In this work, the possibility of prediction of bending rigidity of cotton woven fabrics has been explored with the application of Artificial Neural Network (ANN) and two hybrid methodologies, namely Neuro-genetic modeling and Adaptive Neuro-Fuzzy Inference System (ANFIS) modeling. For this purpose, a set of cotton woven grey fabrics was desized, scoured and relaxed. The fabrics were then conditioned and tested for bending properties. With the database thus created, a neural network model was first developed using back propagation as the learning algorithm. The second model was developed by applying a hybrid learning strategy, in which genetic algorithm was first used as a learning algorithm to optimize the number of neurons and connection weights of the neural network. The Genetic algorithm optimized network structure was further allowed to learn using back propagation algorithm. In the third model, an ANFIS modeling approach was attempted to map the input-output data. The prediction performances of the models were compared and a sensitivity analysis was reported. The results show that the prediction by neuro-genetic and ANFIS models were better in comparison with that of back propagation neural network model.
Arnulf, Jan Ketil; Larsen, Kai Rune; Martinsen, Øyvind Lund; Bong, Chih How
2014-01-01
Some disciplines in the social sciences rely heavily on collecting survey responses to detect empirical relationships among variables. We explored whether these relationships were a priori predictable from the semantic properties of the survey items, using language processing algorithms which are now available as new research methods. Language processing algorithms were used to calculate the semantic similarity among all items in state-of-the-art surveys from Organisational Behaviour research. These surveys covered areas such as transformational leadership, work motivation and work outcomes. This information was used to explain and predict the response patterns from real subjects. Semantic algorithms explained 60-86% of the variance in the response patterns and allowed remarkably precise prediction of survey responses from humans, except in a personality test. Even the relationships between independent and their purported dependent variables were accurately predicted. This raises concern about the empirical nature of data collected through some surveys if results are already given a priori through the way subjects are being asked. Survey response patterns seem heavily determined by semantics. Language algorithms may suggest these prior to administering a survey. This study suggests that semantic algorithms are becoming new tools for the social sciences, opening perspectives on survey responses that prevalent psychometric theory cannot explain.
Impact of glider data assimilation on the Monterey Bay model
NASA Astrophysics Data System (ADS)
Shulman, Igor; Rowley, Clark; Anderson, Stephanie; DeRada, Sergio; Kindle, John; Martin, Paul; Doyle, James; Cummings, James; Ramp, Steve; Chavez, Francisco; Fratantoni, David; Davis, Russ
2009-02-01
Glider observations were essential components of the observational program in the Autonomous Ocean Sampling Network (AOSN-II) experiment in the Monterey Bay area during summer of 2003. This paper is focused on the impact of the assimilation of glider temperature and salinity observations on the Navy Coastal Ocean Model (NCOM) predictions of surface and subsurface properties. The modeling system consists of an implementation of the NCOM model using a curvilinear, orthogonal grid with 1-4 km resolution, with finest resolution around the bay. The model receives open boundary conditions from a regional (9 km resolution) NCOM implementation for the California Current System, and surface fluxes from the Coupled Ocean-Atmosphere Mesoscale Prediction System (COAMPS) atmospheric model at 3 km resolution. The data assimilation component of the system is a version of the Navy Coupled Ocean Data Assimilation (NCODA) system, which is used for assimilation of the glider data into the NCOM model of the Monterey Bay area. The NCODA is a fully 3D multivariate optimum interpolation system that produces simultaneous analyses of temperature, salinity, geopotential, and vector velocity. Assimilation of glider data improves the surface temperature at the mooring locations for the NCOM model hindcast and nowcasts, and for the short-range (1-1.5 days) forecasts. It is shown that it is critical to have accurate atmospheric forcing for more extended forecasts. Assimilation of glider data provided better agreement with independent observations (for example, with aircraft measured SSTs) of the model-predicted and observed spatial distributions of surface temperature and salinity. Mooring observations of subsurface temperature and salinity show sharp changes in the thermocline and halocline depths during transitions from upwelling to relaxation and vice versa. The non-assimilative run also shows these transitions in subsurface temperature; but they are not as well defined. For salinity, the non-assimilative run significantly differs from the observations. However, the glider data assimilating run is able to show comparable results with observations of thermocline as well as halocline depths during upwelling and relaxation events in the Monterey Bay area. It is also shown that during the relaxation of wind, the data assimilative run has higher value of subsurface velocity complex correlation with observations than the non-assimilative run.
Regional Hydrogeochemistry of a Modern Coastal Mixing Zone
NASA Astrophysics Data System (ADS)
Wicks, Carol M.; Herman, Janet S.
1996-02-01
In west central Florida, groundwater samples were collected along flow paths in the unconfined upper Floridan aquifer that cross the inland, freshwater recharge area and the coastal discharge area. A groundwater flow and solute transport model was used to evaluate groundwater flow and mixing of fresh and saline groundwater along a cross section of the unconfined upper Floridan aquifer. Results show that between 8% and 15% of the fresh and 30-31% of the saline groundwater penetrates to the depth in the flow system where contact with and dissolution of gypsum is likely. The deeply circulating fresh and saline groundwater returns to the near-surface environment discharging CaSO4-rich water to the coastal area where it mixes with fresh CaHCO3 groundwater, resulting in a prediction of calcite precipitation in the modern mixing zone.
Arnold, David T; Rowen, Donna; Versteegh, Matthijs M; Morley, Anna; Hooper, Clare E; Maskell, Nicholas A
2015-01-23
In order to estimate utilities for cancer studies where the EQ-5D was not used, the EORTC QLQ-C30 can be used to estimate EQ-5D using existing mapping algorithms. Several mapping algorithms exist for this transformation, however, algorithms tend to lose accuracy in patients in poor health states. The aim of this study was to test all existing mapping algorithms of QLQ-C30 onto EQ-5D, in a dataset of patients with malignant pleural mesothelioma, an invariably fatal malignancy where no previous mapping estimation has been published. Health related quality of life (HRQoL) data where both the EQ-5D and QLQ-C30 were used simultaneously was obtained from the UK-based prospective observational SWAMP (South West Area Mesothelioma and Pemetrexed) trial. In the original trial 73 patients with pleural mesothelioma were offered palliative chemotherapy and their HRQoL was assessed across five time points. This data was used to test the nine available mapping algorithms found in the literature, comparing predicted against observed EQ-5D values. The ability of algorithms to predict the mean, minimise error and detect clinically significant differences was assessed. The dataset had a total of 250 observations across 5 timepoints. The linear regression mapping algorithms tested generally performed poorly, over-estimating the predicted compared to observed EQ-5D values, especially when observed EQ-5D was below 0.5. The best performing algorithm used a response mapping method and predicted the mean EQ-5D with accuracy with an average root mean squared error of 0.17 (Standard Deviation; 0.22). This algorithm reliably discriminated between clinically distinct subgroups seen in the primary dataset. This study tested mapping algorithms in a population with poor health states, where they have been previously shown to perform poorly. Further research into EQ-5D estimation should be directed at response mapping methods given its superior performance in this study.
Mondal, Tapan Kumar; Ganie, Showkat Ahmad; Debnath, Ananda Bhusan
2015-01-01
Oryza coarctata, a halophyte and wild relative of rice, is grown normally in saline water. MicroRNAs (miRNAs) are non-coding RNAs that play pivotal roles in every domain of life including stress response. There are very few reports on the discovery of salt-responsive miRNAs from halophytes. In this study, two small RNA libraries, one each from the control and salt-treated (450 mM NaCl for 24 h) leaves of O. coarctata were sequenced, which yielded 338 known and 95 novel miRNAs. Additionally, we used publicly available transcriptomics data of O. coarctata which led to the discovery of additional 48 conserved miRNAs along with their pre-miRNA sequences through in silico analysis. In total, 36 known and 7 novel miRNAs were up-regulated whereas, 12 known and 7 novel miRNAs were down-regulated under salinity stress. Further, 233 and 154 target genes were predicted for 48 known and 14 novel differentially regulated miRNAs respectively. These targets with the help of gene ontology analysis were found to be involved in several important biological processes that could be involved in salinity tolerance. Relative expression trends of majority of the miRNAs as detected by real time-PCR as well as predicted by Illumina sequencing were found to be coherent. Additionally, expression of most of the target genes was negatively correlated with their corresponding miRNAs. Thus, the present study provides an account of miRNA-target networking that is involved in salinity adaption of O. coarctata.
Arrhythmia Evaluation in Wearable ECG Devices
Sadrawi, Muammar; Lin, Chien-Hung; Hsieh, Yita; Kuo, Chia-Chun; Chien, Jen Chien; Haraikawa, Koichi; Abbod, Maysam F.; Shieh, Jiann-Shing
2017-01-01
This study evaluates four databases from PhysioNet: The American Heart Association database (AHADB), Creighton University Ventricular Tachyarrhythmia database (CUDB), MIT-BIH Arrhythmia database (MITDB), and MIT-BIH Noise Stress Test database (NSTDB). The ANSI/AAMI EC57:2012 is used for the evaluation of the algorithms for the supraventricular ectopic beat (SVEB), ventricular ectopic beat (VEB), atrial fibrillation (AF), and ventricular fibrillation (VF) via the evaluation of the sensitivity, positive predictivity and false positive rate. Sample entropy, fast Fourier transform (FFT), and multilayer perceptron neural network with backpropagation training algorithm are selected for the integrated detection algorithms. For this study, the result for SVEB has some improvements compared to a previous study that also utilized ANSI/AAMI EC57. In further, VEB sensitivity and positive predictivity gross evaluations have greater than 80%, except for the positive predictivity of the NSTDB database. For AF gross evaluation of MITDB database, the results show very good classification, excluding the episode sensitivity. In advanced, for VF gross evaluation, the episode sensitivity and positive predictivity for the AHADB, MITDB, and CUDB, have greater than 80%, except for MITDB episode positive predictivity, which is 75%. The achieved results show that the proposed integrated SVEB, VEB, AF, and VF detection algorithm has an accurate classification according to ANSI/AAMI EC57:2012. In conclusion, the proposed integrated detection algorithm can achieve good accuracy in comparison with other previous studies. Furthermore, more advanced algorithms and hardware devices should be performed in future for arrhythmia detection and evaluation. PMID:29068369
Conrads, Paul; Roehl, Edwin A.; Daamen, Ruby C.; Kitchens, Wiley M.
2006-01-01
The Savannah Harbor is one of the busiest ports on the East Coast of the United States and is located downstream from the Savannah National Wildlife Refuge, which is one of the Nation?s largest freshwater tidal marshes. The Georgia Ports Authority and the U.S. Army Corps of Engineers funded hydrodynamic and ecological studies to evaluate the potential effects of a proposed deepening of Savannah Harbor as part of the Environmental Impact Statement. These studies included a three-dimensional (3D) model of the Savannah River estuary system, which was developed to simulate changes in water levels and salinity in the system in response to geometry changes as a result of the deepening of Savannah Harbor, and a marsh-succession model that predicts plant distribution in the tidal marshes in response to changes in the water-level and salinity conditions in the marsh. Beginning in May 2001, the U.S. Geological Survey entered into cooperative agreements with the Georgia Ports Authority to develop empirical models to simulate the water level and salinity of the rivers and tidal marshes in the vicinity of the Savannah National Wildlife Refuge and to link the 3D hydrodynamic river-estuary model and the marsh-succession model. For the development of these models, many different databases were created that describe the complexity and behaviors of the estuary. The U.S. Geological Survey has maintained a network of continuous streamflow, water-level, and specific-conductance (field measurement to compute salinity) river gages in the study area since the 1980s and a network of water-level and salinity marsh gages in the study area since 1999. The Georgia Ports Authority collected water-level and salinity data during summer 1997 and 1999 and collected continuous water-level and salinity data in the marsh and connecting tidal creeks from 1999 to 2002. Most of the databases comprise time series that differ by variable type, periods of record, measurement frequency, location, and reliability. Understanding freshwater inflows, tidal water levels, and specific conductance in the rivers and marshes is critical to enhancing the predictive capabilities of a successful marsh succession model. Data-mining techniques, including artificial neural network (ANN) models, were applied to address various needs of the ecology study and to integrate the riverine predictions from the 3D model to the marsh-succession model. ANN models were developed to simulate riverine water levels and specific conductance in the vicinity of the tidal marshes for the full range of historical conditions using data from the river gaging networks. ANN models were also developed to simulate the marsh water levels and pore-water salinities using data from the marsh gaging networks. Using the marsh ANN models, the continuous marsh network was hindcasted to be concurrent with the long-term riverine network. The hindcasted data allow ecologists to compute hydrologic parameters?such as hydroperiods and exposure frequency?to help analyze historical vegetation data. To integrate the 3D hydrodynamic model, the marsh-succession model, and various time-series databases, a decision support system (DSS) was developed to support the various needs of regulatory and scientific stakeholders. The DSS required the development of a spreadsheet application that integrates the database, 3D hydrodynamic model output, and ANN riverine and marsh models into a single package that is easy to use and can be readily disseminated. The DSS allows users to evaluate water-level and salinity response for different hydrologic conditions. Savannah River streamflows can be controlled by the user as constant flow, a percentage of historical flows, a percentile daily flow hydrograph, or as a user-specified hydrograph. The DSS can also use output from the 3D model at stream gages near the Savannah National Wildlife Refuge to simulate the effects in the tidal marshes. The DSS is distributed with a two-dimensional (
ERIC Educational Resources Information Center
Strecht, Pedro; Cruz, Luís; Soares, Carlos; Mendes-Moreira, João; Abreu, Rui
2015-01-01
Predicting the success or failure of a student in a course or program is a problem that has recently been addressed using data mining techniques. In this paper we evaluate some of the most popular classification and regression algorithms on this problem. We address two problems: prediction of approval/failure and prediction of grade. The former is…
The Design and Implementation of a Read Prediction Buffer
1992-12-01
City, State, and ZIP Code) 7b ADDRESS (City, State. and ZIP Code) 8a. NAME OF FUNDING /SPONSORING 8b. OFFICE SYMBOL 9 PROCUREMENT INSTRUMENT... 9 E. THESIS STRUCTURE.. . .... ............... 9 II. READ PREDICTION ALGORITHM AND BUFFER DESIGN 10 A. THE READ PREDICTION ALGORITHM...29 Figure 9 . Basic Multiplexer Cell .... .......... .. 30 Figure 10. Block Diagram Simulation Labels ......... 38 viii I. INTRODUCTION A
Graph regularized nonnegative matrix factorization for temporal link prediction in dynamic networks
NASA Astrophysics Data System (ADS)
Ma, Xiaoke; Sun, Penggang; Wang, Yu
2018-04-01
Many networks derived from society and nature are temporal and incomplete. The temporal link prediction problem in networks is to predict links at time T + 1 based on a given temporal network from time 1 to T, which is essential to important applications. The current algorithms either predict the temporal links by collapsing the dynamic networks or collapsing features derived from each network, which are criticized for ignoring the connection among slices. to overcome the issue, we propose a novel graph regularized nonnegative matrix factorization algorithm (GrNMF) for the temporal link prediction problem without collapsing the dynamic networks. To obtain the feature for each network from 1 to t, GrNMF factorizes the matrix associated with networks by setting the rest networks as regularization, which provides a better way to characterize the topological information of temporal links. Then, the GrNMF algorithm collapses the feature matrices to predict temporal links. Compared with state-of-the-art methods, the proposed algorithm exhibits significantly improved accuracy by avoiding the collapse of temporal networks. Experimental results of a number of artificial and real temporal networks illustrate that the proposed method is not only more accurate but also more robust than state-of-the-art approaches.
NASA Astrophysics Data System (ADS)
Cheng, Jun-Hu; Jin, Huali; Liu, Zhiwei
2018-01-01
The feasibility of developing a multispectral imaging method using important wavelengths from hyperspectral images selected by genetic algorithm (GA), successive projection algorithm (SPA) and regression coefficient (RC) methods for modeling and predicting protein content in peanut kernel was investigated for the first time. Partial least squares regression (PLSR) calibration model was established between the spectral data from the selected optimal wavelengths and the reference measured protein content ranged from 23.46% to 28.43%. The RC-PLSR model established using eight key wavelengths (1153, 1567, 1972, 2143, 2288, 2339, 2389 and 2446 nm) showed the best predictive results with the coefficient of determination of prediction (R2P) of 0.901, and root mean square error of prediction (RMSEP) of 0.108 and residual predictive deviation (RPD) of 2.32. Based on the obtained best model and image processing algorithms, the distribution maps of protein content were generated. The overall results of this study indicated that developing a rapid and online multispectral imaging system using the feature wavelengths and PLSR analysis is potential and feasible for determination of the protein content in peanut kernels.
A Stochastic Framework for Evaluating Seizure Prediction Algorithms Using Hidden Markov Models
Wong, Stephen; Gardner, Andrew B.; Krieger, Abba M.; Litt, Brian
2007-01-01
Responsive, implantable stimulation devices to treat epilepsy are now in clinical trials. New evidence suggests that these devices may be more effective when they deliver therapy before seizure onset. Despite years of effort, prospective seizure prediction, which could improve device performance, remains elusive. In large part, this is explained by lack of agreement on a statistical framework for modeling seizure generation and a method for validating algorithm performance. We present a novel stochastic framework based on a three-state hidden Markov model (HMM) (representing interictal, preictal, and seizure states) with the feature that periods of increased seizure probability can transition back to the interictal state. This notion reflects clinical experience and may enhance interpretation of published seizure prediction studies. Our model accommodates clipped EEG segments and formalizes intuitive notions regarding statistical validation. We derive equations for type I and type II errors as a function of the number of seizures, duration of interictal data, and prediction horizon length and we demonstrate the model’s utility with a novel seizure detection algorithm that appeared to predicted seizure onset. We propose this framework as a vital tool for designing and validating prediction algorithms and for facilitating collaborative research in this area. PMID:17021032
A Third Approach to Gene Prediction Suggests Thousands of Additional Human Transcribed Regions
Glusman, Gustavo; Qin, Shizhen; El-Gewely, M. Raafat; Siegel, Andrew F; Roach, Jared C; Hood, Leroy; Smit, Arian F. A
2006-01-01
The identification and characterization of the complete ensemble of genes is a main goal of deciphering the digital information stored in the human genome. Many algorithms for computational gene prediction have been described, ultimately derived from two basic concepts: (1) modeling gene structure and (2) recognizing sequence similarity. Successful hybrid methods combining these two concepts have also been developed. We present a third orthogonal approach to gene prediction, based on detecting the genomic signatures of transcription, accumulated over evolutionary time. We discuss four algorithms based on this third concept: Greens and CHOWDER, which quantify mutational strand biases caused by transcription-coupled DNA repair, and ROAST and PASTA, which are based on strand-specific selection against polyadenylation signals. We combined these algorithms into an integrated method called FEAST, which we used to predict the location and orientation of thousands of putative transcription units not overlapping known genes. Many of the newly predicted transcriptional units do not appear to code for proteins. The new algorithms are particularly apt at detecting genes with long introns and lacking sequence conservation. They therefore complement existing gene prediction methods and will help identify functional transcripts within many apparent “genomic deserts.” PMID:16543943
A novel acenocoumarol pharmacogenomic dosing algorithm for the Greek population of EU-PACT trial.
Ragia, Georgia; Kolovou, Vana; Kolovou, Genovefa; Konstantinides, Stavros; Maltezos, Efstratios; Tavridou, Anna; Tziakas, Dimitrios; Maitland-van der Zee, Anke H; Manolopoulos, Vangelis G
2017-01-01
To generate and validate a pharmacogenomic-guided (PG) dosing algorithm for acenocoumarol in the Greek population. To compare its performance with other PG algorithms developed for the Greek population. A total of 140 Greek patients participants of the EU-PACT trial for acenocoumarol, a randomized clinical trial that prospectively compared the effect of a PG dosing algorithm with a clinical dosing algorithm on the percentage of time within INR therapeutic range, who reached acenocoumarol stable dose were included in the study. CYP2C9 and VKORC1 genotypes, age and weight affected acenocoumarol dose and predicted 53.9% of its variability. EU-PACT PG algorithm overestimated acenocoumarol dose across all different CYP2C9/VKORC1 functional phenotype bins (predicted dose vs stable dose in normal responders 2.31 vs 2.00 mg/day, p = 0.028, in sensitive responders 1.72 vs 1.50 mg/day, p = 0.003, in highly sensitive responders 1.39 vs 1.00 mg/day, p = 0.029). The PG algorithm previously developed for the Greek population overestimated the dose in normal responders (2.51 vs 2.00 mg/day, p < 0.001). Ethnic-specific dosing algorithm is suggested for better prediction of acenocoumarol dosage requirements in patients of Greek origin.
Heat Flux and Fluid Flow in the Terrebonne Basin, Northern Gulf of Mexico
NASA Astrophysics Data System (ADS)
Meazell, K.; Flemings, P. B.
2016-12-01
We use a three-dimensional seismic survey to map the gas hydrate stability zone within a mid-slope salt-withdrawal minibasin in the northern Gulf of Mexico and identify anomalous regions within the basin where fluids may modify the hydrate stability zone. A discontinuous bottom-simulating reflector (BSR) marks the base of the hydrate stability zone and suggests an average geothermal gradient of 18.1 C/km based on the calculated temperature at the BSR assuming seawater salinity, hydrostatic pressure, and a seafloor temperature of 4 C. When compared to our model of the predicted base of gas hydrate stability assuming a basin-wide geothermal gradient of 18.1 C, two anomalies are found where the BSR is observed significantly shallower than expected. The southern anomaly has a lateral influence of 1500 m from the salt, and a maximum shoaling of 800 m. This anomaly is likely the result of increased salinity or heat from a rising salt diapir along the flank of the basin. A local geothermal gradient of 67.31 C/km or a salinity of 17.5 wt % can explain the observed position of the BSR at the southern anomaly. The northern anomaly is associated with active cold seep vents. In this area, the pluming BSR is crescent shaped, which we interpret as the result of warm and or salty fluids migrating up through a fault. This anomaly has a lateral influence of 1500 m, and a maximum shoaling of 600 m above the predicted base of gas hydrate stability. A local geothermal gradient of 35.45 C/km or a salinity of 14.7 wt % is required to adjust the position of the BSR to that which is observed at the northern anomaly. Active fluid migration suggests a combination of both heat and salinity is responsible for the altered position of the BSR.
Whittington, James C. R.; Bogacz, Rafal
2017-01-01
To efficiently learn from feedback, cortical networks need to update synaptic weights on multiple levels of cortical hierarchy. An effective and well-known algorithm for computing such changes in synaptic weights is the error backpropagation algorithm. However, in this algorithm, the change in synaptic weights is a complex function of weights and activities of neurons not directly connected with the synapse being modified, whereas the changes in biological synapses are determined only by the activity of presynaptic and postsynaptic neurons. Several models have been proposed that approximate the backpropagation algorithm with local synaptic plasticity, but these models require complex external control over the network or relatively complex plasticity rules. Here we show that a network developed in the predictive coding framework can efficiently perform supervised learning fully autonomously, employing only simple local Hebbian plasticity. Furthermore, for certain parameters, the weight change in the predictive coding model converges to that of the backpropagation algorithm. This suggests that it is possible for cortical networks with simple Hebbian synaptic plasticity to implement efficient learning algorithms in which synapses in areas on multiple levels of hierarchy are modified to minimize the error on the output. PMID:28333583
Chen, Zhiru; Hong, Wenxue
2016-02-01
Considering the low accuracy of prediction in the positive samples and poor overall classification effects caused by unbalanced sample data of MicroRNA (miRNA) target, we proposes a support vector machine (SVM)-integration of under-sampling and weight (IUSM) algorithm in this paper, an under-sampling based on the ensemble learning algorithm. The algorithm adopts SVM as learning algorithm and AdaBoost as integration framework, and embeds clustering-based under-sampling into the iterative process, aiming at reducing the degree of unbalanced distribution of positive and negative samples. Meanwhile, in the process of adaptive weight adjustment of the samples, the SVM-IUSM algorithm eliminates the abnormal ones in negative samples with robust sample weights smoothing mechanism so as to avoid over-learning. Finally, the prediction of miRNA target integrated classifier is achieved with the combination of multiple weak classifiers through the voting mechanism. The experiment revealed that the SVM-IUSW, compared with other algorithms on unbalanced dataset collection, could not only improve the accuracy of positive targets and the overall effect of classification, but also enhance the generalization ability of miRNA target classifier.
Whittington, James C R; Bogacz, Rafal
2017-05-01
To efficiently learn from feedback, cortical networks need to update synaptic weights on multiple levels of cortical hierarchy. An effective and well-known algorithm for computing such changes in synaptic weights is the error backpropagation algorithm. However, in this algorithm, the change in synaptic weights is a complex function of weights and activities of neurons not directly connected with the synapse being modified, whereas the changes in biological synapses are determined only by the activity of presynaptic and postsynaptic neurons. Several models have been proposed that approximate the backpropagation algorithm with local synaptic plasticity, but these models require complex external control over the network or relatively complex plasticity rules. Here we show that a network developed in the predictive coding framework can efficiently perform supervised learning fully autonomously, employing only simple local Hebbian plasticity. Furthermore, for certain parameters, the weight change in the predictive coding model converges to that of the backpropagation algorithm. This suggests that it is possible for cortical networks with simple Hebbian synaptic plasticity to implement efficient learning algorithms in which synapses in areas on multiple levels of hierarchy are modified to minimize the error on the output.
Sun, Yongliang; Xu, Yubin; Li, Cheng; Ma, Lin
2013-11-13
A Kalman/map filtering (KMF)-aided fast normalized cross correlation (FNCC)-based Wi-Fi fingerprinting location sensing system is proposed in this paper. Compared with conventional neighbor selection algorithms that calculate localization results with received signal strength (RSS) mean samples, the proposed FNCC algorithm makes use of all the on-line RSS samples and reference point RSS variations to achieve higher fingerprinting accuracy. The FNCC computes efficiently while maintaining the same accuracy as the basic normalized cross correlation. Additionally, a KMF is also proposed to process fingerprinting localization results. It employs a new map matching algorithm to nonlinearize the linear location prediction process of Kalman filtering (KF) that takes advantage of spatial proximities of consecutive localization results. With a calibration model integrated into an indoor map, the map matching algorithm corrects unreasonable prediction locations of the KF according to the building interior structure. Thus, more accurate prediction locations are obtained. Using these locations, the KMF considerably improves fingerprinting algorithm performance. Experimental results demonstrate that the FNCC algorithm with reduced computational complexity outperforms other neighbor selection algorithms and the KMF effectively improves location sensing accuracy by using indoor map information and spatial proximities of consecutive localization results.
Sun, Yongliang; Xu, Yubin; Li, Cheng; Ma, Lin
2013-01-01
A Kalman/map filtering (KMF)-aided fast normalized cross correlation (FNCC)-based Wi-Fi fingerprinting location sensing system is proposed in this paper. Compared with conventional neighbor selection algorithms that calculate localization results with received signal strength (RSS) mean samples, the proposed FNCC algorithm makes use of all the on-line RSS samples and reference point RSS variations to achieve higher fingerprinting accuracy. The FNCC computes efficiently while maintaining the same accuracy as the basic normalized cross correlation. Additionally, a KMF is also proposed to process fingerprinting localization results. It employs a new map matching algorithm to nonlinearize the linear location prediction process of Kalman filtering (KF) that takes advantage of spatial proximities of consecutive localization results. With a calibration model integrated into an indoor map, the map matching algorithm corrects unreasonable prediction locations of the KF according to the building interior structure. Thus, more accurate prediction locations are obtained. Using these locations, the KMF considerably improves fingerprinting algorithm performance. Experimental results demonstrate that the FNCC algorithm with reduced computational complexity outperforms other neighbor selection algorithms and the KMF effectively improves location sensing accuracy by using indoor map information and spatial proximities of consecutive localization results. PMID:24233027