On the validity of cosmological Fisher matrix forecasts
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wolz, Laura; Kilbinger, Martin; Weller, Jochen
2012-09-01
We present a comparison of Fisher matrix forecasts for cosmological probes with Monte Carlo Markov Chain (MCMC) posterior likelihood estimation methods. We analyse the performance of future Dark Energy Task Force (DETF) stage-III and stage-IV dark-energy surveys using supernovae, baryon acoustic oscillations and weak lensing as probes. We concentrate in particular on the dark-energy equation of state parameters w{sub 0} and w{sub a}. For purely geometrical probes, and especially when marginalising over w{sub a}, we find considerable disagreement between the two methods, since in this case the Fisher matrix can not reproduce the highly non-elliptical shape of the likelihood function.more » More quantitatively, the Fisher method underestimates the marginalized errors for purely geometrical probes between 30%-70%. For cases including structure formation such as weak lensing, we find that the posterior probability contours from the Fisher matrix estimation are in good agreement with the MCMC contours and the forecasted errors only changing on the 5% level. We then explore non-linear transformations resulting in physically-motivated parameters and investigate whether these parameterisations exhibit a Gaussian behaviour. We conclude that for the purely geometrical probes and, more generally, in cases where it is not known whether the likelihood is close to Gaussian, the Fisher matrix is not the appropriate tool to produce reliable forecasts.« less
Forecasts of non-Gaussian parameter spaces using Box-Cox transformations
NASA Astrophysics Data System (ADS)
Joachimi, B.; Taylor, A. N.
2011-09-01
Forecasts of statistical constraints on model parameters using the Fisher matrix abound in many fields of astrophysics. The Fisher matrix formalism involves the assumption of Gaussianity in parameter space and hence fails to predict complex features of posterior probability distributions. Combining the standard Fisher matrix with Box-Cox transformations, we propose a novel method that accurately predicts arbitrary posterior shapes. The Box-Cox transformations are applied to parameter space to render it approximately multivariate Gaussian, performing the Fisher matrix calculation on the transformed parameters. We demonstrate that, after the Box-Cox parameters have been determined from an initial likelihood evaluation, the method correctly predicts changes in the posterior when varying various parameters of the experimental setup and the data analysis, with marginally higher computational cost than a standard Fisher matrix calculation. We apply the Box-Cox-Fisher formalism to forecast cosmological parameter constraints by future weak gravitational lensing surveys. The characteristic non-linear degeneracy between matter density parameter and normalization of matter density fluctuations is reproduced for several cases, and the capabilities of breaking this degeneracy by weak-lensing three-point statistics is investigated. Possible applications of Box-Cox transformations of posterior distributions are discussed, including the prospects for performing statistical data analysis steps in the transformed Gaussianized parameter space.
Fisher Matrix Preloaded — FISHER4CAST
NASA Astrophysics Data System (ADS)
Bassett, Bruce A.; Fantaye, Yabebal; Hlozek, Renée; Kotze, Jacques
The Fisher Matrix is the backbone of modern cosmological forecasting. We describe the Fisher4Cast software: A general-purpose, easy-to-use, Fisher Matrix framework. It is open source, rigorously designed and tested and includes a Graphical User Interface (GUI) with automated LATEX file creation capability and point-and-click Fisher ellipse generation. Fisher4Cast was designed for ease of extension and, although written in Matlab, is easily portable to open-source alternatives such as Octave and Scilab. Here we use Fisher4Cast to present new 3D and 4D visualizations of the forecasting landscape and to investigate the effects of growth and curvature on future cosmological surveys. Early releases have been available at since mid-2008. The current release of the code is Version 2.2 which is described here. For ease of reference a Quick Start guide and the code used to produce the figures in this paper are included, in the hope that it will be useful to the cosmology and wider scientific communities.
Simulating the effect of non-linear mode coupling in cosmological parameter estimation
NASA Astrophysics Data System (ADS)
Kiessling, A.; Taylor, A. N.; Heavens, A. F.
2011-09-01
Fisher Information Matrix methods are commonly used in cosmology to estimate the accuracy that cosmological parameters can be measured with a given experiment and to optimize the design of experiments. However, the standard approach usually assumes both data and parameter estimates are Gaussian-distributed. Further, for survey forecasts and optimization it is usually assumed that the power-spectrum covariance matrix is diagonal in Fourier space. However, in the low-redshift Universe, non-linear mode coupling will tend to correlate small-scale power, moving information from lower to higher order moments of the field. This movement of information will change the predictions of cosmological parameter accuracy. In this paper we quantify this loss of information by comparing naïve Gaussian Fisher matrix forecasts with a maximum likelihood parameter estimation analysis of a suite of mock weak lensing catalogues derived from N-body simulations, based on the SUNGLASS pipeline, for a 2D and tomographic shear analysis of a Euclid-like survey. In both cases, we find that the 68 per cent confidence area of the Ωm-σ8 plane increases by a factor of 5. However, the marginal errors increase by just 20-40 per cent. We propose a new method to model the effects of non-linear shear-power mode coupling in the Fisher matrix by approximating the shear-power distribution as a multivariate Gaussian with a covariance matrix derived from the mock weak lensing survey. We find that this approximation can reproduce the 68 per cent confidence regions of the full maximum likelihood analysis in the Ωm-σ8 plane to high accuracy for both 2D and tomographic weak lensing surveys. Finally, we perform a multiparameter analysis of Ωm, σ8, h, ns, w0 and wa to compare the Gaussian and non-linear mode-coupled Fisher matrix contours. The 6D volume of the 1σ error contours for the non-linear Fisher analysis is a factor of 3 larger than for the Gaussian case, and the shape of the 68 per cent confidence volume is modified. We propose that future Fisher matrix estimates of cosmological parameter accuracies should include mode-coupling effects.
Cosmology with the largest galaxy cluster surveys: going beyond Fisher matrix forecasts
DOE Office of Scientific and Technical Information (OSTI.GOV)
Khedekar, Satej; Majumdar, Subhabrata, E-mail: satej@mpa-garching.mpg.de, E-mail: subha@tifr.res.in
2013-02-01
We make the first detailed MCMC likelihood study of cosmological constraints that are expected from some of the largest, ongoing and proposed, cluster surveys in different wave-bands and compare the estimates to the prevalent Fisher matrix forecasts. Mock catalogs of cluster counts expected from the surveys — eROSITA, WFXT, RCS2, DES and Planck, along with a mock dataset of follow-up mass calibrations are analyzed for this purpose. A fair agreement between MCMC and Fisher results is found only in the case of minimal models. However, for many cases, the marginalized constraints obtained from Fisher and MCMC methods can differ bymore » factors of 30-100%. The discrepancy can be alarmingly large for a time dependent dark energy equation of state, w(a); the Fisher methods are seen to under-estimate the constraints by as much as a factor of 4-5. Typically, Fisher estimates become more and more inappropriate as we move away from ΛCDM, to a constant-w dark energy to varying-w dark energy cosmologies. Fisher analysis, also, predicts incorrect parameter degeneracies. There are noticeable offsets in the likelihood contours obtained from Fisher methods that is caused due to an asymmetry in the posterior likelihood distribution as seen through a MCMC analysis. From the point of mass-calibration uncertainties, a high value of unknown scatter about the mean mass-observable relation, and its redshift dependence, is seen to have large degeneracies with the cosmological parameters σ{sub 8} and w(a) and can degrade the cosmological constraints considerably. We find that the addition of mass-calibrated cluster datasets can improve dark energy and σ{sub 8} constraints by factors of 2-3 from what can be obtained from CMB+SNe+BAO only . Finally, we show that a joint analysis of datasets of two (or more) different cluster surveys would significantly tighten cosmological constraints from using clusters only. Since, details of future cluster surveys are still being planned, we emphasize that optimal survey design must be done using MCMC analysis rather than Fisher forecasting.« less
A fresh approach to forecasting in astroparticle physics and dark matter searches
NASA Astrophysics Data System (ADS)
Edwards, Thomas D. P.; Weniger, Christoph
2018-02-01
We present a toolbox of new techniques and concepts for the efficient forecasting of experimental sensitivities. These are applicable to a large range of scenarios in (astro-)particle physics, and based on the Fisher information formalism. Fisher information provides an answer to the question 'what is the maximum extractable information from a given observation?'. It is a common tool for the forecasting of experimental sensitivities in many branches of science, but rarely used in astroparticle physics or searches for particle dark matter. After briefly reviewing the Fisher information matrix of general Poisson likelihoods, we propose very compact expressions for estimating expected exclusion and discovery limits ('equivalent counts method'). We demonstrate by comparison with Monte Carlo results that they remain surprisingly accurate even deep in the Poisson regime. We show how correlated background systematics can be efficiently accounted for by a treatment based on Gaussian random fields. Finally, we introduce the novel concept of Fisher information flux. It can be thought of as a generalization of the commonly used signal-to-noise ratio, while accounting for the non-local properties and saturation effects of background and instrumental uncertainties. It is a powerful and flexible tool ready to be used as core concept for informed strategy development in astroparticle physics and searches for particle dark matter.
NASA Astrophysics Data System (ADS)
Wittman, David M.; Benson, Bryant
2018-06-01
Weak lensing analyses use the image---the intensity field---of a distant galaxy to infer gravitational effects on that line of sight. What if we analyze the velocity field instead? We show that lensing imprints much more information onto a highly ordered velocity field, such as that of a rotating disk galaxy, than onto an intensity field. This is because shuffling intensity pixels yields a post-lensed image quite similar to an unlensed galaxy with a different orientation, a problem known as "shape noise." We show that velocity field analysis can eliminate shape noise and yield much more precise lensing constraints. Furthermore, convergence as well as shear can be constrained using the same target, and there is no need to assume the weak lensing limit of small convergence. We present Fisher matrix forecasts of the precision achievable with this method. Velocity field observations are expensive, so we derive guidelines for choosing suitable targets by exploring how precision varies with source parameters such as inclination angle and redshift. Finally, we present simulations that support our Fisher matrix forecasts.
The impact of non-Gaussianity upon cosmological forecasts
NASA Astrophysics Data System (ADS)
Repp, A.; Szapudi, I.; Carron, J.; Wolk, M.
2015-12-01
The primary science driver for 3D galaxy surveys is their potential to constrain cosmological parameters. Forecasts of these surveys' effectiveness typically assume Gaussian statistics for the underlying matter density, despite the fact that the actual distribution is decidedly non-Gaussian. To quantify the effect of this assumption, we employ an analytic expression for the power spectrum covariance matrix to calculate the Fisher information for Baryon Acoustic Oscillation (BAO)-type model surveys. We find that for typical number densities, at kmax = 0.5h Mpc-1, Gaussian assumptions significantly overestimate the information on all parameters considered, in some cases by up to an order of magnitude. However, after marginalizing over a six-parameter set, the form of the covariance matrix (dictated by N-body simulations) causes the majority of the effect to shift to the `amplitude-like' parameters, leaving the others virtually unaffected. We find that Gaussian assumptions at such wavenumbers can underestimate the dark energy parameter errors by well over 50 per cent, producing dark energy figures of merit almost three times too large. Thus, for 3D galaxy surveys probing the non-linear regime, proper consideration of non-Gaussian effects is essential.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huang, Qing-Guo; Wang, Sai; Zhao, Wen, E-mail: huangqg@itp.ac.cn, E-mail: wangsai@itp.ac.cn, E-mail: wzhao7@ustc.edu.cn
2015-10-01
By taking into account the contamination of foreground radiations, we employ the Fisher matrix to forecast the future sensitivity on the tilt of power spectrum of primordial tensor perturbations for several ground-based (AdvACT, CLASS, Keck/BICEP3, Simons Array, SPT-3G), balloon-borne (EBEX, Spider) and satellite (CMBPol, COrE, LiteBIRD) experiments of B-mode polarizations. For the fiducial model n{sub t}=0, our results show that the satellite experiments give good sensitivity on the tensor tilt n{sub t} to the level σ{sub n{sub t}}∼<0.1 for r∼>2×10{sup −3}, while the ground-based and balloon-borne experiments give worse sensitivity. By considering the BICEP2/Keck Array and Planck (BKP) constraint onmore » the tensor-to-scalar ratio r, we see that it is impossible for these experiments to test the consistency relation n{sub t}=−r/8 in the canonical single-field slow-roll inflation models.« less
NASA Astrophysics Data System (ADS)
Brodie, Stephanie; Hobday, Alistair J.; Smith, James A.; Spillman, Claire M.; Hartog, Jason R.; Everett, Jason D.; Taylor, Matthew D.; Gray, Charles A.; Suthers, Iain M.
2017-06-01
Seasonal forecasting of environmental conditions and marine species distribution has been used as a decision support tool in commercial and aquaculture fisheries. These tools may also be applicable to species targeted by the recreational fisheries sector, a sector that is increasing its use of marine resources, and making important economic and social contributions to coastal communities around the world. Here, a seasonal forecast of the habitat and density of dolphinfish (Coryphaena hippurus), based on sea surface temperatures, was developed for the east coast of New South Wales (NSW), Australia. Two prototype forecast products were created; geographic spatial forecasts of dolphinfish habitat and a latitudinal summary identifying the location of fish density peaks. The less detailed latitudinal summary was created to limit the resolution of habitat information to prevent potential resource over-exploitation by fishers in the absence of total catch controls. The forecast dolphinfish habitat model was accurate at the start of the annual dolphinfish migration in NSW (December) but other months (January - May) showed poor performance due to spatial and temporal variability in the catch data used in model validation. Habitat forecasts for December were useful up to five months ahead, with performance decreasing as forecast were made further into the future. The continued development and sound application of seasonal forecasts will help fishery industries cope with future uncertainty and promote dynamic and sustainable marine resource management.
Using Fisher Information Criteria for Chemical Sensor Selection via Convex Optimization Methods
2016-11-16
determinant of the inverse Fisher information matrix which is proportional to the global error volume. If a practitioner has a suitable...pro- ceeds from the determinant of the inverse Fisher information matrix which is proportional to the global error volume. If a practitioner has a...design of statistical estimators (i.e. sensors) as their respective inverses act as lower bounds to the (co)variances of the subject estimator, a property
Siedlecki, Samantha A.; Kaplan, Isaac C.; Hermann, Albert J.; Nguyen, Thanh Tam; Bond, Nicholas A.; Newton, Jan A.; Williams, Gregory D.; Peterson, William T.; Alin, Simone R.; Feely, Richard A.
2016-01-01
Resource managers at the state, federal, and tribal levels make decisions on a weekly to quarterly basis, and fishers operate on a similar timeframe. To determine the potential of a support tool for these efforts, a seasonal forecast system is experimented with here. JISAO’s Seasonal Coastal Ocean Prediction of the Ecosystem (J-SCOPE) features dynamical downscaling of regional ocean conditions in Washington and Oregon waters using a combination of a high-resolution regional model with biogeochemistry and forecasts from NOAA’s Climate Forecast System (CFS). Model performance and predictability were examined for sea surface temperature (SST), bottom temperature, bottom oxygen, pH, and aragonite saturation state through model hindcasts, reforecast, and forecast comparisons with observations. Results indicate J-SCOPE forecasts have measurable skill on seasonal timescales. Experiments suggest that seasonal forecasting of ocean conditions important for fisheries is possible with the right combination of components. Those components include regional predictability on seasonal timescales of the physical environment from a large-scale model, a high-resolution regional model with biogeochemistry that simulates seasonal conditions in hindcasts, a relationship with local stakeholders, and a real-time observational network. Multiple efforts and approaches in different regions would advance knowledge to provide additional tools to fishers and other stakeholders. PMID:27273473
NASA Astrophysics Data System (ADS)
Siedlecki, Samantha A.; Kaplan, Isaac C.; Hermann, Albert J.; Nguyen, Thanh Tam; Bond, Nicholas A.; Newton, Jan A.; Williams, Gregory D.; Peterson, William T.; Alin, Simone R.; Feely, Richard A.
2016-06-01
Resource managers at the state, federal, and tribal levels make decisions on a weekly to quarterly basis, and fishers operate on a similar timeframe. To determine the potential of a support tool for these efforts, a seasonal forecast system is experimented with here. JISAO’s Seasonal Coastal Ocean Prediction of the Ecosystem (J-SCOPE) features dynamical downscaling of regional ocean conditions in Washington and Oregon waters using a combination of a high-resolution regional model with biogeochemistry and forecasts from NOAA’s Climate Forecast System (CFS). Model performance and predictability were examined for sea surface temperature (SST), bottom temperature, bottom oxygen, pH, and aragonite saturation state through model hindcasts, reforecast, and forecast comparisons with observations. Results indicate J-SCOPE forecasts have measurable skill on seasonal timescales. Experiments suggest that seasonal forecasting of ocean conditions important for fisheries is possible with the right combination of components. Those components include regional predictability on seasonal timescales of the physical environment from a large-scale model, a high-resolution regional model with biogeochemistry that simulates seasonal conditions in hindcasts, a relationship with local stakeholders, and a real-time observational network. Multiple efforts and approaches in different regions would advance knowledge to provide additional tools to fishers and other stakeholders.
Siedlecki, Samantha A; Kaplan, Isaac C; Hermann, Albert J; Nguyen, Thanh Tam; Bond, Nicholas A; Newton, Jan A; Williams, Gregory D; Peterson, William T; Alin, Simone R; Feely, Richard A
2016-06-07
Resource managers at the state, federal, and tribal levels make decisions on a weekly to quarterly basis, and fishers operate on a similar timeframe. To determine the potential of a support tool for these efforts, a seasonal forecast system is experimented with here. JISAO's Seasonal Coastal Ocean Prediction of the Ecosystem (J-SCOPE) features dynamical downscaling of regional ocean conditions in Washington and Oregon waters using a combination of a high-resolution regional model with biogeochemistry and forecasts from NOAA's Climate Forecast System (CFS). Model performance and predictability were examined for sea surface temperature (SST), bottom temperature, bottom oxygen, pH, and aragonite saturation state through model hindcasts, reforecast, and forecast comparisons with observations. Results indicate J-SCOPE forecasts have measurable skill on seasonal timescales. Experiments suggest that seasonal forecasting of ocean conditions important for fisheries is possible with the right combination of components. Those components include regional predictability on seasonal timescales of the physical environment from a large-scale model, a high-resolution regional model with biogeochemistry that simulates seasonal conditions in hindcasts, a relationship with local stakeholders, and a real-time observational network. Multiple efforts and approaches in different regions would advance knowledge to provide additional tools to fishers and other stakeholders.
Higgs-dilaton cosmology: An inflation-dark-energy connection and forecasts for future galaxy surveys
NASA Astrophysics Data System (ADS)
Casas, Santiago; Pauly, Martin; Rubio, Javier
2018-02-01
The Higgs-dilaton model is a scale-invariant extension of the Standard Model nonminimally coupled to gravity and containing just one additional degree of freedom on top of the Standard Model particle content. This minimalistic scenario predicts a set of measurable consistency relations between the inflationary observables and the dark-energy equation-of-state parameter. We present an alternative derivation of these consistency relations that highlights the connections and differences with the α -attractor scenario. We study how far these constraints allow one to distinguish the Higgs-dilaton model from Λ CDM and w CDM cosmologies. To this end we first analyze existing data sets using a Markov chain Monte Carlo approach. Second, we perform forecasts for future galaxy surveys using a Fisher matrix approach, both for galaxy clustering and weak lensing probes. Assuming that the best fit values in the different models remain comparable to the present ones, we show that both Euclid- and SKA2-like missions will be able to discriminate a Higgs-dilaton cosmology from Λ CDM and w CDM .
Measuring the lensing potential with tomographic galaxy number counts
DOE Office of Scientific and Technical Information (OSTI.GOV)
Montanari, Francesco; Durrer, Ruth, E-mail: francesco.montanari@unige.ch, E-mail: ruth.durrer@unige.ch
2015-10-01
We investigate how the lensing potential can be measured tomographically with future galaxy surveys using their number counts. Such a measurement is an independent test of the standard ΛCDM framework and can be used to discern modified theories of gravity. We perform a Fisher matrix forecast based on galaxy angular-redshift power spectra, assuming specifications consistent with future photometric Euclid-like surveys and spectroscopic SKA-like surveys. For the Euclid-like survey we derive a fitting formula for the magnification bias. Our analysis suggests that the cross correlation between different redshift bins is very sensitive to the lensing potential such that the survey canmore » measure the amplitude of the lensing potential at the same level of precision as other standard ΛCDM cosmological parameters.« less
Fitting and forecasting coupled dark energy in the non-linear regime
DOE Office of Scientific and Technical Information (OSTI.GOV)
Casas, Santiago; Amendola, Luca; Pettorino, Valeria
2016-01-01
We consider cosmological models in which dark matter feels a fifth force mediated by the dark energy scalar field, also known as coupled dark energy. Our interest resides in estimating forecasts for future surveys like Euclid when we take into account non-linear effects, relying on new fitting functions that reproduce the non-linear matter power spectrum obtained from N-body simulations. We obtain fitting functions for models in which the dark matter-dark energy coupling is constant. Their validity is demonstrated for all available simulations in the redshift range 0z=–1.6 and wave modes below 0k=1 h/Mpc. These fitting formulas can be used tomore » test the predictions of the model in the non-linear regime without the need for additional computing-intensive N-body simulations. We then use these fitting functions to perform forecasts on the constraining power that future galaxy-redshift surveys like Euclid will have on the coupling parameter, using the Fisher matrix method for galaxy clustering (GC) and weak lensing (WL). We find that by using information in the non-linear power spectrum, and combining the GC and WL probes, we can constrain the dark matter-dark energy coupling constant squared, β{sup 2}, with precision smaller than 4% and all other cosmological parameters better than 1%, which is a considerable improvement of more than an order of magnitude compared to corresponding linear power spectrum forecasts with the same survey specifications.« less
This paper presents a methodology that combines the power of an Artificial Neural Network and Information Theory to forecast variables describing the condition of a regional system. The novelty and strength of this approach is in the application of Fisher information, a key metho...
Simple expression for the quantum Fisher information matrix
NASA Astrophysics Data System (ADS)
Šafránek, Dominik
2018-04-01
Quantum Fisher information matrix (QFIM) is a cornerstone of modern quantum metrology and quantum information geometry. Apart from optimal estimation, it finds applications in description of quantum speed limits, quantum criticality, quantum phase transitions, coherence, entanglement, and irreversibility. We derive a surprisingly simple formula for this quantity, which, unlike previously known general expression, does not require diagonalization of the density matrix, and is provably at least as efficient. With a minor modification, this formula can be used to compute QFIM for any finite-dimensional density matrix. Because of its simplicity, it could also shed more light on the quantum information geometry in general.
NASA Astrophysics Data System (ADS)
Heavens, A. F.; Seikel, M.; Nord, B. D.; Aich, M.; Bouffanais, Y.; Bassett, B. A.; Hobson, M. P.
2014-12-01
The Fisher Information Matrix formalism (Fisher 1935) is extended to cases where the data are divided into two parts (X, Y), where the expectation value of Y depends on X according to some theoretical model, and X and Y both have errors with arbitrary covariance. In the simplest case, (X, Y) represent data pairs of abscissa and ordinate, in which case the analysis deals with the case of data pairs with errors in both coordinates, but X can be any measured quantities on which Y depends. The analysis applies for arbitrary covariance, provided all errors are Gaussian, and provided the errors in X are small, both in comparison with the scale over which the expected signal Y changes, and with the width of the prior distribution. This generalizes the Fisher Matrix approach, which normally only considers errors in the `ordinate' Y. In this work, we include errors in X by marginalizing over latent variables, effectively employing a Bayesian hierarchical model, and deriving the Fisher Matrix for this more general case. The methods here also extend to likelihood surfaces which are not Gaussian in the parameter space, and so techniques such as DALI (Derivative Approximation for Likelihoods) can be generalized straightforwardly to include arbitrary Gaussian data error covariances. For simple mock data and theoretical models, we compare to Markov Chain Monte Carlo experiments, illustrating the method with cosmological supernova data. We also include the new method in the FISHER4CAST software.
Kumaraswamy autoregressive moving average models for double bounded environmental data
NASA Astrophysics Data System (ADS)
Bayer, Fábio Mariano; Bayer, Débora Missio; Pumi, Guilherme
2017-12-01
In this paper we introduce the Kumaraswamy autoregressive moving average models (KARMA), which is a dynamic class of models for time series taking values in the double bounded interval (a,b) following the Kumaraswamy distribution. The Kumaraswamy family of distribution is widely applied in many areas, especially hydrology and related fields. Classical examples are time series representing rates and proportions observed over time. In the proposed KARMA model, the median is modeled by a dynamic structure containing autoregressive and moving average terms, time-varying regressors, unknown parameters and a link function. We introduce the new class of models and discuss conditional maximum likelihood estimation, hypothesis testing inference, diagnostic analysis and forecasting. In particular, we provide closed-form expressions for the conditional score vector and conditional Fisher information matrix. An application to environmental real data is presented and discussed.
Constraints on the dark matter and dark energy interactions from weak lensing bispectrum tomography
DOE Office of Scientific and Technical Information (OSTI.GOV)
An, Rui; Feng, Chang; Wang, Bin, E-mail: an_rui@sjtu.edu.cn, E-mail: chang.feng@uci.edu, E-mail: wang_b@sjtu.edu.cn
We estimate uncertainties of cosmological parameters for phenomenological interacting dark energy models using weak lensing convergence power spectrum and bispectrum. We focus on the bispectrum tomography and examine how well the weak lensing bispectrum with tomography can constrain the interactions between dark sectors, as well as other cosmological parameters. Employing the Fisher matrix analysis, we forecast parameter uncertainties derived from weak lensing bispectra with a two-bin tomography and place upper bounds on strength of the interactions between the dark sectors. The cosmic shear will be measured from upcoming weak lensing surveys with high sensitivity, thus it enables us to usemore » the higher order correlation functions of weak lensing to constrain the interaction between dark sectors and will potentially provide more stringent results with other observations combined.« less
Plantet, C; Meimon, S; Conan, J-M; Fusco, T
2015-11-02
Exoplanet direct imaging with large ground based telescopes requires eXtreme Adaptive Optics that couples high-order adaptive optics and coronagraphy. A key element of such systems is the high-order wavefront sensor. We study here several high-order wavefront sensing approaches, and more precisely compare their sensitivity to noise. Three techniques are considered: the classical Shack-Hartmann sensor, the pyramid sensor and the recently proposed LIFTed Shack-Hartmann sensor. They are compared in a unified framework based on precise diffractive models and on the Fisher information matrix, which conveys the information present in the data whatever the estimation method. The diagonal elements of the inverse of the Fisher information matrix, which we use as a figure of merit, are similar to noise propagation coefficients. With these diagonal elements, so called "Fisher coefficients", we show that the LIFTed Shack-Hartmann and pyramid sensors outperform the classical Shack-Hartmann sensor. In photon noise regime, the LIFTed Shack-Hartmann and modulated pyramid sensors obtain a similar overall noise propagation. The LIFTed Shack-Hartmann sensor however provides attractive noise properties on high orders.
NASA Astrophysics Data System (ADS)
Widyaningsih, Purnami; Retno Sari Saputro, Dewi; Nugrahani Putri, Aulia
2017-06-01
GWOLR model combines geographically weighted regression (GWR) and (ordinal logistic reression) OLR models. Its parameter estimation employs maximum likelihood estimation. Such parameter estimation, however, yields difficult-to-solve system of nonlinear equations, and therefore numerical approximation approach is required. The iterative approximation approach, in general, uses Newton-Raphson (NR) method. The NR method has a disadvantage—its Hessian matrix is always the second derivatives of each iteration so it does not always produce converging results. With regard to this matter, NR model is modified by substituting its Hessian matrix into Fisher information matrix, which is termed Fisher scoring (FS). The present research seeks to determine GWOLR model parameter estimation using Fisher scoring method and apply the estimation on data of the level of vulnerability to Dengue Hemorrhagic Fever (DHF) in Semarang. The research concludes that health facilities give the greatest contribution to the probability of the number of DHF sufferers in both villages. Based on the number of the sufferers, IR category of DHF in both villages can be determined.
Statistical classification techniques for engineering and climatic data samples
NASA Technical Reports Server (NTRS)
Temple, E. C.; Shipman, J. R.
1981-01-01
Fisher's sample linear discriminant function is modified through an appropriate alteration of the common sample variance-covariance matrix. The alteration consists of adding nonnegative values to the eigenvalues of the sample variance covariance matrix. The desired results of this modification is to increase the number of correct classifications by the new linear discriminant function over Fisher's function. This study is limited to the two-group discriminant problem.
Quantifying lost information due to covariance matrix estimation in parameter inference
NASA Astrophysics Data System (ADS)
Sellentin, Elena; Heavens, Alan F.
2017-02-01
Parameter inference with an estimated covariance matrix systematically loses information due to the remaining uncertainty of the covariance matrix. Here, we quantify this loss of precision and develop a framework to hypothetically restore it, which allows to judge how far away a given analysis is from the ideal case of a known covariance matrix. We point out that it is insufficient to estimate this loss by debiasing the Fisher matrix as previously done, due to a fundamental inequality that describes how biases arise in non-linear functions. We therefore develop direct estimators for parameter credibility contours and the figure of merit, finding that significantly fewer simulations than previously thought are sufficient to reach satisfactory precisions. We apply our results to DES Science Verification weak lensing data, detecting a 10 per cent loss of information that increases their credibility contours. No significant loss of information is found for KiDS. For a Euclid-like survey, with about 10 nuisance parameters we find that 2900 simulations are sufficient to limit the systematically lost information to 1 per cent, with an additional uncertainty of about 2 per cent. Without any nuisance parameters, 1900 simulations are sufficient to only lose 1 per cent of information. We further derive estimators for all quantities needed for forecasting with estimated covariance matrices. Our formalism allows to determine the sweetspot between running sophisticated simulations to reduce the number of nuisance parameters, and running as many fast simulations as possible.
Constraints on a scale-dependent bias from galaxy clustering
NASA Astrophysics Data System (ADS)
Amendola, L.; Menegoni, E.; Di Porto, C.; Corsi, M.; Branchini, E.
2017-01-01
We forecast the future constraints on scale-dependent parametrizations of galaxy bias and their impact on the estimate of cosmological parameters from the power spectrum of galaxies measured in a spectroscopic redshift survey. For the latter we assume a wide survey at relatively large redshifts, similar to the planned Euclid survey, as the baseline for future experiments. To assess the impact of the bias we perform a Fisher matrix analysis, and we adopt two different parametrizations of scale-dependent bias. The fiducial models for galaxy bias are calibrated using mock catalogs of H α emitting galaxies mimicking the expected properties of the objects that will be targeted by the Euclid survey. In our analysis we have obtained two main results. First of all, allowing for a scale-dependent bias does not significantly increase the errors on the other cosmological parameters apart from the rms amplitude of density fluctuations, σ8 , and the growth index γ , whose uncertainties increase by a factor up to 2, depending on the bias model adopted. Second, we find that the accuracy in the linear bias parameter b0 can be estimated to within 1%-2% at various redshifts regardless of the fiducial model. The nonlinear bias parameters have significantly large errors that depend on the model adopted. Despite this, in the more realistic scenarios departures from the simple linear bias prescription can be detected with a ˜2 σ significance at each redshift explored. Finally, we use the Fisher matrix formalism to assess the impact od assuming an incorrect bias model and find that the systematic errors induced on the cosmological parameters are similar or even larger than the statistical ones.
Searching for modified growth patterns with tomographic surveys
NASA Astrophysics Data System (ADS)
Zhao, Gong-Bo; Pogosian, Levon; Silvestri, Alessandra; Zylberberg, Joel
2009-04-01
In alternative theories of gravity, designed to produce cosmic acceleration at the current epoch, the growth of large scale structure can be modified. We study the potential of upcoming and future tomographic surveys such as Dark Energy Survey (DES) and Large Synoptic Survey Telescope (LSST), with the aid of cosmic microwave background (CMB) and supernovae data, to detect departures from the growth of cosmic structure expected within general relativity. We employ parametric forms to quantify the potential time- and scale-dependent variation of the effective gravitational constant and the differences between the two Newtonian potentials. We then apply the Fisher matrix technique to forecast the errors on the modified growth parameters from galaxy clustering, weak lensing, CMB, and their cross correlations across multiple photometric redshift bins. We find that even with conservative assumptions about the data, DES will produce nontrivial constraints on modified growth and that LSST will do significantly better.
A Kalman filter for a two-dimensional shallow-water model
NASA Technical Reports Server (NTRS)
Parrish, D. F.; Cohn, S. E.
1985-01-01
A two-dimensional Kalman filter is described for data assimilation for making weather forecasts. The filter is regarded as superior to the optimal interpolation method because the filter determines the forecast error covariance matrix exactly instead of using an approximation. A generalized time step is defined which includes expressions for one time step of the forecast model, the error covariance matrix, the gain matrix, and the evolution of the covariance matrix. Subsequent time steps are achieved by quantifying the forecast variables or employing a linear extrapolation from a current variable set, assuming the forecast dynamics are linear. Calculations for the evolution of the error covariance matrix are banded, i.e., are performed only with the elements significantly different from zero. Experimental results are provided from an application of the filter to a shallow-water simulation covering a 6000 x 6000 km grid.
Construction cost forecast model : model documentation and technical notes.
DOT National Transportation Integrated Search
2013-05-01
Construction cost indices are generally estimated with Laspeyres, Paasche, or Fisher indices that allow changes : in the quantities of construction bid items, as well as changes in price to change the cost indices of those items. : These cost indices...
Zimmer, Christoph
2016-01-01
Computational modeling is a key technique for analyzing models in systems biology. There are well established methods for the estimation of the kinetic parameters in models of ordinary differential equations (ODE). Experimental design techniques aim at devising experiments that maximize the information encoded in the data. For ODE models there are well established approaches for experimental design and even software tools. However, data from single cell experiments on signaling pathways in systems biology often shows intrinsic stochastic effects prompting the development of specialized methods. While simulation methods have been developed for decades and parameter estimation has been targeted for the last years, only very few articles focus on experimental design for stochastic models. The Fisher information matrix is the central measure for experimental design as it evaluates the information an experiment provides for parameter estimation. This article suggest an approach to calculate a Fisher information matrix for models containing intrinsic stochasticity and high nonlinearity. The approach makes use of a recently suggested multiple shooting for stochastic systems (MSS) objective function. The Fisher information matrix is calculated by evaluating pseudo data with the MSS technique. The performance of the approach is evaluated with simulation studies on an Immigration-Death, a Lotka-Volterra, and a Calcium oscillation model. The Calcium oscillation model is a particularly appropriate case study as it contains the challenges inherent to signaling pathways: high nonlinearity, intrinsic stochasticity, a qualitatively different behavior from an ODE solution, and partial observability. The computational speed of the MSS approach for the Fisher information matrix allows for an application in realistic size models.
Detecting higher spin fields through statistical anisotropy in the CMB and galaxy power spectra
NASA Astrophysics Data System (ADS)
Bartolo, Nicola; Kehagias, Alex; Liguori, Michele; Riotto, Antonio; Shiraishi, Maresuke; Tansella, Vittorio
2018-01-01
Primordial inflation may represent the most powerful collider to test high-energy physics models. In this paper we study the impact on the inflationary power spectrum of the comoving curvature perturbation in the specific model where massive higher spin fields are rendered effectively massless during a de Sitter epoch through suitable couplings to the inflaton field. In particular, we show that such fields with spin s induce a distinctive statistical anisotropic signal on the power spectrum, in such a way that not only the usual g2 M-statistical anisotropy coefficients, but also higher-order ones (i.e., g4 M,g6 M,…,g(2 s -2 )M and g(2 s )M) are nonvanishing. We examine their imprints in the cosmic microwave background and galaxy power spectra. Our Fisher matrix forecasts indicate that the detectability of gL M depends very weakly on L : all coefficients could be detected in near future if their magnitudes are bigger than about 10-3.
Testing the equivalence principle on cosmological scales
NASA Astrophysics Data System (ADS)
Bonvin, Camille; Fleury, Pierre
2018-05-01
The equivalence principle, that is one of the main pillars of general relativity, is very well tested in the Solar system; however, its validity is more uncertain on cosmological scales, or when dark matter is concerned. This article shows that relativistic effects in the large-scale structure can be used to directly test whether dark matter satisfies Euler's equation, i.e. whether its free fall is characterised by geodesic motion, just like baryons and light. After having proposed a general parametrisation for deviations from Euler's equation, we perform Fisher-matrix forecasts for future surveys like DESI and the SKA, and show that such deviations can be constrained with a precision of order 10%. Deviations from Euler's equation cannot be tested directly with standard methods like redshift-space distortions and gravitational lensing, since these observables are not sensitive to the time component of the metric. Our analysis shows therefore that relativistic effects bring new and complementary constraints to alternative theories of gravity.
Zimmer, Christoph
2016-01-01
Background Computational modeling is a key technique for analyzing models in systems biology. There are well established methods for the estimation of the kinetic parameters in models of ordinary differential equations (ODE). Experimental design techniques aim at devising experiments that maximize the information encoded in the data. For ODE models there are well established approaches for experimental design and even software tools. However, data from single cell experiments on signaling pathways in systems biology often shows intrinsic stochastic effects prompting the development of specialized methods. While simulation methods have been developed for decades and parameter estimation has been targeted for the last years, only very few articles focus on experimental design for stochastic models. Methods The Fisher information matrix is the central measure for experimental design as it evaluates the information an experiment provides for parameter estimation. This article suggest an approach to calculate a Fisher information matrix for models containing intrinsic stochasticity and high nonlinearity. The approach makes use of a recently suggested multiple shooting for stochastic systems (MSS) objective function. The Fisher information matrix is calculated by evaluating pseudo data with the MSS technique. Results The performance of the approach is evaluated with simulation studies on an Immigration-Death, a Lotka-Volterra, and a Calcium oscillation model. The Calcium oscillation model is a particularly appropriate case study as it contains the challenges inherent to signaling pathways: high nonlinearity, intrinsic stochasticity, a qualitatively different behavior from an ODE solution, and partial observability. The computational speed of the MSS approach for the Fisher information matrix allows for an application in realistic size models. PMID:27583802
Galaxy bispectrum from massive spinning particles
NASA Astrophysics Data System (ADS)
Moradinezhad Dizgah, Azadeh; Lee, Hayden; Muñoz, Julian B.; Dvorkin, Cora
2018-05-01
Massive spinning particles, if present during inflation, lead to a distinctive bispectrum of primordial perturbations, the shape and amplitude of which depend on the masses and spins of the extra particles. This signal, in turn, leaves an imprint in the statistical distribution of galaxies; in particular, as a non-vanishing galaxy bispectrum, which can be used to probe the masses and spins of these particles. In this paper, we present for the first time a new theoretical template for the bispectrum generated by massive spinning particles, valid for a general triangle configuration. We then proceed to perform a Fisher-matrix forecast to assess the potential of two next-generation spectroscopic galaxy surveys, EUCLID and DESI, to constrain the primordial non-Gaussianity sourced by these extra particles. We model the galaxy bispectrum using tree-level perturbation theory, accounting for redshift-space distortions and the Alcock-Paczynski effect, and forecast constraints on the primordial non-Gaussianity parameters marginalizing over all relevant biases and cosmological parameters. Our results suggest that these surveys would potentially be sensitive to any primordial non-Gaussianity with an amplitude larger than fNL≈ 1, for massive particles with spins 2, 3, and 4. Interestingly, if non-Gaussianities are present at that level, these surveys will be able to infer the masses of these spinning particles to within tens of percent. If detected, this would provide a very clear window into the particle content of our Universe during inflation.
ERIC Educational Resources Information Center
Collazo, Andres; And Others
Since a great number of variables influence future educational outcomes, forecasting possible trends is a complex task. One such model, the cross-impact matrix, has been developed. The use of this matrix in forecasting future values of social indicators of educational outcomes is described. Variables associated with educational outcomes are used…
Parameter estimation by decoherence in the double-slit experiment
NASA Astrophysics Data System (ADS)
Matsumura, Akira; Ikeda, Taishi; Kukita, Shingo
2018-06-01
We discuss a parameter estimation problem using quantum decoherence in the double-slit interferometer. We consider a particle coupled to a massive scalar field after the particle passing through the double slit and solve the dynamics non-perturbatively for the coupling by the WKB approximation. This allows us to analyze the estimation problem which cannot be treated by master equation used in the research of quantum probe. In this model, the scalar field reduces the interference fringes of the particle and the fringe pattern depends on the field mass and coupling. To evaluate the contrast and the estimation precision obtained from the pattern, we introduce the interferometric visibility and the Fisher information matrix of the field mass and coupling. For the fringe pattern observed on the distant screen, we derive a simple relation between the visibility and the Fisher matrix. Also, focusing on the estimation precision of the mass, we find that the Fisher information characterizes the wave-particle duality in the double-slit interferometer.
The Trapped Radiation Handbook. Change 4,
1977-01-04
DLSE ATT!t Technical Library CommanderAD) COBI /KPD Dat. 1, 12W5 ATTN: Hqs. 14th Aerospace Force (EVN) Space Forecasting Section ATTN: Paul Hason...ATTN: R. P. Caren, D/52-20 ATTNi Hans Wolfhard ATTNI D. C. Fisher, U/52-14 ATTNt Joel Bengston ATTN, Richard G. Johnson, Dept. 52-12 ATTN: Ernest Buer
Cosmology with galaxy cluster phase spaces
NASA Astrophysics Data System (ADS)
Stark, Alejo; Miller, Christopher J.; Huterer, Dragan
2017-07-01
We present a novel approach to constrain accelerating cosmologies with galaxy cluster phase spaces. With the Fisher matrix formalism we forecast constraints on the cosmological parameters that describe the cosmological expansion history. We find that our probe has the potential of providing constraints comparable to, or even stronger than, those from other cosmological probes. More specifically, with 1000 (100) clusters uniformly distributed in the redshift range 0 ≤z ≤0.8 , after applying a conservative 80% mass scatter prior on each cluster and marginalizing over all other parameters, we forecast 1 σ constraints on the dark energy equation of state w and matter density parameter ΩM of σw=0.138 (0.431 ) and σΩM=0.007(0.025 ) in a flat universe. Assuming 40% mass scatter and adding a prior on the Hubble constant we can achieve a constraint on the Chevallier-Polarski-Linder parametrization of the dark energy equation of state parameters w0 and wa with 100 clusters in the same redshift range: σw 0=0.191 and σwa=2.712. Dropping the assumption of flatness and assuming w =-1 we also attain competitive constraints on the matter and dark energy density parameters: σΩ M=0.101 and σΩ Λ=0.197 for 100 clusters uniformly distributed in the range 0 ≤z ≤0.8 after applying a prior on the Hubble constant. We also discuss various observational strategies for tightening constraints in both the near and far future.
NASA Astrophysics Data System (ADS)
Merkel, Philipp M.; Schäfer, Björn Malte
2017-08-01
Recently, it has been shown that cross-correlating cosmic microwave background (CMB) lensing and three-dimensional (3D) cosmic shear allows to considerably tighten cosmological parameter constraints. We investigate whether similar improvement can be achieved in a conventional tomographic setup. We present Fisher parameter forecasts for a Euclid-like galaxy survey in combination with different ongoing and forthcoming CMB experiments. In contrast to a fully 3D analysis, we find only marginal improvement. Assuming Planck-like CMB data, we show that including the full covariance of the combined CMB and cosmic shear data improves the dark energy figure of merit (FOM) by only 3 per cent. The marginalized error on the sum of neutrino masses is reduced at the same level. For a next generation CMB satellite mission such as Prism, the predicted improvement of the dark energy FOM amounts to approximately 25 per cent. Furthermore, we show that the small improvement is contrasted by an increased bias in the dark energy parameters when the intrinsic alignment of galaxies is not correctly accounted for in the full covariance matrix.
Deconstructing the neutrino mass constraint from galaxy redshift surveys
NASA Astrophysics Data System (ADS)
Boyle, Aoife; Komatsu, Eiichiro
2018-03-01
The total mass of neutrinos can be constrained in a number of ways using galaxy redshift surveys. Massive neutrinos modify the expansion rate of the Universe, which can be measured using baryon acoustic oscillations (BAOs) or the Alcock-Paczynski (AP) test. Massive neutrinos also change the structure growth rate and the amplitude of the matter power spectrum, which can be measured using redshift-space distortions (RSD). We use the Fisher matrix formalism to disentangle these information sources, to provide projected neutrino mass constraints from each of these probes alone and to determine how sensitive each is to the assumed cosmological model. We isolate the distinctive effect of neutrino free-streaming on the matter power spectrum and structure growth rate as a signal unique to massive neutrinos that can provide the most robust constraints, which are relatively insensitive to extensions to the cosmological model beyond ΛCDM . We also provide forecasted constraints using all of the information contained in the observed galaxy power spectrum combined, and show that these maximally optimistic constraints are primarily limited by the accuracy to which the optical depth of the cosmic microwave background, τ, is known.
Semisupervised kernel marginal Fisher analysis for face recognition.
Wang, Ziqiang; Sun, Xia; Sun, Lijun; Huang, Yuchun
2013-01-01
Dimensionality reduction is a key problem in face recognition due to the high-dimensionality of face image. To effectively cope with this problem, a novel dimensionality reduction algorithm called semisupervised kernel marginal Fisher analysis (SKMFA) for face recognition is proposed in this paper. SKMFA can make use of both labelled and unlabeled samples to learn the projection matrix for nonlinear dimensionality reduction. Meanwhile, it can successfully avoid the singularity problem by not calculating the matrix inverse. In addition, in order to make the nonlinear structure captured by the data-dependent kernel consistent with the intrinsic manifold structure, a manifold adaptive nonparameter kernel is incorporated into the learning process of SKMFA. Experimental results on three face image databases demonstrate the effectiveness of our proposed algorithm.
Explicit formula for the Holevo bound for two-parameter qubit-state estimation problem
DOE Office of Scientific and Technical Information (OSTI.GOV)
Suzuki, Jun, E-mail: junsuzuki@uec.ac.jp
The main contribution of this paper is to derive an explicit expression for the fundamental precision bound, the Holevo bound, for estimating any two-parameter family of qubit mixed-states in terms of quantum versions of Fisher information. The obtained formula depends solely on the symmetric logarithmic derivative (SLD), the right logarithmic derivative (RLD) Fisher information, and a given weight matrix. This result immediately provides necessary and sufficient conditions for the following two important classes of quantum statistical models; the Holevo bound coincides with the SLD Cramér-Rao bound and it does with the RLD Cramér-Rao bound. One of the important results ofmore » this paper is that a general model other than these two special cases exhibits an unexpected property: the structure of the Holevo bound changes smoothly when the weight matrix varies. In particular, it always coincides with the RLD Cramér-Rao bound for a certain choice of the weight matrix. Several examples illustrate these findings.« less
Forecasting the short-term passenger flow on high-speed railway with neural networks.
Xie, Mei-Quan; Li, Xia-Miao; Zhou, Wen-Liang; Fu, Yan-Bing
2014-01-01
Short-term passenger flow forecasting is an important component of transportation systems. The forecasting result can be applied to support transportation system operation and management such as operation planning and revenue management. In this paper, a divide-and-conquer method based on neural network and origin-destination (OD) matrix estimation is developed to forecast the short-term passenger flow in high-speed railway system. There are three steps in the forecasting method. Firstly, the numbers of passengers who arrive at each station or depart from each station are obtained from historical passenger flow data, which are OD matrices in this paper. Secondly, short-term passenger flow forecasting of the numbers of passengers who arrive at each station or depart from each station based on neural network is realized. At last, the OD matrices in short-term time are obtained with an OD matrix estimation method. The experimental results indicate that the proposed divide-and-conquer method performs well in forecasting the short-term passenger flow on high-speed railway.
A new method for determining the optimal lagged ensemble
DelSole, T.; Tippett, M. K.; Pegion, K.
2017-01-01
Abstract We propose a general methodology for determining the lagged ensemble that minimizes the mean square forecast error. The MSE of a lagged ensemble is shown to depend only on a quantity called the cross‐lead error covariance matrix, which can be estimated from a short hindcast data set and parameterized in terms of analytic functions of time. The resulting parameterization allows the skill of forecasts to be evaluated for an arbitrary ensemble size and initialization frequency. Remarkably, the parameterization also can estimate the MSE of a burst ensemble simply by taking the limit of an infinitely small interval between initialization times. This methodology is applied to forecasts of the Madden Julian Oscillation (MJO) from version 2 of the Climate Forecast System version 2 (CFSv2). For leads greater than a week, little improvement is found in the MJO forecast skill when ensembles larger than 5 days are used or initializations greater than 4 times per day. We find that if the initialization frequency is too infrequent, important structures of the lagged error covariance matrix are lost. Lastly, we demonstrate that the forecast error at leads ≥10 days can be reduced by optimally weighting the lagged ensemble members. The weights are shown to depend only on the cross‐lead error covariance matrix. While the methodology developed here is applied to CFSv2, the technique can be easily adapted to other forecast systems. PMID:28580050
Information Geometry for Landmark Shape Analysis: Unifying Shape Representation and Deformation
Peter, Adrian M.; Rangarajan, Anand
2010-01-01
Shape matching plays a prominent role in the comparison of similar structures. We present a unifying framework for shape matching that uses mixture models to couple both the shape representation and deformation. The theoretical foundation is drawn from information geometry wherein information matrices are used to establish intrinsic distances between parametric densities. When a parameterized probability density function is used to represent a landmark-based shape, the modes of deformation are automatically established through the information matrix of the density. We first show that given two shapes parameterized by Gaussian mixture models (GMMs), the well-known Fisher information matrix of the mixture model is also a Riemannian metric (actually, the Fisher-Rao Riemannian metric) and can therefore be used for computing shape geodesics. The Fisher-Rao metric has the advantage of being an intrinsic metric and invariant to reparameterization. The geodesic—computed using this metric—establishes an intrinsic deformation between the shapes, thus unifying both shape representation and deformation. A fundamental drawback of the Fisher-Rao metric is that it is not available in closed form for the GMM. Consequently, shape comparisons are computationally very expensive. To address this, we develop a new Riemannian metric based on generalized ϕ-entropy measures. In sharp contrast to the Fisher-Rao metric, the new metric is available in closed form. Geodesic computations using the new metric are considerably more efficient. We validate the performance and discriminative capabilities of these new information geometry-based metrics by pairwise matching of corpus callosum shapes. We also study the deformations of fish shapes that have various topological properties. A comprehensive comparative analysis is also provided using other landmark-based distances, including the Hausdorff distance, the Procrustes metric, landmark-based diffeomorphisms, and the bending energies of the thin-plate (TPS) and Wendland splines. PMID:19110497
Forecasting extinction risk with nonstationary matrix models.
Gotelli, Nicholas J; Ellison, Aaron M
2006-02-01
Matrix population growth models are standard tools for forecasting population change and for managing rare species, but they are less useful for predicting extinction risk in the face of changing environmental conditions. Deterministic models provide point estimates of lambda, the finite rate of increase, as well as measures of matrix sensitivity and elasticity. Stationary matrix models can be used to estimate extinction risk in a variable environment, but they assume that the matrix elements are randomly sampled from a stationary (i.e., non-changing) distribution. Here we outline a method for using nonstationary matrix models to construct realistic forecasts of population fluctuation in changing environments. Our method requires three pieces of data: (1) field estimates of transition matrix elements, (2) experimental data on the demographic responses of populations to altered environmental conditions, and (3) forecasting data on environmental drivers. These three pieces of data are combined to generate a series of sequential transition matrices that emulate a pattern of long-term change in environmental drivers. Realistic estimates of population persistence and extinction risk can be derived from stochastic permutations of such a model. We illustrate the steps of this analysis with data from two populations of Sarracenia purpurea growing in northern New England. Sarracenia purpurea is a perennial carnivorous plant that is potentially at risk of local extinction because of increased nitrogen deposition. Long-term monitoring records or models of environmental change can be used to generate time series of driver variables under different scenarios of changing environments. Both manipulative and natural experiments can be used to construct a linking function that describes how matrix parameters change as a function of the environmental driver. This synthetic modeling approach provides quantitative estimates of extinction probability that have an explicit mechanistic basis.
NASA Astrophysics Data System (ADS)
Orlove, Benjamin S.; Broad, Kenneth; Petty, Aaron M.
2004-11-01
This article analyzes the use of climate forecasts among members of the Peruvian fishing sector during the 1997/98 El Niño event. It focuses on the effect of the time of hearing a forecast on the socioeconomic responses to the forecast. Findings are based on data collected from a survey of 596 persons in five ports spanning the length of the Peruvian coast. Respondents include commercial and artisanal fishers, plant workers, managers, and firm owners.These data fill an important gap in the literature on the use of forecasts. Though modelers have discussed the effects of the timing of the dissemination and reception of forecasts, along with other factors, on acting on a forecast once it has been heard, few researchers have gathered empirical evidence on these topics.The 1997/98 El Niño event was covered extensively by the media throughout Peru, affording the opportunity to study the effect of hearing forecasts on actions taken by members of a population directly impacted by ENSO events. Findings of this study examine the relationships among 1) socioeconomic variables, including geographic factors, age, education, income level, organizational ties, and media access; 2) time of hearing the forecast; and 3) actions taken in response to the forecast. Socioeconomic variables have a strong effect on the time of hearing the forecast and the actions taken in response to the forecast; however, time of hearing does not have an independent effect on taking action. The article discusses the implications of these findings for the application of forecasts.A supplement to this article is available online (dx.doi.org/10.1175/BAMS-85-11-Orlove)
Lifting primordial non-Gaussianity above the noise
DOE Office of Scientific and Technical Information (OSTI.GOV)
Welling, Yvette; Woude, Drian van der; Pajer, Enrico, E-mail: welling@strw.leidenuniv.nl, E-mail: D.C.vanderWoude@uu.nl, E-mail: enrico.pajer@gmail.com
2016-08-01
Primordial non-Gaussianity (PNG) in Large Scale Structures is obfuscated by the many additional sources of non-linearity. Within the Effective Field Theory approach to Standard Perturbation Theory, we show that matter non-linearities in the bispectrum can be modeled sufficiently well to strengthen current bounds with near future surveys, such as Euclid. We find that the EFT corrections are crucial to this improvement in sensitivity. Yet, our understanding of non-linearities is still insufficient to reach important theoretical benchmarks for equilateral PNG, while, for local PNG, our forecast is more optimistic. We consistently account for the theoretical error intrinsic to the perturbative approachmore » and discuss the details of its implementation in Fisher forecasts.« less
Du, Tianchuan; Liao, Li; Wu, Cathy H; Sun, Bilin
2016-11-01
Protein-protein interactions play essential roles in many biological processes. Acquiring knowledge of the residue-residue contact information of two interacting proteins is not only helpful in annotating functions for proteins, but also critical for structure-based drug design. The prediction of the protein residue-residue contact matrix of the interfacial regions is challenging. In this work, we introduced deep learning techniques (specifically, stacked autoencoders) to build deep neural network models to tackled the residue-residue contact prediction problem. In tandem with interaction profile Hidden Markov Models, which was used first to extract Fisher score features from protein sequences, stacked autoencoders were deployed to extract and learn hidden abstract features. The deep learning model showed significant improvement over the traditional machine learning model, Support Vector Machines (SVM), with the overall accuracy increased by 15% from 65.40% to 80.82%. We showed that the stacked autoencoders could extract novel features, which can be utilized by deep neural networks and other classifiers to enhance learning, out of the Fisher score features. It is further shown that deep neural networks have significant advantages over SVM in making use of the newly extracted features. Copyright © 2016. Published by Elsevier Inc.
2015-03-01
General covariance intersection covariance matrix Σ1 Measurement 1’s covariance matrix I(X) Fisher information matrix g Confidence region L Lower... information in this chapter will discuss the motivation and background of the geolocation algorithm with the scope of the applications for this research. The...algorithm is able to produce the best description of an object given the information from a set of measurements. Determining a position requires the use of a
Liu, Fengchen; Porco, Travis C; Amza, Abdou; Kadri, Boubacar; Nassirou, Baido; West, Sheila K; Bailey, Robin L; Keenan, Jeremy D; Solomon, Anthony W; Emerson, Paul M; Gambhir, Manoj; Lietman, Thomas M
2015-08-01
Trachoma programs rely on guidelines made in large part using expert opinion of what will happen with and without intervention. Large community-randomized trials offer an opportunity to actually compare forecasting methods in a masked fashion. The Program for the Rapid Elimination of Trachoma trials estimated longitudinal prevalence of ocular chlamydial infection from 24 communities treated annually with mass azithromycin. Given antibiotic coverage and biannual assessments from baseline through 30 months, forecasts of the prevalence of infection in each of the 24 communities at 36 months were made by three methods: the sum of 15 experts' opinion, statistical regression of the square-root-transformed prevalence, and a stochastic hidden Markov model of infection transmission (Susceptible-Infectious-Susceptible, or SIS model). All forecasters were masked to the 36-month results and to the other forecasts. Forecasts of the 24 communities were scored by the likelihood of the observed results and compared using Wilcoxon's signed-rank statistic. Regression and SIS hidden Markov models had significantly better likelihood than community expert opinion (p = 0.004 and p = 0.01, respectively). All forecasts scored better when perturbed to decrease Fisher's information. Each individual expert's forecast was poorer than the sum of experts. Regression and SIS models performed significantly better than expert opinion, although all forecasts were overly confident. Further model refinements may score better, although would need to be tested and compared in new masked studies. Construction of guidelines that rely on forecasting future prevalence could consider use of mathematical and statistical models. Clinicaltrials.gov NCT00792922.
Forecasting the Short-Term Passenger Flow on High-Speed Railway with Neural Networks
Xie, Mei-Quan; Li, Xia-Miao; Zhou, Wen-Liang; Fu, Yan-Bing
2014-01-01
Short-term passenger flow forecasting is an important component of transportation systems. The forecasting result can be applied to support transportation system operation and management such as operation planning and revenue management. In this paper, a divide-and-conquer method based on neural network and origin-destination (OD) matrix estimation is developed to forecast the short-term passenger flow in high-speed railway system. There are three steps in the forecasting method. Firstly, the numbers of passengers who arrive at each station or depart from each station are obtained from historical passenger flow data, which are OD matrices in this paper. Secondly, short-term passenger flow forecasting of the numbers of passengers who arrive at each station or depart from each station based on neural network is realized. At last, the OD matrices in short-term time are obtained with an OD matrix estimation method. The experimental results indicate that the proposed divide-and-conquer method performs well in forecasting the short-term passenger flow on high-speed railway. PMID:25544838
Landscape-scale habitat selection by fishers translocated to the Olympic Peninsula of Washington
Lewis, Jeffrey C.; Jenkins, Kurt J.; Happe, Patricia J.; Manson, David J.; McCalmon, Marc
2016-01-01
The fisher was extirpated from much of the Pacific Northwestern United States during the mid- to late-1900s and is now proposed for federal listing as a threatened species in all or part of its west coast range. Following the translocation of 90 fishers from central British Columbia, Canada, to the Olympic Peninsula of Washington State from 2008 to 2010, we investigated the landscape-scale habitat selection of reintroduced fishers across a broad range of forest ages and disturbance histories, providing the first information on habitat relationships of newly reintroduced fishers in coastal coniferous forests in the Pacific Northwest. We developed 17 a priori models to evaluate several habitat-selection hypotheses based on premises of habitat models used to forecast habitat suitability for the reintroduced population. Further, we hypothesized that female fishers, because of their smaller body size than males, greater vulnerability to predation, and specific reproductive requirements, would be more selective than males for mid- to late-seral forest communities, where complex forest structural elements provide secure foraging, resting, and denning sites. We assessed 11 forest structure and landscape characteristics within the home range core-areas used by 19 females and 12 males and within randomly placed pseudo core areas that represented available habitats. We used case-controlled logistic regression to compare the characteristics of used and pseudo core areas and to assess selection by male and female fishers. Females were more selective of core area placement than males. Fifteen of 19 females (79%) and 5 of 12 males (42%) selected core areas within federal lands that encompassed primarily forests with an overstory of mid-sized or large trees. Male fishers exhibited only weak selection for core areas dominated by forests with an overstory of small trees, primarily on land managed for timber production or at high elevations. The amount of natural open area best distinguished the use of core areas between males and females, with females using substantially less natural open area than males. Although sex-specific selection has been suspected for fishers, we identified factors that distinguish the selection of core areas by females from those of males, information which will be valuable to managers planning reintroductions or providing suitable habitat to promote fisher recovery in the Pacific Northwest.
Constraints on inflation with LSS surveys: features in the primordial power spectrum
NASA Astrophysics Data System (ADS)
Palma, Gonzalo A.; Sapone, Domenico; Sypsas, Spyros
2018-06-01
We analyse the efficiency of future large scale structure surveys to unveil the presence of scale dependent features in the primordial spectrum—resulting from cosmic inflation—imprinted in the distribution of galaxies. Features may appear as a consequence of non-trivial dynamics during cosmic inflation, in which one or more background quantities experienced small but rapid deviations from their characteristic slow-roll evolution. We consider two families of features: localised features and oscillatory extended features. To characterise them we employ various possible templates parametrising their scale dependence and provide forecasts on the constraints on these parametrisations for LSST like surveys. We perform a Fisher matrix analysis for three observables: cosmic microwave background (CMB), galaxy clustering and weak lensing. We find that the combined data set of these observables will be able to limit the presence of features down to levels that are more restrictive than current constraints coming from CMB observations only. In particular, we address the possibility of gaining information on currently known deviations from scale invariance inferred from CMB data, such as the feature appearing at the l ~ 20 multipole (which is the main contribution to the low-l deficit) and another one around l ~ 800.
Optimal weighting in fNL constraints from large scale structure in an idealised case
NASA Astrophysics Data System (ADS)
Slosar, Anže
2009-03-01
We consider the problem of optimal weighting of tracers of structure for the purpose of constraining the non-Gaussianity parameter fNL. We work within the Fisher matrix formalism expanded around fiducial model with fNL = 0 and make several simplifying assumptions. By slicing a general sample into infinitely many samples with different biases, we derive the analytic expression for the relevant Fisher matrix element. We next consider weighting schemes that construct two effective samples from a single sample of tracers with a continuously varying bias. We show that a particularly simple ansatz for weighting functions can recover all information about fNL in the initial sample that is recoverable using a given bias observable and that simple division into two equal samples is considerably suboptimal when sampling of modes is good, but only marginally suboptimal in the limit where Poisson errors dominate.
Ability of matrix models to explain the past and predict the future of plant populations.
McEachern, Kathryn; Crone, Elizabeth E.; Ellis, Martha M.; Morris, William F.; Stanley, Amanda; Bell, Timothy; Bierzychudek, Paulette; Ehrlen, Johan; Kaye, Thomas N.; Knight, Tiffany M.; Lesica, Peter; Oostermeijer, Gerard; Quintana-Ascencio, Pedro F.; Ticktin, Tamara; Valverde, Teresa; Williams, Jennifer I.; Doak, Daniel F.; Ganesan, Rengaian; Thorpe, Andrea S.; Menges, Eric S.
2013-01-01
Uncertainty associated with ecological forecasts has long been recognized, but forecast accuracy is rarely quantified. We evaluated how well data on 82 populations of 20 species of plants spanning 3 continents explained and predicted plant population dynamics. We parameterized stage-based matrix models with demographic data from individually marked plants and determined how well these models forecast population sizes observed at least 5 years into the future. Simple demographic models forecasted population dynamics poorly; only 40% of observed population sizes fell within our forecasts' 95% confidence limits. However, these models explained population dynamics during the years in which data were collected; observed changes in population size during the data-collection period were strongly positively correlated with population growth rate. Thus, these models are at least a sound way to quantify population status. Poor forecasts were not associated with the number of individual plants or years of data. We tested whether vital rates were density dependent and found both positive and negative density dependence. However, density dependence was not associated with forecast error. Forecast error was significantly associated with environmental differences between the data collection and forecast periods. To forecast population fates, more detailed models, such as those that project how environments are likely to change and how these changes will affect population dynamics, may be needed. Such detailed models are not always feasible. Thus, it may be wiser to make risk-averse decisions than to expect precise forecasts from models.
Ability of matrix models to explain the past and predict the future of plant populations.
Crone, Elizabeth E; Ellis, Martha M; Morris, William F; Stanley, Amanda; Bell, Timothy; Bierzychudek, Paulette; Ehrlén, Johan; Kaye, Thomas N; Knight, Tiffany M; Lesica, Peter; Oostermeijer, Gerard; Quintana-Ascencio, Pedro F; Ticktin, Tamara; Valverde, Teresa; Williams, Jennifer L; Doak, Daniel F; Ganesan, Rengaian; McEachern, Kathyrn; Thorpe, Andrea S; Menges, Eric S
2013-10-01
Uncertainty associated with ecological forecasts has long been recognized, but forecast accuracy is rarely quantified. We evaluated how well data on 82 populations of 20 species of plants spanning 3 continents explained and predicted plant population dynamics. We parameterized stage-based matrix models with demographic data from individually marked plants and determined how well these models forecast population sizes observed at least 5 years into the future. Simple demographic models forecasted population dynamics poorly; only 40% of observed population sizes fell within our forecasts' 95% confidence limits. However, these models explained population dynamics during the years in which data were collected; observed changes in population size during the data-collection period were strongly positively correlated with population growth rate. Thus, these models are at least a sound way to quantify population status. Poor forecasts were not associated with the number of individual plants or years of data. We tested whether vital rates were density dependent and found both positive and negative density dependence. However, density dependence was not associated with forecast error. Forecast error was significantly associated with environmental differences between the data collection and forecast periods. To forecast population fates, more detailed models, such as those that project how environments are likely to change and how these changes will affect population dynamics, may be needed. Such detailed models are not always feasible. Thus, it may be wiser to make risk-averse decisions than to expect precise forecasts from models. © 2013 Society for Conservation Biology.
Schrempf, Dominik; Hobolth, Asger
2017-04-01
Recently, Burden and Tang (2016) provided an analytical expression for the stationary distribution of the multivariate neutral Wright-Fisher model with low mutation rates. In this paper we present a simple, alternative derivation that illustrates the approximation. Our proof is based on the discrete multivariate boundary mutation model which has three key ingredients. First, the decoupled Moran model is used to describe genetic drift. Second, low mutation rates are assumed by limiting mutations to monomorphic states. Third, the mutation rate matrix is separated into a time-reversible part and a flux part, as suggested by Burden and Tang (2016). An application of our result to data from several great apes reveals that the assumption of stationarity may be inadequate or that other evolutionary forces like selection or biased gene conversion are acting. Furthermore we find that the model with a reversible mutation rate matrix provides a reasonably good fit to the data compared to the one with a non-reversible mutation rate matrix. Copyright © 2016 The Author(s). Published by Elsevier Inc. All rights reserved.
Combining Correlation Matrices: Simulation Analysis of Improved Fixed-Effects Methods
ERIC Educational Resources Information Center
Hafdahl, Adam R.
2007-01-01
The originally proposed multivariate meta-analysis approach for correlation matrices--analyze Pearson correlations, with each study's observed correlations replacing their population counterparts in its conditional-covariance matrix--performs poorly. Two refinements are considered: Analyze Fisher Z-transformed correlations, and substitute better…
NASA Astrophysics Data System (ADS)
Gelfan, Alexander; Moreido, Vsevolod
2017-04-01
Ensemble hydrological forecasting allows for describing uncertainty caused by variability of meteorological conditions in the river basin for the forecast lead-time. At the same time, in snowmelt-dependent river basins another significant source of uncertainty relates to variability of initial conditions of the basin (snow water equivalent, soil moisture content, etc.) prior to forecast issue. Accurate long-term hydrological forecast is most crucial for large water management systems, such as the Cheboksary reservoir (the catchment area is 374 000 sq.km) located in the Middle Volga river in Russia. Accurate forecasts of water inflow volume, maximum discharge and other flow characteristics are of great value for this basin, especially before the beginning of the spring freshet season that lasts here from April to June. The semi-distributed hydrological model ECOMAG was used to develop long-term ensemble forecast of daily water inflow into the Cheboksary reservoir. To describe variability of the meteorological conditions and construct ensemble of possible weather scenarios for the lead-time of the forecast, two approaches were applied. The first one utilizes 50 weather scenarios observed in the previous years (similar to the ensemble streamflow prediction (ESP) procedure), the second one uses 1000 synthetic scenarios simulated by a stochastic weather generator. We investigated the evolution of forecast uncertainty reduction, expressed as forecast efficiency, over various consequent forecast issue dates and lead time. We analyzed the Nash-Sutcliffe efficiency of inflow hindcasts for the period 1982 to 2016 starting from 1st of March with 15 days frequency for lead-time of 1 to 6 months. This resulted in the forecast efficiency matrix with issue dates versus lead-time that allows for predictability identification of the basin. The matrix was constructed separately for observed and synthetic weather ensembles.
Can agent based models effectively reduce fisheries management implementation uncertainty?
NASA Astrophysics Data System (ADS)
Drexler, M.
2016-02-01
Uncertainty is an inherent feature of fisheries management. Implementation uncertainty remains a challenge to quantify often due to unintended responses of users to management interventions. This problem will continue to plague both single species and ecosystem based fisheries management advice unless the mechanisms driving these behaviors are properly understood. Equilibrium models, where each actor in the system is treated as uniform and predictable, are not well suited to forecast the unintended behaviors of individual fishers. Alternatively, agent based models (AMBs) can simulate the behaviors of each individual actor driven by differing incentives and constraints. This study evaluated the feasibility of using AMBs to capture macro scale behaviors of the US West Coast Groundfish fleet. Agent behavior was specified at the vessel level. Agents made daily fishing decisions using knowledge of their own cost structure, catch history, and the histories of catch and quota markets. By adding only a relatively small number of incentives, the model was able to reproduce highly realistic macro patterns of expected outcomes in response to management policies (catch restrictions, MPAs, ITQs) while preserving vessel heterogeneity. These simulations indicate that agent based modeling approaches hold much promise for simulating fisher behaviors and reducing implementation uncertainty. Additional processes affecting behavior, informed by surveys, are continually being added to the fisher behavior model. Further coupling of the fisher behavior model to a spatial ecosystem model will provide a fully integrated social, ecological, and economic model capable of performing management strategy evaluations to properly consider implementation uncertainty in fisheries management.
Measuring the Microlensing Parallax from Various Space Observatories
NASA Astrophysics Data System (ADS)
Bachelet, E.; Hinse, T. C.; Street, R.
2018-05-01
A few observational methods allow the measurement of the mass and distance of the lens-star for a microlensing event. A first estimate can be obtained by measuring the microlensing parallax effect produced by either the motion of the Earth (annual parallax) or the contemporaneous observation of the lensing event from two (or more) observatories (space or terrestrial parallax) sufficiently separated from each other. Further developing ideas originally outlined by Gould as well as Mogavero & Beaulieu, we review the possibility of measuring systematically the microlensing parallax using a telescope based on the Moon surface and other space-based observing platforms, including the upcoming WFIRST space-telescope. We first generalize the Fisher matrix formulation and present results demonstrating the advantage for each observing scenario. We conclude by outlining the limitation of the Fisher matrix analysis when submitted to a practical data modeling process. By considering a lunar-based parallax observation, we find that parameter correlations introduce a significant loss in detection efficiency of the probed lunar parallax effect.
CMB bispectrum, trispectrum, non-Gaussianity, and the Cramer-Rao bound
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kamionkowski, Marc; Smith, Tristan L.; Heavens, Alan
Minimum-variance estimators for the parameter f{sub nl} that quantifies local-model non-Gaussianity can be constructed from the cosmic microwave background (CMB) bispectrum (three-point function) and also from the trispectrum (four-point function). Some have suggested that a comparison between the estimates for the values of f{sub nl} from the bispectrum and trispectrum allow a consistency test for the model. But others argue that the saturation of the Cramer-Rao bound--which gives a lower limit to the variance of an estimator--by the bispectrum estimator implies that no further information on f{sub nl} can be obtained from the trispectrum. Here, we elaborate the nature ofmore » the correlation between the bispectrum and trispectrum estimators for f{sub nl}. We show that the two estimators become statistically independent in the limit of large number of CMB pixels, and thus that the trispectrum estimator does indeed provide additional information on f{sub nl} beyond that obtained from the bispectrum. We explain how this conclusion is consistent with the Cramer-Rao bound. Our discussion of the Cramer-Rao bound may be of interest to those doing Fisher-matrix parameter-estimation forecasts or data analysis in other areas of physics as well.« less
Optimized detection of shear peaks in weak lensing maps
NASA Astrophysics Data System (ADS)
Marian, Laura; Smith, Robert E.; Hilbert, Stefan; Schneider, Peter
2012-06-01
We present a new method to extract cosmological constraints from weak lensing (WL) peak counts, which we denote as ‘the hierarchical algorithm’. The idea of this method is to combine information from WL maps sequentially smoothed with a series of filters of different size, from the largest down to the smallest, thus increasing the cosmological sensitivity of the resulting peak function. We compare the cosmological constraints resulting from the peak abundance measured in this way and the abundance obtained by using a filter of fixed size, which is the standard practice in WL peak studies. For this purpose, we employ a large set of WL maps generated by ray tracing through N-body simulations, and the Fisher matrix formalism. We find that if low signal-to-noise ratio (?) peaks are included in the analysis (?), the hierarchical method yields constraints significantly better than the single-sized filtering. For a large future survey such as Euclid or Large Synoptic Survey Telescope, combined with information from a cosmic microwave background experiment like Planck, the results for the hierarchical (single-sized) method are Δns= 0.0039 (0.004), ΔΩm= 0.002 (0.0045), Δσ8= 0.003 (0.006) and Δw= 0.019 (0.0525). This forecast is conservative, as we assume no knowledge of the redshifts of the lenses, and consider a single broad bin for the redshifts of the sources. If only peaks with ? are considered, then there is little difference between the results of the two methods. We also examine the statistical properties of the hierarchical peak function: Its covariance matrix has off-diagonal terms for bins with ? and aperture mass of M < 3 × 1014 h-1 M⊙, the higher bins being largely uncorrelated and therefore well described by a Poisson distribution.
Socio-Political Forecasting: Who Needs It?
ERIC Educational Resources Information Center
Burnett, D. Jack
1978-01-01
Socio-political forecasting, a new dimension to university planning that can provide universities time to prepare for the impact of social and political changes, is examined. The four elements in the process are scenarios of the future, the probability/diffusion matrix, the profile of significant value-system changes, and integration and…
Meta-Analytic Structural Equation Modeling: A Two-Stage Approach
ERIC Educational Resources Information Center
Cheung, Mike W. L.; Chan, Wai
2005-01-01
To synthesize studies that use structural equation modeling (SEM), researchers usually use Pearson correlations (univariate r), Fisher z scores (univariate z), or generalized least squares (GLS) to combine the correlation matrices. The pooled correlation matrix is then analyzed by the use of SEM. Questionable inferences may occur for these ad hoc…
Sensor management in RADAR/IRST track fusion
NASA Astrophysics Data System (ADS)
Hu, Shi-qiang; Jing, Zhong-liang
2004-07-01
In this paper, a novel radar management strategy technique suitable for RADAR/IRST track fusion, which is based on Fisher Information Matrix (FIM) and fuzzy stochastic decision approach, is put forward. Firstly, optimal radar measurements' scheduling is obtained by the method of maximizing determinant of the Fisher information matrix of radar and IRST measurements, which is managed by the expert system. Then, suggested a "pseudo sensor" to predict the possible target position using the polynomial method based on the radar and IRST measurements, using "pseudo sensor" model to estimate the target position even if the radar is turned off. At last, based on the tracking performance and the state of target maneuver, fuzzy stochastic decision is used to adjust the optimal radar scheduling and retrieve the module parameter of "pseudo sensor". The experiment result indicates that the algorithm can not only limit Radar activity effectively but also keep the tracking accuracy of active/passive system well. And this algorithm eliminates the drawback of traditional Radar management methods that the Radar activity is fixed and not easy to control and protect.
Comparative test on several forms of background error covariance in 3DVar
NASA Astrophysics Data System (ADS)
Shao, Aimei
2013-04-01
The background error covariance matrix (Hereinafter referred to as B matrix) plays an important role in the three-dimensional variational (3DVar) data assimilation method. However, it is difficult to get B matrix accurately because true atmospheric state is unknown. Therefore, some methods were developed to estimate B matrix (e.g. NMC method, innovation analysis method, recursive filters, and ensemble method such as EnKF). Prior to further development and application of these methods, the function of several B matrixes estimated by these methods in 3Dvar is worth studying and evaluating. For this reason, NCEP reanalysis data and forecast data are used to test the effectiveness of the several B matrixes with VAF (Huang, 1999) method. Here the NCEP analysis is treated as the truth and in this case the forecast error is known. The data from 2006 to 2007 is used as the samples to estimate B matrix and the data in 2008 is used to verify the assimilation effects. The 48h and 24h forecast valid at the same time is used to estimate B matrix with NMC method. B matrix can be represented by a correlation part (a non-diagonal matrix) and a variance part (a diagonal matrix of variances). Gaussian filter function as an approximate approach is used to represent the variation of correlation coefficients with distance in numerous 3DVar systems. On the basis of the assumption, the following several forms of B matrixes are designed and test with VAF in the comparative experiments: (1) error variance and the characteristic lengths are fixed and setted to their mean value averaged over the analysis domain; (2) similar to (1), but the mean characteristic lengths reduce to 50 percent for the height and 60 percent for the temperature of the original; (3) similar to (2), but error variance calculated directly by the historical data is space-dependent; (4) error variance and characteristic lengths are all calculated directly by the historical data; (5) B matrix is estimated directly by the historical data; (6) similar to (5), but a localization process is performed; (7) B matrix is estimated by NMC method but error variance is reduced by 1.7 times in order that the value is close to that calculated from the true forecast error samples; (8) similar to (7), but the localization similar to (6) is performed. Experimental results with the different B matrixes show that for the Gaussian-type B matrix the characteristic lengths calculated from the true error samples don't bring a good analysis results. However, the reduced characteristic lengths (about half of the original one) can lead to a good analysis. If the B matrix estimated directly from the historical data is used in 3DVar, the assimilation effect can not reach to the best. The better assimilation results are generated with the application of reduced characteristic length and localization. Even so, it hasn't obvious advantage compared with Gaussian-type B matrix with the optimal characteristic length. It implies that the Gaussian-type B matrix, widely used for operational 3DVar system, can get a good analysis with the appropriate characteristic lengths. The crucial problem is how to determine the appropriate characteristic lengths. (This work is supported by the National Natural Science Foundation of China (41275102, 40875063), and the Fundamental Research Funds for the Central Universities (lzujbky-2010-9) )
Low-dimensional Representation of Error Covariance
NASA Technical Reports Server (NTRS)
Tippett, Michael K.; Cohn, Stephen E.; Todling, Ricardo; Marchesin, Dan
2000-01-01
Ensemble and reduced-rank approaches to prediction and assimilation rely on low-dimensional approximations of the estimation error covariances. Here stability properties of the forecast/analysis cycle for linear, time-independent systems are used to identify factors that cause the steady-state analysis error covariance to admit a low-dimensional representation. A useful measure of forecast/analysis cycle stability is the bound matrix, a function of the dynamics, observation operator and assimilation method. Upper and lower estimates for the steady-state analysis error covariance matrix eigenvalues are derived from the bound matrix. The estimates generalize to time-dependent systems. If much of the steady-state analysis error variance is due to a few dominant modes, the leading eigenvectors of the bound matrix approximate those of the steady-state analysis error covariance matrix. The analytical results are illustrated in two numerical examples where the Kalman filter is carried to steady state. The first example uses the dynamics of a generalized advection equation exhibiting nonmodal transient growth. Failure to observe growing modes leads to increased steady-state analysis error variances. Leading eigenvectors of the steady-state analysis error covariance matrix are well approximated by leading eigenvectors of the bound matrix. The second example uses the dynamics of a damped baroclinic wave model. The leading eigenvectors of a lowest-order approximation of the bound matrix are shown to approximate well the leading eigenvectors of the steady-state analysis error covariance matrix.
Chao, Jerry; Ward, E. Sally; Ober, Raimund J.
2012-01-01
The high quantum efficiency of the charge-coupled device (CCD) has rendered it the imaging technology of choice in diverse applications. However, under extremely low light conditions where few photons are detected from the imaged object, the CCD becomes unsuitable as its readout noise can easily overwhelm the weak signal. An intended solution to this problem is the electron-multiplying charge-coupled device (EMCCD), which stochastically amplifies the acquired signal to drown out the readout noise. Here, we develop the theory for calculating the Fisher information content of the amplified signal, which is modeled as the output of a branching process. Specifically, Fisher information expressions are obtained for a general and a geometric model of amplification, as well as for two approximations of the amplified signal. All expressions pertain to the important scenario of a Poisson-distributed initial signal, which is characteristic of physical processes such as photon detection. To facilitate the investigation of different data models, a “noise coefficient” is introduced which allows the analysis and comparison of Fisher information via a scalar quantity. We apply our results to the problem of estimating the location of a point source from its image, as observed through an optical microscope and detected by an EMCCD. PMID:23049166
NASA Astrophysics Data System (ADS)
Pillepich, Annalisa; Porciani, Cristiano; Reiprich, Thomas H.
2012-05-01
Starting in late 2013, the eRosita telescope will survey the X-ray sky with unprecedented sensitivity. Assuming a detection limit of 50 photons in the (0.5-2.0) keV energy band with a typical exposure time of 1.6 ks, we predict that eRosita will detect ˜9.3 × 104 clusters of galaxies more massive than 5 × 1013 h-1 M⊙, with the currently planned all-sky survey. Their median redshift will be z≃ 0.35. We perform a Fisher-matrix analysis to forecast the constraining power of ? on the Λ cold dark matter (ΛCDM) cosmology and, simultaneously, on the X-ray scaling relations for galaxy clusters. Special attention is devoted to the possibility of detecting primordial non-Gaussianity. We consider two experimental probes: the number counts and the angular clustering of a photon-count limited sample of clusters. We discuss how the cluster sample should be split to optimize the analysis and we show that redshift information of the individual clusters is vital to break the strong degeneracies among the model parameters. For example, performing a 'tomographic' analysis based on photometric-redshift estimates and combining one- and two-point statistics will give marginal 1σ errors of Δσ8≃ 0.036 and ΔΩm≃ 0.012 without priors, and improve the current estimates on the slope of the luminosity-mass relation by a factor of 3. Regarding primordial non-Gaussianity, ? clusters alone will give ΔfNL≃ 9, 36 and 144 for the local, orthogonal and equilateral model, respectively. Measuring redshifts with spectroscopic accuracy would further tighten the constraints by nearly 40 per cent (barring fNL which displays smaller improvements). Finally, combining ? data with the analysis of temperature anisotropies in the cosmic microwave background by the Planck satellite should give sensational constraints on both the cosmology and the properties of the intracluster medium.
Cosmological study with galaxy clusters detected by the Sunyaev-Zel'dovich effect
NASA Astrophysics Data System (ADS)
Mak, Suet-Ying
In this work, we present various studies to forecast the power of the galaxy clusters detected by the Sunyaev-Zel'dovich (SZ) effect in constraining cosmological models. The SZ effect is regarded as one of the new and promising technique to identify and study cluster physics. With the latest data being released in recent years from the SZ telescopes, it is essential to explore their potentials in providing cosmological information and investigate their relative strengths with respect to galaxy cluster data from X-ray and optical, as well as other cosmological probes such as Cosmic Microwave Background (CMB). One of the topics regard resolving the debate on the existence of an anomalous large scale bulk flow as measured from the kinetic SZ signal of galaxy clusters in the WMAP CMB data. We predict that if such measurement is done with the latest CMB data from the Planck satellite, the sensitivity will be improved by a factor of >5 and thus be able to provide an independent view of its existence. As it turns out, the Planck data, when using the technique developed in this work, find that the observed bulk flow amplitude is consistent with those expected from the LambdaCDM, which is in clear contradiction to the previous claim of a significant bulk flow detection in the WMAP data. We also forecast on the capability of the ongoing and future cluster surveys identified through thermal SZ (tSZ) in constraining three extended models to the LambdaCDM model: modified gravity f( R) model, primordial non-Gaussianity of density perturbation, and the presence of massive neutrinos. We do so by employing their effects on the cluster number count and power spectrum and using Fisher Matrix analysis to estimate the errors on the model parameters. We find that SZ cluster surveys can provide vital complementary information to those expected from non-cluster probes. Our results therefore give the confidence for pursuing these extended cosmological models with SZ clusters.
Feature Extraction of Electronic Nose Signals Using QPSO-Based Multiple KFDA Signal Processing
Wen, Tailai; Huang, Daoyu; Lu, Kun; Deng, Changjian; Zeng, Tanyue; Yu, Song; He, Zhiyi
2018-01-01
The aim of this research was to enhance the classification accuracy of an electronic nose (E-nose) in different detecting applications. During the learning process of the E-nose to predict the types of different odors, the prediction accuracy was not quite satisfying because the raw features extracted from sensors’ responses were regarded as the input of a classifier without any feature extraction processing. Therefore, in order to obtain more useful information and improve the E-nose’s classification accuracy, in this paper, a Weighted Kernels Fisher Discriminant Analysis (WKFDA) combined with Quantum-behaved Particle Swarm Optimization (QPSO), i.e., QWKFDA, was presented to reprocess the original feature matrix. In addition, we have also compared the proposed method with quite a few previously existing ones including Principal Component Analysis (PCA), Locality Preserving Projections (LPP), Fisher Discriminant Analysis (FDA) and Kernels Fisher Discriminant Analysis (KFDA). Experimental results proved that QWKFDA is an effective feature extraction method for E-nose in predicting the types of wound infection and inflammable gases, which shared much higher classification accuracy than those of the contrast methods. PMID:29382146
Feature Extraction of Electronic Nose Signals Using QPSO-Based Multiple KFDA Signal Processing.
Wen, Tailai; Yan, Jia; Huang, Daoyu; Lu, Kun; Deng, Changjian; Zeng, Tanyue; Yu, Song; He, Zhiyi
2018-01-29
The aim of this research was to enhance the classification accuracy of an electronic nose (E-nose) in different detecting applications. During the learning process of the E-nose to predict the types of different odors, the prediction accuracy was not quite satisfying because the raw features extracted from sensors' responses were regarded as the input of a classifier without any feature extraction processing. Therefore, in order to obtain more useful information and improve the E-nose's classification accuracy, in this paper, a Weighted Kernels Fisher Discriminant Analysis (WKFDA) combined with Quantum-behaved Particle Swarm Optimization (QPSO), i.e., QWKFDA, was presented to reprocess the original feature matrix. In addition, we have also compared the proposed method with quite a few previously existing ones including Principal Component Analysis (PCA), Locality Preserving Projections (LPP), Fisher Discriminant Analysis (FDA) and Kernels Fisher Discriminant Analysis (KFDA). Experimental results proved that QWKFDA is an effective feature extraction method for E-nose in predicting the types of wound infection and inflammable gases, which shared much higher classification accuracy than those of the contrast methods.
SIGAR Special Inspector General for Afghanistan Reconstruction
2017-10-30
Daniel Fisher, Economic and Social Development Subject Matter Expert Emmett Schneider, Funding Subject Matter Expert Clark Irwin, Lead Writer/Editor...governance, economic development, peace and reconciliation, and security issues. Each working group has a matrix of benchmarks—subject to change—to chart...the U.S. efforts to build the Afghan security forces, improve governance, facilitate economic and social development, and combat the sale and
Teacher Education: The Application of Fisher's LSD Matrix in the Evaluation of Preservice Teachers.
ERIC Educational Resources Information Center
Stolworthy, Reed L.
The degrees of variance among three groups of evaluators relative to their assessments of the teaching competencies of preservice teacher education students were studied. Subjects included groups of 23 and 32 undergraduates who were certified to teach by the teacher preparation program at Washburn University in Topeka (Kansas) in 1987 and in 1988,…
Xu, Dong; Yan, Shuicheng; Tao, Dacheng; Lin, Stephen; Zhang, Hong-Jiang
2007-11-01
Dimensionality reduction algorithms, which aim to select a small set of efficient and discriminant features, have attracted great attention for human gait recognition and content-based image retrieval (CBIR). In this paper, we present extensions of our recently proposed marginal Fisher analysis (MFA) to address these problems. For human gait recognition, we first present a direct application of MFA, then inspired by recent advances in matrix and tensor-based dimensionality reduction algorithms, we present matrix-based MFA for directly handling 2-D input in the form of gray-level averaged images. For CBIR, we deal with the relevance feedback problem by extending MFA to marginal biased analysis, in which within-class compactness is characterized only by the distances between each positive sample and its neighboring positive samples. In addition, we present a new technique to acquire a direct optimal solution for MFA without resorting to objective function modification as done in many previous algorithms. We conduct comprehensive experiments on the USF HumanID gait database and the Corel image retrieval database. Experimental results demonstrate that MFA and its extensions outperform related algorithms in both applications.
Empirical seasonal forecasts of the NAO
NASA Astrophysics Data System (ADS)
Sanchezgomez, E.; Ortizbevia, M.
2003-04-01
We present here seasonal forecasts of the North Atlantic Oscillation (NAO) issued from ocean predictors with an empirical procedure. The Singular Values Decomposition (SVD) of the cross-correlation matrix between predictor and predictand fields at the lag used for the forecast lead is at the core of the empirical model. The main predictor field are sea surface temperature anomalies, although sea ice cover anomalies are also used. Forecasts are issued in probabilistic form. The model is an improvement over a previous version (1), where Sea Level Pressure Anomalies were first forecast, and the NAO Index built from this forecast field. Both correlation skill between forecast and observed field, and number of forecasts that hit the correct NAO sign, are used to assess the forecast performance , usually above those values found in the case of forecasts issued assuming persistence. For certain seasons and/or leads, values of the skill are above the .7 usefulness treshold. References (1) SanchezGomez, E. and Ortiz Bevia M., 2002, Estimacion de la evolucion pluviometrica de la Espana Seca atendiendo a diversos pronosticos empiricos de la NAO, in 'El Agua y el Clima', Publicaciones de la AEC, Serie A, N 3, pp 63-73, Palma de Mallorca, Spain
Methods of Technological Forecasting,
1977-05-01
Trend Extrapolation Progress Curve Analogy Trend Correlation Substitution Analysis or Substitution Growth Curves Envelope Curve Advances in the State of...the Art Technological Mapping Contextual Mapping Matrix Input-Output Analysis Mathematical Models Simulation Models Dynamic Modelling. CHAPTER IV...Generation Interaction between Needs and Possibilities Map of the Technological Future — (‘ross- Impact Matri x Discovery Matrix Morphological Analysis
NASA Astrophysics Data System (ADS)
Mishra, Abhilash; Hirata, Christopher M.
2018-05-01
In the first paper of this series, we showed that the CMB quadrupole at high redshifts results in a small circular polarization of the emitted 21 cm radiation. In this paper we forecast the sensitivity of future radio experiments to measure the CMB quadrupole during the era of first cosmic light (z ˜20 ). The tomographic measurement of 21 cm circular polarization allows us to construct a 3D remote quadrupole field. Measuring the B -mode component of this remote quadrupole field can be used to put bounds on the tensor-to-scalar ratio r . We make Fisher forecasts for a future Fast Fourier Transform Telescope (FFTT), consisting of an array of dipole antennas in a compact grid configuration, as a function of array size and observation time. We find that a FFTT with a side length of 100 km can achieve σ (r )˜4 ×10-3 after ten years of observation and with a sky coverage fsky˜0.7 . The forecasts are dependent on the evolution of the Lyman-α flux in the pre-reionization era, that remains observationally unconstrained. Finally, we calculate the typical order of magnitudes for circular polarization foregrounds and comment on their mitigation strategies. We conclude that detection of primordial gravitational waves with 21 cm observations is in principle possible, so long as the primordial magnetic field amplitude is small, but would require a very futuristic experiment with corresponding advances in calibration and foreground suppression techniques.
Accelerated Tumor Cell Death by Angiogenic Modifiers
2004-08-01
factors; extracellular matrix; 3-D cell culture; cancer metastasis Running title: Tumor-Stroma Interaction Abbreviations: BSP, bone sialoprotein ; ECM...such as osteocalcin (OC), bone sialoprotein (BSP), osteopontin (OPN), osteonectin (ON or SPARC), 18 osteoprotegerin (OPG), PTHrP, M-CSF, RANK and...Waltregny, D., Bellahcene, A., Van Riet, I., Fisher, L. W., Young, M., Fernandez, P. and et al. Prognostic value of bone sialoprotein expression in
A General Exponential Framework for Dimensionality Reduction.
Wang, Su-Jing; Yan, Shuicheng; Yang, Jian; Zhou, Chun-Guang; Fu, Xiaolan
2014-02-01
As a general framework, Laplacian embedding, based on a pairwise similarity matrix, infers low dimensional representations from high dimensional data. However, it generally suffers from three issues: 1) algorithmic performance is sensitive to the size of neighbors; 2) the algorithm encounters the well known small sample size (SSS) problem; and 3) the algorithm de-emphasizes small distance pairs. To address these issues, here we propose exponential embedding using matrix exponential and provide a general framework for dimensionality reduction. In the framework, the matrix exponential can be roughly interpreted by the random walk over the feature similarity matrix, and thus is more robust. The positive definite property of matrix exponential deals with the SSS problem. The behavior of the decay function of exponential embedding is more significant in emphasizing small distance pairs. Under this framework, we apply matrix exponential to extend many popular Laplacian embedding algorithms, e.g., locality preserving projections, unsupervised discriminant projections, and marginal fisher analysis. Experiments conducted on the synthesized data, UCI, and the Georgia Tech face database show that the proposed new framework can well address the issues mentioned above.
NASA Astrophysics Data System (ADS)
Kozel, Tomas; Stary, Milos
2017-12-01
The main advantage of stochastic forecasting is fan of possible value whose deterministic method of forecasting could not give us. Future development of random process is described better by stochastic then deterministic forecasting. Discharge in measurement profile could be categorized as random process. Content of article is construction and application of forecasting model for managed large open water reservoir with supply function. Model is based on neural networks (NS) and zone models, which forecasting values of average monthly flow from inputs values of average monthly flow, learned neural network and random numbers. Part of data was sorted to one moving zone. The zone is created around last measurement average monthly flow. Matrix of correlation was assembled only from data belonging to zone. The model was compiled for forecast of 1 to 12 month with using backward month flows (NS inputs) from 2 to 11 months for model construction. Data was got ridded of asymmetry with help of Box-Cox rule (Box, Cox, 1964), value r was found by optimization. In next step were data transform to standard normal distribution. The data were with monthly step and forecast is not recurring. 90 years long real flow series was used for compile of the model. First 75 years were used for calibration of model (matrix input-output relationship), last 15 years were used only for validation. Outputs of model were compared with real flow series. For comparison between real flow series (100% successfully of forecast) and forecasts, was used application to management of artificially made reservoir. Course of water reservoir management using Genetic algorithm (GE) + real flow series was compared with Fuzzy model (Fuzzy) + forecast made by Moving zone model. During evaluation process was founding the best size of zone. Results show that the highest number of input did not give the best results and ideal size of zone is in interval from 25 to 35, when course of management was almost same for all numbers from interval. Resulted course of management was compared with course, which was obtained from using GE + real flow series. Comparing results showed that fuzzy model with forecasted values has been able to manage main malfunction and artificially disorders made by model were founded essential, after values of water volume during management were evaluated. Forecasting model in combination with fuzzy model provide very good results in management of water reservoir with storage function and can be recommended for this purpose.
Natural learning in NLDA networks.
González, Ana; Dorronsoro, José R
2007-07-01
Non Linear Discriminant Analysis (NLDA) networks combine a standard Multilayer Perceptron (MLP) transfer function with the minimization of a Fisher analysis criterion. In this work we will define natural-like gradients for NLDA network training. Instead of a more principled approach, that would require the definition of an appropriate Riemannian structure on the NLDA weight space, we will follow a simpler procedure, based on the observation that the gradient of the NLDA criterion function J can be written as the expectation nablaJ(W)=E[Z(X,W)] of a certain random vector Z and defining then I=E[Z(X,W)Z(X,W)(t)] as the Fisher information matrix in this case. This definition of I formally coincides with that of the information matrix for the MLP or other square error functions; the NLDA J criterion, however, does not have this structure. Although very simple, the proposed approach shows much faster convergence than that of standard gradient descent, even when its costlier complexity is taken into account. While the faster convergence of natural MLP batch training can be also explained in terms of its relationship with the Gauss-Newton minimization method, this is not the case for NLDA training, as we will see analytically and numerically that the hessian and information matrices are different.
Minimum Fisher regularization of image reconstruction for infrared imaging bolometer on HL-2A
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gao, J. M.; Liu, Y.; Li, W.
2013-09-15
An infrared imaging bolometer diagnostic has been developed recently for the HL-2A tokamak to measure the temporal and spatial distribution of plasma radiation. The three-dimensional tomography, reduced to a two-dimensional problem by the assumption of plasma radiation toroidal symmetry, has been performed. A three-dimensional geometry matrix is calculated with the one-dimensional pencil beam approximation. The solid angles viewed by the detector elements are taken into account in defining the chord brightness. And the local plasma emission is obtained by inverting the measured brightness with the minimum Fisher regularization method. A typical HL-2A plasma radiation model was chosen to optimize amore » regularization parameter on the criterion of generalized cross validation. Finally, this method was applied to HL-2A experiments, demonstrating the plasma radiated power density distribution in limiter and divertor discharges.« less
Noisy metrology: a saturable lower bound on quantum Fisher information
NASA Astrophysics Data System (ADS)
Yousefjani, R.; Salimi, S.; Khorashad, A. S.
2017-06-01
In order to provide a guaranteed precision and a more accurate judgement about the true value of the Cramér-Rao bound and its scaling behavior, an upper bound (equivalently a lower bound on the quantum Fisher information) for precision of estimation is introduced. Unlike the bounds previously introduced in the literature, the upper bound is saturable and yields a practical instruction to estimate the parameter through preparing the optimal initial state and optimal measurement. The bound is based on the underling dynamics, and its calculation is straightforward and requires only the matrix representation of the quantum maps responsible for encoding the parameter. This allows us to apply the bound to open quantum systems whose dynamics are described by either semigroup or non-semigroup maps. Reliability and efficiency of the method to predict the ultimate precision limit are demonstrated by three main examples.
A Critical Look at the Cross Impact Matrix Method. A Research Report.
ERIC Educational Resources Information Center
Folk, Michael
This paper explains some of the problems with, and their importance to the application of, the Cross-Impact Matrix (CIM). The CIM is a research method designed to serve as a heuristic device to enhance a person's ability to think about the future and as an analytical device to be used by planners to help in actually forecasting future occurrences.…
NASA Astrophysics Data System (ADS)
Zhao, W.; Baskaran, D.; Grishchuk, L. P.
2009-10-01
The relic gravitational waves are the cleanest probe of the violent times in the very early history of the Universe. They are expected to leave signatures in the observed cosmic microwave background anisotropies. We significantly improved our previous analysis [W. Zhao, D. Baskaran, and L. P. Grishchuk, Phys. Rev. DPRVDAQ1550-7998 79, 023002 (2009)10.1103/PhysRevD.79.023002] of the 5-year WMAP TT and TE data at lower multipoles ℓ. This more general analysis returned essentially the same maximum likelihood result (unfortunately, surrounded by large remaining uncertainties): The relic gravitational waves are present and they are responsible for approximately 20% of the temperature quadrupole. We identify and discuss the reasons by which the contribution of gravitational waves can be overlooked in a data analysis. One of the reasons is a misleading reliance on data from very high multipoles ℓ and another a too narrow understanding of the problem as the search for B modes of polarization, rather than the detection of relic gravitational waves with the help of all correlation functions. Our analysis of WMAP5 data has led to the identification of a whole family of models characterized by relatively high values of the likelihood function. Using the Fisher matrix formalism we formulated forecasts for Planck mission in the context of this family of models. We explore in detail various “optimistic,” “pessimistic,” and “dream case” scenarios. We show that in some circumstances the B-mode detection may be very inconclusive, at the level of signal-to-noise ratio S/N=1.75, whereas a smarter data analysis can reveal the same gravitational wave signal at S/N=6.48. The final result is encouraging. Even under unfavorable conditions in terms of instrumental noises and foregrounds, the relic gravitational waves, if they are characterized by the maximum likelihood parameters that we found from WMAP5 data, will be detected by Planck at the level S/N=3.65.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marney, Luke C.; Siegler, William C.; Parsons, Brendon A.
Two-dimensional (2D) gas chromatography coupled with time-of-flight mass spectrometry (GC × GC – TOFMS) is a highly capable instrumental platform that produces complex and information-rich multi-dimensional chemical data. The complex data can be overwhelming, especially when many samples (of various sample classes) are analyzed with multiple injections for each sample. Thus, the data must be analyzed in such a way to extract the most meaningful information. The pixel-based and peak table-based algorithmic use of Fisher ratios has been used successfully in the past to reduce the multi-dimensional data down to those chemical compounds that are changing between classes relative tomore » those that are not (i.e., chemical feature selection). We report on the initial development of a computationally fast novel tile-based Fisher-ratio software that addresses challenges due to 2D retention time misalignment without explicitly aligning the data, which is a problem for both pixel-based and peak table- based methods. Concurrently, the tile-based Fisher-ratio software maximizes the sensitivity contrast of true positives against a background of potential false positives and noise. To study this software, eight compounds, plus one internal standard, were spiked into diesel at various concentrations. The tile-based F-ratio software was able to discover all spiked analytes, within the complex diesel sample matrix with thousands of potential false positives, in each possible concentration comparison, even at the lowest absolute spiked analyte concentration ratio of 1.06.« less
Evaluation of CMAQ and CAMx Ensemble Air Quality Forecasts during the 2015 MAPS-Seoul Field Campaign
NASA Astrophysics Data System (ADS)
Kim, E.; Kim, S.; Bae, C.; Kim, H. C.; Kim, B. U.
2015-12-01
The performance of Air quality forecasts during the 2015 MAPS-Seoul Field Campaign was evaluated. An forecast system has been operated to support the campaign's daily aircraft route decisions for airborne measurements to observe long-range transporting plume. We utilized two real-time ensemble systems based on the Weather Research and Forecasting (WRF)-Sparse Matrix Operator Kernel Emissions (SMOKE)-Comprehensive Air quality Model with extensions (CAMx) modeling framework and WRF-SMOKE- Community Multi_scale Air Quality (CMAQ) framework over northeastern Asia to simulate PM10 concentrations. Global Forecast System (GFS) from National Centers for Environmental Prediction (NCEP) was used to provide meteorological inputs for the forecasts. For an additional set of retrospective simulations, ERA Interim Reanalysis from European Centre for Medium-Range Weather Forecasts (ECMWF) was also utilized to access forecast uncertainties from the meteorological data used. Model Inter-Comparison Study for Asia (MICS-Asia) and National Institute of Environment Research (NIER) Clean Air Policy Support System (CAPSS) emission inventories are used for foreign and domestic emissions, respectively. In the study, we evaluate the CMAQ and CAMx model performance during the campaign by comparing the results to the airborne and surface measurements. Contributions of foreign and domestic emissions are estimated using a brute force method. Analyses on model performance and emissions will be utilized to improve air quality forecasts for the upcoming KORUS-AQ field campaign planned in 2016.
NASA Astrophysics Data System (ADS)
Loutas, T. H.; Bourikas, A.
2017-12-01
We revisit the optimal sensor placement of engineering structures problem with an emphasis on in-plane dynamic strain measurements and to the direction of modal identification as well as vibration-based damage detection for structural health monitoring purposes. The approach utilized is based on the maximization of a norm of the Fisher Information Matrix built with numerically obtained mode shapes of the structure and at the same time prohibit the sensorization of neighbor degrees of freedom as well as those carrying similar information, in order to obtain a satisfactory coverage. A new convergence criterion of the Fisher Information Matrix (FIM) norm is proposed in order to deal with the issue of choosing an appropriate sensor redundancy threshold, a concept recently introduced but not further investigated concerning its choice. The sensor configurations obtained via a forward sequential placement algorithm are sub-optimal in terms of FIM norm values but the selected sensors are not allowed to be placed in neighbor degrees of freedom providing thus a better coverage of the structure and a subsequent better identification of the experimental mode shapes. The issue of how service induced damage affects the initially nominated as optimal sensor configuration is also investigated and reported. The numerical model of a composite sandwich panel serves as a representative aerospace structure upon which our investigations are based.
Optimized Clustering Estimators for BAO Measurements Accounting for Significant Redshift Uncertainty
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ross, Ashley J.; Banik, Nilanjan; Avila, Santiago
2017-05-15
We determine an optimized clustering statistic to be used for galaxy samples with significant redshift uncertainty, such as those that rely on photometric redshifts. To do so, we study the BAO information content as a function of the orientation of galaxy clustering modes with respect to their angle to the line-of-sight (LOS). The clustering along the LOS, as observed in a redshift-space with significant redshift uncertainty, has contributions from clustering modes with a range of orientations with respect to the true LOS. For redshift uncertaintymore » $$\\sigma_z \\geq 0.02(1+z)$$ we find that while the BAO information is confined to transverse clustering modes in the true space, it is spread nearly evenly in the observed space. Thus, measuring clustering in terms of the projected separation (regardless of the LOS) is an efficient and nearly lossless compression of the signal for $$\\sigma_z \\geq 0.02(1+z)$$. For reduced redshift uncertainty, a more careful consideration is required. We then use more than 1700 realizations of galaxy simulations mimicking the Dark Energy Survey Year 1 sample to validate our analytic results and optimized analysis procedure. We find that using the correlation function binned in projected separation, we can achieve uncertainties that are within 10 per cent of of those predicted by Fisher matrix forecasts. We predict that DES Y1 should achieve a 5 per cent distance measurement using our optimized methods. We expect the results presented here to be important for any future BAO measurements made using photometric redshift data.« less
Optimized clustering estimators for BAO measurements accounting for significant redshift uncertainty
NASA Astrophysics Data System (ADS)
Ross, Ashley J.; Banik, Nilanjan; Avila, Santiago; Percival, Will J.; Dodelson, Scott; Garcia-Bellido, Juan; Crocce, Martin; Elvin-Poole, Jack; Giannantonio, Tommaso; Manera, Marc; Sevilla-Noarbe, Ignacio
2017-12-01
We determine an optimized clustering statistic to be used for galaxy samples with significant redshift uncertainty, such as those that rely on photometric redshifts. To do so, we study the baryon acoustic oscillation (BAO) information content as a function of the orientation of galaxy clustering modes with respect to their angle to the line of sight (LOS). The clustering along the LOS, as observed in a redshift-space with significant redshift uncertainty, has contributions from clustering modes with a range of orientations with respect to the true LOS. For redshift uncertainty σz ≥ 0.02(1 + z), we find that while the BAO information is confined to transverse clustering modes in the true space, it is spread nearly evenly in the observed space. Thus, measuring clustering in terms of the projected separation (regardless of the LOS) is an efficient and nearly lossless compression of the signal for σz ≥ 0.02(1 + z). For reduced redshift uncertainty, a more careful consideration is required. We then use more than 1700 realizations (combining two separate sets) of galaxy simulations mimicking the Dark Energy Survey Year 1 (DES Y1) sample to validate our analytic results and optimized analysis procedure. We find that using the correlation function binned in projected separation, we can achieve uncertainties that are within 10 per cent of those predicted by Fisher matrix forecasts. We predict that DES Y1 should achieve a 5 per cent distance measurement using our optimized methods. We expect the results presented here to be important for any future BAO measurements made using photometric redshift data.
Cold dark energy constraints from the abundance of galaxy clusters
Heneka, Caroline; Rapetti, David; Cataneo, Matteo; ...
2017-10-05
We constrain cold dark energy of negligible sound speed using galaxy cluster abundance observations. In contrast to standard quasi-homogeneous dark energy, negligible sound speed implies clustering of the dark energy fluid at all scales, allowing us to measure the effects of dark energy perturbations at cluster scales. We compare those models and set the stage for using non-linear information from semi-analytical modelling in cluster growth data analyses. For this, we recalibrate the halo mass function with non-linear characteristic quantities, the spherical collapse threshold and virial overdensity, that account for model and redshift-dependent behaviours, as well as an additional mass contributionmore » for cold dark energy. Here in this paper, we present the first constraints from this cold dark matter plus cold dark energy mass function using our cluster abundance likelihood, which self-consistently accounts for selection effects, covariances and systematic uncertainties. We combine cluster growth data with cosmic microwave background, supernovae Ia and baryon acoustic oscillation data, and find a shift between cold versus quasi-homogeneous dark energy of up to 1σ. We make a Fisher matrix forecast of constraints attainable with cluster growth data from the ongoing Dark Energy Survey (DES). For DES, we predict ~ 50 percent tighter constraints on (Ωm, w) for cold dark energy versus wCDM models, with the same free parameters. Overall, we show that cluster abundance analyses are sensitive to cold dark energy, an alternative, viable model that should be routinely investigated alongside the standard dark energy scenario.« less
Cold dark energy constraints from the abundance of galaxy clusters
DOE Office of Scientific and Technical Information (OSTI.GOV)
Heneka, Caroline; Rapetti, David; Cataneo, Matteo
We constrain cold dark energy of negligible sound speed using galaxy cluster abundance observations. In contrast to standard quasi-homogeneous dark energy, negligible sound speed implies clustering of the dark energy fluid at all scales, allowing us to measure the effects of dark energy perturbations at cluster scales. We compare those models and set the stage for using non-linear information from semi-analytical modelling in cluster growth data analyses. For this, we recalibrate the halo mass function with non-linear characteristic quantities, the spherical collapse threshold and virial overdensity, that account for model and redshift-dependent behaviours, as well as an additional mass contributionmore » for cold dark energy. Here in this paper, we present the first constraints from this cold dark matter plus cold dark energy mass function using our cluster abundance likelihood, which self-consistently accounts for selection effects, covariances and systematic uncertainties. We combine cluster growth data with cosmic microwave background, supernovae Ia and baryon acoustic oscillation data, and find a shift between cold versus quasi-homogeneous dark energy of up to 1σ. We make a Fisher matrix forecast of constraints attainable with cluster growth data from the ongoing Dark Energy Survey (DES). For DES, we predict ~ 50 percent tighter constraints on (Ωm, w) for cold dark energy versus wCDM models, with the same free parameters. Overall, we show that cluster abundance analyses are sensitive to cold dark energy, an alternative, viable model that should be routinely investigated alongside the standard dark energy scenario.« less
NASA Astrophysics Data System (ADS)
Masato, Giacomo; Cavany, Sean; Charlton-Perez, Andrew; Dacre, Helen; Bone, Angie; Carmicheal, Katie; Murray, Virginia; Danker, Rutger; Neal, Rob; Sarran, Christophe
2015-04-01
The health forecasting alert system for cold weather and heatwaves currently in use in the Cold Weather and Heatwave plans for England is based on 5 alert levels, with levels 2 and 3 dependent on a forecast or actual single temperature action trigger. Epidemiological evidence indicates that for both heat and cold, the impact on human health is gradual, with worsening impact for more extreme temperatures. The 60% risk of heat and cold forecasts used by the alerts is a rather crude probabilistic measure, which could be substantially improved thanks to the state-of-the-art forecast techniques. In this study a prototype of a new health forecasting alert system is developed, which is aligned to the approach used in the Met Office's (MO) National Severe Weather Warning Service (NSWWS). This is in order to improve information available to responders in the health and social care system by linking temperatures more directly to risks of mortality, and developing a system more coherent with other weather alerts. The prototype is compared to the current system in the Cold Weather and Heatwave plans via a case-study approach to verify its potential advantages and shortcomings. The prototype health forecasting alert system introduces an "impact vs likelihood matrix" for the health impacts of hot and cold temperatures which is similar to those used operationally for other weather hazards as part of the NSWWS. The impact axis of this matrix is based on existing epidemiological evidence, which shows an increasing relative risk of death at extremes of outdoor temperature beyond a threshold which can be identified epidemiologically. The likelihood axis is based on a probability measure associated with the temperature forecast. The new method is tested for two case studies (one during summer 2013, one during winter 2013), and compared to the performance of the current alert system. The prototype shows some clear improvements over the current alert system. It allows for a much greater degree of flexibility, provides more detailed regional information about the health risks associated with periods of extreme temperatures, and is more coherent with other weather alerts which may make it easier for front line responders to use. It will require validation and engagement with stakeholders before it can be considered for use.
Cloke, Jonathan; Evans, Katharine; Crabtree, David; Hughes, Annette; Simpson, Helen; Holopainen, Jani; Wickstrand, Nina; Kauppinen, Mikko; Leon-Velarde, Carlos; Larson, Nathan; Dave, Keron; Chen, Yi; Ryser, Elliot; Carter, Mark
2016-01-01
The Thermo Scientific™ SureTect™ Listeria species assay is a new real-time PCR assay for the detection of all species of Listeria in food and environmental samples. The assay was originally certified as Performance Tested Methods(SM) (PTM) 071304 in 2013. This report details the method modification study undertaken to extend the performance claims of the assay for matrixes of raw ground turkey, raw ground pork, bagged lettuce, raw pork sausages, pasteurized 2% fat milk, raw cod, pasteurized brie cheese, and ice cream. The method modification study was conducted using the AOAC Research Institute (RI) PTM program to validate the SureTect PCR assay in comparison to the reference method detailed in ISO 11290-1:1996 including amendment 1:2004. All matrixes were tested by Thermo Fisher Scientific (Basingstoke, United Kingdom). In addition, three matrixes (raw cod, bagged lettuce, and pasteurized brie cheese) were analyzed independently as part of the AOAC RI-controlled independent laboratory study by the University of Guelph, Canada. Using probability of detection statistical analysis, there was no significant difference in the performance between the SureTect assay and the International Organization for Standardization reference method for any of the matrixes analyzed in this study.
Detection of LSB+/-1 steganography based on co-occurrence matrix and bit plane clipping
NASA Astrophysics Data System (ADS)
Abolghasemi, Mojtaba; Aghaeinia, Hassan; Faez, Karim; Mehrabi, Mohammad Ali
2010-01-01
Spatial LSB+/-1 steganography changes smooth characteristics between adjoining pixels of the raw image. We present a novel steganalysis method for LSB+/-1 steganography based on feature vectors derived from the co-occurrence matrix in the spatial domain. We investigate how LSB+/-1 steganography affects the bit planes of an image and show that it changes more least significant bit (LSB) planes of it. The co-occurrence matrix is derived from an image in which some of its most significant bit planes are clipped. By this preprocessing, in addition to reducing the dimensions of the feature vector, the effects of embedding were also preserved. We compute the co-occurrence matrix in different directions and with different dependency and use the elements of the resulting co-occurrence matrix as features. This method is sensitive to the data embedding process. We use a Fisher linear discrimination (FLD) classifier and test our algorithm on different databases and embedding rates. We compare our scheme with the current LSB+/-1 steganalysis methods. It is shown that the proposed scheme outperforms the state-of-the-art methods in detecting the LSB+/-1 steganographic method for grayscale images.
Identity Recognition Algorithm Using Improved Gabor Feature Selection of Gait Energy Image
NASA Astrophysics Data System (ADS)
Chao, LIANG; Ling-yao, JIA; Dong-cheng, SHI
2017-01-01
This paper describes an effective gait recognition approach based on Gabor features of gait energy image. In this paper, the kernel Fisher analysis combined with kernel matrix is proposed to select dominant features. The nearest neighbor classifier based on whitened cosine distance is used to discriminate different gait patterns. The approach proposed is tested on the CASIA and USF gait databases. The results show that our approach outperforms other state of gait recognition approaches in terms of recognition accuracy and robustness.
DOT National Transportation Integrated Search
2014-05-01
Travel demand forecasting models are used to predict future traffic volumes to evaluate : roadway improvement alternatives. Each of the metropolitan planning organizations (MPO) in : Alabama maintains a travel demand model to support planning efforts...
From Cycle Rooted Spanning Forests to the Critical Ising Model: an Explicit Construction
NASA Astrophysics Data System (ADS)
de Tilière, Béatrice
2013-04-01
Fisher established an explicit correspondence between the 2-dimensional Ising model defined on a graph G and the dimer model defined on a decorated version {{G}} of this graph (Fisher in J Math Phys 7:1776-1781, 1966). In this paper we explicitly relate the dimer model associated to the critical Ising model and critical cycle rooted spanning forests (CRSFs). This relation is established through characteristic polynomials, whose definition only depends on the respective fundamental domains, and which encode the combinatorics of the model. We first show a matrix-tree type theorem establishing that the dimer characteristic polynomial counts CRSFs of the decorated fundamental domain {{G}_1}. Our main result consists in explicitly constructing CRSFs of {{G}_1} counted by the dimer characteristic polynomial, from CRSFs of G 1, where edges are assigned Kenyon's critical weight function (Kenyon in Invent Math 150(2):409-439, 2002); thus proving a relation on the level of configurations between two well known 2-dimensional critical models.
NASA Astrophysics Data System (ADS)
Van Steenbergen, N.; Willems, P.
2012-04-01
Reliable flood forecasts are the most important non-structural measures to reduce the impact of floods. However flood forecasting systems are subject to uncertainty originating from the input data, model structure and model parameters of the different hydraulic and hydrological submodels. To quantify this uncertainty a non-parametric data-based approach has been developed. This approach analyses the historical forecast residuals (differences between the predictions and the observations at river gauging stations) without using a predefined statistical error distribution. Because the residuals are correlated with the value of the forecasted water level and the lead time, the residuals are split up into discrete classes of simulated water levels and lead times. For each class, percentile values are calculated of the model residuals and stored in a 'three dimensional error' matrix. By 3D interpolation in this error matrix, the uncertainty in new forecasted water levels can be quantified. In addition to the quantification of the uncertainty, the communication of this uncertainty is equally important. The communication has to be done in a consistent way, reducing the chance of misinterpretation. Also, the communication needs to be adapted to the audience; the majority of the larger public is not interested in in-depth information on the uncertainty on the predicted water levels, but only is interested in information on the likelihood of exceedance of certain alarm levels. Water managers need more information, e.g. time dependent uncertainty information, because they rely on this information to undertake the appropriate flood mitigation action. There are various ways in presenting uncertainty information (numerical, linguistic, graphical, time (in)dependent, etc.) each with their advantages and disadvantages for a specific audience. A useful method to communicate uncertainty of flood forecasts is by probabilistic flood mapping. These maps give a representation of the probability of flooding of a certain area, based on the uncertainty assessment of the flood forecasts. By using this type of maps, water managers can focus their attention on the areas with the highest flood probability. Also the larger public can consult these maps for information on the probability of flooding for their specific location, such that they can take pro-active measures to reduce the personal damage. The method of quantifying the uncertainty was implemented in the operational flood forecasting system for the navigable rivers in the Flanders region of Belgium. The method has shown clear benefits during the floods of the last two years.
Fast Kalman Filter for Random Walk Forecast model
NASA Astrophysics Data System (ADS)
Saibaba, A.; Kitanidis, P. K.
2013-12-01
Kalman filtering is a fundamental tool in statistical time series analysis to understand the dynamics of large systems for which limited, noisy observations are available. However, standard implementations of the Kalman filter are prohibitive because they require O(N^2) in memory and O(N^3) in computational cost, where N is the dimension of the state variable. In this work, we focus our attention on the Random walk forecast model which assumes the state transition matrix to be the identity matrix. This model is frequently adopted when the data is acquired at a timescale that is faster than the dynamics of the state variables and there is considerable uncertainty as to the physics governing the state evolution. We derive an efficient representation for the a priori and a posteriori estimate covariance matrices as a weighted sum of two contributions - the process noise covariance matrix and a low rank term which contains eigenvectors from a generalized eigenvalue problem, which combines information from the noise covariance matrix and the data. We describe an efficient algorithm to update the weights of the above terms and the computation of eigenmodes of the generalized eigenvalue problem (GEP). The resulting algorithm for the Kalman filter with Random walk forecast model scales as O(N) or O(N log N), both in memory and computational cost. This opens up the possibility of real-time adaptive experimental design and optimal control in systems of much larger dimension than was previously feasible. For a small number of measurements (~ 300 - 400), this procedure can be made numerically exact. However, as the number of measurements increase, for several choices of measurement operators and noise covariance matrices, the spectrum of the (GEP) decays rapidly and we are justified in only retaining the dominant eigenmodes. We discuss tradeoffs between accuracy and computational cost. The resulting algorithms are applied to an example application from ray-based travel time tomography.
Effect of oil palm empty fruit bunches fibers reinforced polymer recycled
NASA Astrophysics Data System (ADS)
Hermawan, B.; Nikmatin, S.; Sudaryanto; Alatas, H.; Sukaryo, S. G.
2017-07-01
The aim of this research is to process the OPEFB to become fiber with various sizes which will be used as a filler of polymer matrix recycled acrylonitrile butadiene styrene (ABS). Molecular analysis and mechanical test have been done to understand the influence of fiber size toward material capability to receive outer deformation. Single screw extruder formed a biocomposites granular continued with injection moulding to shaped test pieces. Maleic anhydride was added as coupling agent between filler and matrix. Filler concentration were 10 and 20% in fiber size respectively with constant additif. Two kind of fiber glass (10%) were used as comparator. In order to analyze the results of the mechanical test Fisher least significant difference (LSD) in ANOVA method was performed (-with α=0,05-).
Mizutani, Eiji; Demmel, James W
2003-01-01
This paper briefly introduces our numerical linear algebra approaches for solving structured nonlinear least squares problems arising from 'multiple-output' neural-network (NN) models. Our algorithms feature trust-region regularization, and exploit sparsity of either the 'block-angular' residual Jacobian matrix or the 'block-arrow' Gauss-Newton Hessian (or Fisher information matrix in statistical sense) depending on problem scale so as to render a large class of NN-learning algorithms 'efficient' in both memory and operation costs. Using a relatively large real-world nonlinear regression application, we shall explain algorithmic strengths and weaknesses, analyzing simulation results obtained by both direct and iterative trust-region algorithms with two distinct NN models: 'multilayer perceptrons' (MLP) and 'complementary mixtures of MLP-experts' (or neuro-fuzzy modular networks).
Cosmological measurements with general relativistic galaxy correlations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Raccanelli, Alvise; Montanari, Francesco; Durrer, Ruth
We investigate the cosmological dependence and the constraining power of large-scale galaxy correlations, including all redshift-distortions, wide-angle, lensing and gravitational potential effects on linear scales. We analyze the cosmological information present in the lensing convergence and in the gravitational potential terms describing the so-called ''relativistic effects'', and we find that, while smaller than the information contained in intrinsic galaxy clustering, it is not negligible. We investigate how neglecting them does bias cosmological measurements performed by future spectroscopic and photometric large-scale surveys such as SKA and Euclid. We perform a Fisher analysis using the CLASS code, modified to include scale-dependent galaxymore » bias and redshift-dependent magnification and evolution bias. Our results show that neglecting relativistic terms, especially lensing convergence, introduces an error in the forecasted precision in measuring cosmological parameters of the order of a few tens of percent, in particular when measuring the matter content of the Universe and primordial non-Gaussianity parameters. The analysis suggests a possible substantial systematic error in cosmological parameter constraints. Therefore, we argue that radial correlations and integrated relativistic terms need to be taken into account when forecasting the constraining power of future large-scale number counts of galaxy surveys.« less
Diffusion Forecasting Model with Basis Functions from QR-Decomposition
NASA Astrophysics Data System (ADS)
Harlim, John; Yang, Haizhao
2018-06-01
The diffusion forecasting is a nonparametric approach that provably solves the Fokker-Planck PDE corresponding to Itô diffusion without knowing the underlying equation. The key idea of this method is to approximate the solution of the Fokker-Planck equation with a discrete representation of the shift (Koopman) operator on a set of basis functions generated via the diffusion maps algorithm. While the choice of these basis functions is provably optimal under appropriate conditions, computing these basis functions is quite expensive since it requires the eigendecomposition of an N× N diffusion matrix, where N denotes the data size and could be very large. For large-scale forecasting problems, only a few leading eigenvectors are computationally achievable. To overcome this computational bottleneck, a new set of basis functions constructed by orthonormalizing selected columns of the diffusion matrix and its leading eigenvectors is proposed. This computation can be carried out efficiently via the unpivoted Householder QR factorization. The efficiency and effectiveness of the proposed algorithm will be shown in both deterministically chaotic and stochastic dynamical systems; in the former case, the superiority of the proposed basis functions over purely eigenvectors is significant, while in the latter case forecasting accuracy is improved relative to using a purely small number of eigenvectors. Supporting arguments will be provided on three- and six-dimensional chaotic ODEs, a three-dimensional SDE that mimics turbulent systems, and also on the two spatial modes associated with the boreal winter Madden-Julian Oscillation obtained from applying the Nonlinear Laplacian Spectral Analysis on the measured Outgoing Longwave Radiation.
Diffusion Forecasting Model with Basis Functions from QR-Decomposition
NASA Astrophysics Data System (ADS)
Harlim, John; Yang, Haizhao
2017-12-01
The diffusion forecasting is a nonparametric approach that provably solves the Fokker-Planck PDE corresponding to Itô diffusion without knowing the underlying equation. The key idea of this method is to approximate the solution of the Fokker-Planck equation with a discrete representation of the shift (Koopman) operator on a set of basis functions generated via the diffusion maps algorithm. While the choice of these basis functions is provably optimal under appropriate conditions, computing these basis functions is quite expensive since it requires the eigendecomposition of an N× N diffusion matrix, where N denotes the data size and could be very large. For large-scale forecasting problems, only a few leading eigenvectors are computationally achievable. To overcome this computational bottleneck, a new set of basis functions constructed by orthonormalizing selected columns of the diffusion matrix and its leading eigenvectors is proposed. This computation can be carried out efficiently via the unpivoted Householder QR factorization. The efficiency and effectiveness of the proposed algorithm will be shown in both deterministically chaotic and stochastic dynamical systems; in the former case, the superiority of the proposed basis functions over purely eigenvectors is significant, while in the latter case forecasting accuracy is improved relative to using a purely small number of eigenvectors. Supporting arguments will be provided on three- and six-dimensional chaotic ODEs, a three-dimensional SDE that mimics turbulent systems, and also on the two spatial modes associated with the boreal winter Madden-Julian Oscillation obtained from applying the Nonlinear Laplacian Spectral Analysis on the measured Outgoing Longwave Radiation.
DEVELOPMENT AND EVALUATION OF PM 2.5 SOURCE APPORTIONMENT METHODOLOGIES
The receptor model called Positive Matrix Factorization (PMF) has been extensively used to apportion sources of ambient fine particulate matter (PM2.5), but the accuracy of source apportionment results currently remains unknown. In addition, air quality forecast model...
Forecast on Water Locking Damage of Low Permeable Reservoir with Quantum Neural Network
NASA Astrophysics Data System (ADS)
Zhao, Jingyuan; Sun, Yuxue; Feng, Fuping; Zhao, Fulei; Sui, Dianjie; Xu, Jianjun
2018-01-01
It is of great importance in oil-gas reservoir protection to timely and correctly forecast the water locking damage, the greatest damage for low permeable reservoir. An analysis is conducted on the production mechanism and various influence factors of water locking damage, based on which a quantum neuron is constructed based on the information processing manner of a biological neuron and the principle of quantum neural algorithm, besides, the quantum neural network model forecasting the water locking of the reservoir is established and related software is also made to forecast the water locking damage of the gas reservoir. This method has overcome the defects of grey correlation analysis that requires evaluation matrix analysis and complicated operation. According to the practice in Longxi Area of Daqing Oilfield, this method is characterized by fast operation, few system parameters and high accuracy rate (the general incidence rate may reach 90%), which can provide reliable support for the protection technique of low permeable reservoir.
Modeling and Computing of Stock Index Forecasting Based on Neural Network and Markov Chain
Dai, Yonghui; Han, Dongmei; Dai, Weihui
2014-01-01
The stock index reflects the fluctuation of the stock market. For a long time, there have been a lot of researches on the forecast of stock index. However, the traditional method is limited to achieving an ideal precision in the dynamic market due to the influences of many factors such as the economic situation, policy changes, and emergency events. Therefore, the approach based on adaptive modeling and conditional probability transfer causes the new attention of researchers. This paper presents a new forecast method by the combination of improved back-propagation (BP) neural network and Markov chain, as well as its modeling and computing technology. This method includes initial forecasting by improved BP neural network, division of Markov state region, computing of the state transition probability matrix, and the prediction adjustment. Results of the empirical study show that this method can achieve high accuracy in the stock index prediction, and it could provide a good reference for the investment in stock market. PMID:24782659
NASA Astrophysics Data System (ADS)
Chaudhuri, S.; Das, D.; Goswami, S.; Das, S. K.
2016-11-01
All India summer monsoon rainfall (AISMR) characteristics play a vital role for the policy planning and national economy of the country. In view of the significant impact of monsoon system on regional as well as global climate systems, accurate prediction of summer monsoon rainfall has become a challenge. The objective of this study is to develop an adaptive neuro-fuzzy inference system (ANFIS) for long range forecast of AISMR. The NCEP/NCAR reanalysis data of temperature, zonal and meridional wind at different pressure levels have been taken to construct the input matrix of ANFIS. The membership of the input parameters for AISMR as high, medium or low is estimated with trapezoidal membership function. The fuzzified standardized input parameters and the de-fuzzified target output are trained with artificial neural network models. The forecast of AISMR with ANFIS is compared with non-hybrid multi-layer perceptron model (MLP), radial basis functions network (RBFN) and multiple linear regression (MLR) models. The forecast error analyses of the models reveal that ANFIS provides the best forecast of AISMR with minimum prediction error of 0.076, whereas the errors with MLP, RBFN and MLR models are 0.22, 0.18 and 0.73 respectively. During validation with observations, ANFIS shows its potency over the said comparative models. Performance of the ANFIS model is verified through different statistical skill scores, which also confirms the aptitude of ANFIS in forecasting AISMR. The forecast skill of ANFIS is also observed to be better than Climate Forecast System version 2. The real-time forecast with ANFIS shows possibility of deficit (65-75 cm) AISMR in the year 2015.
Mid-Term Probabilistic Forecast of Oil Spill Trajectories
NASA Astrophysics Data System (ADS)
Castanedo, S.; Abascal, A. J.; Cardenas, M.; Medina, R.; Guanche, Y.; Mendez, F. J.; Camus, P.
2012-12-01
There is increasing concern about the threat posed by oil spills to the coastal environment. This is reflected in the promulgation of various national and international standards among which are those that require companies whose activities involves oil spill risk, to have oil pollution emergency plans or similar arrangements for responding promptly and effectively to oil pollution incidents. Operational oceanography systems (OOS) that provide decision makers with oil spill trajectory forecasting, have demonstrated their usefulness in recent accidents (Castanedo et al., 2006). In recent years, many national and regional OOS have been setup focusing on short-term oil spill forecast (up to 5 days). However, recent accidental marine oil spills (Prestige in Spain, Deep Horizon in Gulf of Mexico) have revealed the importance of having larger prediction horizons (up to 15 days) in regional-scale areas. In this work, we have developed a methodology to provide probabilistic oil spill forecast based on numerical modelling and statistical methods. The main components of this approach are: (1) Use of high resolution long-term (1948-2009) historical hourly data bases of wind, wind-induced currents and astronomical tide currents obtained using state-of-the-art numerical models; (2) classification of representative wind field patterns (n=100) using clustering techniques based on PCA and K-means algorithms (Camus et al., 2011); (3) determination of the cluster occurrence probability and the stochastic matrix (matrix of transition of probability or Markov matrix), p_ij, (probability of moving from a cluster "i" to a cluster "j" in one time step); (4) Initial state for mid-term simulations is obtained from available wind forecast using nearest-neighbors analog method; (5) 15-days Stochastic Markov Chain simulations (m=1000) are launched; (6) Corresponding oil spill trajectories are carried out by TESEO Lagrangian transport model (Abascal et al., 2009); (7) probability maps are delivered using an user friendly Web App. The application of the method to the Gulf of Biscay (North Spain) will show the ability of this approach. References Abascal, A.J., Castanedo, S., Mendez, F.J., Medina, R., Losada, I.J., 2009. Calibration of a Lagrangian transport model using drifting buoys deployed during the Prestige oil spill. J. Coast. Res. 25 (1), 80-90.. Camus, P., Méndez, F.J., Medina, R., 2011. Analysis of clustering and selection algorithms for the study of multivariate wave climate. Coastal Engineering, doi:10.1016/j.coastaleng.2011.02.003. Castanedo, S., Medina, R., Losada, I.J., Vidal, C., Méndez, F.J., Osorio, A., Juanes, J.A., Puente, A., 2006. The Prestige oil spill in Cantabria (Bay of Biscay). Part I: operational forecasting system for quick response, risk assessment and protection of natural resources. J. Coast. Res. 22 (6), 1474-1489.
High-redshift post-reionization cosmology with 21cm intensity mapping
NASA Astrophysics Data System (ADS)
Obuljen, Andrej; Castorina, Emanuele; Villaescusa-Navarro, Francisco; Viel, Matteo
2018-05-01
We investigate the possibility of performing cosmological studies in the redshift range 2.5
NASA Astrophysics Data System (ADS)
Mukherjee, Suvodip; Khatri, Rishi; Wandelt, Benjamin D.
2018-04-01
We revisit the cosmological constraints on resonant and non-resonant conversion of photons to axions in the cosmological magnetic fields. We find that the constraints on photon-axion coupling and primordial magnetic fields are much weaker than previously claimed for low mass axion like particles with masses ma lesssim 5× 10‑13 eV. {In particular we find that the axion mass range 10‑14 eV <= ma <= 5× 10‑13 eV is not excluded by {the} CMB data contrary to the previous claims.} We also examine the photon-axion conversion in the Galactic magnetic fields. Resonant conversion in the large scale coherent Galactic magnetic field results in 100% polarized anisotropic spectral distortions of the {CMB} for the mass range 10‑13 eV lesssim ma lesssim 10‑11 eV. The polarization pattern traces the transverse to line of sight component of the Galactic magnetic field while both the anisotropy in the Galactic magnetic field and electron distribution imprint a characteristic anisotropy pattern in the spectral distortion. Our results apply to scalar as well as pseudoscalar particles. {For conversion to scalar particles, the polarization is rotated by 90o allowing us to distinguish them from the pseudoscalars.} For ma lesssim 10‑14 eV we have non-resonant conversion in the small scale turbulent magnetic field of the Galaxy resulting in anisotropic but unpolarized spectral distortion in the CMB. These unique signatures are potential discriminants against the isotropic and non-polarized signals such as primary CMB, and μ and y distortions with the anisotropic nature making it accessible to experiments with only relative calibration like Planck, LiteBIRD, and CoRE. We forecast for PIXIE as well as for these experiments using Fisher matrix formalism.
Hybrid Inflation: Multi-field Dynamics and Cosmological Constraints
NASA Astrophysics Data System (ADS)
Clesse, Sébastien
2011-09-01
The dynamics of hybrid models is usually approximated by the evolution of a scalar field slowly rolling along a nearly flat valley. Inflation ends with a waterfall phase, due to a tachyonic instability. This final phase is usually assumed to be nearly instantaneous. In this thesis, we go beyond these approximations and analyze the exact 2-field dynamics of hybrid models. Several effects are put in evidence: 1) the possible slow-roll violations along the valley induce the non existence of inflation at small field values. Provided super-planckian fields, the scalar spectrum of the original model is red, in agreement with observations. 2) The initial field values are not fine-tuned along the valley but also occupy a considerable part of the field space exterior to it. They form a structure with fractal boundaries. Using bayesian methods, their distribution in the whole parameter space is studied. Natural bounds on the potential parameters are derived. 3) For the original model, inflation is found to continue for more than 60 e-folds along waterfall trajectories in some part of the parameter space. The scalar power spectrum of adiabatic perturbations is modified and is generically red, possibly in agreement with CMB observations. Topological defects are conveniently stretched outside the observable Universe. 4) The analysis of the initial conditions is extended to the case of a closed Universe, in which the initial singularity is replaced by a classical bounce. In the third part of the thesis, we study how the present CMB constraints on the cosmological parameters could be ameliorated with the observation of the 21cm cosmic background, by future giant radio-telescopes. Forecasts are determined for a characteristic Fast Fourier Transform Telescope, by using both Fisher matrix and MCMC methods.
2014-03-27
TOMOGRAPHIC IMAGING IN 3D WIRELESS SENSOR NETWORKS Thea S. Danella, B.S.E.E. Captain, USAF Approved: //signed// Richard K. Martin , PhD (Chairman) //signed...have every one of them in my life. I want to also thank my advisor, Dr. Richard K. Martin , and fellow student, Mr. Jason Pennington. They were...of the Fisher Information Matrix (FIM) J, and as such are the lower bounds on the Normalized Mean Squared Error (NMSE)R for pixel p. In [49], Martin et
Saleem, Muhammad; Sharif, Kashif; Fahmi, Aliya
2018-04-27
Applications of Pareto distribution are common in reliability, survival and financial studies. In this paper, A Pareto mixture distribution is considered to model a heterogeneous population comprising of two subgroups. Each of two subgroups is characterized by the same functional form with unknown distinct shape and scale parameters. Bayes estimators have been derived using flat and conjugate priors using squared error loss function. Standard errors have also been derived for the Bayes estimators. An interesting feature of this study is the preparation of components of Fisher Information matrix.
Discriminative Projection Selection Based Face Image Hashing
NASA Astrophysics Data System (ADS)
Karabat, Cagatay; Erdogan, Hakan
Face image hashing is an emerging method used in biometric verification systems. In this paper, we propose a novel face image hashing method based on a new technique called discriminative projection selection. We apply the Fisher criterion for selecting the rows of a random projection matrix in a user-dependent fashion. Moreover, another contribution of this paper is to employ a bimodal Gaussian mixture model at the quantization step. Our simulation results on three different databases demonstrate that the proposed method has superior performance in comparison to previously proposed random projection based methods.
Improving operational plume forecasts
NASA Astrophysics Data System (ADS)
Balcerak, Ernie
2012-04-01
Forecasting how plumes of particles, such as radioactive particles from a nuclear disaster, will be transported and dispersed in the atmosphere is an important but computationally challenging task. During the Fukushima nuclear disaster in Japan, operational plume forecasts were produced each day, but as the emissions continued, previous emissions were not included in the simulations used for forecasts because it became impractical to rerun the simulations each day from the beginning of the accident. Draxler and Rolph examine whether it is possible to improve plume simulation speed and flexibility as conditions and input data change. The authors use a method known as a transfer coefficient matrix approach that allows them to simulate many radionuclides using only a few generic species for the computation. Their simulations work faster by dividing the computation into separate independent segments in such a way that the most computationally time consuming pieces of the calculation need to be done only once. This makes it possible to provide real-time operational plume forecasts by continuously updating the previous simulations as new data become available. They tested their method using data from the Fukushima incident to show that it performed well. (Journal of Geophysical Research-Atmospheres, doi:10.1029/2011JD017205, 2012)
Masato, Giacomo; Bone, Angie; Charlton-Perez, Andrew; Cavany, Sean; Neal, Robert; Dankers, Rutger; Dacre, Helen; Carmichael, Katie; Murray, Virginia
2015-01-01
In this study a prototype of a new health forecasting alert system is developed, which is aligned to the approach used in the Met Office's (MO) National Severe Weather Warning Service (NSWWS). This is in order to improve information available to responders in the health and social care system by linking temperatures more directly to risks of mortality, and developing a system more coherent with other weather alerts. The prototype is compared to the current system in the Cold Weather and Heatwave plans via a case-study approach to verify its potential advantages and shortcomings. The prototype health forecasting alert system introduces an "impact vs likelihood matrix" for the health impacts of hot and cold temperatures which is similar to those used operationally for other weather hazards as part of the NSWWS. The impact axis of this matrix is based on existing epidemiological evidence, which shows an increasing relative risk of death at extremes of outdoor temperature beyond a threshold which can be identified epidemiologically. The likelihood axis is based on a probability measure associated with the temperature forecast. The new method is tested for two case studies (one during summer 2013, one during winter 2013), and compared to the performance of the current alert system. The prototype shows some clear improvements over the current alert system. It allows for a much greater degree of flexibility, provides more detailed regional information about the health risks associated with periods of extreme temperatures, and is more coherent with other weather alerts which may make it easier for front line responders to use. It will require validation and engagement with stakeholders before it can be considered for use.
Forecasting the mortality rates using Lee-Carter model and Heligman-Pollard model
NASA Astrophysics Data System (ADS)
Ibrahim, R. I.; Ngataman, N.; Abrisam, W. N. A. Wan Mohd
2017-09-01
Improvement in life expectancies has driven further declines in mortality. The sustained reduction in mortality rates and its systematic underestimation has been attracting the significant interest of researchers in recent years because of its potential impact on population size and structure, social security systems, and (from an actuarial perspective) the life insurance and pensions industry worldwide. Among all forecasting methods, the Lee-Carter model has been widely accepted by the actuarial community and Heligman-Pollard model has been widely used by researchers in modelling and forecasting future mortality. Therefore, this paper only focuses on Lee-Carter model and Heligman-Pollard model. The main objective of this paper is to investigate how accurately these two models will perform using Malaysian data. Since these models involves nonlinear equations that are explicitly difficult to solve, the Matrix Laboratory Version 8.0 (MATLAB 8.0) software will be used to estimate the parameters of the models. Autoregressive Integrated Moving Average (ARIMA) procedure is applied to acquire the forecasted parameters for both models as the forecasted mortality rates are obtained by using all the values of forecasted parameters. To investigate the accuracy of the estimation, the forecasted results will be compared against actual data of mortality rates. The results indicate that both models provide better results for male population. However, for the elderly female population, Heligman-Pollard model seems to underestimate to the mortality rates while Lee-Carter model seems to overestimate to the mortality rates.
Gene Ranking of RNA-Seq Data via Discriminant Non-Negative Matrix Factorization.
Jia, Zhilong; Zhang, Xiang; Guan, Naiyang; Bo, Xiaochen; Barnes, Michael R; Luo, Zhigang
2015-01-01
RNA-sequencing is rapidly becoming the method of choice for studying the full complexity of transcriptomes, however with increasing dimensionality, accurate gene ranking is becoming increasingly challenging. This paper proposes an accurate and sensitive gene ranking method that implements discriminant non-negative matrix factorization (DNMF) for RNA-seq data. To the best of our knowledge, this is the first work to explore the utility of DNMF for gene ranking. When incorporating Fisher's discriminant criteria and setting the reduced dimension as two, DNMF learns two factors to approximate the original gene expression data, abstracting the up-regulated or down-regulated metagene by using the sample label information. The first factor denotes all the genes' weights of two metagenes as the additive combination of all genes, while the second learned factor represents the expression values of two metagenes. In the gene ranking stage, all the genes are ranked as a descending sequence according to the differential values of the metagene weights. Leveraging the nature of NMF and Fisher's criterion, DNMF can robustly boost the gene ranking performance. The Area Under the Curve analysis of differential expression analysis on two benchmarking tests of four RNA-seq data sets with similar phenotypes showed that our proposed DNMF-based gene ranking method outperforms other widely used methods. Moreover, the Gene Set Enrichment Analysis also showed DNMF outweighs others. DNMF is also computationally efficient, substantially outperforming all other benchmarked methods. Consequently, we suggest DNMF is an effective method for the analysis of differential gene expression and gene ranking for RNA-seq data.
Strömberg, Eric A; Nyberg, Joakim; Hooker, Andrew C
2016-12-01
With the increasing popularity of optimal design in drug development it is important to understand how the approximations and implementations of the Fisher information matrix (FIM) affect the resulting optimal designs. The aim of this work was to investigate the impact on design performance when using two common approximations to the population model and the full or block-diagonal FIM implementations for optimization of sampling points. Sampling schedules for two example experiments based on population models were optimized using the FO and FOCE approximations and the full and block-diagonal FIM implementations. The number of support points was compared between the designs for each example experiment. The performance of these designs based on simulation/estimations was investigated by computing bias of the parameters as well as through the use of an empirical D-criterion confidence interval. Simulations were performed when the design was computed with the true parameter values as well as with misspecified parameter values. The FOCE approximation and the Full FIM implementation yielded designs with more support points and less clustering of sample points than designs optimized with the FO approximation and the block-diagonal implementation. The D-criterion confidence intervals showed no performance differences between the full and block diagonal FIM optimal designs when assuming true parameter values. However, the FO approximated block-reduced FIM designs had higher bias than the other designs. When assuming parameter misspecification in the design evaluation, the FO Full FIM optimal design was superior to the FO block-diagonal FIM design in both of the examples.
System learning approach to assess sustainability and ...
This paper presents a methodology that combines the power of an Artificial Neural Network and Information Theory to forecast variables describing the condition of a regional system. The novelty and strength of this approach is in the application of Fisher information, a key method in Information Theory, to preserve trends in the historical data and prevent over fitting projections. The methodology was applied to demographic, environmental, food and energy consumption, and agricultural production in the San Luis Basin regional system in Colorado, U.S.A. These variables are important for tracking conditions in human and natural systems. However, available data are often so far out of date that they limit the ability to manage these systems. Results indicate that the approaches developed provide viable tools for forecasting outcomes with the aim of assisting management toward sustainable trends. This methodology is also applicable for modeling different scenarios in other dynamic systems. Indicators are indispensable for tracking conditions in human and natural systems, however, available data is sometimes far out of date and limit the ability to gauge system status. Techniques like regression and simulation are not sufficient because system characteristics have to be modeled ensuring over simplification of complex dynamics. This work presents a methodology combining the power of an Artificial Neural Network and Information Theory to capture patterns in a real dyna
NASA Astrophysics Data System (ADS)
Darnius, O.; Sitorus, S.
2018-03-01
The objective of this study was to determine the pattern of plant calendar of three types of crops; namely, palawija, rice, andbanana, based on rainfall in Deli Serdang Regency. In the first stage, we forecasted rainfall by using time series analysis, and obtained appropriate model of ARIMA (1,0,0) (1,1,1)12. Based on the forecast result, we designed a plant calendar pattern for the three types of plant. Furthermore, the probability of success in the plant types following the plant calendar pattern was calculated by using the Markov process by discretizing the continuous rainfall data into three categories; namely, Below Normal (BN), Normal (N), and Above Normal (AN) to form the probability transition matrix. Finally, the combination of rainfall forecasting models and the Markov process were used to determine the pattern of cropping calendars and the probability of success in the three crops. This research used rainfall data of Deli Serdang Regency taken from the office of BMKG (Meteorologist Climatology and Geophysics Agency), Sampali Medan, Indonesia.
Barba, Lida; Rodríguez, Nibaldo; Montt, Cecilia
2014-01-01
Two smoothing strategies combined with autoregressive integrated moving average (ARIMA) and autoregressive neural networks (ANNs) models to improve the forecasting of time series are presented. The strategy of forecasting is implemented using two stages. In the first stage the time series is smoothed using either, 3-point moving average smoothing, or singular value Decomposition of the Hankel matrix (HSVD). In the second stage, an ARIMA model and two ANNs for one-step-ahead time series forecasting are used. The coefficients of the first ANN are estimated through the particle swarm optimization (PSO) learning algorithm, while the coefficients of the second ANN are estimated with the resilient backpropagation (RPROP) learning algorithm. The proposed models are evaluated using a weekly time series of traffic accidents of Valparaíso, Chilean region, from 2003 to 2012. The best result is given by the combination HSVD-ARIMA, with a MAPE of 0:26%, followed by MA-ARIMA with a MAPE of 1:12%; the worst result is given by the MA-ANN based on PSO with a MAPE of 15:51%.
A Novel Multilevel-SVD Method to Improve Multistep Ahead Forecasting in Traffic Accidents Domain.
Barba, Lida; Rodríguez, Nibaldo
2017-01-01
Here is proposed a novel method for decomposing a nonstationary time series in components of low and high frequency. The method is based on Multilevel Singular Value Decomposition (MSVD) of a Hankel matrix. The decomposition is used to improve the forecasting accuracy of Multiple Input Multiple Output (MIMO) linear and nonlinear models. Three time series coming from traffic accidents domain are used. They represent the number of persons with injuries in traffic accidents of Santiago, Chile. The data were continuously collected by the Chilean Police and were weekly sampled from 2000:1 to 2014:12. The performance of MSVD is compared with the decomposition in components of low and high frequency of a commonly accepted method based on Stationary Wavelet Transform (SWT). SWT in conjunction with the Autoregressive model (SWT + MIMO-AR) and SWT in conjunction with an Autoregressive Neural Network (SWT + MIMO-ANN) were evaluated. The empirical results have shown that the best accuracy was achieved by the forecasting model based on the proposed decomposition method MSVD, in comparison with the forecasting models based on SWT.
A Novel Multilevel-SVD Method to Improve Multistep Ahead Forecasting in Traffic Accidents Domain
Rodríguez, Nibaldo
2017-01-01
Here is proposed a novel method for decomposing a nonstationary time series in components of low and high frequency. The method is based on Multilevel Singular Value Decomposition (MSVD) of a Hankel matrix. The decomposition is used to improve the forecasting accuracy of Multiple Input Multiple Output (MIMO) linear and nonlinear models. Three time series coming from traffic accidents domain are used. They represent the number of persons with injuries in traffic accidents of Santiago, Chile. The data were continuously collected by the Chilean Police and were weekly sampled from 2000:1 to 2014:12. The performance of MSVD is compared with the decomposition in components of low and high frequency of a commonly accepted method based on Stationary Wavelet Transform (SWT). SWT in conjunction with the Autoregressive model (SWT + MIMO-AR) and SWT in conjunction with an Autoregressive Neural Network (SWT + MIMO-ANN) were evaluated. The empirical results have shown that the best accuracy was achieved by the forecasting model based on the proposed decomposition method MSVD, in comparison with the forecasting models based on SWT. PMID:28261267
Assimilation of NUCAPS Retrieved Profiles in GSI for Unique Forecasting Applications
NASA Technical Reports Server (NTRS)
Berndt, Emily Beth; Zavodsky, Bradley; Srikishen, Jayanthi; Blankenship, Clay
2015-01-01
Hyperspectral IR profiles can be assimilated in GSI as a separate observation other than radiosondes with only changes to tables in the fix directory. Assimilation of profiles does produce changes to analysis fields and evidenced by: Innovations larger than +/-2.0 K are present and represent where individual profiles impact the final temperature analysis.The updated temperature analysis is colder behind the cold front and warmer in the warm sector. The updated moisture analysis is modified more in the low levels and tends to be drier than the original model background Analysis of model output shows: Differences relative to 13-km RAP analyses are smaller when profiles are assimilated with NUCAPS errors. CAPE is under-forecasted when assimilating NUCAPS profiles, which could be problematic for severe weather forecasting Refining the assimilation technique to incorporate an error covariance matrix and creating a separate GSI module to assimilate satellite profiles may improve results.
NASA Technical Reports Server (NTRS)
Hoffman, Ross N.; Nehrkorn, Thomas; Grassotti, Christopher
1997-01-01
We proposed a novel characterization of errors for numerical weather predictions. In its simplest form we decompose the error into a part attributable to phase errors and a remainder. The phase error is represented in the same fashion as a velocity field and is required to vary slowly and smoothly with position. A general distortion representation allows for the displacement and amplification or bias correction of forecast anomalies. Characterizing and decomposing forecast error in this way has two important applications, which we term the assessment application and the objective analysis application. For the assessment application, our approach results in new objective measures of forecast skill which are more in line with subjective measures of forecast skill and which are useful in validating models and diagnosing their shortcomings. With regard to the objective analysis application, meteorological analysis schemes balance forecast error and observational error to obtain an optimal analysis. Presently, representations of the error covariance matrix used to measure the forecast error are severely limited. For the objective analysis application our approach will improve analyses by providing a more realistic measure of the forecast error. We expect, a priori, that our approach should greatly improve the utility of remotely sensed data which have relatively high horizontal resolution, but which are indirectly related to the conventional atmospheric variables. In this project, we are initially focusing on the assessment application, restricted to a realistic but univariate 2-dimensional situation. Specifically, we study the forecast errors of the sea level pressure (SLP) and 500 hPa geopotential height fields for forecasts of the short and medium range. Since the forecasts are generated by the GEOS (Goddard Earth Observing System) data assimilation system with and without ERS 1 scatterometer data, these preliminary studies serve several purposes. They (1) provide a testbed for the use of the distortion representation of forecast errors, (2) act as one means of validating the GEOS data assimilation system and (3) help to describe the impact of the ERS 1 scatterometer data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Song, Yong-Seon; Institute of Cosmology and Gravitation, University of Portsmouth, Dennis Sciama Building, Portsmouth, PO1 3FX; Zhao Gongbo
We explore the complementarity of weak lensing and galaxy peculiar velocity measurements to better constrain modifications to General Relativity. We find no evidence for deviations from General Relativity on cosmological scales from a combination of peculiar velocity measurements (for Luminous Red Galaxies in the Sloan Digital Sky Survey) with weak lensing measurements (from the Canadian France Hawaii Telescope Legacy Survey). We provide a Fisher error forecast for a Euclid-like space-based survey including both lensing and peculiar velocity measurements and show that the expected constraints on modified gravity will be at least an order of magnitude better than with present data,more » i.e. we will obtain {approx_equal}5% errors on the modified gravity parametrization described here. We also present a model-independent method for constraining modified gravity parameters using tomographic peculiar velocity information, and apply this methodology to the present data set.« less
Forecasting the mortality rates of Malaysian population using Heligman-Pollard model
NASA Astrophysics Data System (ADS)
Ibrahim, Rose Irnawaty; Mohd, Razak; Ngataman, Nuraini; Abrisam, Wan Nur Azifah Wan Mohd
2017-08-01
Actuaries, demographers and other professionals have always been aware of the critical importance of mortality forecasting due to declining trend of mortality and continuous increases in life expectancy. Heligman-Pollard model was introduced in 1980 and has been widely used by researchers in modelling and forecasting future mortality. This paper aims to estimate an eight-parameter model based on Heligman and Pollard's law of mortality. Since the model involves nonlinear equations that are explicitly difficult to solve, the Matrix Laboratory Version 7.0 (MATLAB 7.0) software will be used in order to estimate the parameters. Statistical Package for the Social Sciences (SPSS) will be applied to forecast all the parameters according to Autoregressive Integrated Moving Average (ARIMA). The empirical data sets of Malaysian population for period of 1981 to 2015 for both genders will be considered, which the period of 1981 to 2010 will be used as "training set" and the period of 2011 to 2015 as "testing set". In order to investigate the accuracy of the estimation, the forecast results will be compared against actual data of mortality rates. The result shows that Heligman-Pollard model fit well for male population at all ages while the model seems to underestimate the mortality rates for female population at the older ages.
Quantum nonunital dynamics of spin-bath-assisted Fisher information
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hao, Xiang, E-mail: haoxiang-edu198126@163.com; Wu, Yinzhong
2016-04-15
The nonunital non-Markovian dynamics of qubits immersed in a spin bath is studied without any Markovian approximation. The environmental effects on the precisions of quantum parameter estimation are taken into account. The time-dependent transfer matrix and inhomogeneity vector are obtained for the description of the open dynamical process. The dynamical behaviour of one qubit coupled to a spin bath is geometrically described by the Bloch vector. It is found out that the nonunital non-Markovian effects can engender the improvement of the precision of quantum parameter estimation. This result contributes to the environment-assisted quantum information theory.
A product Pearson-type VII density distribution
NASA Astrophysics Data System (ADS)
Nadarajah, Saralees; Kotz, Samuel
2008-01-01
The Pearson-type VII distributions (containing the Student's t distributions) are becoming increasing prominent and are being considered as competitors to the normal distribution. Motivated by real examples in decision sciences, Bayesian statistics, probability theory and Physics, a new Pearson-type VII distribution is introduced by taking the product of two Pearson-type VII pdfs. Various structural properties of this distribution are derived, including its cdf, moments, mean deviation about the mean, mean deviation about the median, entropy, asymptotic distribution of the extreme order statistics, maximum likelihood estimates and the Fisher information matrix. Finally, an application to a Bayesian testing problem is illustrated.
Analysis on the hot spot and trend of the foreign assembly building research
NASA Astrophysics Data System (ADS)
Bi, Xiaoqing; Luo, Yanbing
2017-03-01
First of all, the paper analyzes the research on the front of the assembly building in the past 15 years. This article mainly adopts the method of CO word analysis, construct the co word matrix, correlation matrix, and then into a dissimilarity matrix, and on this basis, using factor analysis, cluster analysis and multi scale analysis method to study the structure of prefabricated construction field display. Finally, the results of the analysis are discussed, and summarized the current research focus of foreign prefabricated construction mainly concentrated in 7 aspects: embankment construction, wood construction, bridge construction, crane layout, PCM wall and glass system, based on neural network test, energy saving and recycling, and forecast the future trend of development study.
Fishers' knowledge about fish trophic interactions in the southeastern Brazilian coast.
Ramires, Milena; Clauzet, Mariana; Barrella, Walter; Rotundo, Matheus M; Silvano, Renato Am; Begossi, Alpina
2015-03-05
Data derived from studies of fishers' local ecological knowledge (LEK) can be invaluable to the proposal of new studies and more appropriate management strategies. This study analyzed the fisher's LEK about trophic relationships of fishes in the southeastern Brazilian coast, comparing fishers' LEK with scientific knowledge to provide new hypotheses. The initial contacts with fishers were made through informal visits in their residences, to explain the research goals, meet fishers and their families, check the number of resident fishers and ask for fishers' consent to participate in the research. After this initial contact, fishers were selected to be included in the interviews through the technique of snowball sampling. The fishers indicated by others who attended the criteria to be included in the research were interviewed by using a semi-structured standard questionnaire. There were interviewed 26 artisanal fishers from three communities of the Ilhabela: Jabaquara, Fome and Serraria. The interviewed fishers showed a detailed knowledge about the trophic interactions of the studied coastal fishes, as fishers mentioned 17 food items for these fishes and six fish and three mammals as fish predators. The most mentioned food items were small fish, shrimps and crabs, while the most mentioned predators were large reef fishes. Fishers also mentioned some predators, such as sea otters, that have not been reported by the biological literature and are poorly known. The LEK of the studied fishers showed a high degree of concordance with the scientific literature regarding fish diet. This study evidenced the value of fishers' LEK to improve fisheries research and management, as well as the needy to increase the collaboration among managers, biologists and fishers.
Human-model hybrid Korean air quality forecasting system.
Chang, Lim-Seok; Cho, Ara; Park, Hyunju; Nam, Kipyo; Kim, Deokrae; Hong, Ji-Hyoung; Song, Chang-Keun
2016-09-01
The Korean national air quality forecasting system, consisting of the Weather Research and Forecasting, the Sparse Matrix Operator Kernel Emissions, and the Community Modeling and Analysis (CMAQ), commenced from August 31, 2013 with target pollutants of particulate matters (PM) and ozone. Factors contributing to PM forecasting accuracy include CMAQ inputs of meteorological field and emissions, forecasters' capacity, and inherent CMAQ limit. Four numerical experiments were conducted including two global meteorological inputs from the Global Forecast System (GFS) and the Unified Model (UM), two emissions from the Model Intercomparison Study Asia (MICS-Asia) and the Intercontinental Chemical Transport Experiment (INTEX-B) for the Northeast Asia with Clear Air Policy Support System (CAPSS) for South Korea, and data assimilation of the Monitoring Atmospheric Composition and Climate (MACC). Significant PM underpredictions by using both emissions were found for PM mass and major components (sulfate and organic carbon). CMAQ predicts PM2.5 much better than PM10 (NMB of PM2.5: -20~-25%, PM10: -43~-47%). Forecasters' error usually occurred at the next day of high PM event. Once CMAQ fails to predict high PM event the day before, forecasters are likely to dismiss the model predictions on the next day which turns out to be true. The best combination of CMAQ inputs is the set of UM global meteorological field, MICS-Asia and CAPSS 2010 emissions with the NMB of -12.3%, the RMSE of 16.6μ/m(3) and the R(2) of 0.68. By using MACC data as an initial and boundary condition, the performance skill of CMAQ would be improved, especially in the case of undefined coarse emission. A variety of methods such as ensemble and data assimilation are considered to improve further the accuracy of air quality forecasting, especially for high PM events to be comparable to for all cases. The growing utilization of the air quality forecast induced the public strongly to demand that the accuracy of the national forecasting be improved. In this study, we investigated the problems in the current forecasting as well as various alternatives to solve the problems. Such efforts to improve the accuracy of the forecast are expected to contribute to the protection of public health by increasing the availability of the forecast system.
NASA Astrophysics Data System (ADS)
Rivière, G.; Hua, B. L.
2004-10-01
A new perturbation initialization method is used to quantify error growth due to inaccuracies of the forecast model initial conditions in a quasigeostrophic box ocean model describing a wind-driven double gyre circulation. This method is based on recent analytical results on Lagrangian alignment dynamics of the perturbation velocity vector in quasigeostrophic flows. More specifically, it consists in initializing a unique perturbation from the sole knowledge of the control flow properties at the initial time of the forecast and whose velocity vector orientation satisfies a Lagrangian equilibrium criterion. This Alignment-based Initialization method is hereafter denoted as the AI method.In terms of spatial distribution of the errors, we have compared favorably the AI error forecast with the mean error obtained with a Monte-Carlo ensemble prediction. It is shown that the AI forecast is on average as efficient as the error forecast initialized with the leading singular vector for the palenstrophy norm, and significantly more efficient than that for total energy and enstrophy norms. Furthermore, a more precise examination shows that the AI forecast is systematically relevant for all control flows whereas the palenstrophy singular vector forecast leads sometimes to very good scores and sometimes to very bad ones.A principal component analysis at the final time of the forecast shows that the AI mode spatial structure is comparable to that of the first eigenvector of the error covariance matrix for a "bred mode" ensemble. Furthermore, the kinetic energy of the AI mode grows at the same constant rate as that of the "bred modes" from the initial time to the final time of the forecast and is therefore characterized by a sustained phase of error growth. In this sense, the AI mode based on Lagrangian dynamics of the perturbation velocity orientation provides a rationale of the "bred mode" behavior.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baker, Kyri; Dall'Anese, Emiliano; Summers, Tyler
This paper outlines a data-driven, distributionally robust approach to solve chance-constrained AC optimal power flow problems in distribution networks. Uncertain forecasts for loads and power generated by photovoltaic (PV) systems are considered, with the goal of minimizing PV curtailment while meeting power flow and voltage regulation constraints. A data- driven approach is utilized to develop a distributionally robust conservative convex approximation of the chance-constraints; particularly, the mean and covariance matrix of the forecast errors are updated online, and leveraged to enforce voltage regulation with predetermined probability via Chebyshev-based bounds. By combining an accurate linear approximation of the AC power flowmore » equations with the distributionally robust chance constraint reformulation, the resulting optimization problem becomes convex and computationally tractable.« less
Accelerated Sensitivity Analysis in High-Dimensional Stochastic Reaction Networks
Arampatzis, Georgios; Katsoulakis, Markos A.; Pantazis, Yannis
2015-01-01
Existing sensitivity analysis approaches are not able to handle efficiently stochastic reaction networks with a large number of parameters and species, which are typical in the modeling and simulation of complex biochemical phenomena. In this paper, a two-step strategy for parametric sensitivity analysis for such systems is proposed, exploiting advantages and synergies between two recently proposed sensitivity analysis methodologies for stochastic dynamics. The first method performs sensitivity analysis of the stochastic dynamics by means of the Fisher Information Matrix on the underlying distribution of the trajectories; the second method is a reduced-variance, finite-difference, gradient-type sensitivity approach relying on stochastic coupling techniques for variance reduction. Here we demonstrate that these two methods can be combined and deployed together by means of a new sensitivity bound which incorporates the variance of the quantity of interest as well as the Fisher Information Matrix estimated from the first method. The first step of the proposed strategy labels sensitivities using the bound and screens out the insensitive parameters in a controlled manner. In the second step of the proposed strategy, a finite-difference method is applied only for the sensitivity estimation of the (potentially) sensitive parameters that have not been screened out in the first step. Results on an epidermal growth factor network with fifty parameters and on a protein homeostasis with eighty parameters demonstrate that the proposed strategy is able to quickly discover and discard the insensitive parameters and in the remaining potentially sensitive parameters it accurately estimates the sensitivities. The new sensitivity strategy can be several times faster than current state-of-the-art approaches that test all parameters, especially in “sloppy” systems. In particular, the computational acceleration is quantified by the ratio between the total number of parameters over the number of the sensitive parameters. PMID:26161544
Takahama, A; Rôças, I N; Faustino, I S P; Alves, F R F; Azevedo, R S; Gomes, C C; Araújo-Filho, W R; Siqueira, J F
2018-07-01
To evaluate the association between the presence of selected bacterial species/groups in the apical root canal and expression of mediators of soft and bone tissue destruction in apical periodontitis lesions. Relationships between bacteria and some other features of apical periodontitis were also investigated. Seventeen freshly extracted teeth with pulp necrosis and apical periodontitis were included. The apical root segment was sectioned and cryopulverized; DNA was extracted and evaluated for the presence of 9 bacterial species/groups using real-time polymerase chain reaction. Lesions were processed for histopathological and immunohistochemical analyses, which targeted matrix metalloproteinase-2 (MMP-2) and -9 (MMP-9), receptor activator of NFκB (RANK), RANK ligand (RANKL) and osteoprotegerin (OPG). Associations of the target bacteria with expression of these mediators, presence of symptoms, lesion size and histopathological diagnosis were evaluated. Data were analysed using the chi-square, Fisher's exact, Mann-Whitney and Pearson tests. P values lower than 0.05 were considered significant. All pulverized apical root samples were positive for bacteria. The most prevalent taxa were Actinobacteria (53%), Streptococcus species (35%), Fusobacterium species and Parvimonas micra (18%). The target mediators exhibited a high mean expression in the lesions (MMP-2: 82%; MMP-9: 73%; RANK: 78%; RANKL; 81%; OPG; 83%). Mean RANKL:OPG ratio was significantly higher in granulomas than cysts (P < 0.05, Mann-Whitney test). Actinobacteria were associated with granulomas, higher MMP-2 expression, lower OPG expression, and higher RANKL:OPG ratio (P < 0.05 for all, Fisher's exact test or Mann-Whitney test). No other significant associations were found. Actinobacteria may play an important role in the active phase of soft and bone tissue destruction in apical periodontitis. © 2018 International Endodontic Journal. Published by John Wiley & Sons Ltd.
Ogungbenro, Kayode; Aarons, Leon
2011-08-01
In the recent years, interest in the application of experimental design theory to population pharmacokinetic (PK) and pharmacodynamic (PD) experiments has increased. The aim is to improve the efficiency and the precision with which parameters are estimated during data analysis and sometimes to increase the power and reduce the sample size required for hypothesis testing. The population Fisher information matrix (PFIM) has been described for uniresponse and multiresponse population PK experiments for design evaluation and optimisation. Despite these developments and availability of tools for optimal design of population PK and PD experiments much of the effort has been focused on repeated continuous variable measurements with less work being done on repeated discrete type measurements. Discrete data arise mainly in PDs e.g. ordinal, nominal, dichotomous or count measurements. This paper implements expressions for the PFIM for repeated ordinal, dichotomous and count measurements based on analysis by a mixed-effects modelling technique. Three simulation studies were used to investigate the performance of the expressions. Example 1 is based on repeated dichotomous measurements, Example 2 is based on repeated count measurements and Example 3 is based on repeated ordinal measurements. Data simulated in MATLAB were analysed using NONMEM (Laplace method) and the glmmML package in R (Laplace and adaptive Gauss-Hermite quadrature methods). The results obtained for Examples 1 and 2 showed good agreement between the relative standard errors obtained using the PFIM and simulations. The results obtained for Example 3 showed the importance of sampling at the most informative time points. Implementation of these expressions will provide the opportunity for efficient design of population PD experiments that involve discrete type data through design evaluation and optimisation.
Riviere, Marie-Karelle; Ueckert, Sebastian; Mentré, France
2016-10-01
Non-linear mixed effect models (NLMEMs) are widely used for the analysis of longitudinal data. To design these studies, optimal design based on the expected Fisher information matrix (FIM) can be used instead of performing time-consuming clinical trial simulations. In recent years, estimation algorithms for NLMEMs have transitioned from linearization toward more exact higher-order methods. Optimal design, on the other hand, has mainly relied on first-order (FO) linearization to calculate the FIM. Although efficient in general, FO cannot be applied to complex non-linear models and with difficulty in studies with discrete data. We propose an approach to evaluate the expected FIM in NLMEMs for both discrete and continuous outcomes. We used Markov Chain Monte Carlo (MCMC) to integrate the derivatives of the log-likelihood over the random effects, and Monte Carlo to evaluate its expectation w.r.t. the observations. Our method was implemented in R using Stan, which efficiently draws MCMC samples and calculates partial derivatives of the log-likelihood. Evaluated on several examples, our approach showed good performance with relative standard errors (RSEs) close to those obtained by simulations. We studied the influence of the number of MC and MCMC samples and computed the uncertainty of the FIM evaluation. We also compared our approach to Adaptive Gaussian Quadrature, Laplace approximation, and FO. Our method is available in R-package MIXFIM and can be used to evaluate the FIM, its determinant with confidence intervals (CIs), and RSEs with CIs. © The Author 2016. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Fisher information and Rényi dimensions: A thermodynamical formalism
DOE Office of Scientific and Technical Information (OSTI.GOV)
Godó, B.; Nagy, Á.
The relation between the Fisher information and Rényi dimensions is established: the Fisher information can be expressed as a linear combination of the first and second derivatives of the Rényi dimensions with respect to the Rényi parameter β. The Rényi parameter β is the parameter of the Fisher information. A thermodynamical description based on the Fisher information with β being the inverse temperature is introduced for chaotic systems. The link between the Fisher information and the heat capacity is emphasized, and the Fisher heat capacity is introduced.
Fisher information and Rényi dimensions: A thermodynamical formalism.
Godó, B; Nagy, Á
2016-08-01
The relation between the Fisher information and Rényi dimensions is established: the Fisher information can be expressed as a linear combination of the first and second derivatives of the Rényi dimensions with respect to the Rényi parameter β. The Rényi parameter β is the parameter of the Fisher information. A thermodynamical description based on the Fisher information with β being the inverse temperature is introduced for chaotic systems. The link between the Fisher information and the heat capacity is emphasized, and the Fisher heat capacity is introduced.
Analysis of the Fisher solution
NASA Astrophysics Data System (ADS)
Abdolrahimi, Shohreh; Shoom, Andrey A.
2010-01-01
We study the d-dimensional Fisher solution which represents a static, spherically symmetric, asymptotically flat spacetime with a massless scalar field. The solution has two parameters, the mass M and the “scalar charge” Σ. The Fisher solution has a naked curvature singularity which divides the spacetime manifold into two disconnected parts. The part which is asymptotically flat we call the Fisher spacetime, and another part we call the Fisher universe. The d-dimensional Schwarzschild-Tangherlini solution and the Fisher solution belong to the same theory and are dual to each other. The duality transformation acting in the parameter space (M,Σ) maps the exterior region of the Schwarzschild-Tangherlini black hole into the Fisher spacetime which has a naked timelike singularity, and interior region of the black hole into the Fisher universe, which is an anisotropic expanding-contracting universe and which has two spacelike singularities representing its “big bang” and “big crunch.” The big bang singularity and the singularity of the Fisher spacetime are radially weak in the sense that a 1-dimensional object moving along a timelike radial geodesic can arrive to the singularities intact. At the vicinity of the singularity the Fisher spacetime of nonzero mass has a region where its Misner-Sharp energy is negative. The Fisher universe has a marginally trapped surface corresponding to the state of its maximal expansion in the angular directions. These results and derived relations between geometric quantities of the Fisher spacetime, the Fisher universe, and the Schwarzschild-Tangherlini black hole may suggest that the massless scalar field transforms the black hole event horizon into the naked radially weak disjoint singularities of the Fisher spacetime and the Fisher universe which are “dual to the horizon.”
Analysis of the Fisher solution
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abdolrahimi, Shohreh; Shoom, Andrey A.
2010-01-15
We study the d-dimensional Fisher solution which represents a static, spherically symmetric, asymptotically flat spacetime with a massless scalar field. The solution has two parameters, the mass M and the 'scalar charge' {Sigma}. The Fisher solution has a naked curvature singularity which divides the spacetime manifold into two disconnected parts. The part which is asymptotically flat we call the Fisher spacetime, and another part we call the Fisher universe. The d-dimensional Schwarzschild-Tangherlini solution and the Fisher solution belong to the same theory and are dual to each other. The duality transformation acting in the parameter space (M,{Sigma}) maps the exteriormore » region of the Schwarzschild-Tangherlini black hole into the Fisher spacetime which has a naked timelike singularity, and interior region of the black hole into the Fisher universe, which is an anisotropic expanding-contracting universe and which has two spacelike singularities representing its 'big bang' and 'big crunch'. The big bang singularity and the singularity of the Fisher spacetime are radially weak in the sense that a 1-dimensional object moving along a timelike radial geodesic can arrive to the singularities intact. At the vicinity of the singularity the Fisher spacetime of nonzero mass has a region where its Misner-Sharp energy is negative. The Fisher universe has a marginally trapped surface corresponding to the state of its maximal expansion in the angular directions. These results and derived relations between geometric quantities of the Fisher spacetime, the Fisher universe, and the Schwarzschild-Tangherlini black hole may suggest that the massless scalar field transforms the black hole event horizon into the naked radially weak disjoint singularities of the Fisher spacetime and the Fisher universe which are 'dual to the horizon'.« less
Future Scenarios in Communications. [Student's Guide.] Preparing for Tomorrow's World.
ERIC Educational Resources Information Center
Iozzi, Louis A.; And Others
The purpose of this module is to introduce students (grades 7-8) to the concept of change and factors influencing change. The module is composed of two major sections. Section 1 examines the development of the telephone system in the United States and introduces four futures forecasting techniques (Delphi probe, cross-impact matrix, trend…
Rodríguez, Nibaldo
2014-01-01
Two smoothing strategies combined with autoregressive integrated moving average (ARIMA) and autoregressive neural networks (ANNs) models to improve the forecasting of time series are presented. The strategy of forecasting is implemented using two stages. In the first stage the time series is smoothed using either, 3-point moving average smoothing, or singular value Decomposition of the Hankel matrix (HSVD). In the second stage, an ARIMA model and two ANNs for one-step-ahead time series forecasting are used. The coefficients of the first ANN are estimated through the particle swarm optimization (PSO) learning algorithm, while the coefficients of the second ANN are estimated with the resilient backpropagation (RPROP) learning algorithm. The proposed models are evaluated using a weekly time series of traffic accidents of Valparaíso, Chilean region, from 2003 to 2012. The best result is given by the combination HSVD-ARIMA, with a MAPE of 0 : 26%, followed by MA-ARIMA with a MAPE of 1 : 12%; the worst result is given by the MA-ANN based on PSO with a MAPE of 15 : 51%. PMID:25243200
A Global Estimate of the Number of Coral Reef Fishers.
Teh, Louise S L; Teh, Lydia C L; Sumaila, U Rashid
2013-01-01
Overfishing threatens coral reefs worldwide, yet there is no reliable estimate on the number of reef fishers globally. We address this data gap by quantifying the number of reef fishers on a global scale, using two approaches - the first estimates reef fishers as a proportion of the total number of marine fishers in a country, based on the ratio of reef-related to total marine fish landed values. The second estimates reef fishers as a function of coral reef area, rural coastal population, and fishing pressure. In total, we find that there are 6 million reef fishers in 99 reef countries and territories worldwide, of which at least 25% are reef gleaners. Our estimates are an improvement over most existing fisher population statistics, which tend to omit accounting for gleaners and reef fishers. Our results suggest that slightly over a quarter of the world's small-scale fishers fish on coral reefs, and half of all coral reef fishers are in Southeast Asia. Coral reefs evidently support the socio-economic well-being of numerous coastal communities. By quantifying the number of people who are employed as reef fishers, we provide decision-makers with an important input into planning for sustainable coral reef fisheries at the appropriate scale.
A Global Estimate of the Number of Coral Reef Fishers
Teh, Louise S. L.; Teh, Lydia C. L.; Sumaila, U. Rashid
2013-01-01
Overfishing threatens coral reefs worldwide, yet there is no reliable estimate on the number of reef fishers globally. We address this data gap by quantifying the number of reef fishers on a global scale, using two approaches - the first estimates reef fishers as a proportion of the total number of marine fishers in a country, based on the ratio of reef-related to total marine fish landed values. The second estimates reef fishers as a function of coral reef area, rural coastal population, and fishing pressure. In total, we find that there are 6 million reef fishers in 99 reef countries and territories worldwide, of which at least 25% are reef gleaners. Our estimates are an improvement over most existing fisher population statistics, which tend to omit accounting for gleaners and reef fishers. Our results suggest that slightly over a quarter of the world’s small-scale fishers fish on coral reefs, and half of all coral reef fishers are in Southeast Asia. Coral reefs evidently support the socio-economic well-being of numerous coastal communities. By quantifying the number of people who are employed as reef fishers, we provide decision-makers with an important input into planning for sustainable coral reef fisheries at the appropriate scale. PMID:23840327
Effective theory of dark energy at redshift survey scales
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gleyzes, Jérôme; Mancarella, Michele; Vernizzi, Filippo
2016-02-01
We explore the phenomenological consequences of general late-time modifications of gravity in the quasi-static approximation, in the case where cold dark matter is non-minimally coupled to the gravitational sector. Assuming spectroscopic and photometric surveys with configuration parameters similar to those of the Euclid mission, we derive constraints on our effective description from three observables: the galaxy power spectrum in redshift space, tomographic weak-lensing shear power spectrum and the correlation spectrum between the integrated Sachs-Wolfe effect and the galaxy distribution. In particular, with ΛCDM as fiducial model and a specific choice for the time dependence of our effective functions, we performmore » a Fisher matrix analysis and find that the unmarginalized 68% CL errors on the parameters describing the modifications of gravity are of order σ∼10{sup −2}–10{sup −3}. We also consider two other fiducial models. A nonminimal coupling of CDM enhances the effects of modified gravity and reduces the above statistical errors accordingly. In all cases, we find that the parameters are highly degenerate, which prevents the inversion of the Fisher matrices. Some of these degeneracies can be broken by combining all three observational probes.« less
Application Of Multi-grid Method On China Seas' Temperature Forecast
NASA Astrophysics Data System (ADS)
Li, W.; Xie, Y.; He, Z.; Liu, K.; Han, G.; Ma, J.; Li, D.
2006-12-01
Correlation scales have been used in traditional scheme of 3-dimensional variational (3D-Var) data assimilation to estimate the background error covariance for the numerical forecast and reanalysis of atmosphere and ocean for decades. However there are still some drawbacks of this scheme. First, the correlation scales are difficult to be determined accurately. Second, the positive definition of the first-guess error covariance matrix cannot be guaranteed unless the correlation scales are sufficiently small. Xie et al. (2005) indicated that a traditional 3D-Var only corrects some certain wavelength errors and its accuracy depends on the accuracy of the first-guess covariance. And in general, short wavelength error can not be well corrected until long one is corrected and then inaccurate first-guess covariance may mistakenly take long wave error as short wave ones and result in erroneous analysis. For the purpose of quickly minimizing the errors of long and short waves successively, a new 3D-Var data assimilation scheme, called multi-grid data assimilation scheme, is proposed in this paper. By assimilating the shipboard SST and temperature profiles data into a numerical model of China Seas, we applied this scheme in two-month data assimilation and forecast experiment which ended in a favorable result. Comparing with the traditional scheme of 3D-Var, the new scheme has higher forecast accuracy and a lower forecast Root-Mean-Square (RMS) error. Furthermore, this scheme was applied to assimilate the SST of shipboard, AVHRR Pathfinder Version 5.0 SST and temperature profiles at the same time, and a ten-month forecast experiment on sea temperature of China Seas was carried out, in which a successful forecast result was obtained. Particularly, the new scheme is demonstrated a great numerical efficiency in these analyses.
Hybrid vs Adaptive Ensemble Kalman Filtering for Storm Surge Forecasting
NASA Astrophysics Data System (ADS)
Altaf, M. U.; Raboudi, N.; Gharamti, M. E.; Dawson, C.; McCabe, M. F.; Hoteit, I.
2014-12-01
Recent storm surge events due to Hurricanes in the Gulf of Mexico have motivated the efforts to accurately forecast water levels. Toward this goal, a parallel architecture has been implemented based on a high resolution storm surge model, ADCIRC. However the accuracy of the model notably depends on the quality and the recentness of the input data (mainly winds and bathymetry), model parameters (e.g. wind and bottom drag coefficients), and the resolution of the model grid. Given all these uncertainties in the system, the challenge is to build an efficient prediction system capable of providing accurate forecasts enough ahead of time for the authorities to evacuate the areas at risk. We have developed an ensemble-based data assimilation system to frequently assimilate available data into the ADCIRC model in order to improve the accuracy of the model. In this contribution we study and analyze the performances of different ensemble Kalman filter methodologies for efficient short-range storm surge forecasting, the aim being to produce the most accurate forecasts at the lowest possible computing time. Using Hurricane Ike meteorological data to force the ADCIRC model over a domain including the Gulf of Mexico coastline, we implement and compare the forecasts of the standard EnKF, the hybrid EnKF and an adaptive EnKF. The last two schemes have been introduced as efficient tools for enhancing the behavior of the EnKF when implemented with small ensembles by exploiting information from a static background covariance matrix. Covariance inflation and localization are implemented in all these filters. Our results suggest that both the hybrid and the adaptive approach provide significantly better forecasts than those resulting from the standard EnKF, even when implemented with much smaller ensembles.
Historical harvest and incidental capture of fishers in California
Jeffrey C. Lewis; William J. Zielinski
1996-01-01
Recent petitions to list the fisher (Martes pennanti) under the Endangered Species Act have brought attention to fisher conservation. Although commercial trapping of fishers in California ended in 1946, summarizing the commercial harvest data can provide a historical perspective to fisher conservation and may indicate the prevalence of incidental...
Application of Fisher Information to Complex Dynamic Systems
Fisher information was developed by the statistician Ronald Fisher as a measure of the information obtainable from data being used to fit a related parameter. Starting from the work of Ronald Fisher1 and B. Roy Frieden2, we have developed Fisher information as a measure of order ...
Application of Fisher Information to Complex Dynamic Systems (Tucson)
Fisher information was developed by the statistician Ronald Fisher as a measure of the information obtainable from data being used to fit a related parameter. Starting from the work of Ronald Fisher1 and B. Roy Frieden2, we have developed Fisher information as a measure of order ...
Probing primordial features with next-generation photometric and radio surveys
NASA Astrophysics Data System (ADS)
Ballardini, M.; Finelli, F.; Maartens, R.; Moscardini, L.
2018-04-01
We investigate the possibility of using future photometric and radio surveys to constrain the power spectrum of primordial fluctuations that is predicted by inflationary models with a violation of the slow-roll phase. We forecast constraints with a Fisher analysis on the amplitude of the parametrized features on ultra-large scales, in order to assess whether these could be distinguishable over the cosmic variance. We find that the next generation of photometric and radio surveys has the potential to test these models at a sensitivity better than current CMB experiments and that the synergy between galaxy and CMB observations is able to constrain models with many extra parameters. In particular, an SKA continuum survey with a huge sky coverage and a flux threshold of a few μJy could confirm the presence of a new phase in the early Universe at more than 3σ.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-12-19
... (Regulation on Application for Fisher Houses and Other Temporary Lodging and VHA Fisher House Application... INFORMATION: Title: Regulation on Application for Fisher Houses and Other Temporary Lodging and VHA Fisher House Application, VA Forms 10-0408 and 10-0408a. OMB Control Number: 2900-0630. Type of Review...
Discriminant projective non-negative matrix factorization.
Guan, Naiyang; Zhang, Xiang; Luo, Zhigang; Tao, Dacheng; Yang, Xuejun
2013-01-01
Projective non-negative matrix factorization (PNMF) projects high-dimensional non-negative examples X onto a lower-dimensional subspace spanned by a non-negative basis W and considers W(T) X as their coefficients, i.e., X≈WW(T) X. Since PNMF learns the natural parts-based representation Wof X, it has been widely used in many fields such as pattern recognition and computer vision. However, PNMF does not perform well in classification tasks because it completely ignores the label information of the dataset. This paper proposes a Discriminant PNMF method (DPNMF) to overcome this deficiency. In particular, DPNMF exploits Fisher's criterion to PNMF for utilizing the label information. Similar to PNMF, DPNMF learns a single non-negative basis matrix and needs less computational burden than NMF. In contrast to PNMF, DPNMF maximizes the distance between centers of any two classes of examples meanwhile minimizes the distance between any two examples of the same class in the lower-dimensional subspace and thus has more discriminant power. We develop a multiplicative update rule to solve DPNMF and prove its convergence. Experimental results on four popular face image datasets confirm its effectiveness comparing with the representative NMF and PNMF algorithms.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Le; Yu, Yu; Zhang, Pengjie, E-mail: lezhang@sjtu.edu.cn
Photo- z error is one of the major sources of systematics degrading the accuracy of weak-lensing cosmological inferences. Zhang et al. proposed a self-calibration method combining galaxy–galaxy correlations and galaxy–shear correlations between different photo- z bins. Fisher matrix analysis shows that it can determine the rate of photo- z outliers at a level of 0.01%–1% merely using photometric data and do not rely on any prior knowledge. In this paper, we develop a new algorithm to implement this method by solving a constrained nonlinear optimization problem arising in the self-calibration process. Based on the techniques of fixed-point iteration and non-negativemore » matrix factorization, the proposed algorithm can efficiently and robustly reconstruct the scattering probabilities between the true- z and photo- z bins. The algorithm has been tested extensively by applying it to mock data from simulated stage IV weak-lensing projects. We find that the algorithm provides a successful recovery of the scatter rates at the level of 0.01%–1%, and the true mean redshifts of photo- z bins at the level of 0.001, which may satisfy the requirements in future lensing surveys.« less
Discriminant Projective Non-Negative Matrix Factorization
Guan, Naiyang; Zhang, Xiang; Luo, Zhigang; Tao, Dacheng; Yang, Xuejun
2013-01-01
Projective non-negative matrix factorization (PNMF) projects high-dimensional non-negative examples X onto a lower-dimensional subspace spanned by a non-negative basis W and considers WT X as their coefficients, i.e., X≈WWT X. Since PNMF learns the natural parts-based representation Wof X, it has been widely used in many fields such as pattern recognition and computer vision. However, PNMF does not perform well in classification tasks because it completely ignores the label information of the dataset. This paper proposes a Discriminant PNMF method (DPNMF) to overcome this deficiency. In particular, DPNMF exploits Fisher's criterion to PNMF for utilizing the label information. Similar to PNMF, DPNMF learns a single non-negative basis matrix and needs less computational burden than NMF. In contrast to PNMF, DPNMF maximizes the distance between centers of any two classes of examples meanwhile minimizes the distance between any two examples of the same class in the lower-dimensional subspace and thus has more discriminant power. We develop a multiplicative update rule to solve DPNMF and prove its convergence. Experimental results on four popular face image datasets confirm its effectiveness comparing with the representative NMF and PNMF algorithms. PMID:24376680
Parameter estimation accuracies of Galactic binaries with eLISA
NASA Astrophysics Data System (ADS)
Błaut, Arkadiusz
2018-09-01
We study parameter estimation accuracy of nearly monochromatic sources of gravitational waves with the future eLISA-like detectors. eLISA will be capable of observing millions of such signals generated by orbiting pairs of compact binaries consisting of white dwarf, neutron star or black hole and to resolve and estimate parameters of several thousands of them providing crucial information regarding their orbital dynamics, formation rates and evolutionary paths. Using the Fisher matrix analysis we compare accuracies of the estimated parameters for different mission designs defined by the GOAT advisory team established to asses the scientific capabilities and the technological issues of the eLISA-like missions.
Geometry of Theory Space and RG Flows
NASA Astrophysics Data System (ADS)
Kar, Sayan
The space of couplings of a given theory is the arena of interest in this article. Equipped with a metric ansatz akin to the Fisher information matrix in the space of parameters in statistics (similar metrics in physics are the Zamolodchikov metric or the O'Connor-Stephens metric) we investigate the geometry of theory space through a study of specific examples. We then look into renormalisation group flows in theory space and make an attempt to characterise such flows via its isotropic expansion, rotation and shear. Consequences arising from the evolution equation for the isotropic expansion are discussed. We conclude by pointing out generalisations and pose some open questions.
Sample-space-based feature extraction and class preserving projection for gene expression data.
Wang, Wenjun
2013-01-01
In order to overcome the problems of high computational complexity and serious matrix singularity for feature extraction using Principal Component Analysis (PCA) and Fisher's Linear Discrinimant Analysis (LDA) in high-dimensional data, sample-space-based feature extraction is presented, which transforms the computation procedure of feature extraction from gene space to sample space by representing the optimal transformation vector with the weighted sum of samples. The technique is used in the implementation of PCA, LDA, Class Preserving Projection (CPP) which is a new method for discriminant feature extraction proposed, and the experimental results on gene expression data demonstrate the effectiveness of the method.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-12
... (Regulation on Application for Fisher Houses and Other Temporary Lodging and VHA Fisher House Application... information technology. Title: Regulation on Application for Fisher Houses and Other Temporary Lodging and VHA Fisher House Application, VA Forms 10-0408 and 10-0408a. OMB Control Number: 2900-0630. Type of Review...
Approximate Joint Diagonalization and Geometric Mean of Symmetric Positive Definite Matrices
Congedo, Marco; Afsari, Bijan; Barachant, Alexandre; Moakher, Maher
2015-01-01
We explore the connection between two problems that have arisen independently in the signal processing and related fields: the estimation of the geometric mean of a set of symmetric positive definite (SPD) matrices and their approximate joint diagonalization (AJD). Today there is a considerable interest in estimating the geometric mean of a SPD matrix set in the manifold of SPD matrices endowed with the Fisher information metric. The resulting mean has several important invariance properties and has proven very useful in diverse engineering applications such as biomedical and image data processing. While for two SPD matrices the mean has an algebraic closed form solution, for a set of more than two SPD matrices it can only be estimated by iterative algorithms. However, none of the existing iterative algorithms feature at the same time fast convergence, low computational complexity per iteration and guarantee of convergence. For this reason, recently other definitions of geometric mean based on symmetric divergence measures, such as the Bhattacharyya divergence, have been considered. The resulting means, although possibly useful in practice, do not satisfy all desirable invariance properties. In this paper we consider geometric means of covariance matrices estimated on high-dimensional time-series, assuming that the data is generated according to an instantaneous mixing model, which is very common in signal processing. We show that in these circumstances we can approximate the Fisher information geometric mean by employing an efficient AJD algorithm. Our approximation is in general much closer to the Fisher information geometric mean as compared to its competitors and verifies many invariance properties. Furthermore, convergence is guaranteed, the computational complexity is low and the convergence rate is quadratic. The accuracy of this new geometric mean approximation is demonstrated by means of simulations. PMID:25919667
NASA Astrophysics Data System (ADS)
Bozhalkina, Yana; Timofeeva, Galina
2016-12-01
Mathematical model of loan portfolio in the form of a controlled Markov chain with discrete time is considered. It is assumed that coefficients of migration matrix depend on corrective actions and external factors. Corrective actions include process of receiving applications, interaction with existing solvent and insolvent clients. External factors are macroeconomic indicators, such as inflation and unemployment rates, exchange rates, consumer price indices, etc. Changes in corrective actions adjust the intensity of transitions in the migration matrix. The mathematical model for forecasting the credit portfolio structure taking into account a cumulative impact of internal and external changes is obtained.
Cable Television: End of a Dream. The Network Project Notebook Number Eight.
ERIC Educational Resources Information Center
Columbia Univ., New York, NY. Network Project.
The Notebook is divided into two parts. The first half reprints the transcript of a radio documentary on cable television, one in a series of five MATRIX radio programs produced by the Network Project in 1974. It includes discussions of planning for the new technology and of its present control by corporate conglomerates, and forecasts a…
Observation-Based Dissipation and Input Terms for Spectral Wave Models, with End-User Testing
2014-09-30
scale influence of the Great barrier reef matrix on wave attenuation, Coral Reefs [published, refereed] Ghantous, M., and A.V. Babanin, 2014: One...Observation-Based Dissipation and Input Terms for Spectral Wave Models...functions, based on advanced understanding of physics of air-sea interactions, wave breaking and swell attenuation, in wave - forecast models. OBJECTIVES The
Quantum chi-squared and goodness of fit testing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Temme, Kristan; Verstraete, Frank
2015-01-15
A quantum mechanical hypothesis test is presented for the hypothesis that a certain setup produces a given quantum state. Although the classical and the quantum problems are very much related to each other, the quantum problem is much richer due to the additional optimization over the measurement basis. A goodness of fit test for i.i.d quantum states is developed and a max-min characterization for the optimal measurement is introduced. We find the quantum measurement which leads both to the maximal Pitman and Bahadur efficiencies, and determine the associated divergence rates. We discuss the relationship of the quantum goodness of fitmore » test to the problem of estimating multiple parameters from a density matrix. These problems are found to be closely related and we show that the largest error of an optimal strategy, determined by the smallest eigenvalue of the Fisher information matrix, is given by the divergence rate of the goodness of fit test.« less
Uncertainties in extracted parameters of a Gaussian emission line profile with continuum background.
Minin, Serge; Kamalabadi, Farzad
2009-12-20
We derive analytical equations for uncertainties in parameters extracted by nonlinear least-squares fitting of a Gaussian emission function with an unknown continuum background component in the presence of additive white Gaussian noise. The derivation is based on the inversion of the full curvature matrix (equivalent to Fisher information matrix) of the least-squares error, chi(2), in a four-variable fitting parameter space. The derived uncertainty formulas (equivalent to Cramer-Rao error bounds) are found to be in good agreement with the numerically computed uncertainties from a large ensemble of simulated measurements. The derived formulas can be used for estimating minimum achievable errors for a given signal-to-noise ratio and for investigating some aspects of measurement setup trade-offs and optimization. While the intended application is Fabry-Perot spectroscopy for wind and temperature measurements in the upper atmosphere, the derivation is generic and applicable to other spectroscopy problems with a Gaussian line shape.
Metric Tests for Curvature from Weak Lensing and Baryon Acoustic Oscillations
NASA Astrophysics Data System (ADS)
Bernstein, G.
2006-02-01
We describe a practical measurement of the curvature of the universe which, unlike current constraints, relies purely on the properties of the Robertson-Walker metric rather than any assumed model for the dynamics and content of the universe. The observable quantity is the cross-correlation between foreground mass and gravitational shear of background galaxies, which depends on the angular diameter distances dA(zl), dA(zs), and dA(zs,zl) on the degenerate triangle formed by observer, source, and lens. In a flat universe, dA(zl,zs)=dA(zs)-dA(zl), but in curved universes an additional term ~Ωk appears and alters the lensing observables even if dA(z) is fixed. We describe a method whereby weak-lensing data can be used to solve simultaneously for dA and the curvature. This method is completely insensitive to the equation of state of the contents of the universe, or amendments to general relativity that alter the gravitational deflection of light or the growth of structure. The curvature estimate is also independent of biases in the photometric redshift scale. This measurement is shown to be subject to a degeneracy among dA, Ωk, and the galaxy bias factors that may be broken by using the same imaging data to measure the angular scale of baryon acoustic oscillations. Simplified estimates of the accuracy attainable by this method indicate that ambitious weak-lensing + baryon-oscillation surveys would measure Ωk to an accuracy ~0.04f-1/2sky(σlnz/0.04)1/2, where σlnz is the photometric redshift error. The Fisher-matrix formalism developed here is also useful for predicting bounds on curvature and other characteristics of parametric dark energy models. We forecast some representative error levels and compare ours to other analyses of the weak-lensing cross-correlation method. We find both curvature and parametric constraints to be surprisingly insensitive to the systematic shear calibration errors.
2016-08-01
ice have catastrophic effects on facilities, infrastructure, and military testing and training. Permafrost temperature , thickness, and geographic...treeline) and fire severity (~0 to ~100% SOL consumption ), they provide an excellent suite of sites to test and quantify the effects of fire severity...stages .........................59 Table 6.1. Variables included in explanatory matrix for black spruce dominance ............68 Table 6.2. Mixed effect
2016-08-01
catastrophic effects on facilities, infrastructure, and military testing and training. Permafrost temperature , thickness, and geographic continuity...and fire severity (~0 to ~100% SOL consumption ), they provide an excellent suite of sites to test and quantify the effects of fire severity on plant...59 Table 6.1. Variables included in explanatory matrix for black spruce dominance ............68 Table 6.2. Mixed effect model
Green, David S; Matthews, Sean M; Swiers, Robert C; Callas, Richard L; Scott Yaeger, J; Farber, Stuart L; Schwartz, Michael K; Powell, Roger A
2018-05-01
Determining how species coexist is critical for understanding functional diversity, niche partitioning and interspecific interactions. Identifying the direct and indirect interactions among sympatric carnivores that enable their coexistence is particularly important to elucidate because they are integral for maintaining ecosystem function. We studied the effects of removing nine fishers (Pekania pennanti) on their population dynamics and used this perturbation to elucidate the interspecific interactions among fishers, grey foxes (Urocyon cinereoargenteus) and ringtails (Bassariscus astutus). Grey foxes (family: Canidae) are likely to compete with fishers due to their similar body sizes and dietary overlap, and ringtails (family: Procyonidae), like fishers, are semi-arboreal species of conservation concern. We used spatial capture-recapture to investigate fisher population numbers and dynamic occupancy models that incorporated interspecific interactions to investigate the effects members of these species had on the colonization and persistence of each other's site occupancy. The fisher population showed no change in density for up to 3 years following the removals of fishers for translocations. In contrast, fisher site occupancy decreased in the years immediately following the translocations. During this same time period, site occupancy by grey foxes increased and remained elevated through the end of the study. We found a complicated hierarchy among fishers, foxes and ringtails. Fishers affected grey fox site persistence negatively but had a positive effect on their colonization. Foxes had a positive effect on ringtail site colonization. Thus, fishers were the dominant small carnivore where present and negatively affected foxes directly and ringtails indirectly. Coexistence among the small carnivores we studied appears to reflect dynamic spatial partitioning. Conservation and management efforts should investigate how intraguild interactions may influence the recolonization of carnivores to previously occupied landscapes. © 2017 The Authors. Journal of Animal Ecology © 2017 British Ecological Society.
Multiple Factors Affect Socioeconomics and Wellbeing of Artisanal Sea Cucumber Fishers.
Purcell, Steven W; Ngaluafe, Poasi; Foale, Simon J; Cocks, Nicole; Cullis, Brian R; Lalavanua, Watisoni
2016-01-01
Small-scale fisheries are important to livelihoods and subsistence seafood consumption of millions of fishers. Sea cucumbers are fished worldwide for export to Asia, yet few studies have assessed factors affecting socioeconomics and wellbeing among fishers. We interviewed 476 men and women sea cucumber fishers at multiple villages within multiple locations in Fiji, Kiribati, Tonga and New Caledonia using structured questionnaires. Low rates of subsistence consumption confirmed a primary role of sea cucumbers in income security. Prices of sea cucumbers sold by fishers varied greatly among countries, depending on the species. Gender variation in landing prices could be due to women catching smaller sea cucumbers or because some traders take advantage of them. Dissatisfaction with fishery income was common (44% of fishers), especially for i-Kiribati fishers, male fishers, and fishers experiencing difficulty selling their catch, but was uncorrelated with sale prices. Income dissatisfaction worsened with age. The number of livelihood activities averaged 2.2-2.5 across countries, and varied significantly among locations. Sea cucumbers were often a primary source of income to fishers, especially in Tonga. Other common livelihood activities were fishing other marine resources, copra production in Kiribati, agriculture in Fiji, and salaried jobs in New Caledonia. Fishing other coastal and coral reef resources was the most common fall-back livelihood option if fishers were forced to exit the fishery. Our data highlight large disparities in subsistence consumption, gender-related price equity, and livelihood diversity among parallel artisanal fisheries. Improvement of supply chains in dispersed small-scale fisheries appears as a critical need for enhancing income and wellbeing of fishers. Strong evidence for co-dependence among small-scale fisheries, through fall-back livelihood preferences of fishers, suggests that resource managers must mitigate concomitant effects on other fisheries when considering fishery closures. That is likely to depend on livelihood diversification programs to take pressure off co-dependent fisheries.
Multiple Factors Affect Socioeconomics and Wellbeing of Artisanal Sea Cucumber Fishers
Ngaluafe, Poasi; Foale, Simon J.; Cocks, Nicole; Cullis, Brian R.; Lalavanua, Watisoni
2016-01-01
Small-scale fisheries are important to livelihoods and subsistence seafood consumption of millions of fishers. Sea cucumbers are fished worldwide for export to Asia, yet few studies have assessed factors affecting socioeconomics and wellbeing among fishers. We interviewed 476 men and women sea cucumber fishers at multiple villages within multiple locations in Fiji, Kiribati, Tonga and New Caledonia using structured questionnaires. Low rates of subsistence consumption confirmed a primary role of sea cucumbers in income security. Prices of sea cucumbers sold by fishers varied greatly among countries, depending on the species. Gender variation in landing prices could be due to women catching smaller sea cucumbers or because some traders take advantage of them. Dissatisfaction with fishery income was common (44% of fishers), especially for i-Kiribati fishers, male fishers, and fishers experiencing difficulty selling their catch, but was uncorrelated with sale prices. Income dissatisfaction worsened with age. The number of livelihood activities averaged 2.2–2.5 across countries, and varied significantly among locations. Sea cucumbers were often a primary source of income to fishers, especially in Tonga. Other common livelihood activities were fishing other marine resources, copra production in Kiribati, agriculture in Fiji, and salaried jobs in New Caledonia. Fishing other coastal and coral reef resources was the most common fall-back livelihood option if fishers were forced to exit the fishery. Our data highlight large disparities in subsistence consumption, gender-related price equity, and livelihood diversity among parallel artisanal fisheries. Improvement of supply chains in dispersed small-scale fisheries appears as a critical need for enhancing income and wellbeing of fishers. Strong evidence for co-dependence among small-scale fisheries, through fall-back livelihood preferences of fishers, suggests that resource managers must mitigate concomitant effects on other fisheries when considering fishery closures. That is likely to depend on livelihood diversification programs to take pressure off co-dependent fisheries. PMID:27930649
Sir Ronald A. Fisher and the International Biometric Society.
Billard, Lynne
2014-06-01
The year 2012 marks the 50th anniversary of the death of Sir Ronald A. Fisher, one of the two Fathers of Statistics and a Founder of the International Biometric Society (the "Society"). To celebrate the extraordinary genius of Fisher and the far-sighted vision of Fisher and Chester Bliss in organizing and promoting the formation of the Society, this article looks at the origins and growth of the Society, some of the key players and events, and especially the roles played by Fisher himself as the First President. A fresh look at Fisher, the man rather than the scientific genius is also presented. © 2014, The International Biometric Society.
Distribution-Agnostic Stochastic Optimal Power Flow for Distribution Grids: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baker, Kyri; Dall'Anese, Emiliano; Summers, Tyler
2016-09-01
This paper outlines a data-driven, distributionally robust approach to solve chance-constrained AC optimal power flow problems in distribution networks. Uncertain forecasts for loads and power generated by photovoltaic (PV) systems are considered, with the goal of minimizing PV curtailment while meeting power flow and voltage regulation constraints. A data- driven approach is utilized to develop a distributionally robust conservative convex approximation of the chance-constraints; particularly, the mean and covariance matrix of the forecast errors are updated online, and leveraged to enforce voltage regulation with predetermined probability via Chebyshev-based bounds. By combining an accurate linear approximation of the AC power flowmore » equations with the distributionally robust chance constraint reformulation, the resulting optimization problem becomes convex and computationally tractable.« less
Cairoli, Andrea; Piovani, Duccio; Jensen, Henrik Jeldtoft
2014-12-31
We propose a new procedure to monitor and forecast the onset of transitions in high-dimensional complex systems. We describe our procedure by an application to the tangled nature model of evolutionary ecology. The quasistable configurations of the full stochastic dynamics are taken as input for a stability analysis by means of the deterministic mean-field equations. Numerical analysis of the high-dimensional stability matrix allows us to identify unstable directions associated with eigenvalues with a positive real part. The overlap of the instantaneous configuration vector of the full stochastic system with the eigenvectors of the unstable directions of the deterministic mean-field approximation is found to be a good early warning of the transitions occurring intermittently.
Critical behavior of dissipative two-dimensional spin lattices
NASA Astrophysics Data System (ADS)
Rota, R.; Storme, F.; Bartolo, N.; Fazio, R.; Ciuti, C.
2017-04-01
We explore critical properties of two-dimensional lattices of spins interacting via an anisotropic Heisenberg Hamiltonian that are subject to incoherent spin flips. We determine the steady-state solution of the master equation for the density matrix via the corner-space renormalization method. We investigate the finite-size scaling and critical exponent of the magnetic linear susceptibility associated with a dissipative ferromagnetic transition. We show that the von Neumann entropy increases across the critical point, revealing a strongly mixed character of the ferromagnetic phase. Entanglement is witnessed by the quantum Fisher information, which exhibits a critical behavior at the transition point, showing that quantum correlations play a crucial role in the transition.
Optimal Measurements for Simultaneous Quantum Estimation of Multiple Phases
NASA Astrophysics Data System (ADS)
Pezzè, Luca; Ciampini, Mario A.; Spagnolo, Nicolò; Humphreys, Peter C.; Datta, Animesh; Walmsley, Ian A.; Barbieri, Marco; Sciarrino, Fabio; Smerzi, Augusto
2017-09-01
A quantum theory of multiphase estimation is crucial for quantum-enhanced sensing and imaging and may link quantum metrology to more complex quantum computation and communication protocols. In this Letter, we tackle one of the key difficulties of multiphase estimation: obtaining a measurement which saturates the fundamental sensitivity bounds. We derive necessary and sufficient conditions for projective measurements acting on pure states to saturate the ultimate theoretical bound on precision given by the quantum Fisher information matrix. We apply our theory to the specific example of interferometric phase estimation using photon number measurements, a convenient choice in the laboratory. Our results thus introduce concepts and methods relevant to the future theoretical and experimental development of multiparameter estimation.
Face recognition based on two-dimensional discriminant sparse preserving projection
NASA Astrophysics Data System (ADS)
Zhang, Dawei; Zhu, Shanan
2018-04-01
In this paper, a supervised dimensionality reduction algorithm named two-dimensional discriminant sparse preserving projection (2DDSPP) is proposed for face recognition. In order to accurately model manifold structure of data, 2DDSPP constructs within-class affinity graph and between-class affinity graph by the constrained least squares (LS) and l1 norm minimization problem, respectively. Based on directly operating on image matrix, 2DDSPP integrates graph embedding (GE) with Fisher criterion. The obtained projection subspace preserves within-class neighborhood geometry structure of samples, while keeping away samples from different classes. The experimental results on the PIE and AR face databases show that 2DDSPP can achieve better recognition performance.
NASA Astrophysics Data System (ADS)
Prasad, Ramendra; Deo, Ravinesh C.; Li, Yan; Maraseni, Tek
2017-11-01
Forecasting streamflow is vital for strategically planning, utilizing and redistributing water resources. In this paper, a wavelet-hybrid artificial neural network (ANN) model integrated with iterative input selection (IIS) algorithm (IIS-W-ANN) is evaluated for its statistical preciseness in forecasting monthly streamflow, and it is then benchmarked against M5 Tree model. To develop hybrid IIS-W-ANN model, a global predictor matrix is constructed for three local hydrological sites (Richmond, Gwydir, and Darling River) in Australia's agricultural (Murray-Darling) Basin. Model inputs comprised of statistically significant lagged combination of streamflow water level, are supplemented by meteorological data (i.e., precipitation, maximum and minimum temperature, mean solar radiation, vapor pressure and evaporation) as the potential model inputs. To establish robust forecasting models, iterative input selection (IIS) algorithm is applied to screen the best data from the predictor matrix and is integrated with the non-decimated maximum overlap discrete wavelet transform (MODWT) applied on the IIS-selected variables. This resolved the frequencies contained in predictor data while constructing a wavelet-hybrid (i.e., IIS-W-ANN and IIS-W-M5 Tree) model. Forecasting ability of IIS-W-ANN is evaluated via correlation coefficient (r), Willmott's Index (WI), Nash-Sutcliffe Efficiency (ENS), root-mean-square-error (RMSE), and mean absolute error (MAE), including the percentage RMSE and MAE. While ANN models are seen to outperform M5 Tree executed for all hydrological sites, the IIS variable selector was efficient in determining the appropriate predictors, as stipulated by the better performance of the IIS coupled (ANN and M5 Tree) models relative to the models without IIS. When IIS-coupled models are integrated with MODWT, the wavelet-hybrid IIS-W-ANN and IIS-W-M5 Tree are seen to attain significantly accurate performance relative to their standalone counterparts. Importantly, IIS-W-ANN model accuracy outweighs IIS-ANN, as evidenced by a larger r and WI (by 7.5% and 3.8%, respectively) and a lower RMSE (by 21.3%). In comparison to the IIS-W-M5 Tree model, IIS-W-ANN model yielded larger values of WI = 0.936-0.979 and ENS = 0.770-0.920. Correspondingly, the errors (RMSE and MAE) ranged from 0.162-0.487 m and 0.139-0.390 m, respectively, with relative errors, RRMSE = (15.65-21.00) % and MAPE = (14.79-20.78) %. Distinct geographic signature is evident where the most and least accurately forecasted streamflow data is attained for the Gwydir and Darling River, respectively. Conclusively, this study advocates the efficacy of iterative input selection, allowing the proper screening of model predictors, and subsequently, its integration with MODWT resulting in enhanced performance of the models applied in streamflow forecasting.
Score Matrix for HWBI Forecast Model
2000-2010 Annual State-Scale Service and Domain scores used to support the approach for forecasting EPA's Human Well-Being Index. A modeling approach was developed based relationship function equations derived from select economic, social and ecosystem final goods and service scores and calculated human well-being index and related domain scores. These data are being used in a secondary capacity. The foundational data and scoring techniques were originally described in: a) U.S. EPA. 2012. Indicators and Methods for Constructing a U.S. Human Well-being Index (HWBI) for Ecosystem Services Research. Report. EPA/600/R-12/023. pp. 121; and b) U.S. EPA. 2014. Indicators and Methods for Evaluating Economic, Ecosystem and Social Services Provisioning. Report. EPA/600/R-14/184. pp. 174. Mode Smith, L. M., Harwell, L. C., Summers, J. K., Smith, H. M., Wade, C. M., Straub, K. R. and J.L. Case (2014).This dataset is associated with the following publication:Summers , K., L. Harwell , and L. Smith. A Model For Change: An Approach for Forecasting Well-Being From Service-Based Decisions. ECOLOGICAL INDICATORS. Elsevier Science Ltd, New York, NY, USA, 69: 295-309, (2016).
NASA Astrophysics Data System (ADS)
Zhang, Hongqin; Tian, Xiangjun
2018-04-01
Ensemble-based data assimilation methods often use the so-called localization scheme to improve the representation of the ensemble background error covariance (Be). Extensive research has been undertaken to reduce the computational cost of these methods by using the localized ensemble samples to localize Be by means of a direct decomposition of the local correlation matrix C. However, the computational costs of the direct decomposition of the local correlation matrix C are still extremely high due to its high dimension. In this paper, we propose an efficient local correlation matrix decomposition approach based on the concept of alternating directions. This approach is intended to avoid direct decomposition of the correlation matrix. Instead, we first decompose the correlation matrix into 1-D correlation matrices in the three coordinate directions, then construct their empirical orthogonal function decomposition at low resolution. This procedure is followed by the 1-D spline interpolation process to transform the above decompositions to the high-resolution grid. Finally, an efficient correlation matrix decomposition is achieved by computing the very similar Kronecker product. We conducted a series of comparison experiments to illustrate the validity and accuracy of the proposed local correlation matrix decomposition approach. The effectiveness of the proposed correlation matrix decomposition approach and its efficient localization implementation of the nonlinear least-squares four-dimensional variational assimilation are further demonstrated by several groups of numerical experiments based on the Advanced Research Weather Research and Forecasting model.
Identifying alternate pathways for climate change to impact inland recreational fishers
Hunt, Len M.; Fenichel, Eli P.; Fulton, David C.; Mendelsohn, Robert; Smith, Jordan W.; Tunney, Tyler D.; Lynch, Abigail J.; Paukert, Craig P.; Whitney, James E.
2016-01-01
Fisheries and human dimensions literature suggests that climate change influences inland recreational fishers in North America through three major pathways. The most widely recognized pathway suggests that climate change impacts habitat and fish populations (e.g., water temperature impacting fish survival) and cascades to impact fishers. Climate change also impacts recreational fishers by influencing environmental conditions that directly affect fishers (e.g., increased temperatures in northern climates resulting in extended open water fishing seasons and increased fishing effort). The final pathway occurs from climate change mitigation and adaptation efforts (e.g., refined energy policies result in higher fuel costs, making distant trips more expensive). To address limitations of past research (e.g., assessing climate change impacts for only one pathway at a time and not accounting for climate variability, extreme weather events, or heterogeneity among fishers), we encourage researchers to refocus their efforts to understand and document climate change impacts to inland fishers.
A differentiable reformulation for E-optimal design of experiments in nonlinear dynamic biosystems.
Telen, Dries; Van Riet, Nick; Logist, Flip; Van Impe, Jan
2015-06-01
Informative experiments are highly valuable for estimating parameters in nonlinear dynamic bioprocesses. Techniques for optimal experiment design ensure the systematic design of such informative experiments. The E-criterion which can be used as objective function in optimal experiment design requires the maximization of the smallest eigenvalue of the Fisher information matrix. However, one problem with the minimal eigenvalue function is that it can be nondifferentiable. In addition, no closed form expression exists for the computation of eigenvalues of a matrix larger than a 4 by 4 one. As eigenvalues are normally computed with iterative methods, state-of-the-art optimal control solvers are not able to exploit automatic differentiation to compute the derivatives with respect to the decision variables. In the current paper a reformulation strategy from the field of convex optimization is suggested to circumvent these difficulties. This reformulation requires the inclusion of a matrix inequality constraint involving positive semidefiniteness. In this paper, this positive semidefiniteness constraint is imposed via Sylverster's criterion. As a result the maximization of the minimum eigenvalue function can be formulated in standard optimal control solvers through the addition of nonlinear constraints. The presented methodology is successfully illustrated with a case study from the field of predictive microbiology. Copyright © 2015. Published by Elsevier Inc.
Current distribution of the fisher, Martes pennanti, in California
William J. Zielinski; Thomas E. Kucera; Reginald H. Barrett
1995-01-01
We describe the 1989-1994 distribution of the fisher, Martes pennanti, in California based on results of detection surveys that used either sooted track-plates or cameras. Fishers were detected in two regions of the state: the northwest and the southern Sierra Nevada. Despite considerable survey effort, neither fisher tracks nor photographs were...
Ray S. Vinkey; Michael K. Schwartz; Kevin S. McKelvey; Kerry R. Foresman; Kristine L. Pilgrim; Brian J. Giddings; Eric C. Lofroth
2006-01-01
Fishers (Martes pennanti) were purportedly extirpated from Montana by 1930 and extant populations are assumed to be descended from translocated fishers. To determine the lineage of fisher populations, we sequenced 2 regions of the mitochondrial DNA genome from 207 tissue samples from British Columbia, Minnesota, Wisconsin, and Montana. In...
Uneven adaptive capacity among fishers in a sea of change
Fuller, Emma; Crona, Beatrice I.
2017-01-01
Fishers worldwide operate in an environment of uncertainty and constant change. Their ability to manage risk associated with such uncertainty and subsequently adapt to change is largely a function of individual circumstances, including their access to different fisheries. However, explicit attention to the heterogeneity of fishers’ connections to fisheries at the level of the individual has been largely ignored. We illustrate the ubiquitous nature of these connections by constructing a typology of commercial fishers in the state of Maine based on the different fisheries that fishers rely on to sustain their livelihoods and find that there are over 600 combinations. We evaluate the adaptive potential of each strategy, using a set of attributes identified by fisheries experts in the state, and find that only 12% of fishers can be classified as being well positioned to adapt in the face of changing socioeconomic and ecological conditions. Sensitivity to the uneven and heterogeneous capacity of fishers to manage risk and adapt to change is critical to devising effective management strategies that broadly support fishers. This will require greater attention to the social-ecological connectivity of fishers across different jurisdictions. PMID:28604775
DOE Office of Scientific and Technical Information (OSTI.GOV)
Momtaz, Salim; Gladstone, William
2008-02-15
In its effort to resolve the conflict between commercial and recreational fishers the New South Wales (NSW) government (NSW Fisheries) banned commercial fishing in the estuarine waters. The NSW Fisheries conducted a number of studies and held meetings with the affected communities including commercial fishers prior to the implementation of the ban. To investigate how community consultation played a role in the decision-making process especially as perceived by the commercial fishers and to determine actual social impacts of the ban on commercial fishers, in-depth interviews were conducted with the commercial fishers. This research reveals that despite the NSW Fisheries' consultationsmore » with commercial fishers prior to the closure, the latter were confused about various vital aspects of the decision. It further reveals that, the commercial fishers faced a number of significant changes as a result of this decision. We argue that a better decision-making process and outcome would have been possible through a meaningful consultation with the commercial fishers and a social impact assessment.« less
Manzan, Maíra Fontes; Lopes, Priscila F M
2015-01-01
Fishers' local ecological knowledge (LEK) is an additional tool to obtain information about cetaceans, regarding their local particularities, fishing interactions, and behavior. However, this knowledge could vary in depth of detail according to the level of interaction that fishers have with a specific species. This study investigated differences in small-scale fishers' LEK regarding the estuarine dolphin (Sotalia guianensis) in three Brazilian northeast coastal communities where fishing is practiced in estuarine lagoons and/or coastal waters and where dolphin-watching tourism varies from incipient to important. The fishers (N = 116) were asked about general characteristics of S. guianensis and their interactions with this dolphin during fishing activities. Compared to lagoon fishers, coastal fishers showed greater knowledge about the species but had more negative interactions with the dolphin during fishing activities. Coastal fishing not only offered the opportunity for fishers to observe a wider variety of the dolphin's behavior, but also implied direct contact with the dolphins, as they are bycaught in coastal gillnets. Besides complementing information that could be used for the management of cetaceans, this study shows that the type of environment most used by fishers also affects the accuracy of the information they provide. When designing studies to gather information on species and/or populations with the support of fishers, special consideration should be given to local particularities such as gear and habitats used within the fishing community.
Matching Fishers’ Knowledge and Landing Data to Overcome Data Missing in Small-Scale Fisheries
Damasio, Ludmila de Melo Alves; Lopes, Priscila F. M.; Guariento, Rafael D.; Carvalho, Adriana R.
2015-01-01
Background In small-scale fishery, information provided by fishers has been useful to complement current and past lack of knowledge on species and environment. Methodology Through interviews, 82 fishers from the largest fishing communities on the north and south borders of a Brazilian northeastern coastal state provided estimates of the catch per unit effort (CPUE) and rank of species abundance of their main target fishes for three time points: current year (2013 at the time of the research), 10, and 20 years past. This information was contrasted to other available data sources: scientific sampling of fish landing (2013), governmental statistics (2003), and information provided by expert fishers (1993), respectively. Principal Findings Fishers were more accurate when reporting information about their maximum CPUE for 2013, but except for three species, which they estimated accurately, fishers overestimated their mean CPUE per species. Fishers were also accurate at establishing ranks of abundance of their main target species for all periods. Fishers' beliefs that fish abundance has not changed over the last 10 years (2003–2013) were corroborated by governmental and scientific landing data. Conclusions The comparison between official and formal landing records and fishers' perceptions revealed that fishers are accurate when reporting maximum CPUE, but not when reporting mean CPUE. Moreover, fishers are less precise the less common a species is in their catches, suggesting that they could provide better information for management purposes on their current target species. PMID:26176538
Olympic Fisher Reintroduction Project: Progress report 2008-2011
Jeffrey C. Lewis,; Patti J. Happe,; Jenkins, Kurt J.; Manson, David J.
2012-01-01
This progress report summarizes the final year of activities of Phase I of the Olympic fisher restoration project. The intent of the Olympic fisher reintroduction project is to reestablish a self-sustaining population of fishers on the Olympic Peninsula. To achieve this goal, the Olympic fisher reintroduction project released 90 fishers within Olympic National Park from 2008 to 2010. The reintroduction of fishers to the Olympic Peninsula was designed as an adaptive management project, including the monitoring of released fishers as a means to (1) evaluate reintroduction success, (2) investigate key biological and ecological traits of fishers, and (3) inform future reintroduction, monitoring, and research efforts. This report summarizes reintroduction activities and preliminary research and monitoring results completed through December 2011. The report is non-interpretational in nature. Although we report the status of movement, survival, and home range components of the research, we have not completed final analyses and interpretation of research results. Much of the data collected during the monitoring and research project will be analyzed and interpreted in the doctoral dissertation being developed by Jeff Lewis; the completion of this dissertation is anticipated prior to April 2013. We anticipate that this work, and analyses of other data collected during the project, will result in several peer-reviewed scientific publications in ecological and conservation journals, which collectively will comprise the final reporting of work summarized here. These publications will include papers addressing post-release movements, survival, resource selection, food habits, and age determination of fishers.
Whales, dolphins or fishes? The ethnotaxonomy of cetaceans in São Sebastião, Brazil
Souza, Shirley P; Begossi, Alpina
2007-01-01
The local knowledge of human populations about the natural world has been addressed through ethnobiological studies, especially concerning resources uses and their management. Several criteria, such as morphology, ecology, behavior, utility and salience, have been used by local communities to classify plants and animals. Studies regarding fishers' knowledge on cetaceans in the world, especially in Brazil, began in the last decade. Our objective is to investigate the folk classification by fishers concerning cetaceans, and the contribution of fishers' local knowledge to the conservation of that group. In particular, we aim to record fishers' knowledge in relation to cetaceans, with emphasis on folk taxonomy. The studied area is São Sebastião, located in the southeastern coast of Brazil, where 70 fishers from 14 communities were selected according to their fishing experience and interviewed through questionnaires about classification, nomenclature and ecological aspects of local cetaceans' species. Our results indicated that most fishers classified cetaceans as belonging to the life-form 'fish'. Fishers' citations for the nomenclature of the 11 biological species (10 biological genera), resulted in 14 folk species (3 generic names). Fishers' taxonomy was influenced mostly by the phenotypic and cultural salience of the studied cetaceans. Cultural transmission, vertical and horizontal, was intimately linked to fishers' classification process. The most salient species, therefore well recognized and named, were those most often caught by gillnets, in addition to the biggest ones and those most exposed by media, through TV programs, which were watched and mentioned by fishers. Our results showed that fishers' ecological knowledge could be a valuable contribution to cetaceans' conservation, helping to determine areas and periods for their protection, indicating priority topics for research and participating in alternative management related to the gillnet fisheries. PMID:17311681
NASA Astrophysics Data System (ADS)
Campbell, M. S.; Ashley, M.; De Groot, J.; Rodwell, L.
2016-02-01
As an emerging industry, Marine Renewable Energy (MRE) is expected to play a major contributory role if the UK is to successfully reach it's desired target of renewable energy production by 2020. However, due to the competing objectives and priorities of MRE and other industries, for example fisheries, and in the delivering of conservation measures, the demand for space within our marine landscape is increasing, and interactions are inevitable. A semi structured interview was conducted with forty fishers across the UK to elicit further information on the challenges, barriers to progress and priority issues these fishers face in relation to MRE development. The questionnaire also included a fisher assessment of the mitigation agenda developed by de Groot et al. (2014) under the Natural Environment Research Council Marine Renewable Energy Knowledge Exchange Programme ( NERC MREKEP). Qualitative data were extracted and analysed using the text analysis software NVivo8. Fishers identified barriers to progress, and in order of the most important themes included; policy, consultation, trust, lack of knowledge, true representation of all fishers, science vs. fisher observation mismatch and timescales. Priority issues identified in order of importance were; displacement or loss of access, cable disturbance, timings of installation/repairs, effects on the seabed and specifically offshore windfarm (OWF) sitting. The consultation process caused discontent among all fishers interviewed. In relation to working towards a collaborative mitigation agenda, fishers highlighted issues of trust in relation to; trans-boundary management, data management and the consultation process. At all stages of the research, the response rate of the importance of gathering fishers' knowledge (FK) was high. Fishers underlined the importance of this data source in assessing the impacts of MRE on the sectors of the UK fleet. Thus, although at an early stage of development, an initial framework for the collection and application of FK is presented and further work on data needs highlighted.
Fishers' knowledge on the coast of Brazil.
Begossi, Alpina; Salivonchyk, Svetlana; Lopes, Priscila F M; Silvano, Renato A M
2016-06-01
Although fishers' knowledge has been recently considered into management programmes, there is still the need to establish a better understanding of fishers' perceptions and cognition. Fishers can provide novel information on the biology and ecology of species, which can potentially be used in the management of fisheries. The knowledge fishers have and how they classify nature is empirically based. It is common, for example, to observe that fishers' taxonomy is often represented by the generic level, one of the hierarchical categories of folk classification that is somewhat analogous to the Linnean genus, as it groups organisms of a higher rank than the folk species.In this study we compiled the knowledge fishers have on local fish, such as their folk names, diet and habitat. Five coastal communities widely distributed along the Brazilian coast were studied: two from the northeast (Porto Sauípe and Itacimirim, in Bahia State, n of interviewees = 34), two from the southeast (Itaipu at Niterói and Copacabana at Rio de Janeiro, Rio de Janeiro State, n = 35) and one from the south coast (Pântano do Sul, in Santa Catarina State, n = 23). Fish pictures were randomly ordered and the same order was presented to all interviewees (n = 92), when they were then asked about the species name and classification and its habitat and diet preferences. Fishers make clusters of fish species, usually hierarchically; fishers of the coast of Brazil use mostly primary lexemes (generic names) to name fish; and fishers did not differentiate between scientific species, since the same folk generic name included two different scientific species. Fishers provide information on species to which there is scarce or no information on diet and habitat, such as Rhinobatos percellens (chola guitarfish, arraia viola or cação viola), Sphoeroides dorsalis (marbled puffer, baiacu), Mycteroperca acutirostris (comb grouper, badejo) and Dasyatis guttata (longnose stingray, arraia, arraia manteiga). fishers' knowledge on fish diet and fish habitat can be strategic to management, since their knowledge concentrates on the fishery target species, which are the ones under higher fishing pressure. Besides, fishers showed to have knowledge on species still poorly known to science.
Hallwass, Gustavo; Lopes, Priscila F M; Juras, Anastácio A; Silvano, Renato A M
2013-10-15
Identifying the factors that influence the amount of fish caught, and thus the fishers' income, is important for proposing or improving management plans. Some of these factors influencing fishing rewards may be related to fishers' behavior, which is driven by economic motivations. Therefore, those management rules that have less of an impact on fishers' income could achieve better acceptance and compliance from fishers. We analyzed the relative influence of environmental and socioeconomic factors on fish catches (biomass) in fishing communities of a large tropical river. We then used the results from this analysis to propose alternative management scenarios in which we predicted potential fishers' compliance (high, moderate and low) based on the extent to which management proposals would affect fish catches and fishers' income. We used a General Linear Model (GLM) to analyze the influence of environmental (fishing community, season and habitat) and socioeconomic factors (number of fishers in the crew, time spent fishing, fishing gear used, type of canoe, distance traveled to fishing grounds) on fish catches (dependent variable) in 572 fishing trips by small-scale fishers in the Lower Tocantins River, Brazilian Amazon. According to the GLM, all factors together accounted for 43% of the variation in the biomass of the fish that were caught. The behaviors of fishers' that are linked to fishing effort, such as time spent fishing (42% of the total explained by GLM), distance traveled to the fishing ground (12%) and number of fishers (10%), were all positively related to the biomass of fish caught and could explain most of the variation on it. The environmental factor of the fishing habitat accounted for 10% of the variation in fish caught. These results, when applied to management scenarios, indicated that some combinations of the management measures, such as selected lakes as no-take areas, restrictions on the use of gillnets (especially during the high-water season) and individual quotas larger than fishers' usual catches, would most likely have less impact on fishers' income. The proposed scenarios help to identify feasible management options, which could promote the conservation of fish, potentially achieving higher fishers' compliance. Copyright © 2013 Elsevier Ltd. All rights reserved.
Mclean, Elizabeth L; Forrester, Graham E
2018-04-01
We tested whether fishers' local ecological knowledge (LEK) of two fish life-history parameters, size at maturity (SAM) at maximum body size (MS), was comparable to scientific estimates (SEK) of the same parameters, and whether LEK influenced fishers' perceptions of sustainability. Local ecological knowledge was documented for 82 fishers from a small-scale fishery in Samaná Bay, Dominican Republic, whereas SEK was compiled from the scientific literature. Size at maturity estimates derived from LEK and SEK overlapped for most of the 15 commonly harvested species (10 of 15). In contrast, fishers' maximum size estimates were usually lower than (eight species), or overlapped with (five species) scientific estimates. Fishers' size-based estimates of catch composition indicate greater potential for overfishing than estimates based on SEK. Fishers' estimates of size at capture relative to size at maturity suggest routine inclusion of juveniles in the catch (9 of 15 species), and fishers' estimates suggest that harvested fish are substantially smaller than maximum body size for most species (11 of 15 species). Scientific estimates also suggest that harvested fish are generally smaller than maximum body size (13 of 15), but suggest that the catch is dominated by adults for most species (9 of 15 species), and that juveniles are present in the catch for fewer species (6 of 15). Most Samaná fishers characterized the current state of their fishery as poor (73%) and as having changed for the worse over the past 20 yr (60%). Fishers stated that concern about overfishing, catching small fish, and catching immature fish contributed to these perceptions, indicating a possible influence of catch-size composition on their perceptions. Future work should test this link more explicitly because we found no evidence that the minority of fishers with more positive perceptions of their fishery reported systematically different estimates of catch-size composition than those with the more negative majority view. Although fishers' and scientific estimates of size at maturity and maximum size parameters sometimes differed, the fact that fishers make routine quantitative assessments of maturity and body size suggests potential for future collaborative monitoring efforts to generate estimates usable by scientists and meaningful to fishers. © 2017 by the Ecological Society of America.
Happe, Patricia J.; Jenkins, Kurt J.; Kay, Thomas J.; Pilgrim, Kristie; Schwartz, Michael K; Lewis, Jeffrey C.; Aubry, Keith B.
2016-01-01
With the translocation and release of 90 fishers (Pekania pennanti) from British Columbia to Olympic National Park during 2008–2010, the National Park Service (NPS) and Washington Department of Fish and Wildlife (WDFW) accomplished the first phase of fisher restoration in Washington State. Beginning in 2013, we initiated a new research project to determine the current status of fishers on Washington’s Olympic Peninsula 3–8 years after the releases and evaluate the short-term success of the restoration program. Objectives of the study are to determine the current distribution of fishers and proportion of the recovery area that is currently occupied by fishers, determine several genetic characteristics of the reintroduced population, and determine reproductive success of the founding animals through genetic studies. During 2015, we continued working with a broad coalition of cooperating agencies, tribes, and nongovernmental organizations (NGO) to collect data on fisher distribution and genetics using noninvasive sampling methods. The primary sampling frame consisted of 157 24-km2 hexagons (hexes) distributed across all major land ownerships within the Olympic Peninsula target survey area. In 2014 we expanded the study by adding 58 more hexes to an expanded study area in response to incidental fisher observations outside of the target area obtained in 2013; 49 hexes were added south and 9 to the east of the target area. During 2015, Federal, State, Tribal and NGO biologists and volunteers established three Distributioned motion-sensing camera stations, paired with hair snaring devices, in 87 hexes; 75 in the targeted area and 12 in the expansion areas. Each paired camera/hair station was left in place for approximately 6 weeks, with three checks on 2-week intervals. We documented fisher presence in 7 of the 87 hexagons. Four fishers were identified through microsatellite DNA analyses. The 4 identified fishers included 1 of the original founding population of 90 and 3 new recruits to the population. Three additional fishers were detected with cameras but not DNA, consequently their identities were unknown. All fisher detections were in the target area. Additionally, we identified 46 other species of wildlife at the baited camera stations. We also obtained 4 additional confirmed records of fishers in the study area through photographs provided by the public and incidental live capture. During 2016, we plan to resample 69 hexagons sampled in the target area in 2014 and 12 new hexes in the expansion area. In addition, we plan to sample non-selected hexes in-between hexes where we had a cluster of fishers in 2014, to provide better understanding of occupancy patterns and minimum number of individuals in an area where fishers appear to be concentrating.
USDA-ARS?s Scientific Manuscript database
Three species of Caribbomerus Vitali are newly recorded for the Dominican Republic: C. decoratus (Zayas), C. elongatus (Fisher), and C. asperatus (Fisher). The first two also represent first records for Hispaniola. Caribbomerus elongatus (Fisher) is redescribed based on additional material, includi...
Fisher classifier and its probability of error estimation
NASA Technical Reports Server (NTRS)
Chittineni, C. B.
1979-01-01
Computationally efficient expressions are derived for estimating the probability of error using the leave-one-out method. The optimal threshold for the classification of patterns projected onto Fisher's direction is derived. A simple generalization of the Fisher classifier to multiple classes is presented. Computational expressions are developed for estimating the probability of error of the multiclass Fisher classifier.
Effects of snow on fisher and marten distributions in Idaho
Nathan Albrecht; C. Heusser; M. Schwartz; J. Sauder; R. Vinkey
2013-01-01
Studies have suggested that deep snow may limit fisher (Martes pennanti) distribution, and that fisher populations may in turn limit marten (Martes americana) distribution. We tested these hypotheses in the Northern Rocky Mountains of Idaho, a region which differs from previous study areas in its climate and relative fisher and marten abundance, but in which very...
Lingafelter, Steven W.
2011-01-01
Abstract Three species of Caribbomerus Vitali are newly recorded for the Dominican Republic: Caribbomerus decoratus (Zayas), Caribbomerus elongatus (Fisher), and Caribbomerus asperatus (Fisher). The first two also represent first records for Hispaniola. Caribbomerus elongatus (Fisher) is redescribed based on additional material, including the first known males. Caribbomerus similis (Fisher) is newly recorded for Dominica. A key to the species of the genus from the West Indies is provided. PMID:21594096
Lingafelter, Steven W
2011-03-11
Three species of Caribbomerus Vitali are newly recorded for the Dominican Republic: Caribbomerus decoratus (Zayas), Caribbomerus elongatus (Fisher), and Caribbomerus asperatus (Fisher). The first two also represent first records for Hispaniola. Caribbomerus elongatus (Fisher) is redescribed based on additional material, including the first known males. Caribbomerus similis (Fisher) is newly recorded for Dominica. A key to the species of the genus from the West Indies is provided.
Influence of conservative corrections on parameter estimation for extreme-mass-ratio inspirals
NASA Astrophysics Data System (ADS)
Huerta, E. A.; Gair, Jonathan R.
2009-04-01
We present an improved numerical kludge waveform model for circular, equatorial extreme-mass-ratio inspirals (EMRIs). The model is based on true Kerr geodesics, augmented by radiative self-force corrections derived from perturbative calculations, and in this paper for the first time we include conservative self-force corrections that we derive by comparison to post-Newtonian results. We present results of a Monte Carlo simulation of parameter estimation errors computed using the Fisher matrix and also assess the theoretical errors that would arise from omitting the conservative correction terms we include here. We present results for three different types of system, namely, the inspirals of black holes, neutron stars, or white dwarfs into a supermassive black hole (SMBH). The analysis shows that for a typical source (a 10M⊙ compact object captured by a 106M⊙ SMBH at a signal to noise ratio of 30) we expect to determine the two masses to within a fractional error of ˜10-4, measure the spin parameter q to ˜10-4.5, and determine the location of the source on the sky and the spin orientation to within 10-3 steradians. We show that, for this kludge model, omitting the conservative corrections leads to a small error over much of the parameter space, i.e., the ratio R of the theoretical model error to the Fisher matrix error is R<1 for all ten parameters in the model. For the few systems with larger errors typically R<3 and hence the conservative corrections can be marginally ignored. In addition, we use our model and first-order self-force results for Schwarzschild black holes to estimate the error that arises from omitting the second-order radiative piece of the self-force. This indicates that it may not be necessary to go beyond first order to recover accurate parameter estimates.
High-precision Predictions for the Acoustic Scale in the Nonlinear Regime
NASA Astrophysics Data System (ADS)
Seo, Hee-Jong; Eckel, Jonathan; Eisenstein, Daniel J.; Mehta, Kushal; Metchnik, Marc; Padmanabhan, Nikhil; Pinto, Phillip; Takahashi, Ryuichi; White, Martin; Xu, Xiaoying
2010-09-01
We measure shifts of the acoustic scale due to nonlinear growth and redshift distortions to a high precision using a very large volume of high-force-resolution simulations. We compare results from various sets of simulations that differ in their force, volume, and mass resolution. We find a consistency within 1.5σ for shift values from different simulations and derive shift α(z) - 1 = (0.300 ± 0.015) %[D(z)/D(0)]2 using our fiducial set. We find a strong correlation with a non-unity slope between shifts in real space and in redshift space and a weak correlation between the initial redshift and low redshift. Density-field reconstruction not only removes the mean shifts and reduces errors on the mean, but also tightens the correlations. After reconstruction, we recover a slope of near unity for the correlation between the real and redshift space and restore a strong correlation between the initial and the low redshifts. We derive propagators and mode-coupling terms from our N-body simulations and compare with the Zel'dovich approximation and the shifts measured from the χ2 fitting, respectively. We interpret the propagator and the mode-coupling term of a nonlinear density field in the context of an average and a dispersion of its complex Fourier coefficients relative to those of the linear density field; from these two terms, we derive a signal-to-noise ratio of the acoustic peak measurement. We attempt to improve our reconstruction method by implementing 2LPT and iterative operations, but we obtain little improvement. The Fisher matrix estimates of uncertainty in the acoustic scale is tested using 5000 h -3 Gpc3 of cosmological Particle-Mesh simulations from Takahashi et al. At an expected sample variance level of 1%, the agreement between the Fisher matrix estimates based on Seo and Eisenstein and the N-body results is better than 10%.
NASA Astrophysics Data System (ADS)
Bui-Thanh, T.; Girolami, M.
2014-11-01
We consider the Riemann manifold Hamiltonian Monte Carlo (RMHMC) method for solving statistical inverse problems governed by partial differential equations (PDEs). The Bayesian framework is employed to cast the inverse problem into the task of statistical inference whose solution is the posterior distribution in infinite dimensional parameter space conditional upon observation data and Gaussian prior measure. We discretize both the likelihood and the prior using the H1-conforming finite element method together with a matrix transfer technique. The power of the RMHMC method is that it exploits the geometric structure induced by the PDE constraints of the underlying inverse problem. Consequently, each RMHMC posterior sample is almost uncorrelated/independent from the others providing statistically efficient Markov chain simulation. However this statistical efficiency comes at a computational cost. This motivates us to consider computationally more efficient strategies for RMHMC. At the heart of our construction is the fact that for Gaussian error structures the Fisher information matrix coincides with the Gauss-Newton Hessian. We exploit this fact in considering a computationally simplified RMHMC method combining state-of-the-art adjoint techniques and the superiority of the RMHMC method. Specifically, we first form the Gauss-Newton Hessian at the maximum a posteriori point and then use it as a fixed constant metric tensor throughout RMHMC simulation. This eliminates the need for the computationally costly differential geometric Christoffel symbols, which in turn greatly reduces computational effort at a corresponding loss of sampling efficiency. We further reduce the cost of forming the Fisher information matrix by using a low rank approximation via a randomized singular value decomposition technique. This is efficient since a small number of Hessian-vector products are required. The Hessian-vector product in turn requires only two extra PDE solves using the adjoint technique. Various numerical results up to 1025 parameters are presented to demonstrate the ability of the RMHMC method in exploring the geometric structure of the problem to propose (almost) uncorrelated/independent samples that are far away from each other, and yet the acceptance rate is almost unity. The results also suggest that for the PDE models considered the proposed fixed metric RMHMC can attain almost as high a quality performance as the original RMHMC, i.e. generating (almost) uncorrelated/independent samples, while being two orders of magnitude less computationally expensive.
NASA Astrophysics Data System (ADS)
Perekhodtseva, E. V.
2009-09-01
Development of successful method of forecast of storm winds, including squalls and tornadoes and heavy rainfalls, that often result in human and material losses, could allow one to take proper measures against destruction of buildings and to protect people. Well-in-advance successful forecast (from 12 hours to 48 hour) makes possible to reduce the losses. Prediction of the phenomena involved is a very difficult problem for synoptic till recently. The existing graphic and calculation methods still depend on subjective decision of an operator. Nowadays in Russia there is no hydrodynamic model for forecast of the maximal precipitation and wind velocity V> 25m/c, hence the main tools of objective forecast are statistical methods using the dependence of the phenomena involved on a number of atmospheric parameters (predictors). Statistical decisive rule of the alternative and probability forecast of these events was obtained in accordance with the concept of "perfect prognosis" using the data of objective analysis. For this purpose the different teaching samples of present and absent of this storm wind and rainfalls were automatically arranged that include the values of forty physically substantiated potential predictors. Then the empirical statistical method was used that involved diagonalization of the mean correlation matrix R of the predictors and extraction of diagonal blocks of strongly correlated predictors. Thus for these phenomena the most informative predictors were selected without loosing information. The statistical decisive rules for diagnosis and prognosis of the phenomena involved U(X) were calculated for choosing informative vector-predictor. We used the criterion of distance of Mahalanobis and criterion of minimum of entropy by Vapnik-Chervonenkis for the selection predictors. Successful development of hydrodynamic models for short-term forecast and improvement of 36-48h forecasts of pressure, temperature and others parameters allowed us to use the prognostic fields of those models for calculations of the discriminant functions in the nodes of the grid 150x150km and the values of probabilities P of dangerous wind and thus to get fully automated forecasts. In order to change to the alternative forecast the author proposes the empirical threshold values specified for this phenomenon and advance period 36 hours. In the accordance to the Pirsey-Obukhov criterion (T), the success of these automated statistical methods of forecast of squalls and tornadoes to 36 -48 hours ahead and heavy rainfalls in the warm season for the territory of Italy, Spain and Balkan countries is T = 1-a-b=0,54: 0,78 after author experiments. A lot of examples of very successful forecasts of summer storm wind and heavy rainfalls over the Italy and Spain territory are submitted at this report. The same decisive rules were applied to the forecast of these phenomena during cold period in this year too. This winter heavy snowfalls in Spain and in Italy and storm wind at this territory were observed very often. And our forecasts are successful.
Berentsen, Paul; Bush, Simon R.; Digal, Larry; Oude Lansink, Alfons
2016-01-01
This study identifies the capabilities needed by small-scale fishers to participate in Fishery Improvement Projects (FIPs) for yellowfin tuna in the Philippines. The current literature provides little empirical evidence on how different models, or types of FIPs, influence the participation of fishers in their programs and the degree which FIPs are able to foster improvements in fishing practices. To address this literature gap, two different FIPs are empirically analysed, each with different approaches for fostering improvement. The first is the non-governmental organisation-led Partnership Programme Towards Sustainable Tuna, which adopts a bottom-up or development oriented FIP model. The second is the private-led Artesmar FIP, which adopts a top-down or market-oriented FIP approach. The data were obtained from 350 fishers surveyed and were analysed using two separate models run in succession, taking into consideration full, partial, and non-participation in the two FIPs. The results demonstrate that different types of capabilities are required in order to participate in different FIP models. Individual firm capabilities are more important for fishers participation in market-oriented FIPs, which use direct economic incentives to encourage improvements in fisher practices. Collective capabilities are more important for fishers to participate in development-oriented FIPs, which drive improvement by supporting fishers, fisher associations, and governments to move towards market requirements. PMID:27732607
Tolentino-Zondervan, Frazen; Berentsen, Paul; Bush, Simon R; Digal, Larry; Oude Lansink, Alfons
2016-01-01
This study identifies the capabilities needed by small-scale fishers to participate in Fishery Improvement Projects (FIPs) for yellowfin tuna in the Philippines. The current literature provides little empirical evidence on how different models, or types of FIPs, influence the participation of fishers in their programs and the degree which FIPs are able to foster improvements in fishing practices. To address this literature gap, two different FIPs are empirically analysed, each with different approaches for fostering improvement. The first is the non-governmental organisation-led Partnership Programme Towards Sustainable Tuna, which adopts a bottom-up or development oriented FIP model. The second is the private-led Artesmar FIP, which adopts a top-down or market-oriented FIP approach. The data were obtained from 350 fishers surveyed and were analysed using two separate models run in succession, taking into consideration full, partial, and non-participation in the two FIPs. The results demonstrate that different types of capabilities are required in order to participate in different FIP models. Individual firm capabilities are more important for fishers participation in market-oriented FIPs, which use direct economic incentives to encourage improvements in fisher practices. Collective capabilities are more important for fishers to participate in development-oriented FIPs, which drive improvement by supporting fishers, fisher associations, and governments to move towards market requirements.
Canine Distemper in an isolated population of fishers (Martes pennanti) from California
Stefan m. Keller; Mourad Gabriel; Karen A. Terio; Edward J. Dubovi; Elizabeth Van Wormer; Rick Sweitzer; Reginald Barret; Craig Thompson; Kathryn Purcell; Linda Munson
2012-01-01
Four fishers (Martes pennanti) from an insular population in the southern Sierra Nevada Mountains, California, USA died as a consequence of an infection with canine distemper virus (CDV) in 2009. Three fishers were found in close temporal and spatial relationship; the fourth fisher died 4 mo later at a 70 km distance from the initial group. Gross...
Fisher information and Rényi entropies in dynamical systems.
Godó, B; Nagy, Á
2017-07-01
The link between the Fisher information and Rényi entropies is explored. The relationship is based on a thermodynamical formalism based on Fisher information with a parameter, β, which is interpreted as the inverse temperature. The Fisher heat capacity is defined and found to be sensitive to changes of higher order than the analogous quantity in the conventional formulation.
ERIC Educational Resources Information Center
Pollnac, Richard B.; Kotowicz, Dawn
2012-01-01
The paper examines job satisfaction among fishers in a tsunami-impacted area on the Andaman coast of Thailand. Following the tsunami, many predicted that fishers would be reluctant to resume their fishing activities. Observations in the fishing communities, however, indicated that as soon as fishers obtained replacements for equipment damaged by…
Michael K. Schwartz
2007-01-01
Until recently it was assumed that fishers (Martes pennanti) in the Rocky Mountains all were descended from reintroduced stocks. However, a recent study reported that mitochondrial DNA (cytochrome-b and control region) haplotypes of fishers found only in west-central Montana are likely derived from a relic population of fishers that escaped harvests conducted in the...
Salehi, Nooshin; Choi, Eric D; Garrison, Roger C
2017-01-16
BACKGROUND Miller Fisher Syndrome is characterized by the clinical triad of ophthalmoplegia, ataxia, and areflexia, and is considered to be a variant of Guillain-Barre Syndrome. Miller Fisher Syndrome is observed in approximately 1-5% of all Guillain-Barre cases in Western countries. Patients with Miller Fisher Syndrome usually have good recovery without residual deficits. Venous thromboembolism is a common complication of Guillain-Barre Syndrome and has also been reported in Miller Fisher Syndrome, but it has generally been reported in the presence of at least one prothrombotic risk factor such as immobility. A direct correlation between venous thromboembolism and Miller Fisher Syndrome or Guillain-Barre Syndrome has not been previously described. CASE REPORT We report the case of a 32-year-old Hispanic male who presented with acute, severe thromboembolic disease and concurrently demonstrated characteristic clinical features of Miller Fisher Syndrome including ophthalmoplegia, ataxia, and areflexia. Past medical and family history were negative for thromboembolic disease, and subsequent hypercoagulability workup was unremarkable. During the course of hospitalization, the patient also developed angioedema. CONCLUSIONS We describe a possible association between Miller Fisher Syndrome, thromboembolic disease, and angioedema.
NASA Astrophysics Data System (ADS)
Faggiani Dias, D.; Subramanian, A. C.; Zanna, L.; Miller, A. J.
2017-12-01
Sea surface temperature (SST) in the Pacific sector is well known to vary on time scales from seasonal to decadal, and the ability to predict these SST fluctuations has many societal and economical benefits. Therefore, we use a suite of statistical linear inverse models (LIMs) to understand the remote and local SST variability that influences SST predictions over the North Pacific region and further improve our understanding on how the long-observed SST record can help better guide multi-model ensemble forecasts. Observed monthly SST anomalies in the Pacific sector (between 15oS and 60oN) are used to construct different regional LIMs for seasonal to decadal prediction. The forecast skills of the LIMs are compared to that from two operational forecast systems in the North American Multi-Model Ensemble (NMME) revealing that the LIM has better skill in the Northeastern Pacific than NMME models. The LIM is also found to have comparable forecast skill for SST in the Tropical Pacific with NMME models. This skill, however, is highly dependent on the initialization month, with forecasts initialized during the summer having better skill than those initialized during the winter. The forecast skill with LIM is also influenced by the verification period utilized to make the predictions, likely due to the changing character of El Niño in the 20th century. The North Pacific seems to be a source of predictability for the Tropics on seasonal to interannual time scales, while the Tropics act to worsen the skill for the forecast in the North Pacific. The data were also bandpassed into seasonal, interannual and decadal time scales to identify the relationships between time scales using the structure of the propagator matrix. For the decadal component, this coupling occurs the other way around: Tropics seem to be a source of predictability for the Extratropics, but the Extratropics don't improve the predictability for the Tropics. These results indicate the importance of temporal scale interactions in improving predictability on decadal timescales. Hence, we show that LIMs are not only useful as benchmarks for estimates of statistical skill, but also to isolate contributions to the forecast skills from different timescales, spatial scales or even model components.
NASA Astrophysics Data System (ADS)
Shaw, Jeremy A.; Daescu, Dacian N.
2017-08-01
This article presents the mathematical framework to evaluate the sensitivity of a forecast error aspect to the input parameters of a weak-constraint four-dimensional variational data assimilation system (w4D-Var DAS), extending the established theory from strong-constraint 4D-Var. Emphasis is placed on the derivation of the equations for evaluating the forecast sensitivity to parameters in the DAS representation of the model error statistics, including bias, standard deviation, and correlation structure. A novel adjoint-based procedure for adaptive tuning of the specified model error covariance matrix is introduced. Results from numerical convergence tests establish the validity of the model error sensitivity equations. Preliminary experiments providing a proof-of-concept are performed using the Lorenz multi-scale model to illustrate the theoretical concepts and potential benefits for practical applications.
Mengele, Karin; Napieralski, Rudolf; Magdolen, Viktor; Reuning, Ute; Gkazepis, Apostolos; Sweep, Fred; Brünner, Nils; Foekens, John; Harbeck, Nadia; Schmitt, Manfred
2010-10-01
In cancer, the serine protease urokinase-type plasminogen activator, its inhibitor (plasminogen activator inhibitor type-1) and the receptor (CD87), among other proteolytic factors, are involved in tumor cell dissemination and turnover of the extracellular matrix. Unsurprisingly, a battery of very uniform data, amassed since the end of the 1990s, has put these members of the plasminogen activation system into the forefront of prognostic/predictive cancer biomarkers relevant to predict the clinical course of cancer patients and their response to cancer therapy. The present review focuses on the molecular characteristics of the disease forecast biomarkers urokinase-type plasminogen activator and plasminogen activator inhibitor type-1, and techniques to quantitatively assess these cancer biomarkers, in the context of potential clinical application and personalized disease management.
R. A. Fisher and his advocacy of randomization.
Hall, Nancy S
2007-01-01
The requirement of randomization in experimental design was first stated by R. A. Fisher, statistician and geneticist, in 1925 in his book Statistical Methods for Research Workers. Earlier designs were systematic and involved the judgment of the experimenter; this led to possible bias and inaccurate interpretation of the data. Fisher's dictum was that randomization eliminates bias and permits a valid test of significance. Randomization in experimenting had been used by Charles Sanders Peirce in 1885 but the practice was not continued. Fisher developed his concepts of randomizing as he considered the mathematics of small samples, in discussions with "Student," William Sealy Gosset. Fisher published extensively. His principles of experimental design were spread worldwide by the many "voluntary workers" who came from other institutions to Rothamsted Agricultural Station in England to learn Fisher's methods.
MIXREG: a computer program for mixed-effects regression analysis with autocorrelated errors.
Hedeker, D; Gibbons, R D
1996-05-01
MIXREG is a program that provides estimates for a mixed-effects regression model (MRM) for normally-distributed response data including autocorrelated errors. This model can be used for analysis of unbalanced longitudinal data, where individuals may be measured at a different number of timepoints, or even at different timepoints. Autocorrelated errors of a general form or following an AR(1), MA(1), or ARMA(1,1) form are allowable. This model can also be used for analysis of clustered data, where the mixed-effects model assumes data within clusters are dependent. The degree of dependency is estimated jointly with estimates of the usual model parameters, thus adjusting for clustering. MIXREG uses maximum marginal likelihood estimation, utilizing both the EM algorithm and a Fisher-scoring solution. For the scoring solution, the covariance matrix of the random effects is expressed in its Gaussian decomposition, and the diagonal matrix reparameterized using the exponential transformation. Estimation of the individual random effects is accomplished using an empirical Bayes approach. Examples illustrating usage and features of MIXREG are provided.
Theoretical Bound of CRLB for Energy Efficient Technique of RSS-Based Factor Graph Geolocation
NASA Astrophysics Data System (ADS)
Kahar Aziz, Muhammad Reza; Heriansyah; Saputra, EfaMaydhona; Musa, Ardiansyah
2018-03-01
To support the increase of wireless geolocation development as the key of the technology in the future, this paper proposes theoretical bound derivation, i.e., Cramer Rao lower bound (CRLB) for energy efficient of received signal strength (RSS)-based factor graph wireless geolocation technique. The theoretical bound derivation is crucially important to evaluate whether the energy efficient technique of RSS-based factor graph wireless geolocation is effective as well as to open the opportunity to further innovation of the technique. The CRLB is derived in this paper by using the Fisher information matrix (FIM) of the main formula of the RSS-based factor graph geolocation technique, which is lied on the Jacobian matrix. The simulation result shows that the derived CRLB has the highest accuracy as a bound shown by its lowest root mean squared error (RMSE) curve compared to the RMSE curve of the RSS-based factor graph geolocation technique. Hence, the derived CRLB becomes the lower bound for the efficient technique of RSS-based factor graph wireless geolocation.
Injectable CMC/PEI gel as an in vivo scaffold for demineralized bone matrix.
Kim, Kyung Sook; Kang, Yun Mi; Lee, Ju Young; Kim, E Sle; Kim, Chun Ho; Min, Byoung Hyun; Lee, Hai Bang; Kim, Jae Ho; Kim, Moon Suk
2009-01-01
A number of materials have been considered as sources of grafts to repair bone defects. Here, we examined the possibility of creating in situ-forming gels from sodium carboxymethylcellulose (CMC) and poly(ethyleneimine) (PEI) for use as an in vivo carrier of demineralized bone matrix (DBM). The interaction between anionic CMC and cationic PEI was examined by evaluating phase transition behavior and viscosity of CMC solutions containing 0-30 wt% PEI. CMC solutions containing 10 wt% PEI exhibited a sol-to-gel phase transition at temperatures greater than 35 degrees C. The phase transition is caused by electrostatic crosslinking of the CMC/PEI solution to form a gel with a three-dimensional network structure. In situ-formed gel implants were successfully fabricated in vivo by simple subcutaneous injection of the CMC/PEI (90/10) solution (with and without DBM) into Fisher rats. The resulting in situ-formed implant maintained its shape for 28 days in vitro and in vivo. Our results show that in situ-forming CMC/PEI gels can serve as a DBM carrier that can be delivered with a minimally invasive procedure.
An approximate stationary solution for multi-allele neutral diffusion with low mutation rates.
Burden, Conrad J; Tang, Yurong
2016-12-01
We address the problem of determining the stationary distribution of the multi-allelic, neutral-evolution Wright-Fisher model in the diffusion limit. A full solution to this problem for an arbitrary K×K mutation rate matrix involves solving for the stationary solution of a forward Kolmogorov equation over a (K-1)-dimensional simplex, and remains intractable. In most practical situations mutations rates are slow on the scale of the diffusion limit and the solution is heavily concentrated on the corners and edges of the simplex. In this paper we present a practical approximate solution for slow mutation rates in the form of a set of line densities along the edges of the simplex. The method of solution relies on parameterising the general non-reversible rate matrix as the sum of a reversible part and a set of (K-1)(K-2)/2 independent terms corresponding to fluxes of probability along closed paths around faces of the simplex. The solution is potentially a first step in estimating non-reversible evolutionary rate matrices from observed allele frequency spectra. Copyright © 2016 Elsevier Inc. All rights reserved.
76 FR 55138 - Post Office Closing
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-06
... the closing of the Fishers Landing, New York post office has been filed. It identifies preliminary... Postal Service's determination to close the Fishers Landing post office in Fishers Landing, New York. The...
Michael K. Schwartz; Nicholas J. DeCesare; Benjamin S. Jimenez; Jeffrey P. Copeland; Wayne E. Melquist
2013-01-01
The fisher (Pekania pennanti; formerly known as Martes pennanti) is a North American endemic mustelid with a geographic distribution that spans much of the boreal forests of North America. In the Northern Rocky Mountain (NRM) fishers have been the focus of Endangered Species Act (ESA) listing decisions. Habitat studies of West Coast fishers in California have...
Supplying osteogenesis to dead bone using an osteogenic matrix cell sheet.
Uchihara, Yoshinobu; Akahane, Manabu; Okuda, Akinori; Shimizu, Takamasa; Masuda, Keisuke; Kira, Tsutomu; Kawate, Kenji; Tanaka, Yasuhito
2018-02-22
To evaluate whether osteogenic matrix cell sheets can supply osteogenesis to dead bone. Femur bone fragments (5 mm in length) were obtained from Fisher 344 rats and irradiated by a single exposure of 60 Gy to produce bones that were no longer viable. Osteogenic matrix cell sheets were created from rat bone marrow-derived stromal cells (BMSCs). After wrapping the dead bone with an osteogenic matrix cell sheet, it was subcutaneously transplanted into the back of a rat and harvested after 4 weeks. Bone formation around the dead bone was evaluated by X-ray imaging and histology. Alkaline phosphatase (ALP) and osteocalcin (OC) mRNA expression levels were measured to confirm osteogenesis of the transplanted bone. The contribution of donor cells to bone formation was assessed using the Sry gene and PKH26. After the cell sheet was transplanted together with dead bone, X-ray images showed abundant calcification around the dead bone. In contrast, no newly formed bone was seen in samples that were transplanted without the cell sheet. Histological sections also showed newly formed bone around dead bone in samples transplanted with the cell sheet, whereas many empty lacunae and no newly formed bone were observed in samples transplanted without the cell sheet. ALP and OC mRNA expression levels were significantly higher in dead bones transplanted with cell sheets than in those without a cell sheet (P < 0.01). Sry gene expression and cells derived from cell sheets labeled with PKH26 were detected in samples transplanted with a cell sheet, indicating survival of donor cells after transplantation. Our study indicates that osteogenic matrix cell sheet transplantation can supply osteogenesis to dead bone. Copyright © 2018. Published by Elsevier B.V.
Gabriel, Mourad W.; Woods, Leslie W.; Poppenga, Robert; Sweitzer, Rick A.; Thompson, Craig; Matthews, Sean M.; Higley, J. Mark; Keller, Stefan M.; Purcell, Kathryn; Barrett, Reginald H.; Wengert, Greta M.; Sacks, Benjamin N.; Clifford, Deana L.
2012-01-01
Anticoagulant rodenticide (AR) poisoning has emerged as a significant concern for conservation and management of non-target wildlife. The purpose for these toxicants is to suppress pest populations in agricultural or urban settings. The potential of direct and indirect exposures and illicit use of ARs on public and community forest lands have recently raised concern for fishers (Martes pennanti), a candidate for listing under the federal Endangered Species Act in the Pacific states. In an investigation of threats to fisher population persistence in the two isolated California populations, we investigate the magnitude of this previously undocumented threat to fishers, we tested 58 carcasses for the presence and quantification of ARs, conducted spatial analysis of exposed fishers in an effort to identify potential point sources of AR, and identified fishers that died directly due to AR poisoning. We found 46 of 58 (79%) fishers exposed to an AR with 96% of those individuals having been exposed to one or more second-generation AR compounds. No spatial clustering of AR exposure was detected and the spatial distribution of exposure suggests that AR contamination is widespread within the fisher’s range in California, which encompasses mostly public forest and park lands Additionally, we diagnosed four fisher deaths, including a lactating female, that were directly attributed to AR toxicosis and documented the first neonatal or milk transfer of an AR to an altricial fisher kit. These ARs, which some are acutely toxic, pose both a direct mortality or fitness risk to fishers, and a significant indirect risk to these isolated populations. Future research should be directed towards investigating risks to prey populations fishers are dependent on, exposure in other rare forest carnivores, and potential AR point sources such as illegal marijuana cultivation in the range of fishers on California public lands. PMID:22808110
Popular media records reveal multi-decadal trends in recreational fishing catch rates.
Thurstan, Ruth H; Game, Edward; Pandolfi, John M
2017-01-01
Despite threats to human wellbeing from ecological degradation, public engagement with this issue remains at low levels. However, studies have shown that crafting messages to resonate with people's personal experiences can enhance engagement. Recreational fishing is one of the principal ways in which people interact with aquatic environments, but long-term data from this perspective are considered rare. We uncovered 852 popular media records of recreational fishing for an Australian estuary across a 140-year period. Using information contained in these articles we analysed the species composition of recreational catches over time and constructed two distinct time series of catch and effort (n fish fisher-1 trip-1; kg fish fisher-1 trip-1) for recreational fishing trips and fishing club competitions (mean n and kg fish caught across all competitors, and n and kg fish caught by the competition winner). Reported species composition remained similar over time. Catch rates reported from recreational fishing trips (1900-1998) displayed a significant decline, averaging 32.5 fish fisher-1 trip-1 prior to 1960, and 18.8 fish fisher-1 trip-1 post-1960. Mean n fish fisher-1 competition-1 (1913-1983) also significantly declined, but best n fish fisher-1 competition-1 (1925-1980) displayed no significant change, averaging 31.2 fish fisher-1 competition-1 over the time series. Mean and best kg fish fisher-1 competition-1 trends also displayed no significant change, averaging 4.2 and 9.9 kg fisher-1 competition-1, respectively. These variable trends suggest that while some fishers experienced diminishing returns in this region over the last few decades, the most skilled inshore fishers were able to maintain their catch rates, highlighting the difficulties inherent in crafting conservation messages that will resonate with all sections of a community. Despite these challenges, this research demonstrates that popular media sources can provide multiple long-term trends at spatial scales, in units and via a recreational experience that many people can relate to.
Suffice, Pauline; Asselin, Hugo; Imbeau, Louis; Cheveau, Marianne; Drapeau, Pierre
2017-09-07
Monitoring of fur-bearing species populations is relatively rare due to their low densities. In addition to catch data, trappers' experience provides information on the ecology and status of the harvested species. Fisher (Pekania pennanti) and American marten (Martes americana) are mustelids that are sensitive to forest management and therefore considered to be ecological indicators of forest health. Fisher populations have increased in eastern North America since the early 2000s and this could have resulted in a northeastern extension of the species' range and increased overlap with marten's range. Moreover, habitats of both species are subject to natural and anthropogenic disturbances. The objective of this study was to document the knowledge held by local trappers in the northern area of sympatry between fisher and marten to identify factors that could explain variation in populations of the two species and interactions between them. Forty-one semi-directed interviews with Indigenous and non-Indigenous trappers in the Abitibi-Témiscamingue region of western Quebec (Canada), at the northern limit of the overlapping ranges of the two mustelid species. Trappers highlighted the lack of exclusivity of marten and fisher to coniferous forests, although marten is more closely associated with them than is fisher. Fisher apparently also takes advantage of open environments, including agroforestry systems. Moreover, climate change increases the frequency of freeze-thaw events that cause the formation of an ice crust on the snow surface, which favors fisher movements. The fisher was identified as a competitor and even a predator of the marten. Furthermore, the fisher is less affected than the marten by forest management, and it also seems to benefit from climate change to a greater extent.
Fisher information in a quantum-critical environment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sun Zhe; Ma Jian; Lu Xiaoming
2010-08-15
We consider a process of parameter estimation in a spin-j system surrounded by a quantum-critical spin chain. Quantum Fisher information lies at the heart of the estimation task. We employ Ising spin chain in a transverse field as the environment which exhibits a quantum phase transition. Fisher information decays with time almost monotonously when the environment reaches the critical point. By choosing a fixed time or taking the time average, one can see the quantum Fisher information presents a sudden drop at the critical point. Different initial states of the environment are considered. The phenomenon that the quantum Fisher information,more » namely, the precision of estimation, changes dramatically can be used to detect the quantum criticality of the environment. We also introduce a general method to obtain the maximal Fisher information for a given state.« less
Quantum Fisher information of the Greenberg-Horne-Zeilinger state in decoherence channels
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ma Jian; Huang Yixiao; Wang Xiaoguang
2011-08-15
Quantum Fisher information of a parameter characterizes the sensitivity of the state with respect to changes of the parameter. In this article, we study the quantum Fisher information of a state with respect to SU(2) rotations under three decoherence channels: the amplitude-damping, phase-damping, and depolarizing channels. The initial state is chosen to be a Greenberg-Horne-Zeilinger state of which the phase sensitivity can achieve the Heisenberg limit. By using the Kraus operator representation, the quantum Fisher information is obtained analytically. We observe the decay and sudden change of the quantum Fisher information in all three channels.
An early "Atkins' Diet": RA Fisher analyses a medical "experiment".
Senn, Stephen
2006-04-01
A study on vitamin absorption which RA Fisher analysed for WRG Atkins and co-authored with him is critically examined. The historical background as well as correspondence between Atkins and Fisher is presented.
Can reliable values of Young's modulus be deduced from Fisher's (1971) spinning lens measurements?
Burd, H J; Wilde, G S; Judge, S J
2006-04-01
The current textbook view of the causes of presbyopia rests very largely on a series of experiments reported by R.F. Fisher some three decades ago, and in particular on the values of lens Young's modulus inferred from the deformation caused by spinning excised lenses about their optical axis (Fisher 1971) We studied the extent to which inferred values of Young's modulus are influenced by assumptions inherent in the mathematical procedures used by Fisher to interpret the test and we investigated several alternative interpretation methods. The results suggest that modelling assumptions inherent in Fisher's original method may have led to systematic errors in the determination of the Young's modulus of the cortex and nucleus. Fisher's conclusion that the cortex is stiffer than the nucleus, particularly in middle age, may be an artefact associated with these systematic errors. Moreover, none of the models we explored are able to account for Fisher's claim that the removal of the capsule has only a modest effect on the deformations induced in the spinning lens.
de Souza, Moysés Loiola Ponte; Vieira, Ana Cláudia C; Andrade, Gustavo; Quinino, Saul; de Fátima Leal Griz, Maria; Azevedo-Filho, Hildo R C
2015-08-01
To associate the presence of language deficits with varying scores of the Fisher grading scale in patients with subarachnoid hemorrhage in the period preceding the treatment of aneurysm in the anterior circulation, as well as to compare the scores of this scale, identifying the grades more associated with the decline of language. Database analysis of 185 preoperative evaluations of language, through the Montreal Toulouse Protocol Alpha version and verbal fluency through CERAD battery, of patients from "Hospital da Restauração" with aneurysmal subarachnoid hemorrhage, divided according to the Fisher grading scale (Fisher I, II, III, or IV) and compared with a control group of individuals considered normal. The various scores of the Fisher grading scale have different levels of language deficits, more pronounced as the amount of blood increases. Fisher III and IV scores are most associated with the decline of language. Our study made it possible to obtain information not yet available in the literature, by correlating the various scores of the Fisher grading scale with language yet in the period preceding treatment. Copyright © 2015 Elsevier Inc. All rights reserved.
Fisher information of accelerated two-qubit systems
NASA Astrophysics Data System (ADS)
Metwally, N.
2018-02-01
In this paper, Fisher information for an accelerated system initially prepared in the X-state is discussed. An analytical solution, which consists of three parts: classical, the average over all pure states and a mixture of pure states, is derived for the general state and for Werner state. It is shown that the Unruh acceleration has a depleting effect on the Fisher information. This depletion depends on the degree of entanglement of the initial state settings. For the X-state, for some intervals of Unruh acceleration, the Fisher information remains constant, irrespective to the Unruh acceleration. In general, the possibility of estimating the state’s parameters decreases as the acceleration increases. However, the precision of estimation can be maximized for certain values of the Unruh acceleration. We also investigate the contribution of the different parts of the Fisher information on the dynamics of the total Fisher information.
More than Anecdotes: Fishers' Ecological Knowledge Can Fill Gaps for Ecosystem Modeling.
Bevilacqua, Ana Helena V; Carvalho, Adriana R; Angelini, Ronaldo; Christensen, Villy
2016-01-01
Ecosystem modeling applied to fisheries remains hampered by a lack of local information. Fishers' knowledge could fill this gap, improving participation in and the management of fisheries. The same fishing area was modeled using two approaches: based on fishers' knowledge and based on scientific information. For the former, the data was collected by interviews through the Delphi methodology, and for the latter, the data was gathered from the literature. Agreement between the attributes generated by the fishers' knowledge model and scientific model is discussed and explored, aiming to improve data availability, the ecosystem model, and fisheries management. The ecosystem attributes produced from the fishers' knowledge model were consistent with the ecosystem attributes produced by the scientific model, and elaborated using only the scientific data from literature. This study provides evidence that fishers' knowledge may suitably complement scientific data, and may improve the modeling tools for the research and management of fisheries.
Moving horizon estimation for assimilating H-SAF remote sensing data into the HBV hydrological model
NASA Astrophysics Data System (ADS)
Montero, Rodolfo Alvarado; Schwanenberg, Dirk; Krahe, Peter; Lisniak, Dmytro; Sensoy, Aynur; Sorman, A. Arda; Akkol, Bulut
2016-06-01
Remote sensing information has been extensively developed over the past few years including spatially distributed data for hydrological applications at high resolution. The implementation of these products in operational flow forecasting systems is still an active field of research, wherein data assimilation plays a vital role on the improvement of initial conditions of streamflow forecasts. We present a novel implementation of a variational method based on Moving Horizon Estimation (MHE), in application to the conceptual rainfall-runoff model HBV, to simultaneously assimilate remotely sensed snow covered area (SCA), snow water equivalent (SWE), soil moisture (SM) and in situ measurements of streamflow data using large assimilation windows of up to one year. This innovative application of the MHE approach allows to simultaneously update precipitation, temperature, soil moisture as well as upper and lower zones water storages of the conceptual model, within the assimilation window, without an explicit formulation of error covariance matrixes and it enables a highly flexible formulation of distance metrics for the agreement of simulated and observed variables. The framework is tested in two data-dense sites in Germany and one data-sparse environment in Turkey. Results show a potential improvement of the lead time performance of streamflow forecasts by using perfect time series of state variables generated by the simulation of the conceptual rainfall-runoff model itself. The framework is also tested using new operational data products from the Satellite Application Facility on Support to Operational Hydrology and Water Management (H-SAF) of EUMETSAT. This study is the first application of H-SAF products to hydrological forecasting systems and it verifies their added value. Results from assimilating H-SAF observations lead to a slight reduction of the streamflow forecast skill in all three cases compared to the assimilation of streamflow data only. On the other hand, the forecast skill of soil moisture shows a significant improvement.
Bettoli, Phillip William; Casto-Yerty, M.; Scholten, G.D.; Heist, Edward J.
2009-01-01
We quantified the bycatch of pallid sturgeon Scaphirhynchus albus in Tennessee's shovelnose sturgeon (Scaphirhynchus platorynchus) fishery by accompanying commercial fishers and monitoring their catch on five dates in spring 2007. Fishers were free to keep or discard any sturgeon they collected in their gillnets and trotlines and we were afforded the opportunity to collect meristic and morphometric data and tissue samples from discarded and harvested specimens. Fishers removed 327 live sturgeon from their gear in our presence, of which 93 were harvested; we also obtained the carcasses of 20 sturgeon that a fisher harvested out of our sight while we were on the water with another fisher. Two of the 113 harvested sturgeon were confirmed pallid sturgeon based on microsatellite DNA analyses. Additionally, fishers gave us five, live pallid sturgeon that they had removed from their gear. If the incidental harvest rate of pallid sturgeon (1.8% of all sturgeon harvested) was similar in the previous two commercial seasons, at least 169 adult pallid sturgeon were harvested by commercial fishers in the Tennessee waters of the Mississippi River in 2005-2007. If fishers altered their behavior because of our presence (i.e. if they were more conservative in what they harvested), the pallid sturgeon take was probably higher when they fished unaccompanied by observers. While retrieving a gill net set the previous day, a fisher we were accompanying retrieved a gillnet lost 2 days earlier; this ghost net caught 53 sturgeon whereby one fish was harvested but most fish were dead, including one confirmed pallid sturgeon.
An evaluation of parturition indices in fishers
Frost, H.C.; York, E.C.; Krohn, W.B.; Elowe, K.D.; Decker, T.A.; Powell, S.M.; Fuller, T.K.
1999-01-01
Fishers (Martes pennanti) are important forest carnivores and furbearers that are susceptible to overharvest. Traditional indices used to monitor fisher populations typically overestimate litter size and proportion of females that give birth. We evaluated the usefulness of 2 indices of reproduction to determine proportion of female fishers that gave birth in a particular year. We used female fishers of known age and reproductive histories to compare appearance of placental scars with incidence of pregnancy and litter size. Microscopic observation of freshly removed reproductive tracts correctly identified pregnant fishers and correctly estimated litter size in 3 of 4 instances, but gross observation of placental scars failed to correctly identify pregnant fishers and litter size. Microscopic observations of reproductive tracts in carcasses that were not fresh also failed to identify pregnant animals and litter size. We evaluated mean sizes of anterior nipples to see if different reproductive classes could be distinguished. Mean anterior nipple size of captive and wild fishers correctly identified current-year breeders from nonbreeders. Former breeders were misclassified in 4 of 13 instances. Presence of placental scars accurately predicted parturition in a small sample size of fishers, but absence of placental scars did not signify that a female did not give birth. In addition to enabling the estimation of parturition rates in live animals more accurately than traditional indices, mean anterior nipple size also provided an estimate of the percentage of adult females that successfully raised young. Though using mean anterior nipple size to index reproductive success looks promising, additional data are needed to evaluate effects of using dried, stretched pelts on nipple size for management purposes.
A Machine LearningFramework to Forecast Wave Conditions
NASA Astrophysics Data System (ADS)
Zhang, Y.; James, S. C.; O'Donncha, F.
2017-12-01
Recently, significant effort has been undertaken to quantify and extract wave energy because it is renewable, environmental friendly, abundant, and often close to population centers. However, a major challenge is the ability to accurately and quickly predict energy production, especially across a 48-hour cycle. Accurate forecasting of wave conditions is a challenging undertaking that typically involves solving the spectral action-balance equation on a discretized grid with high spatial resolution. The nature of the computations typically demands high-performance computing infrastructure. Using a case-study site at Monterey Bay, California, a machine learning framework was trained to replicate numerically simulated wave conditions at a fraction of the typical computational cost. Specifically, the physics-based Simulating WAves Nearshore (SWAN) model, driven by measured wave conditions, nowcast ocean currents, and wind data, was used to generate training data for machine learning algorithms. The model was run between April 1st, 2013 and May 31st, 2017 generating forecasts at three-hour intervals yielding 11,078 distinct model outputs. SWAN-generated fields of 3,104 wave heights and a characteristic period could be replicated through simple matrix multiplications using the mapping matrices from machine learning algorithms. In fact, wave-height RMSEs from the machine learning algorithms (9 cm) were less than those for the SWAN model-verification exercise where those simulations were compared to buoy wave data within the model domain (>40 cm). The validated machine learning approach, which acts as an accurate surrogate for the SWAN model, can now be used to perform real-time forecasts of wave conditions for the next 48 hours using available forecasted boundary wave conditions, ocean currents, and winds. This solution has obvious applications to wave-energy generation as accurate wave conditions can be forecasted with over a three-order-of-magnitude reduction in computational expense. The low computational cost (and by association low computer-power requirement) means that the machine learning algorithms could be installed on a wave-energy converter as a form of "edge computing" where a device could forecast its own 48-hour energy production.
NASA Astrophysics Data System (ADS)
Perekhodtseva, Elvira V.
2010-05-01
Development of successful method of forecast of storm winds, including squalls and tornadoes, that often result in human and material losses, could allow one to take proper measures against destruction of buildings and to protect people. Well-in-advance successful forecast (from 12 hours to 48 hour) makes possible to reduce the losses. Prediction of the phenomena involved is a very difficult problem for synoptic till recently. The existing graphic and calculation methods still depend on subjective decision of an operator. Nowadays in Russia there is no hydrodynamic model for forecast of the maximal wind velocity V> 25m/c, hence the main tools of objective forecast are statistical methods using the dependence of the phenomena involved on a number of atmospheric parameters (predictors). . Statistical decisive rule of the alternative and probability forecast of these events was obtained in accordance with the concept of "perfect prognosis" using the data of objective analysis. For this purpose the different teaching samples of present and absent of this storm wind and rainfalls were automatically arranged that include the values of forty physically substantiated potential predictors. Then the empirical statistical method was used that involved diagonalization of the mean correlation matrix R of the predictors and extraction of diagonal blocks of strongly correlated predictors. Thus for these phenomena the most informative predictors were selected without loosing information. The statistical decisive rules for diagnosis and prognosis of the phenomena involved U(X) were calculated for choosing informative vector-predictor. We used the criterion of distance of Mahalanobis and criterion of minimum of entropy by Vapnik-Chervonenkis for the selection predictors. Successful development of hydrodynamic models for short-term forecast and improvement of 36-48h forecasts of pressure, temperature and others parameters allowed us to use the prognostic fields of those models for calculations of the discriminant functions in the nodes of the grid 75x75km and the values of probabilities P of dangerous wind and thus to get fully automated forecasts. . In order to apply the alternative forecast to European part of Russia and Europe the author proposes the empirical threshold values specified for this phenomenon and advance period 36 hours. According to the Pirsey-Obukhov criterion (T), the success of this hydrometeorological-statistical method of forecast of storm wind and tornadoes to 36 -48 hours ahead in the warm season for the territory of Europe part of Russia and Siberia is T = 1-a-b=0,54-0,78 after independent and author experiments during the period 2004-2009 years. A lot of examples of very successful forecasts are submitted at this report for the territory of Europe and Russia. The same decisive rules were applied to the forecast of these phenomena during cold period in 2009-2010 years too. On the first month of 2010 a lot of cases of storm wind with heavy snowfall were observed and were forecasting over the territory of France, Italy and Germany.
Treating Sample Covariances for Use in Strongly Coupled Atmosphere-Ocean Data Assimilation
NASA Astrophysics Data System (ADS)
Smith, Polly J.; Lawless, Amos S.; Nichols, Nancy K.
2018-01-01
Strongly coupled data assimilation requires cross-domain forecast error covariances; information from ensembles can be used, but limited sampling means that ensemble derived error covariances are routinely rank deficient and/or ill-conditioned and marred by noise. Thus, they require modification before they can be incorporated into a standard assimilation framework. Here we compare methods for improving the rank and conditioning of multivariate sample error covariance matrices for coupled atmosphere-ocean data assimilation. The first method, reconditioning, alters the matrix eigenvalues directly; this preserves the correlation structures but does not remove sampling noise. We show that it is better to recondition the correlation matrix rather than the covariance matrix as this prevents small but dynamically important modes from being lost. The second method, model state-space localization via the Schur product, effectively removes sample noise but can dampen small cross-correlation signals. A combination that exploits the merits of each is found to offer an effective alternative.
Prioritization Methodology for Chemical Replacement
NASA Technical Reports Server (NTRS)
Cruit, W.; Schutzenhofer, S.; Goldberg, B.; Everhart, K.
1993-01-01
This project serves to define an appropriate methodology for effective prioritization of efforts required to develop replacement technologies mandated by imposed and forecast legislation. The methodology used is a semiquantitative approach derived from quality function deployment techniques (QFD Matrix). This methodology aims to weigh the full environmental, cost, safety, reliability, and programmatic implications of replacement technology development to allow appropriate identification of viable candidates and programmatic alternatives. The results are being implemented as a guideline for consideration for current NASA propulsion systems.
NASA Astrophysics Data System (ADS)
Sari Rochman, E. M.; Rachmad, A.; Syakur, M. A.; Suzanti, I. O.
2018-01-01
Community Health Centers (Puskesmas) are health service institutions that provide individual health services for outpatient, inpatient and emergency care services. In the outpatient service, there are several polyclinics, including the polyclinic of Ear, Nose, and Throat (ENT), Eyes, Dentistry, Children, and internal disease. Dental Poli is a form of dental and oral health services which is directed to the community. At this moment, the management team in dental poli often has difficulties when they do the preparation and planning to serve a number of patients. It is because the dental poli does not have the appropriate workers with the right qualification. The purpose of this study is to make the system of forecasting the patient’s visit to predict how many patients will come; so that the resources that have been provided will be in accordance with the needs of the Puskesmas. In the ELM method, input and bias weights are initially determined randomly to obtain final weights using Generalized Invers. The matrix used in the final weights is a matrix whose outputs are from each input to a hidden layer. So ELM has a fast learning speed. The result of the experiment of ELM method in this research is able to generate a prediction of a number of patient visit with the RMSE value which is equal to 0.0426.
Popular media records reveal multi-decadal trends in recreational fishing catch rates
Game, Edward; Pandolfi, John M.
2017-01-01
Despite threats to human wellbeing from ecological degradation, public engagement with this issue remains at low levels. However, studies have shown that crafting messages to resonate with people’s personal experiences can enhance engagement. Recreational fishing is one of the principal ways in which people interact with aquatic environments, but long-term data from this perspective are considered rare. We uncovered 852 popular media records of recreational fishing for an Australian estuary across a 140-year period. Using information contained in these articles we analysed the species composition of recreational catches over time and constructed two distinct time series of catch and effort (n fish fisher-1 trip-1; kg fish fisher-1 trip-1) for recreational fishing trips and fishing club competitions (mean n and kg fish caught across all competitors, and n and kg fish caught by the competition winner). Reported species composition remained similar over time. Catch rates reported from recreational fishing trips (1900–1998) displayed a significant decline, averaging 32.5 fish fisher-1 trip-1 prior to 1960, and 18.8 fish fisher-1 trip-1 post-1960. Mean n fish fisher-1 competition-1 (1913–1983) also significantly declined, but best n fish fisher-1 competition-1 (1925–1980) displayed no significant change, averaging 31.2 fish fisher-1 competition-1 over the time series. Mean and best kg fish fisher-1 competition-1 trends also displayed no significant change, averaging 4.2 and 9.9 kg fisher-1 competition-1, respectively. These variable trends suggest that while some fishers experienced diminishing returns in this region over the last few decades, the most skilled inshore fishers were able to maintain their catch rates, highlighting the difficulties inherent in crafting conservation messages that will resonate with all sections of a community. Despite these challenges, this research demonstrates that popular media sources can provide multiple long-term trends at spatial scales, in units and via a recreational experience that many people can relate to. PMID:28777809
Hazardous Waste Cleanup: Fisher Scientific Chemical Division in Fair Lawn, New Jersey
Fisher Scientific Chemical Division occupies a 10-acre site at 1 Reagent Lane in the Fair Lawn Industrial Park, New Jersey. Since 1955, Fisher has formulated, distilled, repackaged and distributed high-purity, laboratory-grade reagents and solvents.
Statistical Correction of Air Temperature Forecasts for City and Road Weather Applications
NASA Astrophysics Data System (ADS)
Mahura, Alexander; Petersen, Claus; Sass, Bent; Gilet, Nicolas
2014-05-01
The method for statistical correction of air /road surface temperatures forecasts was developed based on analysis of long-term time-series of meteorological observations and forecasts (from HIgh Resolution Limited Area Model & Road Conditions Model; 3 km horizontal resolution). It has been tested for May-Aug 2012 & Oct 2012 - Mar 2013, respectively. The developed method is based mostly on forecasted meteorological parameters with a minimal inclusion of observations (covering only a pre-history period). Although the st iteration correction is based taking into account relevant temperature observations, but the further adjustment of air and road temperature forecasts is based purely on forecasted meteorological parameters. The method is model independent, e.g. it can be applied for temperature correction with other types of models having different horizontal resolutions. It is relatively fast due to application of the singular value decomposition method for matrix solution to find coefficients. Moreover, there is always a possibility for additional improvement due to extra tuning of the temperature forecasts for some locations (stations), and in particular, where for example, the MAEs are generally higher compared with others (see Gilet et al., 2014). For the city weather applications, new operationalized procedure for statistical correction of the air temperature forecasts has been elaborated and implemented for the HIRLAM-SKA model runs at 00, 06, 12, and 18 UTCs covering forecast lengths up to 48 hours. The procedure includes segments for extraction of observations and forecast data, assigning these to forecast lengths, statistical correction of temperature, one-&multi-days statistical evaluation of model performance, decision-making on using corrections by stations, interpolation, visualisation and storage/backup. Pre-operational air temperature correction runs were performed for the mainland Denmark since mid-April 2013 and shown good results. Tests also showed that the CPU time required for the operational procedure is relatively short (less than 15 minutes including a large time spent for interpolation). These also showed that in order to start correction of forecasts there is no need to have a long-term pre-historical data (containing forecasts and observations) and, at least, a couple of weeks will be sufficient when a new observational station is included and added to the forecast point. Note for the road weather application, the operationalization of the statistical correction of the road surface temperature forecasts (for the RWM system daily hourly runs covering forecast length up to 5 hours ahead) for the Danish road network (for about 400 road stations) was also implemented, and it is running in a test mode since Sep 2013. The method can also be applied for correction of the dew point temperature and wind speed (as a part of observations/ forecasts at synoptical stations), where these both meteorological parameters are parts of the proposed system of equations. The evaluation of the method performance for improvement of the wind speed forecasts is planned as well, with considering possibilities for the wind direction improvements (which is more complex due to multi-modal types of such data distribution). The method worked for the entire domain of mainland Denmark (tested for 60 synoptical and 395 road stations), and hence, it can be also applied for any geographical point within this domain, as through interpolation to about 100 cities' locations (for Danish national byvejr forecasts). Moreover, we can assume that the same method can be used in other geographical areas. The evaluation for other domains (with a focus on Greenland and Nordic countries) is planned. In addition, a similar approach might be also tested for statistical correction of concentrations of chemical species, but such approach will require additional elaboration and evaluation.
Astronaut Anna Fisher practices control of the RMS in a trainer
NASA Technical Reports Server (NTRS)
1984-01-01
Astronaut Anna Lee Fisher, mission specialist for 51-A, practices control of the remote manipulator system (RMS) at a special trainer at JSC. Dr. Fisher is pictured in the manipulator development facility (MDF) of JSC's Shuttle mockup and integration laboratory.
Fish Consumption Patterns and Mercury Advisory Knowledge Among Fishers in the Haw River Basin.
Johnston, Jill E; Hoffman, Kate; Wing, Steve; Lowman, Amy
2016-01-01
Fish consumption has numerous health benefits, with fish providing a source of protein as well as omega-3 fatty acids. However, some fish also contain contaminants that can impair human health. In North Carolina, the Department of Health and Human Services has issued fish consumption advisories due to methylmercury contamination in fish. Little is known about local fishers' consumption patterns and advisory adherence in North Carolina. We surveyed a consecutive sample of 50 fishers (74.6% positive response rate) who reported eating fish caught from the Haw River Basin or Jordan Lake. They provided information on demographic characteristics, species caught, and the frequency of local fish consumption. Additionally, fishers provided information on their knowledge of fish consumption advisories and the impact of those advisories on their fishing and fish consumption patterns. The majority of participants were male (n = 44) and reported living in central North Carolina. Catfish, crappie, sunfish, and large-mouth bass were consumed more frequently than other species of fish. Of the fishers surveyed, 8 reported eating more than 1 fish meal high in mercury per week, which exceeds the North Carolina advisory recommendation. Most participants (n = 32) had no knowledge of local fish advisories, and only 4 fishers reported that advisories impacted their fishing practices. We sampled 50 fishers at 11 locations. There is no enumeration of the dynamic population of fishers and no way to assess the representativeness of this sample. Additional outreach is needed to make local fishers aware of fish consumption advisories and the potential health impacts of eating high-mercury fish, which may also contain other persistent and bioaccumulative toxins. ©2016 by the North Carolina Institute of Medicine and The Duke Endowment. All rights reserved.
Fishers' knowledge and seahorse conservation in Brazil
Rosa, Ierecê ML; Alves, Rômulo RN; Bonifácio, Kallyne M; Mourão, José S; Osório, Frederico M; Oliveira, Tacyana PR; Nottingham, Mara C
2005-01-01
From a conservationist perspective, seahorses are threatened fishes. Concomitantly, from a socioeconomic perspective, they represent a source of income to many fishing communities in developing countries. An integration between these two views requires, among other things, the recognition that seahorse fishers have knowledge and abilities that can assist the implementation of conservation strategies and of management plans for seahorses and their habitats. This paper documents the knowledge held by Brazilian fishers on the biology and ecology of the longsnout seahorse Hippocampus reidi. Its aims were to explore collaborative approaches to seahorse conservation and management in Brazil; to assess fishers' perception of seahorse biology and ecology, in the context evaluating potential management options; to increase fishers' involvement with seahorse conservation in Brazil. Data were obtained through questionnaires and interviews made during field surveys conducted in fishing villages located in the States of Piauí, Ceará, Paraíba, Maranhão, Pernambuco and Pará. We consider the following aspects as positive for the conservation of seahorses and their habitats in Brazil: fishers were willing to dialogue with researchers; although captures and/or trade of brooding seahorses occurred, most interviewees recognized the importance of reproduction to the maintenance of seahorses in the wild (and therefore of their source of income), and expressed concern over population declines; fishers associated the presence of a ventral pouch with reproduction in seahorses (regardless of them knowing which sex bears the pouch), and this may facilitate the construction of collaborative management options designed to eliminate captures of brooding specimens; fishers recognized microhabitats of importance to the maintenance of seahorse wild populations; fishers who kept seahorses in captivity tended to recognize the condtions as poor, and as being a cause of seahorse mortality. PMID:16336660
Fishers' knowledge and seahorse conservation in Brazil.
Rosa, Ierecê Ml; Alves, Rômulo Rn; Bonifácio, Kallyne M; Mourão, José S; Osório, Frederico M; Oliveira, Tacyana Pr; Nottingham, Mara C
2005-12-08
From a conservationist perspective, seahorses are threatened fishes. Concomitantly, from a socioeconomic perspective, they represent a source of income to many fishing communities in developing countries. An integration between these two views requires, among other things, the recognition that seahorse fishers have knowledge and abilities that can assist the implementation of conservation strategies and of management plans for seahorses and their habitats. This paper documents the knowledge held by Brazilian fishers on the biology and ecology of the longsnout seahorse Hippocampus reidi. Its aims were to explore collaborative approaches to seahorse conservation and management in Brazil; to assess fishers' perception of seahorse biology and ecology, in the context evaluating potential management options; to increase fishers' involvement with seahorse conservation in Brazil. Data were obtained through questionnaires and interviews made during field surveys conducted in fishing villages located in the States of Piauí, Ceará, Paraíba, Maranhão, Pernambuco and Pará. We consider the following aspects as positive for the conservation of seahorses and their habitats in Brazil: fishers were willing to dialogue with researchers; although captures and/or trade of brooding seahorses occurred, most interviewees recognized the importance of reproduction to the maintenance of seahorses in the wild (and therefore of their source of income), and expressed concern over population declines; fishers associated the presence of a ventral pouch with reproduction in seahorses (regardless of them knowing which sex bears the pouch), and this may facilitate the construction of collaborative management options designed to eliminate captures of brooding specimens; fishers recognized microhabitats of importance to the maintenance of seahorse wild populations; fishers who kept seahorses in captivity tended to recognize the condtions as poor, and as being a cause of seahorse mortality.
Fisher Information, Entropy, and the Second and Third Laws of Thermodynamics
We propose Fisher Information as a new calculable thermodynamic property that can be shown to follow the Second and the Third Laws of Thermodynamics. Fisher Information is, however, qualitatively different from entropy and potentially possessing a great deal more structure. Hence...
Official portrait of Astronaut Anna L. Fisher
NASA Technical Reports Server (NTRS)
1985-01-01
Official portrait of Astronaut Anna L. Fisher. Fisher is posing with her helmet on the table in front of her and the American flag appears over the opposite shoulder (34357); Posing with an empty table in front of her and the American flag behind her (34358).
Fisher information and asymptotic normality in system identification for quantum Markov chains
DOE Office of Scientific and Technical Information (OSTI.GOV)
Guta, Madalin
2011-06-15
This paper deals with the problem of estimating the coupling constant {theta} of a mixing quantum Markov chain. For a repeated measurement on the chain's output we show that the outcomes' time average has an asymptotically normal (Gaussian) distribution, and we give the explicit expressions of its mean and variance. In particular, we obtain a simple estimator of {theta} whose classical Fisher information can be optimized over different choices of measured observables. We then show that the quantum state of the output together with the system is itself asymptotically Gaussian and compute its quantum Fisher information, which sets an absolutemore » bound to the estimation error. The classical and quantum Fisher information are compared in a simple example. In the vicinity of {theta}=0 we find that the quantum Fisher information has a quadratic rather than linear scaling in output size, and asymptotically the Fisher information is localized in the system, while the output is independent of the parameter.« less
Sensory nerves are frequently involved in the spectrum of fisher syndrome.
Shahrizaila, Nortina; Goh, Khean J; Kokubun, Norito; Tan, Ai H; Tan, Cheng Y; Yuki, Nobuhiro
2014-04-01
Differing patterns of neurophysiological abnormalities have been reported in patients with Fisher syndrome. Fisher syndrome is rare, and few series have incorporated prospective serial studies to define the natural history of nerve conduction studies in Guillain-Barré syndrome. In an ongoing prospective study of Guillain-Barré syndrome patients, patients who presented with Fisher syndrome and its spectrum of illness were assessed through serial neurological examinations, nerve conduction studies, and serological testing of IgG against gangliosides and ganglioside complexes. Of the 36 Guillain-Barré syndrome patients identified within 2 years, 17 had features of Fisher syndrome. Serial nerve conduction studies detected significant abnormalities in sensory nerve action potential amplitude in 94% of patients associated with 2 patterns of recovery-non-demyelinating reversible distal conduction failure and axonal regeneration. Similar changes were seen in motor nerves of 5 patients. Patients with the Fisher syndrome spectrum of illness have significant sensory involvement, which may only be evident with serial neurophysiological studies. Copyright © 2013 Wiley Periodicals, Inc.
Engaging recreational fishers in management and conservation: global case studies.
Granek, E F; Madin, E M P; Brown, M A; Figueira, W; Cameron, D S; Hogan, Z; Kristianson, G; de Villiers, P; Williams, J E; Post, J; Zahn, S; Arlinghaus, R
2008-10-01
Globally, the number of recreational fishers is sizeable and increasing in many countries. Associated with this trend is the potential for negative impacts on fish stocks through exploitation or management measures such as stocking and introduction of non-native fishes. Nevertheless, recreational fishers can be instrumental in successful fisheries conservation through active involvement in, or initiation of, conservation projects to reduce both direct and external stressors contributing to fishery declines. Understanding fishers' concerns for sustained access to the resource and developing methods for their meaningful participation can have positive impacts on conservation efforts. We examined a suite of case studies that demonstrate successful involvement of recreational fishers in conservation and management activities that span developed and developing countries, temperate and tropical regions, marine and freshwater systems, and open- and closed-access fisheries. To illustrate potential benefits and challenges of involving recreational fishers in fisheries management and conservation, we examined the socioeconomic and ecological contexts of each case study. We devised a conceptual framework for the engagement of recreational fishers that targets particular types of involvement (enforcement, advocacy, conservation, management design [type and location], research, and monitoring) on the basis of degree of stakeholder stewardship, scale of the fishery, and source of impacts (internal or external). These activities can be enhanced by incorporating local knowledge and traditions, taking advantage of leadership and regional networks, and creating collaborations among various stakeholder groups, scientists, and agencies to maximize the probability of recreational fisher involvement and project success.
Peper, Steven T; Peper, Randall L; Mitcheltree, Denise H; Kollias, George V; Brooks, Robert P; Stevens, Sadie S; Serfass, Thomas L
2016-12-01
Canine distemper virus (CDV) infects families in the order Carnivora. As a preventive measure, vaccinations against CDV are frequently given to mustelids in captive environments. Our objectives were to compare the utility between two modified-live virus canine distemper vaccines (MLV CDV's), Fervac-D® (no longer manufactured) and Galaxy-D® (now manufactured by MSD Animal Health as part of a multivalent vaccine), in developing an immune response in wild-caught fishers. The Pennsylvania Fisher Reintroduction Project (PFRP) used 14 wild-caught fishers during one year of the project to evaluate the utility of vaccinations against CDV as part of any reintroduction project. Fishers were injected subcutaneously in the nape of the neck with their designated vaccine. Fervac-D® did not effectively stimulate development of a serologic antibody response, whereas Galaxy-D® had adequate seroconversion or rise of titer levels to suggest that the general use of MLV CDV may be suitable in fishers pending further studies. We recommend that future studies be conducted, evaluating the use of currently produced vaccines in fishers. Future research should also focus on the length of days required between administration of primary and booster vaccines to achieve sufficient immune response. If only primary doses are required, then hard-release reintroduction projects for fishers could be recommended. If primary and booster vaccines are required then soft-release reintroduction projects should be recommended that include captive management periods, allowing for appropriate vaccination intervals needed to maximize the probability of protection against CDV.
76 FR 4092 - National Saltwater Angler Registry Program
Federal Register 2010, 2011, 2012, 2013, 2014
2011-01-24
..., Florida and Louisiana as exempted States for anglers, spear fishers and for-hire fishing vessels. NMFS has... fishers. DATES: The designation of the States as exempted States is effective on January 24, 2011. [[Page... exempted States only for anglers and spear fishers: Massachusetts, Maryland, and Virginia. Massachusetts...
ERIC Educational Resources Information Center
Florida State Dept. of Education, Tallahassee. Div. of Vocational Education.
This document is a curriculum framework for a program in commercial fishing to be taught in Florida secondary and postsecondary institutions. This outline covers the major concepts/content of the program, which is designed to prepare students for employment in occupations with titles such as net fishers, pot fishers, line fishers, shrimp boat…
Code of Federal Regulations, 2010 CFR
2010-07-01
... 60.2 Pensions, Bonuses, and Veterans' Relief DEPARTMENT OF VETERANS AFFAIRS (CONTINUED) FISHER HOUSES... House which is a housing facility that is located at or near a VA health care facility, that is... donated to VA by the Zachary and Elizabeth M. Fisher Armed Services Foundation or Fisher House Foundation...
Chapter 4: Fishers and American martens
K.L. Purcell; C.M. Thompson; W.J. Zielinski
2012-01-01
Fishers (Martes pennanti) and American martens (M. americana) are carnivorous mustelids associated with late-successional forests. The distributions of both species have decreased in the Sierra Nevada and southern Cascade region (Zielinski et al. 2005). Fishers occur primarily in lower elevation (3,500 to 7,000 ft) (1067 to...
Code of Federal Regulations, 2011 CFR
2011-07-01
... 60.2 Pensions, Bonuses, and Veterans' Relief DEPARTMENT OF VETERANS AFFAIRS (CONTINUED) FISHER HOUSES... House which is a housing facility that is located at or near a VA health care facility, that is... donated to VA by the Zachary and Elizabeth M. Fisher Armed Services Foundation or Fisher House Foundation...
Happe, Patricia J.; Jenkins, Kurt J.; Kay, Thomas J.; Pilgrim, Kristy L.; Schwartz, Michael K.; Lewis, Jeffrey C.; Aubry, Keith B.
2015-01-01
With the translocation and release of 90 fishers (Pekania pennanti) from British Columbia to Olympic National Park during 2008–2010, the National Park Service and Washington Department of Fish and Wildlife accomplished the first phase of fisher restoration in Washington State. Beginning in 2013, we initiated a new research project to determine the current status of fishers on Washington’s Olympic Peninsula 3–5 years after the releases and evaluate the short-term success of the restoration program. Objectives of the study are to determine the current distribution of fishers and proportion of the recovery area that is currently occupied by fishers, determine several genetic characteristics of the reintroduced population, and determine reproductive success of the founding animals through genetic studies. During 2014, we continued working with a broad coalition of cooperating agencies, tribes, and nongovernmental organizations (NGO) to collect data on fisher distribution and genetics using noninvasive sampling methods. The primary sampling frame consisted of 157 24-square-kilometer hexagons (hexes) distributed across all major land ownerships within the Olympic Peninsula target survey area. In 2014 we expanded the study by adding 58 more hexes to an expanded study area in response to incidental fisher observations outside of the target area obtained in 2013; 49 hexes were added south and 9 to the east of the target area. During 2014, federal, state, tribal and NGO biologists and volunteers established three baited motion-sensing camera stations, paired with hair snaring devices, in 80 hexes; 69 in the targeted area 11 in the expansion areas. Each paired camera/hair station was left in place for approximately 6 weeks, with three checks on 2-week intervals. We documented fisher presence in 5 of the 80 hexagons, and identified 5 different fishers through a combination of microsatellite DNA analyses and camera detections. All fisher detections were in the target area. These 5 individuals included 2 of the original founding population of 90, 1 of the 2 rescued and rehabilitated kits that were released in 2010, and 1 new recruit to the population (1 individual was not identified). Additionally, we identified more than 40 other species of wildlife at the baited camera stations. We also obtained eight incidental fisher observations through photographs and carcass retrieval. During 2015, we plan to sample 75 hexagons in the target area and 12 in the expansion area. We plan to sample all unsampled accessible hexes in the target area (26 hexes), and re-sample accessible hexes sampled in 2013 (49 hexes).
Maximum-Likelihood Methods for Processing Signals From Gamma-Ray Detectors
Barrett, Harrison H.; Hunter, William C. J.; Miller, Brian William; Moore, Stephen K.; Chen, Yichun; Furenlid, Lars R.
2009-01-01
In any gamma-ray detector, each event produces electrical signals on one or more circuit elements. From these signals, we may wish to determine the presence of an interaction; whether multiple interactions occurred; the spatial coordinates in two or three dimensions of at least the primary interaction; or the total energy deposited in that interaction. We may also want to compute listmode probabilities for tomographic reconstruction. Maximum-likelihood methods provide a rigorous and in some senses optimal approach to extracting this information, and the associated Fisher information matrix provides a way of quantifying and optimizing the information conveyed by the detector. This paper will review the principles of likelihood methods as applied to gamma-ray detectors and illustrate their power with recent results from the Center for Gamma-ray Imaging. PMID:20107527
Prioritization methodology for chemical replacement
NASA Technical Reports Server (NTRS)
Goldberg, Ben; Cruit, Wendy; Schutzenhofer, Scott
1995-01-01
This methodology serves to define a system for effective prioritization of efforts required to develop replacement technologies mandated by imposed and forecast legislation. The methodology used is a semi quantitative approach derived from quality function deployment techniques (QFD Matrix). QFD is a conceptual map that provides a method of transforming customer wants and needs into quantitative engineering terms. This methodology aims to weight the full environmental, cost, safety, reliability, and programmatic implications of replacement technology development to allow appropriate identification of viable candidates and programmatic alternatives.
NASA Technical Reports Server (NTRS)
Cruit, Wendy; Schutzenhofer, Scott; Goldberg, Ben; Everhart, Kurt
1993-01-01
This project served to define an appropriate methodology for effective prioritization of technology efforts required to develop replacement technologies mandated by imposed and forecast legislation. The methodology used is a semiquantitative approach derived from quality function deployment techniques (QFD Matrix). This methodology aims to weight the full environmental, cost, safety, reliability, and programmatic implications of replacement technology development to allow appropriate identification of viable candidates and programmatic alternatives. The results will be implemented as a guideline for consideration for current NASA propulsion systems.
Emerging technologies for the changing global market
NASA Technical Reports Server (NTRS)
Cruit, Wendy; Schutzenhofer, Scott; Goldberg, Ben; Everhart, Kurt
1993-01-01
This project served to define an appropriate methodology for effective prioritization of technology efforts required to develop replacement technologies mandated by imposed and forecast legislation. The methodology used is a semi-quantative approach derived from quality function deployment techniques (QFD Matrix). This methodology aims to weight the full environmental, cost, safety, reliability, and programmatic implications of replacement technology development to allow appropriate identification of viable candidates and programmatic alternatives. The results will be implemented as a guideline for consideration for current NASA propulsion systems.
2014-09-29
In response to this finding, AMC is initiating a Depot Material Requirements Planning ( MRP ) Integrated Process Team (IPT) from which one objective...methodologies for DOF reviews and corrective actions by AMC and its component organizations. The target completion date for the Depot MRP IPT is June...implemented a matrix for MRP SOW where the aviation programs were updated in Production LMP 1QFY14. Army Materiel Command (cont’d) Management
An evaluation of a weaning index for wild fishers (Pekania [Martes] pennanti) in California
Sean M. Matthews; J. Mark Higley; John T. Finn; Kerry M. Rennie; Craig M. Thompson; Kathryn L. Purcell; Rick A. Sweitzer; Sandra L. Haire; Paul R. Sievert; Todd K. Fuller
2013-01-01
Conservation concern for fishers (Pekania [Martes] pennanti) in the Pacific states has highlighted a need to develop cost-effective methods of monitoring reproduction in extant and reintroduced fisher populations. We evaluated the efficacy of nipple size as a predictive index of weaning success for females...
Roger A. Powell; William J. Zielinski
1994-01-01
The fisher (Martes pennanti) is a medium-size mammalian carnivore and the largest member of the genus Martes (Anderson 1970) of the family Mustelidae in the order Carnivora. The genus Martes includes five or six other extant species. The fisher has the general body build of a stocky weasel and is long, thin, and...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-25
... DEPARTMENT OF JUSTICE Drug Enforcement Administration Importer of Controlled Substances; Notice of Application; Fisher Clinical Services, Inc. Pursuant to Title 21 Code of Federal Regulations (CFR) 1301.34 (a), this is notice that on June 21, 2013, Fisher Clinical Services, Inc., 7554 Schantz Road, Allentown...
Tunnel Vision in Population Research: A Case in Point.
ERIC Educational Resources Information Center
Serron, Luis A.
This paper takes a critical look at a demographic study of Mexico, namely, Tad Fisher's "Mexico: The Problem of People". Fisher's study relies heavily on population growth as a poverty generating factor. The paper states that Fisher's study exhibited perceptual narrowness by not considering the Marxist interpretation that capitalist…
33 CFR 110.50a - Fishers Island Sound, Stonington, Conn.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 33 Navigation and Navigable Waters 1 2010-07-01 2010-07-01 false Fishers Island Sound, Stonington, Conn. 110.50a Section 110.50a Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY ANCHORAGES ANCHORAGE REGULATIONS Special Anchorage Areas § 110.50a Fishers Island Sound...
Determining the gender of American martens and fishers at track plate stations
Keith M. Slauson; Richard L. Truex; William J. Zielinski
2008-01-01
Determining the gender of American martens (Martes americana) and fishers (M. pennanti) from track plate stations would significantly augment the information currently gathered from this simple and inexpensive survey method. We used track-plate impressions collected from captured individual martens and fishers of known gender to...
Home range characteristics of fishers in California
W. J. Zielinski; R. L. Truex; G. A. Schmidt; F. V. Schlexer; K. N. Schmidt; R. H. Barrett
2004-01-01
The fisher (Martes pennanti) is a forest mustelid that historically occurred in California from the mixed conifer forests of the north coast, east to the southern Cascades, and south throughout the Sierra Nevada. Today fishers in California occur only in 2 disjunct populations in the northwestern mountains and the...
Pilpel, Avital
2007-09-01
This paper is concerned with the role of rational belief change theory in the philosophical understanding of experimental error. Today, philosophers seek insight about error in the investigation of specific experiments, rather than in general theories. Nevertheless, rational belief change theory adds to our understanding of just such cases: R. A. Fisher's criticism of Mendel's experiments being a case in point. After an historical introduction, the main part of this paper investigates Fisher's paper from the point of view of rational belief change theory: what changes of belief about Mendel's experiment does Fisher go through and with what justification. It leads to surprising insights about what Fisher had done right and wrong, and, more generally, about the limits of statistical methods in detecting error.
NASA Astrophysics Data System (ADS)
Yuerlita; Perret, Sylvain Roger; Shivakoti, Ganesh P.
2013-07-01
Technical and socio-economic characteristics are known to determine different types of fishers and their livelihood strategies. Faced with declining fish and water resources, small-scale fisheries engage into transformations in livelihood and fishing practices. The paper is an attempt to understand these changes and their socio-economic patterns, in the case of Singkarak Lake in West Sumatra, Indonesia. Based upon the hypothesis that riparian communities have diverse, complex yet structured and dynamic livelihood systems, the paper's main objective is to study, document and model the actual diversity in livelihood, practices and performance of inland small-scale fisheries along the Singkarak Lake, to picture how households are adapted to the situation, and propose an updated, workable model (typology) of those for policy. Principal component analysis and cluster analysis were used to develop a typology of fishing households. The results show that small-scale fishers can be classified into different types characterized by distinct livelihood strategies. Three household types are identified, namely "farming fishers" households (type I, 30 %), "fishing farmers" households (type II, 30 %), and "mainly fishers" households (type III, 40 %). There are significant differences among these groups in the number of boats owned, annual fishing income, agriculture income and farming experience. Type I consists of farming fishers, well equipped, with high fishing costs and income, yet with the lowest return on fishing assets. They are also landowners with farming income, showing the lowest return on land capital. Type II includes poor fishing farmers, landowners with higher farming income; they show the highest return on land asset. They have less fishing equipment, costs and income. Type III (mainly fishers) consists of poorer, younger fishers, with highest return on fishing assets and on fishing costs. They have little land, low farming income, and diversified livelihood sources. The nature of their livelihood strategies is discussed for each identified group. This helps to understand the complexity and diversity of small-scale fishers, particularly in the study area which is still poorly known. This paper concludes with policy implication and possible management initiatives for environmentally prudent policy aiming at improvement of fishers' livelihood.
Yuerlita; Perret, Sylvain Roger; Shivakoti, Ganesh P
2013-07-01
Technical and socio-economic characteristics are known to determine different types of fishers and their livelihood strategies. Faced with declining fish and water resources, small-scale fisheries engage into transformations in livelihood and fishing practices. The paper is an attempt to understand these changes and their socio-economic patterns, in the case of Singkarak Lake in West Sumatra, Indonesia. Based upon the hypothesis that riparian communities have diverse, complex yet structured and dynamic livelihood systems, the paper's main objective is to study, document and model the actual diversity in livelihood, practices and performance of inland small-scale fisheries along the Singkarak Lake, to picture how households are adapted to the situation, and propose an updated, workable model (typology) of those for policy. Principal component analysis and cluster analysis were used to develop a typology of fishing households. The results show that small-scale fishers can be classified into different types characterized by distinct livelihood strategies. Three household types are identified, namely "farming fishers" households (type I, 30 %), "fishing farmers" households (type II, 30 %), and "mainly fishers" households (type III, 40 %). There are significant differences among these groups in the number of boats owned, annual fishing income, agriculture income and farming experience. Type I consists of farming fishers, well equipped, with high fishing costs and income, yet with the lowest return on fishing assets. They are also landowners with farming income, showing the lowest return on land capital. Type II includes poor fishing farmers, landowners with higher farming income; they show the highest return on land asset. They have less fishing equipment, costs and income. Type III (mainly fishers) consists of poorer, younger fishers, with highest return on fishing assets and on fishing costs. They have little land, low farming income, and diversified livelihood sources. The nature of their livelihood strategies is discussed for each identified group. This helps to understand the complexity and diversity of small-scale fishers, particularly in the study area which is still poorly known. This paper concludes with policy implication and possible management initiatives for environmentally prudent policy aiming at improvement of fishers' livelihood.
Work environment and health in the fishing fleet: results from a survey amongst Norwegian fishers.
Sønvisen, Signe Annie; Thorvaldsen, Trine; Holmen, Ingunn M; Øren, Anita
Fishery is an important industry in Norway. Compared to other industries the number of occupational accidents is high. Fishers are exposed to a range of unfavourable working conditions, but there is limited research-based knowledge about the interaction between working conditions and health. The aim of the article is to study fishers' 1) work-related exposures and health complaints, 2) sickness absence, 3) subjective perception of health status and 3) level of job satisfaction. Data was gathered through a telephone survey. The survey included questions about exposure, health complaints, health status and job satisfaction. Methods for analysis were descriptive statistics and relative risk (RR). A total of 830 full-time fishers were interviewed. Coastal fishers are more exposed to factors such as climatic (RR = 1.546, 95% confidence interval [CI] 1.311-1.823), ergonomic (RR = 1.539, 95% CI 1.293-1.833) and processing (RR = 2.119, 95% CI 1.847-2.431), compared to other groups of fishers. Coastal fishers are also more likely to experience musculoskeletal problems (RR = 1.623, 95% CI 1.139-2.314), sickness absence (RR = 1.337, 95% CI 1.081-1.655) and to perceive their own health as poor (RR = 2.155, 95% CI 1.119-4.152). Purse sein fishers are less exposed to climatic (RR = 0.777, 95% CI 0.633-0.953), ergonomic (RR = 0.617, 95% CI 0.487-0.783) and processing (RR = 0.292, 95% CI 0.221-0.385) factors and are less likely to experience sickness absence (RR = 0.635, 95% CI 0.479-0.840). In terms of job satisfaction, 99% if our respondents enjoy their work. Norwegian fishers have a high degree of job satisfaction and overall good health. Challenges regarding health complaints and exposures in the working environment were identified. This may be helpful for the industry, showing where measures should be implemented to prevent exposure, illness and sickness absence. Findings may also serve as a basis for future intervention studies aimed at promoting healthy working environments for fishers, especially how to improve vessels and develop user-friendly technology to reduce risk of injuries and strain.
NASA Astrophysics Data System (ADS)
Marzolino, Ugo; Prosen, Tomaž
2017-09-01
We investigated quantum critical behaviors in the nonequilibrium steady state of a XXZ spin chain with boundary Markovian noise using Fisher information. The latter represents the distance between two infinitesimally close states, and its superextensive size scaling witnesses a critical behavior due to a phase transition since all the interaction terms are extensive. Perturbatively, in the noise strength, we found superextensive Fisher information at anisotropy |Δ |⩽1 and irrational arccosΔ/π irrespective of the order of two noncommuting limits, i.e., the thermodynamic limit and the limit of sending arccosΔ/π to an irrational number via a sequence of rational approximants. From this result we argue the existence of a nonequilibrium quantum phase transition with a critical phase |Δ |⩽1 . From the nonsuperextensivity of the Fisher information of reduced states, we infer that this nonequilibrium quantum phase transition does not have local order parameters but has nonlocal ones, at least at |Δ |=1 . In the nonperturbative regime for the noise strength, we numerically computed the reduced Fisher information which lower bounds the full-state Fisher information and is superextensive only at |Δ |=1 . From the latter result, we derived local order parameters at |Δ |=1 in the nonperturbative case. The existence of critical behavior witnessed by the Fisher information in the phase |Δ |<1 is still an open problem. The Fisher information also represents the best sensitivity for any estimation of the control parameter, in our case the anisotropy Δ , and its superextensivity implies enhanced estimation precision which is also highly robust in the presence of a critical phase.
Lindkvist, Emilie; Basurto, Xavier; Schlüter, Maja
2017-01-01
Small-scale fisheries (SSFs) in developing countries are expected to play a significant role in poverty alleviation and enhancing food security in the decades to come. To realize this expectation, a better understanding of their informal self-governance arrangements is critical for developing policies that can improve fishers' livelihoods and lead to sustainable ecosystem stewardship. The goal of this paper is to develop a more nuanced understanding of micro-level factors-such as fishers' characteristics and behavior-to explain observed differences in self-governance arrangements in Northwest Mexico. We focus on two ubiquitous forms of self-governance: hierarchical non-cooperative arrangements between fishers and fishbuyers, such as patron-client relationships (PCs), versus more cooperative arrangements amongst fishers, such as fishing cooperatives (co-ops). We developed an agent-based model of an archetypical SSF that captures key hypotheses from in-depth fieldwork in Northwest Mexico of fishers' day-to-day fishing and trading. Results from our model indicate that high diversity in fishers' reliability, and low initial trust between co-op members, makes co-ops' establishment difficult. PCs cope better with this kind of diversity because, in contrast to co-ops, they have more flexibility in choosing whom to work with. However, once co-ops establish, they cope better with seasonal variability in fish abundance and provide long-term security for the fishers. We argue that existing levels of trust and diversity among fishers matter for different self-governance arrangements to establish and persist, and should therefore be taken into account when developing better, targeted policies for improved SSFs governance.
Probing primordial non-Gaussianity via iSW measurements with SKA continuum surveys
DOE Office of Scientific and Technical Information (OSTI.GOV)
Raccanelli, Alvise; Doré, Olivier, E-mail: alvise@jhu.edu, E-mail: olivier.dore@caltech.edu; Bacon, David J.
The Planck CMB experiment has delivered the best constraints so far on primordial non-Gaussianity, ruling out early-Universe models of inflation that generate large non-Gaussianity. Although small improvements in the CMB constraints are expected, the next frontier of precision will come from future large-scale surveys of the galaxy distribution. The advantage of such surveys is that they can measure many more modes than the CMB—in particular, forthcoming radio surveys with the Square Kilometre Array will cover huge volumes. Radio continuum surveys deliver the largest volumes, but with the disadvantage of no redshift information. In order to mitigate this, we use twomore » additional observables. First, the integrated Sachs-Wolfe effect—the cross-correlation of the radio number counts with the CMB temperature anisotropies—helps to reduce systematics on the large scales that are sensitive to non-Gaussianity. Second, optical data allows for cross-identification in order to gain some redshift information. We show that, while the single redshift bin case can provide a σ(f{sub NL}) ∼ 20, and is therefore not competitive with current and future constraints on non-Gaussianity, a tomographic analysis could improve the constraints by an order of magnitude, even with only two redshift bins. A huge improvement is provided by the addition of high-redshift sources, so having cross-ID for high-z galaxies and an even higher-z radio tail is key to enabling very precise measurements of f{sub NL}. We use Fisher matrix forecasts to predict the constraining power in the case of no redshift information and the case where cross-ID allows a tomographic analysis, and we show that the constraints do not improve much with 3 or more bins. Our results show that SKA continuum surveys could provide constraints competitive with CMB and forthcoming optical surveys, potentially allowing a measurement of σ(f{sub NL}) ∼ 1 to be made. Moreover, these measurements would act as a useful check of results obtained with other probes at other redshift ranges with other methods.« less
Moreno-Salinas, David; Pascoal, Antonio; Aranda, Joaquin
2013-08-12
In this paper, we address the problem of determining the optimal geometric configuration of an acoustic sensor network that will maximize the angle-related information available for underwater target positioning. In the set-up adopted, a set of autonomous vehicles carries a network of acoustic units that measure the elevation and azimuth angles between a target and each of the receivers on board the vehicles. It is assumed that the angle measurements are corrupted by white Gaussian noise, the variance of which is distance-dependent. Using tools from estimation theory, the problem is converted into that of minimizing, by proper choice of the sensor positions, the trace of the inverse of the Fisher Information Matrix (also called the Cramer-Rao Bound matrix) to determine the sensor configuration that yields the minimum possible covariance of any unbiased target estimator. It is shown that the optimal configuration of the sensors depends explicitly on the intensity of the measurement noise, the constraints imposed on the sensor configuration, the target depth and the probabilistic distribution that defines the prior uncertainty in the target position. Simulation examples illustrate the key results derived.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-05
... Application; Fisher Clinical Services, Inc. Pursuant to Title 21 Code of Federal Regulations 1301.34 (a), this is notice that on October 16, 2012, Fisher Clinical Services, Inc., 7554 Schantz Road, Allentown... import the listed controlled substance for analytical research and clinical trials. The import of the...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-25
... Registration; Fisher Clinical Services, Inc. By Notice dated November 1, 2012, and published in the Federal Register on November 9, 2012, 77 FR 67396, Fisher Clinical Services, Inc., 7554 Schantz Road, Allentown... plans to import the listed controlled substance to conduct clinical trials. No comments or objections...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-21
... Registration; Fisher Clinical Services,Inc. By Notice dated September 20, 2012, and published in the Federal Register on October 2, 2012, 77 FR 60143, Fisher Clinical Services, Inc., 7554 Schantz Road, Allentown... company plans to import the listed substances for analytical research and clinical trials. No comments or...
Linking a Conceptual Framework on Systems Thinking with Experiential Knowledge
ERIC Educational Resources Information Center
Garavito-Bermúdez, Diana; Lundholm, Cecilia; Crona, Beatrice
2016-01-01
This paper addresses a systemic approach for the study of fishers' ecological knowledge in order to describe fishers' ways of knowing and dealing with complexity in ecosystems, and discusses how knowledge is generated through, e.g. apprenticeship, experiential knowledge, and testing of hypotheses. The description and analysis of fishers'…
Using forest inventory data to assess fisher resting habitat suitability in California.
William J. Zielinski; Richard L. Truex; Jeffrey R. Dunk; Tom Gaman
2006-01-01
The fisher (Martes pennanti) is a forest-dwelling carnivore whose current distribution and association with late-seral forest conditions make it vulnerable to stand-altering human activities or natural disturbances. Fishers select a variety of structures for daily resting bouts. These habitat elements, together with foraging and reproductive (denning) habitat,...
Job Satisfaction among Fishers in the Dominican Republic
ERIC Educational Resources Information Center
Ruiz, Victor
2012-01-01
This paper reflects on the results of a job satisfaction study of small-scale fishers in the Dominican Republic. The survey results suggest that, although fishers are generally satisfied with their occupations, they also have serious concerns. These concerns include anxieties about the level of earnings, the condition of marine resources and the…
This paper describes the theory, data, and methodology necessary for using Fisher information to assess the sustainability of the San Luis Basin (SLB) regional system over time. Fisher information was originally developed as a measure of the information content in data and is an ...
33 CFR 110.50a - Fishers Island Sound, Stonington, Conn.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 33 Navigation and Navigable Waters 1 2013-07-01 2013-07-01 false Fishers Island Sound, Stonington... SECURITY ANCHORAGES ANCHORAGE REGULATIONS Special Anchorage Areas § 110.50a Fishers Island Sound, Stonington, Conn. An area on the east side of Mason Island bounded as follows: Beginning at the shore line on...
33 CFR 110.50a - Fishers Island Sound, Stonington, Conn.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 33 Navigation and Navigable Waters 1 2014-07-01 2014-07-01 false Fishers Island Sound, Stonington... SECURITY ANCHORAGES ANCHORAGE REGULATIONS Special Anchorage Areas § 110.50a Fishers Island Sound, Stonington, Conn. An area on the east side of Mason Island bounded as follows: Beginning at the shore line on...
33 CFR 110.50a - Fishers Island Sound, Stonington, Conn.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 33 Navigation and Navigable Waters 1 2012-07-01 2012-07-01 false Fishers Island Sound, Stonington... SECURITY ANCHORAGES ANCHORAGE REGULATIONS Special Anchorage Areas § 110.50a Fishers Island Sound, Stonington, Conn. An area on the east side of Mason Island bounded as follows: Beginning at the shore line on...
33 CFR 110.50a - Fishers Island Sound, Stonington, Conn.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 33 Navigation and Navigable Waters 1 2011-07-01 2011-07-01 false Fishers Island Sound, Stonington... SECURITY ANCHORAGES ANCHORAGE REGULATIONS Special Anchorage Areas § 110.50a Fishers Island Sound, Stonington, Conn. An area on the east side of Mason Island bounded as follows: Beginning at the shore line on...
Bavinck, Maarten; de Klerk, Leo; van der Plaat, Felice; Ravesteijn, Jorik; Angel, Dominique; Arendsen, Hendrik; van Dijk, Tom; de Hoog, Iris; van Koolwijk, Ant; Tuijtel, Stijn; Zuurendonk, Benjamin
2015-07-01
The tsunami that struck the coasts of India on 26 December 2004 resulted in the large-scale destruction of fisher habitations. The post-tsunami rehabilitation effort in Tamil Nadu was directed towards relocating fisher settlements in the interior. This paper discusses the outcomes of a study on the social effects of relocation in a sample of nine communities along the Coromandel Coast. It concludes that, although the participation of fishing communities in house design and in allocation procedures has been limited, many fisher households are satisfied with the quality of the facilities. The distance of the new settlements to the shore, however, is regarded as an impediment to engaging in the fishing profession, and many fishers are actually moving back to their old locations. This raises questions as to the direction of coastal zone policy in India, as well as to the weight accorded to safety (and other coastal development interests) vis-à-vis the livelihood needs of fishers. © 2015 The Author(s). Disasters © Overseas Development Institute, 2015.
The invisibility of fisheries in the process of hydropower development across the Amazon.
Doria, Carolina Rodrigues da Costa; Athayde, Simone; Marques, Elineide E; Lima, Maria Alice Leite; Dutka-Gianelli, Jynessa; Ruffino, Mauro Luis; Kaplan, David; Freitas, Carlos E C; Isaac, Victoria N
2018-05-01
We analyze the invisibility of fisheries and inadequacy of fishers' participation in the process of hydropower development in the Amazon, focusing on gaps between legally mandated and actual outcomes. Using Ostrom's institutional design principles for assessing common-pool resource management, we selected five case studies from Brazilian Amazonian watersheds to conduct an exploratory comparative case-study analysis. We identify similar problems across basins, including deficiencies in the dam licensing process; critical data gaps; inadequate stakeholder participation; violation of human rights; neglect of fishers' knowledge; lack of organization and representation by fishers' groups; and lack of governmental structure and capacity to manage dam construction activities or support fishers after dam construction. Fishers have generally been marginalized or excluded from decision-making regarding planning, construction, mitigation, compensation, and monitoring of the social-ecological impacts of hydroelectric dams. Addressing these deficiencies will require concerted investments and efforts by dam developers, government agencies and civil society, and the promotion of inter-sectorial dialogue and cross-scale participatory planning and decision-making that includes fishers and their associations.
Development of a hybrid model to predict construction and demolition waste: China as a case study.
Song, Yiliao; Wang, Yong; Liu, Feng; Zhang, Yixin
2017-01-01
Construction and demolition waste (C&DW) is currently a worldwide issue, and the situation is the worst in China due to a rapid increase in the construction industry and the short life span of China's buildings. To create an opportunity out of this problem, comprehensive prevention measures and effective management strategies are urgently needed. One major gap in the literature of waste management is a lack of estimations on future C&DW generation. Therefore, this paper presents a forecasting procedure for C&DW in China that can forecast the quantity of each component in such waste. The proposed approach is based on a GM-SVR model that improves the forecasting effectiveness of the gray model (GM), which is achieved by adjusting the residual series by a support vector regression (SVR) method and a transition matrix that aims to estimate the discharge of each component in the C&DW. Through the proposed method, future C&DW volume are listed and analyzed containing their potential components and distribution in different provinces in China. Besides, model testing process provides mathematical evidence to validate the proposed model is an effective way to give future information of C&DW for policy makers. Copyright © 2016 Elsevier Ltd. All rights reserved.
Forecasting volatility with neural regression: a contribution to model adequacy.
Refenes, A N; Holt, W T
2001-01-01
Neural nets' usefulness for forecasting is limited by problems of overfitting and the lack of rigorous procedures for model identification, selection and adequacy testing. This paper describes a methodology for neural model misspecification testing. We introduce a generalization of the Durbin-Watson statistic for neural regression and discuss the general issues of misspecification testing using residual analysis. We derive a generalized influence matrix for neural estimators which enables us to evaluate the distribution of the statistic. We deploy Monte Carlo simulation to compare the power of the test for neural and linear regressors. While residual testing is not a sufficient condition for model adequacy, it is nevertheless a necessary condition to demonstrate that the model is a good approximation to the data generating process, particularly as neural-network estimation procedures are susceptible to partial convergence. The work is also an important step toward developing rigorous procedures for neural model identification, selection and adequacy testing which have started to appear in the literature. We demonstrate its applicability in the nontrivial problem of forecasting implied volatility innovations using high-frequency stock index options. Each step of the model building process is validated using statistical tests to verify variable significance and model adequacy with the results confirming the presence of nonlinear relationships in implied volatility innovations.
Pediatric Miller Fisher Syndrome Complicating an Epstein-Barr Virus Infection.
Communal, Céline; Filleron, Anne; Baron-Joly, Sandrine; Salet, Randa; Tran, Tu-Anh
2016-10-01
Miller Fisher syndrome, a variant of Guillain-Barré syndrome, is an acute inflammatory demyelinating polyradiculoneuropathy that may occur weeks after a bacterial or viral infection. Campylobacter jejuni and Haemophilus influenzae are frequently reported etiological agents. We describe a boy with Miller Fisher syndrome following Epstein-002DBarr virus primary infectious mononucleosis. He presented with bilateral dysfunction of several cranial nerves and hyporeflexia of the limbs but without ataxia. Miller Fisher syndrome was confirmed by the presence of anti-GQ1b antibodies in a blood sample. Epstein-Barr virus was identified by polymerase chain reaction and serology. Epstein-Barr virus should be considered as a Miller Fisher syndrome's causative agent. The physiopathology of this condition may involve cross-reactive T-cells against Epstein-Barr virus antigens and gangliosides. Copyright © 2016 Elsevier Inc. All rights reserved.
Redistribution of benefits but not detection in a fisheries bycatch-reduction management initiative.
McClanahan, T R; Kosgei, J K
2018-02-01
Reducing the capture of small fish, discarded fish, and bycatch is a primary concern of fisheries managers who propose to maintain high yields, species diversity, and ecosystem functions. Modified fishing gear is one of the primary ways to reduce by-catch and capture of small fish. The outcomes of gear modification may depend on competition among fishers using other similar resources and other gears in the same fishing grounds and the subsequent adoption or abandonment of modified gears by fishers. We evaluated adoption of modified gear, catch size, catch per unit effort (CPUE), yield, and fisher incomes in a coral reef fishery in which a 3-cm escape gap was introduced into traditional traps. There were 26.1 (SD 4.9) fishers who used the experimental landing sites and 228(SD 15.7) fishers who used the control landing sites annually over 7 years. The size of fish increased by 10.6% in the modified traps, but the catch of smaller fish increased by 11.2% among the other gears. There was no change in the overall CPUE, yields, or per area incomes; rather, yield benefits were redistributed in favor of the unmodified gears. For example, estimated incomes of fishers who adopted the modified traps remained unchanged but increased for net and spear fishers. Fishers using escape-gap traps had a high proportion of income from larger fish, which may have led to a perception of benefits, high status, and no abandonment of the modified traps. The commensal rather than competitive outcome may explain the continued use of escape-gap traps 3 years after their introduction. Trap fishers showed an interest in negotiating other management improvements, such as increased mesh sizes for nets, which could ultimately catalyze community-level decisions and restrictions that could increase their profits. © 2017 Society for Conservation Biology.
Resources and estuarine health: Perceptions of elected officials and recreational fishers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Burger, J.; Sanchez, J.; McMahon, M.
1999-10-29
It is important to understand the perceptions of user groups regarding both the health of their estuaries and environmental problems requiring management. Recreational fishers were interviewed to determine the perceptions of one of the traditional user groups of Barnegat Bay (New Jersey), and elected officials were interviewed to determine if the people charged with making decisions about environmental issues in the bay held similar perceptions. Although relative ratings were similar, there were significant differences in perceptions of the severity of environmental problems, and for the most part, public officials thought the problems were more severe than did the fishers. Personalmore » watercraft (often called Jet Skis) were rated as the most severe problem, followed by chemical pollution, junk, over fishing, street runoff, and boat oil. Small boats, sailboats, wind surfers, and foraging birds were not considered environmental problems by either elected officials or fishermen. The disconnect between the perceptions of the recreational fishers and those of the locally elected public officials suggests that officials may be hearing from some of the more vocal people about problems, rather than from the typical fishers. Both groups felt there were decreases in some of the resources in the bay; over 50% felt the number of fish and crabs had declined, the size of fish and crabs had declined, and the number of turtles had declined. Among recreational fishers, there were almost no differences in perceptions of the severity of environmental problems or in changes in the bay. The problems that were rated the most severe were personal watercraft and over fishing by commercial fishers. Recreational fishers ranked sailboats, wind surfers, and fishing by birds as posing no problem for the bay. Most fishers felt there had been recent major changes in Barnegat Bay, with there now being fewer and smaller fish, fewer and smaller crabs, and fewer turtles. The results suggest that the views of a wide range of coastal users should be considered when making environmental health decisions.« less
Wright-Fisher diffusion bridges.
Griffiths, Robert C; Jenkins, Paul A; Spanò, Dario
2017-10-06
The trajectory of the frequency of an allele which begins at x at time 0 and is known to have frequency z at time T can be modelled by the bridge process of the Wright-Fisher diffusion. Bridges when x=z=0 are particularly interesting because they model the trajectory of the frequency of an allele which appears at a time, then is lost by random drift or mutation after a time T. The coalescent genealogy back in time of a population in a neutral Wright-Fisher diffusion process is well understood. In this paper we obtain a new interpretation of the coalescent genealogy of the population in a bridge from a time t∈(0,T). In a bridge with allele frequencies of 0 at times 0 and T the coalescence structure is that the population coalesces in two directions from t to 0 and t to T such that there is just one lineage of the allele under consideration at times 0 and T. The genealogy in Wright-Fisher diffusion bridges with selection is more complex than in the neutral model, but still with the property of the population branching and coalescing in two directions from time t∈(0,T). The density of the frequency of an allele at time t is expressed in a way that shows coalescence in the two directions. A new algorithm for exact simulation of a neutral Wright-Fisher bridge is derived. This follows from knowing the density of the frequency in a bridge and exact simulation from the Wright-Fisher diffusion. The genealogy of the neutral Wright-Fisher bridge is also modelled by branching Pólya urns, extending a representation in a Wright-Fisher diffusion. This is a new very interesting representation that relates Wright-Fisher bridges to classical urn models in a Bayesian setting. Copyright © 2017 Elsevier Inc. All rights reserved.
Price elasticity matrix of demand in power system considering demand response programs
NASA Astrophysics Data System (ADS)
Qu, Xinyao; Hui, Hongxun; Yang, Shengchun; Li, Yaping; Ding, Yi
2018-02-01
The increasing renewable energy power generations have brought more intermittency and volatility to the electric power system. Demand-side resources can improve the consumption of renewable energy by demand response (DR), which becomes one of the important means to improve the reliability of power system. In price-based DR, the sensitivity analysis of customer’s power demand to the changing electricity prices is pivotal for setting reasonable prices and forecasting loads of power system. This paper studies the price elasticity matrix of demand (PEMD). An improved PEMD model is proposed based on elasticity effect weight, which can unify the rigid loads and flexible loads. Moreover, the structure of PEMD, which is decided by price policies and load types, and the calculation method of PEMD are also proposed. Several cases are studied to prove the effectiveness of this method.
Statistical significance test for transition matrices of atmospheric Markov chains
NASA Technical Reports Server (NTRS)
Vautard, Robert; Mo, Kingtse C.; Ghil, Michael
1990-01-01
Low-frequency variability of large-scale atmospheric dynamics can be represented schematically by a Markov chain of multiple flow regimes. This Markov chain contains useful information for the long-range forecaster, provided that the statistical significance of the associated transition matrix can be reliably tested. Monte Carlo simulation yields a very reliable significance test for the elements of this matrix. The results of this test agree with previously used empirical formulae when each cluster of maps identified as a distinct flow regime is sufficiently large and when they all contain a comparable number of maps. Monte Carlo simulation provides a more reliable way to test the statistical significance of transitions to and from small clusters. It can determine the most likely transitions, as well as the most unlikely ones, with a prescribed level of statistical significance.
Fuin, Niccolo; Pedemonte, Stefano; Arridge, Simon; Ourselin, Sebastien; Hutton, Brian F
2014-03-01
System designs in single photon emission tomography (SPECT) can be evaluated based on the fundamental trade-off between bias and variance that can be achieved in the reconstruction of emission tomograms. This trade off can be derived analytically using the Cramer-Rao type bounds, which imply the calculation and the inversion of the Fisher information matrix (FIM). The inverse of the FIM expresses the uncertainty associated to the tomogram, enabling the comparison of system designs. However, computing, storing and inverting the FIM is not practical with 3-D imaging systems. In order to tackle the problem of the computational load in calculating the inverse of the FIM, a method based on the calculation of the local impulse response and the variance, in a single point, from a single row of the FIM, has been previously proposed for system design. However this approximation (circulant approximation) does not capture the global interdependence between the variables in shift-variant systems such as SPECT, and cannot account e.g., for data truncation or missing data. Our new formulation relies on subsampling the FIM. The FIM is calculated over a subset of voxels arranged in a grid that covers the whole volume. Every element of the FIM at the grid points is calculated exactly, accounting for the acquisition geometry and for the object. This new formulation reduces the computational complexity in estimating the uncertainty, but nevertheless accounts for the global interdependence between the variables, enabling the exploration of design spaces hindered by the circulant approximation. The graphics processing unit accelerated implementation of the algorithm reduces further the computation times, making the algorithm a good candidate for real-time optimization of adaptive imaging systems. This paper describes the subsampled FIM formulation and implementation details. The advantages and limitations of the new approximation are explored, in comparison with the circulant approximation, in the context of design optimization of a parallel-hole collimator SPECT system and of an adaptive imaging system (similar to the commercially available D-SPECT).
INFORMATION-THEORETIC INEQUALITIES ON UNIMODULAR LIE GROUPS
Chirikjian, Gregory S.
2010-01-01
Classical inequalities used in information theory such as those of de Bruijn, Fisher, Cramér, Rao, and Kullback carry over in a natural way from Euclidean space to unimodular Lie groups. These are groups that possess an integration measure that is simultaneously invariant under left and right shifts. All commutative groups are unimodular. And even in noncommutative cases unimodular Lie groups share many of the useful features of Euclidean space. The rotation and Euclidean motion groups, which are perhaps the most relevant Lie groups to problems in geometric mechanics, are unimodular, as are the unitary groups that play important roles in quantum computing. The extension of core information theoretic inequalities defined in the setting of Euclidean space to this broad class of Lie groups is potentially relevant to a number of problems relating to information gathering in mobile robotics, satellite attitude control, tomographic image reconstruction, biomolecular structure determination, and quantum information theory. In this paper, several definitions are extended from the Euclidean setting to that of Lie groups (including entropy and the Fisher information matrix), and inequalities analogous to those in classical information theory are derived and stated in the form of fifteen small theorems. In all such inequalities, addition of random variables is replaced with the group product, and the appropriate generalization of convolution of probability densities is employed. An example from the field of robotics demonstrates how several of these results can be applied to quantify the amount of information gained by pooling different sensory inputs. PMID:21113416
An information-theoretic approach to designing the plane spacing for multifocal plane microscopy
Tahmasbi, Amir; Ram, Sripad; Chao, Jerry; Abraham, Anish V.; Ward, E. Sally; Ober, Raimund J.
2015-01-01
Multifocal plane microscopy (MUM) is a 3D imaging modality which enables the localization and tracking of single molecules at high spatial and temporal resolution by simultaneously imaging distinct focal planes within the sample. MUM overcomes the depth discrimination problem of conventional microscopy and allows high accuracy localization of a single molecule in 3D along the z-axis. An important question in the design of MUM experiments concerns the appropriate number of focal planes and their spacings to achieve the best possible 3D localization accuracy along the z-axis. Ideally, it is desired to obtain a 3D localization accuracy that is uniform over a large depth and has small numerical values, which guarantee that the single molecule is continuously detectable. Here, we address this concern by developing a plane spacing design strategy based on the Fisher information. In particular, we analyze the Fisher information matrix for the 3D localization problem along the z-axis and propose spacing scenarios termed the strong coupling and the weak coupling spacings, which provide appropriate 3D localization accuracies. Using these spacing scenarios, we investigate the detectability of the single molecule along the z-axis and study the effect of changing the number of focal planes on the 3D localization accuracy. We further review a software module we recently introduced, the MUMDesignTool, that helps to design the plane spacings for a MUM setup. PMID:26113764
Cloke, Jonathan; Evans, Katharine; Crabtree, David; Hughes, Annette; Simpson, Helen; Holopainen, Jani; Wickstrand, Nina; Kauppinen, Mikko; Leon-Velarde, Carlos; Larson, Nathan; Dave, Keron
2014-01-01
The Thermo Scientific SureTect Listeria species Assay is a new real-time PCR assay for the detection of all species of Listeria in food and environmental samples. This validation study was conducted using the AOAC Research Institute (RI) Performance Tested Methods program to validate the SureTect Listeria species Assay in comparison to the reference method detailed in International Organization for Standardization 11290-1:1996 including amendment 1:2004 in a variety of foods plus plastic and stainless steel. The food matrixes validated were smoked salmon, processed cheese, fresh bagged spinach, cantaloupe, cooked prawns, cooked sliced turkey meat, cooked sliced ham, salami, pork frankfurters, and raw ground beef. All matrixes were tested by Thermo Fisher Scientific, Microbiology Division, Basingstoke, UK. In addition, three matrixes (pork frankfurters, fresh bagged spinach, and stainless steel surface samples) were analyzed independently as part of the AOAC-RI-controlled independent laboratory study by the University ofGuelph, Canada. Using probability of detection statistical analysis, a significant difference in favour of the SureTect assay was demonstrated between the SureTect and reference method for high level spiked samples of pork frankfurters, smoked salmon, cooked prawns, stainless steel, and low-spiked samples of salami. For all other matrixes, no significant difference was seen between the two methods during the study. Inclusivity testing was conducted with 68 different isolates of Listeria species, all of which were detected by the SureTect Listeria species Assay. None of the 33 exclusivity isolates were detected by the SureTect Listeria species Assay. Ruggedness testing was conducted to evaluate the performance of the assay with specific method deviations outside of the recommended parameters open to variation, which demonstrated that the assay gave reliable performance. Accelerated stability testing was additionally conducted, validating the assay shelf life.
Evaluation of the Thermo Scientific™ SureTect™ Listeria species Assay.
Cloke, Jonathan; Evans, Katharine; Crabtree, David; Hughes, Annette; Simpson, Helen; Holopainen, Jani; Wickstrand, Nina; Kauppinen, Mikko
2014-03-01
The Thermo Scientific™ SureTect™ Listeria species Assay is a new real-time PCR assay for the detection of all species of Listeria in food and environmental samples. This validation study was conducted using the AOAC Research Institute (RI) Performance Tested MethodsSM program to validate the SureTect Listeria species Assay in comparison to the reference method detailed in International Organization for Standardization 11290-1:1996 including amendment 1:2004 in a variety of foods plus plastic and stainless steel. The food matrixes validated were smoked salmon, processed cheese, fresh bagged spinach, cantaloupe, cooked prawns, cooked sliced turkey meat, cooked sliced ham, salami, pork frankfurters, and raw ground beef. All matrixes were tested by Thermo Fisher Scientific, Microbiology Division, Basingstoke, UK. In addition, three matrixes (pork frankfurters, fresh bagged spinach, and stainless steel surface samples) were analyzed independently as part of the AOAC-RI-controlled independent laboratory study by the University of Guelph, Canada. Using probability of detection statistical analysis, a significant difference in favour of the SureTect assay was demonstrated between the SureTect and reference method for high level spiked samples of pork frankfurters, smoked salmon, cooked prawns, stainless steel, and low-spiked samples of salami. For all other matrixes, no significant difference was seen between the two methods during the study. Inclusivity testing was conducted with 68 different isolates of Listeria species, all of which were detected by the SureTect Listeria species Assay. None of the 33 exclusivity isolates were detected by the SureTect Listeria species Assay. Ruggedness testing was conducted to evaluate the performance of the assay with specific method deviations outside of the recommended parameters open to variation, which demonstrated that the assay gave reliable performance. Accelerated stability testing was additionally conducted, validating the assay shelf life.
Code of Federal Regulations, 2011 CFR
2011-07-01
..., Miss., from its mouth at Kleinston Landing to Fisher Street; navigation. 162.85 Section 162.85... mouth at Kleinston Landing to Fisher Street; navigation. (a) Speed. Excessive speeding is prohibited. A... motion or tied up, a wharf or other structure, works under construction, plant engaged in river and...
Code of Federal Regulations, 2010 CFR
2010-07-01
... 33 Navigation and Navigable Waters 2 2010-07-01 2010-07-01 false Yazoo Diversion Canal, Vicksburg, Miss., from its mouth at Kleinston Landing to Fisher Street; navigation. 162.85 Section 162.85... mouth at Kleinston Landing to Fisher Street; navigation. (a) Speed. Excessive speeding is prohibited. A...
Job Satisfaction in the Shrimp Trawl Fisheries of Vietnam
ERIC Educational Resources Information Center
Sinh, Le Xuan
2012-01-01
This paper investigates the job satisfaction of small-scale shrimp trawl fishers in the vicinity of Camau National Park in southern Vietnam. The research sample consisted of 77 fishers who belong to a growing population of shrimp fishers in the region. The results suggest that 60% would change their fishing metier, 78% would leave fishing for…
Job Satisfaction in the Marine and Estuarine Fisheries of Guinea-Bissau
ERIC Educational Resources Information Center
Fernandes, Raul Mendes
2012-01-01
This paper examines aspects of job satisfaction among small-scale fishers in Guinea-Bissau, West Africa. The willingness of fishers to change metier or occupation is a central aspect of study, and gains relevance from the global degradation of marine environments. The author concludes that small-scale fishers are generally satisfied with the…
Bronwyn W. Williams; Jonathan H. Gilbert; Patrick A. Zollner
2007-01-01
Management of mustelid species such as fishers and martens requires an understanding of the history of local populations. This is particularly true in areas where populations were extirpated and restored through reintroduction efforts. During the late 19th and 20th centuries, fishers (Martes pennanti) and American martens (Martes americana...
Fish Consumption Patterns and Mercury Advisory Knowledge Among Fishers in the Haw River Basin
Johnston, Jill E.; Hoffman, Kate; Wing, Steve; Lowman, Amy
2016-01-01
BACKGROUND Fish consumption has numerous health benefits, with fish providing a source of protein as well as omega-3 fatty acids. However, some fish also contain contaminants that can impair human health. In North Carolina, the Department of Health and Human Services has issued fish consumption advisories due to methylmercury contamination in fish. Little is known about local fishers’ consumption patterns and advisory adherence in North Carolina. METHODS We surveyed a consecutive sample of 50 fishers (74.6% positive response rate) who reported eating fish caught from the Haw River Basin or Jordan Lake. They provided information on demographic characteristics, species caught, and the frequency of local fish consumption. Additionally, fishers provided information on their knowledge of fish consumption advisories and the impact of those advisories on their fishing and fish consumption patterns. RESULTS The majority of participants were male (n = 44) and reported living in central North Carolina. Catfish, crappie, sunfish, and large-mouth bass were consumed more frequently than other species of fish. Of the fishers surveyed, 8 reported eating more than 1 fish meal high in mercury per week, which exceeds the North Carolina advisory recommendation. Most participants (n = 32) had no knowledge of local fish advisories, and only 4 fishers reported that advisories impacted their fishing practices. LIMITATIONS We sampled 50 fishers at 11 locations. There is no enumeration of the dynamic population of fishers and no way to assess the representativeness of this sample. CONCLUSIONS Additional outreach is needed to make local fishers aware of fish consumption advisories and the potential health impacts of eating high-mercury fish, which may also contain other persistent and bioaccumulative toxins. PMID:26763238
Pavlowich, Tyler; Kapuscinski, Anne R
2017-01-01
Social and ecological systems come together during the act of fishing. However, we often lack a deep understanding of the fishing process, despite its importance for understanding and managing fisheries. A quantitative, mechanistic understanding of the opportunities fishers encounter, the constraints they face, and how they make decisions within the context of opportunities and constraints will enhance the design of fisheries management strategies to meet linked ecological and social objectives and will improve scientific capacity to predict impacts of different strategies. We examined the case of spearfishing in a Caribbean coral reef fishery. We mounted cameras on fishers' spearguns to observe the fish they encountered, what limited their ability to catch fish, and how they made decisions about which fish to target. We observed spearfishers who dove with and without the assistance of compressed air, and compared the fishing process of each method using content analysis of videos and decision models of fishers' targeting selections. Compressor divers encountered more fish, took less time to catch each fish, and had a higher rate of successful pursuits. We also analyzed differences among taxa in this multispecies fishery, because some taxa are known to be ecologically or economically more valuable than others. Parrotfish are ecologically indispensable for healthy coral reefs, and they were encountered and captured more frequently than any other taxon. Fishers made decisions about which fish to target based on a fish's market value, proximity to the fisher, and taxon. The information uncovered on fishers' opportunities, constraints, and decision making has implications for managing this fishery and others. Moreover, it demonstrates the value of pursuing an improved understanding of the fishing process from the perspective of the fishers.
Tabery, James; Sarkar, Sahotra
2015-01-01
From 1930 to 1937 Lancelot Hogben FRS occupied the Chair of Social Biology at the London School of Economics and Political Science. According to standard histories of this appointment, he and R. A. Fisher FRS both applied for the position, but Hogben was selected over Fisher. The episode has received attention in large part because of the later prominence of the two figures involved. The surviving archival records, however, tell a remarkably different story. Neither Fisher nor Hogben was ever an official candidate for the chair. Indeed, Fisher seems not to have applied for the position at all, and Hogben was approached only behind the scenes of the official search. The purpose of this paper is to correct and complete the history of this episode. PMID:26665489
Sequential Data Assimilation for Seismicity: a Proof of Concept
NASA Astrophysics Data System (ADS)
van Dinther, Ylona; Fichtner, Andreas; Kuensch, Hansruedi
2016-04-01
Our probabilistic forecasting ability and physical understanding of earthquakes is significantly hampered by limited indications on the current and evolving state of stress and strength on faults. This information is typically thought to be beyond our resolution capabilities based on surface data. We show that the state of stress and strength are actually obtainable for settings with one dominant fault. State variables and their uncertainties are obtained using Ensemble Kalman Filtering, a sequential data assimilation technique extensively developed for weather forecasting purposes. Through the least-squares solution of Bayes theorem erroneous data is for the first time assimilated to update a Partial Differential Equation-driven seismic cycle model. This visco-elasto-plastic continuum forward model solves Navier-Stokes equations with a rate-dependent friction coefficient (van Dinther et al., JGR, 2013). To prove the concept of this weather - earthquake forecasting bridge we perform a perfect model test. Synthetic numerical data from a single analogue borehole is assimilated into 20 ensemble models over 14 cycles of analogue earthquakes. Since we know the true state of the numerical data model, a quantitative and qualitative evaluation shows that meaningful information on the stress and strength of the unobserved fault is typically already available, once data from a single, shallow borehole is assimilated over part of a seismic cycle. This is possible, since the sampled error covariance matrix contains prior information on the physics that relates velocities, stresses, and pressures at the surface to those at the fault. During the analysis step stress and strength distributions are thus reconstructed in such a way that fault coupling can be updated to either inhibit or trigger events. In the subsequent forward propagation step the physical equations are solved to propagate the updated states forward in time and thus provide probabilistic information on the occurrence of the next analogue earthquake. At the next constant assimilation step, the systems forecasting ability turns out to be beyond expectations; 5 analogue events are forecasted approximately accurately, 5 had indications slightly earlier, 3 were identified only during propagation, and 1 was missed. Else predominantly quite interseismic times were forecasted, but for 3 occasions where smaller events triggered prolonged probabilities until the larger event that came slightly latter. Besides temporal forecasting, we also observe some magnitude forecasting skill for 59% of the events, while the other event sizes were underestimated. This new framework thus provides potential to in the long-term assist with improving our probabilistic hazard assessment.
Not all that glitters is RMT in the forecasting of risk of portfolios in the Brazilian stock market
NASA Astrophysics Data System (ADS)
Sandoval, Leonidas; Bortoluzzo, Adriana Bruscato; Venezuela, Maria Kelly
2014-09-01
Using stocks of the Brazilian stock exchange (BM&F-Bovespa), we build portfolios of stocks based on Markowitz's theory and test the predicted and realized risks. This is done using the correlation matrices between stocks, and also using Random Matrix Theory in order to clean such correlation matrices from noise. We also calculate correlation matrices using a regression model in order to remove the effect of common market movements and their cleaned versions using Random Matrix Theory. This is done for years of both low and high volatility of the Brazilian stock market, from 2004 to 2012. The results show that the use of regression to subtract the market effect on returns greatly increases the accuracy of the prediction of risk, and that, although the cleaning of the correlation matrix often leads to portfolios that better predict risks, in periods of high volatility of the market this procedure may fail to do so. The results may be used in the assessment of the true risks when one builds a portfolio of stocks during periods of crisis.
Case study 6.1: DNA survey for fisher in northern Idaho
Samuel Cushman; Kevin McKelvey; Michael Schwartz
2008-01-01
Unique haplotypes indicating the presence of a residual native population of fisher were found in central Idaho (Vinkey et al. 2006). Fishers had been detected previously using camera sets in the Selkirk Mountains just south of the Canadian border, but their population status and genetic composition were unknown. The purpose of the study was to provide a comprehensive...
Bill Zielinski; Fredrick V. Schlexer
2015-01-01
Resting habitat used by fishers (Pekania pennanti) has been relatively well studied but information on the persistence of their resting structures over time is unknown. We selected for reexamination 73 of 195 resting structures used by by fishers in northwestern California and compared their condition on the date they were found with their...
Fisher research in the US Rocky Mountains: A critical overview
Michael Schwartz; J. Sauder
2013-01-01
In this talk we review the recent fisher research and monitoring efforts that have occurred throughout Idaho and Montana in past 2 decades. We begin this talk with a summary of the habitat relationship work that has examined fisher habitat use at multiple scales. These have largely been conducted using radio and satellite telemetry, although a new, joint effort to use...
Meta-analyses of habitat selection by fishers at resting sites in the Pacific coastal region
Keith B. Aubry; Catherine M. Raley; Steven W. Buskirk; William J. Zielinski; Michael K. Schwartz; Richard T. Golightly; Kathryn L. Purcell; Richard D. Weir; J. Scott Yaeger
2013-01-01
The fisher (Pekania pennanti) is a species of conservation concern throughout the Pacific coastal region in North America. A number of radiotelemetry studies of habitat selection by fishers at resting sites have been conducted in this region, but the applicability of observed patterns beyond the boundaries of each study area is unknown. Broadly...
Resting structures and resting habitat of fishers in the southern Sierra Nevada, California
Kathryn L. Purcell; Amie K. Mazzoni; Sylvia R. Mori; Brian B. Boroski
2009-01-01
The fisher (Martes pennanti) is a forest mustelid endemic to North America that has experienced range reductions in Pacific states that have led to their listing under the Endangered Species Act as warranted but precluded by higher priorities. The viability of the southern Sierra Nevada fisher population is of particular concern due to its reduced...
Ecology and management of Martes on private timberlands in north coastal California
Keith A. Hamm; Lowell V. Diller; David W. Lamphear; Desiree A. Early
2012-01-01
Green Diamond Resource Company has conducted periodic studies of fishers on its California timberlands since 1994. A graduate study in 1994 and 1995 used track plates to investigate the distribution and habitat associations of fishers. Fishers were detected at 65 percent of the survey segments during both years combined but marten were not detected. A repeated track...
Patricia J. Happe; Kurt J. Jenkins; Michael K. Schwartz; Jeffrey C. Lewis; Keith B. Aubry
2014-01-01
With the translocation and release of 90 fishers [Pekania pennanti (formerly Martes pennanti)] from British Columbia to Olympic National Park during 2008-2010, the National Park Service and Washington Department of Fish and Wildlife accomplished the first phase of fisher restoration in Washington State. Beginning in 2013, we initiated a new research project to...
ERIC Educational Resources Information Center
Wilburn, Catherine; Feeney, Aidan
2008-01-01
In a recently published study, Sloutsky and Fisher [Sloutsky, V. M., & Fisher, A.V. (2004a). When development and learning decrease memory: Evidence against category-based induction in children. "Psychological Science", 15, 553-558; Sloutsky, V. M., & Fisher, A. V. (2004b). Induction and categorization in young children: A similarity-based model.…
Thermodynamical transcription of density functional theory with minimum Fisher information
NASA Astrophysics Data System (ADS)
Nagy, Á.
2018-03-01
Ghosh, Berkowitz and Parr designed a thermodynamical transcription of the ground-state density functional theory and introduced a local temperature that varies from point to point. The theory, however, is not unique because the kinetic energy density is not uniquely defined. Here we derive the expression of the phase-space Fisher information in the GBP theory taking the inverse temperature as the Fisher parameter. It is proved that this Fisher information takes its minimum for the case of constant temperature. This result is consistent with the recently proven theorem that the phase-space Shannon information entropy attains its maximum at constant temperature.
R. A. Fisher: a faith fit for eugenics.
Moore, James
2007-03-01
In discussions of 'religion-and-science', faith is usually emphasized more than works, scientists' beliefs more than their deeds. By reversing the priority, a lingering puzzle in the life of Ronald Aylmer Fisher (1890-1962), statistician, eugenicist and founder of the neo-Darwinian synthesis, can be solved. Scholars have struggled to find coherence in Fisher's simultaneous commitment to Darwinism, Anglican Christianity and eugenics. The problem is addressed by asking what practical mode of faith or faithful mode of practice lent unity to his life? Families, it is argued, with their myriad practical, emotional and intellectual challenges, rendered a mathematically-based eugenic Darwinian Christianity not just possible for Fisher, but vital.
Dangers, delights, and destiny on the sea: fishers along the East coast of north sumatra, indonesia.
Markkanen, Pia
2005-01-01
This article describes a collaborative project between the International Labour Organization's International Programme on the Elimination of Child Labour (IPEC) and the Lowell Center for Sustainable Production, in identifying work hazards of fishers along the east coast of North Sumatra, Indonesia, in July 2004. The study employed qualitative investigation techniques: participant observations at fishing villages and harbors; and interviews with local fishers and skippers. Fishers work long hours in life-threatening conditions, often with low pay. It would be synergistic to incorporate fishing safety and health policies and advocacy efforts into reconstruction undertakings of fisheries devastated by the 2004 tsunami.
Characterizing nonclassical correlations via local quantum Fisher information
NASA Astrophysics Data System (ADS)
Kim, Sunho; Li, Longsuo; Kumar, Asutosh; Wu, Junde
2018-03-01
We define two ways of quantifying the quantum correlations based on quantum Fisher information (QFI) in order to study the quantum correlations as a resource in quantum metrology. By investigating the hierarchy of measurement-induced Fisher information introduced in Lu et al. [X. M. Lu, S. Luo, and C. H. Oh, Phys. Rev. A 86, 022342 (2012), 10.1103/PhysRevA.86.022342], we show that the presence of quantum correlation can be confirmed by the difference of the Fisher information induced by the measurements of two hierarchies. In particular, the quantitative quantum correlations based on QFI coincide with the geometric discord for pure quantum states.
Subject-based feature extraction by using fisher WPD-CSP in brain-computer interfaces.
Yang, Banghua; Li, Huarong; Wang, Qian; Zhang, Yunyuan
2016-06-01
Feature extraction of electroencephalogram (EEG) plays a vital role in brain-computer interfaces (BCIs). In recent years, common spatial pattern (CSP) has been proven to be an effective feature extraction method. However, the traditional CSP has disadvantages of requiring a lot of input channels and the lack of frequency information. In order to remedy the defects of CSP, wavelet packet decomposition (WPD) and CSP are combined to extract effective features. But WPD-CSP method considers less about extracting specific features that are fitted for the specific subject. So a subject-based feature extraction method using fisher WPD-CSP is proposed in this paper. The idea of proposed method is to adapt fisher WPD-CSP to each subject separately. It mainly includes the following six steps: (1) original EEG signals from all channels are decomposed into a series of sub-bands using WPD; (2) average power values of obtained sub-bands are computed; (3) the specified sub-bands with larger values of fisher distance according to average power are selected for that particular subject; (4) each selected sub-band is reconstructed to be regarded as a new EEG channel; (5) all new EEG channels are used as input of the CSP and a six-dimensional feature vector is obtained by the CSP. The subject-based feature extraction model is so formed; (6) the probabilistic neural network (PNN) is used as the classifier and the classification accuracy is obtained. Data from six subjects are processed by the subject-based fisher WPD-CSP, the non-subject-based fisher WPD-CSP and WPD-CSP, respectively. Compared with non-subject-based fisher WPD-CSP and WPD-CSP, the results show that the proposed method yields better performance (sensitivity: 88.7±0.9%, and specificity: 91±1%) and the classification accuracy from subject-based fisher WPD-CSP is increased by 6-12% and 14%, respectively. The proposed subject-based fisher WPD-CSP method can not only remedy disadvantages of CSP by WPD but also discriminate helpless sub-bands for each subject and make remaining fewer sub-bands keep better separability by fisher distance, which leads to a higher classification accuracy than WPD-CSP method. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Li, J.; Wang, P.; Han, H.; Schmit, T. J.
2014-12-01
JPSS and GOES-R observations play important role in numerical weather prediction (NWP). However, how to best represent the information from satellite observations and how to get value added information from these satellite data into regional NWP models, including both radiance and derived products, still need investigations. In order to enhance the applications of JPSS and GOES-R data in regional NWP for high impact weather forecasts, scientists from Cooperative Institute of Meteorological Satellite Studies (CIMSS) at University of Wisconsin-Madison have recently developed a near realtime regional Satellite Data Assimilation system for Tropical storm forecasts (SDAT) (http://cimss.ssec.wisc.edu/sdat). The system consists of the community Gridpoint Statistical Interpolation (GSI) assimilation system and the advanced Weather Research Forecast (WRF) model. In addition to assimilate GOES, AMSUA/AMSUB, HIRS, MHS, ATMS (Suomi-NPP), AIRS and IASI radiances, the SDAT is also able to assimilate satellite-derived products such as hyperspectral IR retrieved temperature and moisture profiles, total precipitable water (TPW), GOES Sounder (and future GOES-R) layer precipitable water (LPW) and GOES Imager atmospheric motion vector (AMV) products into the system. Real time forecasted GOES infrared (IR) images simulated from SDAT output have also been part of the SDAT system for applications and forecast evaluations. To set up the system parameters, a series of experiments have been carried out to test the impacts of different initialization schemes, including different background error matrix, different NCEP global model date sets, and different WRF model horizontal resolutions. Using SDAT as a research testbed, researches have been conducted for different satellite data impacts study, as well as different techniques for handling clouds in radiance assimilation. Since the fall of 2013, the SDAT system has been running in near real time. The results from historical cases and 2014 hurricane season cases will be compared with the operational GFS and HWRF, and presented at the meeting.
Strategies to reduce the complexity of hydrologic data assimilation for high-dimensional models
NASA Astrophysics Data System (ADS)
Hernandez, F.; Liang, X.
2017-12-01
Probabilistic forecasts in the geosciences offer invaluable information by allowing to estimate the uncertainty of predicted conditions (including threats like floods and droughts). However, while forecast systems based on modern data assimilation algorithms are capable of producing multi-variate probability distributions of future conditions, the computational resources required to fully characterize the dependencies between the model's state variables render their applicability impractical for high-resolution cases. This occurs because of the quadratic space complexity of storing the covariance matrices that encode these dependencies and the cubic time complexity of performing inference operations with them. In this work we introduce two complementary strategies to reduce the size of the covariance matrices that are at the heart of Bayesian assimilation methods—like some variants of (ensemble) Kalman filters and of particle filters—and variational methods. The first strategy involves the optimized grouping of state variables by clustering individual cells of the model into "super-cells." A dynamic fuzzy clustering approach is used to take into account the states (e.g., soil moisture) and forcings (e.g., precipitation) of each cell at each time step. The second strategy consists in finding a compressed representation of the covariance matrix that still encodes the most relevant information but that can be more efficiently stored and processed. A learning and a belief-propagation inference algorithm are developed to take advantage of this modified low-rank representation. The two proposed strategies are incorporated into OPTIMISTS, a state-of-the-art hybrid Bayesian/variational data assimilation algorithm, and comparative streamflow forecasting tests are performed using two watersheds modeled with the Distributed Hydrology Soil Vegetation Model (DHSVM). Contrasts are made between the efficiency gains and forecast accuracy losses of each strategy used in isolation, and of those achieved through their coupling. We expect these developments to help catalyze improvements in the predictive accuracy of large-scale forecasting operations by lowering the costs of deploying advanced data assimilation techniques.
Klaus, Aumayr; Fathi, Osmen; Tatjana, Traub-Weidinger; Bruno, Niederle; Oskar, Koperek
2018-04-01
Follicular thyroid carcinomas (FTCs) are the second most common malignant neoplasia of the thyroid and in general its prognosis is quite favorable. However, the occurrence of metastases or non-responsiveness to radioiodine therapy worsens the prognosis considerably. We evaluated immunohistochemically the expression of hypoxia-associated proteins by hypoxia-induced factor 1α (HIF-1α), the stroma-remodeling marker Tenascin C, as well as markers for the epithelial-mesenchymal transition (EMT), namely E-cadherin and slug in a series of 59 sporadic FTCs. In addition, various clinicopathologic parameters were assessed like TNM-staging, age, tumor size as well as tumor characteristics like desmoplasia, necrosis, and calcification. Overexpression of HIF-1α was seen in 29 of 59 tumors (49.2%) including 21 (35.6%) FTC with strong expression of tumor cell groups. HIF-1α correlated significantly with metastasis (p < 0.001; Mann-Whitney U test), degree of desmoplasia (p = 0.042, Kruskal-Wallis test), tenascin C expression (p = 0.042, Kruskal-Wallis test), calcification (p < 0.025, Kruskal-Wallis test), necrosis (p = 0.002), age (p = 0.011, Kruskal-Wallis test) and tumor stage UICC (p = 0.022, Kruskal-Wallis test). Furthermore, metastasis was associated with the degree of desmoplasia (p = 0.014; Fisher's exact test), calcification (p = 0.008, Fisher's exact test), necrosis (p = 0.042, Fisher's exact test), tumor size (p = 0.015, Mann-Whitney U test), and age (p = 0.001, Mann-Whitney U test). In a Cox proportional hazards model, only metastasis remained as an independent risk factor for overall survival (hazard rate: 10.2 [95% CI, 02.19 to 47.26]; p = 0.003). Our data suggest that HIF-1α plays a critical role in the remodeling of the extracellular matrix as well as metastasizing process of follicular thyroid carcinoma and targeting hypoxia-associated and -regulated proteins may be considered as potential targets for personalized medicine.
Jiao, Pengfei; Cai, Fei; Feng, Yiding; Wang, Wenjun
2017-08-21
Link predication aims at forecasting the latent or unobserved edges in the complex networks and has a wide range of applications in reality. Almost existing methods and models only take advantage of one class organization of the networks, which always lose important information hidden in other organizations of the network. In this paper, we propose a link predication framework which makes the best of the structure of networks in different level of organizations based on nonnegative matrix factorization, which is called NMF 3 here. We first map the observed network into another space by kernel functions, which could get the different order organizations. Then we combine the adjacency matrix of the network with one of other organizations, which makes us obtain the objective function of our framework for link predication based on the nonnegative matrix factorization. Third, we derive an iterative algorithm to optimize the objective function, which converges to a local optimum, and we propose a fast optimization strategy for large networks. Lastly, we test the proposed framework based on two kernel functions on a series of real world networks under different sizes of training set, and the experimental results show the feasibility, effectiveness, and competitiveness of the proposed framework.
Quantum Jeffreys prior for displaced squeezed thermal states
NASA Astrophysics Data System (ADS)
Kwek, L. C.; Oh, C. H.; Wang, Xiang-Bin
1999-09-01
It is known that, by extending the equivalence of the Fisher information matrix to its quantum version, the Bures metric, the quantum Jeffreys prior can be determined from the volume element of the Bures metric. We compute the Bures metric for the displaced squeezed thermal state and analyse the quantum Jeffreys prior and its marginal probability distributions. To normalize the marginal probability density function, it is necessary to provide a range of values of the squeezing parameter or the inverse temperature. We find that if the range of the squeezing parameter is kept narrow, there are significant differences in the marginal probability density functions in terms of the squeezing parameters for the displaced and undisplaced situations. However, these differences disappear as the range increases. Furthermore, marginal probability density functions against temperature are very different in the two cases.
Efficient estimators for likelihood ratio sensitivity indices of complex stochastic dynamics.
Arampatzis, Georgios; Katsoulakis, Markos A; Rey-Bellet, Luc
2016-03-14
We demonstrate that centered likelihood ratio estimators for the sensitivity indices of complex stochastic dynamics are highly efficient with low, constant in time variance and consequently they are suitable for sensitivity analysis in long-time and steady-state regimes. These estimators rely on a new covariance formulation of the likelihood ratio that includes as a submatrix a Fisher information matrix for stochastic dynamics and can also be used for fast screening of insensitive parameters and parameter combinations. The proposed methods are applicable to broad classes of stochastic dynamics such as chemical reaction networks, Langevin-type equations and stochastic models in finance, including systems with a high dimensional parameter space and/or disparate decorrelation times between different observables. Furthermore, they are simple to implement as a standard observable in any existing simulation algorithm without additional modifications.
The beta Burr type X distribution properties with application.
Merovci, Faton; Khaleel, Mundher Abdullah; Ibrahim, Noor Akma; Shitan, Mahendran
2016-01-01
We develop a new continuous distribution called the beta-Burr type X distribution that extends the Burr type X distribution. The properties provide a comprehensive mathematical treatment of this distribution. Further more, various structural properties of the new distribution are derived, that includes moment generating function and the rth moment thus generalizing some results in the literature. We also obtain expressions for the density, moment generating function and rth moment of the order statistics. We consider the maximum likelihood estimation to estimate the parameters. Additionally, the asymptotic confidence intervals for the parameters are derived from the Fisher information matrix. Finally, simulation study is carried at under varying sample size to assess the performance of this model. Illustration the real dataset indicates that this new distribution can serve as a good alternative model to model positive real data in many areas.
Efficient estimators for likelihood ratio sensitivity indices of complex stochastic dynamics
NASA Astrophysics Data System (ADS)
Arampatzis, Georgios; Katsoulakis, Markos A.; Rey-Bellet, Luc
2016-03-01
We demonstrate that centered likelihood ratio estimators for the sensitivity indices of complex stochastic dynamics are highly efficient with low, constant in time variance and consequently they are suitable for sensitivity analysis in long-time and steady-state regimes. These estimators rely on a new covariance formulation of the likelihood ratio that includes as a submatrix a Fisher information matrix for stochastic dynamics and can also be used for fast screening of insensitive parameters and parameter combinations. The proposed methods are applicable to broad classes of stochastic dynamics such as chemical reaction networks, Langevin-type equations and stochastic models in finance, including systems with a high dimensional parameter space and/or disparate decorrelation times between different observables. Furthermore, they are simple to implement as a standard observable in any existing simulation algorithm without additional modifications.
Efficient estimators for likelihood ratio sensitivity indices of complex stochastic dynamics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Arampatzis, Georgios; Katsoulakis, Markos A.; Rey-Bellet, Luc
2016-03-14
We demonstrate that centered likelihood ratio estimators for the sensitivity indices of complex stochastic dynamics are highly efficient with low, constant in time variance and consequently they are suitable for sensitivity analysis in long-time and steady-state regimes. These estimators rely on a new covariance formulation of the likelihood ratio that includes as a submatrix a Fisher information matrix for stochastic dynamics and can also be used for fast screening of insensitive parameters and parameter combinations. The proposed methods are applicable to broad classes of stochastic dynamics such as chemical reaction networks, Langevin-type equations and stochastic models in finance, including systemsmore » with a high dimensional parameter space and/or disparate decorrelation times between different observables. Furthermore, they are simple to implement as a standard observable in any existing simulation algorithm without additional modifications.« less
Quantum Theory of Three-Dimensional Superresolution Using Rotating-PSF Imagery
NASA Astrophysics Data System (ADS)
Prasad, S.; Yu, Z.
The inverse of the quantum Fisher information (QFI) matrix (and extensions thereof) provides the ultimate lower bound on the variance of any unbiased estimation of a parameter from statistical data, whether of intrinsically quantum mechanical or classical character. We calculate the QFI for Poisson-shot-noise-limited imagery using the rotating PSF that can localize and resolve point sources fully in all three dimensions. We also propose an experimental approach based on the use of computer generated hologram and projective measurements to realize the QFI-limited variance for the problem of super-resolving a closely spaced pair of point sources at a highly reduced photon cost. The paper presents a preliminary analysis of quantum-limited three-dimensional (3D) pair optical super-resolution (OSR) problem with potential applications to astronomical imaging and 3D space-debris localization.
Jasper, Randall; Stewart, Barbara A; Knight, Andrew
2017-08-01
Issue addressed Recreational fishing, particularly rock fishing, can be dangerous; 30 fatalities were recorded in Western Australia from 2002-2014. This study investigates differences in behaviours and attitudes towards safety among fishers at a fishing fatality 'black spot' in Australia. Methods A total of 236 fishers were surveyed at Salmon Holes, Western Australia in 2015. Fishers were grouped by country of origin and significant differences among groups for behaviours and attitudes towards personal safety were identified. Results Of fishers surveyed, 53% were born in Asia. These fishers self-assessed as poorer swimmers (F=23.27, P<0.001), yet were more likely to have fished from rocks (χ 2 =20.94, P<0.001). They were less likely to go close to the water to get a snagged line (χ 2 =15.44, P<0.001) or to drink alcohol while fishing ( χ 2 = 8.63, P<0.001), and were more likely to agree that they would drown if swept into the sea (χ 2 =9.49, P<0.001). Although most respondents agreed that wearing a life jacket made fishing safer, 78% 'never' wore a life jacket while fishing. Conclusions Some fishers who were poor swimmers and were aware of the dangers of rock fishing still choose to fish from rocks. So what? Our results support the proposal that the wearing of life jackets should be promoted, if not made mandatory, while water safety education campaigns should be continued and target vulnerable communities.
Rapidly shifting environmental baselines among fishers of the Gulf of California
Sáenz-Arroyo, Andrea; Roberts, Callum M; Torre, Jorge; Cariño-Olvera, Micheline; Enríquez-Andrade, Roberto R
2005-01-01
Shifting environmental baselines are inter-generational changes in perception of the state of the environment. As one generation replaces another, people's perceptions of what is natural change even to the extent that they no longer believe historical anecdotes of past abundance or size of species. Although widely accepted, this phenomenon has yet to be quantitatively tested. Here we survey three generations of fishers from Mexico's Gulf of California (N=108), where fish populations have declined steeply over the last 60 years, to investigate how far and fast their environmental baselines are shifting. Compared to young fishers, old fishers named five times as many species and four times as many fishing sites as once being abundant/productive but now depleted (Kruskal–Wallis tests, both p<0.001) with no evidence of a slowdown in rates of loss experienced by younger compared to older generations (Kruskal–Wallis test, n.s. in both cases). Old fishers caught up to 25 times as many Gulf grouper Mycteroperca jordani as young fishers on their best ever fishing day (regression r2=0.62, p<0.001). Despite times of plentiful large fish still being within living memory, few young fishers appreciated that large species had ever been common or nearshore sites productive. Such rapid shifts in perception of what is natural help explain why society is tolerant of the creeping loss of biodiversity. They imply a large educational hurdle in efforts to reset expectations and targets for conservation. PMID:16191603
Freyhult, Eva; Edvardsson, Sverker; Tamas, Ivica; Moulton, Vincent; Poole, Anthony M
2008-07-21
The H/ACA family of small nucleolar RNAs (snoRNAs) plays a central role in guiding the pseudouridylation of ribosomal RNA (rRNA). In an effort to systematically identify the complete set of rRNA-modifying H/ACA snoRNAs from the genome sequence of the budding yeast, Saccharomyces cerevisiae, we developed a program - Fisher - and previously presented several candidate snoRNAs based on our analysis 1. In this report, we provide a brief update of this work, which was aborted after the publication of experimentally-identified snoRNAs 2 identical to candidates we had identified bioinformatically using Fisher. Our motivation for revisiting this work is to report on the status of the candidate snoRNAs described in 1, and secondly, to report that a modified version of Fisher together with the available multiple yeast genome sequences was able to correctly identify several H/ACA snoRNAs for modification sites not identified by the snoGPS program 3. While we are no longer developing Fisher, we briefly consider the merits of the Fisher algorithm relative to snoGPS, which may be of use for workers considering pursuing a similar search strategy for the identification of small RNAs. The modified source code for Fisher is made available as supplementary material. Our results confirm the validity of using minimum free energy (MFE) secondary structure prediction to guide comparative genomic screening for RNA families with few sequence constraints.
NASA Astrophysics Data System (ADS)
Jamaluddin, Fadhilah; Rahim, Rahela Abdul
2015-12-01
Markov Chain has been introduced since the 1913 for the purpose of studying the flow of data for a consecutive number of years of the data and also forecasting. The important feature in Markov Chain is obtaining the accurate Transition Probability Matrix (TPM). However to obtain the suitable TPM is hard especially in involving long-term modeling due to unavailability of data. This paper aims to enhance the classical Markov Chain by introducing Exponential Smoothing technique in developing the appropriate TPM.
ERIC Educational Resources Information Center
Smith, Susan
2012-01-01
The homepage of the Project on Fair Representation (POFR) features a smiling photo of Abigail Fisher, the young White woman at the center of "Fisher v. the University of Texas," which could end race as a criterion in university admissions. Edward Blum, founder of POFR, a conservative advocacy group, connected Fisher with Wiley Rein LLP,…
W.J. Zielinski; J. R. Dunk; J. S. Yaeger; D. W. LaPlante
2010-01-01
The fisher is warranted for protection under the Endangered Species Act in the western United States and, as such, it is especially important that conservation and management actions are based on sound scientific information. We developed a landscape-scale suitability model for interior northern California to predict the probability of detecting fishers and to identify...
Fisher information for two gamma frailty bivariate Weibull models.
Bjarnason, H; Hougaard, P
2000-03-01
The asymptotic properties of frailty models for multivariate survival data are not well understood. To study this aspect, the Fisher information is derived in the standard bivariate gamma frailty model, where the survival distribution is of Weibull form conditional on the frailty. For comparison, the Fisher information is also derived in the bivariate gamma frailty model, where the marginal distribution is of Weibull form.
Rick A. Sweitzer; Viorel D. Popescu; Reginald H. Barrett; Kathryn L. Purcell; Craig M. Thompson
2015-01-01
In the west coast region of the United States, fishers (Pekania pennanti) exist in 2 remnant populationsâ1 in northern California and 1 in the southern Sierra Nevada, Californiaâand 3 reintroduced populations (western Washington, southern Oregon, and northeastern California). The West Coast Distinct Population Segment of fishers encompassing all of...
Araújo, Tiago Fernando Souza de; Lange, Marcos; Zétola, Viviane H; Massaro, Ayrton; Teive, Hélio A G
2017-10-01
Charles Miller Fisher is considered the father of modern vascular neurology and one of the giants of neurology in the 20th century. This historical review emphasizes Prof. Fisher's magnificent contribution to vascular neurology and celebrates the 65th anniversary of the publication of his groundbreaking study, "Transient Monocular Blindness Associated with Hemiplegia."
Here today, here tomorrow: Managing forests for fisher habitat in the Northern Rockies
Sue Miller; Michael Schwartz; Lucretia E. Olson
2016-01-01
The fisher is a unique member of the weasel family and a sensitive species in the northern Rockies. They were almost extirpated by trapping in the early twentieth century, but these animals (a mix between a native and introduced population) now inhabit a swath of mesic coniferous forests in Idaho and Montana. Forest managers need information on fisher distribution and...
Survival of fishers in the southern Sierra Nevada region of California
Richard A. Sweitzer; Craig M. Thompson; Rebecca E. Green; Reginald H. Barrett; Kathryn L. Purcell
2015-01-01
Fishers in the western United States were recently proposed for listing under the U.S. Endangered Species Act because of concerns for loss of suitable habitat and evidence of a diversity of mortality risks that reduce survival. One of 2 remnant populations of fishers in California is in the southern Sierra Nevada region, where we studied them at 2 research sites in the...
ERIC Educational Resources Information Center
Nguyen, David H. K.
2014-01-01
Using race as a factor in admissions policies was contested in "Fisher v. University of Texas at Austin." Although the U.S. Supreme Court firmly held in "Grutter v. Bollinger" that race can be considered among many factors in admitting students, the recent decision in "Fisher" has posed many questions and challenges…
Srivastava, Ayush; Srivastava, Anurag; Pandey, Ravindra M
2017-10-01
Randomized controlled trials have become the most respected scientific tool to measure the effectiveness of a medical therapy. The design, conduct and analysis of randomized controlled trials were developed by Sir Ronald A. Fisher, a mathematician in Great Britain. Fisher propounded that the process of randomization would equally distribute all the known and even unknown covariates in the two or more comparison groups, so that any difference observed could be ascribed to treatment effect. Today, we observe that in many situations, this prediction of Fisher does not stand true; hence, adaptive randomization schedules have been designed to adjust for major imbalance in important covariates. Present essay unravels some weaknesses inherent in Fisherian concept of randomized controlled trial.
Revisited Fisher's equation in a new outlook: A fractional derivative approach
NASA Astrophysics Data System (ADS)
Alquran, Marwan; Al-Khaled, Kamel; Sardar, Tridip; Chattopadhyay, Joydev
2015-11-01
The well-known Fisher equation with fractional derivative is considered to provide some characteristics of memory embedded into the system. The modified model is analyzed both analytically and numerically. A comparatively new technique residual power series method is used for finding approximate solutions of the modified Fisher model. A new technique combining Sinc-collocation and finite difference method is used for numerical study. The abundance of the bird species Phalacrocorax carbois considered as a test bed to validate the model outcome using estimated parameters. We conjecture non-diffusive and diffusive fractional Fisher equation represents the same dynamics in the interval (memory index, α ∈(0.8384 , 0.9986)). We also observe that when the value of memory index is close to zero, the solutions bifurcate and produce a wave-like pattern. We conclude that the survivability of the species increases for long range memory index. These findings are similar to Fisher observation and act in a similar fashion that advantageous genes do.
D'Lima, Coralie; Marsh, Helene; Hamann, Mark; Sinha, Anindya; Arthur, Rohan
2014-09-01
In human-dominated landscapes, interactions and perceptions towards wildlife are influenced by multidimensional drivers. Understanding these drivers could prove useful for wildlife conservation. We surveyed the attitudes and perceptions of fishers towards threatened Irrawaddy dolphins (Orcaella brevirostris) at Chilika Lagoon India. To validate the drivers of fisher perceptions, we : (1) observed dolphin foraging behavior at stake nets, and (2) compared catch per unit effort (CPUE) and catch income of fishers from stake nets in the presence and absence of foraging dolphins. We found that fishers were mostly positive towards dolphins, believing that dolphins augmented their fish catch and using culture to express their perceptions. Foraging dolphins were observed spending half their time at stake nets and were associated with significantly higher catch income and CPUE of mullet (Liza sp.), a locally preferred food fish species. Wildlife conservation efforts should use the multidimensional drivers of human-wildlife interactions to involve local stakeholders in management.
Tucker, Jody M.; Schwartz, Michael K.; Truex, Richard L.; Pilgrim, Kristine L.; Allendorf, Fred W.
2012-01-01
Establishing if species contractions were the result of natural phenomena or human induced landscape changes is essential for managing natural populations. Fishers (Martes pennanti) in California occur in two geographically and genetically isolated populations in the northwestern mountains and southern Sierra Nevada. Their isolation is hypothesized to have resulted from a decline in abundance and distribution associated with European settlement in the 1800s. However, there is little evidence to establish that fisher occupied the area between the two extant populations at that time. We analyzed 10 microsatellite loci from 275 contemporary and 21 historical fisher samples (1880–1920) to evaluate the demographic history of fisher in California. We did not find any evidence of a recent (post-European) bottleneck in the northwestern population. In the southern Sierra Nevada, genetic subdivision within the population strongly influenced bottleneck tests. After accounting for genetic subdivision, we found a bottleneck signal only in the northern and central portions of the southern Sierra Nevada, indicating that the southernmost tip of these mountains may have acted as a refugium for fisher during the anthropogenic changes of the late 19th and early 20th centuries. Using a coalescent-based Bayesian analysis, we detected a 90% decline in effective population size and dated the time of decline to over a thousand years ago. We hypothesize that fisher distribution in California contracted to the two current population areas pre-European settlement, and that portions of the southern Sierra Nevada subsequently experienced another more recent bottleneck post-European settlement. PMID:23300783
Strain-related differences after experimental traumatic brain injury in rats.
Reid, Wendy Murdock; Rolfe, Andrew; Register, David; Levasseur, Joseph E; Churn, Severn B; Sun, Dong
2010-07-01
The present study directly compares the effects of experimental brain injury in two commonly used rat strains: Fisher 344 and Sprague-Dawley. We previously found that Fisher rats have a higher mortality rate and more frequent seizure attacks at the same injury level than Sprague-Dawley rats. Although strain differences in rats are commonly accepted as contributing to variability among studies, there is a paucity of literature addressing strain influence in experimental neurotrauma. Therefore this study compares outcome measures in two rat strains following lateral fluid percussion injury. Fisher 344 and Sprague-Dawley rats were monitored for changes in physiological measurements, intracranial pressure, and electroencephalographic activity. We further analyzed neuronal degeneration and cell death in the injured brain using Fluoro-Jade-B (FJB) histochemistry and caspase-3 immunostaining. Behavioral studies using the beam walk and Morris water maze were conducted to characterize strain differences in both motor and cognitive functional recovery following injury. We found that Fisher rats had significantly higher intracranial pressure, prolonged seizure activity, increased FJB-positive staining in the injured cortex and thalamus, and increased caspase-3 expression than Sprague-Dawley rats. On average, Fisher rats displayed a greater amount of total recording time in seizure activity and had longer ictal durations. The Fisher rats also had increased motor deficits, correlating with the above results. In spite of these results, Fisher rats performed better on cognitive tests following injury. The results demonstrate that different rat strains respond to injury differently, and thus in preclinical neurotrauma studies strain influence is an important consideration when evaluating outcomes.
Strain-Related Differences after Experimental Traumatic Brain Injury in Rats
Rolfe, Andrew; Register, David; Levasseur, Joseph E.; Churn, Severn B.; Sun, Dong
2010-01-01
Abstract The present study directly compares the effects of experimental brain injury in two commonly used rat strains: Fisher 344 and Sprague-Dawley. We previously found that Fisher rats have a higher mortality rate and more frequent seizure attacks at the same injury level than Sprague-Dawley rats. Although strain differences in rats are commonly accepted as contributing to variability among studies, there is a paucity of literature addressing strain influence in experimental neurotrauma. Therefore this study compares outcome measures in two rat strains following lateral fluid percussion injury. Fisher 344 and Sprague-Dawley rats were monitored for changes in physiological measurements, intracranial pressure, and electroencephalographic activity. We further analyzed neuronal degeneration and cell death in the injured brain using Fluoro-Jade-B (FJB) histochemistry and caspase-3 immunostaining. Behavioral studies using the beam walk and Morris water maze were conducted to characterize strain differences in both motor and cognitive functional recovery following injury. We found that Fisher rats had significantly higher intracranial pressure, prolonged seizure activity, increased FJB-positive staining in the injured cortex and thalamus, and increased caspase-3 expression than Sprague-Dawley rats. On average, Fisher rats displayed a greater amount of total recording time in seizure activity and had longer ictal durations. The Fisher rats also had increased motor deficits, correlating with the above results. In spite of these results, Fisher rats performed better on cognitive tests following injury. The results demonstrate that different rat strains respond to injury differently, and thus in preclinical neurotrauma studies strain influence is an important consideration when evaluating outcomes. PMID:20392137
Forecasting surface water flooding hazard and impact in real-time
NASA Astrophysics Data System (ADS)
Cole, Steven J.; Moore, Robert J.; Wells, Steven C.
2016-04-01
Across the world, there is increasing demand for more robust and timely forecast and alert information on Surface Water Flooding (SWF). Within a UK context, the government Pitt Review into the Summer 2007 floods provided recommendations and impetus to improve the understanding of SWF risk for both off-line design and real-time forecasting and warning. Ongoing development and trial of an end-to-end real-time SWF system is being progressed through the recently formed Natural Hazards Partnership (NHP) with delivery to the Flood Forecasting Centre (FFC) providing coverage over England & Wales. The NHP is a unique forum that aims to deliver coordinated assessments, research and advice on natural hazards for governments and resilience communities across the UK. Within the NHP, a real-time Hazard Impact Model (HIM) framework has been developed that includes SWF as one of three hazards chosen for initial trialling. The trial SWF HIM system uses dynamic gridded surface-runoff estimates from the Grid-to-Grid (G2G) hydrological model to estimate the SWF hazard. National datasets on population, infrastructure, property and transport are available to assess impact severity for a given rarity of SWF hazard. Whilst the SWF hazard footprint is calculated in real-time using 1, 3 and 6 hour accumulations of G2G surface runoff on a 1 km grid, it has been possible to associate these with the effective rainfall design profiles (at 250m resolution) used as input to a detailed flood inundation model (JFlow+) run offline to produce hazard information resolved to 2m resolution. This information is contained in the updated Flood Map for Surface Water (uFMfSW) held by the Environment Agency. The national impact datasets can then be used with the uFMfSW SWF hazard dataset to assess impacts at this scale and severity levels of potential impact assigned at 1km and for aggregated county areas in real-time. The impact component is being led by the Health and Safety Laboratory (HSL) within the NHP. Flood Guidance within the FFC employs the national Flood Risk Matrix, which categorises potential impacts into minimal, minor, significant and severe, and Likelihood, into very low, low, medium and high classes, and the matrix entries then define the Overall Flood Risk as very low, low, medium and high. Likelihood is quantified by running G2G with Met Office ensemble rainfall inputs that in turn allows a probability to be assigned to the SWF hazard and associated impact. This overall procedure is being trialled and refined off-line by CEH and HSL using case study data, and at the same time implemented as a pre-operational test system at the Met Office for evaluation by FFC (a joint Environment Agency and Met Office centre for flood forecasting) in 2016.
Masato, Giacomo; Bone, Angie; Charlton-Perez, Andrew; Cavany, Sean; Neal, Robert; Dankers, Rutger; Dacre, Helen; Carmichael, Katie; Murray, Virginia
2015-01-01
Objectives In this study a prototype of a new health forecasting alert system is developed, which is aligned to the approach used in the Met Office’s (MO) National Severe Weather Warning Service (NSWWS). This is in order to improve information available to responders in the health and social care system by linking temperatures more directly to risks of mortality, and developing a system more coherent with other weather alerts. The prototype is compared to the current system in the Cold Weather and Heatwave plans via a case-study approach to verify its potential advantages and shortcomings. Method The prototype health forecasting alert system introduces an “impact vs likelihood matrix” for the health impacts of hot and cold temperatures which is similar to those used operationally for other weather hazards as part of the NSWWS. The impact axis of this matrix is based on existing epidemiological evidence, which shows an increasing relative risk of death at extremes of outdoor temperature beyond a threshold which can be identified epidemiologically. The likelihood axis is based on a probability measure associated with the temperature forecast. The new method is tested for two case studies (one during summer 2013, one during winter 2013), and compared to the performance of the current alert system. Conclusions The prototype shows some clear improvements over the current alert system. It allows for a much greater degree of flexibility, provides more detailed regional information about the health risks associated with periods of extreme temperatures, and is more coherent with other weather alerts which may make it easier for front line responders to use. It will require validation and engagement with stakeholders before it can be considered for use. PMID:26431427
Coll, Marta; Carreras, Marta; Ciércoles, Cristina; Cornax, Maria-José; Gorelli, Giulia; Morote, Elvira; Sáez, Raquel
2014-01-01
Background The expansion of fishing activities has intensively transformed marine ecosystems worldwide. However, available time series do not frequently cover historical periods. Methodology Fishers' perceptions were used to complement data and characterise changes in fishing activity and exploited ecosystems in the Spanish Mediterranean Sea and Gulf of Cadiz. Fishers' interviews were conducted in 27 fishing harbours of the area, and included 64 fishers from ages between 20 to >70 years old to capture the experiences and memories of various generations. Results are discussed in comparison with available independent information using stock assessments and international convention lists. Principal Findings According to fishers, fishing activity substantially evolved in the area with time, expanding towards deeper grounds and towards areas more distant from the coast. The maximum amount of catch ever caught and the weight of the largest species ever captured inversely declined with time. Fishers (70%) cited specific fishing grounds where depletion occurred. They documented ecological changes of marine biodiversity during the last half of the century: 94% reported the decline of commercially important fish and invertebrates and 61% listed species that could have been extirpated, with frequent mentions to cartilaginous fish. Declines and extirpations were in line with available quantitative evaluations from stock assessments and international conventions, and were likely linked to fishing impacts. Conversely, half of interviewed fishers claimed that several species had proliferated, such as cephalopods, jellyfish, and small-sized fish. These changes were likely related to trophic cascades due to fishing and due to climate change effects. The species composition of depletions, local extinctions and proliferations showed differences by region suggesting that regional dynamics are important when analysing biodiversity changes. Conclusions/Significance Using fishers' perceptions, fishing and ecological changes in the study area were documented. The recovery of local ecological knowledge provides valuable information complementing quantitative monitoring and evaluation surveys. PMID:24465644
Coll, Marta; Carreras, Marta; Ciércoles, Cristina; Cornax, Maria-José; Gorelli, Giulia; Morote, Elvira; Sáez, Raquel
2014-01-01
The expansion of fishing activities has intensively transformed marine ecosystems worldwide. However, available time series do not frequently cover historical periods. Fishers' perceptions were used to complement data and characterise changes in fishing activity and exploited ecosystems in the Spanish Mediterranean Sea and Gulf of Cadiz. Fishers' interviews were conducted in 27 fishing harbours of the area, and included 64 fishers from ages between 20 to >70 years old to capture the experiences and memories of various generations. Results are discussed in comparison with available independent information using stock assessments and international convention lists. According to fishers, fishing activity substantially evolved in the area with time, expanding towards deeper grounds and towards areas more distant from the coast. The maximum amount of catch ever caught and the weight of the largest species ever captured inversely declined with time. Fishers (70%) cited specific fishing grounds where depletion occurred. They documented ecological changes of marine biodiversity during the last half of the century: 94% reported the decline of commercially important fish and invertebrates and 61% listed species that could have been extirpated, with frequent mentions to cartilaginous fish. Declines and extirpations were in line with available quantitative evaluations from stock assessments and international conventions, and were likely linked to fishing impacts. Conversely, half of interviewed fishers claimed that several species had proliferated, such as cephalopods, jellyfish, and small-sized fish. These changes were likely related to trophic cascades due to fishing and due to climate change effects. The species composition of depletions, local extinctions and proliferations showed differences by region suggesting that regional dynamics are important when analysing biodiversity changes. Using fishers' perceptions, fishing and ecological changes in the study area were documented. The recovery of local ecological knowledge provides valuable information complementing quantitative monitoring and evaluation surveys.
Greenberg, L; Cultice, J M
1997-01-01
OBJECTIVE: The Health Resources and Services Administration's Bureau of Health Professions developed a demographic utilization-based model of physician specialty requirements to explore the consequences of a broad range of scenarios pertaining to the nation's health care delivery system on need for physicians. DATA SOURCE/STUDY SETTING: The model uses selected data primarily from the National Center for Health Statistics, the American Medical Association, and the U.S. Bureau of Census. Forecasts are national estimates. STUDY DESIGN: Current (1989) utilization rates for ambulatory and inpatient medical specialty services were obtained for the population according to age, gender, race/ethnicity, and insurance status. These rates are used to estimate specialty-specific total service utilization expressed in patient care minutes for future populations and converted to physician requirements by applying per-physician productivity estimates. DATA COLLECTION/EXTRACTION METHODS: Secondary data were analyzed and put into matrixes for use in the mainframe computer-based model. Several missing data points, e.g., for HMO-enrolled populations, were extrapolated from available data by the project's contractor. PRINCIPAL FINDINGS: The authors contend that the Bureau's demographic utilization model represents improvements over other data-driven methodologies that rely on staffing ratios and similar supply-determined bases for estimating requirements. The model's distinct utility rests in offering national-level physician specialty requirements forecasts. Images Figure 1 PMID:9018213
Entanglement evaluation with atomic Fisher information
NASA Astrophysics Data System (ADS)
Obada, A.-S. F.; Abdel-Khalek, S.
2010-02-01
In this paper, the concept of atomic Fisher information (AFI) is introduced. The marginal distributions of the AFI are defined. This quantity is used as a parameter of entanglement and compared with linear and atomic Wehrl entropies of the two-level atom. The evolution of the atomic Fisher information and atomic Wehrl entropy for only the pure state (or dissipation-free) of the Jaynes-Cummings model is analyzed. We demonstrate the connections between these measures.
Mayo, Oliver
2014-06-01
R. A. Fisher spent much of his final 3 years of life in Adelaide. It was a congenial place to live and work, and he was much in demand as a speaker, in Australia and overseas. It was, however, a difficult time for him because of the sustained criticism of fiducial inference from the early 1950s onwards. The article discusses some of Fisher's work on inference from an Adelaide perspective. It also considers some of the successes arising from this time, in the statistics of field experimentation and in evolutionary genetics. A few personal recollections of Fisher as houseguest are provided. This article is the text of a article presented on August 31, 2012 at the 26th International Biometric Conference, Kobe, Japan. © 2014, The International Biometric Society.
NASA Technical Reports Server (NTRS)
Lyster, Peter M.; Guo, J.; Clune, T.; Larson, J. W.; Atlas, Robert (Technical Monitor)
2001-01-01
The computational complexity of algorithms for Four Dimensional Data Assimilation (4DDA) at NASA's Data Assimilation Office (DAO) is discussed. In 4DDA, observations are assimilated with the output of a dynamical model to generate best-estimates of the states of the system. It is thus a mapping problem, whereby scattered observations are converted into regular accurate maps of wind, temperature, moisture and other variables. The DAO is developing and using 4DDA algorithms that provide these datasets, or analyses, in support of Earth System Science research. Two large-scale algorithms are discussed. The first approach, the Goddard Earth Observing System Data Assimilation System (GEOS DAS), uses an atmospheric general circulation model (GCM) and an observation-space based analysis system, the Physical-space Statistical Analysis System (PSAS). GEOS DAS is very similar to global meteorological weather forecasting data assimilation systems, but is used at NASA for climate research. Systems of this size typically run at between 1 and 20 gigaflop/s. The second approach, the Kalman filter, uses a more consistent algorithm to determine the forecast error covariance matrix than does GEOS DAS. For atmospheric assimilation, the gridded dynamical fields typically have More than 10(exp 6) variables, therefore the full error covariance matrix may be in excess of a teraword. For the Kalman filter this problem can easily scale to petaflop/s proportions. We discuss the computational complexity of GEOS DAS and our implementation of the Kalman filter. We also discuss and quantify some of the technical issues and limitations in developing efficient, in terms of wall clock time, and scalable parallel implementations of the algorithms.
Building unbiased estimators from non-gaussian likelihoods with application to shear estimation
Madhavacheril, Mathew S.; McDonald, Patrick; Sehgal, Neelima; ...
2015-01-15
We develop a general framework for generating estimators of a given quantity which are unbiased to a given order in the difference between the true value of the underlying quantity and the fiducial position in theory space around which we expand the likelihood. We apply this formalism to rederive the optimal quadratic estimator and show how the replacement of the second derivative matrix with the Fisher matrix is a generic way of creating an unbiased estimator (assuming choice of the fiducial model is independent of data). Next we apply the approach to estimation of shear lensing, closely following the workmore » of Bernstein and Armstrong (2014). Our first order estimator reduces to their estimator in the limit of zero shear, but it also naturally allows for the case of non-constant shear and the easy calculation of correlation functions or power spectra using standard methods. Both our first-order estimator and Bernstein and Armstrong’s estimator exhibit a bias which is quadratic in true shear. Our third-order estimator is, at least in the realm of the toy problem of Bernstein and Armstrong, unbiased to 0.1% in relative shear errors Δg/g for shears up to |g| = 0.2.« less
Building unbiased estimators from non-Gaussian likelihoods with application to shear estimation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Madhavacheril, Mathew S.; Sehgal, Neelima; McDonald, Patrick
2015-01-01
We develop a general framework for generating estimators of a given quantity which are unbiased to a given order in the difference between the true value of the underlying quantity and the fiducial position in theory space around which we expand the likelihood. We apply this formalism to rederive the optimal quadratic estimator and show how the replacement of the second derivative matrix with the Fisher matrix is a generic way of creating an unbiased estimator (assuming choice of the fiducial model is independent of data). Next we apply the approach to estimation of shear lensing, closely following the workmore » of Bernstein and Armstrong (2014). Our first order estimator reduces to their estimator in the limit of zero shear, but it also naturally allows for the case of non-constant shear and the easy calculation of correlation functions or power spectra using standard methods. Both our first-order estimator and Bernstein and Armstrong's estimator exhibit a bias which is quadratic in true shear. Our third-order estimator is, at least in the realm of the toy problem of Bernstein and Armstrong, unbiased to 0.1% in relative shear errors Δg/g for shears up to |g|=0.2.« less
Aggregation of Environmental Model Data for Decision Support
NASA Astrophysics Data System (ADS)
Alpert, J. C.
2013-12-01
Weather forecasts and warnings must be prepared and then delivered so as to reach their intended audience in good time to enable effective decision-making. An effort to mitigate these difficulties was studied at a Workshop, 'Sustaining National Meteorological Services - Strengthening WMO Regional and Global Centers' convened, June , 2013, by the World Bank, WMO and the US National Weather Service (NWS). The skill and accuracy of atmospheric forecasts from deterministic models have increased and there are now ensembles of such models that improve decisions to protect life, property and commerce. The NWS production of numerical weather prediction products result in model output from global and high resolution regional ensemble forecasts. Ensembles are constructed by changing the initial conditions to make a 'cloud' of forecasts that attempt to span the space of possible atmospheric realizations which can quantify not only the most likely forecast, but also the uncertainty. This has led to an unprecedented increase in data production and information content from higher resolution, multi-model output and secondary calculations. One difficulty is to obtain the needed subset of data required to estimate the probability of events, and report the information. The calibration required to reliably estimate the probability of events, and honing of threshold adjustments to reduce false alarms for decision makers is also needed. To meet the future needs of the ever-broadening user community and address these issues on a national and international basis, the weather service implemented the NOAA Operational Model Archive and Distribution System (NOMADS). NOMADS provides real-time and retrospective format independent access to climate, ocean and weather model data and delivers high availability content services as part of NOAA's official real time data dissemination at its new NCWCP web operations center. An important aspect of the server's abilities is to aggregate the matrix of model output offering access to probability and calibrating information for real time decision making. The aggregation content server reports over ensemble component and forecast time in addition to the other data dimensions of vertical layer and position for each variable. The unpacking, organization and reading of many binary packed files is accomplished most efficiently on the server while weather element event probability calculations, the thresholds for more accurate decision support, or display remain for the client. Our goal is to reduce uncertainty for variables of interest, e.g, agricultural importance. The weather service operational GFS model ensemble and short range ensemble forecasts can make skillful probability forecasts to alert users if and when their selected weather events will occur. A description of how this framework operates and how it can be implemented using existing NOMADS content services and applications is described.
Gasbarra, Dario; Pajevic, Sinisa; Basser, Peter J
2017-01-01
Tensor-valued and matrix-valued measurements of different physical properties are increasingly available in material sciences and medical imaging applications. The eigenvalues and eigenvectors of such multivariate data provide novel and unique information, but at the cost of requiring a more complex statistical analysis. In this work we derive the distributions of eigenvalues and eigenvectors in the special but important case of m×m symmetric random matrices, D , observed with isotropic matrix-variate Gaussian noise. The properties of these distributions depend strongly on the symmetries of the mean tensor/matrix, D̄ . When D̄ has repeated eigenvalues, the eigenvalues of D are not asymptotically Gaussian, and repulsion is observed between the eigenvalues corresponding to the same D̄ eigenspaces. We apply these results to diffusion tensor imaging (DTI), with m = 3, addressing an important problem of detecting the symmetries of the diffusion tensor, and seeking an experimental design that could potentially yield an isotropic Gaussian distribution. In the 3-dimensional case, when the mean tensor is spherically symmetric and the noise is Gaussian and isotropic, the asymptotic distribution of the first three eigenvalue central moment statistics is simple and can be used to test for isotropy. In order to apply such tests, we use quadrature rules of order t ≥ 4 with constant weights on the unit sphere to design a DTI-experiment with the property that isotropy of the underlying true tensor implies isotropy of the Fisher information. We also explain the potential implications of the methods using simulated DTI data with a Rician noise model.
Gasbarra, Dario; Pajevic, Sinisa; Basser, Peter J.
2017-01-01
Tensor-valued and matrix-valued measurements of different physical properties are increasingly available in material sciences and medical imaging applications. The eigenvalues and eigenvectors of such multivariate data provide novel and unique information, but at the cost of requiring a more complex statistical analysis. In this work we derive the distributions of eigenvalues and eigenvectors in the special but important case of m×m symmetric random matrices, D, observed with isotropic matrix-variate Gaussian noise. The properties of these distributions depend strongly on the symmetries of the mean tensor/matrix, D̄. When D̄ has repeated eigenvalues, the eigenvalues of D are not asymptotically Gaussian, and repulsion is observed between the eigenvalues corresponding to the same D̄ eigenspaces. We apply these results to diffusion tensor imaging (DTI), with m = 3, addressing an important problem of detecting the symmetries of the diffusion tensor, and seeking an experimental design that could potentially yield an isotropic Gaussian distribution. In the 3-dimensional case, when the mean tensor is spherically symmetric and the noise is Gaussian and isotropic, the asymptotic distribution of the first three eigenvalue central moment statistics is simple and can be used to test for isotropy. In order to apply such tests, we use quadrature rules of order t ≥ 4 with constant weights on the unit sphere to design a DTI-experiment with the property that isotropy of the underlying true tensor implies isotropy of the Fisher information. We also explain the potential implications of the methods using simulated DTI data with a Rician noise model. PMID:28989561
GeneFisher-P: variations of GeneFisher as processes in Bio-jETI
Lamprecht, Anna-Lena; Margaria, Tiziana; Steffen, Bernhard; Sczyrba, Alexander; Hartmeier, Sven; Giegerich, Robert
2008-01-01
Background PCR primer design is an everyday, but not trivial task requiring state-of-the-art software. We describe the popular tool GeneFisher and explain its recent restructuring using workflow techniques. We apply a service-oriented approach to model and implement GeneFisher-P, a process-based version of the GeneFisher web application, as a part of the Bio-jETI platform for service modeling and execution. We show how to introduce a flexible process layer to meet the growing demand for improved user-friendliness and flexibility. Results Within Bio-jETI, we model the process using the jABC framework, a mature model-driven, service-oriented process definition platform. We encapsulate remote legacy tools and integrate web services using jETI, an extension of the jABC for seamless integration of remote resources as basic services, ready to be used in the process. Some of the basic services used by GeneFisher are in fact already provided as individual web services at BiBiServ and can be directly accessed. Others are legacy programs, and are made available to Bio-jETI via the jETI technology. The full power of service-based process orientation is required when more bioinformatics tools, available as web services or via jETI, lead to easy extensions or variations of the basic process. This concerns for instance variations of data retrieval or alignment tools as provided by the European Bioinformatics Institute (EBI). Conclusions The resulting service- and process-oriented GeneFisher-P demonstrates how basic services from heterogeneous sources can be easily orchestrated in the Bio-jETI platform and lead to a flexible family of specialized processes tailored to specific tasks. PMID:18460174
Hallwass, Gustavo; Lopes, Priscila F; Juras, Anastácio A; Silvano, Renato A M
2013-03-01
The long-term impacts of large hydroelectric dams on small-scale fisheries in tropical rivers are poorly known. A promising way to investigate such impacts is to compare and integrate the local ecological knowledge (LEK) of resource users with biological data for the same region. We analyzed the accuracy of fishers' LEK to investigate fisheries dynamics and environmental changes in the Lower Tocantins River (Brazilian Amazon) downstream from a large dam. We estimated fishers' LEK through interviews with 300 fishers in nine villages and collected data on 601 fish landings in five of these villages, 22 years after the dam's establishment (2006-2008). We compared these two databases with each other and with data on fish landings from before the dam's establishment (1981) gathered from the literature. The data obtained based on the fishers' LEK (interviews) and from fisheries agreed regarding the primary fish species caught, the most commonly used type of fishing gear (gill nets) and even the most often used gill net mesh sizes but disagreed regarding seasonal fish abundance. According to the interviewed fishers, the primary environmental changes that occurred after the impoundment were an overall decrease in fish abundance, an increase in the abundance of some fish species and, possibly, the local extinction of a commercial fish species (Semaprochilodus brama). These changes were corroborated by comparing fish landings sampled before and 22 years after the impoundment, which indicated changes in the composition of fish landings and a decrease in the total annual fish production. Our results reinforce the hypothesis that large dams may adversely affect small-scale fisheries downstream and establish a feasible approach for applying fishers' LEK to fisheries management, especially in regions with a low research capacity.
A Volumetric Method for Titrimetric Analysis of Hydrogen Peroxide
1985-05-06
fairly satisfactory indicator. Sulfuric acid solutions of cerium are stable over long periods of time, unlike the less stable nitric and hydrochloric acid ...Fisher number D141-5. 5. Sulfuric acid , concentrated (95-98 percent). For example, Fisher number A 300-212. 6. O-Phosphoric acid , 85 percent:. For example...Fisher number A 242-500. 7. 5 N sulfuric acid (Reference 11): Slowly pour 75 mL of concentrated sulfuric acid (95-98 percent) into approximately 200
Liu, Ta-Kang; Wang, Yu-Cheng; Chuang, Laurence Zsu-Hsin; Chen, Chih-How
2016-01-01
The abundance of the eastern Taiwan Strait (ETS) population of the Chinese white dolphin (Sousa chinensis) has been estimated to be less than 100 individuals. It is categorized as critically endangered in the IUCN Red List of Threatened Species. Thus, immediate measures of conservation should be taken to protect it from extinction. Currently, the Taiwanese government plans to designate its habitat as a Major Wildlife Habitat (MWH), a type of marine protected area (MPA) for conservation of wildlife species. Although the designation allows continuing the current exploitation, however, it may cause conflicts among multiple stakeholders with competing interests. The study is to explore the attitude and opinions among the stakeholders in order to better manage the MPA. This study employs a semi-structured interview and a questionnaire survey of local fishers. Results from interviews indicated that the subsistence of fishers remains a major problem. It was found that stakeholders have different perceptions of the fishers' attitude towards conservation and also thought that the fishery-related law enforcement could be difficult. Quantitative survey showed that fishers are generally positive towards the conservation of the Chinese white dolphin but are less willing to participate in the planning process. Most fishers considered temporary fishing closure as feasible for conservation. The results of this study provide recommendations for future efforts towards the goal of better conservation for this endangered species.
Predicting Constraints on Ultra-Light Axion Parameters due to LSST Observations
NASA Astrophysics Data System (ADS)
Given, Gabriel; Grin, Daniel
2018-01-01
Ultra-light axions (ULAs) are a type of dark matter or dark energy candidate (depending on the mass) that are predicted to have a mass between $10^{‑33}$ and $10^{‑18}$ eV. The Large Synoptic Survey Telescope (LSST) is expected to provide a large number of weak lensing observations, which will lower the statistical uncertainty on the convergence power spectrum. I began work with Daniel Grin to predict how accurately the data from the LSST will be able to constrain ULA properties. I wrote Python code that takes a matter power spectrum calculated by axionCAMB and converts it to a convergence power spectrum. My code then takes derivatives of the convergence power spectrum with respect to several cosmological parameters; these derivatives will be used in Fisher Matrix analysis to determine the sensitivity of LSST observations to axion parameters.
Probability distribution of the entanglement across a cut at an infinite-randomness fixed point
NASA Astrophysics Data System (ADS)
Devakul, Trithep; Majumdar, Satya N.; Huse, David A.
2017-03-01
We calculate the probability distribution of entanglement entropy S across a cut of a finite one-dimensional spin chain of length L at an infinite-randomness fixed point using Fisher's strong randomness renormalization group (RG). Using the random transverse-field Ising model as an example, the distribution is shown to take the form p (S |L ) ˜L-ψ (k ) , where k ≡S /ln[L /L0] , the large deviation function ψ (k ) is found explicitly, and L0 is a nonuniversal microscopic length. We discuss the implications of such a distribution on numerical techniques that rely on entanglement, such as matrix-product-state-based techniques. Our results are verified with numerical RG simulations, as well as the actual entanglement entropy distribution for the random transverse-field Ising model which we calculate for large L via a mapping to Majorana fermions.
Operator mixing in the ɛ -expansion: Scheme and evanescent-operator independence
NASA Astrophysics Data System (ADS)
Di Pietro, Lorenzo; Stamou, Emmanuel
2018-03-01
We consider theories with fermionic degrees of freedom that have a fixed point of Wilson-Fisher type in noninteger dimension d =4 -2 ɛ . Due to the presence of evanescent operators, i.e., operators that vanish in integer dimensions, these theories contain families of infinitely many operators that can mix with each other under renormalization. We clarify the dependence of the corresponding anomalous-dimension matrix on the choice of renormalization scheme beyond leading order in ɛ -expansion. In standard choices of scheme, we find that eigenvalues at the fixed point cannot be extracted from a finite-dimensional block. We illustrate in examples a truncation approach to compute the eigenvalues. These are observable scaling dimensions, and, indeed, we find that the dependence on the choice of scheme cancels. As an application, we obtain the IR scaling dimension of four-fermion operators in QED in d =4 -2 ɛ at order O (ɛ2).
Model-based design of experiments for cellular processes.
Chakrabarty, Ankush; Buzzard, Gregery T; Rundell, Ann E
2013-01-01
Model-based design of experiments (MBDOE) assists in the planning of highly effective and efficient experiments. Although the foundations of this field are well-established, the application of these techniques to understand cellular processes is a fertile and rapidly advancing area as the community seeks to understand ever more complex cellular processes and systems. This review discusses the MBDOE paradigm along with applications and challenges within the context of cellular processes and systems. It also provides a brief tutorial on Fisher information matrix (FIM)-based and Bayesian experiment design methods along with an overview of existing software packages and computational advances that support MBDOE application and adoption within the Systems Biology community. As cell-based products and biologics progress into the commercial sector, it is anticipated that MBDOE will become an essential practice for design, quality control, and production. Copyright © 2013 Wiley Periodicals, Inc.
MIXOR: a computer program for mixed-effects ordinal regression analysis.
Hedeker, D; Gibbons, R D
1996-03-01
MIXOR provides maximum marginal likelihood estimates for mixed-effects ordinal probit, logistic, and complementary log-log regression models. These models can be used for analysis of dichotomous and ordinal outcomes from either a clustered or longitudinal design. For clustered data, the mixed-effects model assumes that data within clusters are dependent. The degree of dependency is jointly estimated with the usual model parameters, thus adjusting for dependence resulting from clustering of the data. Similarly, for longitudinal data, the mixed-effects approach can allow for individual-varying intercepts and slopes across time, and can estimate the degree to which these time-related effects vary in the population of individuals. MIXOR uses marginal maximum likelihood estimation, utilizing a Fisher-scoring solution. For the scoring solution, the Cholesky factor of the random-effects variance-covariance matrix is estimated, along with the effects of model covariates. Examples illustrating usage and features of MIXOR are provided.
NASA Technical Reports Server (NTRS)
Littenberg, T. B.; Larson, S. L.; Nelemans, G.; Cornish, N. J.
2012-01-01
Space-based gravitational wave interferometers are sensitive to the galactic population of ultracompact binaries. An important subset of the ultracompact binary population are those stars that can be individually resolved by both gravitational wave interferometers and electromagnetic telescopes. The aim of this paper is to quantify the multimessenger potential of space-based interferometers with arm-lengths between 1 and 5 Gm. The Fisher information matrix is used to estimate the number of binaries from a model of the Milky Way which are localized on the sky by the gravitational wave detector to within 1 and 10 deg(exp 2) and bright enough to be detected by a magnitude-limited survey.We find, depending on the choice ofGW detector characteristics, limiting magnitude and observing strategy, that up to several hundred gravitational wave sources could be detected in electromagnetic follow-up observations.
O'Connell's process as a vicious Brownian motion.
Katori, Makoto
2011-12-01
Vicious Brownian motion is a diffusion scaling limit of Fisher's vicious walk model, which is a system of Brownian particles in one dimension such that if two motions meet they kill each other. We consider the vicious Brownian motions conditioned never to collide with each other and call it noncolliding Brownian motion. This conditional diffusion process is equivalent to the eigenvalue process of the Hermitian-matrix-valued Brownian motion studied by Dyson [J. Math. Phys. 3, 1191 (1962)]. Recently, O'Connell [Ann. Probab. (to be published)] introduced a generalization of the noncolliding Brownian motion by using the eigenfunctions (the Whittaker functions) of the quantum Toda lattice in order to analyze a directed polymer model in 1 + 1 dimensions. We consider a system of one-dimensional Brownian motions with a long-ranged killing term as a generalization of the vicious Brownian motion and construct the O'Connell process as a conditional process of the killing Brownian motions to survive forever.
Mapping the Milky Way Galaxy with LISA
NASA Technical Reports Server (NTRS)
McKinnon, Jose A.; Littenberg, Tyson
2012-01-01
Gravitational wave detectors in the mHz band (such as the Laser Interferometer Space Antenna, or LISA) will observe thousands of compact binaries in the galaxy which can be used to better understand the structure of the Milky Way. To test the effectiveness of LISA to measure the distribution of the galaxy, we simulated the Close White Dwarf Binary (CWDB) gravitational wave sky using different models for the Milky Way. To do so, we have developed a galaxy density distribution modeling code based on the Markov Chain Monte Carlo method. The code uses different distributions to construct realizations of the galaxy. We then use the Fisher Information Matrix to estimate the variance and covariance of the recovered parameters for each detected CWDB. This is the first step toward characterizing the capabilities of space-based gravitational wave detectors to constrain models for galactic structure, such as the size and orientation of the bar in the center of the Milky Way
LISA verification binaries with updated distances from Gaia Data Release 2
NASA Astrophysics Data System (ADS)
Kupfer, T.; Korol, V.; Shah, S.; Nelemans, G.; Marsh, T. R.; Ramsay, G.; Groot, P. J.; Steeghs, D. T. H.; Rossi, E. M.
2018-06-01
Ultracompact binaries with orbital periods less than a few hours will dominate the gravitational wave signal in the mHz regime. Until recently, 10 systems were expected have a predicted gravitational wave signal strong enough to be detectable by the Laser Interferometer Space Antenna (LISA), the so-called `verification binaries'. System parameters, including distances, are needed to provide an accurate prediction of the expected gravitational wave strength to be measured by LISA. Using parallaxes from Gaia Data Release 2 we calculate signal-to-noise ratios (SNR) for ≈50 verification binary candidates. We find that 11 binaries reach a SNR≥20, two further binaries reaching a SNR≥5 and three more systems are expected to have a SNR≈5 after four years integration with LISA. For these 16 systems we present predictions of the gravitational wave amplitude (A) and parameter uncertainties from Fisher information matrix on the amplitude (A) and inclination (ι).
Correlated Topic Vector for Scene Classification.
Wei, Pengxu; Qin, Fei; Wan, Fang; Zhu, Yi; Jiao, Jianbin; Ye, Qixiang
2017-07-01
Scene images usually involve semantic correlations, particularly when considering large-scale image data sets. This paper proposes a novel generative image representation, correlated topic vector, to model such semantic correlations. Oriented from the correlated topic model, correlated topic vector intends to naturally utilize the correlations among topics, which are seldom considered in the conventional feature encoding, e.g., Fisher vector, but do exist in scene images. It is expected that the involvement of correlations can increase the discriminative capability of the learned generative model and consequently improve the recognition accuracy. Incorporated with the Fisher kernel method, correlated topic vector inherits the advantages of Fisher vector. The contributions to the topics of visual words have been further employed by incorporating the Fisher kernel framework to indicate the differences among scenes. Combined with the deep convolutional neural network (CNN) features and Gibbs sampling solution, correlated topic vector shows great potential when processing large-scale and complex scene image data sets. Experiments on two scene image data sets demonstrate that correlated topic vector improves significantly the deep CNN features, and outperforms existing Fisher kernel-based features.
Tang, Bing H
2009-10-01
This review article aims to discuss and analyze the background and findings regarding Fisher-Mendel Controversy in Genetics and to elucidate the scientific argument and intellectual integrity involved, as well as their importance in a fair society, and the lesson of Western falls as learned. At the onset of this review, the kernel of Mendel-Fisher Controversy is dissected and then identified. The fact of an organizational restructuring that had never gone towards a happy synchronization for the ensuing years since 1933 is demonstrated. It was at that time after Fisher succeeded Karl Pearson not only as the Francis Galton Professor of Eugenics but also as the chief of the Galton Laboratory at University College, London. The academic style of eugenics in the late 19th and early 20th centuries in the UK is then introduced. Fisher's ideology at that time, with its effects on the human value system and policy-making at that juncture are portrayed. Bioethical assessment is provided. Lessons in history, the emergence of the Eastern phenomenon and the decline of the Western power are outlined.
Coherence, quantum Fisher information, superradiance, and entanglement as interconvertible resources
NASA Astrophysics Data System (ADS)
Tan, Kok Chuan; Choi, Seongjeon; Kwon, Hyukjoon; Jeong, Hyunseok
2018-05-01
We demonstrate that quantum Fisher information and superradiance can be formulated as coherence measures in accordance with the resource theory of coherence, thus establishing a direct link between metrological resources, superradiance, and coherence. The arguments are generalized to show that coherence may be considered as the underlying fundamental resource for any functional of state that is first of all faithful, and second, concave or linear. It is also shown that quantum Fisher information and the superradiant quantity are in fact antithetical resources in the sense that if coherence were directed to saturate one quantity, then it must come at the expense of the other. Finally, a key result of the paper is to demonstrate that coherence, quantum Fisher information, superradiant quantity, and entanglement are mutually interconvertible resources under incoherent operations.
Carbonell, Eliseu
2014-03-01
Much has been written in recent years about the crisis in fisheries caused by the critical reduction in catches and about the strategies developed by local communities of fishers in response. The aim of this article is to demonstrate that the use of maritime heritage can also be considered part of these strategies. Like fishers elsewhere, Catalan small-scale fishers face severe threats to their professional survival. Recently some of them have became involved in activities related to maritime heritage as a strategy to draw the attention of policy makers and the general public to their problems, a strategy not without clear contradictions. But beyond these contradictions, the article points out the opportunities that use of maritime heritage offers to fishers in Catalonia as well as elsewhere.
Low Titers of Canine Distemper Virus Antibody in Wild Fishers (Martes pennanti) in the Eastern USA.
Peper, Steven T; Peper, Randall L; Mitcheltree, Denise H; Kollias, George V; Brooks, Robert P; Stevens, Sadie S; Serfass, Thomas L
2016-01-01
Canine distemper virus (CDV) infects species in the order Carnivora. Members of the family Mustelidae are among the species most susceptible to CDV and have a high mortality rate after infection. Assessing an animal's pathogen or disease load prior to any reintroduction project is important to help protect the animal being reintroduced, as well as the wildlife and livestock in the area of relocation. We screened 58 fishers for CDV antibody prior to their release into Pennsylvania, US, as part of a reintroduction program. Five of the 58 (9%) fishers had a weak-positive reaction for CDV antibody at a dilution of 1:16. None of the fishers exhibited any clinical sign of canine distemper while being held prior to release.
Probabilistic Predictions of PM2.5 Using a Novel Ensemble Design for the NAQFC
NASA Astrophysics Data System (ADS)
Kumar, R.; Lee, J. A.; Delle Monache, L.; Alessandrini, S.; Lee, P.
2017-12-01
Poor air quality (AQ) in the U.S. is estimated to cause about 60,000 premature deaths with costs of 100B-150B annually. To reduce such losses, the National AQ Forecasting Capability (NAQFC) at the National Oceanic and Atmospheric Administration (NOAA) produces forecasts of ozone, particulate matter less than 2.5 mm in diameter (PM2.5), and other pollutants so that advance notice and warning can be issued to help individuals and communities limit the exposure and reduce air pollution-caused health problems. The current NAQFC, based on the U.S. Environmental Protection Agency Community Multi-scale AQ (CMAQ) modeling system, provides only deterministic AQ forecasts and does not quantify the uncertainty associated with the predictions, which could be large due to the chaotic nature of atmosphere and nonlinearity in atmospheric chemistry. This project aims to take NAQFC a step further in the direction of probabilistic AQ prediction by exploring and quantifying the potential value of ensemble predictions of PM2.5, and perturbing three key aspects of PM2.5 modeling: the meteorology, emissions, and CMAQ secondary organic aerosol formulation. This presentation focuses on the impact of meteorological variability, which is represented by three members of NOAA's Short-Range Ensemble Forecast (SREF) system that were down-selected by hierarchical cluster analysis. These three SREF members provide the physics configurations and initial/boundary conditions for the Weather Research and Forecasting (WRF) model runs that generate required output variables for driving CMAQ that are missing in operational SREF output. We conducted WRF runs for Jan, Apr, Jul, and Oct 2016 to capture seasonal changes in meteorology. Estimated emissions of trace gases and aerosols via the Sparse Matrix Operator Kernel (SMOKE) system were developed using the WRF output. WRF and SMOKE output drive a 3-member CMAQ mini-ensemble of once-daily, 48-h PM2.5 forecasts for the same four months. The CMAQ mini-ensemble is evaluated against both observations and the current operational deterministic NAQFC products, and analyzed to assess the impact of meteorological biases on PM2.5 variability. Quantification of the PM2.5 prediction uncertainty will prove a key factor to support cost-effective decision-making while protecting public health.
NASA Astrophysics Data System (ADS)
Wilcox, C.; Ford, J.
2016-12-01
Crimes involving fishers impose significant costs on fisheries, managers and national governments. These crimes also lead to unsustainable harvesting practices, as they undermine both knowledge of the status of fisheries stocks and limits on their harvesting. One of the greatest contributors to fisheries crimes globally is transfer of fish catch among vessels, otherwise known as transshipment. While legal transshipment provides economic advantages to vessels by increasing their efficiency, illegal transshipment can allow them to avoid regulations, catch prohibited species, and fish with impunity in prohibited locations such as waters of foreign countries. Despite the presence of a number of monitoring technologies for tracking fishing vessels, transshipment is frequently done clandestinely. Here we present a statistical model for transshipment in a Southeast Asian tuna fishery. We utilize both spatial and temporal information on vessel movement patterns in a statistical model to infer unobserved transshipment events among vessels. We provide a risk analysis framework for forecasting likely transshipment events, based on our analysis of vessel movement patterns. The tools we present are widely applicable to a variety of fisheries and types of tracking data, allowing managers to more effectively screen the large volume of data tracking systems create and quickly identify suspicious behavior.
Phase space gradient of dissipated work and information: A role of relative Fisher information
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yamano, Takuya, E-mail: yamano@amy.hi-ho.ne.jp
2013-11-15
We show that an information theoretic distance measured by the relative Fisher information between canonical equilibrium phase densities corresponding to forward and backward processes is intimately related to the gradient of the dissipated work in phase space. We present a universal constraint on it via the logarithmic Sobolev inequality. Furthermore, we point out that a possible expression of the lower bound indicates a deep connection in terms of the relative entropy and the Fisher information of the canonical distributions.
Rényi-Fisher entropy product as a marker of topological phase transitions
NASA Astrophysics Data System (ADS)
Bolívar, J. C.; Nagy, Ágnes; Romera, Elvira
2018-05-01
The combined Rényi-Fisher entropy product of electrons plus holes displays a minimum at the charge neutrality points. The Stam-Rényi difference and the Stam-Rényi uncertainty product of the electrons plus holes, show maxima at the charge neutrality points. Topological quantum numbers capable of detecting the topological insulator and the band insulator phases, are defined. Upper and lower bounds for the position and momentum space Rényi-Fisher entropy products are derived.
Preparation of Radiochemical-Labeled Compounds for the U.S. Army Drug Development Program
1992-04-20
hydrochloric acid , b) extraction with ether, c) basification with potassium carbonate, d extraction with ether. 2. The crude product was isolated by...Chloride Fisher A-575 880667 Nethylene Chloride Fisher D-37 913251 Alumina, basic Act.I Woelm B 1385 Ethylene Natheson 08227 Hydrochloric Acid Fisher A...4-14C]WR-242511) and [!6- 14C]ertelinic acid were coupleted. A total of 31 m~i of (414 ]WR-238605 was prepared with specific activity of 21 mCi
Conservation biology. Galápagos station survives latest attack by fishers.
Ferber, D
2000-12-15
Researchers at the Darwin Research Station are attempting to put the pieces back together after a festering dispute over fishing quotas turned violent between 13 and 17 November. The fuse that set off the most recent conflagration was an annual 50-ton limit on spiny lobsters that local fishers reached barely halfway into the 4-month season. Unruly bands of fishers laid siege to the station and the park service, blocked roads and offices, tore down the island's telephone antenna, and destroyed research records.
Code of Federal Regulations, 2010 CFR
2010-10-01
... penalties of section 106 of the Act. (e) For fishers operating in Category I or II fisheries, failure to.... (f) For fishers operating in Category III fisheries, failure to report all incidental injuries and...
75 FR 61424 - Endangered Species; File No. 15596
Federal Register 2010, 2011, 2012, 2013, 2014
2010-10-05
... North Carolina Aquarium at Fort Fisher, North Carolina Department of Environment and Natural Resources... (50 CFR 222-226). The North Carolina Aquarium at Fort Fisher is requesting a permit to continue...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barack, Leor; Cutler, Curt; Jet Propulsion Laboratory, California Institute of Technology, Pasadena, California 91109
Inspirals of stellar-mass compact objects (COs) into {approx}10{sup 6}M{sub {center_dot}} black holes are especially interesting sources of gravitational waves for the planned Laser Interferometer Space Antenna (LISA). The orbits of these extreme-mass-ratio inspirals (EMRIs) are highly relativistic, displaying extreme versions of both perihelion precession and Lense-Thirring precession of the orbital plane. We investigate the question of whether the emitted waveforms can be used to strongly constrain the geometry of the central massive object, and in essence check that it corresponds to a Kerr black hole (BH). For a Kerr BH, all multipole moments of the spacetime have a simple, uniquemore » relation to M and S, the BH mass, and spin; in particular, the spacetime's mass quadrupole moment Q is given by Q=-S{sup 2}/M. Here we treat Q as an additional parameter, independent of S and M, and ask how well observation can constrain its difference from the Kerr value. This was already estimated by Ryan, but for the simplified case of circular, equatorial orbits, and Ryan also neglected the signal modulations arising from the motion of the LISA satellites. We consider generic orbits and include the modulations due to the satellite motions. For this analysis, we use a family of approximate (basically post-Newtonian) waveforms, which represent the full parameter space of EMRI sources, and which exhibit the main qualitative features of true, general relativistic waveforms. We extend this parameter space to include (in an approximate manner) an arbitrary value of Q, and then construct the Fisher information matrix for the extended parameter space. By inverting the Fisher matrix, we estimate how accurately Q could be extracted from LISA observations of EMRIs. For 1 yr of coherent data from the inspiral of a 10M{sub {center_dot}} black hole into rotating black holes of masses 10{sup 5.5}M{sub {center_dot}}, 10{sup 6}M{sub {center_dot}}, or 10{sup 6.5}M{sub {center_dot}}, we find {delta}(Q/M{sup 3}){approx}10{sup -4}, 10{sup -3}, or 10{sup -2}, respectively (assuming total signal-to-noise ratio of 100, typical of the brightest detectable EMRIs). These results depend only weakly on the eccentricity of the inspiral orbit or the spin of the central object.« less
75 FR 72793 - National Saltwater Angler Registry Program
Federal Register 2010, 2011, 2012, 2013, 2014
2010-11-26
... fifteen dollars ($15.00) for registration of anglers, spear fishers and for-hire fishing vessels to... anglers, spear fishers and for-hire fishing vessels will be fifteen dollars ($15.00). All persons...
Random matrix theory filters in portfolio optimisation: A stability and risk assessment
NASA Astrophysics Data System (ADS)
Daly, J.; Crane, M.; Ruskin, H. J.
2008-07-01
Random matrix theory (RMT) filters, applied to covariance matrices of financial returns, have recently been shown to offer improvements to the optimisation of stock portfolios. This paper studies the effect of three RMT filters on the realised portfolio risk, and on the stability of the filtered covariance matrix, using bootstrap analysis and out-of-sample testing. We propose an extension to an existing RMT filter, (based on Krzanowski stability), which is observed to reduce risk and increase stability, when compared to other RMT filters tested. We also study a scheme for filtering the covariance matrix directly, as opposed to the standard method of filtering correlation, where the latter is found to lower the realised risk, on average, by up to 6.7%. We consider both equally and exponentially weighted covariance matrices in our analysis, and observe that the overall best method out-of-sample was that of the exponentially weighted covariance, with our Krzanowski stability-based filter applied to the correlation matrix. We also find that the optimal out-of-sample decay factors, for both filtered and unfiltered forecasts, were higher than those suggested by Riskmetrics [J.P. Morgan, Reuters, Riskmetrics technical document, Technical Report, 1996. http://www.riskmetrics.com/techdoc.html], with those for the latter approaching a value of α=1. In conclusion, RMT filtering reduced the realised risk, on average, and in the majority of cases when tested out-of-sample, but increased the realised risk on a marked number of individual days-in some cases more than doubling it.
Macroscopic response to microscopic intrinsic noise in three-dimensional Fisher fronts.
Nesic, S; Cuerno, R; Moro, E
2014-10-31
We study the dynamics of three-dimensional Fisher fronts in the presence of density fluctuations. To this end we simulate the Fisher equation subject to stochastic internal noise, and study how the front moves and roughens as a function of the number of particles in the system, N. Our results suggest that the macroscopic behavior of the system is driven by the microscopic dynamics at its leading edge where number fluctuations are dominated by rare events. Contrary to naive expectations, the strength of front fluctuations decays extremely slowly as 1/logN, inducing large-scale fluctuations which we find belong to the one-dimensional Kardar-Parisi-Zhang universality class of kinetically rough interfaces. Hence, we find that there is no weak-noise regime for Fisher fronts, even for realistic numbers of particles in macroscopic systems.
On Stabilizing the Variance of Dynamic Functional Brain Connectivity Time Series
Fransson, Peter
2016-01-01
Abstract Assessment of dynamic functional brain connectivity based on functional magnetic resonance imaging (fMRI) data is an increasingly popular strategy to investigate temporal dynamics of the brain's large-scale network architecture. Current practice when deriving connectivity estimates over time is to use the Fisher transformation, which aims to stabilize the variance of correlation values that fluctuate around varying true correlation values. It is, however, unclear how well the stabilization of signal variance performed by the Fisher transformation works for each connectivity time series, when the true correlation is assumed to be fluctuating. This is of importance because many subsequent analyses either assume or perform better when the time series have stable variance or adheres to an approximate Gaussian distribution. In this article, using simulations and analysis of resting-state fMRI data, we analyze the effect of applying different variance stabilization strategies on connectivity time series. We focus our investigation on the Fisher transformation, the Box–Cox (BC) transformation and an approach that combines both transformations. Our results show that, if the intention of stabilizing the variance is to use metrics on the time series, where stable variance or a Gaussian distribution is desired (e.g., clustering), the Fisher transformation is not optimal and may even skew connectivity time series away from being Gaussian. Furthermore, we show that the suboptimal performance of the Fisher transformation can be substantially improved by including an additional BC transformation after the dynamic functional connectivity time series has been Fisher transformed. PMID:27784176
On Stabilizing the Variance of Dynamic Functional Brain Connectivity Time Series.
Thompson, William Hedley; Fransson, Peter
2016-12-01
Assessment of dynamic functional brain connectivity based on functional magnetic resonance imaging (fMRI) data is an increasingly popular strategy to investigate temporal dynamics of the brain's large-scale network architecture. Current practice when deriving connectivity estimates over time is to use the Fisher transformation, which aims to stabilize the variance of correlation values that fluctuate around varying true correlation values. It is, however, unclear how well the stabilization of signal variance performed by the Fisher transformation works for each connectivity time series, when the true correlation is assumed to be fluctuating. This is of importance because many subsequent analyses either assume or perform better when the time series have stable variance or adheres to an approximate Gaussian distribution. In this article, using simulations and analysis of resting-state fMRI data, we analyze the effect of applying different variance stabilization strategies on connectivity time series. We focus our investigation on the Fisher transformation, the Box-Cox (BC) transformation and an approach that combines both transformations. Our results show that, if the intention of stabilizing the variance is to use metrics on the time series, where stable variance or a Gaussian distribution is desired (e.g., clustering), the Fisher transformation is not optimal and may even skew connectivity time series away from being Gaussian. Furthermore, we show that the suboptimal performance of the Fisher transformation can be substantially improved by including an additional BC transformation after the dynamic functional connectivity time series has been Fisher transformed.
Probabilistic In Situ Stress Estimation and Forecasting using Sequential Data Assimilation
NASA Astrophysics Data System (ADS)
Fichtner, A.; van Dinther, Y.; Kuensch, H. R.
2017-12-01
Our physical understanding and forecasting ability of earthquakes, and other solid Earth dynamic processes, is significantly hampered by limited indications on the evolving state of stress and strength on faults. Integrating observations and physics-based numerical modeling to quantitatively estimate this evolution of a fault's state is crucial. However, systematic attempts are limited and tenuous, especially in light of the scarcity and uncertainty of natural data and the difficulty of modelling the physics governing earthquakes. We adopt the statistical framework of sequential data assimilation - extensively developed for weather forecasting - to efficiently integrate observations and prior knowledge in a forward model, while acknowledging errors in both. To prove this concept we perform a perfect model test in a simplified subduction zone setup, where we assimilate synthetic noised data on velocities and stresses from a single location. Using an Ensemble Kalman Filter, these data and their errors are assimilated to update 150 ensemble members from a Partial Differential Equation-driven seismic cycle model. Probabilistic estimates of fault stress and dynamic strength evolution capture the truth exceptionally well. This is possible, because the sampled error covariance matrix contains prior information from the physics that relates velocities, stresses and pressure at the surface to those at the fault. During the analysis step, stress and strength distributions are thus reconstructed such that fault coupling can be updated to either inhibit or trigger events. In the subsequent forecast step the physical equations are solved to propagate the updated states forward in time and thus provide probabilistic information on the occurrence of the next event. At subsequent assimilation steps, the system's forecasting ability turns out to be significantly better than that of a periodic recurrence model (requiring an alarm 17% vs. 68% of the time). This thus provides distinct added value with respect to using observations or numerical models separately. Although several challenges for applications to a natural setting remain, these first results indicate the large potential of data assimilation techniques for probabilistic seismic hazard assessment and other challenges in dynamic solid earth systems.
Cloke, Jonathan; Clark, Dorn; Radcliff, Roy; Leon-Velarde, Carlos; Larson, Nathan; Dave, Keron; Evans, Katharine; Crabtree, David; Hughes, Annette; Simpson, Helen; Holopainen, Jani; Wickstrand, Nina; Kauppinen, Mikko
2014-01-01
The Thermo Scientific SureTect Salmonella species Assay is a new real-time PCR assay for the detection of Salmonellae in food and environmental samples. This validation study was conducted using the AOAC Research Institute (RI) Performance Tested Methods program to validate the SureTect Salmonella species Assay in comparison to the reference method detailed in International Organization for Standardization 6579:2002 in a variety of food matrixes, namely, raw ground beef, raw chicken breast, raw ground pork, fresh bagged lettuce, pork frankfurters, nonfat dried milk powder, cooked peeled shrimp, pasteurized liquid whole egg, ready-to-eat meal containing beef, and stainless steel surface samples. With the exception of liquid whole egg and fresh bagged lettuce, which were tested in-house, all matrixes were tested by Marshfield Food Safety, Marshfield, WI, on behalf of Thermo Fisher Scientific. In addition, three matrixes (pork frankfurters, lettuce, and stainless steel surface samples) were analyzed independently as part of the AOAC-RI-controlled laboratory study by the University of Guelph, Canada. No significant difference by probability of detection or McNemars Chi-squared statistical analysis was found between the candidate or reference methods for any of the food matrixes or environmental surface samples tested during the validation study. Inclusivity and exclusivity testing was conducted with 117 and 36 isolates, respectively, which demonstrated that the SureTect Salmonella species Assay was able to detect all the major groups of Salmonella enterica subspecies enterica (e.g., Typhimurium) and the less common subspecies of S. enterica (e.g., arizoniae) and the rarely encountered S. bongori. None of the exclusivity isolates analyzed were detected by the SureTect Salmonella species Assay. Ruggedness testing was conducted to evaluate the performance of the assay with specific method deviations outside of the recommended parameters open to variation (enrichment time and temperature, and lysis temperature), which demonstrated that the assay gave reliable performance. Accelerated stability testing was additionally conducted, validating the assay shelf life.
Evaluation of the Thermo Scientific™ SureTect™ Salmonella species Assay.
Cloke, Jonathan; Clark, Dorn; Radcliff, Roy; Leon-Velarde, Carlos; Larson, Nathan; Dave, Keron; Evans, Katharine; Crabtree, David; Hughes, Annette; Simpson, Helen; Holopainen, Jani; Wickstrand, Nina; Kauppinen, Mikko
2014-03-01
The Thermo Scientific™ SureTect™ Salmonella species Assay is a new real-time PCR assay for the detection of Salmonellae in food and environmental samples. This validation study was conducted using the AOAC Research Institute (RI) Performance Tested MethodsSM program to validate the SureTect Salmonella species Assay in comparison to the reference method detailed in International Organization for Standardization 6579:2002 in a variety of food matrixes, namely, raw ground beef, raw chicken breast, raw ground pork, fresh bagged lettuce, pork frankfurters, nonfat dried milk powder, cooked peeled shrimp, pasteurized liquid whole egg, ready-to-eat meal containing beef, and stainless steel surface samples. With the exception of liquid whole egg and fresh bagged lettuce, which were tested in-house, all matrixes were tested by Marshfield Food Safety, Marshfield, WI, on behalf of Thermo Fisher Scientific. In addition, three matrixes (pork frankfurters, lettuce, and stainless steel surface samples) were analyzed independently as part of the AOAC-RI-controlled laboratory study by the University of Guelph, Canada. No significant difference by probability of detection or McNemars Chi-squared statistical analysis was found between the candidate or reference methods for any of the food matrixes or environmental surface samples tested during the validation study. Inclusivity and exclusivity testing was conducted with 117 and 36 isolates, respectively, which demonstrated that the SureTect Salmonella species Assay was able to detect all the major groups of Salmonella enterica subspecies enterica (e.g., Typhimurium) and the less common subspecies of S. enterica (e.g., arizoniae) and the rarely encountered S. bongori. None of the exclusivity isolates analyzed were detected by the SureTect Salmonella species Assay. Ruggedness testing was conducted to evaluate the performance of the assay with specific method deviations outside of the recommended parameters open to variation (enrichment time and temperature, and lysis temperature), which demonstrated that the assay gave reliable performance. Accelerated stability testing was additionally conducted, validating the assay shelf life.
NASA Astrophysics Data System (ADS)
Jiang, Guo-Qing; Xu, Jing; Wei, Jun
2018-04-01
Two algorithms based on machine learning neural networks are proposed—the shallow learning (S-L) and deep learning (D-L) algorithms—that can potentially be used in atmosphere-only typhoon forecast models to provide flow-dependent typhoon-induced sea surface temperature cooling (SSTC) for improving typhoon predictions. The major challenge of existing SSTC algorithms in forecast models is how to accurately predict SSTC induced by an upcoming typhoon, which requires information not only from historical data but more importantly also from the target typhoon itself. The S-L algorithm composes of a single layer of neurons with mixed atmospheric and oceanic factors. Such a structure is found to be unable to represent correctly the physical typhoon-ocean interaction. It tends to produce an unstable SSTC distribution, for which any perturbations may lead to changes in both SSTC pattern and strength. The D-L algorithm extends the neural network to a 4 × 5 neuron matrix with atmospheric and oceanic factors being separated in different layers of neurons, so that the machine learning can determine the roles of atmospheric and oceanic factors in shaping the SSTC. Therefore, it produces a stable crescent-shaped SSTC distribution, with its large-scale pattern determined mainly by atmospheric factors (e.g., winds) and small-scale features by oceanic factors (e.g., eddies). Sensitivity experiments reveal that the D-L algorithms improve maximum wind intensity errors by 60-70% for four case study simulations, compared to their atmosphere-only model runs.
NASA Astrophysics Data System (ADS)
Durazo, Juan A.; Kostelich, Eric J.; Mahalov, Alex
2017-09-01
We propose a targeted observation strategy, based on the influence matrix diagnostic, that optimally selects where additional observations may be placed to improve ionospheric forecasts. This strategy is applied in data assimilation observing system experiments, where synthetic electron density vertical profiles, which represent those of Constellation Observing System for Meteorology, Ionosphere, and Climate/Formosa satellite 3, are assimilated into the Thermosphere-Ionosphere-Electrodynamics General Circulation Model using the local ensemble transform Kalman filter during the 26 September 2011 geomagnetic storm. During each analysis step, the observation vector is augmented with five synthetic vertical profiles optimally placed to target electron density errors, using our targeted observation strategy. Forecast improvement due to assimilation of augmented vertical profiles is measured with the root-mean-square error (RMSE) of analyzed electron density, averaged over 600 km regions centered around the augmented vertical profile locations. Assimilating vertical profiles with targeted locations yields about 60%-80% reduction in electron density RMSE, compared to a 15% average reduction when assimilating randomly placed vertical profiles. Assimilating vertical profiles whose locations target the zonal component of neutral winds (Un) yields on average a 25% RMSE reduction in Un estimates, compared to a 2% average improvement obtained with randomly placed vertical profiles. These results demonstrate that our targeted strategy can improve data assimilation efforts during extreme events by detecting regions where additional observations would provide the largest benefit to the forecast.
Cafaro, Carlo; Alsing, Paul M
2018-04-01
The relevance of the concept of Fisher information is increasing in both statistical physics and quantum computing. From a statistical mechanical standpoint, the application of Fisher information in the kinetic theory of gases is characterized by its decrease along the solutions of the Boltzmann equation for Maxwellian molecules in the two-dimensional case. From a quantum mechanical standpoint, the output state in Grover's quantum search algorithm follows a geodesic path obtained from the Fubini-Study metric on the manifold of Hilbert-space rays. Additionally, Grover's algorithm is specified by constant Fisher information. In this paper, we present an information geometric characterization of the oscillatory or monotonic behavior of statistically parametrized squared probability amplitudes originating from special functional forms of the Fisher information function: constant, exponential decay, and power-law decay. Furthermore, for each case, we compute both the computational speed and the availability loss of the corresponding physical processes by exploiting a convenient Riemannian geometrization of useful thermodynamical concepts. Finally, we briefly comment on the possibility of using the proposed methods of information geometry to help identify a suitable trade-off between speed and thermodynamic efficiency in quantum search algorithms.
NASA Astrophysics Data System (ADS)
Cafaro, Carlo; Alsing, Paul M.
2018-04-01
The relevance of the concept of Fisher information is increasing in both statistical physics and quantum computing. From a statistical mechanical standpoint, the application of Fisher information in the kinetic theory of gases is characterized by its decrease along the solutions of the Boltzmann equation for Maxwellian molecules in the two-dimensional case. From a quantum mechanical standpoint, the output state in Grover's quantum search algorithm follows a geodesic path obtained from the Fubini-Study metric on the manifold of Hilbert-space rays. Additionally, Grover's algorithm is specified by constant Fisher information. In this paper, we present an information geometric characterization of the oscillatory or monotonic behavior of statistically parametrized squared probability amplitudes originating from special functional forms of the Fisher information function: constant, exponential decay, and power-law decay. Furthermore, for each case, we compute both the computational speed and the availability loss of the corresponding physical processes by exploiting a convenient Riemannian geometrization of useful thermodynamical concepts. Finally, we briefly comment on the possibility of using the proposed methods of information geometry to help identify a suitable trade-off between speed and thermodynamic efficiency in quantum search algorithms.
Threatened fish and fishers along the Brazilian Atlantic Forest Coast.
Begossi, Alpina; Salivonchyk, Svetlana; Hallwass, Gustavo; Hanazaki, Natalia; Lopes, Priscila F M; Silvano, Renato A M
2017-12-01
Small-scale fisheries of the Brazilian Atlantic Forest Coast (BAFC) depend on fish resources for food and income. Thus, if the catch diminishes or if fish species that are a target for fishers are overexploited or impacted, this could affect fishers' livelihoods. The exclusion of threatened fish species from the catch is believed to be a threat to small-scale fisheries, which is likely to be the case along the BAFC. Many fish species are currently listed as threatened or vulnerable, whereas there is not enough biological information available to determine the status of the majority of the other species. Failure to protect the BAFC biodiversity might negatively impact fishers' income and the regional economy of local small-scale fisheries. We collected data from 1986 to 2009 through 347 interviews and 24-h food recall surveys at seven southeastern coastal sites of the Atlantic Forest. We show that important species of consumed fish are currently threatened: of the 65 species mentioned by fishers as the most consumed fishes, 33% are decreasing and 54% have an unknown status. Thus, biological and ecological data for BAFC marine species are urgently needed, along with co-management, to promote fish conservation.
Impact of catch shares on diversification of fishers' income and risk.
Holland, Daniel S; Speir, Cameron; Agar, Juan; Crosson, Scott; DePiper, Geret; Kasperski, Stephen; Kitts, Andrew W; Perruso, Larry
2017-08-29
Many fishers diversify their income by participating in multiple fisheries, which has been shown to significantly reduce year-to-year variation in income. The ability of fishers to diversify has become increasingly constrained in the last few decades, and catch share programs could further reduce diversification as a result of consolidation. This could increase income variation and thus financial risk. However, catch shares can also offer fishers opportunities to enter or increase participation in catch share fisheries by purchasing or leasing quota. Thus, the net effect on diversification is uncertain. We tested whether diversification and variation in fishing revenues changed after implementation of catch shares for 6,782 vessels in 13 US fisheries that account for 20% of US landings revenue. For each of these fisheries, we tested whether diversification levels, trends, and variation in fishing revenues changed after implementation of catch shares, both for fishers that remained in the catch share fishery and for those that exited but remained active in other fisheries. We found that diversification for both groups was nearly always reduced. However, in most cases, we found no significant change in interannual variation of revenues, and, where changes were significant, variation decreased nearly as often as it increased.
Hafdahl, Adam R; Williams, Michelle A
2009-03-01
In 2 Monte Carlo studies of fixed- and random-effects meta-analysis for correlations, A. P. Field (2001) ostensibly evaluated Hedges-Olkin-Vevea Fisher-z and Schmidt-Hunter Pearson-r estimators and tests in 120 conditions. Some authors have cited those results as evidence not to meta-analyze Fisher-z correlations, especially with heterogeneous correlation parameters. The present attempt to replicate Field's simulations included comparisons with analytic values as well as results for efficiency and confidence-interval coverage. Field's results under homogeneity were mostly replicable, but those under heterogeneity were not: The latter exhibited up to over .17 more bias than ours and, for tests of the mean correlation and homogeneity, respectively, nonnull rejection rates up to .60 lower and .65 higher. Changes to Field's observations and conclusions are recommended, and practical guidance is offered regarding simulation evidence and choices among methods. Most cautions about poor performance of Fisher-z methods are largely unfounded, especially with a more appropriate z-to-r transformation. The Appendix gives a computer program for obtaining Pearson-r moments from a normal Fisher-z distribution, which is used to demonstrate distortion due to direct z-to-r transformation of a mean Fisher-z correlation.
Information and complexity measures in the interface of a metal and a superconductor
NASA Astrophysics Data System (ADS)
Moustakidis, Ch. C.; Panos, C. P.
2018-06-01
Fisher information, Shannon information entropy and Statistical Complexity are calculated for the interface of a normal metal and a superconductor, as a function of the temperature for several materials. The order parameter Ψ (r) derived from the Ginzburg-Landau theory is used as an input together with experimental values of critical transition temperature Tc and the superconducting coherence length ξ0. Analytical expressions are obtained for information and complexity measures. Thus Tc is directly related in a simple way with disorder and complexity. An analytical relation is found of the Fisher Information with the energy profile of superconductivity i.e. the ratio of surface free energy and the bulk free energy. We verify that a simple relation holds between Shannon and Fisher information i.e. a decomposition of a global information quantity (Shannon) in terms of two local ones (Fisher information), previously derived and verified for atoms and molecules by Liu et al. Finally, we find analytical expressions for generalized information measures like the Tsallis entropy and Fisher information. We conclude that the proper value of the non-extensivity parameter q ≃ 1, in agreement with previous work using a different model, where q ≃ 1.005.
Numerical method based on the lattice Boltzmann model for the Fisher equation.
Yan, Guangwu; Zhang, Jianying; Dong, Yinfeng
2008-06-01
In this paper, a lattice Boltzmann model for the Fisher equation is proposed. First, the Chapman-Enskog expansion and the multiscale time expansion are used to describe higher-order moment of equilibrium distribution functions and a series of partial differential equations in different time scales. Second, the modified partial differential equation of the Fisher equation with the higher-order truncation error is obtained. Third, comparison between numerical results of the lattice Boltzmann models and exact solution is given. The numerical results agree well with the classical ones.
Simple proof of the concavity of the entropy power with respect to Gaussian noise
NASA Technical Reports Server (NTRS)
Dembo, Amir
1989-01-01
A very simple proof of M. H. Costa's result that the entropy power of Xt = X + N (O, tI) is concave in t, is derived as an immediate consequence of an inequality concerning Fisher information. This relationship between Fisher information and entropy is found to be useful for proving the central limit theorem. Thus, one who seeks new entropy inequalities should try first to find new inequalities about Fisher information, or at least to exploit the existing ones in new ways.
Occupational injuries and diseases among commercial fishers in Finland 1996-2015.
Kaustell, Kim O; Mattila, Tiina E A; Rautiainen, Risto H
2016-01-01
Commercial fishing is recognised as one of the most hazardous professions worldwide. In Finland, commercial fishing has some special characteristics, including fishing on ice during frozen waters, and pluriactivity of the fisher family to gain additional income. The goal of this study was to describe injury characteristics among commercial fishers in Finland during the years 1996-2015. With this information, we wish to promote creation of effective safety campaigns and interventions. The data for this study was acquired from The Farmers' Social Insurance Institution, who handles the mandatory pension and occupational injury insurance of Finnish commercial fishers. Descriptive statistics was used to categorise and analyse the data that comprised the anonymized insurance history of 1954 insured fishers and reports on 1135 compensated injuries, 11 fatalities, and 53 occupational disease cases. The results show, that the injury rate of Finnish commercial fishers is high. Forty per cent of the fishing-related injuries occur aboard or when entering or leaving the vessel, while 37% happened ashore, and 11% on sea or lake ice. The most common type of incident is preceded by a slip, trip, or sway followed by a fall to lower level. The injuries result in a median disability length of 21 days. An elevated risk for Finnish (vs. Swedish) speaking, as well as for male fishers was found. The occupational diseases of the studied population were for the most part results of manual, repetitive and/or physically straining work due to e.g. hauling in fishing equipment. Due to small numbers and lack of case data, it is not possible to make any further analysis of the 11 fatalities, which were all drownings. Based on our findings, injury prevention should be targeted, besides preventing fatalities because of drowning, at mitigating the risks for slips, trips, and falls both aboard and ashore.
Bretz, Julia S; Von Dincklage, Falk; Woitzik, Johannes; Winkler, Maren K L; Major, Sebastian; Dreier, Jens P; Bohner, Georg; Scheel, Michael
2017-09-01
Despite its high prevalence among patients with aneurysmal subarachnoid hemorrhage (aSAH) and high risk of delayed cerebral ischemia (DCI), the Fisher grade 3 category remains a poorly studied subgroup. The aim of this cohort study has been to investigate the prognostic value of the Hijdra sum scoring system for the functional outcome in patients with Fisher grade 3 aSAH, in order to improve the risk stratification within this Fisher category. Initial CT scans of 72 prospectively enrolled patients with Fisher grade 3 aSAH were analyzed, and cisternal, ventricular, and total amount of blood were graded according to the Hijdra scale. Additionally, space-occupying subarachnoid blood clots were assessed. Outcome was evaluated after 6 months. Within the subgroup of Fisher grade 3, aSAH patients with an unfavorable outcome showed a significantly larger cisternal Hijdra sum score (HSS: 21.1 ± 5.2) than patients with a favorable outcome (HSS: 17.6 ± 5.9; p = 0.009). However, both the amount of ventricular blood (p = 0.165) and space-occupying blood clots (p = 0.206) appeared to have no prognostic relevance. After adjusting for the patient's age, gender, tobacco use, clinical status at admission, and presence of intracerebral hemorrhage, the cisternal and total HSS remained the only independent parameters included in multivariate logistic regression models to predict functional outcome (p < 0.01). The cisternal Hijdra score is fairly easy to perform and the present study indicates that it has an additional predictive value for the functional outcome within the Fisher 3 category. We suggest that the Hijdra scale is a practically useful prognostic instrument for the risk evaluation after aSAH and should be applied more often in the clinical setting.
Lewis, Jeffrey C.; Powell, Roger A.; Zielinski, William J.
2012-01-01
Translocations are frequently used to restore extirpated carnivore populations. Understanding the factors that influence translocation success is important because carnivore translocations can be time consuming, expensive, and controversial. Using population viability software, we modeled reintroductions of the fisher, a candidate for endangered or threatened status in the Pacific states of the US. Our model predicts that the most important factor influencing successful re-establishment of a fisher population is the number of adult females reintroduced (provided some males are also released). Data from 38 translocations of fishers in North America, including 30 reintroductions, 5 augmentations and 3 introductions, show that the number of females released was, indeed, a good predictor of success but that the number of males released, geographic region and proximity of the source population to the release site were also important predictors. The contradiction between model and data regarding males may relate to the assumption in the model that all males are equally good breeders. We hypothesize that many males may need to be released to insure a sufficient number of good breeders are included, probably large males. Seventy-seven percent of reintroductions with known outcomes (success or failure) succeeded; all 5 augmentations succeeded; but none of the 3 introductions succeeded. Reintroductions were instrumental in reestablishing fisher populations within their historical range and expanding the range from its most-contracted state (43% of the historical range) to its current state (68% of the historical range). To increase the likelihood of translocation success, we recommend that managers: 1) release as many fishers as possible, 2) release more females than males (55–60% females) when possible, 3) release as many adults as possible, especially large males, 4) release fishers from a nearby source population, 5) conduct a formal feasibility assessment, and 6) develop a comprehensive implementation plan that includes an active monitoring program. PMID:22479336
Yates, Katherine L; Schoeman, David S
2013-01-01
Spatial management tools, such as marine spatial planning and marine protected areas, are playing an increasingly important role in attempts to improve marine management and accommodate conflicting needs. Robust data are needed to inform decisions among different planning options, and early inclusion of stakeholder involvement is widely regarded as vital for success. One of the biggest stakeholder groups, and the most likely to be adversely impacted by spatial restrictions, is the fishing community. In order to take their priorities into account, planners need to understand spatial variation in their perceived value of the sea. Here a readily accessible, novel method for quantitatively mapping fishers' spatial access priorities is presented. Spatial access priority mapping, or SAPM, uses only basic functions of standard spreadsheet and GIS software. Unlike the use of remote-sensing data, SAPM actively engages fishers in participatory mapping, documenting rather than inferring their priorities. By so doing, SAPM also facilitates the gathering of other useful data, such as local ecological knowledge. The method was tested and validated in Northern Ireland, where over 100 fishers participated in a semi-structured questionnaire and mapping exercise. The response rate was excellent, 97%, demonstrating fishers' willingness to be involved. The resultant maps are easily accessible and instantly informative, providing a very clear visual indication of which areas are most important for the fishers. The maps also provide quantitative data, which can be used to analyse the relative impact of different management options on the fishing industry and can be incorporated into planning software, such as MARXAN, to ensure that conservation goals can be met at minimum negative impact to the industry. This research shows how spatial access priority mapping can facilitate the early engagement of fishers and the ready incorporation of their priorities into the decision-making process in a transparent, quantitative way.
A Discussion with Suzanne Fisher Staples: The Author as Writer and Cultural Observer.
ERIC Educational Resources Information Center
Sawyer, Walter E.; Sawyer, Jean C.
1993-01-01
Presents an interview with Suzanne Fisher Staples, author of the children's novel, "Shabanu, Daughter of the Wind." Discusses Staples' creative writing process, background, and the writer's role as cultural observer. (HB)
The Milky Way, the Local Group & the IR Tully-Fisher Diagram
NASA Technical Reports Server (NTRS)
Malhotra, S.; Spergel, D.; Rhoads, J.; Li, J.
1996-01-01
Using the near infrared fluxes of local group galaxies derived from Cosmic Background Explorer/Diffuse Infrared Background Experiment band maps and published Cepheid distances, we construct Tully-Fisher diagrams for the Local Group.
Transit light curves with finite integration time: Fisher information analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Price, Ellen M.; Rogers, Leslie A.
2014-10-10
Kepler has revolutionized the study of transiting planets with its unprecedented photometric precision on more than 150,000 target stars. Most of the transiting planet candidates detected by Kepler have been observed as long-cadence targets with 30 minute integration times, and the upcoming Transiting Exoplanet Survey Satellite will record full frame images with a similar integration time. Integrations of 30 minutes affect the transit shape, particularly for small planets and in cases of low signal to noise. Using the Fisher information matrix technique, we derive analytic approximations for the variances and covariances on the transit parameters obtained from fitting light curvemore » photometry collected with a finite integration time. We find that binning the light curve can significantly increase the uncertainties and covariances on the inferred parameters when comparing scenarios with constant total signal to noise (constant total integration time in the absence of read noise). Uncertainties on the transit ingress/egress time increase by a factor of 34 for Earth-size planets and 3.4 for Jupiter-size planets around Sun-like stars for integration times of 30 minutes compared to instantaneously sampled light curves. Similarly, uncertainties on the mid-transit time for Earth and Jupiter-size planets increase by factors of 3.9 and 1.4. Uncertainties on the transit depth are largely unaffected by finite integration times. While correlations among the transit depth, ingress duration, and transit duration all increase in magnitude with longer integration times, the mid-transit time remains uncorrelated with the other parameters. We provide code in Python and Mathematica for predicting the variances and covariances at www.its.caltech.edu/∼eprice.« less
Color model and method for video fire flame and smoke detection using Fisher linear discriminant
NASA Astrophysics Data System (ADS)
Wei, Yuan; Jie, Li; Jun, Fang; Yongming, Zhang
2013-02-01
Video fire detection is playing an increasingly important role in our life. But recent research is often based on a traditional RGB color model used to analyze the flame, which may be not the optimal color space for fire recognition. It is worse when we research smoke simply using gray images instead of color ones. We clarify the importance of color information for fire detection. We present a fire discriminant color (FDC) model for flame or smoke recognition based on color images. The FDC models aim to unify fire color image representation and fire recognition task into one framework. With the definition of between-class scatter matrices and within-class scatter matrices of Fisher linear discriminant, the proposed models seek to obtain one color-space-transform matrix and a discriminate projection basis vector by maximizing the ratio of these two scatter matrices. First, an iterative basic algorithm is designed to get one-component color space transformed from RGB. Then, a general algorithm is extended to generate three-component color space for further improvement. Moreover, we propose a method for video fire detection based on the models using the kNN classifier. To evaluate the recognition performance, we create a database including flame, smoke, and nonfire images for training and testing. The test experiments show that the proposed model achieves a flame verification rate receiver operating characteristic (ROC I) of 97.5% at a false alarm rate (FAR) of 1.06% and a smoke verification rate (ROC II) of 91.5% at a FAR of 1.2%, and lots of fire video experiments demonstrate that our method reaches a high accuracy for fire recognition.
Fisher information and Cramér-Rao lower bound for experimental design in parallel imaging.
Bouhrara, Mustapha; Spencer, Richard G
2018-06-01
The Cramér-Rao lower bound (CRLB) is widely used in the design of magnetic resonance (MR) experiments for parameter estimation. Previous work has considered only Gaussian or Rician noise distributions in this calculation. However, the noise distribution for multi-coil acquisitions, such as in parallel imaging, obeys the noncentral χ-distribution under many circumstances. The purpose of this paper is to present the CRLB calculation for parameter estimation from multi-coil acquisitions. We perform explicit calculations of Fisher matrix elements and the associated CRLB for noise distributions following the noncentral χ-distribution. The special case of diffusion kurtosis is examined as an important example. For comparison with analytic results, Monte Carlo (MC) simulations were conducted to evaluate experimental minimum standard deviations (SDs) in the estimation of diffusion kurtosis model parameters. Results were obtained for a range of signal-to-noise ratios (SNRs), and for both the conventional case of Gaussian noise distribution and noncentral χ-distribution with different numbers of coils, m. At low-to-moderate SNR, the noncentral χ-distribution deviates substantially from the Gaussian distribution. Our results indicate that this departure is more pronounced for larger values of m. As expected, the minimum SDs (i.e., CRLB) in derived diffusion kurtosis model parameters assuming a noncentral χ-distribution provided a closer match to the MC simulations as compared to the Gaussian results. Estimates of minimum variance for parameter estimation and experimental design provided by the CRLB must account for the noncentral χ-distribution of noise in multi-coil acquisitions, especially in the low-to-moderate SNR regime. Magn Reson Med 79:3249-3255, 2018. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.
Fisher statistics for analysis of diffusion tensor directional information.
Hutchinson, Elizabeth B; Rutecki, Paul A; Alexander, Andrew L; Sutula, Thomas P
2012-04-30
A statistical approach is presented for the quantitative analysis of diffusion tensor imaging (DTI) directional information using Fisher statistics, which were originally developed for the analysis of vectors in the field of paleomagnetism. In this framework, descriptive and inferential statistics have been formulated based on the Fisher probability density function, a spherical analogue of the normal distribution. The Fisher approach was evaluated for investigation of rat brain DTI maps to characterize tissue orientation in the corpus callosum, fornix, and hilus of the dorsal hippocampal dentate gyrus, and to compare directional properties in these regions following status epilepticus (SE) or traumatic brain injury (TBI) with values in healthy brains. Direction vectors were determined for each region of interest (ROI) for each brain sample and Fisher statistics were applied to calculate the mean direction vector and variance parameters in the corpus callosum, fornix, and dentate gyrus of normal rats and rats that experienced TBI or SE. Hypothesis testing was performed by calculation of Watson's F-statistic and associated p-value giving the likelihood that grouped observations were from the same directional distribution. In the fornix and midline corpus callosum, no directional differences were detected between groups, however in the hilus, significant (p<0.0005) differences were found that robustly confirmed observations that were suggested by visual inspection of directionally encoded color DTI maps. The Fisher approach is a potentially useful analysis tool that may extend the current capabilities of DTI investigation by providing a means of statistical comparison of tissue structural orientation. Copyright © 2012 Elsevier B.V. All rights reserved.
Not just a fisherman's wife: Women's contribution to health and wellbeing in commercial fishing.
Kilpatrick, Sue; King, Tanya J; Willis, Karen
2015-04-01
To explore the role of women in fishing industry organisations and communities in promoting best-practice health behaviours among fishers in Australia. This paper reports aspects of research that examined how the fishing industry can best support physical health and mental well-being of fishers. The study employed a mixed-methods, multisite case study approach. Data were gathered from face-to-face and phone interactions. Two sites in Victoria and one in Western Australia. Thirty-one male fishers, including commercial licence owners, skippers, deckhands, three female family members, three fishing association representatives, one local government representative, two health care providers, and three regional health planning and funding bodies. Not applicable. Not applicable. Often unrecognised, women associated with the fishing industry are integral to the promotion of good health for fishers. They are key to identifying health issues (particularly mental health issues) and proposing community-based health and well-being strategies. They often do so by incorporating health information and activities into 'soft entry points' - informal, non-health service mechanisms by which fishers can access health information and health services. While not working at the industry coalface, women have a stake, and are key players, in the commercial fishing industry. Their knowledge of, and credibility within, fishing enterprises makes them valuable sources of information about health issues facing the industry and effective strategies to address them. This expertise should be applied in conjunction with industry associations and health providers to achieve better health outcomes for fishers and their families. © 2015 National Rural Health Alliance Inc.
Fisher Sand & Gravel New Mexico, Inc. General Air Quality Permit: Related Documents
Documents related to the Fisher Sand & Gravel – New Mexico, Inc., Grey Mesa Gravel Pit General Air Quality Permit for New or Modified Minor Source Stone Quarrying, Crushing, and Screening Facilities in Indian Country.
Non-linear matter power spectrum covariance matrix errors and cosmological parameter uncertainties
NASA Astrophysics Data System (ADS)
Blot, L.; Corasaniti, P. S.; Amendola, L.; Kitching, T. D.
2016-06-01
The covariance of the matter power spectrum is a key element of the analysis of galaxy clustering data. Independent realizations of observational measurements can be used to sample the covariance, nevertheless statistical sampling errors will propagate into the cosmological parameter inference potentially limiting the capabilities of the upcoming generation of galaxy surveys. The impact of these errors as function of the number of realizations has been previously evaluated for Gaussian distributed data. However, non-linearities in the late-time clustering of matter cause departures from Gaussian statistics. Here, we address the impact of non-Gaussian errors on the sample covariance and precision matrix errors using a large ensemble of N-body simulations. In the range of modes where finite volume effects are negligible (0.1 ≲ k [h Mpc-1] ≲ 1.2), we find deviations of the variance of the sample covariance with respect to Gaussian predictions above ˜10 per cent at k > 0.3 h Mpc-1. Over the entire range these reduce to about ˜5 per cent for the precision matrix. Finally, we perform a Fisher analysis to estimate the effect of covariance errors on the cosmological parameter constraints. In particular, assuming Euclid-like survey characteristics we find that a number of independent realizations larger than 5000 is necessary to reduce the contribution of sampling errors to the cosmological parameter uncertainties at subpercent level. We also show that restricting the analysis to large scales k ≲ 0.2 h Mpc-1 results in a considerable loss in constraining power, while using the linear covariance to include smaller scales leads to an underestimation of the errors on the cosmological parameters.
Evaluation of the Thermo Scientific SureTect Listeria monocytogenes Assay.
Cloke, Jonathan; Leon-Velarde, Carlos; Larson, Nathan; Dave, Keron; Evans, Katharine; Crabtree, David; Hughes, Annette; Hopper, Craig; Simpson, Helen; Withey, Sophie; Oleksiuk, Milena; Holopainen, Jani; Wickstrand, Nina; Kauppinen, Mikko
2014-01-01
The Thermo Scientific SureTect Listeria monocytogenes Assay is a new real-time PCR assay for the detection of Listeria monocytogenes in food and environmental samples. This assay was validated using the AOAC Research Institute (AOAC-RI) Performance Tested Methods program in comparison to the reference method detailed in International Organization for Standardization 11290-1:1996, including Amendment 1:2004 with the following foods and food contact surfaces: smoked salmon, processed cheese, fresh bagged spinach, fresh cantaloupe, cooked prawns (chilled product), cooked sliced turkey meat (chilled product), ice cream, pork frankfurters, salami, ground raw beef meat (12% fat), plastic, and stainless steel. All matrixes were tested by Thermo Fisher Scientific, Microbiology Division, Basingstoke, UK. In addition, three matrixes (pork frankfurters, bagged lettuce, and stainless steel) were analyzed independently as part of the AOAC-RI controlled laboratory study by the University of Guelph, Canada. Using probability of detection (POD) statistical analysis, a significant difference was demonstrated between the candidate and reference methods for salami, cooked sliced turkey and ice cream in favor of the SureTect assay. For all other matrixes, no significant difference by POD was seen between the two methods during the study. Inclusivity and exclusivity testing was also conducted with 53 and 30 isolates, respectively, which demonstrated that the SureTect assay was able to detect all serotypes of L. monocytogenes. None of the exclusivity isolates analyzed were detected by the SureTect assay. Ruggedness testing was conducted to evaluate the performance of the assay with specific method deviations outside the recommended parameters open to variation, i.e., enrichment time and temperature and lysis temperature, which demonstrated that the assay gave reliable performance. Accelerated stability testing was also conducted, validating the assay shelf life.
Combining cluster number counts and galaxy clustering
NASA Astrophysics Data System (ADS)
Lacasa, Fabien; Rosenfeld, Rogerio
2016-08-01
The abundance of clusters and the clustering of galaxies are two of the important cosmological probes for current and future large scale surveys of galaxies, such as the Dark Energy Survey. In order to combine them one has to account for the fact that they are not independent quantities, since they probe the same density field. It is important to develop a good understanding of their correlation in order to extract parameter constraints. We present a detailed modelling of the joint covariance matrix between cluster number counts and the galaxy angular power spectrum. We employ the framework of the halo model complemented by a Halo Occupation Distribution model (HOD). We demonstrate the importance of accounting for non-Gaussianity to produce accurate covariance predictions. Indeed, we show that the non-Gaussian covariance becomes dominant at small scales, low redshifts or high cluster masses. We discuss in particular the case of the super-sample covariance (SSC), including the effects of galaxy shot-noise, halo second order bias and non-local bias. We demonstrate that the SSC obeys mathematical inequalities and positivity. Using the joint covariance matrix and a Fisher matrix methodology, we examine the prospects of combining these two probes to constrain cosmological and HOD parameters. We find that the combination indeed results in noticeably better constraints, with improvements of order 20% on cosmological parameters compared to the best single probe, and even greater improvement on HOD parameters, with reduction of error bars by a factor 1.4-4.8. This happens in particular because the cross-covariance introduces a synergy between the probes on small scales. We conclude that accounting for non-Gaussian effects is required for the joint analysis of these observables in galaxy surveys.
Application of Bred Vectors To Data Assimilation
NASA Astrophysics Data System (ADS)
Corazza, M.; Kalnay, E.; Patil, Dj
We introduced a statistic, the BV-dimension, to measure the effective local finite-time dimensionality of the atmosphere. We show that this dimension is often quite low, and suggest that this finding has important implications for data assimilation and the accuracy of weather forecasting (Patil et al, 2001). The original database for this study was the forecasts of the NCEP global ensemble forecasting system. The initial differences between the control forecast and the per- turbed forecasts are called bred vectors. The control and perturbed initial conditions valid at time t=n(t are evolved using the forecast model until time t=(n+1) (t. The differences between the perturbed and the control forecasts are scaled down to their initial amplitude, and constitute the bred vectors valid at (n+1) (t. Their growth rate is typically about 1.5/day. The bred vectors are similar by construction to leading Lya- punov vectors except that they have small but finite amplitude, and they are valid at finite times. The original NCEP ensemble data set has 5 independent bred vectors. We define a local bred vector at each grid point by choosing the 5 by 5 grid points centered at the grid point (a region of about 1100km by 1100km), and using the north-south and east- west velocity components at 500mb pressure level to form a 50 dimensional column vector. Since we have k=5 global bred vectors, we also have k local bred vectors at each grid point. We estimate the effective dimensionality of the subspace spanned by the local bred vectors by performing a singular value decomposition (EOF analysis). The k local bred vector columns form a 50xk matrix M. The singular values s(i) of M measure the extent to which the k column unit vectors making up the matrix M point in the direction of v(i). We define the bred vector dimension as BVDIM={Sum[s(i)]}^2/{Sum[s(i)]^2} For example, if 4 out of the 5 vectors lie along v, and one lies along v, the BV- dimension would be BVDIM[sqrt(4), 1, 0,0,0]=1.8, less than 2 because one direction is more dominant than the other in representing the original data. The results (Patil et al, 2001) show that there are large regions where the bred vectors span a subspace of substantially lower dimension than that of the full space. These low dimensionality regions are dominant in the baroclinic extratropics, typically have a lifetime of 3-7 days, have a well-defined horizontal and vertical structure that spans 1 most of the atmosphere, and tend to move eastward. New results with a large number of ensemble members confirm these results and indicate that the low dimensionality regions are quite robust, and depend only on the verification time (i.e., the underlying flow). Corazza et al (2001) have performed experiments with a data assimilation system based on a quasi-geostrophic model and simulated observations (Morss, 1999, Hamill et al, 2000). A 3D-variational data assimilation scheme for a quasi-geostrophic chan- nel model is used to study the structure of the background error and its relationship to the corresponding bred vectors. The "true" evolution of the model atmosphere is defined by an integration of the model and "rawinsonde observations" are simulated by randomly perturbing the true state at fixed locations. It is found that after 3-5 days the bred vectors develop well organized structures which are very similar for the two different norms considered in this paper (potential vorticity norm and streamfunction norm). The results show that the bred vectors do indeed represent well the characteristics of the data assimilation forecast errors, and that the subspace of bred vectors contains most of the forecast error, except in areas where the forecast errors are small. For example, the angle between the 6hr forecast error and the subspace spanned by 10 bred vectors is less than 10o over 90% of the domain, indicating a pattern correlation of more than 98.5% between the forecast error and its projection onto the bred vector subspace. The presence of low-dimensional regions in the perturbations of the basic flow has important implications for data assimilation. At any given time, there is a difference between the true atmospheric state and the model forecast. Assuming that model er- rors are not the dominant source of errors, in a region of low BV-dimensionality the difference between the true state and the forecast should lie substantially in the low dimensional unstable subspace of the few bred vectors that contribute most strongly to the low BV-dimension. This information should yield a substantial improvement in the forecast: the data assimilation algorithm should correct the model state by moving it closer to the observations along the unstable subspace, since this is where the true state most likely lies. Preliminary experiments have been conducted with the quasi-geostrophic data assim- ilation system testing whether it is possible to add "errors of the day" based on bred vectors to the standard (constant) 3D-Var background error covariance in order to capture these important errors. The results are extremely encouraging, indicating a significant reduction (about 40%) in the analysis errors at a very low computational cost. References: 2 Corazza, M., E. Kalnay, DJ Patil, R. Morss, M Cai, I. Szunyogh, BR Hunt, E Ott and JA Yorke, 2001: Use of the breeding technique to estimate the structure of the analysis "errors of the day". Submitted to Nonlinear Processes in Geophysics. Hamill, T.M., Snyder, C., and Morss, R.E., 2000: A Comparison of Probabilistic Fore- casts from Bred, Singular-Vector and Perturbed Observation Ensembles, Mon. Wea. Rev., 128, 18351851. Kalnay, E., and Z. Toth, 1994: Removing growing errors in the analysis cycle. Preprints of the Tenth Conference on Numerical Weather Prediction, Amer. Meteor. Soc., 1994, 212-215. Morss, R. E., 1999: Adaptive observations: Idealized sampling strategies for improv- ing numerical weather prediction. PHD thesis, Massachussetts Institute of technology, 225pp. Patil, D. J. S., B. R. Hunt, E. Kalnay, J. A. Yorke, and E. Ott., 2001: Local Low Dimensionality of Atmospheric Dynamics. Phys. Rev. Lett., 86, 5878. 3
EXERGY AND FISHER INFORMATION AS ECOLOGICAL INDEXES
Ecological indices are used to provide summary information about a particular aspect of ecosystem behavior. Many such indices have been proposed and here we investigate two: exergy and Fisher Information. Exergy, a thermodynamically based index, is a measure of maximum amount o...
Who's Qualified? Seeing Race in Color-Blind Times: Lessons from Fisher v. University of Texas
ERIC Educational Resources Information Center
Donnor, Jamel K.
2015-01-01
Using Howard Winant's racial dualism theory, this chapter explains how race was discursively operationalized in the recent U.S. Supreme Court higher education antiracial diversity case Fisher v. University of Texas at Austin.
Using Fisher information to track stability in multivariate systems
With the current proliferation of data, the proficient use of statistical and mining techniques offer substantial benefits to capture useful information from any dataset. As numerous approaches make use of information theory concepts, here, we discuss how Fisher information (FI...
FISHER INFORMATION AND ECOSYSTEM REGIME CHANGES
Following Fisher’s work, we propose two different expressions for the Fisher Information along with Shannon Information as a means of detecting and assessing shifts between alternative ecosystem regimes. Regime shifts are a consequence of bifurcations in the dynamics of an ecosys...
A new discriminative kernel from probabilistic models.
Tsuda, Koji; Kawanabe, Motoaki; Rätsch, Gunnar; Sonnenburg, Sören; Müller, Klaus-Robert
2002-10-01
Recently, Jaakkola and Haussler (1999) proposed a method for constructing kernel functions from probabilistic models. Their so-called Fisher kernel has been combined with discriminative classifiers such as support vector machines and applied successfully in, for example, DNA and protein analysis. Whereas the Fisher kernel is calculated from the marginal log-likelihood, we propose the TOP kernel derived; from tangent vectors of posterior log-odds. Furthermore, we develop a theoretical framework on feature extractors from probabilistic models and use it for analyzing the TOP kernel. In experiments, our new discriminative TOP kernel compares favorably to the Fisher kernel.
1981-06-01
normality and several types of nonnormality. Overall the rank transformation procedure seems to be the best. The Fisher’s LSD multiple comparisons procedure...the rank transformation procedure appears to maintain power better than Fisher’s LSD or the randomization proce- dures. The conclusion of this study...best. The Fisher’s LSD multiple comparisons procedure in the one way and two way layouts iv compared with a randomization procedure and with the same
Recurrent Miller Fisher syndrome.
Madhavan, S; Geetha; Bhargavan, P V
2004-07-01
Miller Fisher syndrome (MFS) is a variant of Guillan Barre syndrome characterized by the triad of ophthalmoplegia, ataxia and areflexia. Recurrences are exceptional with Miller Fisher syndrome. We are reporting a case with two episodes of MFS within two years. Initially he presented with partial ophthalmoplegia, ataxia. Second episode was characterized by full-blown presentation characterized by ataxia, areflexia and ophthalmoplegia. CSF analysis was typical during both episodes. Nerve conduction velocity study was fairly within normal limits. MRI of brain was within normal limits. He responded to symptomatic measures initially, then to steroids in the second episode. We are reporting the case due to its rarity.
Heavy-tailed fractional Pearson diffusions.
Leonenko, N N; Papić, I; Sikorskii, A; Šuvak, N
2017-11-01
We define heavy-tailed fractional reciprocal gamma and Fisher-Snedecor diffusions by a non-Markovian time change in the corresponding Pearson diffusions. Pearson diffusions are governed by the backward Kolmogorov equations with space-varying polynomial coefficients and are widely used in applications. The corresponding fractional reciprocal gamma and Fisher-Snedecor diffusions are governed by the fractional backward Kolmogorov equations and have heavy-tailed marginal distributions in the steady state. We derive the explicit expressions for the transition densities of the fractional reciprocal gamma and Fisher-Snedecor diffusions and strong solutions of the associated Cauchy problems for the fractional backward Kolmogorov equation.
Ganesan, Sornam; Subbiah, Vasantha N.; Michael, Jothi Clara J.
2015-01-01
Objective: To identify the associated factors of cervical pre-malignant lesions among the married fisher women residing in the coastal areas of Sadras, Tamil Nadu. Methods: The study was conducted in five fishermen communities under Sadras, a coastal area in Tamil Nadu, India. Two hundred and fifty married fisher women residing in the area. Quantitative descriptive approach with a cross-sectional study design was used. Data were collected using a structured interview schedule for identifying the associated factors and Pap smear test was performed for identifying the pre-malignant cervical lesions among the married fisher women. Data were analyzed using descriptive and inferential statistics. Results: Among 250 women, about six (2.4%) of them presented with pre-cancerous lesions such as atypical squamous cell of undifferentiated significance (ASCUS) — five (2%) and mild dysplasia one (0.4%). Majority of the women, about 178 (71.2%) women, had abnormal cervical findings. Statistical analysis showed a significant association of risk factors such as advanced age, lack of education, low socioeconomic status, using tobacco, multiparity, premarital sex, extramarital relationship, using cloth as sanitary napkin, etc. Conclusion: The study findings clearly show the increased vulnerable state of the fisher women for acquiring cervical cancer as they had many risk factors contributing to the same. PMID:27981091
Confidence intervals for correlations when data are not normal.
Bishara, Anthony J; Hittner, James B
2017-02-01
With nonnormal data, the typical confidence interval of the correlation (Fisher z') may be inaccurate. The literature has been unclear as to which of several alternative methods should be used instead, and how extreme a violation of normality is needed to justify an alternative. Through Monte Carlo simulation, 11 confidence interval methods were compared, including Fisher z', two Spearman rank-order methods, the Box-Cox transformation, rank-based inverse normal (RIN) transformation, and various bootstrap methods. Nonnormality often distorted the Fisher z' confidence interval-for example, leading to a 95 % confidence interval that had actual coverage as low as 68 %. Increasing the sample size sometimes worsened this problem. Inaccurate Fisher z' intervals could be predicted by a sample kurtosis of at least 2, an absolute sample skewness of at least 1, or significant violations of normality hypothesis tests. Only the Spearman rank-order and RIN transformation methods were universally robust to nonnormality. Among the bootstrap methods, an observed imposed bootstrap came closest to accurate coverage, though it often resulted in an overly long interval. The results suggest that sample nonnormality can justify avoidance of the Fisher z' interval in favor of a more robust alternative. R code for the relevant methods is provided in supplementary materials.
Fishing effort and catch composition of urban market and rural villages in Brazilian Amazon.
Hallwass, Gustavo; Lopes, Priscila Fabiana; Juras, Anastacio Afonso; Silvano, Renato Azevedo Matias
2011-02-01
The management of small-scale freshwater fisheries in Amazon has been based usually on surveys of urban markets, while fisheries of rural villages have gone unnoticed. We compared the fishing characteristics (catch, effort and selectivity) between an urban market and five small villages in the Lower Tocantins River (Brazilian Amazon), downstream from a large reservoir. We recorded 86 and 601 fish landings in the urban market and villages, respectively, using the same methodology. The urban fishers showed higher catch per unit of effort, higher amount of ice (related to a higher fishing effort, as ice is used to store fish catches) and larger crew size per fishing trip, but village fishers had a higher estimated annual fish production. Conversely, urban and village fishers used similar fishing gear (gillnets) and the main fish species caught were the same. However, village fishers showed more diverse strategies regarding gear, habitats and fish caught. Therefore, although it underestimated the total amount of fish caught in the Lower Tocantins River region, the data from the urban market could be a reliable indicator of main fish species exploited and fishing gear used by village fishers. Monitoring and management should consider the differences and similarities between urban and rural fisheries, in Amazon and in other tropical regions.
Underwood, D; Makar, R R; Gidwani, A L; Najfi, S M; Neilly, P; Gilliland, R
2010-03-01
This study compared the efficacy and patient acceptability of two methods of bowel preparation for flexible sigmoidoscopy. Patients attending for outpatient flexible sigmoidoscopy were prospectively randomized to receive one Fleet ready-to-use enema or 2 x 4 g glycerin suppositories, 2 h preprocedure. Patient and endoscopist questionnaires were used to compare the outcomes. From November 2000 to August 2001, 203 (male = 95; female = 108) patients were randomized. Patient data available for 163 patients (enema = 93; suppository = 70) revealed: ease of use (enema = 52; suppository = 25; P < 0.02, Fisher's exact); assistance required (enema = 19; suppository = 3; P < 0.005, Fisher's exact); grade of effectiveness (enema = 83; suppository = 44; P < 0.0001, Fisher's exact), and whether patients wished to try another preparation in future (enema = 16; suppository = 24; P = 0.016, Fisher's exact). Endoscopist data available for 151 patients (enema = 76; suppository = 75) revealed: average depth of insertion (enema = 53.6 +/- 11.6 cm; suppository 46.3 +/- 13.7 cm; P < 0.001, Student's t test); acceptable (excellent + good) quality of preparation [enema = 60 (78.9%); suppository = 34 (45.3%); P < 0.0001, Fisher's exact]. Bowel preparation for flexible sigmoidoscopy using a single Fleet enema is acceptable to patients and more effective than glycerin suppositories.
Ganesan, Sornam; Subbiah, Vasantha N; Michael, Jothi Clara J
2015-01-01
To identify the associated factors of cervical pre-malignant lesions among the married fisher women residing in the coastal areas of Sadras, Tamil Nadu. The study was conducted in five fishermen communities under Sadras, a coastal area in Tamil Nadu, India. Two hundred and fifty married fisher women residing in the area. Quantitative descriptive approach with a cross-sectional study design was used. Data were collected using a structured interview schedule for identifying the associated factors and Pap smear test was performed for identifying the pre-malignant cervical lesions among the married fisher women. Data were analyzed using descriptive and inferential statistics. Among 250 women, about six (2.4%) of them presented with pre-cancerous lesions such as atypical squamous cell of undifferentiated significance (ASCUS) - five (2%) and mild dysplasia one (0.4%). Majority of the women, about 178 (71.2%) women, had abnormal cervical findings. Statistical analysis showed a significant association of risk factors such as advanced age, lack of education, low socioeconomic status, using tobacco, multiparity, premarital sex, extramarital relationship, using cloth as sanitary napkin, etc. The study findings clearly show the increased vulnerable state of the fisher women for acquiring cervical cancer as they had many risk factors contributing to the same.
Fishing Effort and Catch Composition of Urban Market and Rural Villages in Brazilian Amazon
NASA Astrophysics Data System (ADS)
Hallwass, Gustavo; Lopes, Priscila Fabiana; Juras, Anastacio Afonso; Silvano, Renato Azevedo Matias
2011-02-01
The management of small-scale freshwater fisheries in Amazon has been based usually on surveys of urban markets, while fisheries of rural villages have gone unnoticed. We compared the fishing characteristics (catch, effort and selectivity) between an urban market and five small villages in the Lower Tocantins River (Brazilian Amazon), downstream from a large reservoir. We recorded 86 and 601 fish landings in the urban market and villages, respectively, using the same methodology. The urban fishers showed higher catch per unit of effort, higher amount of ice (related to a higher fishing effort, as ice is used to store fish catches) and larger crew size per fishing trip, but village fishers had a higher estimated annual fish production. Conversely, urban and village fishers used similar fishing gear (gillnets) and the main fish species caught were the same. However, village fishers showed more diverse strategies regarding gear, habitats and fish caught. Therefore, although it underestimated the total amount of fish caught in the Lower Tocantins River region, the data from the urban market could be a reliable indicator of main fish species exploited and fishing gear used by village fishers. Monitoring and management should consider the differences and similarities between urban and rural fisheries, in Amazon and in other tropical regions.
NASA Astrophysics Data System (ADS)
Abitbol, Maximilian H.; Chluba, Jens; Hill, J. Colin; Johnson, Bradley R.
2017-10-01
Measurements of cosmic microwave background (CMB) spectral distortions have profound implications for our understanding of physical processes taking place over a vast window in cosmological history. Foreground contamination is unavoidable in such measurements and detailed signal-foreground separation will be necessary to extract cosmological science. In this paper, we present Markov chain Monte Carlo based spectral distortion detection forecasts in the presence of Galactic and extragalactic foregrounds for a range of possible experimental configurations, focusing on the Primordial Inflation Explorer (PIXIE) as a fiducial concept. We consider modifications to the baseline PIXIE mission (operating ≃ 12 months in distortion mode), searching for optimal configurations using a Fisher approach. Using only spectral information, we forecast an extended PIXIE mission to detect the expected average non-relativistic and relativistic thermal Sunyaev-Zeldovich distortions at high significance (194σ and 11σ, respectively), even in the presence of foregrounds. The ΛCDM Silk damping μ-type distortion is not detected without additional modifications of the instrument or external data. Galactic synchrotron radiation is the most problematic source of contamination in this respect, an issue that could be mitigated by combining PIXIE data with future ground-based observations at low frequencies (ν ≲ 15-30 GHz). Assuming moderate external information on the synchrotron spectrum, we project an upper limit of |μ| < 3.6 × 10-7 (95 per cent c.l.), slightly more than one order of magnitude above the fiducial ΛCDM signal from the damping of small-scale primordial fluctuations, but a factor of ≃250 improvement over the current upper limit from COBE/Far Infrared Absolute Spectrophotometer. This limit could be further reduced to |μ| < 9.4 × 10-8 (95 per cent c.l.) with more optimistic assumptions about extra low-frequency information and would rule out many alternative inflation models and provide new constraints on decaying particle scenarios.
Forecasting client transitions in British Columbia's Long-Term Care Program.
Lane, D; Uyeno, D; Stark, A; Gutman, G; McCashin, B
1987-01-01
This article presents a model for the annual transitions of clients through various home and facility placements in a long-term care program. The model, an application of Markov chain analysis, is developed, tested, and applied to over 9,000 clients (N = 9,483) in British Columbia's Long Term Care Program (LTC) over the period 1978-1983. Results show that the model gives accurate forecasts of the progress of groups of clients from state to state in the long-term care system from time of admission until eventual death. Statistical methods are used to test the modeling hypothesis that clients' year-over-year transitions occur in constant proportions from state to state within the long-term care system. Tests are carried out by examining actual year-over-year transitions of each year's new admission cohort (1978-1983). Various subsets of the available data are analyzed and, after accounting for clear differences among annual cohorts, the most acceptable model of the actual client transition data occurred when clients were separated into male and female groups, i.e., the transition behavior of each group is describable by a different Markov model. To validate the model, we develop model estimates for the numbers of existing clients in each state of the long-term care system for the period (1981-1983) for which actual data are available. When these estimates are compared with the actual data, total weighted absolute deviations do not exceed 10 percent of actuals. Finally, we use the properties of the Markov chain probability transition matrix and simulation methods to develop three-year forecasts with prediction intervals for the distribution of the existing total clients into each state of the system. The tests, forecasts, and Markov model supplemental information are contained in a mechanized procedure suitable for a microcomputer. The procedure provides a powerful, efficient tool for decision makers planning facilities and services in response to the needs of long-term care clients. PMID:3121537
Interpreting Gas Production Decline Curves By Combining Geometry and Topology
NASA Astrophysics Data System (ADS)
Ewing, R. P.; Hu, Q.
2014-12-01
Shale gas production forms an increasing fraction of domestic US energy supplies, but individual gas production wells show steep production declines. Better understanding of this production decline would allow better economic forecasting; better understanding of the reasons behind the decline would allow better production management. Yet despite these incentives, production declines curves remain poorly understood, and current analyses range from Arps' purely empirical equation to new sophisticated approaches requiring multiple unavailable parameters. Models often fail to capture salient features: for example, in log-log space many wells decline with an exponent markedly different from the -0.5 expected from diffusion, and often show a transition from one decline mode to another. We propose a new approach based on the assumption that the rate-limiting step is gas movement from the matrix to the induced fracture network. The matrix is represented as an assemblage of equivalent spheres (geometry), with low matrix pore connectivity (topology) that results in a distance-dependent accessible porosity profile given by percolation theory. The basic theory has just 2 parameters: the sphere size distribution (geometry), and the crossover distance (topology) that characterizes the porosity distribution. The theory is readily extended to include e.g. alternative geometries and bi-modal size distributions. Comparisons with historical data are promising.
A review and forecast of engine system research at the Army Propulsion Directorate
NASA Technical Reports Server (NTRS)
Bobula, George A.
1989-01-01
An account is given of the development status and achievements to date of the U.S. Army Propulsion Directorate's Small Turbine Engine Research (STER) programs, which are experimental investigations of the physics of entire engine systems from the viewpoints of component interactions and/or system dynamics. STER efforts are oriented toward the evaluation of complete turboshaft engine advanced concepts and are conducted at the ECRL-2 indoor, sea-level engine test facility. Attention is given to the results obtained by STER experiments concerned with IR-suppressing engine exhausts, a ceramic turbine-blade shroud, an active shaft-vibration control system, and a ceramic-matrix combustor liner.
Computer simulation of low-temperature composites sintering processes for additive technologies
NASA Astrophysics Data System (ADS)
Tovpinets, A. O.; Leytsin, V. N.; Dmitrieva, M. A.
2017-12-01
This is impact research of mixture raw components characteristics on the low-temperature composites structure formation during the sintering process. The obtained results showed that the structure determination of initial compacts obtained after thermal destruction of the polymer binder lets quantify the concentrations of main components and the refractory crystalline product of thermal destruction. Accounting for the distribution of thermal destruction refractory product allows us to refine the forecast of thermal stresses in the matrix of sintered composite. The presented results can be considered as a basis for optimization of initial compositions of multilayer low-temperature composites obtained by additive technologies.
FISHER INFORMATION AND DYNAMIC REGIME CHANGES IN ECOLOGICAL SYSTEMS
Fisher Information and Dynamic Regime Changes in Ecological Systems
Abstract for the 3rd Conference of the International Society for Ecological Informatics
Audrey L. Mayer, Christopher W. Pawlowski, and Heriberto Cabezas
The sustainable nature of particular dynamic...
A CFO's Perspective on the Quality Revolution.
ERIC Educational Resources Information Center
Norton, Alan J.
1994-01-01
The chief financial officer (CFO) of St. John Fisher College (New York) analyzes the costs associated with the implementation of quality management at St. John Fisher and outlines one way to determine whether the investment is yielding an acceptable internal rate of return. (DB)
On Fisher Information and Thermodynamics
Fisher information is a measure of the information obtainable by an observer from the observation of reality. However, information is obtainable only when there are patterns or features to observe, and these only exist when there is order. For example, a system in perfect disor...
77 FR 64816 - National Human Genome Research Institute; Notice of Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-23
... sign language interpretation or other reasonable accommodations, should notify the Contact Person... relevance. Place: National Human Genome Research Institute, 5635 Fishers Lane, Terrace Level Conference Room... Genome Research Institute, 5635 Fishers Lane, Terrace Level Conference Room, Rockville, MD 20892. Contact...
FISHER INFORMATION AS A METRIC FOR SUSTAINABLE REGIMES
The important question in sustainability is not whether the world is sustainable, but whether a humanly acceptable regime of the world is sustainable. We propose Fisher Information as a metric for the sustainability of dynamic regimes in complex systems. The quantity now known ...
Book review: Biology and conservation of martens, sables, and fishers: A new synthesis
Jenkins, Kurt J.
2013-01-01
Review info: Biology and conservation of martens, sables, and fishers: A new synthesis. Edited by K.B. Aubry, W.J. Zielinski, M.G. Raphael, G. Proulx, and S.W. Buskirk, 2012. ISBN: 978-08014, 580pp.
Assessing Fishers' Support of Striped Bass Management Strategies.
Murphy, Robert D; Scyphers, Steven B; Grabowski, Jonathan H
2015-01-01
Incorporating the perspectives and insights of stakeholders is an essential component of ecosystem-based fisheries management, such that policy strategies should account for the diverse interests of various groups of anglers to enhance their efficacy. Here we assessed fishing stakeholders' perceptions on the management of Atlantic striped bass (Morone saxatilis) and receptiveness to potential future regulations using an online survey of recreational and commercial fishers in Massachusetts and Connecticut (USA). Our results indicate that most fishers harbored adequate to positive perceptions of current striped bass management policies when asked to grade their state's management regime. Yet, subtle differences in perceptions existed between recreational and commercial fishers, as well as across individuals with differing levels of fishing experience, resource dependency, and tournament participation. Recreational fishers in both states were generally supportive or neutral towards potential management actions including slot limits (71%) and mandated circle hooks to reduce mortality of released fish (74%), but less supportive of reduced recreational bag limits (51%). Although commercial anglers were typically less supportive of management changes than their recreational counterparts, the majority were still supportive of slot limits (54%) and mandated use of circle hooks (56%). Our study suggests that both recreational and commercial fishers are generally supportive of additional management strategies aimed at sustaining healthy striped bass populations and agree on a variety of strategies. However, both stakeholder groups were less supportive of harvest reductions, which is the most direct measure of reducing mortality available to fisheries managers. By revealing factors that influence stakeholders' support or willingness to comply with management strategies, studies such as ours can help managers identify potential stakeholder support for or conflicts that may result from regulation changes.
Assessing Fishers' Support of Striped Bass Management Strategies
Murphy, Robert D.; Scyphers, Steven B.; Grabowski, Jonathan H.
2015-01-01
Incorporating the perspectives and insights of stakeholders is an essential component of ecosystem-based fisheries management, such that policy strategies should account for the diverse interests of various groups of anglers to enhance their efficacy. Here we assessed fishing stakeholders’ perceptions on the management of Atlantic striped bass (Morone saxatilis) and receptiveness to potential future regulations using an online survey of recreational and commercial fishers in Massachusetts and Connecticut (USA). Our results indicate that most fishers harbored adequate to positive perceptions of current striped bass management policies when asked to grade their state’s management regime. Yet, subtle differences in perceptions existed between recreational and commercial fishers, as well as across individuals with differing levels of fishing experience, resource dependency, and tournament participation. Recreational fishers in both states were generally supportive or neutral towards potential management actions including slot limits (71%) and mandated circle hooks to reduce mortality of released fish (74%), but less supportive of reduced recreational bag limits (51%). Although commercial anglers were typically less supportive of management changes than their recreational counterparts, the majority were still supportive of slot limits (54%) and mandated use of circle hooks (56%). Our study suggests that both recreational and commercial fishers are generally supportive of additional management strategies aimed at sustaining healthy striped bass populations and agree on a variety of strategies. However, both stakeholder groups were less supportive of harvest reductions, which is the most direct measure of reducing mortality available to fisheries managers. By revealing factors that influence stakeholders’ support or willingness to comply with management strategies, studies such as ours can help managers identify potential stakeholder support for or conflicts that may result from regulation changes. PMID:26305324
Effectiveness of scat detection dogs for detecting forest carnivores
Long, Robert A.; Donovan, T.M.; MacKay, Paula; Zielinski, William J.; Buzas, Jeffrey S.
2007-01-01
We assessed the detection and accuracy rates of detection dogs trained to locate scats from free-ranging black bears (Ursus americanus), fishers (Martes pennanti), and bobcats (Lynx rufus). During the summers of 2003-2004, 5 detection teams located 1,565 scats (747 putative black bear, 665 putative fisher, and 153 putative bobcat) at 168 survey sites throughout Vermont, USA. Of 347 scats genetically analyzed for species identification, 179 (51.6%) yielded a positive identification, 131 (37.8%) failed to yield DNA information, and 37 (10.7%) yielded DNA but provided no species confirmation. For 70 survey sites where confirmation of a putative target species' scat was not possible, we assessed the probability that ???1 of the scats collected at the site was deposited by the target species (probability of correct identification; P ID). Based on species confirmations or PID values, we detected bears at 57.1% (96) of sites, fishers at 61.3% (103) of sites, and bobcats at 12.5%o (21) of sites. We estimated that the mean probability of detecting the target species (when present) during a single visit to a site was 0.86 for black bears, 0.95 for fishers, and 0.40 for bobcats. The probability of detecting black bears was largely unaffected by site- or visit-specific covariates, but the probability of detecting fishers varied by detection team. We found little or no effect of topographic ruggedness, vegetation density, or local weather (e.g., temp, humidity) on detection probability for fishers or black bears (data were insufficient for bobcat analyses). Detection dogs were highly effective at locating scats from forest carnivores and provided an efficient and accurate method for collecting detection-nondetection data on multiple species.
Chou, Berry Yun-Hua; Liao, Chung-Min; Lin, Ming-Chao; Cheng, Hsu-Hui
2006-05-01
This paper presents a toxicokinetic/toxicodynamic analysis to appraise arsenic (As) bioaccumulation in farmed juvenile milkfish Chanos chanos at blackfoot disease (BFD)-endemic area in Taiwan, whereas probabilistic incremental lifetime cancer risk (ILCR) and hazard quotient (HQ) models are also employed to assess the range of exposures for the fishers and non-fishers who eat the contaminated fish. We conducted a 7-day exposure experiment to obtain toxicokinetic parameters, whereas a simple critical body burden toxicity model was verified with LC50(t) data obtained from a 7-day acute toxicity bioassay. Acute toxicity bioassay indicates that 96-h LC50 for juvenile milkfish exposed to As is 7.29 (95% CI: 3.10-10.47) mg l(-1). Our risk analysis for milkfish reared in BFD-endemic area indicates a low likelihood that survival is being affected by waterborne As. Human risk analysis demonstrates that 90%-tile probability exposure ILCRs for fishers in BFD-endemic area have orders of magnitude of 10(-3), indicating a high potential carcinogenic risk, whereas there is no significant cancer risk for non-fishers (ILCRs around 10(-5)). All predicted 90%-tiles of HQ are less than 1 for non-fishers, yet larger than 10 for fishers which indicate larger contributions from farmed milkfish consumptions. Sensitivity analysis indicates that to increase the accuracy of the results, efforts should focus on a better definition of probability distributions for milkfish daily consumption rate and As level in milkfish. Here we show that theoretical human health risks for consuming As-contaminated milkfish in the BFD-endemic area are alarming under a conservative condition based on a probabilistic risk assessment model.
77 FR 15650 - Fisher House and Other Temporary Lodging
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-16
... proposed rule generally reflects current VA policy and practice, and conforms to industry standards and... individuals from the medical care environment, Fisher House lodging is available only to accompanying... have an insignificant impact on small entities involved in the lodging industry. However, any effect...
Quantum Fisher information on its own is not a valid measure of the coherence
NASA Astrophysics Data System (ADS)
Kwon, Hyukjoon; Tan, Kok Chuan; Choi, Seongjeon; Jeong, Hyunseok
2018-06-01
We show that contrary to the claim in Feng and Wei (2017), the quantum Fisher information itself is not a valid measure of the coherence based on the resource theory because it can increase via an incoherent operation.
FISHER INFORMATION AS A METRIC FOR SUSTAINABLE SYSTEM REGIMES
The important question in sustainability is not whether the world is sustainable, but whether a humanly acceptable regime of the world is sustainable. We propose Fisher Information as a metric for the sustainability of dynamic regimes in complex systems. The quantity now known ...
75 FR 51828 - National Human Genome Research Institute; Notice of Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-23
... space available. Individuals who plan to attend and need special assistance, such as sign language.... Place: National Institutes of Health, 5635 Fishers Lane, Terrace Level Conference Center, Bethesda, MD.../or proposals. Place: National Institutes of Health, 5635 Fishers Lane, Terrace Level Conference...
78 FR 20932 - National Institute on Alcohol Abuse and Alcoholism; Notice of Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-08
... sign language interpretation or other reasonable accommodations, should notify the Contact Person... applications. Place: National Institutes of Health, 5635 Fishers Lane, Terrace Level Conference Rooms, Bethesda... council. Place: National Institutes of Health, 5635 Fishers Lane, Terrace Level Conference Rooms...
77 FR 8268 - National Human Genome Research Institute; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2012-02-14
... DEPARTMENT OF HEALTH AND HUMAN SERVICES National Institutes of Health National Human Genome... applications. Place: National Human Genome Research Institute, 5635 Fisher's Lane, Room 4076, Rockville, MD..., CIDR, National Human Genome Research Institute, National Institutes of Health, 5635 Fishers Lane, Suite...
78 FR 21382 - National Human Genome Research Institute; Notice of Closed Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-10
... DEPARTMENT OF HEALTH AND HUMAN SERVICES National Institutes of Health National Human Genome... applications. Place: National Human Genome Research Institute, Suite 4076, 5635 Fisher's Lane, Bethesda, MD..., National Human Genome Research Institute, National Institutes of Health, 5635 Fishers Lane, Suite 4075...
38 CFR 60.8 - Lodging availability.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 38 Pensions, Bonuses, and Veterans' Relief 2 2010-07-01 2010-07-01 false Lodging availability. 60.8 Section 60.8 Pensions, Bonuses, and Veterans' Relief DEPARTMENT OF VETERANS AFFAIRS (CONTINUED) FISHER HOUSES AND OTHER TEMPORARY LODGING § 60.8 Lodging availability. Fisher Houses are available solely...
Roger A. Powell; Steven W. Buskirk; William J. Zielinski
2003-01-01
The genus Martes is circumboreal in distribution, with extensions into southern (M. gwatkinsii) and southeast Asia as far as 7°S latitude (M. flavigula; Anderson 1970). The fisher (subgenus Pekania) is endemic to the New World and restricted to mesic coniferous forest of the boreal zone and its...
38 CFR 60.8 - Lodging availability.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 38 Pensions, Bonuses, and Veterans' Relief 2 2011-07-01 2011-07-01 false Lodging availability. 60.8 Section 60.8 Pensions, Bonuses, and Veterans' Relief DEPARTMENT OF VETERANS AFFAIRS (CONTINUED) FISHER HOUSES AND OTHER TEMPORARY LODGING § 60.8 Lodging availability. Fisher Houses are available solely...
Detection and Assessment of Ecosystem Regime Shifts from Fisher Information
Ecosystem regime shifts, which are long-term system reorganizations, have profound implications for sustainability. There is a great need for indicators of regime shifts, particularly methods that are applicable to data from real systems. We have developed a form of Fisher info...
A COMAPRISON OF MERCURY IN MINK AND FISHER IN RHODE ISLAND
Comparison of total mercury concentrations and nitrogen and carbon stable isotope values in muscle tissue and stomach contents of mink (Mustela vison) and fisher (Martes pennanti) from Rhode Island in 2000- 2003 showed results which appeared to reflect dietary differences betwee...
Hazardous Waste Cleanup: Fisher Scientific in Bridgewater, New Jersey
The Fisher Scientific Packaging Facility is an operating facility located on approximately 58 acres in Bridgewater, New Jersey. The site is bounded to the north by Route 202. Most of the frontage on Route 202 is retail/commercial, but there are still small
Fisher equation for anisotropic diffusion: simulating South American human dispersals.
Martino, Luis A; Osella, Ana; Dorso, Claudio; Lanata, José L
2007-09-01
The Fisher equation is commonly used to model population dynamics. This equation allows describing reaction-diffusion processes, considering both population growth and diffusion mechanism. Some results have been reported about modeling human dispersion, always assuming isotropic diffusion. Nevertheless, it is well-known that dispersion depends not only on the characteristics of the habitats where individuals are but also on the properties of the places where they intend to move, then isotropic approaches cannot adequately reproduce the evolution of the wave of advance of populations. Solutions to a Fisher equation are difficult to obtain for complex geometries, moreover, when anisotropy has to be considered and so few studies have been conducted in this direction. With this scope in mind, we present in this paper a solution for a Fisher equation, introducing anisotropy. We apply a finite difference method using the Crank-Nicholson approximation and analyze the results as a function of the characteristic parameters. Finally, this methodology is applied to model South American human dispersal.
Galaxy luminosity function and Tully-Fisher relation: reconciled through rotation-curve studies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cattaneo, Andrea; Salucci, Paolo; Papastergis, Emmanouil, E-mail: andrea.cattaneo@oamp.fr, E-mail: salucci@sissa.it, E-mail: papastergis@astro.cornell.edu
2014-03-10
The relation between galaxy luminosity L and halo virial velocity v {sub vir} required to fit the galaxy luminosity function differs from the observed Tully-Fisher relation between L and disk speed v {sub rot}. Because of this, the problem of reproducing the galaxy luminosity function and the Tully-Fisher relation simultaneously has plagued semianalytic models since their inception. Here we study the relation between v {sub rot} and v {sub vir} by fitting observational average rotation curves of disk galaxies binned in luminosity. We show that the v {sub rot}-v {sub vir} relation that we obtain in this way can fullymore » account for this seeming inconsistency. Therefore, the reconciliation of the luminosity function with the Tully-Fisher relation rests on the complex dependence of v {sub rot} on v {sub vir}, which arises because the ratio of stellar mass to dark matter mass is a strong function of halo mass.« less
Optimal adaptive control for quantum metrology with time-dependent Hamiltonians.
Pang, Shengshi; Jordan, Andrew N
2017-03-09
Quantum metrology has been studied for a wide range of systems with time-independent Hamiltonians. For systems with time-dependent Hamiltonians, however, due to the complexity of dynamics, little has been known about quantum metrology. Here we investigate quantum metrology with time-dependent Hamiltonians to bridge this gap. We obtain the optimal quantum Fisher information for parameters in time-dependent Hamiltonians, and show proper Hamiltonian control is generally necessary to optimize the Fisher information. We derive the optimal Hamiltonian control, which is generally adaptive, and the measurement scheme to attain the optimal Fisher information. In a minimal example of a qubit in a rotating magnetic field, we find a surprising result that the fundamental limit of T 2 time scaling of quantum Fisher information can be broken with time-dependent Hamiltonians, which reaches T 4 in estimating the rotation frequency of the field. We conclude by considering level crossings in the derivatives of the Hamiltonians, and point out additional control is necessary for that case.
Quantum demultiplexer of quantum parameter-estimation information in quantum networks
NASA Astrophysics Data System (ADS)
Xie, Yanqing; Huang, Yumeng; Wu, Yinzhong; Hao, Xiang
2018-05-01
The quantum demultiplexer is constructed by a series of unitary operators and multipartite entangled states. It is used to realize information broadcasting from an input node to multiple output nodes in quantum networks. The scheme of quantum network communication with respect to phase estimation is put forward through the demultiplexer subjected to amplitude damping noises. The generalized partial measurements can be applied to protect the transferring efficiency from environmental noises in the protocol. It is found out that there are some optimal coherent states which can be prepared to enhance the transmission of phase estimation. The dynamics of state fidelity and quantum Fisher information are investigated to evaluate the feasibility of the network communication. While the state fidelity deteriorates rapidly, the quantum Fisher information can be enhanced to a maximum value and then decreases slowly. The memory effect of the environment induces the oscillations of fidelity and quantum Fisher information. The adjustment of the strength of partial measurements is helpful to increase quantum Fisher information.
Social networks and environmental outcomes.
Barnes, Michele L; Lynham, John; Kalberg, Kolter; Leung, PingSun
2016-06-07
Social networks can profoundly affect human behavior, which is the primary force driving environmental change. However, empirical evidence linking microlevel social interactions to large-scale environmental outcomes has remained scarce. Here, we leverage comprehensive data on information-sharing networks among large-scale commercial tuna fishers to examine how social networks relate to shark bycatch, a global environmental issue. We demonstrate that the tendency for fishers to primarily share information within their ethnic group creates segregated networks that are strongly correlated with shark bycatch. However, some fishers share information across ethnic lines, and examinations of their bycatch rates show that network contacts are more strongly related to fishing behaviors than ethnicity. Our findings indicate that social networks are tied to actions that can directly impact marine ecosystems, and that biases toward within-group ties may impede the diffusion of sustainable behaviors. Importantly, our analysis suggests that enhanced communication channels across segregated fisher groups could have prevented the incidental catch of over 46,000 sharks between 2008 and 2012 in a single commercial fishery.
Optimal adaptive control for quantum metrology with time-dependent Hamiltonians
Pang, Shengshi; Jordan, Andrew N.
2017-01-01
Quantum metrology has been studied for a wide range of systems with time-independent Hamiltonians. For systems with time-dependent Hamiltonians, however, due to the complexity of dynamics, little has been known about quantum metrology. Here we investigate quantum metrology with time-dependent Hamiltonians to bridge this gap. We obtain the optimal quantum Fisher information for parameters in time-dependent Hamiltonians, and show proper Hamiltonian control is generally necessary to optimize the Fisher information. We derive the optimal Hamiltonian control, which is generally adaptive, and the measurement scheme to attain the optimal Fisher information. In a minimal example of a qubit in a rotating magnetic field, we find a surprising result that the fundamental limit of T2 time scaling of quantum Fisher information can be broken with time-dependent Hamiltonians, which reaches T4 in estimating the rotation frequency of the field. We conclude by considering level crossings in the derivatives of the Hamiltonians, and point out additional control is necessary for that case. PMID:28276428
Subsurface characterization with localized ensemble Kalman filter employing adaptive thresholding
NASA Astrophysics Data System (ADS)
Delijani, Ebrahim Biniaz; Pishvaie, Mahmoud Reza; Boozarjomehry, Ramin Bozorgmehry
2014-07-01
Ensemble Kalman filter, EnKF, as a Monte Carlo sequential data assimilation method has emerged promisingly for subsurface media characterization during past decade. Due to high computational cost of large ensemble size, EnKF is limited to small ensemble set in practice. This results in appearance of spurious correlation in covariance structure leading to incorrect or probable divergence of updated realizations. In this paper, a universal/adaptive thresholding method is presented to remove and/or mitigate spurious correlation problem in the forecast covariance matrix. This method is, then, extended to regularize Kalman gain directly. Four different thresholding functions have been considered to threshold forecast covariance and gain matrices. These include hard, soft, lasso and Smoothly Clipped Absolute Deviation (SCAD) functions. Three benchmarks are used to evaluate the performances of these methods. These benchmarks include a small 1D linear model and two 2D water flooding (in petroleum reservoirs) cases whose levels of heterogeneity/nonlinearity are different. It should be noted that beside the adaptive thresholding, the standard distance dependant localization and bootstrap Kalman gain are also implemented for comparison purposes. We assessed each setup with different ensemble sets to investigate the sensitivity of each method on ensemble size. The results indicate that thresholding of forecast covariance yields more reliable performance than Kalman gain. Among thresholding function, SCAD is more robust for both covariance and gain estimation. Our analyses emphasize that not all assimilation cycles do require thresholding and it should be performed wisely during the early assimilation cycles. The proposed scheme of adaptive thresholding outperforms other methods for subsurface characterization of underlying benchmarks.
Astronaut Anna Fisher practices control of the RMS in a trainer
1984-08-21
S84-40162 (21 Aug. 1984) --- Astronaut Anna L. Fisher controls the Remote Manipulator System (RMS) arm from inside the "orbiter" as part of her training program in the Johnson Space Center's Shuttle Mock-up and Integration Laboratory. Dr. Fisher, one of three mission specialists for mission 51-A, is inside the cabin portion of a trainer called the Manipulatory Development Facility (MDF). She is able to operate the arm in conjunction with an air bearing floor and to log a great deal of rehearsal time for her flight, on which the retrieval of a low-orbiting communications satellite is planned. Photo credit: NASA
Rosales, Alirio
2017-04-01
Theories are composed of multiple interacting components. I argue that some theories have narratives as essential components, and that narratives function as integrative devices of the mathematical components of theories. Narratives represent complex processes unfolding in time as a sequence of stages, and hold the mathematical elements together as pieces in the investigation of a given process. I present two case studies from population genetics: R. A. Fisher's "mas selection" theory, and Sewall Wright's shifting balance theory. I apply my analysis to an early episode of the "R. A. Fisher - Sewall Wright controversy." Copyright © 2017 Elsevier Ltd. All rights reserved.
Maniac talk - Dr. Richard R. Fisher, Director, Heliophysics Division (Emeritus), NASA HQ
2016-05-25
Dr. Richard R. Fisher: "As in the case of learning how to perform in any specialized context, I found there were a number of issues I was neither taught nor did I learn from life experience. Over the course of a 50-year career that transitioned from ground-based to space-based, I came to understand that there are specific tools and values that proved vital. Using my own journey, I shall summarize a few of the more useful, to identify and make available things and ideas that helped me with my time with NASA." Dr. Richard R. Fisher, Director, Heliophysics Division (Emeritus), NASA HQ
Colman, Steven M.
1983-01-01
Apparently, several pulses of salt flowed into the diapir between about 2-3 and 0.25Myr ago, and the diapir may still be active. The rising salt diapir impeded the flow of ancestral Fisher Creek, causing deposition of more than 125m of basin-fill sediments, and eventually diverted the creek down Cottonwood graben to the Dolores River about 0.25Myr ago. Onion Creek has eroded headward from the Colorado River, through both the diapir and the basin-fill sediments, and is about to capture Fisher Creek, restoring the original drainage course. -from Author
Michael Fisher at King's College London
NASA Astrophysics Data System (ADS)
Domb, Cyril
1991-09-01
Michael Fisher spent the first 16 years of his academic life in the Physics Department of King's College, London, starting as an undergraduate and ending as a full professor. A survey is undertaken of his activities and achievements during the various periods of this phase of his career.
Michael Fisher at King's College London
NASA Astrophysics Data System (ADS)
Domb, Cyril
Michael Fisher spent the first 16 years of his academic life in the Physics Department of King's College, London, starting as an undergraduate and ending as a full professor. A survey is undertaken of his activities and achievements during the various periods of this phase of his career.
Comment on ‘On the realisation of quantum Fisher information’
NASA Astrophysics Data System (ADS)
Olendski, O.
2017-05-01
It is shown that calculation of the momentum Fisher information of the quasi-one-dimensional hydrogen atom recently presented by Saha et al (2017 Eur. J. Phys. 38 025103) is wrong. A correct derivation is provided and its didactical advantages and scientific significances are highlighted.
REGIME CHANGES IN ECOLOGICAL SYSTEMS: AN INFORMATION THEORY APPROACH
We present our efforts at developing an ecological system using Information Theory. We derive an expression for Fisher Information based on sampling of the system trajectory as it evolves in the state space. The Fisher Information index as we have derived it captures the characte...
Resting habitat selection by fishers in California
William J. Zielinski; Richard L. Truex; Gregory A. Schmidt; Fredrick V. Schlexer; Kristin N. Schmidt; Reginald H. Barrett
2004-01-01
We studied the resting habitat ecology of fishers (Martes pennanti) in 2 disjunct populations in California, USA: the northwestern coastal mountains (hereafter, Coastal) and the southern Sierra Nevada (hereafter, Sierra). We described resting structures and compared features surrounding resting structures (the resting site) with those at randomly...
ERIC Educational Resources Information Center
Fisher, William P., Jr.; Choi, Ellie; Fisher, William P.; Stenner, A. Jackson; Horabin, Ivan; Wright, Benjamin D.
1998-01-01
Comments on measurement aspects are presented in discussions of (1) methodology and morality (W. P. Fisher); (2) Rasch measurement (E. Choi); (3) novel wisdom of the Rasch approach (W. P. Fisher); (4) development of construct definition and calibration (A. J. Stenner and I. Horabin); and (5) origin of dimensions (B. D. Wright). (SLD)
78 FR 47715 - National Human Genome Research Institute; Notice of Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-06
... space available. Individuals who plan to attend and need special assistance, such as sign language...: National Institutes of Health, Terrace Level Conference Room, 5635 Fishers Lane, Rockville, MD 20892. Open... Institutes of Health, Terrace Level Conference Room, 5635 Fishers Lane, Rockville, MD 20892. Closed...
75 FR 46951 - National Human Genome Research Institute; Notice of Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-04
... space available. Individuals who plan to attend and need special assistance, such as sign language... relevance. Place: National Institutes of Health, 5635 Fishers Lane, Terrace Level Conference Room, Bethesda... applications and/or proposals. Place: National Institutes of Health, 5635 Fishers Lane, Terrace Level...