Sample records for adaptive sampling approach

  1. Spatial adaptive sampling in multiscale simulation

    NASA Astrophysics Data System (ADS)

    Rouet-Leduc, Bertrand; Barros, Kipton; Cieren, Emmanuel; Elango, Venmugil; Junghans, Christoph; Lookman, Turab; Mohd-Yusof, Jamaludin; Pavel, Robert S.; Rivera, Axel Y.; Roehm, Dominic; McPherson, Allen L.; Germann, Timothy C.

    2014-07-01

    In a common approach to multiscale simulation, an incomplete set of macroscale equations must be supplemented with constitutive data provided by fine-scale simulation. Collecting statistics from these fine-scale simulations is typically the overwhelming computational cost. We reduce this cost by interpolating the results of fine-scale simulation over the spatial domain of the macro-solver. Unlike previous adaptive sampling strategies, we do not interpolate on the potentially very high dimensional space of inputs to the fine-scale simulation. Our approach is local in space and time, avoids the need for a central database, and is designed to parallelize well on large computer clusters. To demonstrate our method, we simulate one-dimensional elastodynamic shock propagation using the Heterogeneous Multiscale Method (HMM); we find that spatial adaptive sampling requires only ≈ 50 ×N0.14 fine-scale simulations to reconstruct the stress field at all N grid points. Related multiscale approaches, such as Equation Free methods, may also benefit from spatial adaptive sampling.

  2. Introducing sampling entropy in repository based adaptive umbrella sampling

    NASA Astrophysics Data System (ADS)

    Zheng, Han; Zhang, Yingkai

    2009-12-01

    Determining free energy surfaces along chosen reaction coordinates is a common and important task in simulating complex systems. Due to the complexity of energy landscapes and the existence of high barriers, one widely pursued objective to develop efficient simulation methods is to achieve uniform sampling among thermodynamic states of interest. In this work, we have demonstrated sampling entropy (SE) as an excellent indicator for uniform sampling as well as for the convergence of free energy simulations. By introducing SE and the concentration theorem into the biasing-potential-updating scheme, we have further improved the adaptivity, robustness, and applicability of our recently developed repository based adaptive umbrella sampling (RBAUS) approach [H. Zheng and Y. Zhang, J. Chem. Phys. 128, 204106 (2008)]. Besides simulations of one dimensional free energy profiles for various systems, the generality and efficiency of this new RBAUS-SE approach have been further demonstrated by determining two dimensional free energy surfaces for the alanine dipeptide in gas phase as well as in water.

  3. Two-stage sequential sampling: A neighborhood-free adaptive sampling procedure

    USGS Publications Warehouse

    Salehi, M.; Smith, D.R.

    2005-01-01

    Designing an efficient sampling scheme for a rare and clustered population is a challenging area of research. Adaptive cluster sampling, which has been shown to be viable for such a population, is based on sampling a neighborhood of units around a unit that meets a specified condition. However, the edge units produced by sampling neighborhoods have proven to limit the efficiency and applicability of adaptive cluster sampling. We propose a sampling design that is adaptive in the sense that the final sample depends on observed values, but it avoids the use of neighborhoods and the sampling of edge units. Unbiased estimators of population total and its variance are derived using Murthy's estimator. The modified two-stage sampling design is easy to implement and can be applied to a wider range of populations than adaptive cluster sampling. We evaluate the proposed sampling design by simulating sampling of two real biological populations and an artificial population for which the variable of interest took the value either 0 or 1 (e.g., indicating presence and absence of a rare event). We show that the proposed sampling design is more efficient than conventional sampling in nearly all cases. The approach used to derive estimators (Murthy's estimator) opens the door for unbiased estimators to be found for similar sequential sampling designs. ?? 2005 American Statistical Association and the International Biometric Society.

  4. Differentially Private Histogram Publication For Dynamic Datasets: An Adaptive Sampling Approach

    PubMed Central

    Li, Haoran; Jiang, Xiaoqian; Xiong, Li; Liu, Jinfei

    2016-01-01

    Differential privacy has recently become a de facto standard for private statistical data release. Many algorithms have been proposed to generate differentially private histograms or synthetic data. However, most of them focus on “one-time” release of a static dataset and do not adequately address the increasing need of releasing series of dynamic datasets in real time. A straightforward application of existing histogram methods on each snapshot of such dynamic datasets will incur high accumulated error due to the composibility of differential privacy and correlations or overlapping users between the snapshots. In this paper, we address the problem of releasing series of dynamic datasets in real time with differential privacy, using a novel adaptive distance-based sampling approach. Our first method, DSFT, uses a fixed distance threshold and releases a differentially private histogram only when the current snapshot is sufficiently different from the previous one, i.e., with a distance greater than a predefined threshold. Our second method, DSAT, further improves DSFT and uses a dynamic threshold adaptively adjusted by a feedback control mechanism to capture the data dynamics. Extensive experiments on real and synthetic datasets demonstrate that our approach achieves better utility than baseline methods and existing state-of-the-art methods. PMID:26973795

  5. A novel multi-scale adaptive sampling-based approach for energy saving in leak detection for WSN-based water pipelines

    NASA Astrophysics Data System (ADS)

    Saqib, Najam us; Faizan Mysorewala, Muhammad; Cheded, Lahouari

    2017-12-01

    In this paper, we propose a novel monitoring strategy for a wireless sensor networks (WSNs)-based water pipeline network. Our strategy uses a multi-pronged approach to reduce energy consumption based on the use of two types of vibration sensors and pressure sensors, all having different energy levels, and a hierarchical adaptive sampling mechanism to determine the sampling frequency. The sampling rate of the sensors is adjusted according to the bandwidth of the vibration signal being monitored by using a wavelet-based adaptive thresholding scheme that calculates the new sampling frequency for the following cycle. In this multimodal sensing scheme, the duty-cycling approach is used for all sensors to reduce the sampling instances, such that the high-energy, high-precision (HE-HP) vibration sensors have low duty cycles, and the low-energy, low-precision (LE-LP) vibration sensors have high duty cycles. The low duty-cycling (HE-HP) vibration sensor adjusts the sampling frequency of the high duty-cycling (LE-LP) vibration sensor. The simulated test bed considered here consists of a water pipeline network which uses pressure and vibration sensors, with the latter having different energy consumptions and precision levels, at various locations in the network. This is all the more useful for energy conservation for extended monitoring. It is shown that by using the novel features of our proposed scheme, a significant reduction in energy consumption is achieved and the leak is effectively detected by the sensor node that is closest to it. Finally, both the total energy consumed by monitoring as well as the time to detect the leak by a WSN node are computed, and show the superiority of our proposed hierarchical adaptive sampling algorithm over a non-adaptive sampling approach.

  6. Adaptive web sampling.

    PubMed

    Thompson, Steven K

    2006-12-01

    A flexible class of adaptive sampling designs is introduced for sampling in network and spatial settings. In the designs, selections are made sequentially with a mixture distribution based on an active set that changes as the sampling progresses, using network or spatial relationships as well as sample values. The new designs have certain advantages compared with previously existing adaptive and link-tracing designs, including control over sample sizes and of the proportion of effort allocated to adaptive selections. Efficient inference involves averaging over sample paths consistent with the minimal sufficient statistic. A Markov chain resampling method makes the inference computationally feasible. The designs are evaluated in network and spatial settings using two empirical populations: a hidden human population at high risk for HIV/AIDS and an unevenly distributed bird population.

  7. A Surrogate-based Adaptive Sampling Approach for History Matching and Uncertainty Quantification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Weixuan; Zhang, Dongxiao; Lin, Guang

    A critical procedure in reservoir simulations is history matching (or data assimilation in a broader sense), which calibrates model parameters such that the simulation results are consistent with field measurements, and hence improves the credibility of the predictions given by the simulations. Often there exist non-unique combinations of parameter values that all yield the simulation results matching the measurements. For such ill-posed history matching problems, Bayesian theorem provides a theoretical foundation to represent different solutions and to quantify the uncertainty with the posterior PDF. Lacking an analytical solution in most situations, the posterior PDF may be characterized with a samplemore » of realizations, each representing a possible scenario. A novel sampling algorithm is presented here for the Bayesian solutions to history matching problems. We aim to deal with two commonly encountered issues: 1) as a result of the nonlinear input-output relationship in a reservoir model, the posterior distribution could be in a complex form, such as multimodal, which violates the Gaussian assumption required by most of the commonly used data assimilation approaches; 2) a typical sampling method requires intensive model evaluations and hence may cause unaffordable computational cost. In the developed algorithm, we use a Gaussian mixture model as the proposal distribution in the sampling process, which is simple but also flexible to approximate non-Gaussian distributions and is particularly efficient when the posterior is multimodal. Also, a Gaussian process is utilized as a surrogate model to speed up the sampling process. Furthermore, an iterative scheme of adaptive surrogate refinement and re-sampling ensures sampling accuracy while keeping the computational cost at a minimum level. The developed approach is demonstrated with an illustrative example and shows its capability in handling the above-mentioned issues. Multimodal posterior of the history

  8. Adaptive Sampling-Based Information Collection for Wireless Body Area Networks.

    PubMed

    Xu, Xiaobin; Zhao, Fang; Wang, Wendong; Tian, Hui

    2016-08-31

    To collect important health information, WBAN applications typically sense data at a high frequency. However, limited by the quality of wireless link, the uploading of sensed data has an upper frequency. To reduce upload frequency, most of the existing WBAN data collection approaches collect data with a tolerable error. These approaches can guarantee precision of the collected data, but they are not able to ensure that the upload frequency is within the upper frequency. Some traditional sampling based approaches can control upload frequency directly, however, they usually have a high loss of information. Since the core task of WBAN applications is to collect health information, this paper aims to collect optimized information under the limitation of upload frequency. The importance of sensed data is defined according to information theory for the first time. Information-aware adaptive sampling is proposed to collect uniformly distributed data. Then we propose Adaptive Sampling-based Information Collection (ASIC) which consists of two algorithms. An adaptive sampling probability algorithm is proposed to compute sampling probabilities of different sensed values. A multiple uniform sampling algorithm provides uniform samplings for values in different intervals. Experiments based on a real dataset show that the proposed approach has higher performance in terms of data coverage and information quantity. The parameter analysis shows the optimized parameter settings and the discussion shows the underlying reason of high performance in the proposed approach.

  9. Adaptive Sampling-Based Information Collection for Wireless Body Area Networks

    PubMed Central

    Xu, Xiaobin; Zhao, Fang; Wang, Wendong; Tian, Hui

    2016-01-01

    To collect important health information, WBAN applications typically sense data at a high frequency. However, limited by the quality of wireless link, the uploading of sensed data has an upper frequency. To reduce upload frequency, most of the existing WBAN data collection approaches collect data with a tolerable error. These approaches can guarantee precision of the collected data, but they are not able to ensure that the upload frequency is within the upper frequency. Some traditional sampling based approaches can control upload frequency directly, however, they usually have a high loss of information. Since the core task of WBAN applications is to collect health information, this paper aims to collect optimized information under the limitation of upload frequency. The importance of sensed data is defined according to information theory for the first time. Information-aware adaptive sampling is proposed to collect uniformly distributed data. Then we propose Adaptive Sampling-based Information Collection (ASIC) which consists of two algorithms. An adaptive sampling probability algorithm is proposed to compute sampling probabilities of different sensed values. A multiple uniform sampling algorithm provides uniform samplings for values in different intervals. Experiments based on a real dataset show that the proposed approach has higher performance in terms of data coverage and information quantity. The parameter analysis shows the optimized parameter settings and the discussion shows the underlying reason of high performance in the proposed approach. PMID:27589758

  10. Adaptive sampling in behavioral surveys.

    PubMed

    Thompson, S K

    1997-01-01

    Studies of populations such as drug users encounter difficulties because the members of the populations are rare, hidden, or hard to reach. Conventionally designed large-scale surveys detect relatively few members of the populations so that estimates of population characteristics have high uncertainty. Ethnographic studies, on the other hand, reach suitable numbers of individuals only through the use of link-tracing, chain referral, or snowball sampling procedures that often leave the investigators unable to make inferences from their sample to the hidden population as a whole. In adaptive sampling, the procedure for selecting people or other units to be in the sample depends on variables of interest observed during the survey, so the design adapts to the population as encountered. For example, when self-reported drug use is found among members of the sample, sampling effort may be increased in nearby areas. Types of adaptive sampling designs include ordinary sequential sampling, adaptive allocation in stratified sampling, adaptive cluster sampling, and optimal model-based designs. Graph sampling refers to situations with nodes (for example, people) connected by edges (such as social links or geographic proximity). An initial sample of nodes or edges is selected and edges are subsequently followed to bring other nodes into the sample. Graph sampling designs include network sampling, snowball sampling, link-tracing, chain referral, and adaptive cluster sampling. A graph sampling design is adaptive if the decision to include linked nodes depends on variables of interest observed on nodes already in the sample. Adjustment methods for nonsampling errors such as imperfect detection of drug users in the sample apply to adaptive as well as conventional designs.

  11. A Novel Approach to Adaptive Flow Separation Control

    DTIC Science & Technology

    2016-09-03

    particular, it considers control of flow separation over a NACA-0025 airfoil using microjet actuators and develops Adaptive Sampling Based Model...Predictive Control ( Adaptive SBMPC), a novel approach to Nonlinear Model Predictive Control that applies the Minimal Resource Allocation Network...Distribution Unlimited UU UU UU UU 03-09-2016 1-May-2013 30-Apr-2016 Final Report: A Novel Approach to Adaptive Flow Separation Control The views, opinions

  12. Accurate Biomass Estimation via Bayesian Adaptive Sampling

    NASA Technical Reports Server (NTRS)

    Wheeler, Kevin R.; Knuth, Kevin H.; Castle, Joseph P.; Lvov, Nikolay

    2005-01-01

    The following concepts were introduced: a) Bayesian adaptive sampling for solving biomass estimation; b) Characterization of MISR Rahman model parameters conditioned upon MODIS landcover. c) Rigorous non-parametric Bayesian approach to analytic mixture model determination. d) Unique U.S. asset for science product validation and verification.

  13. Adaptive sampling strategies with high-throughput molecular dynamics

    NASA Astrophysics Data System (ADS)

    Clementi, Cecilia

    Despite recent significant hardware and software developments, the complete thermodynamic and kinetic characterization of large macromolecular complexes by molecular simulations still presents significant challenges. The high dimensionality of these systems and the complexity of the associated potential energy surfaces (creating multiple metastable regions connected by high free energy barriers) does not usually allow to adequately sample the relevant regions of their configurational space by means of a single, long Molecular Dynamics (MD) trajectory. Several different approaches have been proposed to tackle this sampling problem. We focus on the development of ensemble simulation strategies, where data from a large number of weakly coupled simulations are integrated to explore the configurational landscape of a complex system more efficiently. Ensemble methods are of increasing interest as the hardware roadmap is now mostly based on increasing core counts, rather than clock speeds. The main challenge in the development of an ensemble approach for efficient sampling is in the design of strategies to adaptively distribute the trajectories over the relevant regions of the systems' configurational space, without using any a priori information on the system global properties. We will discuss the definition of smart adaptive sampling approaches that can redirect computational resources towards unexplored yet relevant regions. Our approaches are based on new developments in dimensionality reduction for high dimensional dynamical systems, and optimal redistribution of resources. NSF CHE-1152344, NSF CHE-1265929, Welch Foundation C-1570.

  14. Adaptive enhanced sampling by force-biasing using neural networks

    NASA Astrophysics Data System (ADS)

    Guo, Ashley Z.; Sevgen, Emre; Sidky, Hythem; Whitmer, Jonathan K.; Hubbell, Jeffrey A.; de Pablo, Juan J.

    2018-04-01

    A machine learning assisted method is presented for molecular simulation of systems with rugged free energy landscapes. The method is general and can be combined with other advanced sampling techniques. In the particular implementation proposed here, it is illustrated in the context of an adaptive biasing force approach where, rather than relying on discrete force estimates, one can resort to a self-regularizing artificial neural network to generate continuous, estimated generalized forces. By doing so, the proposed approach addresses several shortcomings common to adaptive biasing force and other algorithms. Specifically, the neural network enables (1) smooth estimates of generalized forces in sparsely sampled regions, (2) force estimates in previously unexplored regions, and (3) continuous force estimates with which to bias the simulation, as opposed to biases generated at specific points of a discrete grid. The usefulness of the method is illustrated with three different examples, chosen to highlight the wide range of applicability of the underlying concepts. In all three cases, the new method is found to enhance considerably the underlying traditional adaptive biasing force approach. The method is also found to provide improvements over previous implementations of neural network assisted algorithms.

  15. Strategies for informed sample size reduction in adaptive controlled clinical trials

    NASA Astrophysics Data System (ADS)

    Arandjelović, Ognjen

    2017-12-01

    Clinical trial adaptation refers to any adjustment of the trial protocol after the onset of the trial. The main goal is to make the process of introducing new medical interventions to patients more efficient. The principal challenge, which is an outstanding research problem, is to be found in the question of how adaptation should be performed so as to minimize the chance of distorting the outcome of the trial. In this paper, we propose a novel method for achieving this. Unlike most of the previously published work, our approach focuses on trial adaptation by sample size adjustment, i.e. by reducing the number of trial participants in a statistically informed manner. Our key idea is to select the sample subset for removal in a manner which minimizes the associated loss of information. We formalize this notion and describe three algorithms which approach the problem in different ways, respectively, using (i) repeated random draws, (ii) a genetic algorithm, and (iii) what we term pair-wise sample compatibilities. Experiments on simulated data demonstrate the effectiveness of all three approaches, with a consistently superior performance exhibited by the pair-wise sample compatibilities-based method.

  16. Adaptive Sampling of Time Series During Remote Exploration

    NASA Technical Reports Server (NTRS)

    Thompson, David R.

    2012-01-01

    This work deals with the challenge of online adaptive data collection in a time series. A remote sensor or explorer agent adapts its rate of data collection in order to track anomalous events while obeying constraints on time and power. This problem is challenging because the agent has limited visibility (all its datapoints lie in the past) and limited control (it can only decide when to collect its next datapoint). This problem is treated from an information-theoretic perspective, fitting a probabilistic model to collected data and optimizing the future sampling strategy to maximize information gain. The performance characteristics of stationary and nonstationary Gaussian process models are compared. Self-throttling sensors could benefit environmental sensor networks and monitoring as well as robotic exploration. Explorer agents can improve performance by adjusting their data collection rate, preserving scarce power or bandwidth resources during uninteresting times while fully covering anomalous events of interest. For example, a remote earthquake sensor could conserve power by limiting its measurements during normal conditions and increasing its cadence during rare earthquake events. A similar capability could improve sensor platforms traversing a fixed trajectory, such as an exploration rover transect or a deep space flyby. These agents can adapt observation times to improve sample coverage during moments of rapid change. An adaptive sampling approach couples sensor autonomy, instrument interpretation, and sampling. The challenge is addressed as an active learning problem, which already has extensive theoretical treatment in the statistics and machine learning literature. A statistical Gaussian process (GP) model is employed to guide sample decisions that maximize information gain. Nonsta tion - ary (e.g., time-varying) covariance relationships permit the system to represent and track local anomalies, in contrast with current GP approaches. Most common GP models

  17. Adaptive sampling in research on risk-related behaviors.

    PubMed

    Thompson, Steven K; Collins, Linda M

    2002-11-01

    This article introduces adaptive sampling designs to substance use researchers. Adaptive sampling is particularly useful when the population of interest is rare, unevenly distributed, hidden, or hard to reach. Examples of such populations are injection drug users, individuals at high risk for HIV/AIDS, and young adolescents who are nicotine dependent. In conventional sampling, the sampling design is based entirely on a priori information, and is fixed before the study begins. By contrast, in adaptive sampling, the sampling design adapts based on observations made during the survey; for example, drug users may be asked to refer other drug users to the researcher. In the present article several adaptive sampling designs are discussed. Link-tracing designs such as snowball sampling, random walk methods, and network sampling are described, along with adaptive allocation and adaptive cluster sampling. It is stressed that special estimation procedures taking the sampling design into account are needed when adaptive sampling has been used. These procedures yield estimates that are considerably better than conventional estimates. For rare and clustered populations adaptive designs can give substantial gains in efficiency over conventional designs, and for hidden populations link-tracing and other adaptive procedures may provide the only practical way to obtain a sample large enough for the study objectives.

  18. Adaptive Peer Sampling with Newscast

    NASA Astrophysics Data System (ADS)

    Tölgyesi, Norbert; Jelasity, Márk

    The peer sampling service is a middleware service that provides random samples from a large decentralized network to support gossip-based applications such as multicast, data aggregation and overlay topology management. Lightweight gossip-based implementations of the peer sampling service have been shown to provide good quality random sampling while also being extremely robust to many failure scenarios, including node churn and catastrophic failure. We identify two problems with these approaches. The first problem is related to message drop failures: if a node experiences a higher-than-average message drop rate then the probability of sampling this node in the network will decrease. The second problem is that the application layer at different nodes might request random samples at very different rates which can result in very poor random sampling especially at nodes with high request rates. We propose solutions for both problems. We focus on Newscast, a robust implementation of the peer sampling service. Our solution is based on simple extensions of the protocol and an adaptive self-control mechanism for its parameters, namely—without involving failure detectors—nodes passively monitor local protocol events using them as feedback for a local control loop for self-tuning the protocol parameters. The proposed solution is evaluated by simulation experiments.

  19. Distributed database kriging for adaptive sampling (D²KAS)

    DOE PAGES

    Roehm, Dominic; Pavel, Robert S.; Barros, Kipton; ...

    2015-03-18

    We present an adaptive sampling method supplemented by a distributed database and a prediction method for multiscale simulations using the Heterogeneous Multiscale Method. A finite-volume scheme integrates the macro-scale conservation laws for elastodynamics, which are closed by momentum and energy fluxes evaluated at the micro-scale. In the original approach, molecular dynamics (MD) simulations are launched for every macro-scale volume element. Our adaptive sampling scheme replaces a large fraction of costly micro-scale MD simulations with fast table lookup and prediction. The cloud database Redis provides the plain table lookup, and with locality aware hashing we gather input data for our predictionmore » scheme. For the latter we use kriging, which estimates an unknown value and its uncertainty (error) at a specific location in parameter space by using weighted averages of the neighboring points. We find that our adaptive scheme significantly improves simulation performance by a factor of 2.5 to 25, while retaining high accuracy for various choices of the algorithm parameters.« less

  20. Using continuous in-situ measurements to adaptively trigger urban storm water samples

    NASA Astrophysics Data System (ADS)

    Wong, B. P.; Kerkez, B.

    2015-12-01

    Until cost-effective in-situ sensors are available for biological parameters, nutrients and metals, automated samplers will continue to be the primary source of reliable water quality measurements. Given limited samples bottles, however, autosamplers often obscure insights on nutrient sources and biogeochemical processes which would otherwise be captured using a continuous sampling approach. To that end, we evaluate the efficacy a novel method to measure first-flush nutrient dynamics in flashy, urban watersheds. Our approach reduces the number of samples required to capture water quality dynamics by leveraging an internet-connected sensor node, which is equipped with a suite of continuous in-situ sensors and an automated sampler. To capture both the initial baseflow as well as storm concentrations, a cloud-hosted adaptive algorithm analyzes the high-resolution sensor data along with local weather forecasts to optimize a sampling schedule. The method was tested in a highly developed urban catchment in Ann Arbor, Michigan and collected samples of nitrate, phosphorus, and suspended solids throughout several storm events. Results indicate that the watershed does not exhibit first flush dynamics, a behavior that would have been obscured when using a non-adaptive sampling approach.

  1. Quality based approach for adaptive face recognition

    NASA Astrophysics Data System (ADS)

    Abboud, Ali J.; Sellahewa, Harin; Jassim, Sabah A.

    2009-05-01

    Recent advances in biometric technology have pushed towards more robust and reliable systems. We aim to build systems that have low recognition errors and are less affected by variation in recording conditions. Recognition errors are often attributed to the usage of low quality biometric samples. Hence, there is a need to develop new intelligent techniques and strategies to automatically measure/quantify the quality of biometric image samples and if necessary restore image quality according to the need of the intended application. In this paper, we present no-reference image quality measures in the spatial domain that have impact on face recognition. The first is called symmetrical adaptive local quality index (SALQI) and the second is called middle halve (MH). Also, an adaptive strategy has been developed to select the best way to restore the image quality, called symmetrical adaptive histogram equalization (SAHE). The main benefits of using quality measures for adaptive strategy are: (1) avoidance of excessive unnecessary enhancement procedures that may cause undesired artifacts, and (2) reduced computational complexity which is essential for real time applications. We test the success of the proposed measures and adaptive approach for a wavelet-based face recognition system that uses the nearest neighborhood classifier. We shall demonstrate noticeable improvements in the performance of adaptive face recognition system over the corresponding non-adaptive scheme.

  2. Low bit-rate image compression via adaptive down-sampling and constrained least squares upconversion.

    PubMed

    Wu, Xiaolin; Zhang, Xiangjun; Wang, Xiaohan

    2009-03-01

    Recently, many researchers started to challenge a long-standing practice of digital photography: oversampling followed by compression and pursuing more intelligent sparse sampling techniques. In this paper, we propose a practical approach of uniform down sampling in image space and yet making the sampling adaptive by spatially varying, directional low-pass prefiltering. The resulting down-sampled prefiltered image remains a conventional square sample grid, and, thus, it can be compressed and transmitted without any change to current image coding standards and systems. The decoder first decompresses the low-resolution image and then upconverts it to the original resolution in a constrained least squares restoration process, using a 2-D piecewise autoregressive model and the knowledge of directional low-pass prefiltering. The proposed compression approach of collaborative adaptive down-sampling and upconversion (CADU) outperforms JPEG 2000 in PSNR measure at low to medium bit rates and achieves superior visual quality, as well. The superior low bit-rate performance of the CADU approach seems to suggest that oversampling not only wastes hardware resources and energy, and it could be counterproductive to image quality given a tight bit budget.

  3. An adaptive signal-processing approach to online adaptive tutoring.

    PubMed

    Bergeron, Bryan; Cline, Andrew

    2011-01-01

    Conventional intelligent or adaptive tutoring online systems rely on domain-specific models of learner behavior based on rules, deep domain knowledge, and other resource-intensive methods. We have developed and studied a domain-independent methodology of adaptive tutoring based on domain-independent signal-processing approaches that obviate the need for the construction of explicit expert and student models. A key advantage of our method over conventional approaches is a lower barrier to entry for educators who want to develop adaptive online learning materials.

  4. An adaptive importance sampling algorithm for Bayesian inversion with multimodal distributions

    DOE PAGES

    Li, Weixuan; Lin, Guang

    2015-03-21

    Parametric uncertainties are encountered in the simulations of many physical systems, and may be reduced by an inverse modeling procedure that calibrates the simulation results to observations on the real system being simulated. Following Bayes’ rule, a general approach for inverse modeling problems is to sample from the posterior distribution of the uncertain model parameters given the observations. However, the large number of repetitive forward simulations required in the sampling process could pose a prohibitive computational burden. This difficulty is particularly challenging when the posterior is multimodal. We present in this paper an adaptive importance sampling algorithm to tackle thesemore » challenges. Two essential ingredients of the algorithm are: 1) a Gaussian mixture (GM) model adaptively constructed as the proposal distribution to approximate the possibly multimodal target posterior, and 2) a mixture of polynomial chaos (PC) expansions, built according to the GM proposal, as a surrogate model to alleviate the computational burden caused by computational-demanding forward model evaluations. In three illustrative examples, the proposed adaptive importance sampling algorithm demonstrates its capabilities of automatically finding a GM proposal with an appropriate number of modes for the specific problem under study, and obtaining a sample accurately and efficiently representing the posterior with limited number of forward simulations.« less

  5. An adaptive importance sampling algorithm for Bayesian inversion with multimodal distributions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Weixuan; Lin, Guang, E-mail: guanglin@purdue.edu

    2015-08-01

    Parametric uncertainties are encountered in the simulations of many physical systems, and may be reduced by an inverse modeling procedure that calibrates the simulation results to observations on the real system being simulated. Following Bayes' rule, a general approach for inverse modeling problems is to sample from the posterior distribution of the uncertain model parameters given the observations. However, the large number of repetitive forward simulations required in the sampling process could pose a prohibitive computational burden. This difficulty is particularly challenging when the posterior is multimodal. We present in this paper an adaptive importance sampling algorithm to tackle thesemore » challenges. Two essential ingredients of the algorithm are: 1) a Gaussian mixture (GM) model adaptively constructed as the proposal distribution to approximate the possibly multimodal target posterior, and 2) a mixture of polynomial chaos (PC) expansions, built according to the GM proposal, as a surrogate model to alleviate the computational burden caused by computational-demanding forward model evaluations. In three illustrative examples, the proposed adaptive importance sampling algorithm demonstrates its capabilities of automatically finding a GM proposal with an appropriate number of modes for the specific problem under study, and obtaining a sample accurately and efficiently representing the posterior with limited number of forward simulations.« less

  6. Implementation of time-efficient adaptive sampling function design for improved undersampled MRI reconstruction

    NASA Astrophysics Data System (ADS)

    Choi, Jinhyeok; Kim, Hyeonjin

    2016-12-01

    To improve the efficacy of undersampled MRI, a method of designing adaptive sampling functions is proposed that is simple to implement on an MR scanner and yet effectively improves the performance of the sampling functions. An approximation of the energy distribution of an image (E-map) is estimated from highly undersampled k-space data acquired in a prescan and efficiently recycled in the main scan. An adaptive probability density function (PDF) is generated by combining the E-map with a modeled PDF. A set of candidate sampling functions are then prepared from the adaptive PDF, among which the one with maximum energy is selected as the final sampling function. To validate its computational efficiency, the proposed method was implemented on an MR scanner, and its robust performance in Fourier-transform (FT) MRI and compressed sensing (CS) MRI was tested by simulations and in a cherry tomato. The proposed method consistently outperforms the conventional modeled PDF approach for undersampling ratios of 0.2 or higher in both FT-MRI and CS-MRI. To fully benefit from undersampled MRI, it is preferable that the design of adaptive sampling functions be performed online immediately before the main scan. In this way, the proposed method may further improve the efficacy of the undersampled MRI.

  7. Adaptive Cluster Sampling for Forest Inventories

    Treesearch

    Francis A. Roesch

    1993-01-01

    Adaptive cluster sampling is shown to be a viable alternative for sampling forests when there are rare characteristics of the forest trees which are of interest and occur on clustered trees. The ideas of recent work in Thompson (1990) have been extended to the case in which the initial sample is selected with unequal probabilities. An example is given in which the...

  8. Adaptive Sampling approach to environmental site characterization at Joliet Army Ammunition Plant: Phase 2 demonstration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bujewski, G.E.; Johnson, R.L.

    1996-04-01

    Adaptive sampling programs provide real opportunities to save considerable time and money when characterizing hazardous waste sites. This Strategic Environmental Research and Development Program (SERDP) project demonstrated two decision-support technologies, SitePlanner{trademark} and Plume{trademark}, that can facilitate the design and deployment of an adaptive sampling program. A demonstration took place at Joliet Army Ammunition Plant (JAAP), and was unique in that it was tightly coupled with ongoing Army characterization work at the facility, with close scrutiny by both state and federal regulators. The demonstration was conducted in partnership with the Army Environmental Center`s (AEC) Installation Restoration Program and AEC`s Technology Developmentmore » Program. AEC supported researchers from Tufts University who demonstrated innovative field analytical techniques for the analysis of TNT and DNT. SitePlanner{trademark} is an object-oriented database specifically designed for site characterization that provides an effective way to compile, integrate, manage and display site characterization data as it is being generated. Plume{trademark} uses a combination of Bayesian analysis and geostatistics to provide technical staff with the ability to quantitatively merge soft and hard information for an estimate of the extent of contamination. Plume{trademark} provides an estimate of contamination extent, measures the uncertainty associated with the estimate, determines the value of additional sampling, and locates additional samples so that their value is maximized.« less

  9. ADAPTIVE METHODS FOR STOCHASTIC DIFFERENTIAL EQUATIONS VIA NATURAL EMBEDDINGS AND REJECTION SAMPLING WITH MEMORY.

    PubMed

    Rackauckas, Christopher; Nie, Qing

    2017-01-01

    Adaptive time-stepping with high-order embedded Runge-Kutta pairs and rejection sampling provides efficient approaches for solving differential equations. While many such methods exist for solving deterministic systems, little progress has been made for stochastic variants. One challenge in developing adaptive methods for stochastic differential equations (SDEs) is the construction of embedded schemes with direct error estimates. We present a new class of embedded stochastic Runge-Kutta (SRK) methods with strong order 1.5 which have a natural embedding of strong order 1.0 methods. This allows for the derivation of an error estimate which requires no additional function evaluations. Next we derive a general method to reject the time steps without losing information about the future Brownian path termed Rejection Sampling with Memory (RSwM). This method utilizes a stack data structure to do rejection sampling, costing only a few floating point calculations. We show numerically that the methods generate statistically-correct and tolerance-controlled solutions. Lastly, we show that this form of adaptivity can be applied to systems of equations, and demonstrate that it solves a stiff biological model 12.28x faster than common fixed timestep algorithms. Our approach only requires the solution to a bridging problem and thus lends itself to natural generalizations beyond SDEs.

  10. ADAPTIVE METHODS FOR STOCHASTIC DIFFERENTIAL EQUATIONS VIA NATURAL EMBEDDINGS AND REJECTION SAMPLING WITH MEMORY

    PubMed Central

    Rackauckas, Christopher

    2017-01-01

    Adaptive time-stepping with high-order embedded Runge-Kutta pairs and rejection sampling provides efficient approaches for solving differential equations. While many such methods exist for solving deterministic systems, little progress has been made for stochastic variants. One challenge in developing adaptive methods for stochastic differential equations (SDEs) is the construction of embedded schemes with direct error estimates. We present a new class of embedded stochastic Runge-Kutta (SRK) methods with strong order 1.5 which have a natural embedding of strong order 1.0 methods. This allows for the derivation of an error estimate which requires no additional function evaluations. Next we derive a general method to reject the time steps without losing information about the future Brownian path termed Rejection Sampling with Memory (RSwM). This method utilizes a stack data structure to do rejection sampling, costing only a few floating point calculations. We show numerically that the methods generate statistically-correct and tolerance-controlled solutions. Lastly, we show that this form of adaptivity can be applied to systems of equations, and demonstrate that it solves a stiff biological model 12.28x faster than common fixed timestep algorithms. Our approach only requires the solution to a bridging problem and thus lends itself to natural generalizations beyond SDEs. PMID:29527134

  11. Behavioral and Neural Adaptation in Approach Behavior.

    PubMed

    Wang, Shuo; Falvello, Virginia; Porter, Jenny; Said, Christopher P; Todorov, Alexander

    2018-06-01

    People often make approachability decisions based on perceived facial trustworthiness. However, it remains unclear how people learn trustworthiness from a population of faces and whether this learning influences their approachability decisions. Here we investigated the neural underpinning of approach behavior and tested two important hypotheses: whether the amygdala adapts to different trustworthiness ranges and whether the amygdala is modulated by task instructions and evaluative goals. We showed that participants adapted to the stimulus range of perceived trustworthiness when making approach decisions and that these decisions were further modulated by the social context. The right amygdala showed both linear response and quadratic response to trustworthiness level, as observed in prior studies. Notably, the amygdala's response to trustworthiness was not modulated by stimulus range or social context, a possible neural dynamic adaptation. Together, our data have revealed a robust behavioral adaptation to different trustworthiness ranges as well as a neural substrate underlying approach behavior based on perceived facial trustworthiness.

  12. Application of adaptive cluster sampling to low-density populations of freshwater mussels

    USGS Publications Warehouse

    Smith, D.R.; Villella, R.F.; Lemarie, D.P.

    2003-01-01

    Freshwater mussels appear to be promising candidates for adaptive cluster sampling because they are benthic macroinvertebrates that cluster spatially and are frequently found at low densities. We applied adaptive cluster sampling to estimate density of freshwater mussels at 24 sites along the Cacapon River, WV, where a preliminary timed search indicated that mussels were present at low density. Adaptive cluster sampling increased yield of individual mussels and detection of uncommon species; however, it did not improve precision of density estimates. Because finding uncommon species, collecting individuals of those species, and estimating their densities are important conservation activities, additional research is warranted on application of adaptive cluster sampling to freshwater mussels. However, at this time we do not recommend routine application of adaptive cluster sampling to freshwater mussel populations. The ultimate, and currently unanswered, question is how to tell when adaptive cluster sampling should be used, i.e., when is a population sufficiently rare and clustered for adaptive cluster sampling to be efficient and practical? A cost-effective procedure needs to be developed to identify biological populations for which adaptive cluster sampling is appropriate.

  13. Adaptive Sampling for Urban Air Quality through Participatory Sensing

    PubMed Central

    Zeng, Yuanyuan; Xiang, Kai

    2017-01-01

    Air pollution is one of the major problems of the modern world. The popularization and powerful functions of smartphone applications enable people to participate in urban sensing to better know about the air problems surrounding them. Data sampling is one of the most important problems that affect the sensing performance. In this paper, we propose an Adaptive Sampling Scheme for Urban Air Quality (AS-air) through participatory sensing. Firstly, we propose to find the pattern rules of air quality according to the historical data contributed by participants based on Apriori algorithm. Based on it, we predict the on-line air quality and use it to accelerate the learning process to choose and adapt the sampling parameter based on Q-learning. The evaluation results show that AS-air provides an energy-efficient sampling strategy, which is adaptive toward the varied outside air environment with good sampling efficiency. PMID:29099766

  14. The Limits to Adaptation; A Systems Approach

    EPA Science Inventory

    The Limits to Adaptation: A Systems Approach. The ability to adapt to climate change is delineated by capacity thresholds, after which climate damages begin to overwhelm the adaptation response. Such thresholds depend upon physical properties (natural processes and engineering...

  15. The influence of approach-avoidance motivational orientation on conflict adaptation.

    PubMed

    Hengstler, Maikel; Holland, Rob W; van Steenbergen, Henk; van Knippenberg, Ad

    2014-06-01

    To deal effectively with a continuously changing environment, our cognitive system adaptively regulates resource allocation. Earlier findings showed that an avoidance orientation (induced by arm extension), relative to an approach orientation (induced by arm flexion), enhanced sustained cognitive control. In avoidance conditions, performance on a cognitive control task was enhanced, as indicated by a reduced congruency effect, relative to approach conditions. Extending these findings, in the present behavioral studies we investigated dynamic adaptations in cognitive control-that is, conflict adaptation. We proposed that an avoidance state recruits more resources in response to conflicting signals, and thereby increases conflict adaptation. Conversely, in an approach state, conflict processing diminishes, which consequently weakens conflict adaptation. As predicted, approach versus avoidance arm movements affected both behavioral congruency effects and conflict adaptation: As compared to approach, avoidance movements elicited reduced congruency effects and increased conflict adaptation. These results are discussed in line with a possible underlying neuropsychological model.

  16. Adaptation of G-TAG Software for Validating Touch-and-Go Comet Surface Sampling Design Methodology

    NASA Technical Reports Server (NTRS)

    Mandic, Milan; Acikmese, Behcet; Blackmore, Lars

    2011-01-01

    The G-TAG software tool was developed under the R&TD on Integrated Autonomous Guidance, Navigation, and Control for Comet Sample Return, and represents a novel, multi-body dynamics simulation software tool for studying TAG sampling. The G-TAG multi-body simulation tool provides a simulation environment in which a Touch-and-Go (TAG) sampling event can be extensively tested. TAG sampling requires the spacecraft to descend to the surface, contact the surface with a sampling collection device, and then to ascend to a safe altitude. The TAG event lasts only a few seconds but is mission-critical with potentially high risk. Consequently, there is a need for the TAG event to be well characterized and studied by simulation and analysis in order for the proposal teams to converge on a reliable spacecraft design. This adaptation of the G-TAG tool was developed to support the Comet Odyssey proposal effort, and is specifically focused to address comet sample return missions. In this application, the spacecraft descends to and samples from the surface of a comet. Performance of the spacecraft during TAG is assessed based on survivability and sample collection performance. For the adaptation of the G-TAG simulation tool to comet scenarios, models are developed that accurately describe the properties of the spacecraft, approach trajectories, and descent velocities, as well as the models of the external forces and torques acting on the spacecraft. The adapted models of the spacecraft, descent profiles, and external sampling forces/torques were more sophisticated and customized for comets than those available in the basic G-TAG simulation tool. Scenarios implemented include the study of variations in requirements, spacecraft design (size, locations, etc. of the spacecraft components), and the environment (surface properties, slope, disturbances, etc.). The simulations, along with their visual representations using G-View, contributed to the Comet Odyssey New Frontiers proposal

  17. Temporally Adaptive Sampling: A Case Study in Rare Species Survey Design with Marbled Salamanders (Ambystoma opacum)

    PubMed Central

    Charney, Noah D.; Kubel, Jacob E.; Eiseman, Charles S.

    2015-01-01

    Improving detection rates for elusive species with clumped distributions is often accomplished through adaptive sampling designs. This approach can be extended to include species with temporally variable detection probabilities. By concentrating survey effort in years when the focal species are most abundant or visible, overall detection rates can be improved. This requires either long-term monitoring at a few locations where the species are known to occur or models capable of predicting population trends using climatic and demographic data. For marbled salamanders (Ambystoma opacum) in Massachusetts, we demonstrate that annual variation in detection probability of larvae is regionally correlated. In our data, the difference in survey success between years was far more important than the difference among the three survey methods we employed: diurnal surveys, nocturnal surveys, and dipnet surveys. Based on these data, we simulate future surveys to locate unknown populations under a temporally adaptive sampling framework. In the simulations, when pond dynamics are correlated over the focal region, the temporally adaptive design improved mean survey success by as much as 26% over a non-adaptive sampling design. Employing a temporally adaptive strategy costs very little, is simple, and has the potential to substantially improve the efficient use of scarce conservation funds. PMID:25799224

  18. An evaluation of inferential procedures for adaptive clinical trial designs with pre-specified rules for modifying the sample size.

    PubMed

    Levin, Gregory P; Emerson, Sarah C; Emerson, Scott S

    2014-09-01

    Many papers have introduced adaptive clinical trial methods that allow modifications to the sample size based on interim estimates of treatment effect. There has been extensive commentary on type I error control and efficiency considerations, but little research on estimation after an adaptive hypothesis test. We evaluate the reliability and precision of different inferential procedures in the presence of an adaptive design with pre-specified rules for modifying the sampling plan. We extend group sequential orderings of the outcome space based on the stage at stopping, likelihood ratio statistic, and sample mean to the adaptive setting in order to compute median-unbiased point estimates, exact confidence intervals, and P-values uniformly distributed under the null hypothesis. The likelihood ratio ordering is found to average shorter confidence intervals and produce higher probabilities of P-values below important thresholds than alternative approaches. The bias adjusted mean demonstrates the lowest mean squared error among candidate point estimates. A conditional error-based approach in the literature has the benefit of being the only method that accommodates unplanned adaptations. We compare the performance of this and other methods in order to quantify the cost of failing to plan ahead in settings where adaptations could realistically be pre-specified at the design stage. We find the cost to be meaningful for all designs and treatment effects considered, and to be substantial for designs frequently proposed in the literature. © 2014, The International Biometric Society.

  19. Adaptive sampling of AEM transients

    NASA Astrophysics Data System (ADS)

    Di Massa, Domenico; Florio, Giovanni; Viezzoli, Andrea

    2016-02-01

    This paper focuses on the sampling of the electromagnetic transient as acquired by airborne time-domain electromagnetic (TDEM) systems. Typically, the sampling of the electromagnetic transient is done using a fixed number of gates whose width grows logarithmically (log-gating). The log-gating has two main benefits: improving the signal to noise (S/N) ratio at late times, when the electromagnetic signal has amplitudes equal or lower than the natural background noise, and ensuring a good resolution at the early times. However, as a result of fixed time gates, the conventional log-gating does not consider any geological variations in the surveyed area, nor the possibly varying characteristics of the measured signal. We show, using synthetic models, how a different, flexible sampling scheme can increase the resolution of resistivity models. We propose a new sampling method, which adapts the gating on the base of the slope variations in the electromagnetic (EM) transient. The use of such an alternative sampling scheme aims to get more accurate inverse models by extracting the geoelectrical information from the measured data in an optimal way.

  20. An adaptive two-stage sequential design for sampling rare and clustered populations

    USGS Publications Warehouse

    Brown, J.A.; Salehi, M.M.; Moradi, M.; Bell, G.; Smith, D.R.

    2008-01-01

    How to design an efficient large-area survey continues to be an interesting question for ecologists. In sampling large areas, as is common in environmental studies, adaptive sampling can be efficient because it ensures survey effort is targeted to subareas of high interest. In two-stage sampling, higher density primary sample units are usually of more interest than lower density primary units when populations are rare and clustered. Two-stage sequential sampling has been suggested as a method for allocating second stage sample effort among primary units. Here, we suggest a modification: adaptive two-stage sequential sampling. In this method, the adaptive part of the allocation process means the design is more flexible in how much extra effort can be directed to higher-abundance primary units. We discuss how best to design an adaptive two-stage sequential sample. ?? 2008 The Society of Population Ecology and Springer.

  1. The Formative Method for Adapting Psychotherapy (FMAP): A community-based developmental approach to culturally adapting therapy

    PubMed Central

    Hwang, Wei-Chin

    2010-01-01

    How do we culturally adapt psychotherapy for ethnic minorities? Although there has been growing interest in doing so, few therapy adaptation frameworks have been developed. The majority of these frameworks take a top-down theoretical approach to adapting psychotherapy. The purpose of this paper is to introduce a community-based developmental approach to modifying psychotherapy for ethnic minorities. The Formative Method for Adapting Psychotherapy (FMAP) is a bottom-up approach that involves collaborating with consumers to generate and support ideas for therapy adaptation. It involves 5-phases that target developing, testing, and reformulating therapy modifications. These phases include: (a) generating knowledge and collaborating with stakeholders (b) integrating generated information with theory and empirical and clinical knowledge, (c) reviewing the initial culturally adapted clinical intervention with stakeholders and revising the culturally adapted intervention, (d) testing the culturally adapted intervention, and (e) finalizing the culturally adapted intervention. Application of the FMAP is illustrated using examples from a study adapting psychotherapy for Chinese Americans, but can also be readily applied to modify therapy for other ethnic groups. PMID:20625458

  2. Irregular and adaptive sampling for automatic geophysic measure systems

    NASA Astrophysics Data System (ADS)

    Avagnina, Davide; Lo Presti, Letizia; Mulassano, Paolo

    2000-07-01

    In this paper a sampling method, based on an irregular and adaptive strategy, is described. It can be used as automatic guide for rovers designed to explore terrestrial and planetary environments. Starting from the hypothesis that a explorative vehicle is equipped with a payload able to acquire measurements of interesting quantities, the method is able to detect objects of interest from measured points and to realize an adaptive sampling, while badly describing the not interesting background.

  3. Flight Approach to Adaptive Control Research

    NASA Technical Reports Server (NTRS)

    Pavlock, Kate Maureen; Less, James L.; Larson, David Nils

    2011-01-01

    The National Aeronautics and Space Administration's Dryden Flight Research Center completed flight testing of adaptive controls research on a full-scale F-18 testbed. The testbed served as a full-scale vehicle to test and validate adaptive flight control research addressing technical challenges involved with reducing risk to enable safe flight in the presence of adverse conditions such as structural damage or control surface failures. This paper describes the research interface architecture, risk mitigations, flight test approach and lessons learned of adaptive controls research.

  4. SAChES: Scalable Adaptive Chain-Ensemble Sampling.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Swiler, Laura Painton; Ray, Jaideep; Ebeida, Mohamed Salah

    We present the development of a parallel Markov Chain Monte Carlo (MCMC) method called SAChES, Scalable Adaptive Chain-Ensemble Sampling. This capability is targed to Bayesian calibration of com- putationally expensive simulation models. SAChES involves a hybrid of two methods: Differential Evo- lution Monte Carlo followed by Adaptive Metropolis. Both methods involve parallel chains. Differential evolution allows one to explore high-dimensional parameter spaces using loosely coupled (i.e., largely asynchronous) chains. Loose coupling allows the use of large chain ensembles, with far more chains than the number of parameters to explore. This reduces per-chain sampling burden, enables high-dimensional inversions and the usemore » of computationally expensive forward models. The large number of chains can also ameliorate the impact of silent-errors, which may affect only a few chains. The chain ensemble can also be sampled to provide an initial condition when an aberrant chain is re-spawned. Adaptive Metropolis takes the best points from the differential evolution and efficiently hones in on the poste- rior density. The multitude of chains in SAChES is leveraged to (1) enable efficient exploration of the parameter space; and (2) ensure robustness to silent errors which may be unavoidable in extreme-scale computational platforms of the future. This report outlines SAChES, describes four papers that are the result of the project, and discusses some additional results.« less

  5. Adaptive Oceanographic Sampling in a Coastal Environment Using Autonomous Gliding Vehicles

    DTIC Science & Technology

    2003-08-01

    cost autonomous vehicles with near-global range and modular sensor payload. Particular emphasis is placed on the development of adaptive sampling...environment. Secondary objectives include continued development of adaptive sampling strategies suitable for large fleets of slow-moving autonomous ... vehicles , and development and implementation of new oceanographic sensors and sampling methodologies. The main task completed was a complete redesign of

  6. A modular approach to adaptive structures.

    PubMed

    Pagitz, Markus; Pagitz, Manuel; Hühne, Christian

    2014-10-07

    A remarkable property of nastic, shape changing plants is their complete fusion between actuators and structure. This is achieved by combining a large number of cells whose geometry, internal pressures and material properties are optimized for a given set of target shapes and stiffness requirements. An advantage of such a fusion is that cell walls are prestressed by cell pressures which increases, decreases the overall structural stiffness, weight. Inspired by the nastic movement of plants, Pagitz et al (2012 Bioinspir. Biomim. 7) published a novel concept for pressure actuated cellular structures. This article extends previous work by introducing a modular approach to adaptive structures. An algorithm that breaks down any continuous target shapes into a small number of standardized modules is presented. Furthermore it is shown how cytoskeletons within each cell enhance the properties of adaptive modules. An adaptive passenger seat and an aircrafts leading, trailing edge is used to demonstrate the potential of a modular approach.

  7. Flight Test Approach to Adaptive Control Research

    NASA Technical Reports Server (NTRS)

    Pavlock, Kate Maureen; Less, James L.; Larson, David Nils

    2011-01-01

    The National Aeronautics and Space Administration s Dryden Flight Research Center completed flight testing of adaptive controls research on a full-scale F-18 testbed. The validation of adaptive controls has the potential to enhance safety in the presence of adverse conditions such as structural damage or control surface failures. This paper describes the research interface architecture, risk mitigations, flight test approach and lessons learned of adaptive controls research.

  8. Advances in adaptive control theory: Gradient- and derivative-free approaches

    NASA Astrophysics Data System (ADS)

    Yucelen, Tansel

    In this dissertation, we present new approaches to improve standard designs in adaptive control theory, and novel adaptive control architectures. We first present a novel Kalman filter based approach for approximately enforcing a linear constraint in standard adaptive control design. One application is that this leads to alternative forms for well known modification terms such as e-modification. In addition, it leads to smaller tracking errors without incurring significant oscillations in the system response and without requiring high modification gain. We derive alternative forms of e- and adaptive loop recovery (ALR-) modifications. Next, we show how to use Kalman filter optimization to derive a novel adaptation law. This results in an optimization-based time-varying adaptation gain that reduces the need for adaptation gain tuning. A second major contribution of this dissertation is the development of a novel derivative-free, delayed weight update law for adaptive control. The assumption of constant unknown ideal weights is relaxed to the existence of time-varying weights, such that fast and possibly discontinuous variation in weights are allowed. This approach is particulary advantageous for applications to systems that can undergo a sudden change in dynamics, such as might be due to reconfiguration, deployment of a payload, docking, or structural damage, and for rejection of external disturbance processes. As a third and final contribution, we develop a novel approach for extending all the methods developed in this dissertation to the case of output feedback. The approach is developed only for the case of derivative-free adaptive control, and the extension of the other approaches developed previously for the state feedback case to output feedback is left as a future research topic. The proposed approaches of this dissertation are illustrated in both simulation and flight test.

  9. Adaptive approaches to biosecurity governance.

    PubMed

    Cook, David C; Liu, Shuang; Murphy, Brendan; Lonsdale, W Mark

    2010-09-01

    This article discusses institutional changes that may facilitate an adaptive approach to biosecurity risk management where governance is viewed as a multidisciplinary, interactive experiment acknowledging uncertainty. Using the principles of adaptive governance, evolved from institutional theory, we explore how the concepts of lateral information flows, incentive alignment, and policy experimentation might shape Australia's invasive species defense mechanisms. We suggest design principles for biosecurity policies emphasizing overlapping complementary response capabilities and the sharing of invasive species risks via a polycentric system of governance. © 2010 Society for Risk Analysis

  10. Passive and active adaptive management: Approaches and an example

    USGS Publications Warehouse

    Williams, B.K.

    2011-01-01

    Adaptive management is a framework for resource conservation that promotes iterative learning-based decision making. Yet there remains considerable confusion about what adaptive management entails, and how to actually make resource decisions adaptively. A key but somewhat ambiguous distinction in adaptive management is between active and passive forms of adaptive decision making. The objective of this paper is to illustrate some approaches to active and passive adaptive management with a simple example involving the drawdown of water impoundments on a wildlife refuge. The approaches are illustrated for the drawdown example, and contrasted in terms of objectives, costs, and potential learning rates. Some key challenges to the actual practice of AM are discussed, and tradeoffs between implementation costs and long-term benefits are highlighted. ?? 2010 Elsevier Ltd.

  11. Coherent optical adaptive technique improves the spatial resolution of STED microscopy in thick samples

    PubMed Central

    Yan, Wei; Yang, Yanlong; Tan, Yu; Chen, Xun; Li, Yang; Qu, Junle; Ye, Tong

    2018-01-01

    Stimulated emission depletion microscopy (STED) is one of far-field optical microscopy techniques that can provide sub-diffraction spatial resolution. The spatial resolution of the STED microscopy is determined by the specially engineered beam profile of the depletion beam and its power. However, the beam profile of the depletion beam may be distorted due to aberrations of optical systems and inhomogeneity of specimens’ optical properties, resulting in a compromised spatial resolution. The situation gets deteriorated when thick samples are imaged. In the worst case, the sever distortion of the depletion beam profile may cause complete loss of the super resolution effect no matter how much depletion power is applied to specimens. Previously several adaptive optics approaches have been explored to compensate aberrations of systems and specimens. However, it is hard to correct the complicated high-order optical aberrations of specimens. In this report, we demonstrate that the complicated distorted wavefront from a thick phantom sample can be measured by using the coherent optical adaptive technique (COAT). The full correction can effectively maintain and improve the spatial resolution in imaging thick samples. PMID:29400356

  12. Efficiently sampling conformations and pathways using the concurrent adaptive sampling (CAS) algorithm

    NASA Astrophysics Data System (ADS)

    Ahn, Surl-Hee; Grate, Jay W.; Darve, Eric F.

    2017-08-01

    Molecular dynamics simulations are useful in obtaining thermodynamic and kinetic properties of bio-molecules, but they are limited by the time scale barrier. That is, we may not obtain properties' efficiently because we need to run microseconds or longer simulations using femtosecond time steps. To overcome this time scale barrier, we can use the weighted ensemble (WE) method, a powerful enhanced sampling method that efficiently samples thermodynamic and kinetic properties. However, the WE method requires an appropriate partitioning of phase space into discrete macrostates, which can be problematic when we have a high-dimensional collective space or when little is known a priori about the molecular system. Hence, we developed a new WE-based method, called the "Concurrent Adaptive Sampling (CAS) algorithm," to tackle these issues. The CAS algorithm is not constrained to use only one or two collective variables, unlike most reaction coordinate-dependent methods. Instead, it can use a large number of collective variables and adaptive macrostates to enhance the sampling in the high-dimensional space. This is especially useful for systems in which we do not know what the right reaction coordinates are, in which case we can use many collective variables to sample conformations and pathways. In addition, a clustering technique based on the committor function is used to accelerate sampling the slowest process in the molecular system. In this paper, we introduce the new method and show results from two-dimensional models and bio-molecules, specifically penta-alanine and a triazine trimer.

  13. Robust online tracking via adaptive samples selection with saliency detection

    NASA Astrophysics Data System (ADS)

    Yan, Jia; Chen, Xi; Zhu, QiuPing

    2013-12-01

    Online tracking has shown to be successful in tracking of previously unknown objects. However, there are two important factors which lead to drift problem of online tracking, the one is how to select the exact labeled samples even when the target locations are inaccurate, and the other is how to handle the confusors which have similar features with the target. In this article, we propose a robust online tracking algorithm with adaptive samples selection based on saliency detection to overcome the drift problem. To deal with the problem of degrading the classifiers using mis-aligned samples, we introduce the saliency detection method to our tracking problem. Saliency maps and the strong classifiers are combined to extract the most correct positive samples. Our approach employs a simple yet saliency detection algorithm based on image spectral residual analysis. Furthermore, instead of using the random patches as the negative samples, we propose a reasonable selection criterion, in which both the saliency confidence and similarity are considered with the benefits that confusors in the surrounding background are incorporated into the classifiers update process before the drift occurs. The tracking task is formulated as a binary classification via online boosting framework. Experiment results in several challenging video sequences demonstrate the accuracy and stability of our tracker.

  14. On adaptive robustness approach to Anti-Jam signal processing

    NASA Astrophysics Data System (ADS)

    Poberezhskiy, Y. S.; Poberezhskiy, G. Y.

    An effective approach to exploiting statistical differences between desired and jamming signals named adaptive robustness is proposed and analyzed in this paper. It combines conventional Bayesian, adaptive, and robust approaches that are complementary to each other. This combining strengthens the advantages and mitigates the drawbacks of the conventional approaches. Adaptive robustness is equally applicable to both jammers and their victim systems. The capabilities required for realization of adaptive robustness in jammers and victim systems are determined. The employment of a specific nonlinear robust algorithm for anti-jam (AJ) processing is described and analyzed. Its effectiveness in practical situations has been proven analytically and confirmed by simulation. Since adaptive robustness can be used by both sides in electronic warfare, it is more advantageous for the fastest and most intelligent side. Many results obtained and discussed in this paper are also applicable to commercial applications such as communications in unregulated or poorly regulated frequency ranges and systems with cognitive capabilities.

  15. An efficient adaptive sampling strategy for global surrogate modeling with applications in multiphase flow simulation

    NASA Astrophysics Data System (ADS)

    Mo, S.; Lu, D.; Shi, X.; Zhang, G.; Ye, M.; Wu, J.

    2016-12-01

    Surrogate models have shown remarkable computational efficiency in hydrological simulations involving design space exploration, sensitivity analysis, uncertainty quantification, etc. The central task of constructing a global surrogate models is to achieve a prescribed approximation accuracy with as few original model executions as possible, which requires a good design strategy to optimize the distribution of data points in the parameter domains and an effective stopping criterion to automatically terminate the design process when desired approximation accuracy is achieved. This study proposes a novel adaptive sampling strategy, which starts from a small number of initial samples and adaptively selects additional samples by balancing the collection in unexplored regions and refinement in interesting areas. We define an efficient and effective evaluation metric basing on Taylor expansion to select the most promising potential samples from candidate points, and propose a robust stopping criterion basing on the approximation accuracy at new points to guarantee the achievement of desired accuracy. The numerical results of several benchmark analytical functions indicate that the proposed approach is more computationally efficient and robust than the widely used maximin distance design and two other well-known adaptive sampling strategies. The application to two complicated multiphase flow problems further demonstrates the efficiency and effectiveness of our method in constructing global surrogate models for high-dimensional and highly nonlinear problems. Acknowledgements: This work was financially supported by the National Nature Science Foundation of China grants No. 41030746 and 41172206.

  16. Design of telehealth trials--introducing adaptive approaches.

    PubMed

    Law, Lisa M; Wason, James M S

    2014-12-01

    The field of telehealth and telemedicine is expanding as the need to improve efficiency of health care becomes more pressing. The decision to implement a telehealth system is generally an expensive undertaking that impacts a large number of patients and other stakeholders. It is therefore extremely important that the decision is fully supported by accurate evaluation of telehealth interventions. Numerous reviews of telehealth have described the evidence base as inconsistent. In response they call for larger, more rigorously controlled trials, and trials which go beyond evaluation of clinical effectiveness alone. The aim of this paper is to discuss various ways in which evaluation of telehealth could be improved by the use of adaptive trial designs. We discuss various adaptive design options, such as sample size reviews and changing the study hypothesis to address uncertain parameters, group sequential trials and multi-arm multi-stage trials to improve efficiency, and enrichment designs to maximise the chances of obtaining clear evidence about the telehealth intervention. There is potential to address the flaws discussed in the telehealth literature through the adoption of adaptive approaches to trial design. Such designs could lead to improvements in efficiency, allow the evaluation of multiple telehealth interventions in a cost-effective way, or accurately assess a range of endpoints that are important in the overall success of a telehealth programme. Copyright © 2014 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  17. Lateral Coherence and Mixing in the Coastal Ocean: Adaptive Sampling using Gliders

    DTIC Science & Technology

    2012-09-30

    Adaptive Sampling using Gliders R. Kipp Shearman Jonathan D. Nash James N. Moum John A. Barth College of Oceanic & Atmospheric Sciences Oregon State...persistent on O (3 day) timescales, so are ideally suited to be adaptively sampled by autonomous gliders that actively report both turbulent and...plan to deploy 4 AUV gliders to perform intensive, adaptive surveys. Newly-enhanced to measure turbulent mixing, water-column currents and dye

  18. Lateral Coherence and Mixing in the Coastal Ocean: Adaptive Sampling using Gliders

    DTIC Science & Technology

    2011-09-30

    Coherence and Mixing in the Coastal Ocean: Adaptive Sampling using Gliders R. Kipp Shearman Jonathan D. Nash James N. Moum John A. Barth College of...These structures evolve yet are often persistent on O (3 day) timescales, so are ideally suited to be adaptively sampled by autonomous gliders that...processes driving lateral dispersion, we plan to deploy 4 AUV gliders to perform intensive, adaptive surveys. Newly-enhanced to measure turbulent mixing

  19. An Evidence-Based Public Health Approach to Climate Change Adaptation

    PubMed Central

    Eidson, Millicent; Tlumak, Jennifer E.; Raab, Kristin K.; Luber, George

    2014-01-01

    Background: Public health is committed to evidence-based practice, yet there has been minimal discussion of how to apply an evidence-based practice framework to climate change adaptation. Objectives: Our goal was to review the literature on evidence-based public health (EBPH), to determine whether it can be applied to climate change adaptation, and to consider how emphasizing evidence-based practice may influence research and practice decisions related to public health adaptation to climate change. Methods: We conducted a substantive review of EBPH, identified a consensus EBPH framework, and modified it to support an EBPH approach to climate change adaptation. We applied the framework to an example and considered implications for stakeholders. Discussion: A modified EBPH framework can accommodate the wide range of exposures, outcomes, and modes of inquiry associated with climate change adaptation and the variety of settings in which adaptation activities will be pursued. Several factors currently limit application of the framework, including a lack of higher-level evidence of intervention efficacy and a lack of guidelines for reporting climate change health impact projections. To enhance the evidence base, there must be increased attention to designing, evaluating, and reporting adaptation interventions; standardized health impact projection reporting; and increased attention to knowledge translation. This approach has implications for funders, researchers, journal editors, practitioners, and policy makers. Conclusions: The current approach to EBPH can, with modifications, support climate change adaptation activities, but there is little evidence regarding interventions and knowledge translation, and guidelines for projecting health impacts are lacking. Realizing the goal of an evidence-based approach will require systematic, coordinated efforts among various stakeholders. Citation: Hess JJ, Eidson M, Tlumak JE, Raab KK, Luber G. 2014. An evidence-based public

  20. Adapted random sampling patterns for accelerated MRI.

    PubMed

    Knoll, Florian; Clason, Christian; Diwoky, Clemens; Stollberger, Rudolf

    2011-02-01

    Variable density random sampling patterns have recently become increasingly popular for accelerated imaging strategies, as they lead to incoherent aliasing artifacts. However, the design of these sampling patterns is still an open problem. Current strategies use model assumptions like polynomials of different order to generate a probability density function that is then used to generate the sampling pattern. This approach relies on the optimization of design parameters which is very time consuming and therefore impractical for daily clinical use. This work presents a new approach that generates sampling patterns by making use of power spectra of existing reference data sets and hence requires neither parameter tuning nor an a priori mathematical model of the density of sampling points. The approach is validated with downsampling experiments, as well as with accelerated in vivo measurements. The proposed approach is compared with established sampling patterns, and the generalization potential is tested by using a range of reference images. Quantitative evaluation is performed for the downsampling experiments using RMS differences to the original, fully sampled data set. Our results demonstrate that the image quality of the method presented in this paper is comparable to that of an established model-based strategy when optimization of the model parameter is carried out and yields superior results to non-optimized model parameters. However, no random sampling pattern showed superior performance when compared to conventional Cartesian subsampling for the considered reconstruction strategy.

  1. A sub-sampled approach to extremely low-dose STEM

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stevens, A.; Luzi, L.; Yang, H.

    The inpainting of randomly sub-sampled images acquired by scanning transmission electron microscopy (STEM) is an attractive method for imaging under low-dose conditions (≤ 1 e -Å 2) without changing either the operation of the microscope or the physics of the imaging process. We show that 1) adaptive sub-sampling increases acquisition speed, resolution, and sensitivity; and 2) random (non-adaptive) sub-sampling is equivalent, but faster than, traditional low-dose techniques. Adaptive sub-sampling opens numerous possibilities for the analysis of beam sensitive materials and in-situ dynamic processes at the resolution limit of the aberration corrected microscope and is demonstrated here for the analysis ofmore » the node distribution in metal-organic frameworks (MOFs).« less

  2. Adaptive approaches to licensing, health technology assessment, and introduction of drugs and devices.

    PubMed

    Husereau, Don; Henshall, Chris; Jivraj, Jamil

    2014-07-01

    Adaptive approaches to the introduction of drugs and medical devices involve the use of an evolving evidence base rather than conventional single-point-in-time evaluations as a proposed means to promote patient access to innovation, reduce clinical uncertainty, ensure effectiveness, and improve the health technology development process. This report summarizes a Health Technology Assessment International (HTAi) Policy Forum discussion, drawing on presentations from invited experts, discussions among attendees about real-world case examples, and background paper. For adaptive approaches to be understood, accepted, and implemented, the Forum identified several key issues that must be addressed. These include the need to define the goals of and to set priorities for adaptive approaches; to examine evidence collection approaches; to clarify the roles and responsibilities of stakeholders; to understand the implications of adaptive approaches on current legal and ethical standards; to determine costs of such approaches and how they will be met; and to identify differences in applying adaptive approaches to drugs versus medical devices. The Forum also explored the different implications of adaptive approaches for various stakeholders, including patients, regulators, HTA/coverage bodies, health systems, clinicians, and industry. A key outcome of the meeting was a clearer understanding of the opportunities and challenges adaptive approaches present. Furthermore, the Forum brought to light the critical importance of recognizing and including a full range of stakeholders as contributors to a shared decision-making model implicit in adaptive pathways in future discussions on, and implementation of, adaptive approaches.

  3. Career Adapt-Abilities Scale in a French-Speaking Swiss Sample: Psychometric Properties and Relationships to Personality and Work Engagement

    ERIC Educational Resources Information Center

    Rossier, Jerome; Zecca, Gregory; Stauffer, Sarah D.; Maggiori, Christian; Dauwalder, Jean-Pierre

    2012-01-01

    The aim of this study was to analyze the psychometric properties of the Career Adapt-Abilities Scale (CAAS) in a French-speaking Swiss sample and its relationship with personality dimensions and work engagement. The heterogeneous sample of 391 participants (M[subscript age] = 39.59, SD = 12.30) completed the CAAS-International and a short version…

  4. An evidence-based public health approach to climate change adaptation.

    PubMed

    Hess, Jeremy J; Eidson, Millicent; Tlumak, Jennifer E; Raab, Kristin K; Luber, George

    2014-11-01

    Public health is committed to evidence-based practice, yet there has been minimal discussion of how to apply an evidence-based practice framework to climate change adaptation. Our goal was to review the literature on evidence-based public health (EBPH), to determine whether it can be applied to climate change adaptation, and to consider how emphasizing evidence-based practice may influence research and practice decisions related to public health adaptation to climate change. We conducted a substantive review of EBPH, identified a consensus EBPH framework, and modified it to support an EBPH approach to climate change adaptation. We applied the framework to an example and considered implications for stakeholders. A modified EBPH framework can accommodate the wide range of exposures, outcomes, and modes of inquiry associated with climate change adaptation and the variety of settings in which adaptation activities will be pursued. Several factors currently limit application of the framework, including a lack of higher-level evidence of intervention efficacy and a lack of guidelines for reporting climate change health impact projections. To enhance the evidence base, there must be increased attention to designing, evaluating, and reporting adaptation interventions; standardized health impact projection reporting; and increased attention to knowledge translation. This approach has implications for funders, researchers, journal editors, practitioners, and policy makers. The current approach to EBPH can, with modifications, support climate change adaptation activities, but there is little evidence regarding interventions and knowledge translation, and guidelines for projecting health impacts are lacking. Realizing the goal of an evidence-based approach will require systematic, coordinated efforts among various stakeholders.

  5. Sampling-free Bayesian inversion with adaptive hierarchical tensor representations

    NASA Astrophysics Data System (ADS)

    Eigel, Martin; Marschall, Manuel; Schneider, Reinhold

    2018-03-01

    A sampling-free approach to Bayesian inversion with an explicit polynomial representation of the parameter densities is developed, based on an affine-parametric representation of a linear forward model. This becomes feasible due to the complete treatment in function spaces, which requires an efficient model reduction technique for numerical computations. The advocated perspective yields the crucial benefit that error bounds can be derived for all occuring approximations, leading to provable convergence subject to the discretization parameters. Moreover, it enables a fully adaptive a posteriori control with automatic problem-dependent adjustments of the employed discretizations. The method is discussed in the context of modern hierarchical tensor representations, which are used for the evaluation of a random PDE (the forward model) and the subsequent high-dimensional quadrature of the log-likelihood, alleviating the ‘curse of dimensionality’. Numerical experiments demonstrate the performance and confirm the theoretical results.

  6. Efficient estimation of abundance for patchily distributed populations via two-phase, adaptive sampling.

    USGS Publications Warehouse

    Conroy, M.J.; Runge, J.P.; Barker, R.J.; Schofield, M.R.; Fonnesbeck, C.J.

    2008-01-01

    Many organisms are patchily distributed, with some patches occupied at high density, others at lower densities, and others not occupied. Estimation of overall abundance can be difficult and is inefficient via intensive approaches such as capture-mark-recapture (CMR) or distance sampling. We propose a two-phase sampling scheme and model in a Bayesian framework to estimate abundance for patchily distributed populations. In the first phase, occupancy is estimated by binomial detection samples taken on all selected sites, where selection may be of all sites available, or a random sample of sites. Detection can be by visual surveys, detection of sign, physical captures, or other approach. At the second phase, if a detection threshold is achieved, CMR or other intensive sampling is conducted via standard procedures (grids or webs) to estimate abundance. Detection and CMR data are then used in a joint likelihood to model probability of detection in the occupancy sample via an abundance-detection model. CMR modeling is used to estimate abundance for the abundance-detection relationship, which in turn is used to predict abundance at the remaining sites, where only detection data are collected. We present a full Bayesian modeling treatment of this problem, in which posterior inference on abundance and other parameters (detection, capture probability) is obtained under a variety of assumptions about spatial and individual sources of heterogeneity. We apply the approach to abundance estimation for two species of voles (Microtus spp.) in Montana, USA. We also use a simulation study to evaluate the frequentist properties of our procedure given known patterns in abundance and detection among sites as well as design criteria. For most population characteristics and designs considered, bias and mean-square error (MSE) were low, and coverage of true parameter values by Bayesian credibility intervals was near nominal. Our two-phase, adaptive approach allows efficient estimation of

  7. Maximum type 1 error rate inflation in multiarmed clinical trials with adaptive interim sample size modifications.

    PubMed

    Graf, Alexandra C; Bauer, Peter; Glimm, Ekkehard; Koenig, Franz

    2014-07-01

    Sample size modifications in the interim analyses of an adaptive design can inflate the type 1 error rate, if test statistics and critical boundaries are used in the final analysis as if no modification had been made. While this is already true for designs with an overall change of the sample size in a balanced treatment-control comparison, the inflation can be much larger if in addition a modification of allocation ratios is allowed as well. In this paper, we investigate adaptive designs with several treatment arms compared to a single common control group. Regarding modifications, we consider treatment arm selection as well as modifications of overall sample size and allocation ratios. The inflation is quantified for two approaches: a naive procedure that ignores not only all modifications, but also the multiplicity issue arising from the many-to-one comparison, and a Dunnett procedure that ignores modifications, but adjusts for the initially started multiple treatments. The maximum inflation of the type 1 error rate for such types of design can be calculated by searching for the "worst case" scenarios, that are sample size adaptation rules in the interim analysis that lead to the largest conditional type 1 error rate in any point of the sample space. To show the most extreme inflation, we initially assume unconstrained second stage sample size modifications leading to a large inflation of the type 1 error rate. Furthermore, we investigate the inflation when putting constraints on the second stage sample sizes. It turns out that, for example fixing the sample size of the control group, leads to designs controlling the type 1 error rate. © 2014 The Author. Biometrical Journal published by WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. Multi-species attributes as the condition for adaptive sampling of rare species using two-stage sequential sampling with an auxiliary variable

    USGS Publications Warehouse

    Panahbehagh, B.; Smith, D.R.; Salehi, M.M.; Hornbach, D.J.; Brown, D.J.; Chan, F.; Marinova, D.; Anderssen, R.S.

    2011-01-01

    Assessing populations of rare species is challenging because of the large effort required to locate patches of occupied habitat and achieve precise estimates of density and abundance. The presence of a rare species has been shown to be correlated with presence or abundance of more common species. Thus, ecological community richness or abundance can be used to inform sampling of rare species. Adaptive sampling designs have been developed specifically for rare and clustered populations and have been applied to a wide range of rare species. However, adaptive sampling can be logistically challenging, in part, because variation in final sample size introduces uncertainty in survey planning. Two-stage sequential sampling (TSS), a recently developed design, allows for adaptive sampling, but avoids edge units and has an upper bound on final sample size. In this paper we present an extension of two-stage sequential sampling that incorporates an auxiliary variable (TSSAV), such as community attributes, as the condition for adaptive sampling. We develop a set of simulations to approximate sampling of endangered freshwater mussels to evaluate the performance of the TSSAV design. The performance measures that we are interested in are efficiency and probability of sampling a unit occupied by the rare species. Efficiency measures the precision of population estimate from the TSSAV design relative to a standard design, such as simple random sampling (SRS). The simulations indicate that the density and distribution of the auxiliary population is the most important determinant of the performance of the TSSAV design. Of the design factors, such as sample size, the fraction of the primary units sampled was most important. For the best scenarios, the odds of sampling the rare species was approximately 1.5 times higher for TSSAV compared to SRS and efficiency was as high as 2 (i.e., variance from TSSAV was half that of SRS). We have found that design performance, especially for adaptive

  9. Superresolution restoration of an image sequence: adaptive filtering approach.

    PubMed

    Elad, M; Feuer, A

    1999-01-01

    This paper presents a new method based on adaptive filtering theory for superresolution restoration of continuous image sequences. The proposed methodology suggests least squares (LS) estimators which adapt in time, based on adaptive filters, least mean squares (LMS) or recursive least squares (RLS). The adaptation enables the treatment of linear space and time-variant blurring and arbitrary motion, both of them assumed known. The proposed new approach is shown to be of relatively low computational requirements. Simulations demonstrating the superresolution restoration algorithms are presented.

  10. Stochastic, adaptive sampling of information by microvilli in fly photoreceptors.

    PubMed

    Song, Zhuoyi; Postma, Marten; Billings, Stephen A; Coca, Daniel; Hardie, Roger C; Juusola, Mikko

    2012-08-07

    In fly photoreceptors, light is focused onto a photosensitive waveguide, the rhabdomere, consisting of tens of thousands of microvilli. Each microvillus is capable of generating elementary responses, quantum bumps, in response to single photons using a stochastically operating phototransduction cascade. Whereas much is known about the cascade reactions, less is known about how the concerted action of the microvilli population encodes light changes into neural information and how the ultrastructure and biochemical machinery of photoreceptors of flies and other insects evolved in relation to the information sampling and processing they perform. We generated biophysically realistic fly photoreceptor models, which accurately simulate the encoding of visual information. By comparing stochastic simulations with single cell recordings from Drosophila photoreceptors, we show how adaptive sampling by 30,000 microvilli captures the temporal structure of natural contrast changes. Following each bump, individual microvilli are rendered briefly (~100-200 ms) refractory, thereby reducing quantum efficiency with increasing intensity. The refractory period opposes saturation, dynamically and stochastically adjusting availability of microvilli (bump production rate: sample rate), whereas intracellular calcium and voltage adapt bump amplitude and waveform (sample size). These adapting sampling principles result in robust encoding of natural light changes, which both approximates perceptual contrast constancy and enhances novel events under different light conditions, and predict information processing across a range of species with different visual ecologies. These results clarify why fly photoreceptors are structured the way they are and function as they do, linking sensory information to sensory evolution and revealing benefits of stochasticity for neural information processing. Copyright © 2012 Elsevier Ltd. All rights reserved.

  11. Stochastic, Adaptive Sampling of Information by Microvilli in Fly Photoreceptors

    PubMed Central

    Song, Zhuoyi; Postma, Marten; Billings, Stephen A.; Coca, Daniel; Hardie, Roger C.; Juusola, Mikko

    2012-01-01

    Summary Background In fly photoreceptors, light is focused onto a photosensitive waveguide, the rhabdomere, consisting of tens of thousands of microvilli. Each microvillus is capable of generating elementary responses, quantum bumps, in response to single photons using a stochastically operating phototransduction cascade. Whereas much is known about the cascade reactions, less is known about how the concerted action of the microvilli population encodes light changes into neural information and how the ultrastructure and biochemical machinery of photoreceptors of flies and other insects evolved in relation to the information sampling and processing they perform. Results We generated biophysically realistic fly photoreceptor models, which accurately simulate the encoding of visual information. By comparing stochastic simulations with single cell recordings from Drosophila photoreceptors, we show how adaptive sampling by 30,000 microvilli captures the temporal structure of natural contrast changes. Following each bump, individual microvilli are rendered briefly (∼100–200 ms) refractory, thereby reducing quantum efficiency with increasing intensity. The refractory period opposes saturation, dynamically and stochastically adjusting availability of microvilli (bump production rate: sample rate), whereas intracellular calcium and voltage adapt bump amplitude and waveform (sample size). These adapting sampling principles result in robust encoding of natural light changes, which both approximates perceptual contrast constancy and enhances novel events under different light conditions, and predict information processing across a range of species with different visual ecologies. Conclusions These results clarify why fly photoreceptors are structured the way they are and function as they do, linking sensory information to sensory evolution and revealing benefits of stochasticity for neural information processing. PMID:22704990

  12. An Approach to Stable Gradient-Descent Adaptation of Higher Order Neural Units.

    PubMed

    Bukovsky, Ivo; Homma, Noriyasu

    2017-09-01

    Stability evaluation of a weight-update system of higher order neural units (HONUs) with polynomial aggregation of neural inputs (also known as classes of polynomial neural networks) for adaptation of both feedforward and recurrent HONUs by a gradient descent method is introduced. An essential core of the approach is based on the spectral radius of a weight-update system, and it allows stability monitoring and its maintenance at every adaptation step individually. Assuring the stability of the weight-update system (at every single adaptation step) naturally results in the adaptation stability of the whole neural architecture that adapts to the target data. As an aside, the used approach highlights the fact that the weight optimization of HONU is a linear problem, so the proposed approach can be generally extended to any neural architecture that is linear in its adaptable parameters.

  13. An unbiased adaptive sampling algorithm for the exploration of RNA mutational landscapes under evolutionary pressure.

    PubMed

    Waldispühl, Jérôme; Ponty, Yann

    2011-11-01

    The analysis of the relationship between sequences and structures (i.e., how mutations affect structures and reciprocally how structures influence mutations) is essential to decipher the principles driving molecular evolution, to infer the origins of genetic diseases, and to develop bioengineering applications such as the design of artificial molecules. Because their structures can be predicted from the sequence data only, RNA molecules provide a good framework to study this sequence-structure relationship. We recently introduced a suite of algorithms called RNAmutants which allows a complete exploration of RNA sequence-structure maps in polynomial time and space. Formally, RNAmutants takes an input sequence (or seed) to compute the Boltzmann-weighted ensembles of mutants with exactly k mutations, and sample mutations from these ensembles. However, this approach suffers from major limitations. Indeed, since the Boltzmann probabilities of the mutations depend of the free energy of the structures, RNAmutants has difficulties to sample mutant sequences with low G+C-contents. In this article, we introduce an unbiased adaptive sampling algorithm that enables RNAmutants to sample regions of the mutational landscape poorly covered by classical algorithms. We applied these methods to sample mutations with low G+C-contents. These adaptive sampling techniques can be easily adapted to explore other regions of the sequence and structural landscapes which are difficult to sample. Importantly, these algorithms come at a minimal computational cost. We demonstrate the insights offered by these techniques on studies of complete RNA sequence structures maps of sizes up to 40 nucleotides. Our results indicate that the G+C-content has a strong influence on the size and shape of the evolutionary accessible sequence and structural spaces. In particular, we show that low G+C-contents favor the apparition of internal loops and thus possibly the synthesis of tertiary structure motifs. On

  14. Autonomous spatially adaptive sampling in experiments based on curvature, statistical error and sample spacing with applications in LDA measurements

    NASA Astrophysics Data System (ADS)

    Theunissen, Raf; Kadosh, Jesse S.; Allen, Christian B.

    2015-06-01

    Spatially varying signals are typically sampled by collecting uniformly spaced samples irrespective of the signal content. For signals with inhomogeneous information content, this leads to unnecessarily dense sampling in regions of low interest or insufficient sample density at important features, or both. A new adaptive sampling technique is presented directing sample collection in proportion to local information content, capturing adequately the short-period features while sparsely sampling less dynamic regions. The proposed method incorporates a data-adapted sampling strategy on the basis of signal curvature, sample space-filling, variable experimental uncertainty and iterative improvement. Numerical assessment has indicated a reduction in the number of samples required to achieve a predefined uncertainty level overall while improving local accuracy for important features. The potential of the proposed method has been further demonstrated on the basis of Laser Doppler Anemometry experiments examining the wake behind a NACA0012 airfoil and the boundary layer characterisation of a flat plate.

  15. An adaptive sampling method for variable-fidelity surrogate models using improved hierarchical kriging

    NASA Astrophysics Data System (ADS)

    Hu, Jiexiang; Zhou, Qi; Jiang, Ping; Shao, Xinyu; Xie, Tingli

    2018-01-01

    Variable-fidelity (VF) modelling methods have been widely used in complex engineering system design to mitigate the computational burden. Building a VF model generally includes two parts: design of experiments and metamodel construction. In this article, an adaptive sampling method based on improved hierarchical kriging (ASM-IHK) is proposed to refine the improved VF model. First, an improved hierarchical kriging model is developed as the metamodel, in which the low-fidelity model is varied through a polynomial response surface function to capture the characteristics of a high-fidelity model. Secondly, to reduce local approximation errors, an active learning strategy based on a sequential sampling method is introduced to make full use of the already required information on the current sampling points and to guide the sampling process of the high-fidelity model. Finally, two numerical examples and the modelling of the aerodynamic coefficient for an aircraft are provided to demonstrate the approximation capability of the proposed approach, as well as three other metamodelling methods and two sequential sampling methods. The results show that ASM-IHK provides a more accurate metamodel at the same simulation cost, which is very important in metamodel-based engineering design problems.

  16. A novel approach for SEMG signal classification with adaptive local binary patterns.

    PubMed

    Ertuğrul, Ömer Faruk; Kaya, Yılmaz; Tekin, Ramazan

    2016-07-01

    Feature extraction plays a major role in the pattern recognition process, and this paper presents a novel feature extraction approach, adaptive local binary pattern (aLBP). aLBP is built on the local binary pattern (LBP), which is an image processing method, and one-dimensional local binary pattern (1D-LBP). In LBP, each pixel is compared with its neighbors. Similarly, in 1D-LBP, each data in the raw is judged against its neighbors. 1D-LBP extracts feature based on local changes in the signal. Therefore, it has high a potential to be employed in medical purposes. Since, each action or abnormality, which is recorded in SEMG signals, has its own pattern, and via the 1D-LBP these (hidden) patterns may be detected. But, the positions of the neighbors in 1D-LBP are constant depending on the position of the data in the raw. Also, both LBP and 1D-LBP are very sensitive to noise. Therefore, its capacity in detecting hidden patterns is limited. To overcome these drawbacks, aLBP was proposed. In aLBP, the positions of the neighbors and their values can be assigned adaptively via the down-sampling and the smoothing coefficients. Therefore, the potential to detect (hidden) patterns, which may express an illness or an action, is really increased. To validate the proposed feature extraction approach, two different datasets were employed. Achieved accuracies by the proposed approach were higher than obtained results by employed popular feature extraction approaches and the reported results in the literature. Obtained accuracy results were brought out that the proposed method can be employed to investigate SEMG signals. In summary, this work attempts to develop an adaptive feature extraction scheme that can be utilized for extracting features from local changes in different categories of time-varying signals.

  17. Statistical analysis of hydrological response in urbanising catchments based on adaptive sampling using inter-amount times

    NASA Astrophysics Data System (ADS)

    ten Veldhuis, Marie-Claire; Schleiss, Marc

    2017-04-01

    In this study, we introduced an alternative approach for analysis of hydrological flow time series, using an adaptive sampling framework based on inter-amount times (IATs). The main difference with conventional flow time series is the rate at which low and high flows are sampled: the unit of analysis for IATs is a fixed flow amount, instead of a fixed time window. We analysed statistical distributions of flows and IATs across a wide range of sampling scales to investigate sensitivity of statistical properties such as quantiles, variance, skewness, scaling parameters and flashiness indicators to the sampling scale. We did this based on streamflow time series for 17 (semi)urbanised basins in North Carolina, US, ranging from 13 km2 to 238 km2 in size. Results showed that adaptive sampling of flow time series based on inter-amounts leads to a more balanced representation of low flow and peak flow values in the statistical distribution. While conventional sampling gives a lot of weight to low flows, as these are most ubiquitous in flow time series, IAT sampling gives relatively more weight to high flow values, when given flow amounts are accumulated in shorter time. As a consequence, IAT sampling gives more information about the tail of the distribution associated with high flows, while conventional sampling gives relatively more information about low flow periods. We will present results of statistical analyses across a range of subdaily to seasonal scales and will highlight some interesting insights that can be derived from IAT statistics with respect to basin flashiness and impact urbanisation on hydrological response.

  18. A new approach to adaptive control of manipulators

    NASA Technical Reports Server (NTRS)

    Seraji, H.

    1987-01-01

    An approach in which the manipulator inverse is used as a feedforward controller is employed in the adaptive control of manipulators in order to achieve trajectory tracking by the joint angles. The desired trajectory is applied as an input to the feedforward controller, and the controller output is used as the driving torque for the manipulator. An adaptive algorithm obtained from MRAC theory is used to update the controller gains to cope with variations in the manipulator inverse due to changes of the operating point. An adaptive feedback controller and an auxiliary signal enhance closed-loop stability and achieve faster adaptation. Simulation results demonstrate the effectiveness of the proposed control scheme for different reference trajectories, and despite large variations in the payload.

  19. Approach for Using Learner Satisfaction to Evaluate the Learning Adaptation Policy

    ERIC Educational Resources Information Center

    Jeghal, Adil; Oughdir, Lahcen; Tairi, Hamid; Radouane, Abdelhay

    2016-01-01

    The learning adaptation is a very important phase in a learning situation in human learning environments. This paper presents the authors' approach used to evaluate the effectiveness of learning adaptive systems. This approach is based on the analysis of learner satisfaction notices collected by a questionnaire on a learning situation; to analyze…

  20. Adapting to the Digital Age: A Narrative Approach

    ERIC Educational Resources Information Center

    Cousins, Sarah; Bissar, Dounia

    2012-01-01

    The article adopts a narrative inquiry approach to foreground informal learning and exposes a collection of stories from tutors about how they adapted comfortably to the digital age.We were concerned that despite substantial evidence that bringing about changes in pedagogic practices can be difficult, there is a gap in convincing approaches to…

  1. SMI adaptive antenna arrays for weak interfering signals. [Sample Matrix Inversion

    NASA Technical Reports Server (NTRS)

    Gupta, Inder J.

    1986-01-01

    The performance of adaptive antenna arrays in the presence of weak interfering signals (below thermal noise) is studied. It is shown that a conventional adaptive antenna array sample matrix inversion (SMI) algorithm is unable to suppress such interfering signals. To overcome this problem, the SMI algorithm is modified. In the modified algorithm, the covariance matrix is redefined such that the effect of thermal noise on the weights of adaptive arrays is reduced. Thus, the weights are dictated by relatively weak signals. It is shown that the modified algorithm provides the desired interference protection.

  2. Concept Based Approach for Adaptive Personalized Course Learning System

    ERIC Educational Resources Information Center

    Salahli, Mehmet Ali; Özdemir, Muzaffer; Yasar, Cumali

    2013-01-01

    One of the most important factors for improving the personalization aspects of learning systems is to enable adaptive properties to them. The aim of the adaptive personalized learning system is to offer the most appropriate learning path and learning materials to learners by taking into account their profiles. In this paper, a new approach to…

  3. A Decentralized Adaptive Approach to Fault Tolerant Flight Control

    NASA Technical Reports Server (NTRS)

    Wu, N. Eva; Nikulin, Vladimir; Heimes, Felix; Shormin, Victor

    2000-01-01

    This paper briefly reports some results of our study on the application of a decentralized adaptive control approach to a 6 DOF nonlinear aircraft model. The simulation results showed the potential of using this approach to achieve fault tolerant control. Based on this observation and some analysis, the paper proposes a multiple channel adaptive control scheme that makes use of the functionally redundant actuating and sensing capabilities in the model, and explains how to implement the scheme to tolerate actuator and sensor failures. The conditions, under which the scheme is applicable, are stated in the paper.

  4. Temporal sampling, resetting, and adaptation orchestrate gradient sensing in sperm

    PubMed Central

    Alvarez, Luis; Seifert, Reinhard; Gregor, Ingo; Jäckle, Oliver; Beyermann, Michael; Krause, Eberhard

    2012-01-01

    Sperm, navigating in a chemical gradient, are exposed to a periodic stream of chemoattractant molecules. The periodic stimulation entrains Ca2+ oscillations that control looping steering responses. It is not known how sperm sample chemoattractant molecules during periodic stimulation and adjust their sensitivity. We report that sea urchin sperm sampled molecules for 0.2–0.6 s before a Ca2+ response was produced. Additional molecules delivered during a Ca2+ response reset the cell by causing a pronounced Ca2+ drop that terminated the response; this reset was followed by a new Ca2+ rise. After stimulation, sperm adapted their sensitivity following the Weber–Fechner law. Taking into account the single-molecule sensitivity, we estimate that sperm can register a minimal gradient of 0.8 fM/µm and be attracted from as far away as 4.7 mm. Many microorganisms sense stimulus gradients along periodic paths to translate a spatial distribution of the stimulus into a temporal pattern of the cell response. Orchestration of temporal sampling, resetting, and adaptation might control gradient sensing in such organisms as well. PMID:22986497

  5. Temporal sampling, resetting, and adaptation orchestrate gradient sensing in sperm.

    PubMed

    Kashikar, Nachiket D; Alvarez, Luis; Seifert, Reinhard; Gregor, Ingo; Jäckle, Oliver; Beyermann, Michael; Krause, Eberhard; Kaupp, U Benjamin

    2012-09-17

    Sperm, navigating in a chemical gradient, are exposed to a periodic stream of chemoattractant molecules. The periodic stimulation entrains Ca(2+) oscillations that control looping steering responses. It is not known how sperm sample chemoattractant molecules during periodic stimulation and adjust their sensitivity. We report that sea urchin sperm sampled molecules for 0.2-0.6 s before a Ca(2+) response was produced. Additional molecules delivered during a Ca(2+) response reset the cell by causing a pronounced Ca(2+) drop that terminated the response; this reset was followed by a new Ca(2+) rise. After stimulation, sperm adapted their sensitivity following the Weber-Fechner law. Taking into account the single-molecule sensitivity, we estimate that sperm can register a minimal gradient of 0.8 fM/µm and be attracted from as far away as 4.7 mm. Many microorganisms sense stimulus gradients along periodic paths to translate a spatial distribution of the stimulus into a temporal pattern of the cell response. Orchestration of temporal sampling, resetting, and adaptation might control gradient sensing in such organisms as well.

  6. A mixed signal ECG processing platform with an adaptive sampling ADC for portable monitoring applications.

    PubMed

    Kim, Hyejung; Van Hoof, Chris; Yazicioglu, Refet Firat

    2011-01-01

    This paper describes a mixed-signal ECG processing platform with an 12-bit ADC architecture that can adapt its sampling rate according to the input signals rate of change. This enables the sampling of ECG signals with significantly reduced data rate without loss of information. The presented adaptive sampling scheme reduces the ADC power consumption, enables the processing of ECG signals with lower power consumption, and reduces the power consumption of the radio while streaming the ECG signals. The test results show that running a CWT-based R peak detection algorithm using the adaptively sampled ECG signals consumes only 45.6 μW and it leads to 36% less overall system power consumption.

  7. Recruitment of hard-to-reach population subgroups via adaptations of the snowball sampling strategy.

    PubMed

    Sadler, Georgia Robins; Lee, Hau-Chen; Lim, Rod Seung-Hwan; Fullerton, Judith

    2010-09-01

    Nurse researchers and educators often engage in outreach to narrowly defined populations. This article offers examples of how variations on the snowball sampling recruitment strategy can be applied in the creation of culturally appropriate, community-based information dissemination efforts related to recruitment to health education programs and research studies. Examples from the primary author's program of research are provided to demonstrate how adaptations of snowball sampling can be used effectively in the recruitment of members of traditionally underserved or vulnerable populations. The adaptation of snowball sampling techniques, as described in this article, helped the authors to gain access to each of the more-vulnerable population groups of interest. The use of culturally sensitive recruitment strategies is both appropriate and effective in enlisting the involvement of members of vulnerable populations. Adaptations of snowball sampling strategies should be considered when recruiting participants for education programs or for research studies when the recruitment of a population-based sample is not essential.

  8. Surface sampling techniques for 3D object inspection

    NASA Astrophysics Data System (ADS)

    Shih, Chihhsiong S.; Gerhardt, Lester A.

    1995-03-01

    While the uniform sampling method is quite popular for pointwise measurement of manufactured parts, this paper proposes three novel sampling strategies which emphasize 3D non-uniform inspection capability. They are: (a) the adaptive sampling, (b) the local adjustment sampling, and (c) the finite element centroid sampling techniques. The adaptive sampling strategy is based on a recursive surface subdivision process. Two different approaches are described for this adaptive sampling strategy. One uses triangle patches while the other uses rectangle patches. Several real world objects were tested using these two algorithms. Preliminary results show that sample points are distributed more closely around edges, corners, and vertices as desired for many classes of objects. Adaptive sampling using triangle patches is shown to generally perform better than both uniform and adaptive sampling using rectangle patches. The local adjustment sampling strategy uses a set of predefined starting points and then finds the local optimum position of each nodal point. This method approximates the object by moving the points toward object edges and corners. In a hybrid approach, uniform points sets and non-uniform points sets, first preprocessed by the adaptive sampling algorithm on a real world object were then tested using the local adjustment sampling method. The results show that the initial point sets when preprocessed by adaptive sampling using triangle patches, are moved the least amount of distance by the subsequently applied local adjustment method, again showing the superiority of this method. The finite element sampling technique samples the centroids of the surface triangle meshes produced from the finite element method. The performance of this algorithm was compared to that of the adaptive sampling using triangular patches. The adaptive sampling with triangular patches was once again shown to be better on different classes of objects.

  9. Efficiently sampling conformations and pathways using the concurrent adaptive sampling (CAS) algorithm

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ahn, Surl-Hee; Grate, Jay W.; Darve, Eric F.

    Molecular dynamics (MD) simulations are useful in obtaining thermodynamic and kinetic properties of bio-molecules but are limited by the timescale barrier, i.e., we may be unable to efficiently obtain properties because we need to run microseconds or longer simulations using femtoseconds time steps. While there are several existing methods to overcome this timescale barrier and efficiently sample thermodynamic and/or kinetic properties, problems remain in regard to being able to sample un- known systems, deal with high-dimensional space of collective variables, and focus the computational effort on slow timescales. Hence, a new sampling method, called the “Concurrent Adaptive Sampling (CAS) algorithm,”more » has been developed to tackle these three issues and efficiently obtain conformations and pathways. The method is not constrained to use only one or two collective variables, unlike most reaction coordinate-dependent methods. Instead, it can use a large number of collective vari- ables and uses macrostates (a partition of the collective variable space) to enhance the sampling. The exploration is done by running a large number of short simula- tions, and a clustering technique is used to accelerate the sampling. In this paper, we introduce the new methodology and show results from two-dimensional models and bio-molecules, such as penta-alanine and triazine polymer« less

  10. Complex adaptive systems: A new approach for understanding health practices.

    PubMed

    Gomersall, Tim

    2018-06-22

    This article explores the potential of complex adaptive systems theory to inform behaviour change research. A complex adaptive system describes a collection of heterogeneous agents interacting within a particular context, adapting to each other's actions. In practical terms, this implies that behaviour change is 1) socially and culturally situated; 2) highly sensitive to small baseline differences in individuals, groups, and intervention components; and 3) determined by multiple components interacting "chaotically". Two approaches to studying complex adaptive systems are briefly reviewed. Agent-based modelling is a computer simulation technique that allows researchers to investigate "what if" questions in a virtual environment. Applied qualitative research techniques, on the other hand, offer a way to examine what happens when an intervention is pursued in real-time, and to identify the sorts of rules and assumptions governing social action. Although these represent very different approaches to complexity, there may be scope for mixing these methods - for example, by grounding models in insights derived from qualitative fieldwork. Finally, I will argue that the concept of complex adaptive systems offers one opportunity to gain a deepened understanding of health-related practices, and to examine the social psychological processes that produce health-promoting or damaging actions.

  11. Robust, Causal, and Incremental Approaches to Investigating Linguistic Adaptation

    PubMed Central

    Roberts, Seán G.

    2018-01-01

    This paper discusses the maximum robustness approach for studying cases of adaptation in language. We live in an age where we have more data on more languages than ever before, and more data to link it with from other domains. This should make it easier to test hypotheses involving adaptation, and also to spot new patterns that might be explained by adaptation. However, there is not much discussion of the overall approach to research in this area. There are outstanding questions about how to formalize theories, what the criteria are for directing research and how to integrate results from different methods into a clear assessment of a hypothesis. This paper addresses some of those issues by suggesting an approach which is causal, incremental and robust. It illustrates the approach with reference to a recent claim that dry environments select against the use of precise contrasts in pitch. Study 1 replicates a previous analysis of the link between humidity and lexical tone with an alternative dataset and finds that it is not robust. Study 2 performs an analysis with a continuous measure of tone and finds no significant correlation. Study 3 addresses a more recent analysis of the link between humidity and vowel use and finds that it is robust, though the effect size is small and the robustness of the measurement of vowel use is low. Methodological robustness of the general theory is addressed by suggesting additional approaches including iterated learning, a historical case study, corpus studies, and studying individual speech. PMID:29515487

  12. Optimal Decision Stimuli for Risky Choice Experiments: An Adaptive Approach

    PubMed Central

    Cavagnaro, Daniel R.; Gonzalez, Richard; Myung, Jay I.; Pitt, Mark A.

    2014-01-01

    Collecting data to discriminate between models of risky choice requires careful selection of decision stimuli. Models of decision making aim to predict decisions across a wide range of possible stimuli, but practical limitations force experimenters to select only a handful of them for actual testing. Some stimuli are more diagnostic between models than others, so the choice of stimuli is critical. This paper provides the theoretical background and a methodological framework for adaptive selection of optimal stimuli for discriminating among models of risky choice. The approach, called Adaptive Design Optimization (ADO), adapts the stimulus in each experimental trial based on the results of the preceding trials. We demonstrate the validity of the approach with simulation studies aiming to discriminate Expected Utility, Weighted Expected Utility, Original Prospect Theory, and Cumulative Prospect Theory models. PMID:24532856

  13. Optimal Decision Stimuli for Risky Choice Experiments: An Adaptive Approach.

    PubMed

    Cavagnaro, Daniel R; Gonzalez, Richard; Myung, Jay I; Pitt, Mark A

    2013-02-01

    Collecting data to discriminate between models of risky choice requires careful selection of decision stimuli. Models of decision making aim to predict decisions across a wide range of possible stimuli, but practical limitations force experimenters to select only a handful of them for actual testing. Some stimuli are more diagnostic between models than others, so the choice of stimuli is critical. This paper provides the theoretical background and a methodological framework for adaptive selection of optimal stimuli for discriminating among models of risky choice. The approach, called Adaptive Design Optimization (ADO), adapts the stimulus in each experimental trial based on the results of the preceding trials. We demonstrate the validity of the approach with simulation studies aiming to discriminate Expected Utility, Weighted Expected Utility, Original Prospect Theory, and Cumulative Prospect Theory models.

  14. Adaptive Biasing Combined with Hamiltonian Replica Exchange to Improve Umbrella Sampling Free Energy Simulations.

    PubMed

    Zeller, Fabian; Zacharias, Martin

    2014-02-11

    The accurate calculation of potentials of mean force for ligand-receptor binding is one of the most important applications of molecular simulation techniques. Typically, the separation distance between ligand and receptor is chosen as a reaction coordinate along which a PMF can be calculated with the aid of umbrella sampling (US) techniques. In addition, restraints can be applied on the relative position and orientation of the partner molecules to reduce accessible phase space. An approach combining such phase space reduction with flattening of the free energy landscape and configurational exchanges has been developed, which significantly improves the convergence of PMF calculations in comparison with standard umbrella sampling. The free energy surface along the reaction coordinate is smoothened by iteratively adapting biasing potentials corresponding to previously calculated PMFs. Configurations are allowed to exchange between the umbrella simulation windows via the Hamiltonian replica exchange method. The application to a DNA molecule in complex with a minor groove binding ligand indicates significantly improved convergence and complete reversibility of the sampling along the pathway. The calculated binding free energy is in excellent agreement with experimental results. In contrast, the application of standard US resulted in large differences between PMFs calculated for association and dissociation pathways. The approach could be a useful alternative to standard US for computational studies on biomolecular recognition processes.

  15. Elucidating Microbial Adaptation Dynamics via Autonomous Exposure and Sampling

    NASA Technical Reports Server (NTRS)

    Grace, Joseph M.; Verseux, Cyprien; Gentry, Diana; Moffet, Amy; Thayabaran, Ramanen; Wong, Nathan; Rothschild, Lynn

    2013-01-01

    The adaptation of micro-organisms to their environments is a complex process of interaction between the pressures of the environment and of competition. Reducing this multifactorial process to environmental exposure in the laboratory is a common tool for elucidating individual mechanisms of evolution, such as mutation rates. Although such studies inform fundamental questions about the way adaptation and even speciation occur, they are often limited by labor-intensive manual techniques. Current methods for controlled study of microbial adaptation limit the length of time, the depth of collected data, and the breadth of applied environmental conditions. Small idiosyncrasies in manual techniques can have large effects on outcomes; for example, there are significant variations in induced radiation resistances following similar repeated exposure protocols. We describe here a project under development to allow rapid cycling of multiple types of microbial environmental exposure. The system allows continuous autonomous monitoring and data collection of both single species and sampled communities, independently and concurrently providing multiple types of controlled environmental pressure (temperature, radiation, chemical presence or absence, and so on) to a microbial community in dynamic response to the ecosystem's current status. When combined with DNA sequencing and extraction, such a controlled environment can cast light on microbial functional development, population dynamics, inter- and intra-species competition, and microbe-environment interaction. The project's goal is to allow rapid, repeatable iteration of studies of both natural and artificial microbial adaptation. As an example, the same system can be used both to increase the pH of a wet soil aliquot over time while periodically sampling it for genetic activity analysis, or to repeatedly expose a culture of bacteria to the presence of a toxic metal, automatically adjusting the level of toxicity based on the

  16. An Approach to V&V of Embedded Adaptive Systems

    NASA Technical Reports Server (NTRS)

    Liu, Yan; Yerramalla, Sampath; Fuller, Edgar; Cukic, Bojan; Gururajan, Srikaruth

    2004-01-01

    Rigorous Verification and Validation (V&V) techniques are essential for high assurance systems. Lately, the performance of some of these systems is enhanced by embedded adaptive components in order to cope with environmental changes. Although the ability of adapting is appealing, it actually poses a problem in terms of V&V. Since uncertainties induced by environmental changes have a significant impact on system behavior, the applicability of conventional V&V techniques is limited. In safety-critical applications such as flight control system, the mechanisms of change must be observed, diagnosed, accommodated and well understood prior to deployment. In this paper, we propose a non-conventional V&V approach suitable for online adaptive systems. We apply our approach to an intelligent flight control system that employs a particular type of Neural Networks (NN) as the adaptive learning paradigm. Presented methodology consists of a novelty detection technique and online stability monitoring tools. The novelty detection technique is based on Support Vector Data Description that detects novel (abnormal) data patterns. The Online Stability Monitoring tools based on Lyapunov's Stability Theory detect unstable learning behavior in neural networks. Cases studies based on a high fidelity simulator of NASA's Intelligent Flight Control System demonstrate a successful application of the presented V&V methodology. ,

  17. Adaptation, Acceptance and Adaptive Preferences in Health and Capability Well-Being Measurement Amongst Those Approaching End of Life.

    PubMed

    Coast, Joanna; Bailey, Cara; Orlando, Rosanna; Armour, Kathy; Perry, Rachel; Jones, Louise; Kinghorn, Philip

    2018-05-09

    Adaptive preferences occur when people subconsciously alter their views to account for the possibilities available to them. Adaptive preferences may be problematic where these views are used in resource allocation decisions because they may lead to underestimation of the true benefits of providing services. This research explored the nature and extent of both adaptation (changing to better suit the context) and adaptive preferences (altering preferences in response to restricted options) in individuals approaching the end of life (EoL). Qualitative data from 'thinkaloud' interviews with 33 hospice patients, 22 close persons and 17 health professionals were used alongside their responses to three health/well-being measures for use in resource allocation decisions: EQ-5D-5L (health status); ICECAP-A (adult capability); and ICECAP-SCM (Supportive Care Measure; EoL capability). Constant comparative analysis combined a focus on both verbalised perceptions across the three groups and responses to the measures. Data collection took place between October 2012 and February 2014. Informants spoke clearly about how patients had adapted their lives in response to symptoms associated with their terminal condition. It was often seen as a positive choice to accept their state and adapt in this way but, at the same time, most patients were fully aware of the health and capability losses that they had faced. Self-assessments of health and capability generally appeared to reflect the pre-adaptation state, although there were exceptions. Despite adapting to their conditions, the reference group for individuals approaching EoL largely remained a healthy, capable population, and most did not show evidence of adaptive preferences.

  18. Universal digital high-resolution melt: a novel approach to broad-based profiling of heterogeneous biological samples.

    PubMed

    Fraley, Stephanie I; Hardick, Justin; Masek, Billie J; Jo Masek, Billie; Athamanolap, Pornpat; Rothman, Richard E; Gaydos, Charlotte A; Carroll, Karen C; Wakefield, Teresa; Wang, Tza-Huei; Yang, Samuel

    2013-10-01

    Comprehensive profiling of nucleic acids in genetically heterogeneous samples is important for clinical and basic research applications. Universal digital high-resolution melt (U-dHRM) is a new approach to broad-based PCR diagnostics and profiling technologies that can overcome issues of poor sensitivity due to contaminating nucleic acids and poor specificity due to primer or probe hybridization inaccuracies for single nucleotide variations. The U-dHRM approach uses broad-based primers or ligated adapter sequences to universally amplify all nucleic acid molecules in a heterogeneous sample, which have been partitioned, as in digital PCR. Extensive assay optimization enables direct sequence identification by algorithm-based matching of melt curve shape and Tm to a database of known sequence-specific melt curves. We show that single-molecule detection and single nucleotide sensitivity is possible. The feasibility and utility of U-dHRM is demonstrated through detection of bacteria associated with polymicrobial blood infection and microRNAs (miRNAs) associated with host response to infection. U-dHRM using broad-based 16S rRNA gene primers demonstrates universal single cell detection of bacterial pathogens, even in the presence of larger amounts of contaminating bacteria; U-dHRM using universally adapted Lethal-7 miRNAs in a heterogeneous mixture showcases the single copy sensitivity and single nucleotide specificity of this approach.

  19. Evaluating Adaptive Governance Approaches to Sustainable Water Management in North-West Thailand

    NASA Astrophysics Data System (ADS)

    Clark, Julian R. A.; Semmahasak, Chutiwalanch

    2013-04-01

    Adaptive governance is advanced as a potent means of addressing institutional fit of natural resource systems with prevailing modes of political-administrative management. Its advocates also argue that it enhances participatory and learning opportunities for stakeholders over time. Yet an increasing number of studies demonstrate real difficulties in implementing adaptive governance `solutions'. This paper builds on these debates by examining the introduction of adaptive governance to water management in Chiang Mai province, north-west Thailand. The paper considers, first, the limitations of current water governance modes at the provincial scale, and the rationale for implementation of an adaptive approach. The new approach is then critically examined, with its initial performance and likely future success evaluated by (i) analysis of water stakeholders' opinions of its first year of operation; and (ii) comparison of its governance attributes against recent empirical accounts of implementation difficulty and failure of adaptive governance of natural resource management more generally. The analysis confirms the potentially significant role that the new approach can play in brokering and resolving the underlying differences in stakeholder representation and knowledge construction at the heart of the prevailing water governance modes in north-west Thailand.

  20. Adaptation of the Athlete Burnout Questionnaire in a Spanish sample of athletes.

    PubMed

    Arce, Constantino; De Francisco, Cristina; Andrade, Elena; Seoane, Gloria; Raedeke, Thomas

    2012-11-01

    In this paper, we offer a general version of the Spanish adaptation of Athlete Burnout Questionnaire (ABQ) designed to measure the syndrome of burnout in athletes of different sports. In previous works, the Spanish version of ABQ was administered to different samples of soccer players. Its psychometric properties were appropriate and similar to the findings in original ABQ. The purpose of this study was to examine the generalization to others sports of the Spanish adaptation. We started from this adaptation, but we included three alternative statements (one for each dimension of the questionnaire), and we replaced the word "soccer" with the word "sport". An 18-item version was administered to a sample of 487 athletes aged 13 and 29 years old. Confirmatory factor analyses replicated the factor structure, but two items modification were necessary in order to obtain a good overall fit of the model. The internal consistency and test-retest reliability of the questionnaire were satisfactory.

  1. Recruiting hard-to-reach United States population sub-groups via adaptations of snowball sampling strategy

    PubMed Central

    Sadler, Georgia Robins; Lee, Hau-Chen; Seung-Hwan Lim, Rod; Fullerton, Judith

    2011-01-01

    Nurse researchers and educators often engage in outreach to narrowly defined populations. This article offers examples of how variations on the snowball sampling recruitment strategy can be applied in the creation of culturally appropriate, community-based information dissemination efforts related to recruitment to health education programs and research studies. Examples from the primary author’s program of research are provided to demonstrate how adaptations of snowball sampling can be effectively used in the recruitment of members of traditionally underserved or vulnerable populations. The adaptation of snowball sampling techniques, as described in this article, helped the authors to gain access to each of the more vulnerable population groups of interest. The use of culturally sensitive recruitment strategies is both appropriate and effective in enlisting the involvement of members of vulnerable populations. Adaptations of snowball sampling strategies should be considered when recruiting participants for education programs or subjects for research studies when recruitment of a population based sample is not essential. PMID:20727089

  2. A Novel Approach for Adaptive Signal Processing

    NASA Technical Reports Server (NTRS)

    Chen, Ya-Chin; Juang, Jer-Nan

    1998-01-01

    Adaptive linear predictors have been used extensively in practice in a wide variety of forms. In the main, their theoretical development is based upon the assumption of stationarity of the signals involved, particularly with respect to the second order statistics. On this basis, the well-known normal equations can be formulated. If high- order statistical stationarity is assumed, then the equivalent normal equations involve high-order signal moments. In either case, the cross moments (second or higher) are needed. This renders the adaptive prediction procedure non-blind. A novel procedure for blind adaptive prediction has been proposed and considerable implementation has been made in our contributions in the past year. The approach is based upon a suitable interpretation of blind equalization methods that satisfy the constant modulus property and offers significant deviations from the standard prediction methods. These blind adaptive algorithms are derived by formulating Lagrange equivalents from mechanisms of constrained optimization. In this report, other new update algorithms are derived from the fundamental concepts of advanced system identification to carry out the proposed blind adaptive prediction. The results of the work can be extended to a number of control-related problems, such as disturbance identification. The basic principles are outlined in this report and differences from other existing methods are discussed. The applications implemented are speech processing, such as coding and synthesis. Simulations are included to verify the novel modelling method.

  3. Missile Defense: European Phased Adaptive Approach Acquisitions Face Synchronization, Transparency, and Accountability Challenges

    DTIC Science & Technology

    2010-12-21

    House of Representatives Subject: Missile Defense: European Phased Adaptive Approach Acquisitions Face Synchronization , Transparency, and...TITLE AND SUBTITLE Missile Defense: European Phased Adaptive Approach Acquisitions Face Synchronization , Transparency, and Accountability...However, we found that DOD has not fully implemented a management process that synchronizes EPAA acquisition activities and ensures transparency and

  4. Convergence and Efficiency of Adaptive Importance Sampling Techniques with Partial Biasing

    NASA Astrophysics Data System (ADS)

    Fort, G.; Jourdain, B.; Lelièvre, T.; Stoltz, G.

    2018-04-01

    We propose a new Monte Carlo method to efficiently sample a multimodal distribution (known up to a normalization constant). We consider a generalization of the discrete-time Self Healing Umbrella Sampling method, which can also be seen as a generalization of well-tempered metadynamics. The dynamics is based on an adaptive importance technique. The importance function relies on the weights (namely the relative probabilities) of disjoint sets which form a partition of the space. These weights are unknown but are learnt on the fly yielding an adaptive algorithm. In the context of computational statistical physics, the logarithm of these weights is, up to an additive constant, the free-energy, and the discrete valued function defining the partition is called the collective variable. The algorithm falls into the general class of Wang-Landau type methods, and is a generalization of the original Self Healing Umbrella Sampling method in two ways: (i) the updating strategy leads to a larger penalization strength of already visited sets in order to escape more quickly from metastable states, and (ii) the target distribution is biased using only a fraction of the free-energy, in order to increase the effective sample size and reduce the variance of importance sampling estimators. We prove the convergence of the algorithm and analyze numerically its efficiency on a toy example.

  5. Elucidating Microbial Adaptation Dynamics via Autonomous Exposure and Sampling

    NASA Astrophysics Data System (ADS)

    Grace, J. M.; Verseux, C.; Gentry, D.; Moffet, A.; Thayabaran, R.; Wong, N.; Rothschild, L.

    2013-12-01

    The adaptation of micro-organisms to their environments is a complex process of interaction between the pressures of the environment and of competition. Reducing this multifactorial process to environmental exposure in the laboratory is a common tool for elucidating individual mechanisms of evolution, such as mutation rates[Wielgoss et al., 2013]. Although such studies inform fundamental questions about the way adaptation and even speciation occur, they are often limited by labor-intensive manual techniques[Wassmann et al., 2010]. Current methods for controlled study of microbial adaptation limit the length of time, the depth of collected data, and the breadth of applied environmental conditions. Small idiosyncrasies in manual techniques can have large effects on outcomes; for example, there are significant variations in induced radiation resistances following similar repeated exposure protocols[Alcántara-Díaz et al., 2004; Goldman and Travisano, 2011]. We describe here a project under development to allow rapid cycling of multiple types of microbial environmental exposure. The system allows continuous autonomous monitoring and data collection of both single species and sampled communities, independently and concurrently providing multiple types of controlled environmental pressure (temperature, radiation, chemical presence or absence, and so on) to a microbial community in dynamic response to the ecosystem's current status. When combined with DNA sequencing and extraction, such a controlled environment can cast light on microbial functional development, population dynamics, inter- and intra-species competition, and microbe-environment interaction. The project's goal is to allow rapid, repeatable iteration of studies of both natural and artificial microbial adaptation. As an example, the same system can be used both to increase the pH of a wet soil aliquot over time while periodically sampling it for genetic activity analysis, or to repeatedly expose a culture of

  6. Monitoring California Hardwood Rangeland Resources: An Adaptive Approach

    Treesearch

    Raul Tuazon

    1991-01-01

    This paper describes monitoring hardwood rangelands in California within the context of an adaptive or anticipatory approach. A heuristic process of policy evolution under conditions of complexity and uncertainty is presented. Long-term, short-term and program effectiveness monitoring for hardwood rangelands are discussed relative to the process described. The...

  7. Towards a system level understanding of non-model organisms sampled from the environment: a network biology approach.

    PubMed

    Williams, Tim D; Turan, Nil; Diab, Amer M; Wu, Huifeng; Mackenzie, Carolynn; Bartie, Katie L; Hrydziuszko, Olga; Lyons, Brett P; Stentiford, Grant D; Herbert, John M; Abraham, Joseph K; Katsiadaki, Ioanna; Leaver, Michael J; Taggart, John B; George, Stephen G; Viant, Mark R; Chipman, Kevin J; Falciani, Francesco

    2011-08-01

    The acquisition and analysis of datasets including multi-level omics and physiology from non-model species, sampled from field populations, is a formidable challenge, which so far has prevented the application of systems biology approaches. If successful, these could contribute enormously to improving our understanding of how populations of living organisms adapt to environmental stressors relating to, for example, pollution and climate. Here we describe the first application of a network inference approach integrating transcriptional, metabolic and phenotypic information representative of wild populations of the European flounder fish, sampled at seven estuarine locations in northern Europe with different degrees and profiles of chemical contaminants. We identified network modules, whose activity was predictive of environmental exposure and represented a link between molecular and morphometric indices. These sub-networks represented both known and candidate novel adverse outcome pathways representative of several aspects of human liver pathophysiology such as liver hyperplasia, fibrosis, and hepatocellular carcinoma. At the molecular level these pathways were linked to TNF alpha, TGF beta, PDGF, AGT and VEGF signalling. More generally, this pioneering study has important implications as it can be applied to model molecular mechanisms of compensatory adaptation to a wide range of scenarios in wild populations.

  8. A Hybrid Approach for Supporting Adaptivity in E-Learning Environments

    ERIC Educational Resources Information Center

    Al-Omari, Mohammad; Carter, Jenny; Chiclana, Francisco

    2016-01-01

    Purpose: The purpose of this paper is to identify a framework to support adaptivity in e-learning environments. The framework reflects a novel hybrid approach incorporating the concept of the event-condition-action (ECA) model and intelligent agents. Moreover, a system prototype is developed reflecting the hybrid approach to supporting adaptivity…

  9. A New Multi-Agent Approach to Adaptive E-Education

    NASA Astrophysics Data System (ADS)

    Chen, Jing; Cheng, Peng

    Improving customer satisfaction degree is important in e-Education. This paper describes a new approach to adaptive e-Education taking into account the full spectrum of Web service techniques and activities. It presents a multi-agents architecture based on artificial psychology techniques, which makes the e-Education process both adaptable and dynamic, and hence up-to-date. Knowledge base techniques are used to support the e-Education process, and artificial psychology techniques to deal with user psychology, which makes the e-Education system more effective and satisfying.

  10. Cross-cultural adaptation of instruments assessing breastfeeding determinants: a multi-step approach

    PubMed Central

    2014-01-01

    Background Cross-cultural adaptation is a necessary process to effectively use existing instruments in other cultural and language settings. The process of cross-culturally adapting, including translation, of existing instruments is considered a critical set to establishing a meaningful instrument for use in another setting. Using a multi-step approach is considered best practice in achieving cultural and semantic equivalence of the adapted version. We aimed to ensure the content validity of our instruments in the cultural context of KwaZulu-Natal, South Africa. Methods The Iowa Infant Feeding Attitudes Scale, Breastfeeding Self-Efficacy Scale-Short Form and additional items comprise our consolidated instrument, which was cross-culturally adapted utilizing a multi-step approach during August 2012. Cross-cultural adaptation was achieved through steps to maintain content validity and attain semantic equivalence in the target version. Specifically, Lynn’s recommendation to apply an item-level content validity index score was followed. The revised instrument was translated and back-translated. To ensure semantic equivalence, Brislin’s back-translation approach was utilized followed by the committee review to address any discrepancies that emerged from translation. Results Our consolidated instrument was adapted to be culturally relevant and translated to yield more reliable and valid results for use in our larger research study to measure infant feeding determinants effectively in our target cultural context. Conclusions Undertaking rigorous steps to effectively ensure cross-cultural adaptation increases our confidence that the conclusions we make based on our self-report instrument(s) will be stronger. In this way, our aim to achieve strong cross-cultural adaptation of our consolidated instruments was achieved while also providing a clear framework for other researchers choosing to utilize existing instruments for work in other cultural, geographic and population

  11. The Self-Adapting Focused Review System. Probability sampling of medical records to monitor utilization and quality of care.

    PubMed

    Ash, A; Schwartz, M; Payne, S M; Restuccia, J D

    1990-11-01

    Medical record review is increasing in importance as the need to identify and monitor utilization and quality of care problems grow. To conserve resources, reviews are usually performed on a subset of cases. If judgment is used to identify subgroups for review, this raises the following questions: How should subgroups be determined, particularly since the locus of problems can change over time? What standard of comparison should be used in interpreting rates of problems found in subgroups? How can population problem rates be estimated from observed subgroup rates? How can the bias be avoided that arises because reviewers know that selected cases are suspected of having problems? How can changes in problem rates over time be interpreted when evaluating intervention programs? Simple random sampling, an alternative to subgroup review, overcomes the problems implied by these questions but is inefficient. The Self-Adapting Focused Review System (SAFRS), introduced and described here, provides an adaptive approach to record selection that is based upon model-weighted probability sampling. It retains the desirable inferential properties of random sampling while allowing reviews to be concentrated on cases currently thought most likely to be problematic. Model development and evaluation are illustrated using hospital data to predict inappropriate admissions.

  12. Adaptive control of turbulence intensity is accelerated by frugal flow sampling.

    PubMed

    Quinn, Daniel B; van Halder, Yous; Lentink, David

    2017-11-01

    The aerodynamic performance of vehicles and animals, as well as the productivity of turbines and energy harvesters, depends on the turbulence intensity of the incoming flow. Previous studies have pointed at the potential benefits of active closed-loop turbulence control. However, it is unclear what the minimal sensory and algorithmic requirements are for realizing this control. Here we show that very low-bandwidth anemometers record sufficient information for an adaptive control algorithm to converge quickly. Our online Newton-Raphson algorithm tunes the turbulence in a recirculating wind tunnel by taking readings from an anemometer in the test section. After starting at 9% turbulence intensity, the algorithm converges on values ranging from 10% to 45% in less than 12 iterations within 1% accuracy. By down-sampling our measurements, we show that very-low-bandwidth anemometers record sufficient information for convergence. Furthermore, down-sampling accelerates convergence by smoothing gradients in turbulence intensity. Our results explain why low-bandwidth anemometers in engineering and mechanoreceptors in biology may be sufficient for adaptive control of turbulence intensity. Finally, our analysis suggests that, if certain turbulent eddy sizes are more important to control than others, frugal adaptive control schemes can be particularly computationally effective for improving performance. © 2017 The Author(s).

  13. Self-adaptive enhanced sampling in the energy and trajectory spaces: accelerated thermodynamics and kinetic calculations.

    PubMed

    Gao, Yi Qin

    2008-04-07

    Here, we introduce a simple self-adaptive computational method to enhance the sampling in energy, configuration, and trajectory spaces. The method makes use of two strategies. It first uses a non-Boltzmann distribution method to enhance the sampling in the phase space, in particular, in the configuration space. The application of this method leads to a broad energy distribution in a large energy range and a quickly converged sampling of molecular configurations. In the second stage of simulations, the configuration space of the system is divided into a number of small regions according to preselected collective coordinates. An enhanced sampling of reactive transition paths is then performed in a self-adaptive fashion to accelerate kinetics calculations.

  14. Analytical approach to an integrate-and-fire model with spike-triggered adaptation

    NASA Astrophysics Data System (ADS)

    Schwalger, Tilo; Lindner, Benjamin

    2015-12-01

    The calculation of the steady-state probability density for multidimensional stochastic systems that do not obey detailed balance is a difficult problem. Here we present the analytical derivation of the stationary joint and various marginal probability densities for a stochastic neuron model with adaptation current. Our approach assumes weak noise but is valid for arbitrary adaptation strength and time scale. The theory predicts several effects of adaptation on the statistics of the membrane potential of a tonically firing neuron: (i) a membrane potential distribution with a convex shape, (ii) a strongly increased probability of hyperpolarized membrane potentials induced by strong and fast adaptation, and (iii) a maximized variability associated with the adaptation current at a finite adaptation time scale.

  15. Conflicts of thermal adaptation and fever--a cybernetic approach based on physiological experiments.

    PubMed

    Werner, J; Beckmann, U

    1998-01-01

    Cold adaptation aims primarily at a better economy, i.e., preservation of energy often at the cost of a lower mean body temperature during cold stress, whereas heat adaptation whether achieved by exposure to a hot environment or by endogenous heat produced by muscle exercise, often brings about a higher efficiency of control, i.e., a lower mean body temperature during heat stress, at the cost of a higher water loss. While cold adaptation is beneficial in a cold environment, it may constitute a detrimental factor for exposure to a hot environment, mainly because of morphological adaptation. Heat adaptation may be maladaptive for cold exposure, mainly because of functional adaptation. Heat adaptation clearly is best suited to avoid higher body temperatures in fever, no matter which environmental conditions prevail. On the other hand, cold adaptation is detrimental for coping with fever in hot environment. Yet, in the cold, preceding cold adaptation may, because of reduced metabolic heat production, result in lower febrile increase of body temperature. Apparently controversial effects and results may be analyzed in the framework of a cybernetic approach to the main mechanisms of thermal adaptation and fever. Morphological adaptations alter the properties of the heat transfer characteristics of the body ("passive system"), whereas functional adaptation and fever concern the subsystems of control, namely sensors, integrative centers and effectors. In a closed control-loop the two types of adaptation have totally different consequences. It is shown that the experimental results are consistent with the predictions of such an approach.

  16. Remediation management of complex sites using an adaptive site management approach.

    PubMed

    Price, John; Spreng, Carl; Hawley, Elisabeth L; Deeb, Rula

    2017-12-15

    Complex sites require a disproportionate amount of resources for environmental remediation and long timeframes to achieve remediation objectives, due to their complex geologic conditions, hydrogeologic conditions, geochemical conditions, contaminant-related conditions, large scale of contamination, and/or non-technical challenges. A recent team of state and federal environmental regulators, federal agency representatives, industry experts, community stakeholders, and academia worked together as an Interstate Technology & Regulatory Council (ITRC) team to compile resources and create new guidance on the remediation management of complex sites. This article summarizes the ITRC team's recommended process for addressing complex sites through an adaptive site management approach. The team provided guidance for site managers and other stakeholders to evaluate site complexities and determine site remediation potential, i.e., whether an adaptive site management approach is warranted. Adaptive site management was described as a comprehensive, flexible approach to iteratively evaluate and adjust the remedial strategy in response to remedy performance. Key aspects of adaptive site management were described, including tools for revising and updating the conceptual site model (CSM), the importance of setting interim objectives to define short-term milestones on the journey to achieving site objectives, establishing a performance model and metrics to evaluate progress towards meeting interim objectives, and comparing actual with predicted progress during scheduled periodic evaluations, and establishing decision criteria for when and how to adapt/modify/revise the remedial strategy in response to remedy performance. Key findings will be published in an ITRC Technical and Regulatory guidance document in 2017 and free training webinars will be conducted. More information is available at www.itrc-web.org. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. Statistical Inference for Data Adaptive Target Parameters.

    PubMed

    Hubbard, Alan E; Kherad-Pajouh, Sara; van der Laan, Mark J

    2016-05-01

    Consider one observes n i.i.d. copies of a random variable with a probability distribution that is known to be an element of a particular statistical model. In order to define our statistical target we partition the sample in V equal size sub-samples, and use this partitioning to define V splits in an estimation sample (one of the V subsamples) and corresponding complementary parameter-generating sample. For each of the V parameter-generating samples, we apply an algorithm that maps the sample to a statistical target parameter. We define our sample-split data adaptive statistical target parameter as the average of these V-sample specific target parameters. We present an estimator (and corresponding central limit theorem) of this type of data adaptive target parameter. This general methodology for generating data adaptive target parameters is demonstrated with a number of practical examples that highlight new opportunities for statistical learning from data. This new framework provides a rigorous statistical methodology for both exploratory and confirmatory analysis within the same data. Given that more research is becoming "data-driven", the theory developed within this paper provides a new impetus for a greater involvement of statistical inference into problems that are being increasingly addressed by clever, yet ad hoc pattern finding methods. To suggest such potential, and to verify the predictions of the theory, extensive simulation studies, along with a data analysis based on adaptively determined intervention rules are shown and give insight into how to structure such an approach. The results show that the data adaptive target parameter approach provides a general framework and resulting methodology for data-driven science.

  18. An Energy Efficient Adaptive Sampling Algorithm in a Sensor Network for Automated Water Quality Monitoring.

    PubMed

    Shu, Tongxin; Xia, Min; Chen, Jiahong; Silva, Clarence de

    2017-11-05

    Power management is crucial in the monitoring of a remote environment, especially when long-term monitoring is needed. Renewable energy sources such as solar and wind may be harvested to sustain a monitoring system. However, without proper power management, equipment within the monitoring system may become nonfunctional and, as a consequence, the data or events captured during the monitoring process will become inaccurate as well. This paper develops and applies a novel adaptive sampling algorithm for power management in the automated monitoring of the quality of water in an extensive and remote aquatic environment. Based on the data collected on line using sensor nodes, a data-driven adaptive sampling algorithm (DDASA) is developed for improving the power efficiency while ensuring the accuracy of sampled data. The developed algorithm is evaluated using two distinct key parameters, which are dissolved oxygen (DO) and turbidity. It is found that by dynamically changing the sampling frequency, the battery lifetime can be effectively prolonged while maintaining a required level of sampling accuracy. According to the simulation results, compared to a fixed sampling rate, approximately 30.66% of the battery energy can be saved for three months of continuous water quality monitoring. Using the same dataset to compare with a traditional adaptive sampling algorithm (ASA), while achieving around the same Normalized Mean Error (NME), DDASA is superior in saving 5.31% more battery energy.

  19. An Energy Efficient Adaptive Sampling Algorithm in a Sensor Network for Automated Water Quality Monitoring

    PubMed Central

    Shu, Tongxin; Xia, Min; Chen, Jiahong; de Silva, Clarence

    2017-01-01

    Power management is crucial in the monitoring of a remote environment, especially when long-term monitoring is needed. Renewable energy sources such as solar and wind may be harvested to sustain a monitoring system. However, without proper power management, equipment within the monitoring system may become nonfunctional and, as a consequence, the data or events captured during the monitoring process will become inaccurate as well. This paper develops and applies a novel adaptive sampling algorithm for power management in the automated monitoring of the quality of water in an extensive and remote aquatic environment. Based on the data collected on line using sensor nodes, a data-driven adaptive sampling algorithm (DDASA) is developed for improving the power efficiency while ensuring the accuracy of sampled data. The developed algorithm is evaluated using two distinct key parameters, which are dissolved oxygen (DO) and turbidity. It is found that by dynamically changing the sampling frequency, the battery lifetime can be effectively prolonged while maintaining a required level of sampling accuracy. According to the simulation results, compared to a fixed sampling rate, approximately 30.66% of the battery energy can be saved for three months of continuous water quality monitoring. Using the same dataset to compare with a traditional adaptive sampling algorithm (ASA), while achieving around the same Normalized Mean Error (NME), DDASA is superior in saving 5.31% more battery energy. PMID:29113087

  20. Towards a System Level Understanding of Non-Model Organisms Sampled from the Environment: A Network Biology Approach

    PubMed Central

    Williams, Tim D.; Turan, Nil; Diab, Amer M.; Wu, Huifeng; Mackenzie, Carolynn; Bartie, Katie L.; Hrydziuszko, Olga; Lyons, Brett P.; Stentiford, Grant D.; Herbert, John M.; Abraham, Joseph K.; Katsiadaki, Ioanna; Leaver, Michael J.; Taggart, John B.; George, Stephen G.; Viant, Mark R.; Chipman, Kevin J.; Falciani, Francesco

    2011-01-01

    The acquisition and analysis of datasets including multi-level omics and physiology from non-model species, sampled from field populations, is a formidable challenge, which so far has prevented the application of systems biology approaches. If successful, these could contribute enormously to improving our understanding of how populations of living organisms adapt to environmental stressors relating to, for example, pollution and climate. Here we describe the first application of a network inference approach integrating transcriptional, metabolic and phenotypic information representative of wild populations of the European flounder fish, sampled at seven estuarine locations in northern Europe with different degrees and profiles of chemical contaminants. We identified network modules, whose activity was predictive of environmental exposure and represented a link between molecular and morphometric indices. These sub-networks represented both known and candidate novel adverse outcome pathways representative of several aspects of human liver pathophysiology such as liver hyperplasia, fibrosis, and hepatocellular carcinoma. At the molecular level these pathways were linked to TNF alpha, TGF beta, PDGF, AGT and VEGF signalling. More generally, this pioneering study has important implications as it can be applied to model molecular mechanisms of compensatory adaptation to a wide range of scenarios in wild populations. PMID:21901081

  1. Applying Bayesian Item Selection Approaches to Adaptive Tests Using Polytomous Items

    ERIC Educational Resources Information Center

    Penfield, Randall D.

    2006-01-01

    This study applied the maximum expected information (MEI) and the maximum posterior-weighted information (MPI) approaches of computer adaptive testing item selection to the case of a test using polytomous items following the partial credit model. The MEI and MPI approaches are described. A simulation study compared the efficiency of ability…

  2. A global sampling approach to designing and reengineering RNA secondary structures.

    PubMed

    Levin, Alex; Lis, Mieszko; Ponty, Yann; O'Donnell, Charles W; Devadas, Srinivas; Berger, Bonnie; Waldispühl, Jérôme

    2012-11-01

    The development of algorithms for designing artificial RNA sequences that fold into specific secondary structures has many potential biomedical and synthetic biology applications. To date, this problem remains computationally difficult, and current strategies to address it resort to heuristics and stochastic search techniques. The most popular methods consist of two steps: First a random seed sequence is generated; next, this seed is progressively modified (i.e. mutated) to adopt the desired folding properties. Although computationally inexpensive, this approach raises several questions such as (i) the influence of the seed; and (ii) the efficiency of single-path directed searches that may be affected by energy barriers in the mutational landscape. In this article, we present RNA-ensign, a novel paradigm for RNA design. Instead of taking a progressive adaptive walk driven by local search criteria, we use an efficient global sampling algorithm to examine large regions of the mutational landscape under structural and thermodynamical constraints until a solution is found. When considering the influence of the seeds and the target secondary structures, our results show that, compared to single-path directed searches, our approach is more robust, succeeds more often and generates more thermodynamically stable sequences. An ensemble approach to RNA design is thus well worth pursuing as a complement to existing approaches. RNA-ensign is available at http://csb.cs.mcgill.ca/RNAensign.

  3. A global sampling approach to designing and reengineering RNA secondary structures

    PubMed Central

    Levin, Alex; Lis, Mieszko; Ponty, Yann; O’Donnell, Charles W.; Devadas, Srinivas; Berger, Bonnie; Waldispühl, Jérôme

    2012-01-01

    The development of algorithms for designing artificial RNA sequences that fold into specific secondary structures has many potential biomedical and synthetic biology applications. To date, this problem remains computationally difficult, and current strategies to address it resort to heuristics and stochastic search techniques. The most popular methods consist of two steps: First a random seed sequence is generated; next, this seed is progressively modified (i.e. mutated) to adopt the desired folding properties. Although computationally inexpensive, this approach raises several questions such as (i) the influence of the seed; and (ii) the efficiency of single-path directed searches that may be affected by energy barriers in the mutational landscape. In this article, we present RNA-ensign, a novel paradigm for RNA design. Instead of taking a progressive adaptive walk driven by local search criteria, we use an efficient global sampling algorithm to examine large regions of the mutational landscape under structural and thermodynamical constraints until a solution is found. When considering the influence of the seeds and the target secondary structures, our results show that, compared to single-path directed searches, our approach is more robust, succeeds more often and generates more thermodynamically stable sequences. An ensemble approach to RNA design is thus well worth pursuing as a complement to existing approaches. RNA-ensign is available at http://csb.cs.mcgill.ca/RNAensign. PMID:22941632

  4. Taking a Broad Approach to Public Health Program Adaptation: Adapting a Family-Based Diabetes Education Program

    ERIC Educational Resources Information Center

    Reinschmidt, Kerstin M.; Teufel-Shone, Nicolette I.; Bradford, Gail; Drummond, Rebecca L.; Torres, Emma; Redondo, Floribella; Elenes, Jo Jean; Sanders, Alicia; Gastelum, Sylvia; Moore-Monroy, Martha; Barajas, Salvador; Fernandez, Lourdes; Alvidrez, Rosy; de Zapien, Jill Guernsey; Staten, Lisa K.

    2010-01-01

    Diabetes health disparities among Hispanic populations have been countered with federally funded health promotion and disease prevention programs. Dissemination has focused on program adaptation to local cultural contexts for greater acceptability and sustainability. Taking a broader approach and drawing on our experience in Mexican American…

  5. A User-Centered Approach to Adaptive Hypertext Based on an Information Relevance Model

    NASA Technical Reports Server (NTRS)

    Mathe, Nathalie; Chen, James

    1994-01-01

    Rapid and effective to information in large electronic documentation systems can be facilitated if information relevant in an individual user's content can be automatically supplied to this user. However most of this knowledge on contextual relevance is not found within the contents of documents, it is rather established incrementally by users during information access. We propose a new model for interactively learning contextual relevance during information retrieval, and incrementally adapting retrieved information to individual user profiles. The model, called a relevance network, records the relevance of references based on user feedback for specific queries and user profiles. It also generalizes such knowledge to later derive relevant references for similar queries and profiles. The relevance network lets users filter information by context of relevance. Compared to other approaches, it does not require any prior knowledge nor training. More importantly, our approach to adaptivity is user-centered. It facilitates acceptance and understanding by users by giving them shared control over the adaptation without disturbing their primary task. Users easily control when to adapt and when to use the adapted system. Lastly, the model is independent of the particular application used to access information, and supports sharing of adaptations among users.

  6. Modern control concepts in hydrology. [parameter identification in adaptive stochastic control approach

    NASA Technical Reports Server (NTRS)

    Duong, N.; Winn, C. B.; Johnson, G. R.

    1975-01-01

    Two approaches to an identification problem in hydrology are presented, based upon concepts from modern control and estimation theory. The first approach treats the identification of unknown parameters in a hydrologic system subject to noisy inputs as an adaptive linear stochastic control problem; the second approach alters the model equation to account for the random part in the inputs, and then uses a nonlinear estimation scheme to estimate the unknown parameters. Both approaches use state-space concepts. The identification schemes are sequential and adaptive and can handle either time-invariant or time-dependent parameters. They are used to identify parameters in the Prasad model of rainfall-runoff. The results obtained are encouraging and confirm the results from two previous studies; the first using numerical integration of the model equation along with a trial-and-error procedure, and the second using a quasi-linearization technique. The proposed approaches offer a systematic way of analyzing the rainfall-runoff process when the input data are imbedded in noise.

  7. Adaptive Role Playing Games: An Immersive Approach for Problem Based Learning

    ERIC Educational Resources Information Center

    Sancho, Pilar; Moreno-Ger, Pablo; Fuentes-Fernandez, Ruben; Fernandez-Manjon, Baltasar

    2009-01-01

    In this paper we present a general framework, called NUCLEO, for the application of socio-constructive educational approaches in higher education. The underlying pedagogical approach relies on an adaptation model in order to improve group dynamics, as this has been identified as one of the key features in the success of collaborative learning…

  8. The role of adaptive management as an operational approach for resource management agencies

    USGS Publications Warehouse

    Johnson, B.L.

    1999-01-01

    In making resource management decisions, agencies use a variety of approaches that involve different levels of political concern, historical precedence, data analyses, and evaluation. Traditional decision-making approaches have often failed to achieve objectives for complex problems in large systems, such as the Everglades or the Colorado River. I contend that adaptive management is the best approach available to agencies for addressing this type of complex problem, although its success has been limited thus far. Traditional decision-making approaches have been fairly successful at addressing relatively straightforward problems in small, replicated systems, such as management of trout in small streams or pulp production in forests. However, this success may be jeopardized as more users place increasing demands on these systems. Adaptive management has received little attention from agencies for addressing problems in small-scale systems, but I suggest that it may be a useful approach for creating a holistic view of common problems and developing guidelines that can then be used in simpler, more traditional approaches to management. Although adaptive management may be more expensive to initiate than traditional approaches, it may be less expensive in the long run if it leads to more effective management. The overall goal of adaptive management is not to maintain an optimal condition of the resource, but to develop an optimal management capacity. This is accomplished by maintaining ecological resilience that allows the system to react to inevitable stresses, and generating flexibility in institutions and stakeholders that allows managers to react when conditions change. The result is that, rather than managing for a single, optimal state, we manage within a range of acceptable outcomes while avoiding catastrophes and irreversible negative effects. Copyright ?? 1999 by The Resilience Alliance.

  9. Flexible binding simulation by a novel and improved version of virtual-system coupled adaptive umbrella sampling

    NASA Astrophysics Data System (ADS)

    Dasgupta, Bhaskar; Nakamura, Haruki; Higo, Junichi

    2016-10-01

    Virtual-system coupled adaptive umbrella sampling (VAUS) enhances sampling along a reaction coordinate by using a virtual degree of freedom. However, VAUS and regular adaptive umbrella sampling (AUS) methods are yet computationally expensive. To decrease the computational burden further, improvements of VAUS for all-atom explicit solvent simulation are presented here. The improvements include probability distribution calculation by a Markov approximation; parameterization of biasing forces by iterative polynomial fitting; and force scaling. These when applied to study Ala-pentapeptide dimerization in explicit solvent showed advantage over regular AUS. By using improved VAUS larger biological systems are amenable.

  10. Land-based approach to evaluate sustainable land management and adaptive capacity of ecosystems/lands

    NASA Astrophysics Data System (ADS)

    Kust, German; Andreeva, Olga

    2015-04-01

    A number of new concepts and paradigms appeared during last decades, such as sustainable land management (SLM), climate change (CC) adaptation, environmental services, ecosystem health, and others. All of these initiatives still not having the common scientific platform although some agreements in terminology were reached, schemes of links and feedback loops created, and some models developed. Nevertheless, in spite of all these scientific achievements, the land related issues are still not in the focus of CC adaptation and mitigation. The last did not grow much beyond the "greenhouse gases" (GHG) concept, which makes land degradation as the "forgotten side of climate change" The possible decision to integrate concepts of climate and desertification/land degradation could be consideration of the "GHG" approach providing global solution, and "land" approach providing local solution covering other "locally manifesting" issues of global importance (biodiversity conservation, food security, disasters and risks, etc.) to serve as a central concept among those. SLM concept is a land-based approach, which includes the concepts of both ecosystem-based approach (EbA) and community-based approach (CbA). SLM can serve as in integral CC adaptation strategy, being based on the statement "the more healthy and resilient the system is, the less vulnerable and more adaptive it will be to any external changes and forces, including climate" The biggest scientific issue is the methods to evaluate the SLM and results of the SLM investments. We suggest using the approach based on the understanding of the balance or equilibrium of the land and nature components as the major sign of the sustainable system. Prom this point of view it is easier to understand the state of the ecosystem stress, size of the "health", range of adaptive capacity, drivers of degradation and SLM nature, as well as the extended land use, and the concept of environmental land management as the improved SLM approach

  11. An information theoretic approach of designing sparse kernel adaptive filters.

    PubMed

    Liu, Weifeng; Park, Il; Principe, José C

    2009-12-01

    This paper discusses an information theoretic approach of designing sparse kernel adaptive filters. To determine useful data to be learned and remove redundant ones, a subjective information measure called surprise is introduced. Surprise captures the amount of information a datum contains which is transferable to a learning system. Based on this concept, we propose a systematic sparsification scheme, which can drastically reduce the time and space complexity without harming the performance of kernel adaptive filters. Nonlinear regression, short term chaotic time-series prediction, and long term time-series forecasting examples are presented.

  12. Adaptive measurements of urban runoff quality

    NASA Astrophysics Data System (ADS)

    Wong, Brandon P.; Kerkez, Branko

    2016-11-01

    An approach to adaptively measure runoff water quality dynamics is introduced, focusing specifically on characterizing the timing and magnitude of urban pollutographs. Rather than relying on a static schedule or flow-weighted sampling, which can miss important water quality dynamics if parameterized inadequately, novel Internet-enabled sensor nodes are used to autonomously adapt their measurement frequency to real-time weather forecasts and hydrologic conditions. This dynamic approach has the potential to significantly improve the use of constrained experimental resources, such as automated grab samplers, which continue to provide a strong alternative to sampling water quality dynamics when in situ sensors are not available. Compared to conventional flow-weighted or time-weighted sampling schemes, which rely on preset thresholds, a major benefit of the approach is the ability to dynamically adapt to features of an underlying hydrologic signal. A 28 km2 urban watershed was studied to characterize concentrations of total suspended solids (TSS) and total phosphorus. Water quality samples were autonomously triggered in response to features in the underlying hydrograph and real-time weather forecasts. The study watershed did not exhibit a strong first flush and intraevent concentration variability was driven by flow acceleration, wherein the largest loadings of TSS and total phosphorus corresponded with the steepest rising limbs of the storm hydrograph. The scalability of the proposed method is discussed in the context of larger sensor network deployments, as well the potential to improving control of urban water quality.

  13. Adapting legume crops to climate change using genomic approaches.

    PubMed

    Mousavi-Derazmahalleh, Mahsa; Bayer, Philipp E; Hane, James K; Valliyodan, Babu; Nguyen, Henry T; Nelson, Matthew N; Erskine, William; Varshney, Rajeev K; Papa, Roberto; Edwards, David

    2018-03-30

    Our agricultural system and hence food security is threatened by combination of events, such as increasing population, the impacts of climate change, and the need to a more sustainable development. Evolutionary adaptation may help some species to overcome environmental changes through new selection pressures driven by climate change. However, success of evolutionary adaptation is dependent on various factors, one of which is the extent of genetic variation available within species. Genomic approaches provide an exceptional opportunity to identify genetic variation that can be employed in crop improvement programs. In this review, we illustrate some of the routinely used genomics-based methods as well as recent breakthroughs, which facilitate assessment of genetic variation and discovery of adaptive genes in legumes. Although additional information is needed, the current utility of selection tools indicate a robust ability to utilize existing variation among legumes to address the challenges of climate uncertainty. © 2018 The Authors. Plant, Cell & Environment Published by John Wiley & Sons Ltd.

  14. Influence of wave-front sampling in adaptive optics retinal imaging

    PubMed Central

    Laslandes, Marie; Salas, Matthias; Hitzenberger, Christoph K.; Pircher, Michael

    2017-01-01

    A wide range of sampling densities of the wave-front has been used in retinal adaptive optics (AO) instruments, compared to the number of corrector elements. We developed a model in order to characterize the link between number of actuators, number of wave-front sampling points and AO correction performance. Based on available data from aberration measurements in the human eye, 1000 wave-fronts were generated for the simulations. The AO correction performance in the presence of these representative aberrations was simulated for different deformable mirror and Shack Hartmann wave-front sensor combinations. Predictions of the model were experimentally tested through in vivo measurements in 10 eyes including retinal imaging with an AO scanning laser ophthalmoscope. According to our study, a ratio between wavefront sampling points and actuator elements of 2 is sufficient to achieve high resolution in vivo images of photoreceptors. PMID:28271004

  15. Adaptive enhanced sampling with a path-variable for the simulation of protein folding and aggregation

    NASA Astrophysics Data System (ADS)

    Peter, Emanuel K.

    2017-12-01

    In this article, we present a novel adaptive enhanced sampling molecular dynamics (MD) method for the accelerated simulation of protein folding and aggregation. We introduce a path-variable L based on the un-biased momenta p and displacements dq for the definition of the bias s applied to the system and derive 3 algorithms: general adaptive bias MD, adaptive path-sampling, and a hybrid method which combines the first 2 methodologies. Through the analysis of the correlations between the bias and the un-biased gradient in the system, we find that the hybrid methodology leads to an improved force correlation and acceleration in the sampling of the phase space. We apply our method on SPC/E water, where we find a conservation of the average water structure. We then use our method to sample dialanine and the folding of TrpCage, where we find a good agreement with simulation data reported in the literature. Finally, we apply our methodologies on the initial stages of aggregation of a hexamer of Alzheimer's amyloid β fragment 25-35 (Aβ 25-35) and find that transitions within the hexameric aggregate are dominated by entropic barriers, while we speculate that especially the conformation entropy plays a major role in the formation of the fibril as a rate limiting factor.

  16. Adaptive enhanced sampling with a path-variable for the simulation of protein folding and aggregation.

    PubMed

    Peter, Emanuel K

    2017-12-07

    In this article, we present a novel adaptive enhanced sampling molecular dynamics (MD) method for the accelerated simulation of protein folding and aggregation. We introduce a path-variable L based on the un-biased momenta p and displacements dq for the definition of the bias s applied to the system and derive 3 algorithms: general adaptive bias MD, adaptive path-sampling, and a hybrid method which combines the first 2 methodologies. Through the analysis of the correlations between the bias and the un-biased gradient in the system, we find that the hybrid methodology leads to an improved force correlation and acceleration in the sampling of the phase space. We apply our method on SPC/E water, where we find a conservation of the average water structure. We then use our method to sample dialanine and the folding of TrpCage, where we find a good agreement with simulation data reported in the literature. Finally, we apply our methodologies on the initial stages of aggregation of a hexamer of Alzheimer's amyloid β fragment 25-35 (Aβ 25-35) and find that transitions within the hexameric aggregate are dominated by entropic barriers, while we speculate that especially the conformation entropy plays a major role in the formation of the fibril as a rate limiting factor.

  17. Practical guidelines for implementing adaptive optics in fluorescence microscopy

    NASA Astrophysics Data System (ADS)

    Wilding, Dean; Pozzi, Paolo; Soloviev, Oleg; Vdovin, Gleb; Verhaegen, Michel

    2018-02-01

    In life sciences, interest in the microscopic imaging of increasingly complex three dimensional samples, such as cell spheroids, zebrafish embryos, and in vivo applications in small animals, is growing quickly. Due to the increasing complexity of samples, more and more life scientists are considering the implementation of adaptive optics in their experimental setups. While several approaches to adaptive optics in microscopy have been reported, it is often difficult and confusing for the microscopist to choose from the array of techniques and equipment. In this poster presentation we offer a small guide to adaptive optics providing general guidelines for successful adaptive optics implementation.

  18. Self-organizing adaptive map: autonomous learning of curves and surfaces from point samples.

    PubMed

    Piastra, Marco

    2013-05-01

    Competitive Hebbian Learning (CHL) (Martinetz, 1993) is a simple and elegant method for estimating the topology of a manifold from point samples. The method has been adopted in a number of self-organizing networks described in the literature and has given rise to related studies in the fields of geometry and computational topology. Recent results from these fields have shown that a faithful reconstruction can be obtained using the CHL method only for curves and surfaces. Within these limitations, these findings constitute a basis for defining a CHL-based, growing self-organizing network that produces a faithful reconstruction of an input manifold. The SOAM (Self-Organizing Adaptive Map) algorithm adapts its local structure autonomously in such a way that it can match the features of the manifold being learned. The adaptation process is driven by the defects arising when the network structure is inadequate, which cause a growth in the density of units. Regions of the network undergo a phase transition and change their behavior whenever a simple, local condition of topological regularity is met. The phase transition is eventually completed across the entire structure and the adaptation process terminates. In specific conditions, the structure thus obtained is homeomorphic to the input manifold. During the adaptation process, the network also has the capability to focus on the acquisition of input point samples in critical regions, with a substantial increase in efficiency. The behavior of the network has been assessed experimentally with typical data sets for surface reconstruction, including suboptimal conditions, e.g. with undersampling and noise. Copyright © 2012 Elsevier Ltd. All rights reserved.

  19. The diversity of gendered adaptation strategies to climate change of Indian farmers: A feminist intersectional approach.

    PubMed

    Ravera, Federica; Martín-López, Berta; Pascual, Unai; Drucker, Adam

    2016-12-01

    This paper examines climate change adaptation and gender issues through an application of a feminist intersectional approach. This approach permits the identification of diverse adaptation responses arising from the existence of multiple and fragmented dimensions of identity (including gender) that intersect with power relations to shape situation-specific interactions between farmers and ecosystems. Based on results from contrasting research cases in Bihar and Uttarakhand, India, this paper demonstrates, inter alia, that there are geographically determined gendered preferences and adoption strategies regarding adaptation options and that these are influenced by the socio-ecological context and institutional dynamics. Intersecting identities, such as caste, wealth, age and gender, influence decisions and reveal power dynamics and negotiation within the household and the community, as well as barriers to adaptation among groups. Overall, the findings suggest that a feminist intersectional approach does appear to be useful and worth further exploration in the context of climate change adaptation. In particular, future research could benefit from more emphasis on a nuanced analysis of the intra-gender differences that shape adaptive capacity to climate change.

  20. Solution-Adaptive Cartesian Cell Approach for Viscous and Inviscid Flows

    NASA Technical Reports Server (NTRS)

    Coirier, William J.; Powell, Kenneth G.

    1996-01-01

    A Cartesian cell-based approach for adaptively refined solutions of the Euler and Navier-Stokes equations in two dimensions is presented. Grids about geometrically complicated bodies are generated automatically, by the recursive subdivision of a single Cartesian cell encompassing the entire flow domain. Where the resulting cells intersect bodies, polygonal cut cells are created using modified polygon-clipping algorithms. The grid is stored in a binary tree data structure that provides a natural means of obtaining cell-to-cell connectivity and of carrying out solution-adaptive mesh refinement. The Euler and Navier-Stokes equations are solved on the resulting grids using a finite volume formulation. The convective terms are upwinded: A linear reconstruction of the primitive variables is performed, providing input states to an approximate Riemann solver for computing the fluxes between neighboring cells. The results of a study comparing the accuracy and positivity of two classes of cell-centered, viscous gradient reconstruction procedures is briefly summarized. Adaptively refined solutions of the Navier-Stokes equations are shown using the more robust of these gradient reconstruction procedures, where the results computed by the Cartesian approach are compared to theory, experiment, and other accepted computational results for a series of low and moderate Reynolds number flows.

  1. Incremental Validity of the Durand Adaptive Psychopathic Traits Questionnaire Above Self-Report Psychopathy Measures in Community Samples.

    PubMed

    Durand, Guillaume

    2018-05-03

    Although highly debated, the notion of the existence of an adaptive side to psychopathy is supported by some researchers. Currently, 2 instruments assessing psychopathic traits include an adaptive component, which might not cover the full spectrum of adaptive psychopathic traits. The Durand Adaptive Psychopathic Traits Questionnaire (DAPTQ; Durand, 2017 ) is a 41-item self-reported instrument assessing adaptive traits known to correlate with the psychopathic personality. In this study, I investigated in 2 samples (N = 263 and N = 262) the incremental validity of the DAPTQ over the Psychopathic Personality Inventory-Short Form (PPI-SF) and the Triarchic Psychopathy Measure (TriPM) using multiple criterion measures. Results showed that the DAPTQ significantly increased the predictive validity over the PPI-SF on 5 factors of the HEXACO. Additionally, the DAPTQ provided incremental validity over both the PPI-SF and the TriPM on measures of communication adaptability, perceived stress, and trait anxiety. Overall, these results support the validity of the DAPTQ in community samples. Directions for future studies to further validate the DAPTQ are discussed.

  2. Shape anomaly detection under strong measurement noise: An analytical approach to adaptive thresholding

    NASA Astrophysics Data System (ADS)

    Krasichkov, Alexander S.; Grigoriev, Eugene B.; Bogachev, Mikhail I.; Nifontov, Eugene M.

    2015-10-01

    We suggest an analytical approach to the adaptive thresholding in a shape anomaly detection problem. We find an analytical expression for the distribution of the cosine similarity score between a reference shape and an observational shape hindered by strong measurement noise that depends solely on the noise level and is independent of the particular shape analyzed. The analytical treatment is also confirmed by computer simulations and shows nearly perfect agreement. Using this analytical solution, we suggest an improved shape anomaly detection approach based on adaptive thresholding. We validate the noise robustness of our approach using typical shapes of normal and pathological electrocardiogram cycles hindered by additive white noise. We show explicitly that under high noise levels our approach considerably outperforms the conventional tactic that does not take into account variations in the noise level.

  3. An Adaptive Approach to Managing Knowledge Development in a Project-Based Learning Environment

    ERIC Educational Resources Information Center

    Tilchin, Oleg; Kittany, Mohamed

    2016-01-01

    In this paper we propose an adaptive approach to managing the development of students' knowledge in the comprehensive project-based learning (PBL) environment. Subject study is realized by two-stage PBL. It shapes adaptive knowledge management (KM) process and promotes the correct balance between personalized and collaborative learning. The…

  4. Approaching neuropsychological tasks through adaptive neurorobots

    NASA Astrophysics Data System (ADS)

    Gigliotta, Onofrio; Bartolomeo, Paolo; Miglino, Orazio

    2015-04-01

    Neuropsychological phenomena have been modelized mainly, by the mainstream approach, by attempting to reproduce their neural substrate whereas sensory-motor contingencies have attracted less attention. In this work, we introduce a simulator based on the evolutionary robotics platform Evorobot* in order to setting up in silico neuropsychological tasks. Moreover, in this study we trained artificial embodied neurorobotic agents equipped with a pan/tilt camera, provided with different neural and motor capabilities, to solve a well-known neuropsychological test: the cancellation task in which an individual is asked to cancel target stimuli surrounded by distractors. Results showed that embodied agents provided with additional motor capabilities (a zooming/attentional actuator) outperformed simple pan/tilt agents, even those equipped with more complex neural controllers and that the zooming ability is exploited to correctly categorising presented stimuli. We conclude that since the sole neural computational power cannot explain the (artificial) cognition which emerged throughout the adaptive process, such kind of modelling approach can be fruitful in neuropsychological modelling where the importance of having a body is often neglected.

  5. Knowledge-light adaptation approaches in case-based reasoning for radiotherapy treatment planning.

    PubMed

    Petrovic, Sanja; Khussainova, Gulmira; Jagannathan, Rupa

    2016-03-01

    Radiotherapy treatment planning aims at delivering a sufficient radiation dose to cancerous tumour cells while sparing healthy organs in the tumour-surrounding area. It is a time-consuming trial-and-error process that requires the expertise of a group of medical experts including oncologists and medical physicists and can take from 2 to 3h to a few days. Our objective is to improve the performance of our previously built case-based reasoning (CBR) system for brain tumour radiotherapy treatment planning. In this system, a treatment plan for a new patient is retrieved from a case base containing patient cases treated in the past and their treatment plans. However, this system does not perform any adaptation, which is needed to account for any difference between the new and retrieved cases. Generally, the adaptation phase is considered to be intrinsically knowledge-intensive and domain-dependent. Therefore, an adaptation often requires a large amount of domain-specific knowledge, which can be difficult to acquire and often is not readily available. In this study, we investigate approaches to adaptation that do not require much domain knowledge, referred to as knowledge-light adaptation. We developed two adaptation approaches: adaptation based on machine-learning tools and adaptation-guided retrieval. They were used to adapt the beam number and beam angles suggested in the retrieved case. Two machine-learning tools, neural networks and naive Bayes classifier, were used in the adaptation to learn how the difference in attribute values between the retrieved and new cases affects the output of these two cases. The adaptation-guided retrieval takes into consideration not only the similarity between the new and retrieved cases, but also how to adapt the retrieved case. The research was carried out in collaboration with medical physicists at the Nottingham University Hospitals NHS Trust, City Hospital Campus, UK. All experiments were performed using real-world brain cancer

  6. A practical approach for translating climate change adaptation principles into forest management actions

    Treesearch

    Maria K. Janowiak; Christopher W. Swanston; Linda M. Nagel; Leslie A. Brandt; Patricia R. Butler; Stephen D. Handler; P. Danielle Shannon; Louis R. Iverson; Stephen N. Matthews; Anantha Prasad; Matthew P. Peters

    2014-01-01

    There is an ever-growing body of literature on forest management strategies for climate change adaptation; however, few frameworks have been presented for integrating these strategies with the real-world challenges of forest management. We have developed a structured approach for translating broad adaptation concepts into specific management actions and silvicultural...

  7. New multigrid approach for three-dimensional unstructured, adaptive grids

    NASA Technical Reports Server (NTRS)

    Parthasarathy, Vijayan; Kallinderis, Y.

    1994-01-01

    A new multigrid method with adaptive unstructured grids is presented. The three-dimensional Euler equations are solved on tetrahedral grids that are adaptively refined or coarsened locally. The multigrid method is employed to propagate the fine grid corrections more rapidly by redistributing the changes-in-time of the solution from the fine grid to the coarser grids to accelerate convergence. A new approach is employed that uses the parent cells of the fine grid cells in an adapted mesh to generate successively coaser levels of multigrid. This obviates the need for the generation of a sequence of independent, nonoverlapping grids as well as the relatively complicated operations that need to be performed to interpolate the solution and the residuals between the independent grids. The solver is an explicit, vertex-based, finite volume scheme that employs edge-based data structures and operations. Spatial discretization is of central-differencing type combined with a special upwind-like smoothing operators. Application cases include adaptive solutions obtained with multigrid acceleration for supersonic and subsonic flow over a bump in a channel, as well as transonic flow around the ONERA M6 wing. Two levels of multigrid resulted in reduction in the number of iterations by a factor of 5.

  8. A novel heterogeneous training sample selection method on space-time adaptive processing

    NASA Astrophysics Data System (ADS)

    Wang, Qiang; Zhang, Yongshun; Guo, Yiduo

    2018-04-01

    The performance of ground target detection about space-time adaptive processing (STAP) decreases when non-homogeneity of clutter power is caused because of training samples contaminated by target-like signals. In order to solve this problem, a novel nonhomogeneous training sample selection method based on sample similarity is proposed, which converts the training sample selection into a convex optimization problem. Firstly, the existing deficiencies on the sample selection using generalized inner product (GIP) are analyzed. Secondly, the similarities of different training samples are obtained by calculating mean-hausdorff distance so as to reject the contaminated training samples. Thirdly, cell under test (CUT) and the residual training samples are projected into the orthogonal subspace of the target in the CUT, and mean-hausdorff distances between the projected CUT and training samples are calculated. Fourthly, the distances are sorted in order of value and the training samples which have the bigger value are selective preference to realize the reduced-dimension. Finally, simulation results with Mountain-Top data verify the effectiveness of the proposed method.

  9. An Enhanced Adaptive Management Approach for Remediation of Legacy Mercury in the South River

    PubMed Central

    Foran, Christy M.; Baker, Kelsie M.; Grosso, Nancy R.; Linkov, Igor

    2015-01-01

    Uncertainties about future conditions and the effects of chosen actions, as well as increasing resource scarcity, have been driving forces in the utilization of adaptive management strategies. However, many applications of adaptive management have been criticized for a number of shortcomings, including a limited ability to learn from actions and a lack of consideration of stakeholder objectives. To address these criticisms, we supplement existing adaptive management approaches with a decision-analytical approach that first informs the initial selection of management alternatives and then allows for periodic re-evaluation or phased implementation of management alternatives based on monitoring information and incorporation of stakeholder values. We describe the application of this enhanced adaptive management (EAM) framework to compare remedial alternatives for mercury in the South River, based on an understanding of the loading and behavior of mercury in the South River near Waynesboro, VA. The outcomes show that the ranking of remedial alternatives is influenced by uncertainty in the mercury loading model, by the relative importance placed on different criteria, and by cost estimates. The process itself demonstrates that a decision model can link project performance criteria, decision-maker preferences, environmental models, and short- and long-term monitoring information with management choices to help shape a remediation approach that provides useful information for adaptive, incremental implementation. PMID:25665032

  10. Adaptive Conditioning of Multiple-Point Geostatistical Facies Simulation to Flow Data with Facies Probability Maps

    NASA Astrophysics Data System (ADS)

    Khodabakhshi, M.; Jafarpour, B.

    2013-12-01

    Characterization of complex geologic patterns that create preferential flow paths in certain reservoir systems requires higher-order geostatistical modeling techniques. Multipoint statistics (MPS) provides a flexible grid-based approach for simulating such complex geologic patterns from a conceptual prior model known as a training image (TI). In this approach, a stationary TI that encodes the higher-order spatial statistics of the expected geologic patterns is used to represent the shape and connectivity of the underlying lithofacies. While MPS is quite powerful for describing complex geologic facies connectivity, the nonlinear and complex relation between the flow data and facies distribution makes flow data conditioning quite challenging. We propose an adaptive technique for conditioning facies simulation from a prior TI to nonlinear flow data. Non-adaptive strategies for conditioning facies simulation to flow data can involves many forward flow model solutions that can be computationally very demanding. To improve the conditioning efficiency, we develop an adaptive sampling approach through a data feedback mechanism based on the sampling history. In this approach, after a short period of sampling burn-in time where unconditional samples are generated and passed through an acceptance/rejection test, an ensemble of accepted samples is identified and used to generate a facies probability map. This facies probability map contains the common features of the accepted samples and provides conditioning information about facies occurrence in each grid block, which is used to guide the conditional facies simulation process. As the sampling progresses, the initial probability map is updated according to the collective information about the facies distribution in the chain of accepted samples to increase the acceptance rate and efficiency of the conditioning. This conditioning process can be viewed as an optimization approach where each new sample is proposed based on the

  11. Adaptive sampling dual terahertz comb spectroscopy using dual free-running femtosecond lasers.

    PubMed

    Yasui, Takeshi; Ichikawa, Ryuji; Hsieh, Yi-Da; Hayashi, Kenta; Cahyadi, Harsono; Hindle, Francis; Sakaguchi, Yoshiyuki; Iwata, Tetsuo; Mizutani, Yasuhiro; Yamamoto, Hirotsugu; Minoshima, Kaoru; Inaba, Hajime

    2015-06-02

    Terahertz (THz) dual comb spectroscopy (DCS) is a promising method for high-accuracy, high-resolution, broadband THz spectroscopy because the mode-resolved THz comb spectrum includes both broadband THz radiation and narrow-line CW-THz radiation characteristics. In addition, all frequency modes of a THz comb can be phase-locked to a microwave frequency standard, providing excellent traceability. However, the need for stabilization of dual femtosecond lasers has often hindered its wide use. To overcome this limitation, here we have demonstrated adaptive-sampling THz-DCS, allowing the use of free-running femtosecond lasers. To correct the fluctuation of the time and frequency scales caused by the laser timing jitter, an adaptive sampling clock is generated by dual THz-comb-referenced spectrum analysers and is used for a timing clock signal in a data acquisition board. The results not only indicated the successful implementation of THz-DCS with free-running lasers but also showed that this configuration outperforms standard THz-DCS with stabilized lasers due to the slight jitter remained in the stabilized lasers.

  12. Adaptive sampling dual terahertz comb spectroscopy using dual free-running femtosecond lasers

    PubMed Central

    Yasui, Takeshi; Ichikawa, Ryuji; Hsieh, Yi-Da; Hayashi, Kenta; Cahyadi, Harsono; Hindle, Francis; Sakaguchi, Yoshiyuki; Iwata, Tetsuo; Mizutani, Yasuhiro; Yamamoto, Hirotsugu; Minoshima, Kaoru; Inaba, Hajime

    2015-01-01

    Terahertz (THz) dual comb spectroscopy (DCS) is a promising method for high-accuracy, high-resolution, broadband THz spectroscopy because the mode-resolved THz comb spectrum includes both broadband THz radiation and narrow-line CW-THz radiation characteristics. In addition, all frequency modes of a THz comb can be phase-locked to a microwave frequency standard, providing excellent traceability. However, the need for stabilization of dual femtosecond lasers has often hindered its wide use. To overcome this limitation, here we have demonstrated adaptive-sampling THz-DCS, allowing the use of free-running femtosecond lasers. To correct the fluctuation of the time and frequency scales caused by the laser timing jitter, an adaptive sampling clock is generated by dual THz-comb-referenced spectrum analysers and is used for a timing clock signal in a data acquisition board. The results not only indicated the successful implementation of THz-DCS with free-running lasers but also showed that this configuration outperforms standard THz-DCS with stabilized lasers due to the slight jitter remained in the stabilized lasers. PMID:26035687

  13. ASICs Approach for the Implementation of a Symmetric Triangular Fuzzy Coprocessor and Its Application to Adaptive Filtering

    NASA Technical Reports Server (NTRS)

    Starks, Scott; Abdel-Hafeez, Saleh; Usevitch, Bryan

    1997-01-01

    This paper discusses the implementation of a fuzzy logic system using an ASICs design approach. The approach is based upon combining the inherent advantages of symmetric triangular membership functions and fuzzy singleton sets to obtain a novel structure for fuzzy logic system application development. The resulting structure utilizes a fuzzy static RAM to store the rule-base and the end-points of the triangular membership functions. This provides advantages over other approaches in which all sampled values of membership functions for all universes must be stored. The fuzzy coprocessor structure implements the fuzzification and defuzzification processes through a two-stage parallel pipeline architecture which is capable of executing complex fuzzy computations in less than 0.55us with an accuracy of more than 95%, thus making it suitable for a wide range of applications. Using the approach presented in this paper, a fuzzy logic rule-base can be directly downloaded via a host processor to an onchip rule-base memory with a size of 64 words. The fuzzy coprocessor's design supports up to 49 rules for seven fuzzy membership functions associated with each of the chip's two input variables. This feature allows designers to create fuzzy logic systems without the need for additional on-board memory. Finally, the paper reports on simulation studies that were conducted for several adaptive filter applications using the least mean squared adaptive algorithm for adjusting the knowledge rule-base.

  14. Optimal updating magnitude in adaptive flat-distribution sampling

    NASA Astrophysics Data System (ADS)

    Zhang, Cheng; Drake, Justin A.; Ma, Jianpeng; Pettitt, B. Montgomery

    2017-11-01

    We present a study on the optimization of the updating magnitude for a class of free energy methods based on flat-distribution sampling, including the Wang-Landau (WL) algorithm and metadynamics. These methods rely on adaptive construction of a bias potential that offsets the potential of mean force by histogram-based updates. The convergence of the bias potential can be improved by decreasing the updating magnitude with an optimal schedule. We show that while the asymptotically optimal schedule for the single-bin updating scheme (commonly used in the WL algorithm) is given by the known inverse-time formula, that for the Gaussian updating scheme (commonly used in metadynamics) is often more complex. We further show that the single-bin updating scheme is optimal for very long simulations, and it can be generalized to a class of bandpass updating schemes that are similarly optimal. These bandpass updating schemes target only a few long-range distribution modes and their optimal schedule is also given by the inverse-time formula. Constructed from orthogonal polynomials, the bandpass updating schemes generalize the WL and Langfeld-Lucini-Rago algorithms as an automatic parameter tuning scheme for umbrella sampling.

  15. Optimal updating magnitude in adaptive flat-distribution sampling.

    PubMed

    Zhang, Cheng; Drake, Justin A; Ma, Jianpeng; Pettitt, B Montgomery

    2017-11-07

    We present a study on the optimization of the updating magnitude for a class of free energy methods based on flat-distribution sampling, including the Wang-Landau (WL) algorithm and metadynamics. These methods rely on adaptive construction of a bias potential that offsets the potential of mean force by histogram-based updates. The convergence of the bias potential can be improved by decreasing the updating magnitude with an optimal schedule. We show that while the asymptotically optimal schedule for the single-bin updating scheme (commonly used in the WL algorithm) is given by the known inverse-time formula, that for the Gaussian updating scheme (commonly used in metadynamics) is often more complex. We further show that the single-bin updating scheme is optimal for very long simulations, and it can be generalized to a class of bandpass updating schemes that are similarly optimal. These bandpass updating schemes target only a few long-range distribution modes and their optimal schedule is also given by the inverse-time formula. Constructed from orthogonal polynomials, the bandpass updating schemes generalize the WL and Langfeld-Lucini-Rago algorithms as an automatic parameter tuning scheme for umbrella sampling.

  16. An Adaptive Critic Approach to Reference Model Adaptation

    NASA Technical Reports Server (NTRS)

    Krishnakumar, K.; Limes, G.; Gundy-Burlet, K.; Bryant, D.

    2003-01-01

    Neural networks have been successfully used for implementing control architectures for different applications. In this work, we examine a neural network augmented adaptive critic as a Level 2 intelligent controller for a C- 17 aircraft. This intelligent control architecture utilizes an adaptive critic to tune the parameters of a reference model, which is then used to define the angular rate command for a Level 1 intelligent controller. The present architecture is implemented on a high-fidelity non-linear model of a C-17 aircraft. The goal of this research is to improve the performance of the C-17 under degraded conditions such as control failures and battle damage. Pilot ratings using a motion based simulation facility are included in this paper. The benefits of using an adaptive critic are documented using time response comparisons for severe damage situations.

  17. Game-theoretic approach to joint transmitter adaptation and power control in wireless systems.

    PubMed

    Popescu, Dimitrie C; Rawat, Danda B; Popescu, Otilia; Saquib, Mohamad

    2010-06-01

    Game theory has emerged as a new mathematical tool in the analysis and design of wireless communication systems, being particularly useful in studying the interactions among adaptive transmitters that attempt to achieve specific objectives without cooperation. In this paper, we present a game-theoretic approach to the problem of joint transmitter adaptation and power control in wireless systems, where users' transmissions are subject to quality-of-service requirements specified in terms of target signal-to-interference-plus-noise ratios (SINRs) and nonideal vector channels between transmitters and receivers are explicitly considered. Our approach is based on application of separable games, which are a specific class of noncooperative games where the players' cost is a separable function of their strategic choices. We formally state a joint codeword and power adaptation game, which is separable, and we study its properties in terms of its subgames, namely, the codeword adaptation subgame and the power adaptation subgame. We investigate the necessary conditions for an optimal Nash equilibrium and show that this corresponds to an ensemble of user codewords and powers, which maximizes the sum capacity of the corresponding multiaccess vector channel model, and for which the specified target SINRs are achieved with minimum transmitted power.

  18. Adaptation of a weighted regression approach to evaluate water quality trends in anestuary

    EPA Science Inventory

    To improve the description of long-term changes in water quality, a weighted regression approach developed to describe trends in pollutant transport in rivers was adapted to analyze a long-term water quality dataset from Tampa Bay, Florida. The weighted regression approach allows...

  19. AN OPTIMAL ADAPTIVE LOCAL GRID REFINEMENT APPROACH TO MODELING CONTAMINANT TRANSPORT

    EPA Science Inventory

    A Lagrangian-Eulerian method with an optimal adaptive local grid refinement is used to model contaminant transport equations. pplication of this approach to two bench-mark problems indicates that it completely resolves difficulties of peak clipping, numerical diffusion, and spuri...

  20. The adaptive, cut-cell Cartesian approach (warts and all)

    NASA Technical Reports Server (NTRS)

    Powell, Kenneth G.

    1995-01-01

    Solution-adaptive methods based on cutting bodies out of Cartesian grids are gaining popularity now that the ways of circumventing the accuracy problems associated with small cut cells have been developed. Researchers are applying Cartesian-based schemes to a broad class of problems now, and, although there is still development work to be done, it is becoming clearer which problems are best suited to the approach (and which are not). The purpose of this paper is to give a candid assessment, based on applying Cartesian schemes to a variety of problems, of the strengths and weaknesses of the approach as it is currently implemented.

  1. Variable Neural Adaptive Robust Control: A Switched System Approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lian, Jianming; Hu, Jianghai; Zak, Stanislaw H.

    2015-05-01

    Variable neural adaptive robust control strategies are proposed for the output tracking control of a class of multi-input multi-output uncertain systems. The controllers incorporate a variable-structure radial basis function (RBF) network as the self-organizing approximator for unknown system dynamics. The variable-structure RBF network solves the problem of structure determination associated with fixed-structure RBF networks. It can determine the network structure on-line dynamically by adding or removing radial basis functions according to the tracking performance. The structure variation is taken into account in the stability analysis of the closed-loop system using a switched system approach with the aid of the piecewisemore » quadratic Lyapunov function. The performance of the proposed variable neural adaptive robust controllers is illustrated with simulations.« less

  2. Adaptive single-pixel imaging with aggregated sampling and continuous differential measurements

    NASA Astrophysics Data System (ADS)

    Huo, Yaoran; He, Hongjie; Chen, Fan; Tai, Heng-Ming

    2018-06-01

    This paper proposes an adaptive compressive imaging technique with one single-pixel detector and single arm. The aggregated sampling (AS) method enables the reduction of resolutions of the reconstructed images. It aims to reduce the time and space consumption. The target image with a resolution up to 1024 × 1024 can be reconstructed successfully at the 20% sampling rate. The continuous differential measurement (CDM) method combined with a ratio factor of significant coefficient (RFSC) improves the imaging quality. Moreover, RFSC reduces the human intervention in parameter setting. This technique enhances the practicability of single-pixel imaging with the benefits from less time and space consumption, better imaging quality and less human intervention.

  3. Adaptive windowing and windowless approaches to estimate dynamic functional brain connectivity

    NASA Astrophysics Data System (ADS)

    Yaesoubi, Maziar; Calhoun, Vince D.

    2017-08-01

    In this work, we discuss estimation of dynamic dependence of a multi-variate signal. Commonly used approaches are often based on a locality assumption (e.g. sliding-window) which can miss spontaneous changes due to blurring with local but unrelated changes. We discuss recent approaches to overcome this limitation including 1) a wavelet-space approach, essentially adapting the window to the underlying frequency content and 2) a sparse signal-representation which removes any locality assumption. The latter is especially useful when there is no prior knowledge of the validity of such assumption as in brain-analysis. Results on several large resting-fMRI data sets highlight the potential of these approaches.

  4. Adaptive Water Sampling based on Unsupervised Clustering

    NASA Astrophysics Data System (ADS)

    Py, F.; Ryan, J.; Rajan, K.; Sherman, A.; Bird, L.; Fox, M.; Long, D.

    2007-12-01

    Autonomous Underwater Vehicles (AUVs) are widely used for oceanographic surveys, during which data is collected from a number of on-board sensors. Engineers and scientists at MBARI have extended this approach by developing a water sampler specialy for the AUV, which can sample a specific patch of water at a specific time. The sampler, named the Gulper, captures 2 liters of seawater in less than 2 seconds on a 21" MBARI Odyssey AUV. Each sample chamber of the Gulper is filled with seawater through a one-way valve, which protrudes through the fairing of the AUV. This new kind of device raises a new problem: when to trigger the gulper autonomously? For example, scientists interested in studying the mobilization and transport of shelf sediments would like to detect intermediate nepheloïd layers (INLs). To be able to detect this phenomenon we need to extract a model based on AUV sensors that can detect this feature in-situ. The formation of such a model is not obvious as identification of this feature is generally based on data from multiple sensors. We have developed an unsupervised data clustering technique to extract the different features which will then be used for on-board classification and triggering of the Gulper. We use a three phase approach: 1) use data from past missions to learn the different classes of data from sensor inputs. The clustering algorithm will then extract the set of features that can be distinguished within this large data set. 2) Scientists on shore then identify these features and point out which correspond to those of interest (e.g. nepheloïd layer, upwelling material etc) 3) Embed the corresponding classifier into the AUV control system to indicate the most probable feature of the water depending on sensory input. The triggering algorithm looks to this result and triggers the Gulper if the classifier indicates that we are within the feature of interest with a predetermined threshold of confidence. We have deployed this method of online

  5. A problem-oriented approach to understanding adaptation: lessons learnt from Alpine Shire, Victoria Australia.

    NASA Astrophysics Data System (ADS)

    Roman, Carolina

    2010-05-01

    Climate change is gaining attention as a significant strategic issue for localities that rely on their business sectors for economic viability. For businesses in the tourism sector, considerable research effort has sought to characterise the vulnerability to the likely impacts of future climate change through scenarios or ‘end-point' approaches (Kelly & Adger, 2000). Whilst useful, there are few demonstrable case studies that complement such work with a ‘start-point' approach that seeks to explore contextual vulnerability (O'Brien et al., 2007). This broader approach is inclusive of climate change as a process operating within a biophysical system and allows recognition of the complex interactions that occur in the coupled human-environmental system. A problem-oriented and interdisciplinary approach was employed at Alpine Shire, in northeast Victoria Australia, to explore the concept of contextual vulnerability and adaptability to stressors that include, but are not limited to climatic change. Using a policy sciences approach, the objective was to identify factors that influence existing vulnerabilities and that might consequently act as barriers to effective adaptation for the Shire's business community involved in the tourism sector. Analyses of results suggest that many threats, including the effects climate change, compete for the resources, strategy and direction of local tourism management bodies. Further analysis of conditioning factors revealed that many complex and interacting factors define the vulnerability and adaptive capacity of the Shire's tourism sector to the challenges of global change, which collectively have more immediate implications for policy and planning than long-term future climate change scenarios. An approximation of the common interest, i.e. enhancing capacity in business acumen amongst tourism operators, would facilitate adaptability and sustainability through the enhancement of social capital in this business community. Kelly, P

  6. The choice of sample size: a mixed Bayesian / frequentist approach.

    PubMed

    Pezeshk, Hamid; Nematollahi, Nader; Maroufy, Vahed; Gittins, John

    2009-04-01

    Sample size computations are largely based on frequentist or classical methods. In the Bayesian approach the prior information on the unknown parameters is taken into account. In this work we consider a fully Bayesian approach to the sample size determination problem which was introduced by Grundy et al. and developed by Lindley. This approach treats the problem as a decision problem and employs a utility function to find the optimal sample size of a trial. Furthermore, we assume that a regulatory authority, which is deciding on whether or not to grant a licence to a new treatment, uses a frequentist approach. We then find the optimal sample size for the trial by maximising the expected net benefit, which is the expected benefit of subsequent use of the new treatment minus the cost of the trial.

  7. EXSPRT: An Expert Systems Approach to Computer-Based Adaptive Testing.

    ERIC Educational Resources Information Center

    Frick, Theodore W.; And Others

    Expert systems can be used to aid decision making. A computerized adaptive test (CAT) is one kind of expert system, although it is not commonly recognized as such. A new approach, termed EXSPRT, was devised that combines expert systems reasoning and sequential probability ratio test stopping rules. EXSPRT-R uses random selection of test items,…

  8. Iterative learning-based decentralized adaptive tracker for large-scale systems: a digital redesign approach.

    PubMed

    Tsai, Jason Sheng-Hong; Du, Yan-Yi; Huang, Pei-Hsiang; Guo, Shu-Mei; Shieh, Leang-San; Chen, Yuhua

    2011-07-01

    In this paper, a digital redesign methodology of the iterative learning-based decentralized adaptive tracker is proposed to improve the dynamic performance of sampled-data linear large-scale control systems consisting of N interconnected multi-input multi-output subsystems, so that the system output will follow any trajectory which may not be presented by the analytic reference model initially. To overcome the interference of each sub-system and simplify the controller design, the proposed model reference decentralized adaptive control scheme constructs a decoupled well-designed reference model first. Then, according to the well-designed model, this paper develops a digital decentralized adaptive tracker based on the optimal analog control and prediction-based digital redesign technique for the sampled-data large-scale coupling system. In order to enhance the tracking performance of the digital tracker at specified sampling instants, we apply the iterative learning control (ILC) to train the control input via continual learning. As a result, the proposed iterative learning-based decentralized adaptive tracker not only has robust closed-loop decoupled property but also possesses good tracking performance at both transient and steady state. Besides, evolutionary programming is applied to search for a good learning gain to speed up the learning process of ILC. Copyright © 2011 ISA. Published by Elsevier Ltd. All rights reserved.

  9. Accelerated Adaptive Integration Method

    PubMed Central

    2015-01-01

    Conformational changes that occur upon ligand binding may be too slow to observe on the time scales routinely accessible using molecular dynamics simulations. The adaptive integration method (AIM) leverages the notion that when a ligand is either fully coupled or decoupled, according to λ, barrier heights may change, making some conformational transitions more accessible at certain λ values. AIM adaptively changes the value of λ in a single simulation so that conformations sampled at one value of λ seed the conformational space sampled at another λ value. Adapting the value of λ throughout a simulation, however, does not resolve issues in sampling when barriers remain high regardless of the λ value. In this work, we introduce a new method, called Accelerated AIM (AcclAIM), in which the potential energy function is flattened at intermediate values of λ, promoting the exploration of conformational space as the ligand is decoupled from its receptor. We show, with both a simple model system (Bromocyclohexane) and the more complex biomolecule Thrombin, that AcclAIM is a promising approach to overcome high barriers in the calculation of free energies, without the need for any statistical reweighting or additional processors. PMID:24780083

  10. A sub-sampled approach to extremely low-dose STEM

    DOE PAGES

    Stevens, A.; Luzi, L.; Yang, H.; ...

    2018-01-22

    The inpainting of deliberately and randomly sub-sampled images offers a potential means to image specimens at a high resolution and under extremely low-dose conditions (≤1 e -/Å 2) using a scanning transmission electron microscope. We show that deliberate sub-sampling acquires images at least an order of magnitude faster than conventional low-dose methods for an equivalent electron dose. More importantly, when adaptive sub-sampling is implemented to acquire the images, there is a significant increase in the resolution and sensitivity which accompanies the increase in imaging speed. Lastly, we demonstrate the potential of this method for beam sensitive materials and in-situ observationsmore » by experimentally imaging the node distribution in a metal-organic framework.« less

  11. A sub-sampled approach to extremely low-dose STEM

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stevens, A.; Luzi, L.; Yang, H.

    The inpainting of deliberately and randomly sub-sampled images offers a potential means to image specimens at a high resolution and under extremely low-dose conditions (≤1 e -/Å 2) using a scanning transmission electron microscope. We show that deliberate sub-sampling acquires images at least an order of magnitude faster than conventional low-dose methods for an equivalent electron dose. More importantly, when adaptive sub-sampling is implemented to acquire the images, there is a significant increase in the resolution and sensitivity which accompanies the increase in imaging speed. Lastly, we demonstrate the potential of this method for beam sensitive materials and in-situ observationsmore » by experimentally imaging the node distribution in a metal-organic framework.« less

  12. Study of a MEMS-based Shack-Hartmann wavefront sensor with adjustable pupil sampling for astronomical adaptive optics.

    PubMed

    Baranec, Christoph; Dekany, Richard

    2008-10-01

    We introduce a Shack-Hartmann wavefront sensor for adaptive optics that enables dynamic control of the spatial sampling of an incoming wavefront using a segmented mirror microelectrical mechanical systems (MEMS) device. Unlike a conventional lenslet array, subapertures are defined by either segments or groups of segments of a mirror array, with the ability to change spatial pupil sampling arbitrarily by redefining the segment grouping. Control over the spatial sampling of the wavefront allows for the minimization of wavefront reconstruction error for different intensities of guide source and different atmospheric conditions, which in turn maximizes an adaptive optics system's delivered Strehl ratio. Requirements for the MEMS devices needed in this Shack-Hartmann wavefront sensor are also presented.

  13. Genetics of climate change adaptation.

    PubMed

    Franks, Steven J; Hoffmann, Ary A

    2012-01-01

    The rapid rate of current global climate change is having strong effects on many species and, at least in some cases, is driving evolution, particularly when changes in conditions alter patterns of selection. Climate change thus provides an opportunity for the study of the genetic basis of adaptation. Such studies include a variety of observational and experimental approaches, such as sampling across clines, artificial evolution experiments, and resurrection studies. These approaches can be combined with a number of techniques in genetics and genomics, including association and mapping analyses, genome scans, and transcription profiling. Recent research has revealed a number of candidate genes potentially involved in climate change adaptation and has also illustrated that genetic regulatory networks and epigenetic effects may be particularly relevant for evolution driven by climate change. Although genetic and genomic data are rapidly accumulating, we still have much to learn about the genetic architecture of climate change adaptation.

  14. Pupil-segmentation-based adaptive optics for microscopy

    NASA Astrophysics Data System (ADS)

    Ji, Na; Milkie, Daniel E.; Betzig, Eric

    2011-03-01

    Inhomogeneous optical properties of biological samples make it difficult to obtain diffraction-limited resolution in depth. Correcting the sample-induced optical aberrations needs adaptive optics (AO). However, the direct wavefront-sensing approach commonly used in astronomy is not suitable for most biological samples due to their strong scattering of light. We developed an image-based AO approach that is insensitive to sample scattering. By comparing images of the sample taken with different segments of the pupil illuminated, local tilt in the wavefront is measured from image shift. The aberrated wavefront is then obtained either by measuring the local phase directly using interference or with phase reconstruction algorithms similar to those used in astronomical AO. We implemented this pupil-segmentation-based approach in a two-photon fluorescence microscope and demonstrated that diffraction-limited resolution can be recovered from nonbiological and biological samples.

  15. Ensemble based adaptive over-sampling method for imbalanced data learning in computer aided detection of microaneurysm.

    PubMed

    Ren, Fulong; Cao, Peng; Li, Wei; Zhao, Dazhe; Zaiane, Osmar

    2017-01-01

    Diabetic retinopathy (DR) is a progressive disease, and its detection at an early stage is crucial for saving a patient's vision. An automated screening system for DR can help in reduce the chances of complete blindness due to DR along with lowering the work load on ophthalmologists. Among the earliest signs of DR are microaneurysms (MAs). However, current schemes for MA detection appear to report many false positives because detection algorithms have high sensitivity. Inevitably some non-MAs structures are labeled as MAs in the initial MAs identification step. This is a typical "class imbalance problem". Class imbalanced data has detrimental effects on the performance of conventional classifiers. In this work, we propose an ensemble based adaptive over-sampling algorithm for overcoming the class imbalance problem in the false positive reduction, and we use Boosting, Bagging, Random subspace as the ensemble framework to improve microaneurysm detection. The ensemble based over-sampling methods we proposed combine the strength of adaptive over-sampling and ensemble. The objective of the amalgamation of ensemble and adaptive over-sampling is to reduce the induction biases introduced from imbalanced data and to enhance the generalization classification performance of extreme learning machines (ELM). Experimental results show that our ASOBoost method has higher area under the ROC curve (AUC) and G-mean values than many existing class imbalance learning methods. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. A comparison of adaptive sampling designs and binary spatial models: A simulation study using a census of Bromus inermis

    USGS Publications Warehouse

    Irvine, Kathryn M.; Thornton, Jamie; Backus, Vickie M.; Hohmann, Matthew G.; Lehnhoff, Erik A.; Maxwell, Bruce D.; Michels, Kurt; Rew, Lisa

    2013-01-01

    Commonly in environmental and ecological studies, species distribution data are recorded as presence or absence throughout a spatial domain of interest. Field based studies typically collect observations by sampling a subset of the spatial domain. We consider the effects of six different adaptive and two non-adaptive sampling designs and choice of three binary models on both predictions to unsampled locations and parameter estimation of the regression coefficients (species–environment relationships). Our simulation study is unique compared to others to date in that we virtually sample a true known spatial distribution of a nonindigenous plant species, Bromus inermis. The census of B. inermis provides a good example of a species distribution that is both sparsely (1.9 % prevalence) and patchily distributed. We find that modeling the spatial correlation using a random effect with an intrinsic Gaussian conditionally autoregressive prior distribution was equivalent or superior to Bayesian autologistic regression in terms of predicting to un-sampled areas when strip adaptive cluster sampling was used to survey B. inermis. However, inferences about the relationships between B. inermis presence and environmental predictors differed between the two spatial binary models. The strip adaptive cluster designs we investigate provided a significant advantage in terms of Markov chain Monte Carlo chain convergence when trying to model a sparsely distributed species across a large area. In general, there was little difference in the choice of neighborhood, although the adaptive king was preferred when transects were randomly placed throughout the spatial domain.

  17. Adapting to Uncertainty: Comparing Methodological Approaches to Climate Adaptation and Mitigation Policy

    NASA Astrophysics Data System (ADS)

    Huda, J.; Kauneckis, D. L.

    2013-12-01

    Climate change adaptation represents a number of unique policy-making challenges. Foremost among these is dealing with the range of future climate impacts to a wide scope of inter-related natural systems, their interaction with social and economic systems, and uncertainty resulting from the variety of downscaled climate model scenarios and climate science projections. These cascades of uncertainty have led to a number of new approaches as well as a reexamination of traditional methods for evaluating risk and uncertainty in policy-making. Policy makers are required to make decisions and formulate policy irrespective of the level of uncertainty involved and while a debate continues regarding the level of scientific certainty required in order to make a decision, incremental change in the climate policy continues at multiple governance levels. This project conducts a comparative analysis of the range of methodological approaches that are evolving to address uncertainty in climate change policy. It defines 'methodologies' to include a variety of quantitative and qualitative approaches involving both top-down and bottom-up policy processes that attempt to enable policymakers to synthesize climate information into the policy process. The analysis examines methodological approaches to decision-making in climate policy based on criteria such as sources of policy choice information, sectors to which the methodology has been applied, sources from which climate projections were derived, quantitative and qualitative methods used to deal with uncertainty, and the benefits and limitations of each. A typology is developed to better categorize the variety of approaches and methods, examine the scope of policy activities they are best suited for, and highlight areas for future research and development.

  18. A Direct Adaptive Control Approach in the Presence of Model Mismatch

    NASA Technical Reports Server (NTRS)

    Joshi, Suresh M.; Tao, Gang; Khong, Thuan

    2009-01-01

    This paper considers the problem of direct model reference adaptive control when the plant-model matching conditions are violated due to abnormal changes in the plant or incorrect knowledge of the plant's mathematical structure. The approach consists of direct adaptation of state feedback gains for state tracking, and simultaneous estimation of the plant-model mismatch. Because of the mismatch, the plant can no longer track the state of the original reference model, but may be able to track a new reference model that still provides satisfactory performance. The reference model is updated if the estimated plant-model mismatch exceeds a bound that is determined via robust stability and/or performance criteria. The resulting controller is a hybrid direct-indirect adaptive controller that offers asymptotic state tracking in the presence of plant-model mismatch as well as parameter deviations.

  19. Kinetic Boltzmann approach adapted for modeling highly ionized matter created by x-ray irradiation of a solid.

    PubMed

    Ziaja, Beata; Saxena, Vikrant; Son, Sang-Kil; Medvedev, Nikita; Barbrel, Benjamin; Woloncewicz, Bianca; Stransky, Michal

    2016-05-01

    We report on the kinetic Boltzmann approach adapted for simulations of highly ionized matter created from a solid by its x-ray irradiation. X rays can excite inner-shell electrons, which leads to the creation of deeply lying core holes. Their relaxation, especially in heavier elements, can take complicated paths, leading to a large number of active configurations. Their number can be so large that solving the set of respective evolution equations becomes computationally inefficient and another modeling approach should be used instead. To circumvent this complexity, the commonly used continuum models employ a superconfiguration scheme. Here, we propose an alternative approach which still uses "true" atomic configurations but limits their number by restricting the sample relaxation to the predominant relaxation paths. We test its reliability, performing respective calculations for a bulk material consisting of light atoms and comparing the results with a full calculation including all relaxation paths. Prospective application for heavy elements is discussed.

  20. Adaptive decision making in a dynamic environment: a test of a sequential sampling model of relative judgment.

    PubMed

    Vuckovic, Anita; Kwantes, Peter J; Neal, Andrew

    2013-09-01

    Research has identified a wide range of factors that influence performance in relative judgment tasks. However, the findings from this research have been inconsistent. Studies have varied with respect to the identification of causal variables and the perceptual and decision-making mechanisms underlying performance. Drawing on the ecological rationality approach, we present a theory of the judgment and decision-making processes involved in a relative judgment task that explains how people judge a stimulus and adapt their decision process to accommodate their own uncertainty associated with those judgments. Undergraduate participants performed a simulated air traffic control conflict detection task. Across two experiments, we systematically manipulated variables known to affect performance. In the first experiment, we manipulated the relative distances of aircraft to a common destination while holding aircraft speeds constant. In a follow-up experiment, we introduced a direct manipulation of relative speed. We then fit a sequential sampling model to the data, and used the best fitting parameters to infer the decision-making processes responsible for performance. Findings were consistent with the theory that people adapt to their own uncertainty by adjusting their criterion and the amount of time they take to collect evidence in order to make a more accurate decision. From a practical perspective, the paper demonstrates that one can use a sequential sampling model to understand performance in a dynamic environment, allowing one to make sense of and interpret complex patterns of empirical findings that would otherwise be difficult to interpret using standard statistical analyses. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  1. Adaptation of a Weighted Regression Approach to Evaluate Water Quality Trends in an Estuary

    EPA Science Inventory

    To improve the description of long-term changes in water quality, we adapted a weighted regression approach to analyze a long-term water quality dataset from Tampa Bay, Florida. The weighted regression approach, originally developed to resolve pollutant transport trends in rivers...

  2. Calculating Confidence, Uncertainty, and Numbers of Samples When Using Statistical Sampling Approaches to Characterize and Clear Contaminated Areas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Piepel, Gregory F.; Matzke, Brett D.; Sego, Landon H.

    2013-04-27

    This report discusses the methodology, formulas, and inputs needed to make characterization and clearance decisions for Bacillus anthracis-contaminated and uncontaminated (or decontaminated) areas using a statistical sampling approach. Specifically, the report includes the methods and formulas for calculating the • number of samples required to achieve a specified confidence in characterization and clearance decisions • confidence in making characterization and clearance decisions for a specified number of samples for two common statistically based environmental sampling approaches. In particular, the report addresses an issue raised by the Government Accountability Office by providing methods and formulas to calculate the confidence that amore » decision area is uncontaminated (or successfully decontaminated) if all samples collected according to a statistical sampling approach have negative results. Key to addressing this topic is the probability that an individual sample result is a false negative, which is commonly referred to as the false negative rate (FNR). The two statistical sampling approaches currently discussed in this report are 1) hotspot sampling to detect small isolated contaminated locations during the characterization phase, and 2) combined judgment and random (CJR) sampling during the clearance phase. Typically if contamination is widely distributed in a decision area, it will be detectable via judgment sampling during the characterization phrase. Hotspot sampling is appropriate for characterization situations where contamination is not widely distributed and may not be detected by judgment sampling. CJR sampling is appropriate during the clearance phase when it is desired to augment judgment samples with statistical (random) samples. The hotspot and CJR statistical sampling approaches are discussed in the report for four situations: 1. qualitative data (detect and non-detect) when the FNR = 0 or when using statistical sampling methods that

  3. Adaptation strategies and approaches: Chapter 2

    Treesearch

    Patricia Butler; Chris Swanston; Maria Janowiak; Linda Parker; Matt St. Pierre; Leslie Brandt

    2012-01-01

    A wealth of information is available on climate change adaptation, but much of it is very broad and of limited use at the finer spatial scales most relevant to land managers. This chapter contains a "menu" of adaptation actions and provides land managers in northern Wisconsin with a range of options to help forest ecosystems adapt to climate change impacts....

  4. A multi-band environment-adaptive approach to noise suppression for cochlear implants.

    PubMed

    Saki, Fatemeh; Mirzahasanloo, Taher; Kehtarnavaz, Nasser

    2014-01-01

    This paper presents an improved environment-adaptive noise suppression solution for the cochlear implants speech processing pipeline. This improvement is achieved by using a multi-band data-driven approach in place of a previously developed single-band data-driven approach. Seven commonly encountered noisy environments of street, car, restaurant, mall, bus, pub and train are considered to quantify the improvement. The results obtained indicate about 10% improvement in speech quality measures.

  5. Book review: Decision making in natural resource management: A structured adaptive approach

    USGS Publications Warehouse

    Fuller, Angela K.

    2014-01-01

    No abstract available.Book information: Decision Making in Natural Resource Management: A Structured Adaptive Approach. Michael J. Conroy and James T. Peterson, 2013. Wiley-Blackwell, Oxford, UK. 456 pp. $99.95 paperback. ISBN: 978-0-470-67174-0.

  6. Adaptive leadership and person-centered care: a new approach to solving problems.

    PubMed

    Corazzini, Kirsten N; Anderson, Ruth A

    2014-01-01

    Successfully transitioning to person-centered care in nursing homes requires a new approach to solving care issues. The adaptive leadership framework suggests that expert providers must support frontline caregivers in their efforts to develop high-quality, person-centered solutions.

  7. Calibration model maintenance in melamine resin production: Integrating drift detection, smart sample selection and model adaptation.

    PubMed

    Nikzad-Langerodi, Ramin; Lughofer, Edwin; Cernuda, Carlos; Reischer, Thomas; Kantner, Wolfgang; Pawliczek, Marcin; Brandstetter, Markus

    2018-07-12

    The physico-chemical properties of Melamine Formaldehyde (MF) based thermosets are largely influenced by the degree of polymerization (DP) in the underlying resin. On-line supervision of the turbidity point by means of vibrational spectroscopy has recently emerged as a promising technique to monitor the DP of MF resins. However, spectroscopic determination of the DP relies on chemometric models, which are usually sensitive to drifts caused by instrumental and/or sample-associated changes occurring over time. In order to detect the time point when drifts start causing prediction bias, we here explore a universal drift detector based on a faded version of the Page-Hinkley (PH) statistic, which we test in three data streams from an industrial MF resin production process. We employ committee disagreement (CD), computed as the variance of model predictions from an ensemble of partial least squares (PLS) models, as a measure for sample-wise prediction uncertainty and use the PH statistic to detect changes in this quantity. We further explore supervised and unsupervised strategies for (semi-)automatic model adaptation upon detection of a drift. For the former, manual reference measurements are requested whenever statistical thresholds on Hotelling's T 2 and/or Q-Residuals are violated. Models are subsequently re-calibrated using weighted partial least squares in order to increase the influence of newer samples, which increases the flexibility when adapting to new (drifted) states. Unsupervised model adaptation is carried out exploiting the dual antecedent-consequent structure of a recently developed fuzzy systems variant of PLS termed FLEXFIS-PLS. In particular, antecedent parts are updated while maintaining the internal structure of the local linear predictors (i.e. the consequents). We found improved drift detection capability of the CD compared to Hotelling's T 2 and Q-Residuals when used in combination with the proposed PH test. Furthermore, we found that active

  8. An adaptive approach to the physical annealing strategy for simulated annealing

    NASA Astrophysics Data System (ADS)

    Hasegawa, M.

    2013-02-01

    A new and reasonable method for adaptive implementation of simulated annealing (SA) is studied on two types of random traveling salesman problems. The idea is based on the previous finding on the search characteristics of the threshold algorithms, that is, the primary role of the relaxation dynamics in their finite-time optimization process. It is shown that the effective temperature for optimization can be predicted from the system's behavior analogous to the stabilization phenomenon occurring in the heating process starting from a quenched solution. The subsequent slow cooling near the predicted point draws out the inherent optimizing ability of finite-time SA in more straightforward manner than the conventional adaptive approach.

  9. Adaptive Wing Camber Optimization: A Periodic Perturbation Approach

    NASA Technical Reports Server (NTRS)

    Espana, Martin; Gilyard, Glenn

    1994-01-01

    Available redundancy among aircraft control surfaces allows for effective wing camber modifications. As shown in the past, this fact can be used to improve aircraft performance. To date, however, algorithm developments for in-flight camber optimization have been limited. This paper presents a perturbational approach for cruise optimization through in-flight camber adaptation. The method uses, as a performance index, an indirect measurement of the instantaneous net thrust. As such, the actual performance improvement comes from the integrated effects of airframe and engine. The algorithm, whose design and robustness properties are discussed, is demonstrated on the NASA Dryden B-720 flight simulator.

  10. Phobos Sample Return: Next Approach

    NASA Astrophysics Data System (ADS)

    Zelenyi, Lev; Martynov, Maxim; Zakharov, Alexander; Korablev, Oleg; Ivanov, Alexey; Karabadzak, George

    The Martian moons still remain a mystery after numerous studies by Mars orbiting spacecraft. Their study cover three major topics related to (1) Solar system in general (formation and evolution, origin of planetary satellites, origin and evolution of life); (2) small bodies (captured asteroid, or remnants of Mars formation, or reaccreted Mars ejecta); (3) Mars (formation and evolution of Mars; Mars ejecta at the satellites). As reviewed by Galimov [2010] most of the above questions require the sample return from the Martian moon, while some (e.g. the characterization of the organic matter) could be also answered by in situ experiments. There is the possibility to obtain the sample of Mars material by sampling Phobos: following to Chappaz et al. [2012] a 200-g sample could contain 10-7 g of Mars surface material launched during the past 1 mln years, or 5*10-5 g of Mars material launched during the past 10 mln years, or 5*1010 individual particles from Mars, quantities suitable for accurate laboratory analyses. The studies of Phobos have been of high priority in the Russian program on planetary research for many years. Phobos-88 mission consisted of two spacecraft (Phobos-1, Phobos-2) and aimed the approach to Phobos at 50 m and remote studies, and also the release of small landers (long-living stations DAS). This mission implemented the program incompletely. It was returned information about the Martian environment and atmosphere. The next profect Phobos Sample Return (Phobos-Grunt) initially planned in early 2000 has been delayed several times owing to budget difficulties; the spacecraft failed to leave NEO in 2011. The recovery of the science goals of this mission and the delivery of the samples of Phobos to Earth remain of highest priority for Russian scientific community. The next Phobos SR mission named Boomerang was postponed following the ExoMars cooperation, but is considered the next in the line of planetary exploration, suitable for launch around 2022. A

  11. Development and Climate Change: A Mainstreaming Approach for Assessing Economic, Social, and Environmental Impacts of Adaptation Measures

    NASA Astrophysics Data System (ADS)

    Halsnæs, Kirsten; Trærup, Sara

    2009-05-01

    The paper introduces the so-called climate change mainstreaming approach, where vulnerability and adaptation measures are assessed in the context of general development policy objectives. The approach is based on the application of a limited set of indicators. These indicators are selected as representatives of focal development policy objectives, and a stepwise approach for addressing climate change impacts, development linkages, and the economic, social and environmental dimensions related to vulnerability and adaptation are introduced. Within this context it is illustrated using three case studies how development policy indicators in practice can be used to assess climate change impacts and adaptation measures based on three case studies, namely a road project in flood prone areas of Mozambique, rainwater harvesting in the agricultural sector in Tanzania and malaria protection in Tanzania. The conclusions of the paper confirm that climate risks can be reduced at relatively low costs, but the uncertainty is still remaining about some of the wider development impacts of implementing climate change adaptation measures.

  12. Development and climate change: a mainstreaming approach for assessing economic, social, and environmental impacts of adaptation measures.

    PubMed

    Halsnaes, Kirsten; Traerup, Sara

    2009-05-01

    The paper introduces the so-called climate change mainstreaming approach, where vulnerability and adaptation measures are assessed in the context of general development policy objectives. The approach is based on the application of a limited set of indicators. These indicators are selected as representatives of focal development policy objectives, and a stepwise approach for addressing climate change impacts, development linkages, and the economic, social and environmental dimensions related to vulnerability and adaptation are introduced. Within this context it is illustrated using three case studies how development policy indicators in practice can be used to assess climate change impacts and adaptation measures based on three case studies, namely a road project in flood prone areas of Mozambique, rainwater harvesting in the agricultural sector in Tanzania and malaria protection in Tanzania. The conclusions of the paper confirm that climate risks can be reduced at relatively low costs, but the uncertainty is still remaining about some of the wider development impacts of implementing climate change adaptation measures.

  13. Studying the Global Bifurcation Involving Wada Boundary Metamorphosis by a Method of Generalized Cell Mapping with Sampling-Adaptive Interpolation

    NASA Astrophysics Data System (ADS)

    Liu, Xiao-Ming; Jiang, Jun; Hong, Ling; Tang, Dafeng

    In this paper, a new method of Generalized Cell Mapping with Sampling-Adaptive Interpolation (GCMSAI) is presented in order to enhance the efficiency of the computation of one-step probability transition matrix of the Generalized Cell Mapping method (GCM). Integrations with one mapping step are replaced by sampling-adaptive interpolations of third order. An explicit formula of interpolation error is derived for a sampling-adaptive control to switch on integrations for the accuracy of computations with GCMSAI. By applying the proposed method to a two-dimensional forced damped pendulum system, global bifurcations are investigated with observations of boundary metamorphoses including full to partial and partial to partial as well as the birth of fully Wada boundary. Moreover GCMSAI requires a computational time of one thirtieth up to one fiftieth compared to that of the previous GCM.

  14. In search of an adaptive social-ecological approach to understanding a tropical city

    Treesearch

    A.E. Lugo; C.M. Concepcion; L.E. Santiago-Acevedo; T.A. Munoz-Erickson; J.C. Verdejo Ortiz; R. Santiago-Bartolomei; J. Forero-Montana; C.J. Nytch; H. Manrique; W. Colon-Cortes

    2012-01-01

    This essay describes our effort to develop a practical approach to the integration of the social and ecological sciences in the context of a Latin-American city such as San Juan, Puerto Rico. We describe our adaptive social-ecological approach in the historical context of the developing paradigms of the Anthropocene, new integrative social and ecological sciences, and...

  15. Evaluation of single and two-stage adaptive sampling designs for estimation of density and abundance of freshwater mussels in a large river

    USGS Publications Warehouse

    Smith, D.R.; Rogala, J.T.; Gray, B.R.; Zigler, S.J.; Newton, T.J.

    2011-01-01

    Reliable estimates of abundance are needed to assess consequences of proposed habitat restoration and enhancement projects on freshwater mussels in the Upper Mississippi River (UMR). Although there is general guidance on sampling techniques for population assessment of freshwater mussels, the actual performance of sampling designs can depend critically on the population density and spatial distribution at the project site. To evaluate various sampling designs, we simulated sampling of populations, which varied in density and degree of spatial clustering. Because of logistics and costs of large river sampling and spatial clustering of freshwater mussels, we focused on adaptive and non-adaptive versions of single and two-stage sampling. The candidate designs performed similarly in terms of precision (CV) and probability of species detection for fixed sample size. Both CV and species detection were determined largely by density, spatial distribution and sample size. However, designs did differ in the rate that occupied quadrats were encountered. Occupied units had a higher probability of selection using adaptive designs than conventional designs. We used two measures of cost: sample size (i.e. number of quadrats) and distance travelled between the quadrats. Adaptive and two-stage designs tended to reduce distance between sampling units, and thus performed better when distance travelled was considered. Based on the comparisons, we provide general recommendations on the sampling designs for the freshwater mussels in the UMR, and presumably other large rivers.

  16. [Auditory aftereffects with approaching and withdrawing sound sources: dependence on trajectory and domain of presentation of adapting stimuli].

    PubMed

    Malinina, E S; Andreeva, I G

    2013-01-01

    The perceptual peculiarities of sound source withdrawing and approaching and their influence on auditory aftereffects were studied in the free field. The radial movement of the auditory adapting stimuli was imitated by two methods: (1) by oppositely directed simultaneous amplitude change of the wideband signals at two loudspeakers placed at 1.1 and 4.5 m from a listener; (2) by an increase or a decrease of the wideband noise amplitude of the impulses at one of the loudspeakers--whether close or distant. The radial auditory movement of test stimuli was imitated by using the first method of imitation of adapting stimuli movement. Nine listeners estimated the direction of test stimuli movement without adaptation (control) and after adaptation. Adapting stimuli were stationary, slowly moving with sound level variation of 2 dB and rapidly moving with variation of 12 dB. The percentage of "withdrawing" responses was used for psychometric curve construction. Three perceptual phenomena were found. The growing louder effect was shown in control series without adaptation. The effect was characterized by a decrease of the number of "withdrawing" responses and overestimation of test stimuli as approaching. The position-dependent aftereffects were noticed after adaptation to the stationary and slowly moving sound stimuli. The aftereffect was manifested as an increase of the number of "withdrawing" responses and overestimation of test stimuli as withdrawal. The effect was reduced with increase of the distance between the listener and the loudspeaker. Movement aftereffects were revealed after adaptation to the rapidly moving stimuli. Aftereffects were direction-dependent: the number of "withdrawal" responses after adaptation to approach increased, whereas after adaptation to withdrawal it decreased relative to control. The movement aftereffects were more pronounced at imitation of movement of adapting stimuli by the first method. In this case the listener could determine the

  17. Effects of Calibration Sample Size and Item Bank Size on Ability Estimation in Computerized Adaptive Testing

    ERIC Educational Resources Information Center

    Sahin, Alper; Weiss, David J.

    2015-01-01

    This study aimed to investigate the effects of calibration sample size and item bank size on examinee ability estimation in computerized adaptive testing (CAT). For this purpose, a 500-item bank pre-calibrated using the three-parameter logistic model with 10,000 examinees was simulated. Calibration samples of varying sizes (150, 250, 350, 500,…

  18. Adaptation and Validation of the Sexual Assertiveness Scale (SAS) in a Sample of Male Drug Users.

    PubMed

    Vallejo-Medina, Pablo; Sierra, Juan Carlos

    2015-04-21

    The aim of the present study was to adapt and validate the Sexual Assertiveness Scale (SAS) in a sample of male drug users. A sample of 326 male drug users and 322 non-clinical males was selected by cluster sampling and convenience sampling, respectively. Results showed that the scale had good psychometric properties and adequate internal consistency reliability (Initiation = .66, Refusal = .74 and STD-P = .79). An evaluation of the invariance showed strong factor equivalence between both samples. A high and moderate effect of Differential Item Functioning was only found in items 1 and 14 (∆R 2 Nagelkerke = .076 and .037, respectively). We strongly recommend not using item 1 if the goal is to compare the scores of both groups, otherwise the comparison will be biased. Correlations obtained between the CSFQ-14 and the safe sex ratio and the SAS subscales were significant (CI = 95%) and indicated good concurrent validity. Scores of male drug users were similar to those of non-clinical males. Therefore, the adaptation of the SAS to drug users provides enough guarantees for reliable and valid use in both clinical practice and research, although care should be taken with item 1.

  19. Identification of novel serum peptide biomarkers for high-altitude adaptation: a comparative approach

    PubMed Central

    Yang, Juan; Li, Wenhua; Liu, Siyuan; Yuan, Dongya; Guo, Yijiao; Jia, Cheng; Song, Tusheng; Huang, Chen

    2016-01-01

    We aimed to identify serum biomarkers for screening individuals who could adapt to high-altitude hypoxia at sea level. HHA (high-altitude hypoxia acclimated; n = 48) and HHI (high-altitude hypoxia illness; n = 48) groups were distinguished at high altitude, routine blood tests were performed for both groups at high altitude and at sea level. Serum biomarkers were identified by comparing serum peptidome profiling between HHI and HHA groups collected at sea level. Routine blood tests revealed the concentration of hemoglobin and red blood cells were significantly higher in HHI than in HHA at high altitude. Serum peptidome profiling showed that ten significantly differentially expressed peaks between HHA and HHI at sea level. Three potential serum peptide peaks (m/z values: 1061.91, 1088.33, 4057.63) were further sequence identified as regions of the inter-α trypsin inhibitor heavy chain H4 fragment (ITIH4 347–356), regions of the inter-α trypsin inhibitor heavy chain H1 fragment (ITIH1 205–214), and isoform 1 of fibrinogen α chain precursor (FGA 588–624). Expression of their full proteins was also tested by ELISA in HHA and HHI samples collected at sea level. Our study provided a novel approach for identifying potential biomarkers for screening people at sea level who can adapt to high altitudes. PMID:27150491

  20. Identification of novel serum peptide biomarkers for high-altitude adaptation: a comparative approach

    NASA Astrophysics Data System (ADS)

    Yang, Juan; Li, Wenhua; Liu, Siyuan; Yuan, Dongya; Guo, Yijiao; Jia, Cheng; Song, Tusheng; Huang, Chen

    2016-05-01

    We aimed to identify serum biomarkers for screening individuals who could adapt to high-altitude hypoxia at sea level. HHA (high-altitude hypoxia acclimated; n = 48) and HHI (high-altitude hypoxia illness; n = 48) groups were distinguished at high altitude, routine blood tests were performed for both groups at high altitude and at sea level. Serum biomarkers were identified by comparing serum peptidome profiling between HHI and HHA groups collected at sea level. Routine blood tests revealed the concentration of hemoglobin and red blood cells were significantly higher in HHI than in HHA at high altitude. Serum peptidome profiling showed that ten significantly differentially expressed peaks between HHA and HHI at sea level. Three potential serum peptide peaks (m/z values: 1061.91, 1088.33, 4057.63) were further sequence identified as regions of the inter-α trypsin inhibitor heavy chain H4 fragment (ITIH4 347-356), regions of the inter-α trypsin inhibitor heavy chain H1 fragment (ITIH1 205-214), and isoform 1 of fibrinogen α chain precursor (FGA 588-624). Expression of their full proteins was also tested by ELISA in HHA and HHI samples collected at sea level. Our study provided a novel approach for identifying potential biomarkers for screening people at sea level who can adapt to high altitudes.

  1. POF-Darts: Geometric adaptive sampling for probability of failure

    DOE PAGES

    Ebeida, Mohamed S.; Mitchell, Scott A.; Swiler, Laura P.; ...

    2016-06-18

    We introduce a novel technique, POF-Darts, to estimate the Probability Of Failure based on random disk-packing in the uncertain parameter space. POF-Darts uses hyperplane sampling to explore the unexplored part of the uncertain space. We use the function evaluation at a sample point to determine whether it belongs to failure or non-failure regions, and surround it with a protection sphere region to avoid clustering. We decompose the domain into Voronoi cells around the function evaluations as seeds and choose the radius of the protection sphere depending on the local Lipschitz continuity. As sampling proceeds, regions uncovered with spheres will shrink,more » improving the estimation accuracy. After exhausting the function evaluation budget, we build a surrogate model using the function evaluations associated with the sample points and estimate the probability of failure by exhaustive sampling of that surrogate. In comparison to other similar methods, our algorithm has the advantages of decoupling the sampling step from the surrogate construction one, the ability to reach target POF values with fewer samples, and the capability of estimating the number and locations of disconnected failure regions, not just the POF value. Furthermore, we present various examples to demonstrate the efficiency of our novel approach.« less

  2. Behavior Change Interventions to Improve the Health of Racial and Ethnic Minority Populations: A Tool Kit of Adaptation Approaches

    PubMed Central

    Davidson, Emma M; Liu, Jing Jing; Bhopal, Raj; White, Martin; Johnson, Mark RD; Netto, Gina; Wabnitz, Cecile; Sheikh, Aziz

    2013-01-01

    Context Adapting behavior change interventions to meet the needs of racial and ethnic minority populations has the potential to enhance their effectiveness in the target populations. But because there is little guidance on how best to undertake these adaptations, work in this field has proceeded without any firm foundations. In this article, we present our Tool Kit of Adaptation Approaches as a framework for policymakers, practitioners, and researchers interested in delivering behavior change interventions to ethnically diverse, underserved populations in the United Kingdom. Methods We undertook a mixed-method program of research on interventions for smoking cessation, increasing physical activity, and promoting healthy eating that had been adapted to improve salience and acceptability for African-, Chinese-, and South Asian–origin minority populations. This program included a systematic review (reported using PRISMA criteria), qualitative interviews, and a realist synthesis of data. Findings We compiled a richly informative data set of 161 publications and twenty-six interviews detailing the adaptation of behavior change interventions and the contexts in which they were undertaken. On the basis of these data, we developed our Tool Kit of Adaptation Approaches, which contains (1) a forty-six-item Typology of Adaptation Approaches; (2) a Pathway to Adaptation, which shows how to use the Typology to create a generic behavior change intervention; and (3) RESET, a decision tool that provides practical guidance on which adaptations to use in different contexts. Conclusions Our Tool Kit of Adaptation Approaches provides the first evidence-derived suite of materials to support the development, design, implementation, and reporting of health behavior change interventions for minority groups. The Tool Kit now needs prospective, empirical evaluation in a range of intervention and population settings. PMID:24320170

  3. The Colorado Climate Preparedness Project: A Systematic Approach to Assessing Efforts Supporting State-Level Adaptation

    NASA Astrophysics Data System (ADS)

    Klein, R.; Gordon, E.

    2010-12-01

    Scholars and policy analysts often contend that an effective climate adaptation strategy must entail "mainstreaming," or incorporating responses to possible climate impacts into existing planning and management decision frameworks. Such an approach, however, makes it difficult to assess the degree to which decisionmaking entities are engaging in adaptive activities that may or may not be explicitly framed around a changing climate. For example, a drought management plan may not explicitly address climate change, but the activities and strategies outlined in it may reduce vulnerabilities posed by a variable and changing climate. Consequently, to generate a strategic climate adaptation plan requires identifying the entire suite of activities that are implicitly linked to climate and may affect adaptive capacity within the system. Here we outline a novel, two-pronged approach, leveraging social science methods, to understanding adaptation throughout state government in Colorado. First, we conducted a series of interviews with key actors in state and federal government agencies, non-governmental organizations, universities, and other entities engaged in state issues. The purpose of these interviews was to elicit information about current activities that may affect the state’s adaptive capacity and to identify future climate-related needs across the state. Second, we have developed an interactive database cataloging organizations, products, projects, and people actively engaged in adaptive planning and policymaking that are relevant to the state of Colorado. The database includes a wiki interface, helping create a dynamic component that will enable frequent updating as climate-relevant information emerges. The results of this project are intended to paint a clear picture of sectors and agencies with higher and lower levels of adaptation awareness and to provide a roadmap for the next gubernatorial administration to pursue a more sophisticated climate adaptation agenda

  4. Physician behavioral adaptability: A model to outstrip a "one size fits all" approach.

    PubMed

    Carrard, Valérie; Schmid Mast, Marianne

    2015-10-01

    Based on a literature review, we propose a model of physician behavioral adaptability (PBA) with the goal of inspiring new research. PBA means that the physician adapts his or her behavior according to patients' different preferences. The PBA model shows how physicians infer patients' preferences and adapt their interaction behavior from one patient to the other. We claim that patients will benefit from better outcomes if their physicians show behavioral adaptability rather than a "one size fits all" approach. This literature review is based on a literature search of the PsycINFO(®) and MEDLINE(®) databases. The literature review and first results stemming from the authors' research support the validity and viability of parts of the PBA model. There is evidence suggesting that physicians are able to show behavioral flexibility when interacting with their different patients, that a match between patients' preferences and physician behavior is related to better consultation outcomes, and that physician behavioral adaptability is related to better consultation outcomes. Training of physicians' behavioral flexibility and their ability to infer patients' preferences can facilitate physician behavioral adaptability and positive patient outcomes. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  5. A Hybrid Acoustic and Pronunciation Model Adaptation Approach for Non-native Speech Recognition

    NASA Astrophysics Data System (ADS)

    Oh, Yoo Rhee; Kim, Hong Kook

    In this paper, we propose a hybrid model adaptation approach in which pronunciation and acoustic models are adapted by incorporating the pronunciation and acoustic variabilities of non-native speech in order to improve the performance of non-native automatic speech recognition (ASR). Specifically, the proposed hybrid model adaptation can be performed at either the state-tying or triphone-modeling level, depending at which acoustic model adaptation is performed. In both methods, we first analyze the pronunciation variant rules of non-native speakers and then classify each rule as either a pronunciation variant or an acoustic variant. The state-tying level hybrid method then adapts pronunciation models and acoustic models by accommodating the pronunciation variants in the pronunciation dictionary and by clustering the states of triphone acoustic models using the acoustic variants, respectively. On the other hand, the triphone-modeling level hybrid method initially adapts pronunciation models in the same way as in the state-tying level hybrid method; however, for the acoustic model adaptation, the triphone acoustic models are then re-estimated based on the adapted pronunciation models and the states of the re-estimated triphone acoustic models are clustered using the acoustic variants. From the Korean-spoken English speech recognition experiments, it is shown that ASR systems employing the state-tying and triphone-modeling level adaptation methods can relatively reduce the average word error rates (WERs) by 17.1% and 22.1% for non-native speech, respectively, when compared to a baseline ASR system.

  6. One Adaptive Synchronization Approach for Fractional-Order Chaotic System with Fractional-Order 1 < q < 2

    PubMed Central

    Zhou, Ping; Bai, Rongji

    2014-01-01

    Based on a new stability result of equilibrium point in nonlinear fractional-order systems for fractional-order lying in 1 < q < 2, one adaptive synchronization approach is established. The adaptive synchronization for the fractional-order Lorenz chaotic system with fractional-order 1 < q < 2 is considered. Numerical simulations show the validity and feasibility of the proposed scheme. PMID:25247207

  7. Adaptive Monte Carlo methods

    NASA Astrophysics Data System (ADS)

    Fasnacht, Marc

    We develop adaptive Monte Carlo methods for the calculation of the free energy as a function of a parameter of interest. The methods presented are particularly well-suited for systems with complex energy landscapes, where standard sampling techniques have difficulties. The Adaptive Histogram Method uses a biasing potential derived from histograms recorded during the simulation to achieve uniform sampling in the parameter of interest. The Adaptive Integration method directly calculates an estimate of the free energy from the average derivative of the Hamiltonian with respect to the parameter of interest and uses it as a biasing potential. We compare both methods to a state of the art method, and demonstrate that they compare favorably for the calculation of potentials of mean force of dense Lennard-Jones fluids. We use the Adaptive Integration Method to calculate accurate potentials of mean force for different types of simple particles in a Lennard-Jones fluid. Our approach allows us to separate the contributions of the solvent to the potential of mean force from the effect of the direct interaction between the particles. With contributions of the solvent determined, we can find the potential of mean force directly for any other direct interaction without additional simulations. We also test the accuracy of the Adaptive Integration Method on a thermodynamic cycle, which allows us to perform a consistency check between potentials of mean force and chemical potentials calculated using the Adaptive Integration Method. The results demonstrate a high degree of consistency of the method.

  8. Adaptive Management

    EPA Science Inventory

    Adaptive management is an approach to natural resource management that emphasizes learning through management where knowledge is incomplete, and when, despite inherent uncertainty, managers and policymakers must act. Unlike a traditional trial and error approach, adaptive managem...

  9. Methods for flexible sample-size design in clinical trials: Likelihood, weighted, dual test, and promising zone approaches.

    PubMed

    Shih, Weichung Joe; Li, Gang; Wang, Yining

    2016-03-01

    Sample size plays a crucial role in clinical trials. Flexible sample-size designs, as part of the more general category of adaptive designs that utilize interim data, have been a popular topic in recent years. In this paper, we give a comparative review of four related methods for such a design. The likelihood method uses the likelihood ratio test with an adjusted critical value. The weighted method adjusts the test statistic with given weights rather than the critical value. The dual test method requires both the likelihood ratio statistic and the weighted statistic to be greater than the unadjusted critical value. The promising zone approach uses the likelihood ratio statistic with the unadjusted value and other constraints. All four methods preserve the type-I error rate. In this paper we explore their properties and compare their relationships and merits. We show that the sample size rules for the dual test are in conflict with the rules of the promising zone approach. We delineate what is necessary to specify in the study protocol to ensure the validity of the statistical procedure and what can be kept implicit in the protocol so that more flexibility can be attained for confirmatory phase III trials in meeting regulatory requirements. We also prove that under mild conditions, the likelihood ratio test still preserves the type-I error rate when the actual sample size is larger than the re-calculated one. Copyright © 2015 Elsevier Inc. All rights reserved.

  10. An Efficient Adaptive Angle-Doppler Compensation Approach for Non-Sidelooking Airborne Radar STAP

    PubMed Central

    Shen, Mingwei; Yu, Jia; Wu, Di; Zhu, Daiyin

    2015-01-01

    In this study, the effects of non-sidelooking airborne radar clutter dispersion on space-time adaptive processing (STAP) is considered, and an efficient adaptive angle-Doppler compensation (EAADC) approach is proposed to improve the clutter suppression performance. In order to reduce the computational complexity, the reduced-dimension sparse reconstruction (RDSR) technique is introduced into the angle-Doppler spectrum estimation to extract the required parameters for compensating the clutter spectral center misalignment. Simulation results to demonstrate the effectiveness of the proposed algorithm are presented. PMID:26053755

  11. A locally p-adaptive approach for Large Eddy Simulation of compressible flows in a DG framework

    NASA Astrophysics Data System (ADS)

    Tugnoli, Matteo; Abbà, Antonella; Bonaventura, Luca; Restelli, Marco

    2017-11-01

    We investigate the possibility of reducing the computational burden of LES models by employing local polynomial degree adaptivity in the framework of a high-order DG method. A novel degree adaptation technique especially featured to be effective for LES applications is proposed and its effectiveness is compared to that of other criteria already employed in the literature. The resulting locally adaptive approach allows to achieve significant reductions in computational cost of representative LES computations.

  12. A Unified Nonlinear Adaptive Approach for Detection and Isolation of Engine Faults

    NASA Technical Reports Server (NTRS)

    Tang, Liang; DeCastro, Jonathan A.; Zhang, Xiaodong; Farfan-Ramos, Luis; Simon, Donald L.

    2010-01-01

    A challenging problem in aircraft engine health management (EHM) system development is to detect and isolate faults in system components (i.e., compressor, turbine), actuators, and sensors. Existing nonlinear EHM methods often deal with component faults, actuator faults, and sensor faults separately, which may potentially lead to incorrect diagnostic decisions and unnecessary maintenance. Therefore, it would be ideal to address sensor faults, actuator faults, and component faults under one unified framework. This paper presents a systematic and unified nonlinear adaptive framework for detecting and isolating sensor faults, actuator faults, and component faults for aircraft engines. The fault detection and isolation (FDI) architecture consists of a parallel bank of nonlinear adaptive estimators. Adaptive thresholds are appropriately designed such that, in the presence of a particular fault, all components of the residual generated by the adaptive estimator corresponding to the actual fault type remain below their thresholds. If the faults are sufficiently different, then at least one component of the residual generated by each remaining adaptive estimator should exceed its threshold. Therefore, based on the specific response of the residuals, sensor faults, actuator faults, and component faults can be isolated. The effectiveness of the approach was evaluated using the NASA C-MAPSS turbofan engine model, and simulation results are presented.

  13. Nuclear Ensemble Approach with Importance Sampling.

    PubMed

    Kossoski, Fábris; Barbatti, Mario

    2018-06-12

    We show that the importance sampling technique can effectively augment the range of problems where the nuclear ensemble approach can be applied. A sampling probability distribution function initially determines the collection of initial conditions for which calculations are performed, as usual. Then, results for a distinct target distribution are computed by introducing compensating importance sampling weights for each sampled point. This mapping between the two probability distributions can be performed whenever they are both explicitly constructed. Perhaps most notably, this procedure allows for the computation of temperature dependent observables. As a test case, we investigated the UV absorption spectra of phenol, which has been shown to have a marked temperature dependence. Application of the proposed technique to a range that covers 500 K provides results that converge to those obtained with conventional sampling. We further show that an overall improved rate of convergence is obtained when sampling is performed at intermediate temperatures. The comparison between calculated and the available measured cross sections is very satisfactory, as the main features of the spectra are correctly reproduced. As a second test case, one of Tully's classical models was revisited, and we show that the computation of dynamical observables also profits from the importance sampling technique. In summary, the strategy developed here can be employed to assess the role of temperature for any property calculated within the nuclear ensemble method, with the same computational cost as doing so for a single temperature.

  14. Adaptive eLearning modules for cytopathology education: A review and approach.

    PubMed

    Samulski, T Danielle; La, Teresa; Wu, Roseann I

    2016-11-01

    Clinical training imposes time and resource constraints on educators and learners, making it difficult to provide and absorb meaningful instruction. Additionally, innovative and personalized education has become an expectation of adult learners. Fortunately, the development of web-based educational tools provides a possible solution to these challenges. Within this review, we introduce the utility of adaptive eLearning platforms in pathology education. In addition to a review of the current literature, we provide the reader with a suggested approach for module creation, as well as a critical assessment of an available platform, based on our experience in creating adaptive eLearning modules for teaching basic concepts in gynecologic cytopathology. Diagn. Cytopathol. 2016;44:944-951. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  15. Adaptive dynamic programming approach to experience-based systems identification and control.

    PubMed

    Lendaris, George G

    2009-01-01

    Humans have the ability to make use of experience while selecting their control actions for distinct and changing situations, and their process speeds up and have enhanced effectiveness as more experience is gained. In contrast, current technological implementations slow down as more knowledge is stored. A novel way of employing Approximate (or Adaptive) Dynamic Programming (ADP) is described that shifts the underlying Adaptive Critic type of Reinforcement Learning method "up a level", away from designing individual (optimal) controllers to that of developing on-line algorithms that efficiently and effectively select designs from a repository of existing controller solutions (perhaps previously developed via application of ADP methods). The resulting approach is called Higher-Level Learning Algorithm. The approach and its rationale are described and some examples of its application are given. The notions of context and context discernment are important to understanding the human abilities noted above. These are first defined, in a manner appropriate to controls and system-identification, and as a foundation relating to the application arena, a historical view of the various phases during development of the controls field is given, organized by how the notion 'context' was, or was not, involved in each phase.

  16. Shear wave speed estimation by adaptive random sample consensus method.

    PubMed

    Lin, Haoming; Wang, Tianfu; Chen, Siping

    2014-01-01

    This paper describes a new method for shear wave velocity estimation that is capable of extruding outliers automatically without preset threshold. The proposed method is an adaptive random sample consensus (ARANDSAC) and the metric used here is finding the certain percentage of inliers according to the closest distance criterion. To evaluate the method, the simulation and phantom experiment results were compared using linear regression with all points (LRWAP) and radon sum transform (RS) method. The assessment reveals that the relative biases of mean estimation are 20.00%, 4.67% and 5.33% for LRWAP, ARANDSAC and RS respectively for simulation, 23.53%, 4.08% and 1.08% for phantom experiment. The results suggested that the proposed ARANDSAC algorithm is accurate in shear wave speed estimation.

  17. The Application of the Monte Carlo Approach to Cognitive Diagnostic Computerized Adaptive Testing With Content Constraints

    ERIC Educational Resources Information Center

    Mao, Xiuzhen; Xin, Tao

    2013-01-01

    The Monte Carlo approach which has previously been implemented in traditional computerized adaptive testing (CAT) is applied here to cognitive diagnostic CAT to test the ability of this approach to address multiple content constraints. The performance of the Monte Carlo approach is compared with the performance of the modified maximum global…

  18. Self-Learning Adaptive Umbrella Sampling Method for the Determination of Free Energy Landscapes in Multiple Dimensions

    PubMed Central

    Wojtas-Niziurski, Wojciech; Meng, Yilin; Roux, Benoit; Bernèche, Simon

    2013-01-01

    The potential of mean force describing conformational changes of biomolecules is a central quantity that determines the function of biomolecular systems. Calculating an energy landscape of a process that depends on three or more reaction coordinates might require a lot of computational power, making some of multidimensional calculations practically impossible. Here, we present an efficient automatized umbrella sampling strategy for calculating multidimensional potential of mean force. The method progressively learns by itself, through a feedback mechanism, which regions of a multidimensional space are worth exploring and automatically generates a set of umbrella sampling windows that is adapted to the system. The self-learning adaptive umbrella sampling method is first explained with illustrative examples based on simplified reduced model systems, and then applied to two non-trivial situations: the conformational equilibrium of the pentapeptide Met-enkephalin in solution and ion permeation in the KcsA potassium channel. With this method, it is demonstrated that a significant smaller number of umbrella windows needs to be employed to characterize the free energy landscape over the most relevant regions without any loss in accuracy. PMID:23814508

  19. Using Bayesian Adaptive Trial Designs for Comparative Effectiveness Research: A Virtual Trial Execution.

    PubMed

    Luce, Bryan R; Connor, Jason T; Broglio, Kristine R; Mullins, C Daniel; Ishak, K Jack; Saunders, Elijah; Davis, Barry R

    2016-09-20

    Bayesian and adaptive clinical trial designs offer the potential for more efficient processes that result in lower sample sizes and shorter trial durations than traditional designs. To explore the use and potential benefits of Bayesian adaptive clinical trial designs in comparative effectiveness research. Virtual execution of ALLHAT (Antihypertensive and Lipid-Lowering Treatment to Prevent Heart Attack Trial) as if it had been done according to a Bayesian adaptive trial design. Comparative effectiveness trial of antihypertensive medications. Patient data sampled from the more than 42 000 patients enrolled in ALLHAT with publicly available data. Number of patients randomly assigned between groups, trial duration, observed numbers of events, and overall trial results and conclusions. The Bayesian adaptive approach and original design yielded similar overall trial conclusions. The Bayesian adaptive trial randomly assigned more patients to the better-performing group and would probably have ended slightly earlier. This virtual trial execution required limited resampling of ALLHAT patients for inclusion in RE-ADAPT (REsearch in ADAptive methods for Pragmatic Trials). Involvement of a data monitoring committee and other trial logistics were not considered. In a comparative effectiveness research trial, Bayesian adaptive trial designs are a feasible approach and potentially generate earlier results and allocate more patients to better-performing groups. National Heart, Lung, and Blood Institute.

  20. Lead Tap Sampling Approaches: What Do They Tell You

    EPA Science Inventory

    There is no single, universally applicable sampling approach for lead in drinking water. The appropriate type of sampling is dictated by the question being asked. There is no reason when a customer asks to have their home water tested to see if it's "safe" that they s...

  1. Different Strokes for Different Folks? Contrasting Approaches to Cultural Adaptation of Parenting Interventions.

    PubMed

    Mejia, Anilena; Leijten, Patty; Lachman, Jamie M; Parra-Cardona, José Ruben

    2017-08-01

    Relevant achievements have been accomplished in prevention science with regard to disseminating efficacious parenting interventions among underserved populations. However, widespread disparities in availability of parenting services continue to negatively impact diverse populations in high-income countries (e.g., the USA) and low- and middle-income countries. As a result, a scholarly debate on cultural adaptation has evolved over the years. Specifically, some scholars have argued that in diverse cultural contexts, existing evidence-based parenting interventions should be delivered with strict fidelity to ensure effectiveness. Others have emphasized the need for cultural adaptations of interventions when disseminated among diverse populations. In this paper, we propose that discussions on cultural adaptation should be conceptualized as a "both-and," rather than an "either-or" process. To justify this stance, we describe three distinct parenting intervention projects to illustrate how cultural adaptation and efficacy of evidence-based interventions can be achieved using contrasting approaches and frameworks, depending on cultural preferences and available resources of local contexts. Further, we suggest the need to develop guidelines for consistent reporting of cultural adaptation procedures as a critical component of future investigations. This discussion is relevant for the broader public health field and prevention science.

  2. Evaluating performance of stormwater sampling approaches using a dynamic watershed model.

    PubMed

    Ackerman, Drew; Stein, Eric D; Ritter, Kerry J

    2011-09-01

    Accurate quantification of stormwater pollutant levels is essential for estimating overall contaminant discharge to receiving waters. Numerous sampling approaches exist that attempt to balance accuracy against the costs associated with the sampling method. This study employs a novel and practical approach of evaluating the accuracy of different stormwater monitoring methodologies using stormflows and constituent concentrations produced by a fully validated continuous simulation watershed model. A major advantage of using a watershed model to simulate pollutant concentrations is that a large number of storms representing a broad range of conditions can be applied in testing the various sampling approaches. Seventy-eight distinct methodologies were evaluated by "virtual samplings" of 166 simulated storms of varying size, intensity and duration, representing 14 years of storms in Ballona Creek near Los Angeles, California. The 78 methods can be grouped into four general strategies: volume-paced compositing, time-paced compositing, pollutograph sampling, and microsampling. The performances of each sampling strategy was evaluated by comparing the (1) median relative error between the virtually sampled and the true modeled event mean concentration (EMC) of each storm (accuracy), (2) median absolute deviation about the median or "MAD" of the relative error or (precision), and (3) the percentage of storms where sampling methods were within 10% of the true EMC (combined measures of accuracy and precision). Finally, costs associated with site setup, sampling, and laboratory analysis were estimated for each method. Pollutograph sampling consistently outperformed the other three methods both in terms of accuracy and precision, but was the most costly method evaluated. Time-paced sampling consistently underestimated while volume-paced sampling over estimated the storm EMCs. Microsampling performance approached that of pollutograph sampling at a substantial cost savings. The

  3. A review of plan library approaches in adaptive radiotherapy of bladder cancer.

    PubMed

    Collins, Shane D; Leech, Michelle M

    2018-05-01

    Large variations in the shape and size of the bladder volume are commonly observed in bladder cancer radiotherapy (RT). The clinical target volume (CTV) is therefore frequently inadequately treated and large isotropic margins are inappropriate in terms of dose to organs at risk (OAR); thereby making adaptive radiotherapy (ART) attractive for this tumour site. There are various methods of ART delivery, however, for bladder cancer, plan libraries are frequently used. A review of published studies on plan libraries for bladder cancer using four databases (Pubmed, Science Direct, Embase and Cochrane Library) was conducted. The endpoints selected were accuracy and feasibility of initiation of a plan library strategy into a RT department. Twenty-four articles were included in this review. The majority of studies reported improvement in accuracy with 10 studies showing an improvement in planning target volume (PTV) and CTV coverage with plan libraries, some by up to 24%. Seventeen studies showed a dose reduction to OARs, particularly the small bowel V45Gy, V40Gy, V30Gy and V10Gy, and the rectal V30Gy. However, the occurrence of no suitable plan was reported in six studies, with three studies showing no significant difference between adaptive and non-adaptive strategies in terms of target coverage. In addition, inter-observer variability in plan selection appears to remain problematic. The additional resources, education and technology required for the initiation of plan library selection for bladder cancer may hinder its routine clinical implementation, with eight studies illustrating increased treatment time required. While there is a growing body of evidence in support of plan libraries for bladder RT, many studies differed in their delivery approach. The advent of the clinical use of the MRI-linear accelerator will provide RT departments with the opportunity to consider daily online adaption for bladder cancer as an alternate to plan library approaches.

  4. Constraint satisfaction adaptive neural network and heuristics combined approaches for generalized job-shop scheduling.

    PubMed

    Yang, S; Wang, D

    2000-01-01

    This paper presents a constraint satisfaction adaptive neural network, together with several heuristics, to solve the generalized job-shop scheduling problem, one of NP-complete constraint satisfaction problems. The proposed neural network can be easily constructed and can adaptively adjust its weights of connections and biases of units based on the sequence and resource constraints of the job-shop scheduling problem during its processing. Several heuristics that can be combined with the neural network are also presented. In the combined approaches, the neural network is used to obtain feasible solutions, the heuristic algorithms are used to improve the performance of the neural network and the quality of the obtained solutions. Simulations have shown that the proposed neural network and its combined approaches are efficient with respect to the quality of solutions and the solving speed.

  5. The importance of training strategy adaptation: a learner-oriented approach for improving older adults' memory and transfer.

    PubMed

    Bottiroli, Sara; Cavallini, Elena; Dunlosky, John; Vecchi, Tomaso; Hertzog, Christopher

    2013-09-01

    We investigated the benefits of strategy-adaptation training for promoting transfer effects. This learner-oriented approach--which directly encourages the learner to generalize strategic behavior to new tasks--helps older adults appraise new tasks and adapt trained strategies to them. In Experiment 1, older adults in a strategy-adaptation training group used 2 strategies (imagery and sentence generation) while practicing 2 tasks (list and associative learning); they were then instructed on how to do a simple task analysis to help them adapt the trained strategies for 2 different unpracticed tasks (place learning and text learning) that were discussed during training. Two additional criterion tasks (name-face associative learning and grocery-list learning) were never mentioned during training. Two other groups were included: A strategy training group (who received strategy training and transfer instructions but not strategy-adaptation training) and a waiting-list control group. Both training procedures enhanced older adults' performance on the trained tasks and those tasks that were discussed during training, but transfer was greatest after strategy-adaptation training. Experiment 2 found that strategy-adaptation training conducted via a manual that older adults used at home also promoted transfer. These findings demonstrate the importance of adopting a learner-oriented approach to promote transfer of strategy training. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  6. Context-aware adaptive spelling in motor imagery BCI

    NASA Astrophysics Data System (ADS)

    Perdikis, S.; Leeb, R.; Millán, J. d. R.

    2016-06-01

    Objective. This work presents a first motor imagery-based, adaptive brain-computer interface (BCI) speller, which is able to exploit application-derived context for improved, simultaneous classifier adaptation and spelling. Online spelling experiments with ten able-bodied users evaluate the ability of our scheme, first, to alleviate non-stationarity of brain signals for restoring the subject’s performances, second, to guide naive users into BCI control avoiding initial offline BCI calibration and, third, to outperform regular unsupervised adaptation. Approach. Our co-adaptive framework combines the BrainTree speller with smooth-batch linear discriminant analysis adaptation. The latter enjoys contextual assistance through BrainTree’s language model to improve online expectation-maximization maximum-likelihood estimation. Main results. Our results verify the possibility to restore single-sample classification and BCI command accuracy, as well as spelling speed for expert users. Most importantly, context-aware adaptation performs significantly better than its unsupervised equivalent and similar to the supervised one. Although no significant differences are found with respect to the state-of-the-art PMean approach, the proposed algorithm is shown to be advantageous for 30% of the users. Significance. We demonstrate the possibility to circumvent supervised BCI recalibration, saving time without compromising the adaptation quality. On the other hand, we show that this type of classifier adaptation is not as efficient for BCI training purposes.

  7. Resilient microorganisms in dust samples of the International Space Station-survival of the adaptation specialists.

    PubMed

    Mora, Maximilian; Perras, Alexandra; Alekhova, Tatiana A; Wink, Lisa; Krause, Robert; Aleksandrova, Alina; Novozhilova, Tatiana; Moissl-Eichinger, Christine

    2016-12-20

    The International Space Station (ISS) represents a unique biotope for the human crew but also for introduced microorganisms. Microbes experience selective pressures such as microgravity, desiccation, poor nutrient-availability due to cleaning, and an increased radiation level. We hypothesized that the microbial community inside the ISS is modified by adapting to these stresses. For this reason, we analyzed 8-12 years old dust samples from Russian ISS modules with major focus on the long-time surviving portion of the microbial community. We consequently assessed the cultivable microbiota of these samples in order to analyze their extremotolerant potential against desiccation, heat-shock, and clinically relevant antibiotics. In addition, we studied the bacterial and archaeal communities from the stored Russian dust samples via molecular methods (next-generation sequencing, NGS) and compared our new data with previously derived information from the US American ISS dust microbiome. We cultivated and identified in total 85 bacterial, non-pathogenic isolates (17 different species) and 1 fungal isolate from the 8-12 year old dust samples collected in the Russian segment of the ISS. Most of these isolates exhibited robust resistance against heat-shock and clinically relevant antibiotics. Microbial 16S rRNA gene and archaeal 16S rRNA gene targeting Next Generation Sequencing showed signatures of human-associated microorganisms (Corynebacterium, Staphylococcus, Coprococcus etc.), but also specifically adapted extremotolerant microorganisms. Besides bacteria, the detection of archaeal signatures in higher abundance was striking. Our findings reveal (i) the occurrence of living, hardy microorganisms in archived Russian ISS dust samples, (ii) a profound resistance capacity of ISS microorganisms against environmental stresses, and (iii) the presence of archaeal signatures on board. In addition, we found indications that the microbial community in the Russian segment dust

  8. Chi-Squared Test of Fit and Sample Size-A Comparison between a Random Sample Approach and a Chi-Square Value Adjustment Method.

    PubMed

    Bergh, Daniel

    2015-01-01

    Chi-square statistics are commonly used for tests of fit of measurement models. Chi-square is also sensitive to sample size, which is why several approaches to handle large samples in test of fit analysis have been developed. One strategy to handle the sample size problem may be to adjust the sample size in the analysis of fit. An alternative is to adopt a random sample approach. The purpose of this study was to analyze and to compare these two strategies using simulated data. Given an original sample size of 21,000, for reductions of sample sizes down to the order of 5,000 the adjusted sample size function works as good as the random sample approach. In contrast, when applying adjustments to sample sizes of lower order the adjustment function is less effective at approximating the chi-square value for an actual random sample of the relevant size. Hence, the fit is exaggerated and misfit under-estimated using the adjusted sample size function. Although there are big differences in chi-square values between the two approaches at lower sample sizes, the inferences based on the p-values may be the same.

  9. Adaptive-numerical-bias metadynamics.

    PubMed

    Khanjari, Neda; Eslami, Hossein; Müller-Plathe, Florian

    2017-12-05

    A metadynamics scheme is presented in which the free energy surface is filled with progressively adding adaptive biasing potentials, obtained from the accumulated probability distribution of the collective variables. Instead of adding Gaussians with assigned height and width in conventional metadynamics method, here we add a more realistic adaptive biasing potential to the Hamiltonian of the system. The shape of the adaptive biasing potential is adjusted on the fly by sampling over the visited states. As the top of the barrier is approached, the biasing potentials become wider. This decreases the problem of trapping the system in the niches, introduced by the addition of Gaussians of fixed height in metadynamics. Our results for the free energy profiles of three test systems show that this method is more accurate and converges more quickly than the conventional metadynamics, and is quite comparable (in accuracy and convergence rate) with the well-tempered metadynamics method. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  10. An adaptive Kalman filter approach for cardiorespiratory signal extraction and fusion of non-contacting sensors.

    PubMed

    Foussier, Jerome; Teichmann, Daniel; Jia, Jing; Misgeld, Berno; Leonhardt, Steffen

    2014-05-09

    Extracting cardiorespiratory signals from non-invasive and non-contacting sensor arrangements, i.e. magnetic induction sensors, is a challenging task. The respiratory and cardiac signals are mixed on top of a large and time-varying offset and are likely to be disturbed by measurement noise. Basic filtering techniques fail to extract relevant information for monitoring purposes. We present a real-time filtering system based on an adaptive Kalman filter approach that separates signal offsets, respiratory and heart signals from three different sensor channels. It continuously estimates respiration and heart rates, which are fed back into the system model to enhance performance. Sensor and system noise covariance matrices are automatically adapted to the aimed application, thus improving the signal separation capabilities. We apply the filtering to two different subjects with different heart rates and sensor properties and compare the results to the non-adaptive version of the same Kalman filter. Also, the performance, depending on the initialization of the filters, is analyzed using three different configurations ranging from best to worst case. Extracted data are compared with reference heart rates derived from a standard pulse-photoplethysmographic sensor and respiration rates from a flowmeter. In the worst case for one of the subjects the adaptive filter obtains mean errors (standard deviations) of -0.2 min(-1) (0.3 min(-1)) and -0.7 bpm (1.7 bpm) (compared to -0.2 min(-1) (0.4 min(-1)) and 42.0 bpm (6.1 bpm) for the non-adaptive filter) for respiration and heart rate, respectively. In bad conditions the heart rate is only correctly measurable when the Kalman matrices are adapted to the target sensor signals. Also, the reduced mean error between the extracted offset and the raw sensor signal shows that adapting the Kalman filter continuously improves the ability to separate the desired signals from the raw sensor data. The average total computational time needed

  11. A new approach for designing self-organizing systems and application to adaptive control

    NASA Technical Reports Server (NTRS)

    Ramamoorthy, P. A.; Zhang, Shi; Lin, Yueqing; Huang, Song

    1993-01-01

    There is tremendous interest in the design of intelligent machines capable of autonomous learning and skillful performance under complex environments. A major task in designing such systems is to make the system plastic and adaptive when presented with new and useful information and stable in response to irrelevant events. A great body of knowledge, based on neuro-physiological concepts, has evolved as a possible solution to this problem. Adaptive resonance theory (ART) is a classical example under this category. The system dynamics of an ART network is described by a set of differential equations with nonlinear functions. An approach for designing self-organizing networks characterized by nonlinear differential equations is proposed.

  12. Validity of the reduced-sample insulin modified frequently-sampled intravenous glucose tolerance test using the nonlinear regression approach.

    PubMed

    Sumner, Anne E; Luercio, Marcella F; Frempong, Barbara A; Ricks, Madia; Sen, Sabyasachi; Kushner, Harvey; Tulloch-Reid, Marshall K

    2009-02-01

    The disposition index, the product of the insulin sensitivity index (S(I)) and the acute insulin response to glucose, is linked in African Americans to chromosome 11q. This link was determined with S(I) calculated with the nonlinear regression approach to the minimal model and data from the reduced-sample insulin-modified frequently-sampled intravenous glucose tolerance test (Reduced-Sample-IM-FSIGT). However, the application of the nonlinear regression approach to calculate S(I) using data from the Reduced-Sample-IM-FSIGT has been challenged as being not only inaccurate but also having a high failure rate in insulin-resistant subjects. Our goal was to determine the accuracy and failure rate of the Reduced-Sample-IM-FSIGT using the nonlinear regression approach to the minimal model. With S(I) from the Full-Sample-IM-FSIGT considered the standard and using the nonlinear regression approach to the minimal model, we compared the agreement between S(I) from the Full- and Reduced-Sample-IM-FSIGT protocols. One hundred African Americans (body mass index, 31.3 +/- 7.6 kg/m(2) [mean +/- SD]; range, 19.0-56.9 kg/m(2)) had FSIGTs. Glucose (0.3 g/kg) was given at baseline. Insulin was infused from 20 to 25 minutes (total insulin dose, 0.02 U/kg). For the Full-Sample-IM-FSIGT, S(I) was calculated based on the glucose and insulin samples taken at -1, 1, 2, 3, 4, 5, 6, 7, 8,10, 12, 14, 16, 19, 22, 23, 24, 25, 27, 30, 40, 50, 60, 70, 80, 90, 100, 120, 150, and 180 minutes. For the Reduced-Sample-FSIGT, S(I) was calculated based on the time points that appear in bold. Agreement was determined by Spearman correlation, concordance, and the Bland-Altman method. In addition, for both protocols, the population was divided into tertiles of S(I). Insulin resistance was defined by the lowest tertile of S(I) from the Full-Sample-IM-FSIGT. The distribution of subjects across tertiles was compared by rank order and kappa statistic. We found that the rate of failure of resolution of S(I) by

  13. Free Energy Calculations using a Swarm-Enhanced Sampling Molecular Dynamics Approach.

    PubMed

    Burusco, Kepa K; Bruce, Neil J; Alibay, Irfan; Bryce, Richard A

    2015-10-26

    Free energy simulations are an established computational tool in modelling chemical change in the condensed phase. However, sampling of kinetically distinct substates remains a challenge to these approaches. As a route to addressing this, we link the methods of thermodynamic integration (TI) and swarm-enhanced sampling molecular dynamics (sesMD), where simulation replicas interact cooperatively to aid transitions over energy barriers. We illustrate the approach by using alchemical alkane transformations in solution, comparing them with the multiple independent trajectory TI (IT-TI) method. Free energy changes for transitions computed by using IT-TI grew increasingly inaccurate as the intramolecular barrier was heightened. By contrast, swarm-enhanced sampling TI (sesTI) calculations showed clear improvements in sampling efficiency, leading to more accurate computed free energy differences, even in the case of the highest barrier height. The sesTI approach, therefore, has potential in addressing chemical change in systems where conformations exist in slow exchange. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. A Cluster-Based Dual-Adaptive Topology Control Approach in Wireless Sensor Networks.

    PubMed

    Gui, Jinsong; Zhou, Kai; Xiong, Naixue

    2016-09-25

    Multi-Input Multi-Output (MIMO) can improve wireless network performance. Sensors are usually single-antenna devices due to the high hardware complexity and cost, so several sensors are used to form virtual MIMO array, which is a desirable approach to efficiently take advantage of MIMO gains. Also, in large Wireless Sensor Networks (WSNs), clustering can improve the network scalability, which is an effective topology control approach. The existing virtual MIMO-based clustering schemes do not either fully explore the benefits of MIMO or adaptively determine the clustering ranges. Also, clustering mechanism needs to be further improved to enhance the cluster structure life. In this paper, we propose an improved clustering scheme for virtual MIMO-based topology construction (ICV-MIMO), which can determine adaptively not only the inter-cluster transmission modes but also the clustering ranges. Through the rational division of cluster head function and the optimization of cluster head selection criteria and information exchange process, the ICV-MIMO scheme effectively reduces the network energy consumption and improves the lifetime of the cluster structure when compared with the existing typical virtual MIMO-based scheme. Moreover, the message overhead and time complexity are still in the same order of magnitude.

  15. The Limits to Adaptation: A Systems Approach

    EPA Science Inventory

    The ability to adapt to climate change is delineated by capacity thresholds, after which climate damages begin to overwhelm the adaptation response. Such thresholds depend upon physical properties (natural processes and engineering parameters), resource constraints (expressed th...

  16. Multidimensional Adaptation in MAS Organizations.

    PubMed

    Alberola, Juan M; Julian, Vicente; Garcia-Fornes, Ana

    2013-04-01

    Organization adaptation requires determining the consequences of applying changes not only in terms of the benefits provided but also measuring the adaptation costs as well as the impact that these changes have on all of the components of the organization. In this paper, we provide an approach for adaptation in multiagent systems based on a multidimensional transition deliberation mechanism (MTDM). This approach considers transitions in multiple dimensions and is aimed at obtaining the adaptation with the highest potential for improvement in utility based on the costs of adaptation. The approach provides an accurate measurement of the impact of the adaptation since it determines the organization that is to be transitioned to as well as the changes required to carry out this transition. We show an example of adaptation in a service provider network environment in order to demonstrate that the measurement of the adaptation consequences taken by the MTDM improves the organization performance more than the other approaches.

  17. Towards psychologically adaptive brain-computer interfaces

    NASA Astrophysics Data System (ADS)

    Myrden, A.; Chau, T.

    2016-12-01

    Objective. Brain-computer interface (BCI) performance is sensitive to short-term changes in psychological states such as fatigue, frustration, and attention. This paper explores the design of a BCI that can adapt to these short-term changes. Approach. Eleven able-bodied individuals participated in a study during which they used a mental task-based EEG-BCI to play a simple maze navigation game while self-reporting their perceived levels of fatigue, frustration, and attention. In an offline analysis, a regression algorithm was trained to predict changes in these states, yielding Pearson correlation coefficients in excess of 0.45 between the self-reported and predicted states. Two means of fusing the resultant mental state predictions with mental task classification were investigated. First, single-trial mental state predictions were used to predict correct classification by the BCI during each trial. Second, an adaptive BCI was designed that retrained a new classifier for each testing sample using only those training samples for which predicted mental state was similar to that predicted for the current testing sample. Main results. Mental state-based prediction of BCI reliability exceeded chance levels. The adaptive BCI exhibited significant, but practically modest, increases in classification accuracy for five of 11 participants and no significant difference for the remaining six despite a smaller average training set size. Significance. Collectively, these findings indicate that adaptation to psychological state may allow the design of more accurate BCIs.

  18. Comparing Stream DOC Fluxes from Sensor- and Sample-Based Approaches

    NASA Astrophysics Data System (ADS)

    Shanley, J. B.; Saraceno, J.; Aulenbach, B. T.; Mast, A.; Clow, D. W.; Hood, K.; Walker, J. F.; Murphy, S. F.; Torres-Sanchez, A.; Aiken, G.; McDowell, W. H.

    2015-12-01

    DOC transport by streamwater is a significant flux that does not consistently show up in ecosystem carbon budgets. In an effort to quantify stream DOC flux, we analyzed three to four years of high-frequency in situ fluorescing dissolved organic matter (FDOM) concentrations and turbidity measured by optical sensors at the five diverse forested and/or alpine headwater sites of the U.S. Geological Survey (USGS) Water, Energy, and Biogeochemical Budgets (WEBB) program. FDOM serves as a proxy for DOC. We also took discrete samples over a range of hydrologic conditions, using both manual weekly and automated event-based sampling. After compensating FDOM for temperature effects and turbidity interference - which was successful even at the high-turbidity Luquillo, PR site -- we evaluated the DOC-FDOM relation based on discrete sample DOC analyses matched to corrected FDOM at the time of sampling. FDOM was a moderately robust predictor of DOC, with r2 from 0.60 to more than 0.95 among sites. We then formed continuous DOC time series by two independent approaches: (1) DOC predicted from FDOM; and (2) the composite method, based on modeled DOC from regression on stream discharge, season, air temperature, and time, forcing the model to observations and adjusting modeled concentrations between observations by linearly-interpolated model residuals. DOC flux from each approach was then computed directly as concentration times discharge. DOC fluxes based on the sensor approach were consistently greater than the sample-based approach. At Loch Vale, CO (2.5 years) and Panola Mountain GA (1 year), the difference was 5-17%. At Sleepers River, VT (3 years), preliminary differences were greater than 20%. The difference is driven by the highest events, but we are investigating these results further. We will also present comparisons from Luquillo, PR, and Allequash Creek, WI. The higher sensor-based DOC fluxes could result from their accuracy during hysteresis, which is difficult to model

  19. ESCAschool study: trial protocol of an adaptive treatment approach for school-age children with ADHD including two randomised trials.

    PubMed

    Döpfner, Manfred; Hautmann, Christopher; Dose, Christina; Banaschewski, Tobias; Becker, Katja; Brandeis, Daniel; Holtmann, Martin; Jans, Thomas; Jenkner, Carolin; Millenet, Sabina; Renner, Tobias; Romanos, Marcel; von Wirth, Elena

    2017-07-24

    The ESCAschool study addresses the treatment of school-age children with attention-deficit/hyperactivity disorder (ADHD) in a large multicentre trial. It aims to investigate three interrelated topics: (i) Clinical guidelines often recommend a stepped care approach, including different treatment strategies for children with mild to moderate and severe ADHD symptoms, respectively. However, this approach has not yet been empirically validated. (ii) Behavioural interventions and neurofeedback have been shown to be effective, but the superiority of combined treatment approaches such as medication plus behaviour therapy or medication plus neurofeedback compared to medication alone remains questionable. (iii) Growing evidence indicates that telephone-assisted self-help interventions are effective in the treatment of ADHD. However, larger randomised controlled trials (RCTs) are lacking. This report presents the ESCAschool trial protocol. In an adaptive treatment design, two RCTs and additional observational treatment arms are considered. The target sample size of ESCAschool is 521 children with ADHD. Based on their baseline ADHD symptom severity, the children will be assigned to one of two groups (mild to moderate symptom group and severe symptom group). The adaptive design includes two treatment phases (Step 1 and Step 2). According to clinical guidelines, different treatment protocols will be followed for the two severity groups. In the moderate group, the efficacy of telephone-assisted self-help for parents and teachers will be tested against waitlist control in Step 1 (RCT I). The severe group will receive pharmacotherapy combined with psychoeducation in Step 1. For both groups, treatment response will be determined after Step 1 treatment (no, partial or full response). In severe group children demonstrating partial response to medication, in Step 2, the efficacy of (1) counselling, (2) behaviour therapy and (3) neurofeedback will be tested (RCT II). All other

  20. A novel variable selection approach that iteratively optimizes variable space using weighted binary matrix sampling.

    PubMed

    Deng, Bai-chuan; Yun, Yong-huan; Liang, Yi-zeng; Yi, Lun-zhao

    2014-10-07

    In this study, a new optimization algorithm called the Variable Iterative Space Shrinkage Approach (VISSA) that is based on the idea of model population analysis (MPA) is proposed for variable selection. Unlike most of the existing optimization methods for variable selection, VISSA statistically evaluates the performance of variable space in each step of optimization. Weighted binary matrix sampling (WBMS) is proposed to generate sub-models that span the variable subspace. Two rules are highlighted during the optimization procedure. First, the variable space shrinks in each step. Second, the new variable space outperforms the previous one. The second rule, which is rarely satisfied in most of the existing methods, is the core of the VISSA strategy. Compared with some promising variable selection methods such as competitive adaptive reweighted sampling (CARS), Monte Carlo uninformative variable elimination (MCUVE) and iteratively retaining informative variables (IRIV), VISSA showed better prediction ability for the calibration of NIR data. In addition, VISSA is user-friendly; only a few insensitive parameters are needed, and the program terminates automatically without any additional conditions. The Matlab codes for implementing VISSA are freely available on the website: https://sourceforge.net/projects/multivariateanalysis/files/VISSA/.

  1. Modern Adaptive Analytics Approach to Lowering Seismic Network Detection Thresholds

    NASA Astrophysics Data System (ADS)

    Johnson, C. E.

    2017-12-01

    Modern seismic networks present a number of challenges, but perhaps most notably are those related to 1) extreme variation in station density, 2) temporal variation in station availability, and 3) the need to achieve detectability for much smaller events of strategic importance. The first of these has been reasonably addressed in the development of modern seismic associators, such as GLASS 3.0 by the USGS/NEIC, though some work still remains to be done in this area. However, the latter two challenges demand special attention. Station availability is impacted by weather, equipment failure or the adding or removing of stations, and while thresholds have been pushed to increasingly smaller magnitudes, new algorithms are needed to achieve even lower thresholds. Station availability can be addressed by a modern, adaptive architecture that maintains specified performance envelopes using adaptive analytics coupled with complexity theory. Finally, detection thresholds can be lowered using a novel approach that tightly couples waveform analytics with the event detection and association processes based on a principled repicking algorithm that uses particle realignment for enhanced phase discrimination.

  2. An integrate-over-temperature approach for enhanced sampling.

    PubMed

    Gao, Yi Qin

    2008-02-14

    A simple method is introduced to achieve efficient random walking in the energy space in molecular dynamics simulations which thus enhances the sampling over a large energy range. The approach is closely related to multicanonical and replica exchange simulation methods in that it allows configurations of the system to be sampled in a wide energy range by making use of Boltzmann distribution functions at multiple temperatures. A biased potential is quickly generated using this method and is then used in accelerated molecular dynamics simulations.

  3. A New Approach of Juvenile Age Estimation using Measurements of the Ilium and Multivariate Adaptive Regression Splines (MARS) Models for Better Age Prediction.

    PubMed

    Corron, Louise; Marchal, François; Condemi, Silvana; Chaumoître, Kathia; Adalian, Pascal

    2017-01-01

    Juvenile age estimation methods used in forensic anthropology generally lack methodological consistency and/or statistical validity. Considering this, a standard approach using nonparametric Multivariate Adaptive Regression Splines (MARS) models were tested to predict age from iliac biometric variables of male and female juveniles from Marseilles, France, aged 0-12 years. Models using unidimensional (length and width) and bidimensional iliac data (module and surface) were constructed on a training sample of 176 individuals and validated on an independent test sample of 68 individuals. Results show that MARS prediction models using iliac width, module and area give overall better and statistically valid age estimates. These models integrate punctual nonlinearities of the relationship between age and osteometric variables. By constructing valid prediction intervals whose size increases with age, MARS models take into account the normal increase of individual variability. MARS models can qualify as a practical and standardized approach for juvenile age estimation. © 2016 American Academy of Forensic Sciences.

  4. Adaptive convex combination approach for the identification of improper quaternion processes.

    PubMed

    Ujang, Bukhari Che; Jahanchahi, Cyrus; Took, Clive Cheong; Mandic, Danilo P

    2014-01-01

    Data-adaptive optimal modeling and identification of real-world vector sensor data is provided by combining the fractional tap-length (FT) approach with model order selection in the quaternion domain. To account rigorously for the generality of such processes, both second-order circular (proper) and noncircular (improper), the proposed approach in this paper combines the FT length optimization with both the strictly linear quaternion least mean square (QLMS) and widely linear QLMS (WL-QLMS). A collaborative approach based on QLMS and WL-QLMS is shown to both identify the type of processes (proper or improper) and to track their optimal parameters in real time. Analysis shows that monitoring the evolution of the convex mixing parameter within the collaborative approach allows us to track the improperness in real time. Further insight into the properties of those algorithms is provided by establishing a relationship between the steady-state error and optimal model order. The approach is supported by simulations on model order selection and identification of both strictly linear and widely linear quaternion-valued systems, such as those routinely used in renewable energy (wind) and human-centered computing (biomechanics).

  5. A New Approach to Interference Excision in Radio Astronomy: Real-Time Adaptive Cancellation

    NASA Astrophysics Data System (ADS)

    Barnbaum, Cecilia; Bradley, Richard F.

    1998-11-01

    Every year, an increasing amount of radio-frequency (RF) spectrum in the VHF, UHF, and microwave bands is being utilized to support new commercial and military ventures, and all have the potential to interfere with radio astronomy observations. Such services already cause problems for radio astronomy even in very remote observing sites, and the potential for this form of light pollution to grow is alarming. Preventive measures to eliminate interference through FCC legislation and ITU agreements can be effective; however, many times this approach is inadequate and interference excision at the receiver is necessary. Conventional techniques such as RF filters, RF shielding, and postprocessing of data have been only somewhat successful, but none has been sufficient. Adaptive interference cancellation is a real-time approach to interference excision that has not been used before in radio astronomy. We describe here, for the first time, adaptive interference cancellation in the context of radio astronomy instrumentation, and we present initial results for our prototype receiver. In the 1960s, analog adaptive interference cancelers were developed that obtain a high degree of cancellation in problems of radio communications and radar. However, analog systems lack the dynamic range, noised performance, and versatility required by radio astronomy. The concept of digital adaptive interference cancellation was introduced in the mid-1960s as a way to reduce unwanted noise in low-frequency (audio) systems. Examples of such systems include the canceling of maternal ECG in fetal electrocardiography and the reduction of engine noise in the passenger compartments of automobiles. These audio-frequency applications require bandwidths of only a few tens of kilohertz. Only recently has high-speed digital filter technology made high dynamic range adaptive canceling possible in a bandwidth as large as a few megahertz, finally opening the door to application in radio astronomy. We have

  6. Small RNA Library Preparation Method for Next-Generation Sequencing Using Chemical Modifications to Prevent Adapter Dimer Formation.

    PubMed

    Shore, Sabrina; Henderson, Jordana M; Lebedev, Alexandre; Salcedo, Michelle P; Zon, Gerald; McCaffrey, Anton P; Paul, Natasha; Hogrefe, Richard I

    2016-01-01

    For most sample types, the automation of RNA and DNA sample preparation workflows enables high throughput next-generation sequencing (NGS) library preparation. Greater adoption of small RNA (sRNA) sequencing has been hindered by high sample input requirements and inherent ligation side products formed during library preparation. These side products, known as adapter dimer, are very similar in size to the tagged library. Most sRNA library preparation strategies thus employ a gel purification step to isolate tagged library from adapter dimer contaminants. At very low sample inputs, adapter dimer side products dominate the reaction and limit the sensitivity of this technique. Here we address the need for improved specificity of sRNA library preparation workflows with a novel library preparation approach that uses modified adapters to suppress adapter dimer formation. This workflow allows for lower sample inputs and elimination of the gel purification step, which in turn allows for an automatable sRNA library preparation protocol.

  7. Mobile membrane introduction tandem mass spectrometry for on-the-fly measurements and adaptive sampling of VOCs around oil and gas projects in Alberta, Canada

    NASA Astrophysics Data System (ADS)

    Krogh, E.; Gill, C.; Bell, R.; Davey, N.; Martinsen, M.; Thompson, A.; Simpson, I. J.; Blake, D. R.

    2012-12-01

    The release of hydrocarbons into the environment can have significant environmental and economic consequences. The evolution of smaller, more portable mass spectrometers to the field can provide spatially and temporally resolved information for rapid detection, adaptive sampling and decision support. We have deployed a mobile platform membrane introduction mass spectrometer (MIMS) for the in-field simultaneous measurement of volatile and semi-volatile organic compounds. In this work, we report instrument and data handling advances that produce geographically referenced data in real-time and preliminary data where these improvements have been combined with high precision ultra-trace VOCs analysis to adaptively sample air plumes near oil and gas operations in Alberta, Canada. We have modified a commercially available ion-trap mass spectrometer (Griffin ICX 400) with an in-house temperature controlled capillary hollow fibre polydimethylsiloxane (PDMS) polymer membrane interface and in-line permeation tube flow cell for a continuously infused internal standard. The system is powered by 24 VDC for remote operations in a moving vehicle. Software modifications include the ability to run continuous, interlaced tandem mass spectrometry (MS/MS) experiments for multiple contaminants/internal standards. All data are time and location stamped with on-board GPS and meteorological data to facilitate spatial and temporal data mapping. Tandem MS/MS scans were employed to simultaneously monitor ten volatile and semi-volatile analytes, including benzene, toluene, ethylbenzene and xylene (BTEX), reduced sulfur compounds, halogenated organics and naphthalene. Quantification was achieved by calibrating against a continuously infused deuterated internal standard (toluene-d8). Time referenced MS/MS data were correlated with positional data and processed using Labview and Matlab to produce calibrated, geographical Google Earth data-visualizations that enable adaptive sampling protocols

  8. Mouse EEG spike detection based on the adapted continuous wavelet transform

    NASA Astrophysics Data System (ADS)

    Tieng, Quang M.; Kharatishvili, Irina; Chen, Min; Reutens, David C.

    2016-04-01

    Objective. Electroencephalography (EEG) is an important tool in the diagnosis of epilepsy. Interictal spikes on EEG are used to monitor the development of epilepsy and the effects of drug therapy. EEG recordings are generally long and the data voluminous. Thus developing a sensitive and reliable automated algorithm for analyzing EEG data is necessary. Approach. A new algorithm for detecting and classifying interictal spikes in mouse EEG recordings is proposed, based on the adapted continuous wavelet transform (CWT). The construction of the adapted mother wavelet is founded on a template obtained from a sample comprising the first few minutes of an EEG data set. Main Result. The algorithm was tested with EEG data from a mouse model of epilepsy and experimental results showed that the algorithm could distinguish EEG spikes from other transient waveforms with a high degree of sensitivity and specificity. Significance. Differing from existing approaches, the proposed approach combines wavelet denoising, to isolate transient signals, with adapted CWT-based template matching, to detect true interictal spikes. Using the adapted wavelet constructed from a predefined template, the adapted CWT is calculated on small EEG segments to fit dynamical changes in the EEG recording.

  9. A neural algorithm for the non-uniform and adaptive sampling of biomedical data.

    PubMed

    Mesin, Luca

    2016-04-01

    Body sensors are finding increasing applications in the self-monitoring for health-care and in the remote surveillance of sensitive people. The physiological data to be sampled can be non-stationary, with bursts of high amplitude and frequency content providing most information. Such data could be sampled efficiently with a non-uniform schedule that increases the sampling rate only during activity bursts. A real time and adaptive algorithm is proposed to select the sampling rate, in order to reduce the number of measured samples, but still recording the main information. The algorithm is based on a neural network which predicts the subsequent samples and their uncertainties, requiring a measurement only when the risk of the prediction is larger than a selectable threshold. Four examples of application to biomedical data are discussed: electromyogram, electrocardiogram, electroencephalogram, and body acceleration. Sampling rates are reduced under the Nyquist limit, still preserving an accurate representation of the data and of their power spectral densities (PSD). For example, sampling at 60% of the Nyquist frequency, the percentage average rectified errors in estimating the signals are on the order of 10% and the PSD is fairly represented, until the highest frequencies. The method outperforms both uniform sampling and compressive sensing applied to the same data. The discussed method allows to go beyond Nyquist limit, still preserving the information content of non-stationary biomedical signals. It could find applications in body sensor networks to lower the number of wireless communications (saving sensor power) and to reduce the occupation of memory. Copyright © 2016 Elsevier Ltd. All rights reserved.

  10. An adaptive multi-moment FVM approach for incompressible flows

    NASA Astrophysics Data System (ADS)

    Liu, Cheng; Hu, Changhong

    2018-04-01

    In this study, a multi-moment finite volume method (FVM) based on block-structured adaptive Cartesian mesh is proposed for simulating incompressible flows. A conservative interpolation scheme following the idea of the constrained interpolation profile (CIP) method is proposed for the prolongation operation of the newly created mesh. A sharp immersed boundary (IB) method is used to model the immersed rigid body. A moving least squares (MLS) interpolation approach is applied for reconstruction of the velocity field around the solid surface. An efficient method for discretization of Laplacian operators on adaptive meshes is proposed. Numerical simulations on several test cases are carried out for validation of the proposed method. For the case of viscous flow past an impulsively started cylinder (Re = 3000 , 9500), the computed surface vorticity coincides with the result of the body-fitted method. For the case of a fast pitching NACA 0015 airfoil at moderate Reynolds numbers (Re = 10000 , 45000), the predicted drag coefficient (CD) and lift coefficient (CL) agree well with other numerical or experimental results. For 2D and 3D simulations of viscous flow past a pitching plate with prescribed motions (Re = 5000 , 40000), the predicted CD, CL and CM (moment coefficient) are in good agreement with those obtained by other numerical methods.

  11. Virtual-system-coupled adaptive umbrella sampling to compute free-energy landscape for flexible molecular docking.

    PubMed

    Higo, Junichi; Dasgupta, Bhaskar; Mashimo, Tadaaki; Kasahara, Kota; Fukunishi, Yoshifumi; Nakamura, Haruki

    2015-07-30

    A novel enhanced conformational sampling method, virtual-system-coupled adaptive umbrella sampling (V-AUS), was proposed to compute 300-K free-energy landscape for flexible molecular docking, where a virtual degrees of freedom was introduced to control the sampling. This degree of freedom interacts with the biomolecular system. V-AUS was applied to complex formation of two disordered amyloid-β (Aβ30-35 ) peptides in a periodic box filled by an explicit solvent. An interpeptide distance was defined as the reaction coordinate, along which sampling was enhanced. A uniform conformational distribution was obtained covering a wide interpeptide distance ranging from the bound to unbound states. The 300-K free-energy landscape was characterized by thermodynamically stable basins of antiparallel and parallel β-sheet complexes and some other complex forms. Helices were frequently observed, when the two peptides contacted loosely or fluctuated freely without interpeptide contacts. We observed that V-AUS converged to uniform distribution more effectively than conventional AUS sampling did. © 2015 Wiley Periodicals, Inc.

  12. An Adaptive Nonlinear Aircraft Maneuvering Envelope Estimation Approach for Online Applications

    NASA Technical Reports Server (NTRS)

    Schuet, Stefan R.; Lombaerts, Thomas Jan; Acosta, Diana; Wheeler, Kevin; Kaneshige, John

    2014-01-01

    A nonlinear aircraft model is presented and used to develop an overall unified robust and adaptive approach to passive trim and maneuverability envelope estimation with uncertainty quantification. The concept of time scale separation makes this method suitable for the online characterization of altered safe maneuvering limitations after impairment. The results can be used to provide pilot feedback and/or be combined with flight planning, trajectory generation, and guidance algorithms to help maintain safe aircraft operations in both nominal and off-nominal scenarios.

  13. Characterization of GM events by insert knowledge adapted re-sequencing approaches

    PubMed Central

    Yang, Litao; Wang, Congmao; Holst-Jensen, Arne; Morisset, Dany; Lin, Yongjun; Zhang, Dabing

    2013-01-01

    Detection methods and data from molecular characterization of genetically modified (GM) events are needed by stakeholders of public risk assessors and regulators. Generally, the molecular characteristics of GM events are incomprehensively revealed by current approaches and biased towards detecting transformation vector derived sequences. GM events are classified based on available knowledge of the sequences of vectors and inserts (insert knowledge). Herein we present three insert knowledge-adapted approaches for characterization GM events (TT51-1 and T1c-19 rice as examples) based on paired-end re-sequencing with the advantages of comprehensiveness, accuracy, and automation. The comprehensive molecular characteristics of two rice events were revealed with additional unintended insertions comparing with the results from PCR and Southern blotting. Comprehensive transgene characterization of TT51-1 and T1c-19 is shown to be independent of a priori knowledge of the insert and vector sequences employing the developed approaches. This provides an opportunity to identify and characterize also unknown GM events. PMID:24088728

  14. Characterization of GM events by insert knowledge adapted re-sequencing approaches.

    PubMed

    Yang, Litao; Wang, Congmao; Holst-Jensen, Arne; Morisset, Dany; Lin, Yongjun; Zhang, Dabing

    2013-10-03

    Detection methods and data from molecular characterization of genetically modified (GM) events are needed by stakeholders of public risk assessors and regulators. Generally, the molecular characteristics of GM events are incomprehensively revealed by current approaches and biased towards detecting transformation vector derived sequences. GM events are classified based on available knowledge of the sequences of vectors and inserts (insert knowledge). Herein we present three insert knowledge-adapted approaches for characterization GM events (TT51-1 and T1c-19 rice as examples) based on paired-end re-sequencing with the advantages of comprehensiveness, accuracy, and automation. The comprehensive molecular characteristics of two rice events were revealed with additional unintended insertions comparing with the results from PCR and Southern blotting. Comprehensive transgene characterization of TT51-1 and T1c-19 is shown to be independent of a priori knowledge of the insert and vector sequences employing the developed approaches. This provides an opportunity to identify and characterize also unknown GM events.

  15. A random sampling approach for robust estimation of tissue-to-plasma ratio from extremely sparse data.

    PubMed

    Chu, Hui-May; Ette, Ene I

    2005-09-02

    his study was performed to develop a new nonparametric approach for the estimation of robust tissue-to-plasma ratio from extremely sparsely sampled paired data (ie, one sample each from plasma and tissue per subject). Tissue-to-plasma ratio was estimated from paired/unpaired experimental data using independent time points approach, area under the curve (AUC) values calculated with the naïve data averaging approach, and AUC values calculated using sampling based approaches (eg, the pseudoprofile-based bootstrap [PpbB] approach and the random sampling approach [our proposed approach]). The random sampling approach involves the use of a 2-phase algorithm. The convergence of the sampling/resampling approaches was investigated, as well as the robustness of the estimates produced by different approaches. To evaluate the latter, new data sets were generated by introducing outlier(s) into the real data set. One to 2 concentration values were inflated by 10% to 40% from their original values to produce the outliers. Tissue-to-plasma ratios computed using the independent time points approach varied between 0 and 50 across time points. The ratio obtained from AUC values acquired using the naive data averaging approach was not associated with any measure of uncertainty or variability. Calculating the ratio without regard to pairing yielded poorer estimates. The random sampling and pseudoprofile-based bootstrap approaches yielded tissue-to-plasma ratios with uncertainty and variability. However, the random sampling approach, because of the 2-phase nature of its algorithm, yielded more robust estimates and required fewer replications. Therefore, a 2-phase random sampling approach is proposed for the robust estimation of tissue-to-plasma ratio from extremely sparsely sampled data.

  16. Prospective Mathematics Teachers' Perceptions on and Adaptation of Student-Centred Approach to Teaching

    ERIC Educational Resources Information Center

    Osmanoglu, Aslihan; Dincer, Emrah Oguzhan

    2018-01-01

    The aim of this study was to investigate prospective secondary mathematics teachers' perceptions on and adaptation of student-centred approach to teaching. The study was conducted with 58 prospective secondary mathematics teachers who were the graduates from mathematics departments from different universities' Science and Literature faculties.…

  17. Online participation in climate change adaptation: A case study of agricultural adaptation measures in Northern Italy.

    PubMed

    Bojovic, Dragana; Bonzanigo, Laura; Giupponi, Carlo; Maziotis, Alexandros

    2015-07-01

    The new EU strategy on adaptation to climate change suggests flexible and participatory approaches. Face-to-face contact, although it involves time-consuming procedures with a limited audience, has often been considered the most effective participatory approach. In recent years, however, there has been an increase in the visibility of different citizens' initiatives in the online world, which strengthens the possibility of greater citizen agency. This paper investigates whether the Internet can ensure efficient public participation with meaningful engagement in climate change adaptation. In elucidating issues regarding climate change adaptation, we developed an eParticipation framework to explore adaptation capacity of agriculture to climate change in Northern Italy. Farmers were mobilised using a pre-existing online network. First they took part in an online questionnaire for revealing their perceptions of and reactions to the impacts of ongoing changes in agriculture. We used these results to suggest a portfolio of policy measures and to set evaluation criteria. Farmers then evaluated these policy options, using a multi criteria analysis tool with a simple user-friendly interface. Our results showed that eParticipation is efficient: it supports a rapid data collection, while involving high number of participants. Moreover, we demonstrated that the digital divide is decreasingly an obstacle for using online spaces for public engagement. This research does not present eParticipation as a panacea. Rather, eParticipation was implemented with well-established participatory approaches to both validate the results and, consequently, communicate meaningful messages on local agricultural adaptation practices to regional decision-makers. Feedbacks from the regional decision-makers showed their interest in using eParticipation to improve communication with farmers in the future. We expect that, with further Internet proliferation, eParticipation may allow the inclusion of

  18. Where do adaptive shifts occur during invasion A multidisciplinary approach to unravel cold adaptation in a tropical ant species invading the Mediterranean zone

    USDA-ARS?s Scientific Manuscript database

    Although evolution is now recognized as improving the invasive success of populations, where and when key adaptation event(s) occur often remains unclear. Here we used a multidisciplinary approach to disentangle the eco-evolutionary scenario of invasion of a Mediterranean zone (i.e. Israel) by the t...

  19. An adaptive Kalman filter approach for cardiorespiratory signal extraction and fusion of non-contacting sensors

    PubMed Central

    2014-01-01

    Background Extracting cardiorespiratory signals from non-invasive and non-contacting sensor arrangements, i.e. magnetic induction sensors, is a challenging task. The respiratory and cardiac signals are mixed on top of a large and time-varying offset and are likely to be disturbed by measurement noise. Basic filtering techniques fail to extract relevant information for monitoring purposes. Methods We present a real-time filtering system based on an adaptive Kalman filter approach that separates signal offsets, respiratory and heart signals from three different sensor channels. It continuously estimates respiration and heart rates, which are fed back into the system model to enhance performance. Sensor and system noise covariance matrices are automatically adapted to the aimed application, thus improving the signal separation capabilities. We apply the filtering to two different subjects with different heart rates and sensor properties and compare the results to the non-adaptive version of the same Kalman filter. Also, the performance, depending on the initialization of the filters, is analyzed using three different configurations ranging from best to worst case. Results Extracted data are compared with reference heart rates derived from a standard pulse-photoplethysmographic sensor and respiration rates from a flowmeter. In the worst case for one of the subjects the adaptive filter obtains mean errors (standard deviations) of -0.2 min −1 (0.3 min −1) and -0.7 bpm (1.7 bpm) (compared to -0.2 min −1 (0.4 min −1) and 42.0 bpm (6.1 bpm) for the non-adaptive filter) for respiration and heart rate, respectively. In bad conditions the heart rate is only correctly measurable when the Kalman matrices are adapted to the target sensor signals. Also, the reduced mean error between the extracted offset and the raw sensor signal shows that adapting the Kalman filter continuously improves the ability to separate the desired signals from the raw sensor data. The average

  20. An adaptive interpolation scheme for molecular potential energy surfaces

    NASA Astrophysics Data System (ADS)

    Kowalewski, Markus; Larsson, Elisabeth; Heryudono, Alfa

    2016-08-01

    The calculation of potential energy surfaces for quantum dynamics can be a time consuming task—especially when a high level of theory for the electronic structure calculation is required. We propose an adaptive interpolation algorithm based on polyharmonic splines combined with a partition of unity approach. The adaptive node refinement allows to greatly reduce the number of sample points by employing a local error estimate. The algorithm and its scaling behavior are evaluated for a model function in 2, 3, and 4 dimensions. The developed algorithm allows for a more rapid and reliable interpolation of a potential energy surface within a given accuracy compared to the non-adaptive version.

  1. A retrospective cross-sectional quantitative molecular approach in biological samples from patients with syphilis.

    PubMed

    Pinto, Miguel; Antelo, Minia; Ferreira, Rita; Azevedo, Jacinta; Santo, Irene; Borrego, Maria José; Gomes, João Paulo

    2017-03-01

    Syphilis is the sexually transmitted disease caused by Treponema pallidum, a pathogen highly adapted to the human host. As a multistage disease, syphilis presents distinct clinical manifestations that pose different implications for diagnosis. Nevertheless, the inherent factors leading to diverse disease progressions are still unknown. We aimed to assess the association between treponemal loads and dissimilar disease outcomes, to better understand syphilis. We retrospectively analyzed 309 DNA samples distinct anatomic sites associated with particular syphilis manifestations. All samples had previously tested positive by a PCR-based diagnostic kit. An absolute quantitative real-time PCR procedure was used to precisely quantify the number of treponemal and human cells to determine T. pallidum loads in each sample. In general, lesion exudates presented the highest T. pallidum loads in contrast with blood-derived samples. Within the latter, a higher dispersion of T. pallidum quantities was observed for secondary syphilis. T. pallidum was detected in substantial amounts in 37 samples of seronegative individuals and in 13 cases considered as syphilis-treated. No association was found between treponemal loads and serological results or HIV status. This study suggests a scenario where syphilis may be characterized by: i) heterogeneous and high treponemal loads in primary syphilis, regardless of the anatomic site, reflecting dissimilar duration of chancres development and resolution; ii) high dispersion of bacterial concentrations in secondary syphilis, potentially suggesting replication capability of T. pallidum while in the bloodstream; and iii) bacterial evasiveness, either to the host immune system or antibiotic treatment, while remaining hidden in privileged niches. This work highlights the importance of using molecular approaches to study uncultivable human pathogens, such as T. pallidum, in the infection process. Copyright © 2017 Elsevier Ltd. All rights

  2. Knowledge-based nonuniform sampling in multidimensional NMR.

    PubMed

    Schuyler, Adam D; Maciejewski, Mark W; Arthanari, Haribabu; Hoch, Jeffrey C

    2011-07-01

    The full resolution afforded by high-field magnets is rarely realized in the indirect dimensions of multidimensional NMR experiments because of the time cost of uniformly sampling to long evolution times. Emerging methods utilizing nonuniform sampling (NUS) enable high resolution along indirect dimensions by sampling long evolution times without sampling at every multiple of the Nyquist sampling interval. While the earliest NUS approaches matched the decay of sampling density to the decay of the signal envelope, recent approaches based on coupled evolution times attempt to optimize sampling by choosing projection angles that increase the likelihood of resolving closely-spaced resonances. These approaches employ knowledge about chemical shifts to predict optimal projection angles, whereas prior applications of tailored sampling employed only knowledge of the decay rate. In this work we adapt the matched filter approach as a general strategy for knowledge-based nonuniform sampling that can exploit prior knowledge about chemical shifts and is not restricted to sampling projections. Based on several measures of performance, we find that exponentially weighted random sampling (envelope matched sampling) performs better than shift-based sampling (beat matched sampling). While shift-based sampling can yield small advantages in sensitivity, the gains are generally outweighed by diminished robustness. Our observation that more robust sampling schemes are only slightly less sensitive than schemes highly optimized using prior knowledge about chemical shifts has broad implications for any multidimensional NMR study employing NUS. The results derived from simulated data are demonstrated with a sample application to PfPMT, the phosphoethanolamine methyltransferase of the human malaria parasite Plasmodium falciparum.

  3. Adaptation of ICT Integration Approach Scale to Kosovo Culture: A Study of Validity and Reliability Analysis

    ERIC Educational Resources Information Center

    Kervan, Serdan; Tezci, Erdogan

    2018-01-01

    The aim of this study is to adapt ICT integration approach scale to Kosovo culture, which measures ICT integration approaches of university faculty to teaching and learning process. The scale developed in Turkish has been translated into Albanian to provide linguistic equivalence. The survey was given to a total of 303 instructors [161 (53.1%)…

  4. Developing Coastal Adaptation to Climate Change in the New York City Infrastructure-Shed: Process, Approach, Tools, and Strategies

    NASA Technical Reports Server (NTRS)

    Rosenzweig, Cynthia; Solecki, William D.; Blake, Reginald; Bowman, Malcolm; Faris, Craig; Gornitz, Vivien; Horton, Radley; Jacob, Klaus; LeBlanc, Alice; Leichenko, Robin; hide

    2010-01-01

    While current rates of sea level rise and associated coastal flooding in the New York City region appear to be manageable by stakeholders responsible for communications, energy, transportation, and water infrastructure, projections for sea level rise and associated flooding in the future, especially those associated with rapid icemelt of the Greenland and West Antarctic Icesheets, may be beyond the range of current capacity because an extreme event might cause flooding and inundation beyond the planning and preparedness regimes. This paper describes the comprehensive process, approach, and tools developed by the New York City Panel on Climate Change (NPCC) in conjunction with the region s stakeholders who manage its critical infrastructure, much of which lies near the coast. It presents the adaptation approach and the sea-level rise and storm projections related to coastal risks developed through the stakeholder process. Climate change adaptation planning in New York City is characterized by a multi-jurisdictional stakeholder-scientist process, state-of-the-art scientific projections and mapping, and development of adaptation strategies based on a risk-management approach.

  5. A Cluster-Based Dual-Adaptive Topology Control Approach in Wireless Sensor Networks

    PubMed Central

    Gui, Jinsong; Zhou, Kai; Xiong, Naixue

    2016-01-01

    Multi-Input Multi-Output (MIMO) can improve wireless network performance. Sensors are usually single-antenna devices due to the high hardware complexity and cost, so several sensors are used to form virtual MIMO array, which is a desirable approach to efficiently take advantage of MIMO gains. Also, in large Wireless Sensor Networks (WSNs), clustering can improve the network scalability, which is an effective topology control approach. The existing virtual MIMO-based clustering schemes do not either fully explore the benefits of MIMO or adaptively determine the clustering ranges. Also, clustering mechanism needs to be further improved to enhance the cluster structure life. In this paper, we propose an improved clustering scheme for virtual MIMO-based topology construction (ICV-MIMO), which can determine adaptively not only the inter-cluster transmission modes but also the clustering ranges. Through the rational division of cluster head function and the optimization of cluster head selection criteria and information exchange process, the ICV-MIMO scheme effectively reduces the network energy consumption and improves the lifetime of the cluster structure when compared with the existing typical virtual MIMO-based scheme. Moreover, the message overhead and time complexity are still in the same order of magnitude. PMID:27681731

  6. High-throughput adaptive sampling for whole-slide histopathology image analysis (HASHI) via convolutional neural networks: Application to invasive breast cancer detection.

    PubMed

    Cruz-Roa, Angel; Gilmore, Hannah; Basavanhally, Ajay; Feldman, Michael; Ganesan, Shridar; Shih, Natalie; Tomaszewski, John; Madabhushi, Anant; González, Fabio

    2018-01-01

    Precise detection of invasive cancer on whole-slide images (WSI) is a critical first step in digital pathology tasks of diagnosis and grading. Convolutional neural network (CNN) is the most popular representation learning method for computer vision tasks, which have been successfully applied in digital pathology, including tumor and mitosis detection. However, CNNs are typically only tenable with relatively small image sizes (200 × 200 pixels). Only recently, Fully convolutional networks (FCN) are able to deal with larger image sizes (500 × 500 pixels) for semantic segmentation. Hence, the direct application of CNNs to WSI is not computationally feasible because for a WSI, a CNN would require billions or trillions of parameters. To alleviate this issue, this paper presents a novel method, High-throughput Adaptive Sampling for whole-slide Histopathology Image analysis (HASHI), which involves: i) a new efficient adaptive sampling method based on probability gradient and quasi-Monte Carlo sampling, and, ii) a powerful representation learning classifier based on CNNs. We applied HASHI to automated detection of invasive breast cancer on WSI. HASHI was trained and validated using three different data cohorts involving near 500 cases and then independently tested on 195 studies from The Cancer Genome Atlas. The results show that (1) the adaptive sampling method is an effective strategy to deal with WSI without compromising prediction accuracy by obtaining comparative results of a dense sampling (∼6 million of samples in 24 hours) with far fewer samples (∼2,000 samples in 1 minute), and (2) on an independent test dataset, HASHI is effective and robust to data from multiple sites, scanners, and platforms, achieving an average Dice coefficient of 76%.

  7. High-throughput adaptive sampling for whole-slide histopathology image analysis (HASHI) via convolutional neural networks: Application to invasive breast cancer detection

    PubMed Central

    Gilmore, Hannah; Basavanhally, Ajay; Feldman, Michael; Ganesan, Shridar; Shih, Natalie; Tomaszewski, John; Madabhushi, Anant; González, Fabio

    2018-01-01

    Precise detection of invasive cancer on whole-slide images (WSI) is a critical first step in digital pathology tasks of diagnosis and grading. Convolutional neural network (CNN) is the most popular representation learning method for computer vision tasks, which have been successfully applied in digital pathology, including tumor and mitosis detection. However, CNNs are typically only tenable with relatively small image sizes (200 × 200 pixels). Only recently, Fully convolutional networks (FCN) are able to deal with larger image sizes (500 × 500 pixels) for semantic segmentation. Hence, the direct application of CNNs to WSI is not computationally feasible because for a WSI, a CNN would require billions or trillions of parameters. To alleviate this issue, this paper presents a novel method, High-throughput Adaptive Sampling for whole-slide Histopathology Image analysis (HASHI), which involves: i) a new efficient adaptive sampling method based on probability gradient and quasi-Monte Carlo sampling, and, ii) a powerful representation learning classifier based on CNNs. We applied HASHI to automated detection of invasive breast cancer on WSI. HASHI was trained and validated using three different data cohorts involving near 500 cases and then independently tested on 195 studies from The Cancer Genome Atlas. The results show that (1) the adaptive sampling method is an effective strategy to deal with WSI without compromising prediction accuracy by obtaining comparative results of a dense sampling (∼6 million of samples in 24 hours) with far fewer samples (∼2,000 samples in 1 minute), and (2) on an independent test dataset, HASHI is effective and robust to data from multiple sites, scanners, and platforms, achieving an average Dice coefficient of 76%. PMID:29795581

  8. Convergence of fractional adaptive systems using gradient approach.

    PubMed

    Gallegos, Javier A; Duarte-Mermoud, Manuel A

    2017-07-01

    Conditions for boundedness and convergence of the output error and the parameter error for various Caputo's fractional order adaptive schemes based on the steepest descent method are derived in this paper. To this aim, the concept of sufficiently exciting signals is introduced, characterized and related to the concept of persistently exciting signals used in the integer order case. An application is designed in adaptive indirect control of integer order systems using fractional equations to adjust parameters. This application is illustrated for a pole placement adaptive problem. Advantages of using fractional adjustment in control adaptive schemes are experimentally obtained. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  9. Understanding adolescent type 1 diabetes self-management as an adaptive process: A grounded theory approach.

    PubMed

    Chilton, Roy; Pires-Yfantouda, Renata

    2015-01-01

    To develop a conceptual understanding of the process of adapting to the self-management of type 1 diabetes during adolescence. Participants were recruited from a National Health Service paediatric diabetes service within the south-west of England which runs six countywide diabetes clinics. Thirteen interviews were conducted using a social constructivist grounded theory approach. The findings illustrate how self-management can be understood in terms of a continuum-based framework, ranging from difficulties with, to successful self-management. Adaptation within the continuum can further be understood by specific transitional phases and process mechanisms, providing further depth to individuals' experiences of adaptation. This investigation provides a conceptual understanding of the complex issues adolescents encounter while adapting to and integrating a diabetes self-management regime into their lives. It provides an invaluable framework for exploring psychological mechanisms and contextualising them within a self-management continuum. Implications for healthcare professionals are discussed and further research proposes whether the model could be applicable to other chronic illnesses.

  10. In vitro adaptation of Plasmodium falciparum reveal variations in cultivability.

    PubMed

    White, John; Mascarenhas, Anjali; Pereira, Ligia; Dash, Rashmi; Walke, Jayashri T; Gawas, Pooja; Sharma, Ambika; Manoharan, Suresh Kumar; Guler, Jennifer L; Maki, Jennifer N; Kumar, Ashwani; Mahanta, Jagadish; Valecha, Neena; Dubhashi, Nagesh; Vaz, Marina; Gomes, Edwin; Chery, Laura; Rathod, Pradipsinh K

    2016-01-22

    Culture-adapted Plasmodium falciparum parasites can offer deeper understanding of geographic variations in drug resistance, pathogenesis and immune evasion. To help ground population-based calculations and inferences from culture-adapted parasites, the complete range of parasites from a study area must be well represented in any collection. To this end, standardized adaptation methods and determinants of successful in vitro adaption were sought. Venous blood was collected from 33 P. falciparum-infected individuals at Goa Medical College and Hospital (Bambolim, Goa, India). Culture variables such as whole blood versus washed blood, heat-inactivated plasma versus Albumax, and different starting haematocrit levels were tested on fresh blood samples from patients. In vitro adaptation was considered successful when two four-fold or greater increases in parasitaemia were observed within, at most, 33 days of attempted culture. Subsequently, parasites from the same patients, which were originally cryopreserved following blood draw, were retested for adaptability for 45 days using identical host red blood cells (RBCs) and culture media. At a new endemic area research site, ~65% of tested patient samples, with varied patient history and clinical presentation, were successfully culture-adapted immediately after blood collection. Cultures set up at 1% haematocrit and 0.5% Albumax adapted most rapidly, but no single test condition was uniformly fatal to culture adaptation. Success was not limited by low patient parasitaemia nor by patient age. Some parasites emerged even after significant delays in sample processing and even after initiation of treatment with anti-malarials. When 'day 0' cryopreserved samples were retested in parallel many months later using identical host RBCs and media, speed to adaptation appeared to be an intrinsic property of the parasites collected from individual patients. Culture adaptation of P. falciparum in a field setting is formally shown to be

  11. An Adaptive Intelligent Integrated Lighting Control Approach for High-Performance Office Buildings

    NASA Astrophysics Data System (ADS)

    Karizi, Nasim

    An acute and crucial societal problem is the energy consumed in existing commercial buildings. There are 1.5 million commercial buildings in the U.S. with only about 3% being built each year. Hence, existing buildings need to be properly operated and maintained for several decades. Application of integrated centralized control systems in buildings could lead to more than 50% energy savings. This research work demonstrates an innovative adaptive integrated lighting control approach which could achieve significant energy savings and increase indoor comfort in high performance office buildings. In the first phase of the study, a predictive algorithm was developed and validated through experiments in an actual test room. The objective was to regulate daylight on a specified work plane by controlling the blind slat angles. Furthermore, a sensor-based integrated adaptive lighting controller was designed in Simulink which included an innovative sensor optimization approach based on genetic algorithm to minimize the number of sensors and efficiently place them in the office. The controller was designed based on simple integral controllers. The objective of developed control algorithm was to improve the illuminance situation in the office through controlling the daylight and electrical lighting. To evaluate the performance of the system, the controller was applied on experimental office model in Lee et al.'s research study in 1998. The result of the developed control approach indicate a significantly improvement in lighting situation and 1-23% and 50-78% monthly electrical energy savings in the office model, compared to two static strategies when the blinds were left open and closed during the whole year respectively.

  12. The Portuguese adaptation of the Gudjonsson Suggestibility Scale (GSS1) in a sample of inmates.

    PubMed

    Pires, Rute; Silva, Danilo R; Ferreira, Ana Sousa

    2014-01-01

    This paper comprises two studies which address the validity of the Portuguese adaptation of the Gudjonsson Suggestibility Scale, GSS1. In study 1, the means and standard deviations for the suggestibility results of a sample of Portuguese inmates (N=40, Mage=37.5 years, SD=8.1) were compared to those of a sample of Icelandic inmates (Gudjonsson, 1997; Gudjonsson & Sigurdsson, 1996). Portuguese inmates' results were in line with the original results. In study 2, the means and standard deviations for the suggestibility results of the sample of Portuguese inmates were compared to those of a general Portuguese population sample (N=57, Mage=36.1 years, SD=12.7). The forensic sample obtained significantly higher scores in suggestibility measures than the general population sample. ANOVA confirmed that the increased suggestibility in the inmates sample was due to the limited memory capacity of this latter group. Given that the results of both studies 1 and 2 are in keeping with the author's original results (Gudjonsson, 1997), this may be regarded as a confirmation of the validity of the Portuguese GSS1. Copyright © 2013 Elsevier Ltd. All rights reserved.

  13. Russian Loanword Adaptation in Persian; Optimal Approach

    ERIC Educational Resources Information Center

    Kambuziya, Aliye Kord Zafaranlu; Hashemi, Eftekhar Sadat

    2011-01-01

    In this paper we analyzed some of the phonological rules of Russian loanword adaptation in Persian, on the view of Optimal Theory (OT) (Prince & Smolensky, 1993/2004). It is the first study of phonological process on Russian loanwords adaptation in Persian. By gathering about 50 current Russian loanwords, we selected some of them to analyze. We…

  14. Sample preparation combined with electroanalysis to improve simultaneous determination of antibiotics in animal derived food samples.

    PubMed

    da Silva, Wesley Pereira; de Oliveira, Luiz Henrique; Santos, André Luiz Dos; Ferreira, Valdir Souza; Trindade, Magno Aparecido Gonçalves

    2018-06-01

    A procedure based on liquid-liquid extraction (LLE) and phase separation using magnetically stirred salt-induced high-temperature liquid-liquid extraction (PS-MSSI-HT-LLE) was developed to extract and pre-concentrate ciprofloxacin (CIPRO) and enrofloxacin (ENRO) from animal food samples before electroanalysis. Firstly, simple LLE was used to extract the fluoroquinolones (FQs) from animal food samples, in which dilution was performed to reduce interference effects to below a tolerable threshold. Then, adapted PS-MSSI-HT-LLE protocols allowed re-extraction and further pre-concentration of target analytes in the diluted acid samples for simultaneous electrochemical quantification at low concentration levels. To improve the peak separation, in simultaneous detection, a baseline-corrected second-order derivative approach was processed. These approaches allowed quantification of target FQs from animal food samples spiked at levels of 0.80 to 2.00 µmol L -1 in chicken meat, with recovery values always higher than 80.5%, as well as in milk samples spiked at 4.00 µmol L -1 , with recovery values close to 70.0%. Copyright © 2018 Elsevier Ltd. All rights reserved.

  15. A comparison of two sampling approaches for assessing the urban forest canopy cover from aerial photography.

    Treesearch

    Ucar Zennure; Pete Bettinger; Krista Merry; Jacek Siry; J.M. Bowker

    2016-01-01

    Two different sampling approaches for estimating urban tree canopy cover were applied to two medium-sized cities in the United States, in conjunction with two freely available remotely sensed imagery products. A random point-based sampling approach, which involved 1000 sample points, was compared against a plot/grid sampling (cluster sampling) approach that involved a...

  16. Adaptive Critic Neural Network-Based Terminal Area Energy Management and Approach and Landing Guidance

    NASA Technical Reports Server (NTRS)

    Grantham, Katie

    2003-01-01

    Reusable Launch Vehicles (RLVs) have different mission requirements than the Space Shuttle, which is used for benchmark guidance design. Therefore, alternative Terminal Area Energy Management (TAEM) and Approach and Landing (A/L) Guidance schemes can be examined in the interest of cost reduction. A neural network based solution for a finite horizon trajectory optimization problem is presented in this paper. In this approach the optimal trajectory of the vehicle is produced by adaptive critic based neural networks, which were trained off-line to maintain a gradual glideslope.

  17. Total variation approach for adaptive nonuniformity correction in focal-plane arrays.

    PubMed

    Vera, Esteban; Meza, Pablo; Torres, Sergio

    2011-01-15

    In this Letter we propose an adaptive scene-based nonuniformity correction method for fixed-pattern noise removal in imaging arrays. It is based on the minimization of the total variation of the estimated irradiance, and the resulting function is optimized by an isotropic total variation approach making use of an alternating minimization strategy. The proposed method provides enhanced results when applied to a diverse set of real IR imagery, accurately estimating the nonunifomity parameters of each detector in the focal-plane array at a fast convergence rate, while also forming fewer ghosting artifacts.

  18. Neural adaptation accounts for the dynamic resizing of peripersonal space: evidence from a psychophysical-computational approach.

    PubMed

    Noel, Jean-Paul; Blanke, Olaf; Magosso, Elisa; Serino, Andrea

    2018-06-01

    Interactions between the body and the environment occur within the peripersonal space (PPS), the space immediately surrounding the body. The PPS is encoded by multisensory (audio-tactile, visual-tactile) neurons that possess receptive fields (RFs) anchored on the body and restricted in depth. The extension in depth of PPS neurons' RFs has been documented to change dynamically as a function of the velocity of incoming stimuli, but the underlying neural mechanisms are still unknown. Here, by integrating a psychophysical approach with neural network modeling, we propose a mechanistic explanation behind this inherent dynamic property of PPS. We psychophysically mapped the size of participant's peri-face and peri-trunk space as a function of the velocity of task-irrelevant approaching auditory stimuli. Findings indicated that the peri-trunk space was larger than the peri-face space, and, importantly, as for the neurophysiological delineation of RFs, both of these representations enlarged as the velocity of incoming sound increased. We propose a neural network model to mechanistically interpret these findings: the network includes reciprocal connections between unisensory areas and higher order multisensory neurons, and it implements neural adaptation to persistent stimulation as a mechanism sensitive to stimulus velocity. The network was capable of replicating the behavioral observations of PPS size remapping and relates behavioral proxies of PPS size to neurophysiological measures of multisensory neurons' RF size. We propose that a biologically plausible neural adaptation mechanism embedded within the network encoding for PPS can be responsible for the dynamic alterations in PPS size as a function of the velocity of incoming stimuli. NEW & NOTEWORTHY Interactions between body and environment occur within the peripersonal space (PPS). PPS neurons are highly dynamic, adapting online as a function of body-object interactions. The mechanistic underpinning PPS dynamic

  19. Evolution of adaptation mechanisms: Adaptation energy, stress, and oscillating death.

    PubMed

    Gorban, Alexander N; Tyukina, Tatiana A; Smirnova, Elena V; Pokidysheva, Lyudmila I

    2016-09-21

    In 1938, Selye proposed the notion of adaptation energy and published 'Experimental evidence supporting the conception of adaptation energy.' Adaptation of an animal to different factors appears as the spending of one resource. Adaptation energy is a hypothetical extensive quantity spent for adaptation. This term causes much debate when one takes it literally, as a physical quantity, i.e. a sort of energy. The controversial points of view impede the systematic use of the notion of adaptation energy despite experimental evidence. Nevertheless, the response to many harmful factors often has general non-specific form and we suggest that the mechanisms of physiological adaptation admit a very general and nonspecific description. We aim to demonstrate that Selye׳s adaptation energy is the cornerstone of the top-down approach to modelling of non-specific adaptation processes. We analyze Selye׳s axioms of adaptation energy together with Goldstone׳s modifications and propose a series of models for interpretation of these axioms. Adaptation energy is considered as an internal coordinate on the 'dominant path' in the model of adaptation. The phenomena of 'oscillating death' and 'oscillating remission' are predicted on the base of the dynamical models of adaptation. Natural selection plays a key role in the evolution of mechanisms of physiological adaptation. We use the fitness optimization approach to study of the distribution of resources for neutralization of harmful factors, during adaptation to a multifactor environment, and analyze the optimal strategies for different systems of factors. Copyright © 2016 Elsevier Ltd. All rights reserved.

  20. Efficiency of Adaptive Temperature-Based Replica Exchange for Sampling Large-Scale Protein Conformational Transitions.

    PubMed

    Zhang, Weihong; Chen, Jianhan

    2013-06-11

    Temperature-based replica exchange (RE) is now considered a principal technique for enhanced sampling of protein conformations. It is also recognized that existence of sharp cooperative transitions (such as protein folding/unfolding) can lead to temperature exchange bottlenecks and significantly reduce the sampling efficiency. Here, we revisit two adaptive temperature-based RE protocols, namely, exchange equalization (EE) and current maximization (CM), that were previously examined using atomistic simulations (Lee and Olson, J. Chem. Physics2011, 134, 24111). Both protocols aim to overcome exchange bottlenecks by adaptively adjusting the simulation temperatures, either to achieve uniform exchange rates (in EE) or to maximize temperature diffusion (CM). By designing a realistic yet computationally tractable coarse-grained protein model, one can sample many reversible folding/unfolding transitions using conventional constant temperature molecular dynamics (MD), standard REMD, EE-REMD, and CM-REMD. This allows rigorous evaluation of the sampling efficiency, by directly comparing the rates of folding/unfolding transitions and convergence of various thermodynamic properties of interest. The results demonstrate that both EE and CM can indeed enhance temperature diffusion compared to standard RE, by ∼3- and over 10-fold, respectively. Surprisingly, the rates of reversible folding/unfolding transitions are similar in all three RE protocols. The convergence rates of several key thermodynamic properties, including the folding stability and various 1D and 2D free energy surfaces, are also similar. Therefore, the efficiency of RE protocols does not appear to be limited by temperature diffusion, but by the inherent rates of spontaneous large-scale conformational rearrangements. This is particularly true considering that virtually all RE simulations of proteins in practice involve exchange attempt frequencies (∼ps(-1)) that are several orders of magnitude faster than the

  1. Development of an Assistance Environment for Tutors Based on a Co-Adaptive Design Approach

    ERIC Educational Resources Information Center

    Lavoue, Elise; George, Sebastien; Prevot, Patrick

    2012-01-01

    In this article, we present a co-adaptive design approach named TE-Cap (Tutoring Experience Capitalisation) that we applied for the development of an assistance environment for tutors. Since tasks assigned to tutors in educational contexts are not well defined, we are developing an environment which responds to needs which are not precisely…

  2. Three Authentic Curriculum-Integration Approaches to Bird Adaptations That Incorporate Technology and Thinking Skills

    ERIC Educational Resources Information Center

    Rule, Audrey C.; Barrera, Manuel T., III

    2008-01-01

    Integration of subject areas with technology and thinking skills is a way to help teachers cope with today's overloaded curriculum and to help students see the connectedness of different curriculum areas. This study compares three authentic approaches to teaching a science unit on bird adaptations for habitat that integrate thinking skills and…

  3. Identifiability and Performance Analysis of Output Over-sampling Approach to Direct Closed-loop Identification

    NASA Astrophysics Data System (ADS)

    Sun, Lianming; Sano, Akira

    Output over-sampling based closed-loop identification algorithm is investigated in this paper. Some instinct properties of the continuous stochastic noise and the plant input, output in the over-sampling approach are analyzed, and they are used to demonstrate the identifiability in the over-sampling approach and to evaluate its identification performance. Furthermore, the selection of plant model order, the asymptotic variance of estimated parameters and the asymptotic variance of frequency response of the estimated model are also explored. It shows that the over-sampling approach can guarantee the identifiability and improve the performance of closed-loop identification greatly.

  4. An adaptive toolbox approach to the route to expertise in sport.

    PubMed

    de Oliveira, Rita F; Lobinger, Babett H; Raab, Markus

    2014-01-01

    Expertise is characterized by fast decision-making which is highly adaptive to new situations. Here we propose that athletes use a toolbox of heuristics which they develop on their route to expertise. The development of heuristics occurs within the context of the athletes' natural abilities, past experiences, developed skills, and situational context, but does not pertain to any of these factors separately. This is a novel approach because it integrates separate factors into a comprehensive heuristic description. The novelty of this approach lies within the integration of separate factors determining expertise into a comprehensive heuristic description. It is our contention that talent identification methods and talent development models should therefore be geared toward the assessment and development of specific heuristics. Specifically, in addition to identifying and developing separate natural abilities and skills as per usual, heuristics should be identified and developed. The application of heuristics to talent and expertise models can bring the field one step away from dichotomized models of nature and nurture toward a comprehensive approach to the route to expertise.

  5. An adaptive toolbox approach to the route to expertise in sport

    PubMed Central

    de Oliveira, Rita F.; Lobinger, Babett H.; Raab, Markus

    2014-01-01

    Expertise is characterized by fast decision-making which is highly adaptive to new situations. Here we propose that athletes use a toolbox of heuristics which they develop on their route to expertise. The development of heuristics occurs within the context of the athletes’ natural abilities, past experiences, developed skills, and situational context, but does not pertain to any of these factors separately. This is a novel approach because it integrates separate factors into a comprehensive heuristic description. The novelty of this approach lies within the integration of separate factors determining expertise into a comprehensive heuristic description. It is our contention that talent identification methods and talent development models should therefore be geared toward the assessment and development of specific heuristics. Specifically, in addition to identifying and developing separate natural abilities and skills as per usual, heuristics should be identified and developed. The application of heuristics to talent and expertise models can bring the field one step away from dichotomized models of nature and nurture toward a comprehensive approach to the route to expertise. PMID:25071673

  6. An adaptive interpolation scheme for molecular potential energy surfaces

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kowalewski, Markus, E-mail: mkowalew@uci.edu; Larsson, Elisabeth; Heryudono, Alfa

    The calculation of potential energy surfaces for quantum dynamics can be a time consuming task—especially when a high level of theory for the electronic structure calculation is required. We propose an adaptive interpolation algorithm based on polyharmonic splines combined with a partition of unity approach. The adaptive node refinement allows to greatly reduce the number of sample points by employing a local error estimate. The algorithm and its scaling behavior are evaluated for a model function in 2, 3, and 4 dimensions. The developed algorithm allows for a more rapid and reliable interpolation of a potential energy surface within amore » given accuracy compared to the non-adaptive version.« less

  7. System health monitoring using multiple-model adaptive estimation techniques

    NASA Astrophysics Data System (ADS)

    Sifford, Stanley Ryan

    Monitoring system health for fault detection and diagnosis by tracking system parameters concurrently with state estimates is approached using a new multiple-model adaptive estimation (MMAE) method. This novel method is called GRid-based Adaptive Parameter Estimation (GRAPE). GRAPE expands existing MMAE methods by using new techniques to sample the parameter space. GRAPE expands on MMAE with the hypothesis that sample models can be applied and resampled without relying on a predefined set of models. GRAPE is initially implemented in a linear framework using Kalman filter models. A more generalized GRAPE formulation is presented using extended Kalman filter (EKF) models to represent nonlinear systems. GRAPE can handle both time invariant and time varying systems as it is designed to track parameter changes. Two techniques are presented to generate parameter samples for the parallel filter models. The first approach is called selected grid-based stratification (SGBS). SGBS divides the parameter space into equally spaced strata. The second approach uses Latin Hypercube Sampling (LHS) to determine the parameter locations and minimize the total number of required models. LHS is particularly useful when the parameter dimensions grow. Adding more parameters does not require the model count to increase for LHS. Each resample is independent of the prior sample set other than the location of the parameter estimate. SGBS and LHS can be used for both the initial sample and subsequent resamples. Furthermore, resamples are not required to use the same technique. Both techniques are demonstrated for both linear and nonlinear frameworks. The GRAPE framework further formalizes the parameter tracking process through a general approach for nonlinear systems. These additional methods allow GRAPE to either narrow the focus to converged values within a parameter range or expand the range in the appropriate direction to track the parameters outside the current parameter range boundary

  8. Application of stakeholder-based and modelling approaches for supporting robust adaptation decision making under future climatic uncertainty and changing urban-agricultural water demand

    NASA Astrophysics Data System (ADS)

    Bhave, Ajay; Dessai, Suraje; Conway, Declan; Stainforth, David

    2016-04-01

    Deep uncertainty in future climate change and socio-economic conditions necessitates the use of assess-risk-of-policy approaches over predict-then-act approaches for adaptation decision making. Robust Decision Making (RDM) approaches embody this principle and help evaluate the ability of adaptation options to satisfy stakeholder preferences under wide-ranging future conditions. This study involves the simultaneous application of two RDM approaches; qualitative and quantitative, in the Cauvery River Basin in Karnataka (population ~23 million), India. The study aims to (a) determine robust water resources adaptation options for the 2030s and 2050s and (b) compare the usefulness of a qualitative stakeholder-driven approach with a quantitative modelling approach. For developing a large set of future scenarios a combination of climate narratives and socio-economic narratives was used. Using structured expert elicitation with a group of climate experts in the Indian Summer Monsoon, climatic narratives were developed. Socio-economic narratives were developed to reflect potential future urban and agricultural water demand. In the qualitative RDM approach, a stakeholder workshop helped elicit key vulnerabilities, water resources adaptation options and performance criteria for evaluating options. During a second workshop, stakeholders discussed and evaluated adaptation options against the performance criteria for a large number of scenarios of climatic and socio-economic change in the basin. In the quantitative RDM approach, a Water Evaluation And Planning (WEAP) model was forced by precipitation and evapotranspiration data, coherent with the climatic narratives, together with water demand data based on socio-economic narratives. We find that compared to business-as-usual conditions options addressing urban water demand satisfy performance criteria across scenarios and provide co-benefits like energy savings and reduction in groundwater depletion, while options reducing

  9. An opportunity cost approach to sample size calculation in cost-effectiveness analysis.

    PubMed

    Gafni, A; Walter, S D; Birch, S; Sendi, P

    2008-01-01

    The inclusion of economic evaluations as part of clinical trials has led to concerns about the adequacy of trial sample size to support such analysis. The analytical tool of cost-effectiveness analysis is the incremental cost-effectiveness ratio (ICER), which is compared with a threshold value (lambda) as a method to determine the efficiency of a health-care intervention. Accordingly, many of the methods suggested to calculating the sample size requirements for the economic component of clinical trials are based on the properties of the ICER. However, use of the ICER and a threshold value as a basis for determining efficiency has been shown to be inconsistent with the economic concept of opportunity cost. As a result, the validity of the ICER-based approaches to sample size calculations can be challenged. Alternative methods for determining improvements in efficiency have been presented in the literature that does not depend upon ICER values. In this paper, we develop an opportunity cost approach to calculating sample size for economic evaluations alongside clinical trials, and illustrate the approach using a numerical example. We compare the sample size requirement of the opportunity cost method with the ICER threshold method. In general, either method may yield the larger required sample size. However, the opportunity cost approach, although simple to use, has additional data requirements. We believe that the additional data requirements represent a small price to pay for being able to perform an analysis consistent with both concept of opportunity cost and the problem faced by decision makers. Copyright (c) 2007 John Wiley & Sons, Ltd.

  10. Defining adaptation measures collaboratively: A participatory approach in the Doñana socio-ecological system, Spain.

    PubMed

    De Stefano, Lucia; Hernández-Mora, Nuria; Iglesias, Ana; Sánchez, Berta

    2017-06-15

    The uncertainty associated with the definition of strategies for climate change adaptation poses a challenge that cannot be faced by science alone. We present a participatory experience where, instead of having science defining solutions and eliciting stakeholders' feedback, local actors actually drove the process. While principles and methods of the approach are easily adaptable to different local contexts, this paper shows the contribution of participatory dynamics to the design of adaptation measures in the biodiversity-rich socio-ecological region surrounding the Doñana wetlands (Southern Spain). During the process, stakeholders and scientists collaboratively designed a common scenario for the future in which to define and assess a portfolio of potential adaptation measures, and found a safe, informal space for open dialogue and information exchange. Through this dialogue, points of connection among local actors emerged around the need for more integrated, transparent design of adaptation measures; for strengthening local capacity; and for strategies to diversify economic activities in order to increase the resilience of the region. Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. A Cartesian, cell-based approach for adaptively-refined solutions of the Euler and Navier-Stokes equations

    NASA Technical Reports Server (NTRS)

    Coirier, William J.; Powell, Kenneth G.

    1994-01-01

    A Cartesian, cell-based approach for adaptively-refined solutions of the Euler and Navier-Stokes equations in two dimensions is developed and tested. Grids about geometrically complicated bodies are generated automatically, by recursive subdivision of a single Cartesian cell encompassing the entire flow domain. Where the resulting cells intersect bodies, N-sided 'cut' cells are created using polygon-clipping algorithms. The grid is stored in a binary-tree structure which provides a natural means of obtaining cell-to-cell connectivity and of carrying out solution-adaptive mesh refinement. The Euler and Navier-Stokes equations are solved on the resulting grids using a finite-volume formulation. The convective terms are upwinded: a gradient-limited, linear reconstruction of the primitive variables is performed, providing input states to an approximate Riemann solver for computing the fluxes between neighboring cells. The more robust of a series of viscous flux functions is used to provide the viscous fluxes at the cell interfaces. Adaptively-refined solutions of the Navier-Stokes equations using the Cartesian, cell-based approach are obtained and compared to theory, experiment, and other accepted computational results for a series of low and moderate Reynolds number flows.

  12. A Cartesian, cell-based approach for adaptively-refined solutions of the Euler and Navier-Stokes equations

    NASA Technical Reports Server (NTRS)

    Coirier, William J.; Powell, Kenneth G.

    1995-01-01

    A Cartesian, cell-based approach for adaptively-refined solutions of the Euler and Navier-Stokes equations in two dimensions is developed and tested. Grids about geometrically complicated bodies are generated automatically, by recursive subdivision of a single Cartesian cell encompassing the entire flow domain. Where the resulting cells intersect bodies, N-sided 'cut' cells are created using polygon-clipping algorithms. The grid is stored in a binary-tree data structure which provides a natural means of obtaining cell-to-cell connectivity and of carrying out solution-adaptive mesh refinement. The Euler and Navier-Stokes equations are solved on the resulting grids using a finite-volume formulation. The convective terms are upwinded: A gradient-limited, linear reconstruction of the primitive variables is performed, providing input states to an approximate Riemann solver for computing the fluxes between neighboring cells. The more robust of a series of viscous flux functions is used to provide the viscous fluxes at the cell interfaces. Adaptively-refined solutions of the Navier-Stokes equations using the Cartesian, cell-based approach are obtained and compared to theory, experiment and other accepted computational results for a series of low and moderate Reynolds number flows.

  13. Domain Adaptation for Pedestrian Detection Based on Prediction Consistency

    PubMed Central

    Huan-ling, Tang; Zhi-yong, An

    2014-01-01

    Pedestrian detection is an active area of research in computer vision. It remains a quite challenging problem in many applications where many factors cause a mismatch between source dataset used to train the pedestrian detector and samples in the target scene. In this paper, we propose a novel domain adaptation model for merging plentiful source domain samples with scared target domain samples to create a scene-specific pedestrian detector that performs as well as rich target domain simples are present. Our approach combines the boosting-based learning algorithm with an entropy-based transferability, which is derived from the prediction consistency with the source classifications, to selectively choose the samples showing positive transferability in source domains to the target domain. Experimental results show that our approach can improve the detection rate, especially with the insufficient labeled data in target scene. PMID:25013850

  14. Multilattice sampling strategies for region of interest dynamic MRI.

    PubMed

    Rilling, Gabriel; Tao, Yuehui; Marshall, Ian; Davies, Mike E

    2013-08-01

    A multilattice sampling approach is proposed for dynamic MRI with Cartesian trajectories. It relies on the use of sampling patterns composed of several different lattices and exploits an image model where only some parts of the image are dynamic, whereas the rest is assumed static. Given the parameters of such an image model, the methodology followed for the design of a multilattice sampling pattern adapted to the model is described. The multi-lattice approach is compared to single-lattice sampling, as used by traditional acceleration methods such as UNFOLD (UNaliasing by Fourier-Encoding the Overlaps using the temporal Dimension) or k-t BLAST, and random sampling used by modern compressed sensing-based methods. On the considered image model, it allows more flexibility and higher accelerations than lattice sampling and better performance than random sampling. The method is illustrated on a phase-contrast carotid blood velocity mapping MR experiment. Combining the multilattice approach with the KEYHOLE technique allows up to 12× acceleration factors. Simulation and in vivo undersampling results validate the method. Compared to lattice and random sampling, multilattice sampling provides significant gains at high acceleration factors. © 2012 Wiley Periodicals, Inc.

  15. Teacher and Student-Focused Approaches: Influence of Learning Approach and Self-Efficacy in a Psychology Postgraduate Sample

    ERIC Educational Resources Information Center

    Kaye, Linda K.; Brewer, Gayle

    2013-01-01

    The current study examined approaches to teaching in a postgraduate psychology sample. This included considering teaching-focused (information transfer) and student-focused (conceptual changes in understanding) approaches to teaching. Postgraduate teachers of psychology (N = 113) completed a questionnaire measuring their use of a teacher- or…

  16. Optimal Asset Distribution for Environmental Assessment and Forecasting Based on Observations, Adaptive Sampling, and Numerical Prediction

    DTIC Science & Technology

    2012-09-30

    and Forecasting Based on Observations, Adaptive Sampling, and Numerical Prediction Steven R. Ramp Soliton Ocean Services, Inc. 691 Country Club... Soliton Ocean Services, Inc,691 Country Club Drive,Monterey,CA,93924 8. PERFORMING ORGANIZATION REPORT NUMBER 9. SPONSORING/MONITORING AGENCY NAME(S...shortwave. The results show that the incoming shortwave radiation was the dominant term, even when averaged over the dark hours, which accounts

  17. A new approach to importance sampling for the simulation of false alarms. [in radar systems

    NASA Technical Reports Server (NTRS)

    Lu, D.; Yao, K.

    1987-01-01

    In this paper a modified importance sampling technique for improving the convergence of Importance Sampling is given. By using this approach to estimate low false alarm rates in radar simulations, the number of Monte Carlo runs can be reduced significantly. For one-dimensional exponential, Weibull, and Rayleigh distributions, a uniformly minimum variance unbiased estimator is obtained. For Gaussian distribution the estimator in this approach is uniformly better than that of previously known Importance Sampling approach. For a cell averaging system, by combining this technique and group sampling, the reduction of Monte Carlo runs for a reference cell of 20 and false alarm rate of lE-6 is on the order of 170 as compared to the previously known Importance Sampling approach.

  18. Improving the performances of autofocus based on adaptive retina-like sampling model

    NASA Astrophysics Data System (ADS)

    Hao, Qun; Xiao, Yuqing; Cao, Jie; Cheng, Yang; Sun, Ce

    2018-03-01

    An adaptive retina-like sampling model (ARSM) is proposed to balance autofocusing accuracy and efficiency. Based on the model, we carry out comparative experiments between the proposed method and the traditional method in terms of accuracy, the full width of the half maxima (FWHM) and time consumption. Results show that the performances of our method are better than that of the traditional method. Meanwhile, typical autofocus functions, including sum-modified-Laplacian (SML), Laplacian (LAP), Midfrequency-DCT (MDCT) and Absolute Tenengrad (ATEN) are compared through comparative experiments. The smallest FWHM is obtained by the use of LAP, which is more suitable for evaluating accuracy than other autofocus functions. The autofocus function of MDCT is most suitable to evaluate the real-time ability.

  19. Coherence-Gated Sensorless Adaptive Optics Multiphoton Retinal Imaging.

    PubMed

    Cua, Michelle; Wahl, Daniel J; Zhao, Yuan; Lee, Sujin; Bonora, Stefano; Zawadzki, Robert J; Jian, Yifan; Sarunic, Marinko V

    2016-09-07

    Multiphoton microscopy enables imaging deep into scattering tissues. The efficient generation of non-linear optical effects is related to both the pulse duration (typically on the order of femtoseconds) and the size of the focused spot. Aberrations introduced by refractive index inhomogeneity in the sample distort the wavefront and enlarge the focal spot, which reduces the multiphoton signal. Traditional approaches to adaptive optics wavefront correction are not effective in thick or multi-layered scattering media. In this report, we present sensorless adaptive optics (SAO) using low-coherence interferometric detection of the excitation light for depth-resolved aberration correction of two-photon excited fluorescence (TPEF) in biological tissue. We demonstrate coherence-gated SAO TPEF using a transmissive multi-actuator adaptive lens for in vivo imaging in a mouse retina. This configuration has significant potential for reducing the laser power required for adaptive optics multiphoton imaging, and for facilitating integration with existing systems.

  20. Adaptive Sampling in Autonomous Marine Sensor Networks

    DTIC Science & Technology

    2006-06-01

    Analog Processing Section A high-performance preamplifier with low noise characteristics is vital to obtaining quality sonar data. The preamplifier ...research assistantships through the Generic Ocean Array Technology Sonar (GOATS) project, contract N00014-97-1-0202 and contract N00014-05-G-0106 Delivery...Formation Behavior ..................................... 60 5 An AUV Intelligent Sensor for Real-Time Adaptive Sensing 63 5.1 A Logical Sonar Sensor

  1. Hybrid selection for sequencing pathogen genomes from clinical samples

    PubMed Central

    2011-01-01

    We have adapted a solution hybrid selection protocol to enrich pathogen DNA in clinical samples dominated by human genetic material. Using mock mixtures of human and Plasmodium falciparum malaria parasite DNA as well as clinical samples from infected patients, we demonstrate an average of approximately 40-fold enrichment of parasite DNA after hybrid selection. This approach will enable efficient genome sequencing of pathogens from clinical samples, as well as sequencing of endosymbiotic organisms such as Wolbachia that live inside diverse metazoan phyla. PMID:21835008

  2. Adaptive MANET multipath routing algorithm based on the simulated annealing approach.

    PubMed

    Kim, Sungwook

    2014-01-01

    Mobile ad hoc network represents a system of wireless mobile nodes that can freely and dynamically self-organize network topologies without any preexisting communication infrastructure. Due to characteristics like temporary topology and absence of centralized authority, routing is one of the major issues in ad hoc networks. In this paper, a new multipath routing scheme is proposed by employing simulated annealing approach. The proposed metaheuristic approach can achieve greater and reciprocal advantages in a hostile dynamic real world network situation. Therefore, the proposed routing scheme is a powerful method for finding an effective solution into the conflict mobile ad hoc network routing problem. Simulation results indicate that the proposed paradigm adapts best to the variation of dynamic network situations. The average remaining energy, network throughput, packet loss probability, and traffic load distribution are improved by about 10%, 10%, 5%, and 10%, respectively, more than the existing schemes.

  3. Analyzing Hedges in Verbal Communication: An Adaptation-Based Approach

    ERIC Educational Resources Information Center

    Wang, Yuling

    2010-01-01

    Based on Adaptation Theory, the article analyzes the production process of hedges. The procedure consists of the continuous making of choices in linguistic forms and communicative strategies. These choices are made just for adaptation to the contextual correlates. Besides, the adaptation process is dynamic, intentional and bidirectional.

  4. Adaption of egg and larvae sampling techniques for lake sturgeon and broadcast spawning fishes in a deep river

    USGS Publications Warehouse

    Roseman, Edward F.; Kennedy, Gregory W.; Craig, Jaquelyn; Boase, James; Soper, Karen

    2011-01-01

    In this report we describe how we adapted two techniques for sampling lake sturgeon (Acipenser fulvescens) and other fish early life history stages to meet our research needs in the Detroit River, a deep, flowing Great Lakes connecting channel. First, we developed a buoy-less method for sampling fish eggs and spawning activity using egg mats deployed on the river bottom. The buoy-less method allowed us to fish gear in areas frequented by boaters and recreational anglers, thus eliminating surface obstructions that interfered with recreational and boating activities. The buoy-less method also reduced gear loss due to drift when masses of floating aquatic vegetation would accumulate on buoys and lines, increasing the drag on the gear and pulling it downstream. Second, we adapted a D-frame drift net system formerly employed in shallow streams to assess larval lake sturgeon dispersal for use in the deeper (>8 m) Detroit River using an anchor and buoy system.

  5. Adaption of egg and larvae sampling techniques for lake sturgeon and broadcast spawning fishes in a deep river

    USGS Publications Warehouse

    Roseman, E.F.; Boase, J.; Kennedy, G.; Craig, J.; Soper, K.

    2011-01-01

    In this report we describe how we adapted two techniques for sampling lake sturgeon (Acipenser fulvescens) and other fish early life history stages to meet our research needs in the Detroit River, a deep, flowing Great Lakes connecting channel. First, we developed a buoy-less method for sampling fish eggs and spawning activity using egg mats deployed on the river bottom. The buoy-less method allowed us to fish gear in areas frequented by boaters and recreational anglers, thus eliminating surface obstructions that interfered with recreational and boating activities. The buoy-less method also reduced gear loss due to drift when masses of floating aquatic vegetation would accumulate on buoys and lines, increasing the drag on the gear and pulling it downstream. Second, we adapted a D-frame drift net system formerly employed in shallow streams to assess larval lake sturgeon dispersal for use in the deeper (>8m) Detroit River using an anchor and buoy system. ?? 2011 Blackwell Verlag, Berlin.

  6. Complexity Thinking in PE: Game-Centred Approaches, Games as Complex Adaptive Systems, and Ecological Values

    ERIC Educational Resources Information Center

    Storey, Brian; Butler, Joy

    2013-01-01

    Background: This article draws on the literature relating to game-centred approaches (GCAs), such as Teaching Games for Understanding, and dynamical systems views of motor learning to demonstrate a convergence of ideas around games as complex adaptive learning systems. This convergence is organized under the title "complexity thinking"…

  7. Adapting Evidence-Based Mental Health Treatments in Community Settings: Preliminary Results from a Partnership Approach

    ERIC Educational Resources Information Center

    Southam-Gerow, Michael A.; Hourigan, Shannon E.; Allin, Robert B., Jr.

    2009-01-01

    This article describes the application of a university-community partnership model to the problem of adapting evidence-based treatment approaches in a community mental health setting. Background on partnership research is presented, with consideration of methodological and practical issues related to this kind of research. Then, a rationale for…

  8. A New Approach to Teaching Biomechanics Through Active, Adaptive, and Experiential Learning.

    PubMed

    Singh, Anita

    2017-07-01

    Demand of biomedical engineers continues to rise to meet the needs of healthcare industry. Current training of bioengineers follows the traditional and dominant model of theory-focused curricula. However, the unmet needs of the healthcare industry warrant newer skill sets in these engineers. Translational training strategies such as solving real world problems through active, adaptive, and experiential learning hold promise. In this paper, we report our findings of adding a real-world 4-week problem-based learning unit into a biomechanics capstone course for engineering students. Surveys assessed student perceptions of the activity and learning experience. While students, across three cohorts, felt challenged to solve a real-world problem identified during the simulation lab visit, they felt more confident in utilizing knowledge learned in the biomechanics course and self-directed research. Instructor evaluations indicated that the active and experiential learning approach fostered their technical knowledge and life-long learning skills while exposing them to the components of adaptive learning and innovation.

  9. Controlling aliased dynamics in motion systems? An identification for sampled-data control approach

    NASA Astrophysics Data System (ADS)

    Oomen, Tom

    2014-07-01

    Sampled-data control systems occasionally exhibit aliased resonance phenomena within the control bandwidth. The aim of this paper is to investigate the aspect of these aliased dynamics with application to a high performance industrial nano-positioning machine. This necessitates a full sampled-data control design approach, since these aliased dynamics endanger both the at-sample performance and the intersample behaviour. The proposed framework comprises both system identification and sampled-data control. In particular, the sampled-data control objective necessitates models that encompass the intersample behaviour, i.e., ideally continuous time models. Application of the proposed approach on an industrial wafer stage system provides a thorough insight and new control design guidelines for controlling aliased dynamics.

  10. 76 FR 65165 - Importation of Plants for Planting; Risk-Based Sampling and Inspection Approach and Propagative...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-20

    ...] Importation of Plants for Planting; Risk-Based Sampling and Inspection Approach and Propagative Monitoring and... advising the public of our decision to implement a risk-based sampling approach for the inspection of... risk-based sampling and inspection approach will allow us to target high-risk plants for planting for...

  11. Establishing Interpretive Consistency When Mixing Approaches: Role of Sampling Designs in Evaluations

    ERIC Educational Resources Information Center

    Collins, Kathleen M. T.; Onwuegbuzie, Anthony J.

    2013-01-01

    The goal of this chapter is to recommend quality criteria to guide evaluators' selections of sampling designs when mixing approaches. First, we contextualize our discussion of quality criteria and sampling designs by discussing the concept of interpretive consistency and how it impacts sampling decisions. Embedded in this discussion are…

  12. Rain sampling device

    DOEpatents

    Nelson, Danny A.; Tomich, Stanley D.; Glover, Donald W.; Allen, Errol V.; Hales, Jeremy M.; Dana, Marshall T.

    1991-01-01

    The present invention constitutes a rain sampling device adapted for independent operation at locations remote from the user which allows rainfall to be sampled in accordance with any schedule desired by the user. The rain sampling device includes a mechanism for directing wet precipitation into a chamber, a chamber for temporarily holding the precipitation during the process of collection, a valve mechanism for controllably releasing samples of said precipitation from said chamber, a means for distributing the samples released from the holding chamber into vessels adapted for permanently retaining these samples, and an electrical mechanism for regulating the operation of the device.

  13. Approach for Structurally Clearing an Adaptive Compliant Trailing Edge Flap for Flight

    NASA Technical Reports Server (NTRS)

    Miller, Eric J.; Lokos, William A.; Cruz, Josue; Crampton, Glen; Stephens, Craig A.; Kota, Sridhar; Ervin, Gregory; Flick, Pete

    2015-01-01

    The Adaptive Compliant Trailing Edge (ACTE) flap was flown on the NASA Gulfstream GIII test bed at the NASA Armstrong Flight Research Center. This smoothly curving flap replaced the existing Fowler flaps creating a seamless control surface. This compliant structure, developed by FlexSys Inc. in partnership with Air Force Research Laboratory, supported NASA objectives for airframe structural noise reduction, aerodynamic efficiency, and wing weight reduction through gust load alleviation. A thorough structures airworthiness approach was developed to move this project safely to flight.

  14. Dynamic experiment design regularization approach to adaptive imaging with array radar/SAR sensor systems.

    PubMed

    Shkvarko, Yuriy; Tuxpan, José; Santos, Stewart

    2011-01-01

    We consider a problem of high-resolution array radar/SAR imaging formalized in terms of a nonlinear ill-posed inverse problem of nonparametric estimation of the power spatial spectrum pattern (SSP) of the random wavefield scattered from a remotely sensed scene observed through a kernel signal formation operator and contaminated with random Gaussian noise. First, the Sobolev-type solution space is constructed to specify the class of consistent kernel SSP estimators with the reproducing kernel structures adapted to the metrics in such the solution space. Next, the "model-free" variational analysis (VA)-based image enhancement approach and the "model-based" descriptive experiment design (DEED) regularization paradigm are unified into a new dynamic experiment design (DYED) regularization framework. Application of the proposed DYED framework to the adaptive array radar/SAR imaging problem leads to a class of two-level (DEED-VA) regularized SSP reconstruction techniques that aggregate the kernel adaptive anisotropic windowing with the projections onto convex sets to enforce the consistency and robustness of the overall iterative SSP estimators. We also show how the proposed DYED regularization method may be considered as a generalization of the MVDR, APES and other high-resolution nonparametric adaptive radar sensing techniques. A family of the DYED-related algorithms is constructed and their effectiveness is finally illustrated via numerical simulations.

  15. The Development of the Alpha-Omega Completed Sentence Form (AOCSF): An Instrument to Aid in the Measurement, Identification, and Assessment of an Individual's "Adaptational Approach(es)" to the Stressful Event of Death and Related Issues.

    ERIC Educational Resources Information Center

    Klein, Ronald; And Others

    The Alpha Omega Completed Sentence Form (AOCSF) was developed to identify and measure a person's adaptational approaches to information concerning their own death or the possible death of a significant other. In contrast to the Kubler-Ross stage theory, the adaptational approach recognizes a person's capacity to assimilate new information which…

  16. Digital gene expression analysis with sample multiplexing and PCR duplicate detection: A straightforward protocol.

    PubMed

    Rozenberg, Andrey; Leese, Florian; Weiss, Linda C; Tollrian, Ralph

    2016-01-01

    Tag-Seq is a high-throughput approach used for discovering SNPs and characterizing gene expression. In comparison to RNA-Seq, Tag-Seq eases data processing and allows detection of rare mRNA species using only one tag per transcript molecule. However, reduced library complexity raises the issue of PCR duplicates, which distort gene expression levels. Here we present a novel Tag-Seq protocol that uses the least biased methods for RNA library preparation combined with a novel approach for joint PCR template and sample labeling. In our protocol, input RNA is fragmented by hydrolysis, and poly(A)-bearing RNAs are selected and directly ligated to mixed DNA-RNA P5 adapters. The P5 adapters contain i5 barcodes composed of sample-specific (moderately) degenerate base regions (mDBRs), which later allow detection of PCR duplicates. The P7 adapter is attached via reverse transcription with individual i7 barcodes added during the amplification step. The resulting libraries can be sequenced on an Illumina sequencer. After sample demultiplexing and PCR duplicate removal with a free software tool we designed, the data are ready for downstream analysis. Our protocol was tested on RNA samples from predator-induced and control Daphnia microcrustaceans.

  17. Parallel Anisotropic Tetrahedral Adaptation

    NASA Technical Reports Server (NTRS)

    Park, Michael A.; Darmofal, David L.

    2008-01-01

    An adaptive method that robustly produces high aspect ratio tetrahedra to a general 3D metric specification without introducing hybrid semi-structured regions is presented. The elemental operators and higher-level logic is described with their respective domain-decomposed parallelizations. An anisotropic tetrahedral grid adaptation scheme is demonstrated for 1000-1 stretching for a simple cube geometry. This form of adaptation is applicable to more complex domain boundaries via a cut-cell approach as demonstrated by a parallel 3D supersonic simulation of a complex fighter aircraft. To avoid the assumptions and approximations required to form a metric to specify adaptation, an approach is introduced that directly evaluates interpolation error. The grid is adapted to reduce and equidistribute this interpolation error calculation without the use of an intervening anisotropic metric. Direct interpolation error adaptation is illustrated for 1D and 3D domains.

  18. The Dynamic Interplay among EFL Learners' Ambiguity Tolerance, Adaptability, Cultural Intelligence, Learning Approach, and Language Achievement

    ERIC Educational Resources Information Center

    Alahdadi, Shadi; Ghanizadeh, Afsaneh

    2017-01-01

    A key objective of education is to prepare individuals to be fully-functioning learners. This entails developing the cognitive, metacognitive, motivational, cultural, and emotional competencies. The present study aimed to examine the interrelationships among adaptability, tolerance of ambiguity, cultural intelligence, learning approach, and…

  19. An Energy Aware Adaptive Sampling Algorithm for Energy Harvesting WSN with Energy Hungry Sensors.

    PubMed

    Srbinovski, Bruno; Magno, Michele; Edwards-Murphy, Fiona; Pakrashi, Vikram; Popovici, Emanuel

    2016-03-28

    Wireless sensor nodes have a limited power budget, though they are often expected to be functional in the field once deployed for extended periods of time. Therefore, minimization of energy consumption and energy harvesting technology in Wireless Sensor Networks (WSN) are key tools for maximizing network lifetime, and achieving self-sustainability. This paper proposes an energy aware Adaptive Sampling Algorithm (ASA) for WSN with power hungry sensors and harvesting capabilities, an energy management technique that can be implemented on any WSN platform with enough processing power to execute the proposed algorithm. An existing state-of-the-art ASA developed for wireless sensor networks with power hungry sensors is optimized and enhanced to adapt the sampling frequency according to the available energy of the node. The proposed algorithm is evaluated using two in-field testbeds that are supplied by two different energy harvesting sources (solar and wind). Simulation and comparison between the state-of-the-art ASA and the proposed energy aware ASA (EASA) in terms of energy durability are carried out using in-field measured harvested energy (using both wind and solar sources) and power hungry sensors (ultrasonic wind sensor and gas sensors). The simulation results demonstrate that using ASA in combination with an energy aware function on the nodes can drastically increase the lifetime of a WSN node and enable self-sustainability. In fact, the proposed EASA in conjunction with energy harvesting capability can lead towards perpetual WSN operation and significantly outperform the state-of-the-art ASA.

  20. An Energy Aware Adaptive Sampling Algorithm for Energy Harvesting WSN with Energy Hungry Sensors

    PubMed Central

    Srbinovski, Bruno; Magno, Michele; Edwards-Murphy, Fiona; Pakrashi, Vikram; Popovici, Emanuel

    2016-01-01

    Wireless sensor nodes have a limited power budget, though they are often expected to be functional in the field once deployed for extended periods of time. Therefore, minimization of energy consumption and energy harvesting technology in Wireless Sensor Networks (WSN) are key tools for maximizing network lifetime, and achieving self-sustainability. This paper proposes an energy aware Adaptive Sampling Algorithm (ASA) for WSN with power hungry sensors and harvesting capabilities, an energy management technique that can be implemented on any WSN platform with enough processing power to execute the proposed algorithm. An existing state-of-the-art ASA developed for wireless sensor networks with power hungry sensors is optimized and enhanced to adapt the sampling frequency according to the available energy of the node. The proposed algorithm is evaluated using two in-field testbeds that are supplied by two different energy harvesting sources (solar and wind). Simulation and comparison between the state-of-the-art ASA and the proposed energy aware ASA (EASA) in terms of energy durability are carried out using in-field measured harvested energy (using both wind and solar sources) and power hungry sensors (ultrasonic wind sensor and gas sensors). The simulation results demonstrate that using ASA in combination with an energy aware function on the nodes can drastically increase the lifetime of a WSN node and enable self-sustainability. In fact, the proposed EASA in conjunction with energy harvesting capability can lead towards perpetual WSN operation and significantly outperform the state-of-the-art ASA. PMID:27043559

  1. Adaptive kernel function using line transect sampling

    NASA Astrophysics Data System (ADS)

    Albadareen, Baker; Ismail, Noriszura

    2018-04-01

    The estimation of f(0) is crucial in the line transect method which is used for estimating population abundance in wildlife survey's. The classical kernel estimator of f(0) has a high negative bias. Our study proposes an adaptation in the kernel function which is shown to be more efficient than the usual kernel estimator. A simulation study is adopted to compare the performance of the proposed estimators with the classical kernel estimators.

  2. Integrating adaptive behaviour in large-scale flood risk assessments: an Agent-Based Modelling approach

    NASA Astrophysics Data System (ADS)

    Haer, Toon; Aerts, Jeroen

    2015-04-01

    Between 1998 and 2009, Europe suffered over 213 major damaging floods, causing 1126 deaths, displacing around half a million people. In this period, floods caused at least 52 billion euro in insured economic losses making floods the most costly natural hazard faced in Europe. In many low-lying areas, the main strategy to cope with floods is to reduce the risk of the hazard through flood defence structures, like dikes and levees. However, it is suggested that part of the responsibility for flood protection needs to shift to households and businesses in areas at risk, and that governments and insurers can effectively stimulate the implementation of individual protective measures. However, adaptive behaviour towards flood risk reduction and the interaction between the government, insurers, and individuals has hardly been studied in large-scale flood risk assessments. In this study, an European Agent-Based Model is developed including agent representatives for the administrative stakeholders of European Member states, insurers and reinsurers markets, and individuals following complex behaviour models. The Agent-Based Modelling approach allows for an in-depth analysis of the interaction between heterogeneous autonomous agents and the resulting (non-)adaptive behaviour. Existing flood damage models are part of the European Agent-Based Model to allow for a dynamic response of both the agents and the environment to changing flood risk and protective efforts. By following an Agent-Based Modelling approach this study is a first contribution to overcome the limitations of traditional large-scale flood risk models in which the influence of individual adaptive behaviour towards flood risk reduction is often lacking.

  3. Context-aware adaptive spelling in motor imagery BCI.

    PubMed

    Perdikis, S; Leeb, R; Millán, J D R

    2016-06-01

    This work presents a first motor imagery-based, adaptive brain-computer interface (BCI) speller, which is able to exploit application-derived context for improved, simultaneous classifier adaptation and spelling. Online spelling experiments with ten able-bodied users evaluate the ability of our scheme, first, to alleviate non-stationarity of brain signals for restoring the subject's performances, second, to guide naive users into BCI control avoiding initial offline BCI calibration and, third, to outperform regular unsupervised adaptation. Our co-adaptive framework combines the BrainTree speller with smooth-batch linear discriminant analysis adaptation. The latter enjoys contextual assistance through BrainTree's language model to improve online expectation-maximization maximum-likelihood estimation. Our results verify the possibility to restore single-sample classification and BCI command accuracy, as well as spelling speed for expert users. Most importantly, context-aware adaptation performs significantly better than its unsupervised equivalent and similar to the supervised one. Although no significant differences are found with respect to the state-of-the-art PMean approach, the proposed algorithm is shown to be advantageous for 30% of the users. We demonstrate the possibility to circumvent supervised BCI recalibration, saving time without compromising the adaptation quality. On the other hand, we show that this type of classifier adaptation is not as efficient for BCI training purposes.

  4. Adaptive template generation for amyloid PET using a deep learning approach.

    PubMed

    Kang, Seung Kwan; Seo, Seongho; Shin, Seong A; Byun, Min Soo; Lee, Dong Young; Kim, Yu Kyeong; Lee, Dong Soo; Lee, Jae Sung

    2018-05-11

    Accurate spatial normalization (SN) of amyloid positron emission tomography (PET) images for Alzheimer's disease assessment without coregistered anatomical magnetic resonance imaging (MRI) of the same individual is technically challenging. In this study, we applied deep neural networks to generate individually adaptive PET templates for robust and accurate SN of amyloid PET without using matched 3D MR images. Using 681 pairs of simultaneously acquired 11 C-PIB PET and T1-weighted 3D MRI scans of AD, MCI, and cognitively normal subjects, we trained and tested two deep neural networks [convolutional auto-encoder (CAE) and generative adversarial network (GAN)] that produce adaptive best PET templates. More specifically, the networks were trained using 685,100 pieces of augmented data generated by rotating 527 randomly selected datasets and validated using 154 datasets. The input to the supervised neural networks was the 3D PET volume in native space and the label was the spatially normalized 3D PET image using the transformation parameters obtained from MRI-based SN. The proposed deep learning approach significantly enhanced the quantitative accuracy of MRI-less amyloid PET assessment by reducing the SN error observed when an average amyloid PET template is used. Given an input image, the trained deep neural networks rapidly provide individually adaptive 3D PET templates without any discontinuity between the slices (in 0.02 s). As the proposed method does not require 3D MRI for the SN of PET images, it has great potential for use in routine analysis of amyloid PET images in clinical practice and research. © 2018 Wiley Periodicals, Inc.

  5. Coherence-Gated Sensorless Adaptive Optics Multiphoton Retinal Imaging

    PubMed Central

    Cua, Michelle; Wahl, Daniel J.; Zhao, Yuan; Lee, Sujin; Bonora, Stefano; Zawadzki, Robert J.; Jian, Yifan; Sarunic, Marinko V.

    2016-01-01

    Multiphoton microscopy enables imaging deep into scattering tissues. The efficient generation of non-linear optical effects is related to both the pulse duration (typically on the order of femtoseconds) and the size of the focused spot. Aberrations introduced by refractive index inhomogeneity in the sample distort the wavefront and enlarge the focal spot, which reduces the multiphoton signal. Traditional approaches to adaptive optics wavefront correction are not effective in thick or multi-layered scattering media. In this report, we present sensorless adaptive optics (SAO) using low-coherence interferometric detection of the excitation light for depth-resolved aberration correction of two-photon excited fluorescence (TPEF) in biological tissue. We demonstrate coherence-gated SAO TPEF using a transmissive multi-actuator adaptive lens for in vivo imaging in a mouse retina. This configuration has significant potential for reducing the laser power required for adaptive optics multiphoton imaging, and for facilitating integration with existing systems. PMID:27599635

  6. Adaptive management

    USGS Publications Warehouse

    Allen, Craig R.; Garmestani, Ahjond S.

    2015-01-01

    Adaptive management is an approach to natural resource management that emphasizes learning through management where knowledge is incomplete, and when, despite inherent uncertainty, managers and policymakers must act. Unlike a traditional trial and error approach, adaptive management has explicit structure, including a careful elucidation of goals, identification of alternative management objectives and hypotheses of causation, and procedures for the collection of data followed by evaluation and reiteration. The process is iterative, and serves to reduce uncertainty, build knowledge and improve management over time in a goal-oriented and structured process.

  7. Evaluation of PCR Approaches for Detection of Bartonella bacilliformis in Blood Samples.

    PubMed

    Gomes, Cláudia; Martinez-Puchol, Sandra; Pons, Maria J; Bazán, Jorge; Tinco, Carmen; del Valle, Juana; Ruiz, Joaquim

    2016-03-01

    The lack of an effective diagnostic tool for Carrion's disease leads to misdiagnosis, wrong treatments and perpetuation of asymptomatic carriers living in endemic areas. Conventional PCR approaches have been reported as a diagnostic technique. However, the detection limit of these techniques is not clear as well as if its usefulness in low bacteriemia cases. The aim of this study was to evaluate the detection limit of 3 PCR approaches. We determined the detection limit of 3 different PCR approaches: Bartonella-specific 16S rRNA, fla and its genes. We also evaluated the viability of dry blood spots to be used as a sample transport system. Our results show that 16S rRNA PCR is the approach with a lowest detection limit, 5 CFU/μL, and thus, the best diagnostic PCR tool studied. Dry blood spots diminish the sensitivity of the assay. From the tested PCRs, the 16S rRNA PCR-approach is the best to be used in the direct blood detection of acute cases of Carrion's disease. However its use in samples from dry blood spots results in easier management of transport samples in rural areas, a slight decrease in the sensitivity was observed. The usefulness to detect by PCR the presence of low-bacteriemic or asymptomatic carriers is doubtful, showing the need to search for new more sensible techniques.

  8. A New Trans-Disciplinary Approach to Regional Integrated Assessment of Climate Impact and Adaptation in Agricultural Systems (Invited)

    NASA Astrophysics Data System (ADS)

    Antle, J. M.; Valdivia, R. O.; Jones, J.; Rosenzweig, C.; Ruane, A. C.

    2013-12-01

    This presentation provides an overview of the new methods developed by researchers in the Agricultural Model Inter-comparison and Improvement Project (AgMIP) for regional climate impact assessment and analysis of adaptation in agricultural systems. This approach represents a departure from approaches in the literature in several dimensions. First, the approach is based on the analysis of agricultural systems (not individual crops) and is inherently trans-disciplinary: it is based on a deep collaboration among a team of climate scientists, agricultural scientists and economists to design and implement regional integrated assessments of agricultural systems. Second, in contrast to previous approaches that have imposed future climate on models based on current socio-economic conditions, this approach combines bio-physical and economic models with a new type of pathway analysis (Representative Agricultural Pathways) to parameterize models consistent with a plausible future world in which climate change would be occurring. Third, adaptation packages for the agricultural systems in a region are designed by the research team with a level of detail that is useful to decision makers, such as research administrators and donors, who are making agricultural R&D investment decisions. The approach is illustrated with examples from AgMIP's projects currently being carried out in Africa and South Asia.

  9. The genomic architecture and association genetics of adaptive characters using a candidate SNP approach in boreal black spruce

    PubMed Central

    2013-01-01

    Background The genomic architecture of adaptive traits remains poorly understood in non-model plants. Various approaches can be used to bridge this gap, including the mapping of quantitative trait loci (QTL) in pedigrees, and genetic association studies in non-structured populations. Here we present results on the genomic architecture of adaptive traits in black spruce, which is a widely distributed conifer of the North American boreal forest. As an alternative to the usual candidate gene approach, a candidate SNP approach was developed for association testing. Results A genetic map containing 231 gene loci was used to identify QTL that were related to budset timing and to tree height assessed over multiple years and sites. Twenty-two unique genomic regions were identified, including 20 that were related to budset timing and 6 that were related to tree height. From results of outlier detection and bulk segregant analysis for adaptive traits using DNA pool sequencing of 434 genes, 52 candidate SNPs were identified and subsequently tested in genetic association studies for budset timing and tree height assessed over multiple years and sites. A total of 34 (65%) SNPs were significantly associated with budset timing, or tree height, or both. Although the percentages of explained variance (PVE) by individual SNPs were small, several significant SNPs were shared between sites and among years. Conclusions The sharing of genomic regions and significant SNPs between budset timing and tree height indicates pleiotropic effects. Significant QTLs and SNPs differed quite greatly among years, suggesting that different sets of genes for the same characters are involved at different stages in the tree’s life history. The functional diversity of genes carrying significant SNPs and low observed PVE further indicated that a large number of polymorphisms are involved in adaptive genetic variation. Accordingly, for undomesticated species such as black spruce with natural populations

  10. An Integrated Systems Approach to Designing Climate Change Adaptation Policy in Water Resources

    NASA Astrophysics Data System (ADS)

    Ryu, D.; Malano, H. M.; Davidson, B.; George, B.

    2014-12-01

    Climate change projections are characterised by large uncertainties with rainfall variability being the key challenge in designing adaptation policies. Climate change adaptation in water resources shows all the typical characteristics of 'wicked' problems typified by cognitive uncertainty as new scientific knowledge becomes available, problem instability, knowledge imperfection and strategic uncertainty due to institutional changes that inevitably occur over time. Planning that is characterised by uncertainties and instability requires an approach that can accommodate flexibility and adaptive capacity for decision-making. An ability to take corrective measures in the event that scenarios and responses envisaged initially derive into forms at some future stage. We present an integrated-multidisciplinary and comprehensive framework designed to interface and inform science and decision making in the formulation of water resource management strategies to deal with climate change in the Musi Catchment of Andhra Pradesh, India. At the core of this framework is a dialogue between stakeholders, decision makers and scientists to define a set of plausible responses to an ensemble of climate change scenarios derived from global climate modelling. The modelling framework used to evaluate the resulting combination of climate scenarios and adaptation responses includes the surface and groundwater assessment models (SWAT & MODFLOW) and the water allocation modelling (REALM) to determine the water security of each adaptation strategy. Three climate scenarios extracted from downscaled climate models were selected for evaluation together with four agreed responses—changing cropping patterns, increasing watershed development, changing the volume of groundwater extraction and improving irrigation efficiency. Water security in this context is represented by the combination of level of water availability and its associated security of supply for three economic activities (agriculture

  11. Rain sampling device

    DOEpatents

    Nelson, D.A.; Tomich, S.D.; Glover, D.W.; Allen, E.V.; Hales, J.M.; Dana, M.T.

    1991-05-14

    The present invention constitutes a rain sampling device adapted for independent operation at locations remote from the user which allows rainfall to be sampled in accordance with any schedule desired by the user. The rain sampling device includes a mechanism for directing wet precipitation into a chamber, a chamber for temporarily holding the precipitation during the process of collection, a valve mechanism for controllably releasing samples of the precipitation from the chamber, a means for distributing the samples released from the holding chamber into vessels adapted for permanently retaining these samples, and an electrical mechanism for regulating the operation of the device. 11 figures.

  12. Adaptation illustrations: Chapter 4

    Treesearch

    Maria Janowiak; Patricia Butler; Chris Swanston; Matt St. Pierre; Linda Parker

    2012-01-01

    In this chapter, we demonstrate how the Adaptation Workbook (Chapter 3) can be used with the Adaptation Strategies and Approaches (Chapter 2) to develop adaptation tactics for two real-world management issues. The two illustrations in this chapter are intended to provide helpful tips to managers completing the Adaptation Workbook, as well as to show how the anticipated...

  13. An Approach to Automated Fusion System Design and Adaptation

    PubMed Central

    Fritze, Alexander; Mönks, Uwe; Holst, Christoph-Alexander; Lohweg, Volker

    2017-01-01

    Industrial applications are in transition towards modular and flexible architectures that are capable of self-configuration and -optimisation. This is due to the demand of mass customisation and the increasing complexity of industrial systems. The conversion to modular systems is related to challenges in all disciplines. Consequently, diverse tasks such as information processing, extensive networking, or system monitoring using sensor and information fusion systems need to be reconsidered. The focus of this contribution is on distributed sensor and information fusion systems for system monitoring, which must reflect the increasing flexibility of fusion systems. This contribution thus proposes an approach, which relies on a network of self-descriptive intelligent sensor nodes, for the automatic design and update of sensor and information fusion systems. This article encompasses the fusion system configuration and adaptation as well as communication aspects. Manual interaction with the flexibly changing system is reduced to a minimum. PMID:28300762

  14. An Approach to Automated Fusion System Design and Adaptation.

    PubMed

    Fritze, Alexander; Mönks, Uwe; Holst, Christoph-Alexander; Lohweg, Volker

    2017-03-16

    Industrial applications are in transition towards modular and flexible architectures that are capable of self-configuration and -optimisation. This is due to the demand of mass customisation and the increasing complexity of industrial systems. The conversion to modular systems is related to challenges in all disciplines. Consequently, diverse tasks such as information processing, extensive networking, or system monitoring using sensor and information fusion systems need to be reconsidered. The focus of this contribution is on distributed sensor and information fusion systems for system monitoring, which must reflect the increasing flexibility of fusion systems. This contribution thus proposes an approach, which relies on a network of self-descriptive intelligent sensor nodes, for the automatic design and update of sensor and information fusion systems. This article encompasses the fusion system configuration and adaptation as well as communication aspects. Manual interaction with the flexibly changing system is reduced to a minimum.

  15. Mujeres Fuertes y Corazones Saludables: adaptation of the StrongWomen -healthy hearts program for rural Latinas using an intervention mapping approach.

    PubMed

    Perry, Cynthia K; McCalmont, Jean C; Ward, Judy P; Menelas, Hannah-Dulya K; Jackson, Christie; De Witz, Jazmyne R; Solanki, Emma; Seguin, Rebecca A

    2017-12-28

    To describe our use of intervention mapping as a systematic method to adapt an evidence-based physical activity and nutrition program to reflect the needs of rural Latinas. An intervention mapping process involving six steps guided the adaptation of an evidence based physical activity and nutrition program, using a community-based participatory research approach. We partnered with a community advisory board of rural Latinas throughout the adaptation process. A needs assessment and logic models were used to ascertain which program was the best fit for adaptation. Once identified, we collaborated with one of the developers of the original program (StrongWomen - Healthy Hearts) during the adaptation process. First, essential theoretical methods and program elements were identified, and additional elements were added or adapted. Next, we reviewed and made changes to reflect the community and cultural context of the practical applications, intervention strategies, program curriculum, materials, and participant information. Finally, we planned for the implementation and evaluation of the adapted program, Mujeres Fuertes y Corazones Saludables, within the context of the rural community. A pilot study will be conducted with overweight, sedentary, middle-aged, Spanish-speaking Latinas. Outcome measures will assess change in weight, physical fitness, physical activity, and nutrition behavior. The intervention mapping process was feasible and provided a systematic approach to balance fit and fidelity in the adaptation of an evidence-based program. Collaboration with community members ensured that the components of the curriculum that were adapted were culturally appropriate and relevant within the local community context.

  16. Adaptive Learning and Risk Taking

    ERIC Educational Resources Information Center

    Denrell, Jerker

    2007-01-01

    Humans and animals learn from experience by reducing the probability of sampling alternatives with poor past outcomes. Using simulations, J. G. March (1996) illustrated how such adaptive sampling could lead to risk-averse as well as risk-seeking behavior. In this article, the author develops a formal theory of how adaptive sampling influences risk…

  17. The redshift distribution of cosmological samples: a forward modeling approach

    NASA Astrophysics Data System (ADS)

    Herbel, Jörg; Kacprzak, Tomasz; Amara, Adam; Refregier, Alexandre; Bruderer, Claudio; Nicola, Andrina

    2017-08-01

    Determining the redshift distribution n(z) of galaxy samples is essential for several cosmological probes including weak lensing. For imaging surveys, this is usually done using photometric redshifts estimated on an object-by-object basis. We present a new approach for directly measuring the global n(z) of cosmological galaxy samples, including uncertainties, using forward modeling. Our method relies on image simulations produced using \\textsc{UFig} (Ultra Fast Image Generator) and on ABC (Approximate Bayesian Computation) within the MCCL (Monte-Carlo Control Loops) framework. The galaxy population is modeled using parametric forms for the luminosity functions, spectral energy distributions, sizes and radial profiles of both blue and red galaxies. We apply exactly the same analysis to the real data and to the simulated images, which also include instrumental and observational effects. By adjusting the parameters of the simulations, we derive a set of acceptable models that are statistically consistent with the data. We then apply the same cuts to the simulations that were used to construct the target galaxy sample in the real data. The redshifts of the galaxies in the resulting simulated samples yield a set of n(z) distributions for the acceptable models. We demonstrate the method by determining n(z) for a cosmic shear like galaxy sample from the 4-band Subaru Suprime-Cam data in the COSMOS field. We also complement this imaging data with a spectroscopic calibration sample from the VVDS survey. We compare our resulting posterior n(z) distributions to the one derived from photometric redshifts estimated using 36 photometric bands in COSMOS and find good agreement. This offers good prospects for applying our approach to current and future large imaging surveys.

  18. A Framework Approach to Evaluate Cross-Cultural Adaptation of Public Engagement Strategies for Radioactive Waste Management - 13430

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hermann, Laura

    2013-07-01

    The complex interplay of politics, economics and culture undermines attempts to define universal best practices for public engagement in the management of nuclear materials. In the international context, communicators must rely on careful adaptation and creative execution to make standard communication techniques succeed in their local communities. Nuclear professionals need an approach to assess and adapt culturally specific public engagement strategies to meet the demands of their particular political, economic and social structures. Using participant interviews and public sources, the Potomac Communications Group reviewed country-specific examples of nuclear-related communication efforts to provide insight into a proposed approach. The review consideredmore » a spectrum of cultural dimensions related to diversity, authority, conformity, proximity and time. Comparisons help to identify cross-cultural influences of various public engagement tactics and to inform a framework for communicators. While not prescriptive in its application, the framework offers a way for communicators to assess the salience of outreach tactics in specific situations. The approach can guide communicators to evaluate and tailor engagement strategies to achieve localized public outreach goals. (authors)« less

  19. Providing Adaptivity in Moodle LMS Courses

    ERIC Educational Resources Information Center

    Despotovic-Zrakic, Marijana; Markovic, Aleksandar; Bogdanovic, Zorica; Barac, Dusan; Krco, Srdjan

    2012-01-01

    In this paper, we describe an approach to providing adaptivity in e-education courses. The primary goal of the paper is to enhance an existing e-education system, namely Moodle LMS, by developing a method for creating adaptive courses, and to compare its effectiveness with non-adaptive education approach. First, we defined the basic requirements…

  20. Construction of large signaling pathways using an adaptive perturbation approach with phosphoproteomic data.

    PubMed

    Melas, Ioannis N; Mitsos, Alexander; Messinis, Dimitris E; Weiss, Thomas S; Rodriguez, Julio-Saez; Alexopoulos, Leonidas G

    2012-04-01

    Construction of large and cell-specific signaling pathways is essential to understand information processing under normal and pathological conditions. On this front, gene-based approaches offer the advantage of large pathway exploration whereas phosphoproteomic approaches offer a more reliable view of pathway activities but are applicable to small pathway sizes. In this paper, we demonstrate an experimentally adaptive approach to construct large signaling pathways from phosphoproteomic data within a 3-day time frame. Our approach--taking advantage of the fast turnaround time of the xMAP technology--is carried out in four steps: (i) screen optimal pathway inducers, (ii) select the responsive ones, (iii) combine them in a combinatorial fashion to construct a phosphoproteomic dataset, and (iv) optimize a reduced generic pathway via an Integer Linear Programming formulation. As a case study, we uncover novel players and their corresponding pathways in primary human hepatocytes by interrogating the signal transduction downstream of 81 receptors of interest and constructing a detailed model for the responsive part of the network comprising 177 species (of which 14 are measured) and 365 interactions.

  1. Dynamic Experiment Design Regularization Approach to Adaptive Imaging with Array Radar/SAR Sensor Systems

    PubMed Central

    Shkvarko, Yuriy; Tuxpan, José; Santos, Stewart

    2011-01-01

    We consider a problem of high-resolution array radar/SAR imaging formalized in terms of a nonlinear ill-posed inverse problem of nonparametric estimation of the power spatial spectrum pattern (SSP) of the random wavefield scattered from a remotely sensed scene observed through a kernel signal formation operator and contaminated with random Gaussian noise. First, the Sobolev-type solution space is constructed to specify the class of consistent kernel SSP estimators with the reproducing kernel structures adapted to the metrics in such the solution space. Next, the “model-free” variational analysis (VA)-based image enhancement approach and the “model-based” descriptive experiment design (DEED) regularization paradigm are unified into a new dynamic experiment design (DYED) regularization framework. Application of the proposed DYED framework to the adaptive array radar/SAR imaging problem leads to a class of two-level (DEED-VA) regularized SSP reconstruction techniques that aggregate the kernel adaptive anisotropic windowing with the projections onto convex sets to enforce the consistency and robustness of the overall iterative SSP estimators. We also show how the proposed DYED regularization method may be considered as a generalization of the MVDR, APES and other high-resolution nonparametric adaptive radar sensing techniques. A family of the DYED-related algorithms is constructed and their effectiveness is finally illustrated via numerical simulations. PMID:22163859

  2. Multi-rate cubature Kalman filter based data fusion method with residual compensation to adapt to sampling rate discrepancy in attitude measurement system.

    PubMed

    Guo, Xiaoting; Sun, Changku; Wang, Peng

    2017-08-01

    This paper investigates the multi-rate inertial and vision data fusion problem in nonlinear attitude measurement systems, where the sampling rate of the inertial sensor is much faster than that of the vision sensor. To fully exploit the high frequency inertial data and obtain favorable fusion results, a multi-rate CKF (Cubature Kalman Filter) algorithm with estimated residual compensation is proposed in order to adapt to the problem of sampling rate discrepancy. During inter-sampling of slow observation data, observation noise can be regarded as infinite. The Kalman gain is unknown and approaches zero. The residual is also unknown. Therefore, the filter estimated state cannot be compensated. To obtain compensation at these moments, state error and residual formulas are modified when compared with the observation data available moments. Self-propagation equation of the state error is established to propagate the quantity from the moments with observation to the moments without observation. Besides, a multiplicative adjustment factor is introduced as Kalman gain, which acts on the residual. Then the filter estimated state can be compensated even when there are no visual observation data. The proposed method is tested and verified in a practical setup. Compared with multi-rate CKF without residual compensation and single-rate CKF, a significant improvement is obtained on attitude measurement by using the proposed multi-rate CKF with inter-sampling residual compensation. The experiment results with superior precision and reliability show the effectiveness of the proposed method.

  3. Adaptive Metropolis Sampling with Product Distributions

    NASA Technical Reports Server (NTRS)

    Wolpert, David H.; Lee, Chiu Fan

    2005-01-01

    The Metropolis-Hastings (MH) algorithm is a way to sample a provided target distribution pi(z). It works by repeatedly sampling a separate proposal distribution T(x,x') to generate a random walk {x(t)}. We consider a modification of the MH algorithm in which T is dynamically updated during the walk. The update at time t uses the {x(t' less than t)} to estimate the product distribution that has the least Kullback-Leibler distance to pi. That estimate is the information-theoretically optimal mean-field approximation to pi. We demonstrate through computer experiments that our algorithm produces samples that are superior to those of the conventional MH algorithm.

  4. Real-time nutrient monitoring in rivers: adaptive sampling strategies, technological challenges and future directions

    NASA Astrophysics Data System (ADS)

    Blaen, Phillip; Khamis, Kieran; Lloyd, Charlotte; Bradley, Chris

    2016-04-01

    Excessive nutrient concentrations in river waters threaten aquatic ecosystem functioning and can pose substantial risks to human health. Robust monitoring strategies are therefore required to generate reliable estimates of river nutrient loads and to improve understanding of the catchment processes that drive spatiotemporal patterns in nutrient fluxes. Furthermore, these data are vital for prediction of future trends under changing environmental conditions and thus the development of appropriate mitigation measures. In recent years, technological developments have led to an increase in the use of continuous in-situ nutrient analysers, which enable measurements at far higher temporal resolutions than can be achieved with discrete sampling and subsequent laboratory analysis. However, such instruments can be costly to run and difficult to maintain (e.g. due to high power consumption and memory requirements), leading to trade-offs between temporal and spatial monitoring resolutions. Here, we highlight how adaptive monitoring strategies, comprising a mixture of temporal sample frequencies controlled by one or more 'trigger variables' (e.g. river stage, turbidity, or nutrient concentration), can advance our understanding of catchment nutrient dynamics while simultaneously overcoming many of the practical and economic challenges encountered in typical in-situ river nutrient monitoring applications. We present examples of short-term variability in river nutrient dynamics, driven by complex catchment behaviour, which support our case for the development of monitoring systems that can adapt in real-time to rapid environmental changes. In addition, we discuss the advantages and disadvantages of current nutrient monitoring techniques, and suggest new research directions based on emerging technologies and highlight how these might improve: 1) monitoring strategies, and 2) understanding of linkages between catchment processes and river nutrient fluxes.

  5. Identifying Immune Drivers of Gulf War Illness Using a Novel Daily Sampling Approach

    DTIC Science & Technology

    2017-10-01

    AWARD NUMBER: W81XWH-12-1-0557 TITLE: Identifying Immune Drivers of Gulf War Illness Using a Novel Daily Sampling Approach PRINCIPAL...TITLE AND SUBTITLE Identifying Immune Drivers of Gulf War Illness Using A Novel 5a. CONTRACT NUMBER Daily Sampling Approach 5b. GRANT NUMBER...INTRODUCTION: The major aim of this research project is to identify aspects of the immune system that are dysregulated in veterans with Gulf War Illness

  6. Indirect estimation of signal-dependent noise with nonadaptive heterogeneous samples.

    PubMed

    Azzari, Lucio; Foi, Alessandro

    2014-08-01

    We consider the estimation of signal-dependent noise from a single image. Unlike conventional algorithms that build a scatterplot of local mean-variance pairs from either small or adaptively selected homogeneous data samples, our proposed approach relies on arbitrarily large patches of heterogeneous data extracted at random from the image. We demonstrate the feasibility of our approach through an extensive theoretical analysis based on mixture of Gaussian distributions. A prototype algorithm is also developed in order to validate the approach on simulated data as well as on real camera raw images.

  7. Beyond Risk and Protective Factors: An Adaptation-Based Approach to Resilience.

    PubMed

    Ellis, Bruce J; Bianchi, JeanMarie; Griskevicius, Vladas; Frankenhuis, Willem E

    2017-07-01

    How does repeated or chronic childhood adversity shape social and cognitive abilities? According to the prevailing deficit model, children from high-stress backgrounds are at risk for impairments in learning and behavior, and the intervention goal is to prevent, reduce, or repair the damage. Missing from this deficit approach is an attempt to leverage the unique strengths and abilities that develop in response to high-stress environments. Evolutionary-developmental models emphasize the coherent, functional changes that occur in response to stress over the life course. Research in birds, rodents, and humans suggests that developmental exposures to stress can improve forms of attention, perception, learning, memory, and problem solving that are ecologically relevant in harsh-unpredictable environments (as per the specialization hypothesis). Many of these skills and abilities, moreover, are primarily manifest in currently stressful contexts where they would provide the greatest fitness-relevant advantages (as per the sensitization hypothesis). This perspective supports an alternative adaptation-based approach to resilience that converges on a central question: "What are the attention, learning, memory, problem-solving, and decision-making strategies that are enhanced through exposures to childhood adversity?" At an applied level, this approach focuses on how we can work with, rather than against, these strengths to promote success in education, employment, and civic life.

  8. Deterministic and reliability based optimization of integrated thermal protection system composite panel using adaptive sampling techniques

    NASA Astrophysics Data System (ADS)

    Ravishankar, Bharani

    Conventional space vehicles have thermal protection systems (TPS) that provide protection to an underlying structure that carries the flight loads. In an attempt to save weight, there is interest in an integrated TPS (ITPS) that combines the structural function and the TPS function. This has weight saving potential, but complicates the design of the ITPS that now has both thermal and structural failure modes. The main objectives of this dissertation was to optimally design the ITPS subjected to thermal and mechanical loads through deterministic and reliability based optimization. The optimization of the ITPS structure requires computationally expensive finite element analyses of 3D ITPS (solid) model. To reduce the computational expenses involved in the structural analysis, finite element based homogenization method was employed, homogenizing the 3D ITPS model to a 2D orthotropic plate. However it was found that homogenization was applicable only for panels that are much larger than the characteristic dimensions of the repeating unit cell in the ITPS panel. Hence a single unit cell was used for the optimization process to reduce the computational cost. Deterministic and probabilistic optimization of the ITPS panel required evaluation of failure constraints at various design points. This further demands computationally expensive finite element analyses which was replaced by efficient, low fidelity surrogate models. In an optimization process, it is important to represent the constraints accurately to find the optimum design. Instead of building global surrogate models using large number of designs, the computational resources were directed towards target regions near constraint boundaries for accurate representation of constraints using adaptive sampling strategies. Efficient Global Reliability Analyses (EGRA) facilitates sequentially sampling of design points around the region of interest in the design space. EGRA was applied to the response surface construction of

  9. Social Daydreaming and Adjustment: An Experience-Sampling Study of Socio-Emotional Adaptation During a Life Transition

    PubMed Central

    Poerio, Giulia L.; Totterdell, Peter; Emerson, Lisa-Marie; Miles, Eleanor

    2016-01-01

    Estimates suggest that up to half of waking life is spent daydreaming; that is, engaged in thought that is independent of, and unrelated to, one’s current task. Emerging research indicates that daydreams are predominately social suggesting that daydreams may serve socio-emotional functions. Here we explore the functional role of social daydreaming for socio-emotional adjustment during an important and stressful life transition (the transition to university) using experience-sampling with 103 participants over 28 days. Over time, social daydreams increased in their positive characteristics and positive emotional outcomes; specifically, participants reported that their daydreams made them feel more socially connected and less lonely, and that the content of their daydreams became less fanciful and involved higher quality relationships. These characteristics then predicted less loneliness at the end of the study, which, in turn was associated with greater social adaptation to university. Feelings of connection resulting from social daydreams were also associated with less emotional inertia in participants who reported being less socially adapted to university. Findings indicate that social daydreaming is functional for promoting socio-emotional adjustment to an important life event. We highlight the need to consider the social content of stimulus-independent cognitions, their characteristics, and patterns of change, to specify how social thoughts enable socio-emotional adaptation. PMID:26834685

  10. Development and Testing of Harpoon-Based Approaches for Collecting Comet Samples

    NASA Technical Reports Server (NTRS)

    Purves, Lloyd (Compiler); Nuth, Joseph (Compiler); Amatucci, Edward (Compiler); Wegel, Donald; Smith, Walter; Church, Joseph; Leary, James; Kee, Lake; Hill, Stuart; Grebenstein, Markus; hide

    2017-01-01

    Comets, having bright tails visible to the unassisted human eye, are considered to have been known about since pre-historic times. In fact 3,000-year old written records of comet sightings have been identified. In comparison, asteroids, being so dim that telescopes are required for observation, were not discovered until 1801. Yet, despite their later discovery, a space mission returned the first samples of an asteroid in 2010 and two more asteroid sample return missions have already been launched. By contrast no comet sample return mission has ever been funded, despite the fact that comets in certain ways are far more scientifically interesting than asteroids. Why is this? The basic answer is the greater difficulty, and consequently higher cost, of a comet sample return mission. Comets typically are in highly elliptical heliocentric orbits which require much more time and propulsion for Space Craft (SC) to reach from Earth and then return to Earth as compared to many asteroids which are in Earth-like orbits. It is also harder for a SC to maneuver safely near a comet given the generally longer communications distances and the challenge of navigating in the comet's, when the comet is close to perihelion, which turns out to be one of the most interesting times for a SC to get close to the comet surface. Due to the science value of better understanding the sublimation of volatiles near the comet surface, other contributions to higher cost as desire to get sample material from both the comet surface and a little below, to preserve the stratigraphy of the sample, and to return the sample in a storage state where it does not undergo undesirable alterations, such as aqueous. In response to these challenges of comet sample return missions, the NASA Goddard Space Flight Center (GFSC) has worked for about a decade (2006 to this time) to develop and test approaches for comet sample return that would enable such a mission to be scientifically valuable, while having acceptably

  11. Urn models for response-adaptive randomized designs: a simulation study based on a non-adaptive randomized trial.

    PubMed

    Ghiglietti, Andrea; Scarale, Maria Giovanna; Miceli, Rosalba; Ieva, Francesca; Mariani, Luigi; Gavazzi, Cecilia; Paganoni, Anna Maria; Edefonti, Valeria

    2018-03-22

    Recently, response-adaptive designs have been proposed in randomized clinical trials to achieve ethical and/or cost advantages by using sequential accrual information collected during the trial to dynamically update the probabilities of treatment assignments. In this context, urn models-where the probability to assign patients to treatments is interpreted as the proportion of balls of different colors available in a virtual urn-have been used as response-adaptive randomization rules. We propose the use of Randomly Reinforced Urn (RRU) models in a simulation study based on a published randomized clinical trial on the efficacy of home enteral nutrition in cancer patients after major gastrointestinal surgery. We compare results with the RRU design with those previously published with the non-adaptive approach. We also provide a code written with the R software to implement the RRU design in practice. In detail, we simulate 10,000 trials based on the RRU model in three set-ups of different total sample sizes. We report information on the number of patients allocated to the inferior treatment and on the empirical power of the t-test for the treatment coefficient in the ANOVA model. We carry out a sensitivity analysis to assess the effect of different urn compositions. For each sample size, in approximately 75% of the simulation runs, the number of patients allocated to the inferior treatment by the RRU design is lower, as compared to the non-adaptive design. The empirical power of the t-test for the treatment effect is similar in the two designs.

  12. The redshift distribution of cosmological samples: a forward modeling approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Herbel, Jörg; Kacprzak, Tomasz; Amara, Adam

    Determining the redshift distribution n ( z ) of galaxy samples is essential for several cosmological probes including weak lensing. For imaging surveys, this is usually done using photometric redshifts estimated on an object-by-object basis. We present a new approach for directly measuring the global n ( z ) of cosmological galaxy samples, including uncertainties, using forward modeling. Our method relies on image simulations produced using \\textsc(UFig) (Ultra Fast Image Generator) and on ABC (Approximate Bayesian Computation) within the MCCL (Monte-Carlo Control Loops) framework. The galaxy population is modeled using parametric forms for the luminosity functions, spectral energy distributions, sizesmore » and radial profiles of both blue and red galaxies. We apply exactly the same analysis to the real data and to the simulated images, which also include instrumental and observational effects. By adjusting the parameters of the simulations, we derive a set of acceptable models that are statistically consistent with the data. We then apply the same cuts to the simulations that were used to construct the target galaxy sample in the real data. The redshifts of the galaxies in the resulting simulated samples yield a set of n ( z ) distributions for the acceptable models. We demonstrate the method by determining n ( z ) for a cosmic shear like galaxy sample from the 4-band Subaru Suprime-Cam data in the COSMOS field. We also complement this imaging data with a spectroscopic calibration sample from the VVDS survey. We compare our resulting posterior n ( z ) distributions to the one derived from photometric redshifts estimated using 36 photometric bands in COSMOS and find good agreement. This offers good prospects for applying our approach to current and future large imaging surveys.« less

  13. Integrating top-down and bottom-up approaches to design a cost-effective and equitable programme of measures for adaptation of a river basin to global change

    NASA Astrophysics Data System (ADS)

    Girard, Corentin; Rinaudo, Jean-Daniel; Pulido-Velazquez, Manuel

    2016-04-01

    Adaptation to the multiple facets of global change challenges the conventional means of sustainably planning and managing water resources at the river basin scale. Numerous demand or supply management options are available, from which adaptation measures need to be selected in a context of high uncertainty of future conditions. Given the interdependency of water users, agreements need to be found at the local level to implement the most effective adaptation measures. Therefore, this work develops an approach combining economics and water resources engineering to select a cost-effective programme of adaptation measures in the context of climate change uncertainty, and to define an equitable allocation of the cost of the adaptation plan between the stakeholders involved. A framework is developed to integrate inputs from the two main approaches commonly used to plan for adaptation. The first, referred to as "top-down", consists of a modelling chain going from global greenhouse gases emission scenarios to local hydrological models used to assess the impact of climate change on water resources. Conversely, the second approach, called "bottom-up", starts from assessing vulnerability at the local level to then identify adaptation measures used to face an uncertain future. The methodological framework presented in this contribution relies on a combination of these two approaches to support the selection of adaptation measures at the local level. Outcomes from these two approaches are integrated to select a cost-effective combination of adaptation measures through a least-cost optimization model developed at the river basin scale. The performances of a programme of measures are assessed under different climate projections to identify cost-effective and least-regret adaptation measures. The issue of allocating the cost of the adaptation plan is considered through two complementary perspectives. The outcome of a negotiation process between the stakeholders is modelled through

  14. Assessing Adaptive Instructional Design Tools and Methods in ADAPT[IT].

    ERIC Educational Resources Information Center

    Eseryel, Deniz; Spector, J. Michael

    ADAPT[IT] (Advanced Design Approach for Personalized Training - Interactive Tools) is a European project within the Information Society Technologies program that is providing design methods and tools to guide a training designer according to the latest cognitive science and standardization principles. ADAPT[IT] addresses users in two significantly…

  15. A control systems engineering approach for adaptive behavioral interventions: illustration with a fibromyalgia intervention.

    PubMed

    Deshpande, Sunil; Rivera, Daniel E; Younger, Jarred W; Nandola, Naresh N

    2014-09-01

    The term adaptive intervention has been used in behavioral medicine to describe operationalized and individually tailored strategies for prevention and treatment of chronic, relapsing disorders. Control systems engineering offers an attractive means for designing and implementing adaptive behavioral interventions that feature intensive measurement and frequent decision-making over time. This is illustrated in this paper for the case of a low-dose naltrexone treatment intervention for fibromyalgia. System identification methods from engineering are used to estimate dynamical models from daily diary reports completed by participants. These dynamical models then form part of a model predictive control algorithm which systematically decides on treatment dosages based on measurements obtained under real-life conditions involving noise, disturbances, and uncertainty. The effectiveness and implications of this approach for behavioral interventions (in general) and pain treatment (in particular) are demonstrated using informative simulations.

  16. An adaptive confidence limit for periodic non-steady conditions fault detection

    NASA Astrophysics Data System (ADS)

    Wang, Tianzhen; Wu, Hao; Ni, Mengqi; Zhang, Milu; Dong, Jingjing; Benbouzid, Mohamed El Hachemi; Hu, Xiong

    2016-05-01

    System monitoring has become a major concern in batch process due to the fact that failure rate in non-steady conditions is much higher than in steady ones. A series of approaches based on PCA have already solved problems such as data dimensionality reduction, multivariable decorrelation, and processing non-changing signal. However, if the data follows non-Gaussian distribution or the variables contain some signal changes, the above approaches are not applicable. To deal with these concerns and to enhance performance in multiperiod data processing, this paper proposes a fault detection method using adaptive confidence limit (ACL) in periodic non-steady conditions. The proposed ACL method achieves four main enhancements: Longitudinal-Standardization could convert non-Gaussian sampling data to Gaussian ones; the multiperiod PCA algorithm could reduce dimensionality, remove correlation, and improve the monitoring accuracy; the adaptive confidence limit could detect faults under non-steady conditions; the fault sections determination procedure could select the appropriate parameter of the adaptive confidence limit. The achieved result analysis clearly shows that the proposed ACL method is superior to other fault detection approaches under periodic non-steady conditions.

  17. Optimal flexible sample size design with robust power.

    PubMed

    Zhang, Lanju; Cui, Lu; Yang, Bo

    2016-08-30

    It is well recognized that sample size determination is challenging because of the uncertainty on the treatment effect size. Several remedies are available in the literature. Group sequential designs start with a sample size based on a conservative (smaller) effect size and allow early stop at interim looks. Sample size re-estimation designs start with a sample size based on an optimistic (larger) effect size and allow sample size increase if the observed effect size is smaller than planned. Different opinions favoring one type over the other exist. We propose an optimal approach using an appropriate optimality criterion to select the best design among all the candidate designs. Our results show that (1) for the same type of designs, for example, group sequential designs, there is room for significant improvement through our optimization approach; (2) optimal promising zone designs appear to have no advantages over optimal group sequential designs; and (3) optimal designs with sample size re-estimation deliver the best adaptive performance. We conclude that to deal with the challenge of sample size determination due to effect size uncertainty, an optimal approach can help to select the best design that provides most robust power across the effect size range of interest. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  18. Particle systems for adaptive, isotropic meshing of CAD models

    PubMed Central

    Levine, Joshua A.; Whitaker, Ross T.

    2012-01-01

    We present a particle-based approach for generating adaptive triangular surface and tetrahedral volume meshes from computer-aided design models. Input shapes are treated as a collection of smooth, parametric surface patches that can meet non-smoothly on boundaries. Our approach uses a hierarchical sampling scheme that places particles on features in order of increasing dimensionality. These particles reach a good distribution by minimizing an energy computed in 3D world space, with movements occurring in the parametric space of each surface patch. Rather than using a pre-computed measure of feature size, our system automatically adapts to both curvature as well as a notion of topological separation. It also enforces a measure of smoothness on these constraints to construct a sizing field that acts as a proxy to piecewise-smooth feature size. We evaluate our technique with comparisons against other popular triangular meshing techniques for this domain. PMID:23162181

  19. Salmon and Sagebrush: The Shoshone-Bannock Tribes Collaborative Approach to Adaptation Planning

    NASA Astrophysics Data System (ADS)

    Petersen, A.; Nasser, E.; Stone, D.; Krosby, M.; Whitley-Binder, L.; Morgan, H.; Rupp, D. E.; Dello, K.; Dalton, M. M.; Fox, M.; Rodgers, K.

    2017-12-01

    The Shoshone-Bannock Tribes reside in the Upper Snake River Watershed in southeast Idaho. Their lives and culture are intertwined with the lands where they live; lands which continue to sustain the Tribes cultural, spiritual, dietary and economic needs. Climate change presents a new threat to the region requiring innovative approaches to prepare for changes as well as to protect the natural resources within the region. As a critical first step in building climate resilience, the Tribes worked with Adaptation International, the University of Washington's Climate Impacts Group (CIG) and the Oregon Climate Change Research Institute (OCCRI) to complete a collaborative climate change vulnerability assessment and adaptation planning process. This presentation provides an overview of collaborative process, shares the results of the project, and includes a 3-minute video presentation. The project started with the identification of 34 plant and animal species to focus the vulnerability assessment. OCCRI analyzed detailed downscaled climate projections for two key climate scenarios (RCP 4.5 and RCP 8.5) and timescales (2050s and 2080s). CIG then used NatureServe's Climate Change Vulnerability Index (CCVI) to develop initial relative vulnerability results for these species. A core team of Tribal staff members from various departments refined these results, drawing upon and integrating rich local and traditional knowledges of the natural environmental and cultural resources. The adaptation planning phase of the project continued in a similar collaborative manner with the project team identifying promising adaptation actions and working directly with Tribal staff to refine and customize these strategies. Tailoring the actions to the local context provides a framework for action that the Tribes can continue to build on in the future. By engaging in these efforts to identify vulnerable species and adaptation strategies and actions to minimize the negative effects of climate

  20. Approaches to evaluating climate change impacts on species: A guide to initiating the adaptation planning process

    Treesearch

    Erika L. Rowland; Jennifer E. Davison; Lisa J. Graumlich

    2011-01-01

    Assessing the impact of climate change on species and associated management objectives is a critical initial step for engaging in the adaptation planning process. Multiple approaches are available. While all possess limitations to their application associated with the uncertainties inherent in the data and models that inform their results, conducting and incorporating...

  1. Apparatus and method for handheld sampling

    DOEpatents

    Staab, Torsten A.

    2005-09-20

    The present invention includes an apparatus, and corresponding method, for taking a sample. The apparatus is built around a frame designed to be held in at least one hand. A sample media is used to secure the sample. A sample media adapter for securing the sample media is operated by a trigger mechanism connectively attached within the frame to the sample media adapter.

  2. Assessing metacognition of grade 2 and grade 4 students using an adaptation of multi-method interview approach during mathematics problem-solving

    NASA Astrophysics Data System (ADS)

    Kuzle, A.

    2018-06-01

    The important role that metacognition plays as a predictor for student mathematical learning and for mathematical problem-solving, has been extensively documented. But only recently has attention turned to primary grades, and more research is needed at this level. The goals of this paper are threefold: (1) to present metacognitive framework during mathematics problem-solving, (2) to describe their multi-method interview approach developed to study student mathematical metacognition, and (3) to empirically evaluate the utility of their model and the adaptation of their approach in the context of grade 2 and grade 4 mathematics problem-solving. The results are discussed not only with regard to further development of the adapted multi-method interview approach, but also with regard to their theoretical and practical implications.

  3. An adaptive optics approach for laser beam correction in turbulence utilizing a modified plenoptic camera

    NASA Astrophysics Data System (ADS)

    Ko, Jonathan; Wu, Chensheng; Davis, Christopher C.

    2015-09-01

    Adaptive optics has been widely used in the field of astronomy to correct for atmospheric turbulence while viewing images of celestial bodies. The slightly distorted incoming wavefronts are typically sensed with a Shack-Hartmann sensor and then corrected with a deformable mirror. Although this approach has proven to be effective for astronomical purposes, a new approach must be developed when correcting for the deep turbulence experienced in ground to ground based optical systems. We propose the use of a modified plenoptic camera as a wavefront sensor capable of accurately representing an incoming wavefront that has been significantly distorted by strong turbulence conditions (C2n <10-13 m- 2/3). An intelligent correction algorithm can then be developed to reconstruct the perturbed wavefront and use this information to drive a deformable mirror capable of correcting the major distortions. After the large distortions have been corrected, a secondary mode utilizing more traditional adaptive optics algorithms can take over to fine tune the wavefront correction. This two-stage algorithm can find use in free space optical communication systems, in directed energy applications, as well as for image correction purposes.

  4. Adapting populations in space: clonal interference and genetic diversity

    NASA Astrophysics Data System (ADS)

    Weissman, Daniel; Barton, Nick

    Most species inhabit ranges much larger than the scales over which individuals interact. How does this spatial structure interact with adaptive evolution? We consider a simple model of a spatially-extended, adapting population and show that, while clonal interference severely limits the adaptation of purely asexual populations, even rare recombination is enough to allow adaptation at rates approaching those of well-mixed populations. We also find that the genetic hitchhiking produced by the adaptive alleles sweeping through the population has strange effects on the patterns of genetic diversity. In large spatial ranges, even low rates of adaptation cause all individuals in the population to rapidly trace their ancestry back to individuals living in a small region in the center of the range. The probability of fixation of an allele is thus strongly dependent on the allele's spatial location, with alleles from the center favored. Surprisingly, these effects are seen genome-wide (instead of being localized to the regions of the genome undergoing the sweeps). The spatial concentration of ancestry produces a power-law dependence of relatedness on distance, so that even individuals sampled far apart are likely to be fairly closely related, masking the underlying spatial structure.

  5. A Risk-based Model Predictive Control Approach to Adaptive Interventions in Behavioral Health

    PubMed Central

    Zafra-Cabeza, Ascensión; Rivera, Daniel E.; Collins, Linda M.; Ridao, Miguel A.; Camacho, Eduardo F.

    2010-01-01

    This paper examines how control engineering and risk management techniques can be applied in the field of behavioral health through their use in the design and implementation of adaptive behavioral interventions. Adaptive interventions are gaining increasing acceptance as a means to improve prevention and treatment of chronic, relapsing disorders, such as abuse of alcohol, tobacco, and other drugs, mental illness, and obesity. A risk-based Model Predictive Control (MPC) algorithm is developed for a hypothetical intervention inspired by Fast Track, a real-life program whose long-term goal is the prevention of conduct disorders in at-risk children. The MPC-based algorithm decides on the appropriate frequency of counselor home visits, mentoring sessions, and the availability of after-school recreation activities by relying on a model that includes identifiable risks, their costs, and the cost/benefit assessment of mitigating actions. MPC is particularly suited for the problem because of its constraint-handling capabilities, and its ability to scale to interventions involving multiple tailoring variables. By systematically accounting for risks and adapting treatment components over time, an MPC approach as described in this paper can increase intervention effectiveness and adherence while reducing waste, resulting in advantages over conventional fixed treatment. A series of simulations are conducted under varying conditions to demonstrate the effectiveness of the algorithm. PMID:21643450

  6. Adaption of G-TAG Software for Validating Touch and Go Asteroid Sample Return Design Methodology

    NASA Technical Reports Server (NTRS)

    Blackmore, Lars James C.; Acikmese, Behcet; Mandic, Milan

    2012-01-01

    A software tool is used to demonstrate the feasibility of Touch and Go (TAG) sampling for Asteroid Sample Return missions. TAG is a concept whereby a spacecraft is in contact with the surface of a small body, such as a comet or asteroid, for a few seconds or less before ascending to a safe location away from the small body. Previous work at JPL developed the G-TAG simulation tool, which provides a software environment for fast, multi-body simulations of the TAG event. G-TAG is described in Multibody Simulation Software Testbed for Small-Body Exploration and Sampling, (NPO-47196) NASA Tech Briefs, Vol. 35, No. 11 (November 2011), p.54. This current innovation adapts this tool to a mission that intends to return a sample from the surface of an asteroid. In order to demonstrate the feasibility of the TAG concept, the new software tool was used to generate extensive simulations that demonstrate the designed spacecraft meets key requirements. These requirements state that contact force and duration must be sufficient to ensure that enough material from the surface is collected in the brushwheel sampler (BWS), and that the spacecraft must survive the contact and must be able to recover and ascend to a safe position, and maintain velocity and orientation after the contact.

  7. A scenario framework to explore the future migration and adaptation in deltas: A multi-scale and participatory approach

    NASA Astrophysics Data System (ADS)

    Kebede, Abiy S.; Nicholls, Robert J.; Allan, Andrew; Arto, Inaki; Cazcarro, Ignacio; Fernandes, Jose A.; Hill, Chris T.; Hutton, Craig W.; Kay, Susan; Lawn, Jon; Lazar, Attila N.; Whitehead, Paul W.

    2017-04-01

    Coastal deltas are home for over 500 million people globally, and they have been identified as one of the most vulnerable coastal environments during the 21st century. They are susceptible to multiple climatic (e.g., sea-level rise, storm surges, change in temperature and precipitation) and socio-economic (e.g., human-induced subsidence, population and urbanisation changes, GDP growth) drivers of change. These drivers also operate at multiple scales, ranging from local to global and short- to long-term. This highlights the complex challenges deltas face in terms of both their long-term sustainability as well as the well-being of their residents and the health of ecosystems that support the livelihood of large (often very poor) population under uncertain changing conditions. A holistic understanding of these challenges and the potential impacts of future climate and socio-economic changes is central for devising robust adaptation policies. Scenario analysis has long been identified as a strategic management tool to explore future climate change and its impacts for supporting robust decision-making under uncertainty. This work presents the overall scenario framework, methodology, and processes adopted for the development of scenarios in the DECCMA* project. DECCMA is analysing the future of three deltas in South Asia and West Africa: (i) the Ganges-Brahmaputra-Meghna (GBM) delta (Bangladesh/India), (ii) the Mahanadi delta (India), and (iii) the Volta delta (Ghana). This includes comparisons between these three deltas. Hence, the scenario framework comprises a multi-scale hybrid approach, with six levels of scenario considerations: (i) global (climate change, e.g., sea-level rise, temperature change; and socio-economic assumptions, e.g., population and urbanisation changes, GDP growth); (ii) regional catchments (e.g., river flow modelling), (iii) regional seas (e.g., fisheries modelling), (iv) regional politics (e.g., transboundary disputes), (v) national (e.g., socio

  8. A Balanced Approach to Adaptive Probability Density Estimation.

    PubMed

    Kovacs, Julio A; Helmick, Cailee; Wriggers, Willy

    2017-01-01

    Our development of a Fast (Mutual) Information Matching (FIM) of molecular dynamics time series data led us to the general problem of how to accurately estimate the probability density function of a random variable, especially in cases of very uneven samples. Here, we propose a novel Balanced Adaptive Density Estimation (BADE) method that effectively optimizes the amount of smoothing at each point. To do this, BADE relies on an efficient nearest-neighbor search which results in good scaling for large data sizes. Our tests on simulated data show that BADE exhibits equal or better accuracy than existing methods, and visual tests on univariate and bivariate experimental data show that the results are also aesthetically pleasing. This is due in part to the use of a visual criterion for setting the smoothing level of the density estimate. Our results suggest that BADE offers an attractive new take on the fundamental density estimation problem in statistics. We have applied it on molecular dynamics simulations of membrane pore formation. We also expect BADE to be generally useful for low-dimensional applications in other statistical application domains such as bioinformatics, signal processing and econometrics.

  9. Building the framework for climate change adaptation in the urban areas using participatory approach: the Czech Republic experience

    NASA Astrophysics Data System (ADS)

    Emmer, Adam; Hubatová, Marie; Lupač, Miroslav; Pondělíček, Michael; Šafařík, Miroslav; Šilhánková, Vladimíra; Vačkář, David

    2016-04-01

    The Czech Republic has experienced numerous extreme hydrometeorological / climatological events such as floods (significant ones in 1997, 2002, 2010, 2013), droughts (2013, 2015), heat waves (2015) and windstorms (2007) during past decades. These events are generally attributed to the ongoing climate change and caused loss of lives and significant material damages (up to several % of GDP in some years), especially in urban areas. To initiate the adaptation process of urban areas, the main objective was to prepare a framework for creating climate change adaptation strategies of individual cities reflecting physical-geographical and socioeconomical conditions of the Czech Republic. Three pilot cities (Hradec Králové, Žďár nad Sázavou, Dobru\\vska) were used to optimize entire procedure. Two sets of participatory seminars were organised in order to involve all key stakeholders (the city council, department of the environment, department of the crisis management, hydrometeorological institute, local experts, ...) into the process of creation of the adaptation strategy from its early stage. Lesson learned for the framework were related especially to its applicability on a local level, which is largely a matter of the understandability of the concept. Finally, this illustrative and widely applicable framework (so called 'road map to adaptation strategy') includes five steps: (i) analysis of existing strategies and plans on national, regional and local levels; (ii) analysing climate-change related hazards and key vulnerabilities; (iii) identification of adaptation needs, evaluation of existing adaptation capacity and formulation of future adaptation priorities; (iv) identification of limits and barriers for the adaptation (economical, environmental, ...); and (v) selection of specific types of adaptation measures reflecting identified adaptation needs and formulated adaptation priorities. Keywords: climate change adaptation (CCA); urban areas; participatory approach

  10. Object Tracking Using Adaptive Covariance Descriptor and Clustering-Based Model Updating for Visual Surveillance

    PubMed Central

    Qin, Lei; Snoussi, Hichem; Abdallah, Fahed

    2014-01-01

    We propose a novel approach for tracking an arbitrary object in video sequences for visual surveillance. The first contribution of this work is an automatic feature extraction method that is able to extract compact discriminative features from a feature pool before computing the region covariance descriptor. As the feature extraction method is adaptive to a specific object of interest, we refer to the region covariance descriptor computed using the extracted features as the adaptive covariance descriptor. The second contribution is to propose a weakly supervised method for updating the object appearance model during tracking. The method performs a mean-shift clustering procedure among the tracking result samples accumulated during a period of time and selects a group of reliable samples for updating the object appearance model. As such, the object appearance model is kept up-to-date and is prevented from contamination even in case of tracking mistakes. We conducted comparing experiments on real-world video sequences, which confirmed the effectiveness of the proposed approaches. The tracking system that integrates the adaptive covariance descriptor and the clustering-based model updating method accomplished stable object tracking on challenging video sequences. PMID:24865883

  11. SU-F-J-97: A Joint Registration and Segmentation Approach for Large Bladder Deformations in Adaptive Radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Derksen, A; Koenig, L; Heldmann, S

    Purpose: To improve results of deformable image registration (DIR) in adaptive radiotherapy for large bladder deformations in CT/CBCT pelvis imaging. Methods: A variational multi-modal DIR algorithm is incorporated in a joint iterative scheme, alternating between segmentation based bladder matching and registration. Using an initial DIR to propagate the bladder contour to the CBCT, in a segmentation step the contour is improved by discrete image gradient sampling along all surface normals and adapting the delineation to match the location of each maximum (with a search range of +−5/2mm at the superior/inferior bladder side and step size of 0.5mm). An additional graph-cutmore » based constraint limits the maximum difference between neighboring points. This improved contour is utilized in a subsequent DIR with a surface matching constraint. By calculating an euclidean distance map of the improved contour surface, the new constraint enforces the DIR to map each point of the original contour onto the improved contour. The resulting deformation is then used as a starting guess to compute a deformation update, which can again be used for the next segmentation step. The result is a dense deformation, able to capture much larger bladder deformations. The new method is evaluated on ten CT/CBCT male pelvis datasets, calculating Dice similarity coefficients (DSC) between the final propagated bladder contour and a manually delineated gold standard on the CBCT image. Results: Over all ten cases, an average DSC of 0.93±0.03 is achieved on the bladder. Compared with the initial DIR (0.88±0.05), the DSC is equal (2 cases) or improved (8 cases). Additionally, DSC accuracy of femoral bones (0.94±0.02) was not affected. Conclusion: The new approach shows that using the presented alternating segmentation/registration approach, the results of bladder DIR in the pelvis region can be greatly improved, especially for cases with large variations in bladder volume. Fraunhofer MEVIS

  12. A Simple and Robust Method for Partially Matched Samples Using the P-Values Pooling Approach

    PubMed Central

    Kuan, Pei Fen; Huang, Bo

    2013-01-01

    This paper focuses on statistical analyses in scenarios where some samples from the matched pairs design are missing, resulting in partially matched samples. Motivated by the idea of meta-analysis, we recast the partially matched samples as coming from two experimental designs, and propose a simple yet robust approach based on the weighted Z-test to integrate the p-values computed from these two designs. We show that the proposed approach achieves better operating characteristics in simulations and a case study, compared to existing methods for partially matched samples. PMID:23417968

  13. A Feedfordward Adaptive Controller to Reduce the Imaging Time of Large-Sized Biological Samples with a SPM-Based Multiprobe Station

    PubMed Central

    Otero, Jorge; Guerrero, Hector; Gonzalez, Laura; Puig-Vidal, Manel

    2012-01-01

    The time required to image large samples is an important limiting factor in SPM-based systems. In multiprobe setups, especially when working with biological samples, this drawback can make impossible to conduct certain experiments. In this work, we present a feedfordward controller based on bang-bang and adaptive controls. The controls are based in the difference between the maximum speeds that can be used for imaging depending on the flatness of the sample zone. Topographic images of Escherichia coli bacteria samples were acquired using the implemented controllers. Results show that to go faster in the flat zones, rather than using a constant scanning speed for the whole image, speeds up the imaging process of large samples by up to a 4× factor. PMID:22368491

  14. The role of career adaptability and courage on life satisfaction in adolescence.

    PubMed

    Ginevra, Maria Cristina; Magnano, Paola; Lodi, Ernesto; Annovazzi, Chiara; Camussi, Elisabetta; Patrizi, Patrizia; Nota, Laura

    2018-01-01

    The present study aimed to extend understanding about the relationship between career adaptability, courage, and life satisfaction in a sample of Italian adolescents. It was hypothesized that courage partially mediated the relationship between career adaptability and life satisfaction. Specifically, 1202 Italian high school students with an age from 14 to 20 years (M = 16.87; SD = 1.47), of which 600 (49.9%) boys and 602 (50.1%) girls, were involved. Using a multigroup approach across gender, it was found that courage partially mediated the relationship between career adaptability and life satisfaction in boys and girls. Results suggested the relevance of career interventions to promote career adaptability and courage for strengthening life satisfaction in adolescence. Copyright © 2017 The Foundation for Professionals in Services for Adolescents. Published by Elsevier Ltd. All rights reserved.

  15. Domain adaptation via transfer component analysis.

    PubMed

    Pan, Sinno Jialin; Tsang, Ivor W; Kwok, James T; Yang, Qiang

    2011-02-01

    Domain adaptation allows knowledge from a source domain to be transferred to a different but related target domain. Intuitively, discovering a good feature representation across domains is crucial. In this paper, we first propose to find such a representation through a new learning method, transfer component analysis (TCA), for domain adaptation. TCA tries to learn some transfer components across domains in a reproducing kernel Hilbert space using maximum mean miscrepancy. In the subspace spanned by these transfer components, data properties are preserved and data distributions in different domains are close to each other. As a result, with the new representations in this subspace, we can apply standard machine learning methods to train classifiers or regression models in the source domain for use in the target domain. Furthermore, in order to uncover the knowledge hidden in the relations between the data labels from the source and target domains, we extend TCA in a semisupervised learning setting, which encodes label information into transfer components learning. We call this extension semisupervised TCA. The main contribution of our work is that we propose a novel dimensionality reduction framework for reducing the distance between domains in a latent space for domain adaptation. We propose both unsupervised and semisupervised feature extraction approaches, which can dramatically reduce the distance between domain distributions by projecting data onto the learned transfer components. Finally, our approach can handle large datasets and naturally lead to out-of-sample generalization. The effectiveness and efficiency of our approach are verified by experiments on five toy datasets and two real-world applications: cross-domain indoor WiFi localization and cross-domain text classification.

  16. Sample-independent approach to normalize two-dimensional data for orthogonality evaluation using whole separation space scaling.

    PubMed

    Jáčová, Jaroslava; Gardlo, Alžběta; Friedecký, David; Adam, Tomáš; Dimandja, Jean-Marie D

    2017-08-18

    Orthogonality is a key parameter that is used to evaluate the separation power of chromatography-based two-dimensional systems. It is necessary to scale the separation data before the assessment of the orthogonality. Current scaling approaches are sample-dependent, and the extent of the retention space that is converted into a normalized retention space is set according to the retention times of the first and last analytes contained in a unique sample to elute. The presence or absence of a highly retained analyte in a sample can thus significantly influence the amount of information (in terms of the total amount of separation space) contained in the normalized retention space considered for the calculation of the orthogonality. We propose a Whole Separation Space Scaling (WOSEL) approach that accounts for the whole separation space delineated by the analytical method, and not the sample. This approach enables an orthogonality-based evaluation of the efficiency of the analytical system that is independent of the sample selected. The WOSEL method was compared to two currently used orthogonality approaches through the evaluation of in silico-generated chromatograms and real separations of human biofluids and petroleum samples. WOSEL exhibits sample-to-sample stability values of 3.8% on real samples, compared to 7.0% and 10.1% for the two other methods, respectively. Using real analyses, we also demonstrate that some previously developed approaches can provide misleading conclusions on the overall orthogonality of a two-dimensional chromatographic system. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. Variational Approach to Enhanced Sampling and Free Energy Calculations

    NASA Astrophysics Data System (ADS)

    Valsson, Omar; Parrinello, Michele

    2014-08-01

    The ability of widely used sampling methods, such as molecular dynamics or Monte Carlo simulations, to explore complex free energy landscapes is severely hampered by the presence of kinetic bottlenecks. A large number of solutions have been proposed to alleviate this problem. Many are based on the introduction of a bias potential which is a function of a small number of collective variables. However constructing such a bias is not simple. Here we introduce a functional of the bias potential and an associated variational principle. The bias that minimizes the functional relates in a simple way to the free energy surface. This variational principle can be turned into a practical, efficient, and flexible sampling method. A number of numerical examples are presented which include the determination of a three-dimensional free energy surface. We argue that, beside being numerically advantageous, our variational approach provides a convenient and novel standpoint for looking at the sampling problem.

  18. SpikeTemp: An Enhanced Rank-Order-Based Learning Approach for Spiking Neural Networks With Adaptive Structure.

    PubMed

    Wang, Jinling; Belatreche, Ammar; Maguire, Liam P; McGinnity, Thomas Martin

    2017-01-01

    This paper presents an enhanced rank-order-based learning algorithm, called SpikeTemp, for spiking neural networks (SNNs) with a dynamically adaptive structure. The trained feed-forward SNN consists of two layers of spiking neurons: 1) an encoding layer which temporally encodes real-valued features into spatio-temporal spike patterns and 2) an output layer of dynamically grown neurons which perform spatio-temporal classification. Both Gaussian receptive fields and square cosine population encoding schemes are employed to encode real-valued features into spatio-temporal spike patterns. Unlike the rank-order-based learning approach, SpikeTemp uses the precise times of the incoming spikes for adjusting the synaptic weights such that early spikes result in a large weight change and late spikes lead to a smaller weight change. This removes the need to rank all the incoming spikes and, thus, reduces the computational cost of SpikeTemp. The proposed SpikeTemp algorithm is demonstrated on several benchmark data sets and on an image recognition task. The results show that SpikeTemp can achieve better classification performance and is much faster than the existing rank-order-based learning approach. In addition, the number of output neurons is much smaller when the square cosine encoding scheme is employed. Furthermore, SpikeTemp is benchmarked against a selection of existing machine learning algorithms, and the results demonstrate the ability of SpikeTemp to classify different data sets after just one presentation of the training samples with comparable classification performance.

  19. Discriminative clustering on manifold for adaptive transductive classification.

    PubMed

    Zhang, Zhao; Jia, Lei; Zhang, Min; Li, Bing; Zhang, Li; Li, Fanzhang

    2017-10-01

    In this paper, we mainly propose a novel adaptive transductive label propagation approach by joint discriminative clustering on manifolds for representing and classifying high-dimensional data. Our framework seamlessly combines the unsupervised manifold learning, discriminative clustering and adaptive classification into a unified model. Also, our method incorporates the adaptive graph weight construction with label propagation. Specifically, our method is capable of propagating label information using adaptive weights over low-dimensional manifold features, which is different from most existing studies that usually predict the labels and construct the weights in the original Euclidean space. For transductive classification by our formulation, we first perform the joint discriminative K-means clustering and manifold learning to capture the low-dimensional nonlinear manifolds. Then, we construct the adaptive weights over the learnt manifold features, where the adaptive weights are calculated through performing the joint minimization of the reconstruction errors over features and soft labels so that the graph weights can be joint-optimal for data representation and classification. Using the adaptive weights, we can easily estimate the unknown labels of samples. After that, our method returns the updated weights for further updating the manifold features. Extensive simulations on image classification and segmentation show that our proposed algorithm can deliver the state-of-the-art performance on several public datasets. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. Survey of adaptive image coding techniques

    NASA Technical Reports Server (NTRS)

    Habibi, A.

    1977-01-01

    The general problem of image data compression is discussed briefly with attention given to the use of Karhunen-Loeve transforms, suboptimal systems, and block quantization. A survey is then conducted encompassing the four categories of adaptive systems: (1) adaptive transform coding (adaptive sampling, adaptive quantization, etc.), (2) adaptive predictive coding (adaptive delta modulation, adaptive DPCM encoding, etc.), (3) adaptive cluster coding (blob algorithms and the multispectral cluster coding technique), and (4) adaptive entropy coding.

  1. The Parent Version of the Preschool Social Skills Rating System: Psychometric Analysis and Adaptation with a German Preschool Sample

    ERIC Educational Resources Information Center

    Hess, Markus; Scheithauer, Herbert; Kleiber, Dieter; Wille, Nora; Erhart, Michael; Ravens-Sieberer, Ulrike

    2014-01-01

    The Social Skills Rating System (SSRS) developed by Gresham and Elliott (1990) is a multirater, norm-referenced instrument measuring social skills and adaptive behavior in preschool children. The aims of the present study were (a) to test the factorial structure of the Parent Form of the SSRS for the first time with a German preschool sample (391…

  2. High-throughput sample adaptive offset hardware architecture for high-efficiency video coding

    NASA Astrophysics Data System (ADS)

    Zhou, Wei; Yan, Chang; Zhang, Jingzhi; Zhou, Xin

    2018-03-01

    A high-throughput hardware architecture for a sample adaptive offset (SAO) filter in the high-efficiency video coding video coding standard is presented. First, an implementation-friendly and simplified bitrate estimation method of rate-distortion cost calculation is proposed to reduce the computational complexity in the mode decision of SAO. Then, a high-throughput VLSI architecture for SAO is presented based on the proposed bitrate estimation method. Furthermore, multiparallel VLSI architecture for in-loop filters, which integrates both deblocking filter and SAO filter, is proposed. Six parallel strategies are applied in the proposed in-loop filters architecture to improve the system throughput and filtering speed. Experimental results show that the proposed in-loop filters architecture can achieve up to 48% higher throughput in comparison with prior work. The proposed architecture can reach a high-operating clock frequency of 297 MHz with TSMC 65-nm library and meet the real-time requirement of the in-loop filters for 8 K × 4 K video format at 132 fps.

  3. A decision analysis approach to climate adaptation: comparing multiple pathways for multi-decadal decision making

    NASA Astrophysics Data System (ADS)

    Lin, B. B.; Little, L.

    2013-12-01

    Policy planners around the world are required to consider the implications of adapting to climatic change across spatial contexts and decadal timeframes. However, local level information for planning is often poorly defined, even though climate adaptation decision-making is made at this scale. This is especially true when considering sea level rise and coastal impacts of climate change. We present a simple approach using sea level rise simulations paired with adaptation scenarios to assess a range of adaptation options available to local councils dealing with issues of beach recession under present and future sea level rise and storm surge. Erosion and beach recession pose a large socioeconomic risk to coastal communities because of the loss of key coastal infrastructure. We examine the well-known adaptation technique of beach nourishment and assess various timings and amounts of beach nourishment at decadal time spans in relation to beach recession impacts. The objective was to identify an adaptation strategy that would allow for a low frequency of management interventions, the maintenance of beach width, and the ability to minimize variation in beach width over the 2010 to 2100 simulation period. 1000 replications of each adaptation option were produced against the 90 year simulation in order to model the ability each adaptation option to achieve the three key objectives. Three sets of adaptation scenarios were identified. Within each scenario, a number of adaptation options were tested. The three scenarios were: 1) Fixed periodic beach replenishment of specific amounts at 20 and 50 year intervals, 2) Beach replenishment to the initial beach width based on trigger levels of recession (5m, 10m, 20m), and 3) Fixed period beach replenishment of a variable amount at decadal intervals (every 10, 20, 30, 40, 50 years). For each adaptation option, we show the effectiveness of each beach replenishment scenario to maintain beach width and consider the implications of more

  4. A knowledge representation approach using fuzzy cognitive maps for better navigation support in an adaptive learning system.

    PubMed

    Chrysafiadi, Konstantina; Virvou, Maria

    2013-12-01

    In this paper a knowledge representation approach of an adaptive and/or personalized tutoring system is presented. The domain knowledge should be represented in a more realistic way in order to allow the adaptive and/or personalized tutoring system to deliver the learning material to each individual learner dynamically taking into account her/his learning needs and her/his different learning pace. To succeed this, the domain knowledge representation has to depict the possible increase or decrease of the learner's knowledge. Considering that the domain concepts that constitute the learning material are not independent from each other, the knowledge representation approach has to allow the system to recognize either the domain concepts that are already partly or completely known for a learner, or the domain concepts that s/he has forgotten, taking into account the learner's knowledge level of the related concepts. In other words, the system should be informed about the knowledge dependencies that exist among the domain concepts of the learning material, as well as the strength on impact of each domain concept on others. Fuzzy Cognitive Maps (FCMs) seem to be an ideal way for representing graphically this kind of information. The suggested knowledge representation approach has been implemented in an e-learning adaptive system for teaching computer programming. The particular system was used by the students of a postgraduate program in the field of Informatics in the University of Piraeus and was compared with a corresponding system, in which the domain knowledge was represented using the most common used technique of network of concepts. The results of the evaluation were very encouraging.

  5. Bayesian selective response-adaptive design using the historical control.

    PubMed

    Kim, Mi-Ok; Harun, Nusrat; Liu, Chunyan; Khoury, Jane C; Broderick, Joseph P

    2018-06-13

    High quality historical control data, if incorporated, may reduce sample size, trial cost, and duration. A too optimistic use of the data, however, may result in bias under prior-data conflict. Motivated by well-publicized two-arm comparative trials in stroke, we propose a Bayesian design that both adaptively incorporates historical control data and selectively adapt the treatment allocation ratios within an ongoing trial responsively to the relative treatment effects. The proposed design differs from existing designs that borrow from historical controls. As opposed to reducing the number of subjects assigned to the control arm blindly, this design does so adaptively to the relative treatment effects only if evaluation of cumulated current trial data combined with the historical control suggests the superiority of the intervention arm. We used the effective historical sample size approach to quantify borrowed information on the control arm and modified the treatment allocation rules of the doubly adaptive biased coin design to incorporate the quantity. The modified allocation rules were then implemented under the Bayesian framework with commensurate priors addressing prior-data conflict. Trials were also more frequently concluded earlier in line with the underlying truth, reducing trial cost, and duration and yielded parameter estimates with smaller standard errors. © 2018 The Authors. Statistics in Medicine Published by John Wiley & Sons, Ltd.

  6. Utility of the Mayo-Portland adaptability inventory-4 for self-reported outcomes in a military sample with traumatic brain injury.

    PubMed

    Kean, Jacob; Malec, James F; Cooper, Douglas B; Bowles, Amy O

    2013-12-01

    To investigate the psychometric properties of the Mayo-Portland Adaptability Inventory-4 (MPAI-4) obtained by self-report in a large sample of active duty military personnel with traumatic brain injury (TBI). Consecutive cohort who completed the MPAI-4 as a part of a larger battery of clinical outcome measures at the time of intake to an outpatient brain injury clinic. Medical center. Consecutively referred sample of active duty military personnel (N=404) who suffered predominantly mild (n=355), but also moderate (n=37) and severe (n=12), TBI. Not applicable. MPAI-4 RESULTS: Initial factor analysis suggested 2 salient dimensions. In subsequent analysis, the ratio of the first and second eigenvalues (6.84:1) and parallel analysis indicated sufficient unidimensionality in 26 retained items. Iterative Rasch analysis resulted in the rescaling of the measure and the removal of 5 additional items for poor fit. The items of the final 21-item Mayo-Portland Adaptability Inventory-military were locally independent, demonstrated monotonically increasing responses, adequately fit the item response model, and permitted the identification of nearly 5 statistically distinct levels of disability in the study population. Slight mistargeting of the population resulted in the global outcome, as measured by the Mayo-Portland Adaptability Inventory-military, tending to be less reflective of very mild levels of disability. These data collected in a relatively large sample of active duty service members with TBI provide insight into the ability of patients to self-report functional impairment and the distinct effects of military deployment on outcome, providing important guidance for the meaningful measurement of outcome in this population. Copyright © 2013 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  7. Omics Approaches for Identifying Physiological Adaptations to Genome Instability in Aging.

    PubMed

    Edifizi, Diletta; Schumacher, Björn

    2017-11-04

    DNA damage causally contributes to aging and age-related diseases. The declining functioning of tissues and organs during aging can lead to the increased risk of succumbing to aging-associated diseases. Congenital syndromes that are caused by heritable mutations in DNA repair pathways lead to cancer susceptibility and accelerated aging, thus underlining the importance of genome maintenance for withstanding aging. High-throughput mass-spectrometry-based approaches have recently contributed to identifying signalling response networks and gaining a more comprehensive understanding of the physiological adaptations occurring upon unrepaired DNA damage. The insulin-like signalling pathway has been implicated in a DNA damage response (DDR) network that includes epidermal growth factor (EGF)-, AMP-activated protein kinases (AMPK)- and the target of rapamycin (TOR)-like signalling pathways, which are known regulators of growth, metabolism, and stress responses. The same pathways, together with the autophagy-mediated proteostatic response and the decline in energy metabolism have also been found to be similarly regulated during natural aging, suggesting striking parallels in the physiological adaptation upon persistent DNA damage due to DNA repair defects and long-term low-level DNA damage accumulation occurring during natural aging. These insights will be an important starting point to study the interplay between signalling networks involved in progeroid syndromes that are caused by DNA repair deficiencies and to gain new understanding of the consequences of DNA damage in the aging process.

  8. A Novel Method of Adrenal Venous Sampling via an Antecubital Approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jiang, Xiongjing, E-mail: jxj103@hotmail.com; Dong, Hui; Peng, Meng

    PurposeCurrently, almost all adrenal venous sampling (AVS) procedures are performed by femoral vein access. The purpose of this study was to establish the technique of AVS via an antecubital approach and evaluate its safety and feasibility.Materials and MethodsFrom January 2012 to June 2015, 194 consecutive patients diagnosed as primary aldosteronism underwent AVS via an antecubital approach without ACTH simulation. Catheters used for bilateral adrenal cannulations were recorded. The success rate of bilateral adrenal sampling, operation time, fluoroscopy time, dosage of contrast, and incidence of complications were calculated.ResultsA 5F MPA1 catheter was first used to attempt right adrenal cannulation in all patients.more » Cannulation of the right adrenal vein was successfully performed in 164 (84.5%) patients. The 5F JR5, Cobra2, and TIG catheters were the ultimate catheters for right adrenal cannulation in 16 (8.2%), 5 (2.6%), and 9 (4.6%) patients, respectively. For left adrenal cannulation, JR5 and Cobra2 catheters were used in 19 (9.8%) and 10 (5.2%) patients, respectively, while only TIG catheters were used in the remaining 165 (85.1%) patients. The rate of successful adrenal sampling on the right, left, and bilateral sides was 91.8%, 93.3%, and 87.6%, respectively. The mean time of operation was (16.3 ± 4.3) minutes, mean fluoroscopy time was (4.7 ± 1.3) minutes, and the mean use of contrast was (14.3 ± 4.7) ml. The incidence of adrenal hematoma was 1.0%.ConclusionsThis study showed that AVS via an antecubital approach was safe and feasible, with a high rate of successful sampling.« less

  9. Overcoming the matched-sample bottleneck: an orthogonal approach to integrate omic data.

    PubMed

    Nguyen, Tin; Diaz, Diana; Tagett, Rebecca; Draghici, Sorin

    2016-07-12

    MicroRNAs (miRNAs) are small non-coding RNA molecules whose primary function is to regulate the expression of gene products via hybridization to mRNA transcripts, resulting in suppression of translation or mRNA degradation. Although miRNAs have been implicated in complex diseases, including cancer, their impact on distinct biological pathways and phenotypes is largely unknown. Current integration approaches require sample-matched miRNA/mRNA datasets, resulting in limited applicability in practice. Since these approaches cannot integrate heterogeneous information available across independent experiments, they neither account for bias inherent in individual studies, nor do they benefit from increased sample size. Here we present a novel framework able to integrate miRNA and mRNA data (vertical data integration) available in independent studies (horizontal meta-analysis) allowing for a comprehensive analysis of the given phenotypes. To demonstrate the utility of our method, we conducted a meta-analysis of pancreatic and colorectal cancer, using 1,471 samples from 15 mRNA and 14 miRNA expression datasets. Our two-dimensional data integration approach greatly increases the power of statistical analysis and correctly identifies pathways known to be implicated in the phenotypes. The proposed framework is sufficiently general to integrate other types of data obtained from high-throughput assays.

  10. Adaptive Flight Control Research at NASA

    NASA Technical Reports Server (NTRS)

    Motter, Mark A.

    2008-01-01

    A broad overview of current adaptive flight control research efforts at NASA is presented, as well as some more detailed discussion of selected specific approaches. The stated objective of the Integrated Resilient Aircraft Control Project, one of NASA s Aviation Safety programs, is to advance the state-of-the-art of adaptive controls as a design option to provide enhanced stability and maneuverability margins for safe landing in the presence of adverse conditions such as actuator or sensor failures. Under this project, a number of adaptive control approaches are being pursued, including neural networks and multiple models. Validation of all the adaptive control approaches will use not only traditional methods such as simulation, wind tunnel testing and manned flight tests, but will be augmented with recently developed capabilities in unmanned flight testing.

  11. New Approaches to Attenuated Hepatitis a Vaccine Development: Cloning and Sequencing of Cell-Culture Adapted Viral cDNA.

    DTIC Science & Technology

    1987-10-13

    after multiple passages in vivo and in vitro. J. Gen. Virol. 67, 1741- 1744. Sabin , A.B. (1985). Oral poliovirus vaccine : history of its development...IN (N NEW APPROACHES TO ATTENUATED HEPATITIS A VACCINE DEVELOPMENT: Q) CLONING AND SEQUENCING OF CELL-CULTURE ADAPTED VIRAL cDNA I ANNUAL REPORT...6ll02Bsl0 A 055 11. TITLE (Include Security Classification) New Approaches to Attenuated Hepatitis A Vaccine Development: Cloning and Sequencing of Cell

  12. Interactive Cumulative Burden Assessment: Engaging Stakeholders in an Adaptive, Participatory and Transdisciplinary Approach

    PubMed Central

    Shrestha, Rehana; van Maarseveen, Martin

    2018-01-01

    Cumulative burden assessment (CuBA) has the potential to inform planning and decision-making on health disparities related to multiple environmental burdens. However, scholars have raised concerns about the social complexity to be dealt with while conducting CuBA, suggesting that it should be addressed in an adaptive, participatory and transdisciplinary (APT) approach. APT calls for deliberation among stakeholders by engaging them in a process of social learning and knowledge co-production. We propose an interactive stakeholder-based approach that facilitates a science-based stakeholder dialogue as an interface for combining different knowledge domains and engendering social learning in CuBA processes. Our approach allows participants to interact with each other using a flexible and auditable CuBA model implemented within a shared workspace. In two workshops we explored the usefulness and practicality of the approach. Results show that stakeholders were enabled to deliberate on cumulative burdens collaboratively, to learn about the technical uncertainties and social challenges associated with CuBA, and to co-produce knowledge in a realm of both technical and societal challenges. The paper identifies potential benefits relevant for responding to social complexity in the CuBA and further recommends exploration of how our approach can enable or constraint social learning and knowledge co-production in CuBA processes under various institutional, social and political contexts. PMID:29401676

  13. Adaptively biased molecular dynamics: An umbrella sampling method with a time-dependent potential

    NASA Astrophysics Data System (ADS)

    Babin, Volodymyr; Karpusenka, Vadzim; Moradi, Mahmoud; Roland, Christopher; Sagui, Celeste

    We discuss an adaptively biased molecular dynamics (ABMD) method for the computation of a free energy surface for a set of reaction coordinates. The ABMD method belongs to the general category of umbrella sampling methods with an evolving biasing potential. It is characterized by a small number of control parameters and an O(t) numerical cost with simulation time t. The method naturally allows for extensions based on multiple walkers and replica exchange mechanism. The workings of the method are illustrated with a number of examples, including sugar puckering, and free energy landscapes for polymethionine and polyproline peptides, and for a short β-turn peptide. ABMD has been implemented into the latest version (Case et al., AMBER 10; University of California: San Francisco, 2008) of the AMBER software package and is freely available to the simulation community.

  14. A two-stage Monte Carlo approach to the expression of uncertainty with finite sample sizes.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crowder, Stephen Vernon; Moyer, Robert D.

    2005-05-01

    Proposed supplement I to the GUM outlines a 'propagation of distributions' approach to deriving the distribution of a measurand for any non-linear function and for any set of random inputs. The supplement's proposed Monte Carlo approach assumes that the distributions of the random inputs are known exactly. This implies that the sample sizes are effectively infinite. In this case, the mean of the measurand can be determined precisely using a large number of Monte Carlo simulations. In practice, however, the distributions of the inputs will rarely be known exactly, but must be estimated using possibly small samples. If these approximatedmore » distributions are treated as exact, the uncertainty in estimating the mean is not properly taken into account. In this paper, we propose a two-stage Monte Carlo procedure that explicitly takes into account the finite sample sizes used to estimate parameters of the input distributions. We will illustrate the approach with a case study involving the efficiency of a thermistor mount power sensor. The performance of the proposed approach will be compared to the standard GUM approach for finite samples using simple non-linear measurement equations. We will investigate performance in terms of coverage probabilities of derived confidence intervals.« less

  15. A Sampling Based Approach to Spacecraft Autonomous Maneuvering with Safety Specifications

    NASA Technical Reports Server (NTRS)

    Starek, Joseph A.; Barbee, Brent W.; Pavone, Marco

    2015-01-01

    This paper presents a methods for safe spacecraft autonomous maneuvering that leverages robotic motion-planning techniques to spacecraft control. Specifically the scenario we consider is an in-plan rendezvous of a chaser spacecraft in proximity to a target spacecraft at the origin of the Clohessy Wiltshire Hill frame. The trajectory for the chaser spacecraft is generated in a receding horizon fashion by executing a sampling based robotic motion planning algorithm name Fast Marching Trees (FMT) which efficiently grows a tree of trajectories over a set of probabillistically drawn samples in the state space. To enforce safety the tree is only grown over actively safe samples for which there exists a one-burn collision avoidance maneuver that circularizes the spacecraft orbit along a collision-free coasting arc and that can be executed under potential thrusters failures. The overall approach establishes a provably correct framework for the systematic encoding of safety specifications into the spacecraft trajectory generations process and appears amenable to real time implementation on orbit. Simulation results are presented for a two-fault tolerant spacecraft during autonomous approach to a single client in Low Earth Orbit.

  16. An Adaptive Prediction-Based Approach to Lossless Compression of Floating-Point Volume Data.

    PubMed

    Fout, N; Ma, Kwan-Liu

    2012-12-01

    In this work, we address the problem of lossless compression of scientific and medical floating-point volume data. We propose two prediction-based compression methods that share a common framework, which consists of a switched prediction scheme wherein the best predictor out of a preset group of linear predictors is selected. Such a scheme is able to adapt to different datasets as well as to varying statistics within the data. The first method, called APE (Adaptive Polynomial Encoder), uses a family of structured interpolating polynomials for prediction, while the second method, which we refer to as ACE (Adaptive Combined Encoder), combines predictors from previous work with the polynomial predictors to yield a more flexible, powerful encoder that is able to effectively decorrelate a wide range of data. In addition, in order to facilitate efficient visualization of compressed data, our scheme provides an option to partition floating-point values in such a way as to provide a progressive representation. We compare our two compressors to existing state-of-the-art lossless floating-point compressors for scientific data, with our data suite including both computer simulations and observational measurements. The results demonstrate that our polynomial predictor, APE, is comparable to previous approaches in terms of speed but achieves better compression rates on average. ACE, our combined predictor, while somewhat slower, is able to achieve the best compression rate on all datasets, with significantly better rates on most of the datasets.

  17. Formation tracker design of multiple mobile robots with wheel perturbations: adaptive output-feedback approach

    NASA Astrophysics Data System (ADS)

    Yoo, Sung Jin

    2016-11-01

    This paper presents a theoretical design approach for output-feedback formation tracking of multiple mobile robots under wheel perturbations. It is assumed that these perturbations are unknown and the linear and angular velocities of the robots are unmeasurable. First, adaptive state observers for estimating unmeasurable velocities of the robots are developed under the robots' kinematics and dynamics including wheel perturbation effects. Then, we derive a virtual-structure-based formation tracker scheme according to the observer dynamic surface design procedure. The main difficulty of the output-feedback control design is to manage the coupling problems between unmeasurable velocities and unknown wheel perturbation effects. These problems are avoided by using the adaptive technique and the function approximation property based on fuzzy logic systems. From the Lyapunov stability analysis, it is shown that point tracking errors of each robot and synchronisation errors for the desired formation converge to an adjustable neighbourhood of the origin, while all signals in the controlled closed-loop system are semiglobally uniformly ultimately bounded.

  18. On position/force tracking control problem of cooperative robot manipulators using adaptive fuzzy backstepping approach.

    PubMed

    Baigzadehnoe, Barmak; Rahmani, Zahra; Khosravi, Alireza; Rezaie, Behrooz

    2017-09-01

    In this paper, the position and force tracking control problem of cooperative robot manipulator system handling a common rigid object with unknown dynamical models and unknown external disturbances is investigated. The universal approximation properties of fuzzy logic systems are employed to estimate the unknown system dynamics. On the other hand, by defining new state variables based on the integral and differential of position and orientation errors of the grasped object, the error system of coordinated robot manipulators is constructed. Subsequently by defining the appropriate change of coordinates and using the backstepping design strategy, an adaptive fuzzy backstepping position tracking control scheme is proposed for multi-robot manipulator systems. By utilizing the properties of internal forces, extra terms are also added to the control signals to consider the force tracking problem. Moreover, it is shown that the proposed adaptive fuzzy backstepping position/force control approach ensures all the signals of the closed loop system uniformly ultimately bounded and tracking errors of both positions and forces can converge to small desired values by proper selection of the design parameters. Finally, the theoretic achievements are tested on the two three-link planar robot manipulators cooperatively handling a common object to illustrate the effectiveness of the proposed approach. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  19. Optimizing trial design in pharmacogenetics research: comparing a fixed parallel group, group sequential, and adaptive selection design on sample size requirements.

    PubMed

    Boessen, Ruud; van der Baan, Frederieke; Groenwold, Rolf; Egberts, Antoine; Klungel, Olaf; Grobbee, Diederick; Knol, Mirjam; Roes, Kit

    2013-01-01

    Two-stage clinical trial designs may be efficient in pharmacogenetics research when there is some but inconclusive evidence of effect modification by a genomic marker. Two-stage designs allow to stop early for efficacy or futility and can offer the additional opportunity to enrich the study population to a specific patient subgroup after an interim analysis. This study compared sample size requirements for fixed parallel group, group sequential, and adaptive selection designs with equal overall power and control of the family-wise type I error rate. The designs were evaluated across scenarios that defined the effect sizes in the marker positive and marker negative subgroups and the prevalence of marker positive patients in the overall study population. Effect sizes were chosen to reflect realistic planning scenarios, where at least some effect is present in the marker negative subgroup. In addition, scenarios were considered in which the assumed 'true' subgroup effects (i.e., the postulated effects) differed from those hypothesized at the planning stage. As expected, both two-stage designs generally required fewer patients than a fixed parallel group design, and the advantage increased as the difference between subgroups increased. The adaptive selection design added little further reduction in sample size, as compared with the group sequential design, when the postulated effect sizes were equal to those hypothesized at the planning stage. However, when the postulated effects deviated strongly in favor of enrichment, the comparative advantage of the adaptive selection design increased, which precisely reflects the adaptive nature of the design. Copyright © 2013 John Wiley & Sons, Ltd.

  20. Capillary Electrophoresis Sensitivity Enhancement Based on Adaptive Moving Average Method.

    PubMed

    Drevinskas, Tomas; Telksnys, Laimutis; Maruška, Audrius; Gorbatsova, Jelena; Kaljurand, Mihkel

    2018-06-05

    In the present work, we demonstrate a novel approach to improve the sensitivity of the "out of lab" portable capillary electrophoretic measurements. Nowadays, many signal enhancement methods are (i) underused (nonoptimal), (ii) overused (distorts the data), or (iii) inapplicable in field-portable instrumentation because of a lack of computational power. The described innovative migration velocity-adaptive moving average method uses an optimal averaging window size and can be easily implemented with a microcontroller. The contactless conductivity detection was used as a model for the development of a signal processing method and the demonstration of its impact on the sensitivity. The frequency characteristics of the recorded electropherograms and peaks were clarified. Higher electrophoretic mobility analytes exhibit higher-frequency peaks, whereas lower electrophoretic mobility analytes exhibit lower-frequency peaks. On the basis of the obtained data, a migration velocity-adaptive moving average algorithm was created, adapted, and programmed into capillary electrophoresis data-processing software. Employing the developed algorithm, each data point is processed depending on a certain migration time of the analyte. Because of the implemented migration velocity-adaptive moving average method, the signal-to-noise ratio improved up to 11 times for sampling frequency of 4.6 Hz and up to 22 times for sampling frequency of 25 Hz. This paper could potentially be used as a methodological guideline for the development of new smoothing algorithms that require adaptive conditions in capillary electrophoresis and other separation methods.

  1. Adaptive AFM scan speed control for high aspect ratio fast structure tracking

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ahmad, Ahmad; Schuh, Andreas; Rangelow, Ivo W.

    2014-10-15

    Improved imaging rates in Atomic Force Microscopes (AFM) are of high interest for disciplines such as life sciences and failure analysis of semiconductor wafers, where the sample topology shows high aspect ratios. Also, fast imaging is necessary to cover a large surface under investigation in reasonable times. Since AFMs are composed of mechanical components, they are associated with comparably low resonance frequencies that undermine the effort to increase the acquisition rates. In particular, high and steep structures are difficult to follow, which causes the cantilever to temporarily loose contact to or crash into the sample. Here, we report on amore » novel approach that does not affect the scanner dynamics, but adapts the lateral scanning speed of the scanner. The controller monitors the control error signal and, only when necessary, decreases the scan speed to allow the z-piezo more time to react to changes in the sample's topography. In this case, the overall imaging rate can be significantly increased, because a general scan speed trade-off decision is not needed and smooth areas are scanned fast. In contrast to methods trying to increase the z-piezo bandwidth, our method is a comparably simple approach that can be easily adapted to standard systems.« less

  2. A Sub-Sampling Approach for Data Acquisition in Gamma Ray Emission Tomography

    NASA Astrophysics Data System (ADS)

    Fysikopoulos, Eleftherios; Kopsinis, Yannis; Georgiou, Maria; Loudos, George

    2016-06-01

    State of the art data acquisition systems for small animal imaging gamma ray detectors often rely on free running Analog to Digital Converters (ADCs) and high density Field Programmable Gate Arrays (FPGA) devices for digital signal processing. In this work, a sub-sampling acquisition approach, which exploits a priori information regarding the shape of the obtained detector pulses is proposed. Output pulses shape depends on the response of the scintillation crystal, photodetector's properties and amplifier/shaper operation. Using these known characteristics of the detector pulses prior to digitization, one can model the voltage pulse derived from the shaper (a low-pass filter, last in the front-end electronics chain), in order to reduce the desirable sampling rate of ADCs. Fitting with a small number of measurements, pulse shape estimation is then feasible. In particular, the proposed sub-sampling acquisition approach relies on a bi-exponential modeling of the pulse shape. We show that the properties of the pulse that are relevant for Single Photon Emission Computed Tomography (SPECT) event detection (i.e., position and energy) can be calculated by collecting just a small fraction of the number of samples usually collected in data acquisition systems used so far. Compared to the standard digitization process, the proposed sub-sampling approach allows the use of free running ADCs with sampling rate reduced by a factor of 5. Two small detectors consisting of Cerium doped Gadolinium Aluminum Gallium Garnet (Gd3Al2Ga3O12 : Ce or GAGG:Ce) pixelated arrays (array elements: 2 × 2 × 5 mm3 and 1 × 1 × 10 mm3 respectively) coupled to a Position Sensitive Photomultiplier Tube (PSPMT) were used for experimental evaluation. The two detectors were used to obtain raw images and energy histograms under 140 keV and 661.7 keV irradiation respectively. The sub-sampling acquisition technique (10 MHz sampling rate) was compared with a standard acquisition method (52 MHz sampling

  3. Integrating evolutionary and functional approaches to infer adaptation at specific loci.

    PubMed

    Storz, Jay F; Wheat, Christopher W

    2010-09-01

    Inferences about adaptation at specific loci are often exclusively based on the static analysis of DNA sequence variation. Ideally,population-genetic evidence for positive selection serves as a stepping-off point for experimental studies to elucidate the functional significance of the putatively adaptive variation. We argue that inferences about adaptation at specific loci are best achieved by integrating the indirect, retrospective insights provided by population-genetic analyses with the more direct, mechanistic insights provided by functional experiments. Integrative studies of adaptive genetic variation may sometimes be motivated by experimental insights into molecular function, which then provide the impetus to perform population genetic tests to evaluate whether the functional variation is of adaptive significance. In other cases, studies may be initiated by genome scans of DNA variation to identify candidate loci for recent adaptation. Results of such analyses can then motivate experimental efforts to test whether the identified candidate loci do in fact contribute to functional variation in some fitness-related phenotype. Functional studies can provide corroborative evidence for positive selection at particular loci, and can potentially reveal specific molecular mechanisms of adaptation.

  4. Performance Monitoring and Assessment of Neuro-Adaptive Controllers for Aerospace Applications Using a Bayesian Approach

    NASA Technical Reports Server (NTRS)

    Gupta, Pramod; Guenther, Kurt; Hodgkinson, John; Jacklin, Stephen; Richard, Michael; Schumann, Johann; Soares, Fola

    2005-01-01

    Modern exploration missions require modern control systems-control systems that can handle catastrophic changes in the system's behavior, compensate for slow deterioration in sustained operations, and support fast system ID. Adaptive controllers, based upon Neural Networks have these capabilities, but they can only be used safely if proper verification & validation (V&V) can be done. In this paper we present our V & V approach and simulation result within NASA's Intelligent Flight Control Systems (IFCS).

  5. Mixture-based gatekeeping procedures in adaptive clinical trials.

    PubMed

    Kordzakhia, George; Dmitrienko, Alex; Ishida, Eiji

    2018-01-01

    Clinical trials with data-driven decision rules often pursue multiple clinical objectives such as the evaluation of several endpoints or several doses of an experimental treatment. These complex analysis strategies give rise to "multivariate" multiplicity problems with several components or sources of multiplicity. A general framework for defining gatekeeping procedures in clinical trials with adaptive multistage designs is proposed in this paper. The mixture method is applied to build a gatekeeping procedure at each stage and inferences at each decision point (interim or final analysis) are performed using the combination function approach. An advantage of utilizing the mixture method is that it enables powerful gatekeeping procedures applicable to a broad class of settings with complex logical relationships among the hypotheses of interest. Further, the combination function approach supports flexible data-driven decisions such as a decision to increase the sample size or remove a treatment arm. The paper concludes with a clinical trial example that illustrates the methodology by applying it to develop an adaptive two-stage design with a mixture-based gatekeeping procedure.

  6. Species richness in soil bacterial communities: a proposed approach to overcome sample size bias.

    PubMed

    Youssef, Noha H; Elshahed, Mostafa S

    2008-09-01

    Estimates of species richness based on 16S rRNA gene clone libraries are increasingly utilized to gauge the level of bacterial diversity within various ecosystems. However, previous studies have indicated that regardless of the utilized approach, species richness estimates obtained are dependent on the size of the analyzed clone libraries. We here propose an approach to overcome sample size bias in species richness estimates in complex microbial communities. Parametric (Maximum likelihood-based and rarefaction curve-based) and non-parametric approaches were used to estimate species richness in a library of 13,001 near full-length 16S rRNA clones derived from soil, as well as in multiple subsets of the original library. Species richness estimates obtained increased with the increase in library size. To obtain a sample size-unbiased estimate of species richness, we calculated the theoretical clone library sizes required to encounter the estimated species richness at various clone library sizes, used curve fitting to determine the theoretical clone library size required to encounter the "true" species richness, and subsequently determined the corresponding sample size-unbiased species richness value. Using this approach, sample size-unbiased estimates of 17,230, 15,571, and 33,912 were obtained for the ML-based, rarefaction curve-based, and ACE-1 estimators, respectively, compared to bias-uncorrected values of 15,009, 11,913, and 20,909.

  7. Adaptive sparse grid approach for the efficient simulation of pulsed eddy current testing inspections

    NASA Astrophysics Data System (ADS)

    Miorelli, Roberto; Reboud, Christophe

    2018-04-01

    Pulsed Eddy Current Testing (PECT) is a popular NonDestructive Testing (NDT) technique for some applications like corrosion monitoring in the oil and gas industry, or rivet inspection in the aeronautic area. Its particularity is to use a transient excitation, which allows to retrieve more information from the piece than conventional harmonic ECT, in a simpler and cheaper way than multi-frequency ECT setups. Efficient modeling tools prove, as usual, very useful to optimize experimental sensors and devices or evaluate their performance, for instance. This paper proposes an efficient simulation of PECT signals based on standard time harmonic solvers and use of an Adaptive Sparse Grid (ASG) algorithm. An adaptive sampling of the ECT signal spectrum is performed with this algorithm, then the complete spectrum is interpolated from this sparse representation and PECT signals are finally synthesized by means of inverse Fourier transform. Simulation results corresponding to existing industrial configurations are presented and the performance of the strategy is discussed by comparison to reference results.

  8. A Climate Change Adaptation Strategy for Management of ...

    EPA Pesticide Factsheets

    Sea level rise is causing shoreline erosion, increased coastal flooding, and marsh vulnerability to the impact of storms. Coastal marshes provide flood abatement, carbon and nutrient sequestration, water quality maintenance, and habitat for fish, shellfish, and wildlife, including species of concern, such as the saltmarsh sparrow (Ammodramus caudacutus). We present a climate change adaptation strategy (CCAS) adopted by scientific, management, and policy stakeholders for managing coastal marshes and enhancing system resiliency. A common adaptive management approach previously used for restoration projects was modified to identify climate-related vulnerabilities and plan climate change adaptive actions. As an example of implementation of the CCAS, we describe the stakeholder plans and management actions the US Fish and Wildlife Service and partners developed to build coastal resiliency in the Narrow River Estuary, RI, in the aftermath of Superstorm Sandy. When possible, an experimental BACI (before-after, control-impact) design, described as pre- and post-sampling at the impact site and one or more control sites, was incorporated into the climate change adaptation and implementation plans. Specific climate change adaptive actions and monitoring plans are described and include shoreline stabilization, restoring marsh drainage, increasing marsh elevation, and enabling upland marsh migration. The CCAS provides a framework and methodology for successfully managing coa

  9. Identification of Hot Moments and Hot Spots for Real-Time Adaptive Control of Multi-scale Environmental Sensor Networks

    NASA Astrophysics Data System (ADS)

    Wietsma, T.; Minsker, B. S.

    2012-12-01

    Increased sensor throughput combined with decreasing hardware costs has led to a disruptive growth in data volume. This disruption, popularly termed "the data deluge," has placed new demands for cyberinfrastructure and information technology skills among researchers in many academic fields, including the environmental sciences. Adaptive sampling has been well established as an effective means of improving network resource efficiency (energy, bandwidth) without sacrificing sample set quality relative to traditional uniform sampling. However, using adaptive sampling for the explicit purpose of improving resolution over events -- situations displaying intermittent dynamics and unique hydrogeological signatures -- is relatively new. In this paper, we define hot spots and hot moments in terms of sensor signal activity as measured through discrete Fourier analysis. Following this frequency-based approach, we apply the Nyquist-Shannon sampling theorem, a fundamental contribution from signal processing that led to the field of information theory, for analysis of uni- and multivariate environmental signal data. In the scope of multi-scale environmental sensor networks, we present several sampling control algorithms, derived from the Nyquist-Shannon theorem, that operate at local (field sensor), regional (base station for aggregation of field sensor data), and global (Cloud-based, computationally intensive models) scales. Evaluated over soil moisture data, results indicate significantly greater sample density during precipitation events while reducing overall sample volume. Using these algorithms as indicators rather than control mechanisms, we also discuss opportunities for spatio-temporal modeling as a tool for planning/modifying sensor network deployments. Locally adaptive model based on Nyquist-Shannon sampling theorem Pareto frontiers for local, regional, and global models relative to uniform sampling. Objectives are (1) overall sampling efficiency and (2) sampling

  10. Combined AIE/EBE/GMRES approach to incompressible flows. [Adaptive Implicit-Explicit/Grouped Element-by-Element/Generalized Minimum Residuals

    NASA Technical Reports Server (NTRS)

    Liou, J.; Tezduyar, T. E.

    1990-01-01

    Adaptive implicit-explicit (AIE), grouped element-by-element (GEBE), and generalized minimum residuals (GMRES) solution techniques for incompressible flows are combined. In this approach, the GEBE and GMRES iteration methods are employed to solve the equation systems resulting from the implicitly treated elements, and therefore no direct solution effort is involved. The benchmarking results demonstrate that this approach can substantially reduce the CPU time and memory requirements in large-scale flow problems. Although the description of the concepts and the numerical demonstration are based on the incompressible flows, the approach presented here is applicable to larger class of problems in computational mechanics.

  11. Constraining Unsaturated Hydraulic Parameters Using the Latin Hypercube Sampling Method and Coupled Hydrogeophysical Approach

    NASA Astrophysics Data System (ADS)

    Farzamian, Mohammad; Monteiro Santos, Fernando A.; Khalil, Mohamed A.

    2017-12-01

    The coupled hydrogeophysical approach has proved to be a valuable tool for improving the use of geoelectrical data for hydrological model parameterization. In the coupled approach, hydrological parameters are directly inferred from geoelectrical measurements in a forward manner to eliminate the uncertainty connected to the independent inversion of electrical resistivity data. Several numerical studies have been conducted to demonstrate the advantages of a coupled approach; however, only a few attempts have been made to apply the coupled approach to actual field data. In this study, we developed a 1D coupled hydrogeophysical code to estimate the van Genuchten-Mualem model parameters, K s, n, θ r and α, from time-lapse vertical electrical sounding data collected during a constant inflow infiltration experiment. van Genuchten-Mualem parameters were sampled using the Latin hypercube sampling method to provide a full coverage of the range of each parameter from their distributions. By applying the coupled approach, vertical electrical sounding data were coupled to hydrological models inferred from van Genuchten-Mualem parameter samples to investigate the feasibility of constraining the hydrological model. The key approaches taken in the study are to (1) integrate electrical resistivity and hydrological data and avoiding data inversion, (2) estimate the total water mass recovery of electrical resistivity data and consider it in van Genuchten-Mualem parameters evaluation and (3) correct the influence of subsurface temperature fluctuations during the infiltration experiment on electrical resistivity data. The results of the study revealed that the coupled hydrogeophysical approach can improve the value of geophysical measurements in hydrological model parameterization. However, the approach cannot overcome the technical limitations of the geoelectrical method associated with resolution and of water mass recovery.

  12. Adapting Champion's Breast Cancer Fear Scale to colorectal cancer: psychometric testing in a sample of older Chinese adults.

    PubMed

    Leung, Doris Y P; Wong, Eliza M L; Chan, Carmen W H

    2014-06-01

    Colorectal cancer (CRC) is the most common type of cancer in both men and women, and older adults are more susceptible to this disease. Previous studies suggest that cancer fear may be a key predictor of participation in cancer screening. Yet there is a lack of validated measuring tools of fear relating to CRC for the Chinese older adult population. This study aims to test the psychometric properties of the Chinese version of the Colorectal Cancer Fear Scale (CRCFS), adapting from the Champion's Breast Cancer Fear Scale. The CRCFS was developed by altering the wording 'breast cancer' to 'colorectal cancer'. Interviewer-administered surveys were carried out with a convenience sample of 250 community-dwelling adults aged at least 60 years old without a history of cancer. A subsample of 40 participants completed the scale again at one-month. Confirmatory factor analysis revealed that the one-factor model provided excellent fits to the overall data, and two randomly split samples. Cronbach's alpha of the scale was 0.95 and test-retest reliability was 0.52. Positive and significant correlations of CRC Cancer Fear with CRC-related susceptibility, severity and barriers were observed. A non-linear relationship with benefits was found. The findings provide support for the psychometric properties of a Chinese version of the Champion Cancer Fear with an adaption to CRC in a sample of community dwelling older Chinese adults. The scale provides a useful tool to assess CRC-related fear, which interventions should address in order to improve screening rates among older Chinese adults. Copyright © 2014 Elsevier Ltd. All rights reserved.

  13. Making climate change tangible for strategic adaptation planning: The Climate Corridor Approach

    NASA Astrophysics Data System (ADS)

    Orlowsky, Boris; Calanca, Pierluigi; Ali, Irshad; Ali, Jawad; Elguera Hilares, Agustin; Huggel, Christian; Khan, Inamullah; Neukom, Raphael; Nizami, Arjumand; Qazi, Muhammad Abbas; Robledo, Carmenza; Rohrer, Mario; Salzmann, Nadine; Schmidt, Kaspar

    2017-04-01

    Climate change is a global phenomenon and difficult to grasp. Although its importance is generally acknowledged, impacts of (future) climate change on human activities are in many cases not taken into account explicitly, in particular when planning development projects. This is due to technical and conceptual challenges, missing financial and human resources and competing priorities. Neglecting climate change can become problematic, if a proposed activity requires specific climatological conditions under which it becomes feasible, a simple example being crop cultivation that needs certain temperature an d precipitation ranges. Comparing such ``climate corridors'' to future climate projections provides an intuitive and low-cost yet quantitative means for assessing needs for, and viability of, adaptation activities under climate change - a "poor man's approach" to climate suitability analysis. A chief advantage of this approach is its modest demand on data. Three case studies from Pakistan, Peru and Tajikistan show that climate corridor analysis can deliver robust results and can be used to efficiently communicate risks and challenges of climate change to partners and stakeholders in the developing countries.

  14. Group adaptation, formal darwinism and contextual analysis.

    PubMed

    Okasha, S; Paternotte, C

    2012-06-01

    We consider the question: under what circumstances can the concept of adaptation be applied to groups, rather than individuals? Gardner and Grafen (2009, J. Evol. Biol.22: 659-671) develop a novel approach to this question, building on Grafen's 'formal Darwinism' project, which defines adaptation in terms of links between evolutionary dynamics and optimization. They conclude that only clonal groups, and to a lesser extent groups in which reproductive competition is repressed, can be considered as adaptive units. We re-examine the conditions under which the selection-optimization links hold at the group level. We focus on an important distinction between two ways of understanding the links, which have different implications regarding group adaptationism. We show how the formal Darwinism approach can be reconciled with G.C. Williams' famous analysis of group adaptation, and we consider the relationships between group adaptation, the Price equation approach to multi-level selection, and the alternative approach based on contextual analysis. © 2012 The Authors. Journal of Evolutionary Biology © 2012 European Society For Evolutionary Biology.

  15. Application of Adaptive Autopilot Designs for an Unmanned Aerial Vehicle

    NASA Technical Reports Server (NTRS)

    Shin, Yoonghyun; Calise, Anthony J.; Motter, Mark A.

    2005-01-01

    This paper summarizes the application of two adaptive approaches to autopilot design, and presents an evaluation and comparison of the two approaches in simulation for an unmanned aerial vehicle. One approach employs two-stage dynamic inversion and the other employs feedback dynamic inversions based on a command augmentation system. Both are augmented with neural network based adaptive elements. The approaches permit adaptation to both parametric uncertainty and unmodeled dynamics, and incorporate a method that permits adaptation during periods of control saturation. Simulation results for an FQM-117B radio controlled miniature aerial vehicle are presented to illustrate the performance of the neural network based adaptation.

  16. An Optimal Control Modification to Model-Reference Adaptive Control for Fast Adaptation

    NASA Technical Reports Server (NTRS)

    Nguyen, Nhan T.; Krishnakumar, Kalmanje; Boskovic, Jovan

    2008-01-01

    This paper presents a method that can achieve fast adaptation for a class of model-reference adaptive control. It is well-known that standard model-reference adaptive control exhibits high-gain control behaviors when a large adaptive gain is used to achieve fast adaptation in order to reduce tracking error rapidly. High gain control creates high-frequency oscillations that can excite unmodeled dynamics and can lead to instability. The fast adaptation approach is based on the minimization of the squares of the tracking error, which is formulated as an optimal control problem. The necessary condition of optimality is used to derive an adaptive law using the gradient method. This adaptive law is shown to result in uniform boundedness of the tracking error by means of the Lyapunov s direct method. Furthermore, this adaptive law allows a large adaptive gain to be used without causing undesired high-gain control effects. The method is shown to be more robust than standard model-reference adaptive control. Simulations demonstrate the effectiveness of the proposed method.

  17. Quantifying the Adaptive Cycle.

    PubMed

    Angeler, David G; Allen, Craig R; Garmestani, Ahjond S; Gunderson, Lance H; Hjerne, Olle; Winder, Monika

    2015-01-01

    The adaptive cycle was proposed as a conceptual model to portray patterns of change in complex systems. Despite the model having potential for elucidating change across systems, it has been used mainly as a metaphor, describing system dynamics qualitatively. We use a quantitative approach for testing premises (reorganisation, conservatism, adaptation) in the adaptive cycle, using Baltic Sea phytoplankton communities as an example of such complex system dynamics. Phytoplankton organizes in recurring spring and summer blooms, a well-established paradigm in planktology and succession theory, with characteristic temporal trajectories during blooms that may be consistent with adaptive cycle phases. We used long-term (1994-2011) data and multivariate analysis of community structure to assess key components of the adaptive cycle. Specifically, we tested predictions about: reorganisation: spring and summer blooms comprise distinct community states; conservatism: community trajectories during individual adaptive cycles are conservative; and adaptation: phytoplankton species during blooms change in the long term. All predictions were supported by our analyses. Results suggest that traditional ecological paradigms such as phytoplankton successional models have potential for moving the adaptive cycle from a metaphor to a framework that can improve our understanding how complex systems organize and reorganize following collapse. Quantifying reorganization, conservatism and adaptation provides opportunities to cope with the intricacies and uncertainties associated with fast ecological change, driven by shifting system controls. Ultimately, combining traditional ecological paradigms with heuristics of complex system dynamics using quantitative approaches may help refine ecological theory and improve our understanding of the resilience of ecosystems.

  18. Quantifying the adaptive cycle

    USGS Publications Warehouse

    Angeler, David G.; Allen, Craig R.; Garmestani, Ahjond S.; Gunderson, Lance H.; Hjerne, Olle; Winder, Monika

    2015-01-01

    The adaptive cycle was proposed as a conceptual model to portray patterns of change in complex systems. Despite the model having potential for elucidating change across systems, it has been used mainly as a metaphor, describing system dynamics qualitatively. We use a quantitative approach for testing premises (reorganisation, conservatism, adaptation) in the adaptive cycle, using Baltic Sea phytoplankton communities as an example of such complex system dynamics. Phytoplankton organizes in recurring spring and summer blooms, a well-established paradigm in planktology and succession theory, with characteristic temporal trajectories during blooms that may be consistent with adaptive cycle phases. We used long-term (1994–2011) data and multivariate analysis of community structure to assess key components of the adaptive cycle. Specifically, we tested predictions about: reorganisation: spring and summer blooms comprise distinct community states; conservatism: community trajectories during individual adaptive cycles are conservative; and adaptation: phytoplankton species during blooms change in the long term. All predictions were supported by our analyses. Results suggest that traditional ecological paradigms such as phytoplankton successional models have potential for moving the adaptive cycle from a metaphor to a framework that can improve our understanding how complex systems organize and reorganize following collapse. Quantifying reorganization, conservatism and adaptation provides opportunities to cope with the intricacies and uncertainties associated with fast ecological change, driven by shifting system controls. Ultimately, combining traditional ecological paradigms with heuristics of complex system dynamics using quantitative approaches may help refine ecological theory and improve our understanding of the resilience of ecosystems.

  19. Probabilistic co-adaptive brain-computer interfacing

    NASA Astrophysics Data System (ADS)

    Bryan, Matthew J.; Martin, Stefan A.; Cheung, Willy; Rao, Rajesh P. N.

    2013-12-01

    Objective. Brain-computer interfaces (BCIs) are confronted with two fundamental challenges: (a) the uncertainty associated with decoding noisy brain signals, and (b) the need for co-adaptation between the brain and the interface so as to cooperatively achieve a common goal in a task. We seek to mitigate these challenges. Approach. We introduce a new approach to brain-computer interfacing based on partially observable Markov decision processes (POMDPs). POMDPs provide a principled approach to handling uncertainty and achieving co-adaptation in the following manner: (1) Bayesian inference is used to compute posterior probability distributions (‘beliefs’) over brain and environment state, and (2) actions are selected based on entire belief distributions in order to maximize total expected reward; by employing methods from reinforcement learning, the POMDP’s reward function can be updated over time to allow for co-adaptive behaviour. Main results. We illustrate our approach using a simple non-invasive BCI which optimizes the speed-accuracy trade-off for individual subjects based on the signal-to-noise characteristics of their brain signals. We additionally demonstrate that the POMDP BCI can automatically detect changes in the user’s control strategy and can co-adaptively switch control strategies on-the-fly to maximize expected reward. Significance. Our results suggest that the framework of POMDPs offers a promising approach for designing BCIs that can handle uncertainty in neural signals and co-adapt with the user on an ongoing basis. The fact that the POMDP BCI maintains a probability distribution over the user’s brain state allows a much more powerful form of decision making than traditional BCI approaches, which have typically been based on the output of classifiers or regression techniques. Furthermore, the co-adaptation of the system allows the BCI to make online improvements to its behaviour, adjusting itself automatically to the user’s changing

  20. Adaptive capacity of geographical clusters: Complexity science and network theory approach

    NASA Astrophysics Data System (ADS)

    Albino, Vito; Carbonara, Nunzia; Giannoccaro, Ilaria

    This paper deals with the adaptive capacity of geographical clusters (GCs), that is a relevant topic in the literature. To address this topic, GC is considered as a complex adaptive system (CAS). Three theoretical propositions concerning the GC adaptive capacity are formulated by using complexity theory. First, we identify three main properties of CAS s that affect the adaptive capacity, namely the interconnectivity, the heterogeneity, and the level of control, and define how the value of these properties influence the adaptive capacity. Then, we associate these properties with specific GC characteristics so obtaining the key conditions of GCs that give them the adaptive capacity so assuring their competitive advantage. To test these theoretical propositions, a case study on two real GCs is carried out. The considered GCs are modeled as networks where firms are nodes and inter-firms relationships are links. Heterogeneity, interconnectivity, and level of control are considered as network properties and thus measured by using the methods of the network theory.

  1. Non-adaptive and adaptive hybrid approaches for enhancing water quality management

    NASA Astrophysics Data System (ADS)

    Kalwij, Ineke M.; Peralta, Richard C.

    2008-09-01

    SummaryUsing optimization to help solve groundwater management problems cost-effectively is becoming increasingly important. Hybrid optimization approaches, that combine two or more optimization algorithms, will become valuable and common tools for addressing complex nonlinear hydrologic problems. Hybrid heuristic optimizers have capabilities far beyond those of a simple genetic algorithm (SGA), and are continuously improving. SGAs having only parent selection, crossover, and mutation are inefficient and rarely used for optimizing contaminant transport management. Even an advanced genetic algorithm (AGA) that includes elitism (to emphasize using the best strategies as parents) and healing (to help assure optimal strategy feasibility) is undesirably inefficient. Much more efficient than an AGA is the presented hybrid (AGCT), which adds comprehensive tabu search (TS) features to an AGA. TS mechanisms (TS probability, tabu list size, search coarseness and solution space size, and a TS threshold value) force the optimizer to search portions of the solution space that yield superior pumping strategies, and to avoid reproducing similar or inferior strategies. An AGCT characteristic is that TS control parameters are unchanging during optimization. However, TS parameter values that are ideal for optimization commencement can be undesirable when nearing assumed global optimality. The second presented hybrid, termed global converger (GC), is significantly better than the AGCT. GC includes AGCT plus feedback-driven auto-adaptive control that dynamically changes TS parameters during run-time. Before comparing AGCT and GC, we empirically derived scaled dimensionless TS control parameter guidelines by evaluating 50 sets of parameter values for a hypothetical optimization problem. For the hypothetical area, AGCT optimized both well locations and pumping rates. The parameters are useful starting values because using trial-and-error to identify an ideal combination of control

  2. Flexible automated approach for quantitative liquid handling of complex biological samples.

    PubMed

    Palandra, Joe; Weller, David; Hudson, Gary; Li, Jeff; Osgood, Sarah; Hudson, Emily; Zhong, Min; Buchholz, Lisa; Cohen, Lucinda H

    2007-11-01

    A fully automated protein precipitation technique for biological sample preparation has been developed for the quantitation of drugs in various biological matrixes. All liquid handling during sample preparation was automated using a Hamilton MicroLab Star Robotic workstation, which included the preparation of standards and controls from a Watson laboratory information management system generated work list, shaking of 96-well plates, and vacuum application. Processing time is less than 30 s per sample or approximately 45 min per 96-well plate, which is then immediately ready for injection onto an LC-MS/MS system. An overview of the process workflow is discussed, including the software development. Validation data are also provided, including specific liquid class data as well as comparative data of automated vs manual preparation using both quality controls and actual sample data. The efficiencies gained from this automated approach are described.

  3. Omics Approaches for Identifying Physiological Adaptations to Genome Instability in Aging

    PubMed Central

    Edifizi, Diletta; Schumacher, Björn

    2017-01-01

    DNA damage causally contributes to aging and age-related diseases. The declining functioning of tissues and organs during aging can lead to the increased risk of succumbing to aging-associated diseases. Congenital syndromes that are caused by heritable mutations in DNA repair pathways lead to cancer susceptibility and accelerated aging, thus underlining the importance of genome maintenance for withstanding aging. High-throughput mass-spectrometry-based approaches have recently contributed to identifying signalling response networks and gaining a more comprehensive understanding of the physiological adaptations occurring upon unrepaired DNA damage. The insulin-like signalling pathway has been implicated in a DNA damage response (DDR) network that includes epidermal growth factor (EGF)-, AMP-activated protein kinases (AMPK)- and the target of rapamycin (TOR)-like signalling pathways, which are known regulators of growth, metabolism, and stress responses. The same pathways, together with the autophagy-mediated proteostatic response and the decline in energy metabolism have also been found to be similarly regulated during natural aging, suggesting striking parallels in the physiological adaptation upon persistent DNA damage due to DNA repair defects and long-term low-level DNA damage accumulation occurring during natural aging. These insights will be an important starting point to study the interplay between signalling networks involved in progeroid syndromes that are caused by DNA repair deficiencies and to gain new understanding of the consequences of DNA damage in the aging process. PMID:29113067

  4. QPSO-Based Adaptive DNA Computing Algorithm

    PubMed Central

    Karakose, Mehmet; Cigdem, Ugur

    2013-01-01

    DNA (deoxyribonucleic acid) computing that is a new computation model based on DNA molecules for information storage has been increasingly used for optimization and data analysis in recent years. However, DNA computing algorithm has some limitations in terms of convergence speed, adaptability, and effectiveness. In this paper, a new approach for improvement of DNA computing is proposed. This new approach aims to perform DNA computing algorithm with adaptive parameters towards the desired goal using quantum-behaved particle swarm optimization (QPSO). Some contributions provided by the proposed QPSO based on adaptive DNA computing algorithm are as follows: (1) parameters of population size, crossover rate, maximum number of operations, enzyme and virus mutation rate, and fitness function of DNA computing algorithm are simultaneously tuned for adaptive process, (2) adaptive algorithm is performed using QPSO algorithm for goal-driven progress, faster operation, and flexibility in data, and (3) numerical realization of DNA computing algorithm with proposed approach is implemented in system identification. Two experiments with different systems were carried out to evaluate the performance of the proposed approach with comparative results. Experimental results obtained with Matlab and FPGA demonstrate ability to provide effective optimization, considerable convergence speed, and high accuracy according to DNA computing algorithm. PMID:23935409

  5. Sample rotating turntable kit for infrared spectrometers

    DOEpatents

    Eckels, Joel Del [Livermore, CA; Klunder, Gregory L [Oakland, CA

    2008-03-04

    An infrared spectrometer sample rotating turntable kit has a rotatable sample cup containing the sample. The infrared spectrometer has an infrared spectrometer probe for analyzing the sample and the rotatable sample cup is adapted to receive the infrared spectrometer probe. A reflectance standard is located in the rotatable sample cup. A sleeve is positioned proximate the sample cup and adapted to receive the probe. A rotator rotates the rotatable sample cup. A battery is connected to the rotator.

  6. Adaptive control of robotic manipulators

    NASA Technical Reports Server (NTRS)

    Seraji, H.

    1987-01-01

    The author presents a novel approach to adaptive control of manipulators to achieve trajectory tracking by the joint angles. The central concept in this approach is the utilization of the manipulator inverse as a feedforward controller. The desired trajectory is applied as an input to the feedforward controller which behaves as the inverse of the manipulator at any operating point; the controller output is used as the driving torque for the manipulator. The controller gains are then updated by an adaptation algorithm derived from MRAC (model reference adaptive control) theory to cope with variations in the manipulator inverse due to changes of the operating point. An adaptive feedback controller and an auxiliary signal are also used to enhance closed-loop stability and to achieve faster adaptation. The proposed control scheme is computationally fast and does not require a priori knowledge of the complex dynamic model or the parameter values of the manipulator or the payload.

  7. RECRUITING FOR A LONGITUDINAL STUDY OF CHILDREN'S HEALTH USING A HOUSEHOLD-BASED PROBABILITY SAMPLING APPROACH

    EPA Science Inventory

    The sampling design for the National Children¿s Study (NCS) calls for a population-based, multi-stage, clustered household sampling approach (visit our website for more information on the NCS : www.nationalchildrensstudy.gov). The full sample is designed to be representative of ...

  8. Energetics, adaptation, and adaptability.

    PubMed

    Ulijaszek, Stanley J

    1996-01-01

    Energy capture and conversion are fundamental to human existence, and over the past three decades biological anthropologists have used a number of approaches which incorporate energetics measures in studies of human population biology. Human groups can vary enormously in their energy expenditure. This review considers evidence for genetic adaptation and presents models for physiological adaptability to reduced physiological energy availability and/or negative energy balance. In industrialized populations, different aspects of energy expenditure have been shown to have a genetic component, including basal metabolic rate, habitual physical activity level, mechanical efficiency of work performance, and thermic effect of food. Metabolic adaptation to low energy intakes has been demonstrated in populations in both developing and industrialized nations. Thyroid hormone-related effects on energy metabolic responses to low physiological energy availability are unified in a model, linking energetic adaptability in physical activity and maintenance metabolism. Negative energy balance has been shown to be associated with reduced reproductive function in women experiencing seasonal environments in some developing countries. Existing models relating negative energy balance to menstrual or ovulatory function are largely descriptive, and do not propose any physiological mechanisms for this phenomenon. A model is proposed whereby reduced physiological energy availability could influence ovulatory function via low serum levels of the amino acid aspartate and reduced sympathetic nervous system activity. © 1996 Wiley-Liss, Inc. Copyright © 1996 Wiley-Liss, Inc.

  9. Resilience through adaptation

    PubMed Central

    van Voorn, George A. K.; Ligtenberg, Arend; Molenaar, Jaap

    2017-01-01

    Adaptation of agents through learning or evolution is an important component of the resilience of Complex Adaptive Systems (CAS). Without adaptation, the flexibility of such systems to cope with outside pressures would be much lower. To study the capabilities of CAS to adapt, social simulations with agent-based models (ABMs) provide a helpful tool. However, the value of ABMs for studying adaptation depends on the availability of methodologies for sensitivity analysis that can quantify resilience and adaptation in ABMs. In this paper we propose a sensitivity analysis methodology that is based on comparing time-dependent probability density functions of output of ABMs with and without agent adaptation. The differences between the probability density functions are quantified by the so-called earth-mover’s distance. We use this sensitivity analysis methodology to quantify the probability of occurrence of critical transitions and other long-term effects of agent adaptation. To test the potential of this new approach, it is used to analyse the resilience of an ABM of adaptive agents competing for a common-pool resource. Adaptation is shown to contribute positively to the resilience of this ABM. If adaptation proceeds sufficiently fast, it may delay or avert the collapse of this system. PMID:28196372

  10. Resilience through adaptation.

    PubMed

    Ten Broeke, Guus A; van Voorn, George A K; Ligtenberg, Arend; Molenaar, Jaap

    2017-01-01

    Adaptation of agents through learning or evolution is an important component of the resilience of Complex Adaptive Systems (CAS). Without adaptation, the flexibility of such systems to cope with outside pressures would be much lower. To study the capabilities of CAS to adapt, social simulations with agent-based models (ABMs) provide a helpful tool. However, the value of ABMs for studying adaptation depends on the availability of methodologies for sensitivity analysis that can quantify resilience and adaptation in ABMs. In this paper we propose a sensitivity analysis methodology that is based on comparing time-dependent probability density functions of output of ABMs with and without agent adaptation. The differences between the probability density functions are quantified by the so-called earth-mover's distance. We use this sensitivity analysis methodology to quantify the probability of occurrence of critical transitions and other long-term effects of agent adaptation. To test the potential of this new approach, it is used to analyse the resilience of an ABM of adaptive agents competing for a common-pool resource. Adaptation is shown to contribute positively to the resilience of this ABM. If adaptation proceeds sufficiently fast, it may delay or avert the collapse of this system.

  11. Sample processing approach for detection of ricin in surface samples.

    PubMed

    Kane, Staci; Shah, Sanjiv; Erler, Anne Marie; Alfaro, Teneile

    2017-12-01

    With several ricin contamination incidents reported over the past decade, rapid and accurate methods are needed for environmental sample analysis, especially after decontamination. A sample processing method was developed for common surface sampling devices to improve the limit of detection and avoid false negative/positive results for ricin analysis. Potential assay interferents from the sample matrix (bleach residue, sample material, wetting buffer), including reference dust, were tested using a Time-Resolved Fluorescence (TRF) immunoassay. Test results suggested that the sample matrix did not cause the elevated background fluorescence sometimes observed when analyzing post-bleach decontamination samples from ricin incidents. Furthermore, sample particulates (80mg/mL Arizona Test Dust) did not enhance background fluorescence or interfere with ricin detection by TRF. These results suggested that high background fluorescence in this immunoassay could be due to labeled antibody quality and/or quantity issues. Centrifugal ultrafiltration devices were evaluated for ricin concentration as a part of sample processing. Up to 30-fold concentration of ricin was observed by the devices, which serve to remove soluble interferents and could function as the front-end sample processing step to other ricin analytical methods. The procedure has the potential to be used with a broader range of environmental sample types and with other potential interferences and to be followed by other ricin analytical methods, although additional verification studies would be required. Published by Elsevier B.V.

  12. Assessment of 48 Stock markets using adaptive multifractal approach

    NASA Astrophysics Data System (ADS)

    Ferreira, Paulo; Dionísio, Andreia; Movahed, S. M. S.

    2017-11-01

    In this paper, Stock market comovements are examined using cointegration, Granger causality tests and nonlinear approaches in context of mutual information and correlations. Since underlying data sets are affected by non-stationarities and trends, we also apply Adaptive Multifractal Detrended Fluctuation Analysis (AMF-DFA) and Adaptive Multifractal Detrended Cross-Correlation Analysis (AMF-DXA). We find only 170 pair of Stock markets cointegrated, and according to the Granger causality and mutual information, we realize that the strongest relations lies between emerging markets, and between emerging and frontier markets. According to scaling exponent given by AMF-DFA, h(q = 2) > 1, we find that all underlying data sets belong to non-stationary process. According to Efficient Market Hypothesis (EMH), only 8 markets are classified in uncorrelated processes at 2 σ confidence interval. 6 Stock markets belong to anti-correlated class and dominant part of markets has memory in corresponding daily index prices during January 1995 to February 2014. New-Zealand with H = 0 . 457 ± 0 . 004 and Jordan with H = 0 . 602 ± 0 . 006 are far from EMH. The nature of cross-correlation exponents based on AMF-DXA is almost multifractal for all pair of Stock markets. The empirical relation, Hxy ≤ [Hxx +Hyy ] / 2, is confirmed. Mentioned relation for q > 0 is also satisfied while for q < 0 there is a deviation from this relation confirming behavior of markets for small fluctuations is affected by contribution of major pair. For larger fluctuations, the cross-correlation contains information from both local (internal) and global (external) conditions. Width of singularity spectrum for auto-correlation and cross-correlation are Δαxx ∈ [ 0 . 304 , 0 . 905 ] and Δαxy ∈ [ 0 . 246 , 1 . 178 ] , respectively. The wide range of singularity spectrum for cross-correlation confirms that the bilateral relation between Stock markets is more complex. The value of σDCCA indicates that all

  13. Systems and methods for self-synchronized digital sampling

    NASA Technical Reports Server (NTRS)

    Samson, Jr., John R. (Inventor)

    2008-01-01

    Systems and methods for self-synchronized data sampling are provided. In one embodiment, a system for capturing synchronous data samples is provided. The system includes an analog to digital converter adapted to capture signals from one or more sensors and convert the signals into a stream of digital data samples at a sampling frequency determined by a sampling control signal; and a synchronizer coupled to the analog to digital converter and adapted to receive a rotational frequency signal from a rotating machine, wherein the synchronizer is further adapted to generate the sampling control signal, and wherein the sampling control signal is based on the rotational frequency signal.

  14. Quantum Ensemble Classification: A Sampling-Based Learning Control Approach.

    PubMed

    Chen, Chunlin; Dong, Daoyi; Qi, Bo; Petersen, Ian R; Rabitz, Herschel

    2017-06-01

    Quantum ensemble classification (QEC) has significant applications in discrimination of atoms (or molecules), separation of isotopes, and quantum information extraction. However, quantum mechanics forbids deterministic discrimination among nonorthogonal states. The classification of inhomogeneous quantum ensembles is very challenging, since there exist variations in the parameters characterizing the members within different classes. In this paper, we recast QEC as a supervised quantum learning problem. A systematic classification methodology is presented by using a sampling-based learning control (SLC) approach for quantum discrimination. The classification task is accomplished via simultaneously steering members belonging to different classes to their corresponding target states (e.g., mutually orthogonal states). First, a new discrimination method is proposed for two similar quantum systems. Then, an SLC method is presented for QEC. Numerical results demonstrate the effectiveness of the proposed approach for the binary classification of two-level quantum ensembles and the multiclass classification of multilevel quantum ensembles.

  15. Adaptive management: Chapter 1

    USGS Publications Warehouse

    Allen, Craig R.; Garmestani, Ahjond S.; Allen, Craig R.; Garmestani, Ahjond S.

    2015-01-01

    Adaptive management is an approach to natural resource management that emphasizes learning through management where knowledge is incomplete, and when, despite inherent uncertainty, managers and policymakers must act. Unlike a traditional trial and error approach, adaptive management has explicit structure, including a careful elucidation of goals, identification of alternative management objectives and hypotheses of causation, and procedures for the collection of data followed by evaluation and reiteration. The process is iterative, and serves to reduce uncertainty, build knowledge and improve management over time in a goal-oriented and structured process.

  16. Astrobiology Objectives for Mars Sample Return

    NASA Astrophysics Data System (ADS)

    Meyer, M. A.

    2002-05-01

    Astrobiology is the study of life in the Universe, and a major objective is to understand the past, present, and future biologic potential of Mars. The current Mars Exploration Program encompasses a series of missions for reconnaissance and in-situ analyses to define in time and space the degree of habitability on Mars. Determining whether life ever existed on Mars is a more demanding question as evidenced by controversies concerning the biogenicity of features in the Mars meteorite ALH84001 and in the earliest rocks on Earth. In-situ studies may find samples of extreme interest but resolution of the life question most probably would require a sample returned to Earth. A selected sample from Mars has the many advantages: State-of-the-art instruments, precision sample handling and processing, scrutiny by different investigators employing different techniques, and adaptation of approach to any surprises It is with a returned sample from Mars that Astrobiology has the most to gain in determining whether life did, does, or could exist on Mars.

  17. L1 Adaptive Control Augmentation System with Application to the X-29 Lateral/Directional Dynamics: A Multi-Input Multi-Output Approach

    NASA Technical Reports Server (NTRS)

    Griffin, Brian Joseph; Burken, John J.; Xargay, Enric

    2010-01-01

    This paper presents an L(sub 1) adaptive control augmentation system design for multi-input multi-output nonlinear systems in the presence of unmatched uncertainties which may exhibit significant cross-coupling effects. A piecewise continuous adaptive law is adopted and extended for applicability to multi-input multi-output systems that explicitly compensates for dynamic cross-coupling. In addition, explicit use of high-fidelity actuator models are added to the L1 architecture to reduce uncertainties in the system. The L(sub 1) multi-input multi-output adaptive control architecture is applied to the X-29 lateral/directional dynamics and results are evaluated against a similar single-input single-output design approach.

  18. Ebola Virus Altered Innate and Adaptive Immune Response Signalling Pathways: Implications for Novel Therapeutic Approaches.

    PubMed

    Kumar, Anoop

    2016-01-01

    Ebola virus (EBOV) arise attention for their impressive lethality by the poor immune response and high inflammatory reaction in the patients. It causes a severe hemorrhagic fever with case fatality rates of up to 90%. The mechanism underlying this lethal outcome is poorly understood. In 2014, a major outbreak of Ebola virus spread amongst several African countries, including Leone, Sierra, and Guinea. Although infections only occur frequently in Central Africa, but the virus has the potential to spread globally. Presently, there is no vaccine or treatment is available to counteract Ebola virus infections due to poor understanding of its interaction with the immune system. Accumulating evidence indicates that the virus actively alters both innate and adaptive immune responses and triggers harmful inflammatory responses. In the literature, some reports have shown that alteration of immune signaling pathways could be due to the ability of EBOV to interfere with dendritic cells (DCs), which link innate and adaptive immune responses. On the other hand, some reports have demonstrated that EBOV, VP35 proteins act as interferon antagonists. So, how the Ebola virus altered the innate and adaptive immune response signaling pathways is still an open question for the researcher to be explored. Thus, in this review, I try to summarize the mechanisms of the alteration of innate and adaptive immune response signaling pathways by Ebola virus which will be helpful for designing effective drugs or vaccines against this lethal infection. Further, potential targets, current treatment and novel therapeutic approaches have also been discussed.

  19. Eigenvector method for umbrella sampling enables error analysis

    PubMed Central

    Thiede, Erik H.; Van Koten, Brian; Weare, Jonathan; Dinner, Aaron R.

    2016-01-01

    Umbrella sampling efficiently yields equilibrium averages that depend on exploring rare states of a model by biasing simulations to windows of coordinate values and then combining the resulting data with physical weighting. Here, we introduce a mathematical framework that casts the step of combining the data as an eigenproblem. The advantage to this approach is that it facilitates error analysis. We discuss how the error scales with the number of windows. Then, we derive a central limit theorem for averages that are obtained from umbrella sampling. The central limit theorem suggests an estimator of the error contributions from individual windows, and we develop a simple and computationally inexpensive procedure for implementing it. We demonstrate this estimator for simulations of the alanine dipeptide and show that it emphasizes low free energy pathways between stable states in comparison to existing approaches for assessing error contributions. Our work suggests the possibility of using the estimator and, more generally, the eigenvector method for umbrella sampling to guide adaptation of the simulation parameters to accelerate convergence. PMID:27586912

  20. Cultural adaptation in measuring common client characteristics with an urban Mainland Chinese sample.

    PubMed

    Song, Xiaoxia; Anderson, Timothy; Beutler, Larry E; Sun, Shijin; Wu, Guohong; Kimpara, Satoko

    2015-01-01

    This study aimed to develop a culturally adapted version of the Systematic Treatment Selection-Innerlife (STS) in China. A total of 300 nonclinical participants collected from Mainland China and 240 nonclinical US participants were drawn from archival data. A Chinese version of the STS was developed, using translation and back-translation procedures. After confirmatory factor analysis (CFA) of the original STS sub scales failed on both samples, exploratory factor analysis (EFA) was then used to access whether a simple structure would emerge on these STS treatment items. Parallel analysis and minimum average partial were used to determine the number of factor to retain. Three cross-cultural factors were found in this study, Internalized Distress, Externalized Distress and interpersonal relations. This supported that regardless of whether one is in presumably different cultural contexts of the USA or China, psychological distress is expressed in a few basic channels of internalized distress, externalized distress, and interpersonal relations, from which different manifestations in different culture were also discussed.

  1. A Multipronged, Adaptive Approach for the Recruitment of Diverse Community-Residing Elders with Memory Impairment: The MIND at Home Experience.

    PubMed

    Samus, Quincy M; Amjad, Halima; Johnston, Deirdre; Black, Betty S; Bartels, Stephen J; Lyketsos, Constantine G

    2015-07-01

    To provide a critical review of a multipronged recruitment approach used to identify, recruit, and enroll a diverse community-based sample of persons with memory disorders into an 18-month randomized, controlled dementia care coordination trial. Descriptive analysis of a recruitment approach comprised five strategies: community liaison ("gatekeepers") method, letters sent from trusted community organizations, display and distribution of study materials in the community, research registries, and general community outreach and engagement activities. Participants were 55 community organizations and 63 staff of community organizations in Baltimore, Maryland. Participant referral sources, eligibility, enrollment status, demographics, and loss to follow-up were tracked in a relational access database. In total, 1,275 referrals were received and 303 socioeconomically, cognitively, and racially diverse community-dwelling persons with cognitive disorders were enrolled. Most referrals came from letters sent from community organizations directly to clients on the study's behalf (39%) and referrals from community liaison organizations (29%). African American/black enrollees were most likely to come from community liaison organizations. A multipronged, adaptive approach led to the successful recruitment of diverse community-residing elders with memory impairment for an intervention trial. Key factors for success included using a range of evidence-supported outreach strategies, forming key strategic community partnerships, seeking regular stakeholder input through all research phases, and obtaining "buy-in" from community stakeholders by aligning study objectives with perceived unmet community needs. Copyright © 2015 American Association for Geriatric Psychiatry. Published by Elsevier Inc. All rights reserved.

  2. Understanding and comparisons of different sampling approaches for the Fourier Amplitudes Sensitivity Test (FAST)

    PubMed Central

    Xu, Chonggang; Gertner, George

    2013-01-01

    Fourier Amplitude Sensitivity Test (FAST) is one of the most popular uncertainty and sensitivity analysis techniques. It uses a periodic sampling approach and a Fourier transformation to decompose the variance of a model output into partial variances contributed by different model parameters. Until now, the FAST analysis is mainly confined to the estimation of partial variances contributed by the main effects of model parameters, but does not allow for those contributed by specific interactions among parameters. In this paper, we theoretically show that FAST analysis can be used to estimate partial variances contributed by both main effects and interaction effects of model parameters using different sampling approaches (i.e., traditional search-curve based sampling, simple random sampling and random balance design sampling). We also analytically calculate the potential errors and biases in the estimation of partial variances. Hypothesis tests are constructed to reduce the effect of sampling errors on the estimation of partial variances. Our results show that compared to simple random sampling and random balance design sampling, sensitivity indices (ratios of partial variances to variance of a specific model output) estimated by search-curve based sampling generally have higher precision but larger underestimations. Compared to simple random sampling, random balance design sampling generally provides higher estimation precision for partial variances contributed by the main effects of parameters. The theoretical derivation of partial variances contributed by higher-order interactions and the calculation of their corresponding estimation errors in different sampling schemes can help us better understand the FAST method and provide a fundamental basis for FAST applications and further improvements. PMID:24143037

  3. Understanding and comparisons of different sampling approaches for the Fourier Amplitudes Sensitivity Test (FAST).

    PubMed

    Xu, Chonggang; Gertner, George

    2011-01-01

    Fourier Amplitude Sensitivity Test (FAST) is one of the most popular uncertainty and sensitivity analysis techniques. It uses a periodic sampling approach and a Fourier transformation to decompose the variance of a model output into partial variances contributed by different model parameters. Until now, the FAST analysis is mainly confined to the estimation of partial variances contributed by the main effects of model parameters, but does not allow for those contributed by specific interactions among parameters. In this paper, we theoretically show that FAST analysis can be used to estimate partial variances contributed by both main effects and interaction effects of model parameters using different sampling approaches (i.e., traditional search-curve based sampling, simple random sampling and random balance design sampling). We also analytically calculate the potential errors and biases in the estimation of partial variances. Hypothesis tests are constructed to reduce the effect of sampling errors on the estimation of partial variances. Our results show that compared to simple random sampling and random balance design sampling, sensitivity indices (ratios of partial variances to variance of a specific model output) estimated by search-curve based sampling generally have higher precision but larger underestimations. Compared to simple random sampling, random balance design sampling generally provides higher estimation precision for partial variances contributed by the main effects of parameters. The theoretical derivation of partial variances contributed by higher-order interactions and the calculation of their corresponding estimation errors in different sampling schemes can help us better understand the FAST method and provide a fundamental basis for FAST applications and further improvements.

  4. Laboratory Approach to the Study of Elastic Anisotropy on Rock Samples

    NASA Astrophysics Data System (ADS)

    Pros, Z.; Lokajíček, T.; Klíma, K.

    The experimental approach (hardware and software) to the study of the elastic an isotropy of rocks on spherical samples under hydrostatic pressure up to 400 MPa is discussed. A substantial innovation of the existing measuring system and processing methods enabled us to make a detailed investigation and evaluation of the kinematic as well as dynamic parameters of elastic waves propagating through anisotropic media. The innovation is based on digital recording of the wave pattern with a high sampling density of both time and amplitude. Several options and results obtained with the innovated laboratory equipment are presented.

  5. Handwritten word preprocessing for database adaptation

    NASA Astrophysics Data System (ADS)

    Oprean, Cristina; Likforman-Sulem, Laurence; Mokbel, Chafic

    2013-01-01

    Handwriting recognition systems are typically trained using publicly available databases, where data have been collected in controlled conditions (image resolution, paper background, noise level,...). Since this is not often the case in real-world scenarios, classification performance can be affected when novel data is presented to the word recognition system. To overcome this problem, we present in this paper a new approach called database adaptation. It consists of processing one set (training or test) in order to adapt it to the other set (test or training, respectively). Specifically, two kinds of preprocessing, namely stroke thickness normalization and pixel intensity normalization are considered. The advantage of such approach is that we can re-use the existing recognition system trained on controlled data. We conduct several experiments with the Rimes 2011 word database and with a real-world database. We adapt either the test set or the training set. Results show that training set adaptation achieves better results than test set adaptation, at the cost of a second training stage on the adapted data. Accuracy of data set adaptation is increased by 2% to 3% in absolute value over no adaptation.

  6. Adapting to life: ocean biogeochemical modelling and adaptive remeshing

    NASA Astrophysics Data System (ADS)

    Hill, J.; Popova, E. E.; Ham, D. A.; Piggott, M. D.; Srokosz, M.

    2014-05-01

    An outstanding problem in biogeochemical modelling of the ocean is that many of the key processes occur intermittently at small scales, such as the sub-mesoscale, that are not well represented in global ocean models. This is partly due to their failure to resolve sub-mesoscale phenomena, which play a significant role in vertical nutrient supply. Simply increasing the resolution of the models may be an inefficient computational solution to this problem. An approach based on recent advances in adaptive mesh computational techniques may offer an alternative. Here the first steps in such an approach are described, using the example of a simple vertical column (quasi-1-D) ocean biogeochemical model. We present a novel method of simulating ocean biogeochemical behaviour on a vertically adaptive computational mesh, where the mesh changes in response to the biogeochemical and physical state of the system throughout the simulation. We show that the model reproduces the general physical and biological behaviour at three ocean stations (India, Papa and Bermuda) as compared to a high-resolution fixed mesh simulation and to observations. The use of an adaptive mesh does not increase the computational error, but reduces the number of mesh elements by a factor of 2-3. Unlike previous work the adaptivity metric used is flexible and we show that capturing the physical behaviour of the model is paramount to achieving a reasonable solution. Adding biological quantities to the adaptivity metric further refines the solution. We then show the potential of this method in two case studies where we change the adaptivity metric used to determine the varying mesh sizes in order to capture the dynamics of chlorophyll at Bermuda and sinking detritus at Papa. We therefore demonstrate that adaptive meshes may provide a suitable numerical technique for simulating seasonal or transient biogeochemical behaviour at high vertical resolution whilst minimising the number of elements in the mesh. More

  7. Performance assessment of a pre-partitioned adaptive chemistry approach in large-eddy simulation of turbulent flames

    NASA Astrophysics Data System (ADS)

    Pepiot, Perrine; Liang, Youwen; Newale, Ashish; Pope, Stephen

    2016-11-01

    A pre-partitioned adaptive chemistry (PPAC) approach recently developed and validated in the simplified framework of a partially-stirred reactor is applied to the simulation of turbulent flames using a LES/particle PDF framework. The PPAC approach was shown to simultaneously provide significant savings in CPU and memory requirements, two major limiting factors in LES/particle PDF. The savings are achieved by providing each particle in the PDF method with a specialized reduced representation and kinetic model adjusted to its changing composition. Both representation and model are identified efficiently from a pre-determined list using a low-dimensional binary-tree search algorithm, thereby keeping the run-time overhead associated with the adaptive strategy to a minimum. The Sandia D flame is used as benchmark to quantify the performance of the PPAC algorithm in a turbulent combustion setting. In particular, the CPU and memory benefits, the distribution of the various representations throughout the computational domain, and the relationship between the user-defined error tolerances used to derive the reduced representations and models and the actual errors observed in LES/PDF are characterized. This material is based upon work supported by the U.S. Department of Energy Office of Science, Office of Basic Energy Sciences under Award Number DE-FG02-90ER14128.

  8. Adapting the capacities and vulnerabilities approach: a gender analysis tool.

    PubMed

    Birks, Lauren; Powell, Christopher; Hatfield, Jennifer

    2017-12-01

    Gender analysis methodology is increasingly being considered as essential to health research because 'women's social, economic and political status undermine their ability to protect and promote their own physical, emotional and mental health, including their effective use of health information and services' {World Health Organization [Gender Analysis in Health: a review of selected tools. 2003; www.who.int/gender/documents/en/Gender. pdf (20 February 2008, date last accessed)]}. By examining gendered roles, responsibilities and norms through the lens of gender analysis, we can develop an in-depth understanding of social power differentials, and be better able to address gender inequalities and inequities within institutions and between men and women. When conducting gender analysis, tools and frameworks may help to aid community engagement and to provide a framework to ensure that relevant gendered nuances are assessed. The capacities and vulnerabilities approach (CVA) is one such gender analysis framework that critically considers gender and its associated roles, responsibilities and power dynamics in a particular community and seeks to meet a social need of that particular community. Although the original intent of the CVA was to guide humanitarian intervention and disaster preparedness, we adapted this framework to a different context, which focuses on identifying and addressing emerging problems and social issues in a particular community or area that affect their specific needs, such as an infectious disease outbreak or difficulty accessing health information and resources. We provide an example of our CVA adaptation, which served to facilitate a better understanding of how health-related disparities affect Maasai women in a remote, resource-poor setting in Northern Tanzania. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  9. Delivering organisational adaptation through legislative mechanisms: Evidence from the Adaptation Reporting Power (Climate Change Act 2008).

    PubMed

    Jude, S R; Drew, G H; Pollard, S J T; Rocks, S A; Jenkinson, K; Lamb, R

    2017-01-01

    There is increasing recognition that organisations, particularly in key infrastructure sectors, are potentially vulnerable to climate change and extreme weather events, and require organisational responses to ensure they are resilient and adaptive. However, detailed evidence of how adaptation is facilitated, implemented and reported, particularly through legislative mechanisms is lacking. The United Kingdom Climate Change Act (2008), introduced the Adaptation Reporting Power, enabling the Government to direct so-called reporting authorities to report their climate change risks and adaptation plans. We describe the authors' unique role and experience supporting the Department for Environment, Food and Rural Affairs (Defra) during the Adaptation Reporting Power's first round. An evaluation framework, used to review the adaptation reports, is presented alongside evidence on how the process provides new insights into adaptation activities and triggered organisational change in 78% of reporting authorities, including the embedding of climate risk and adaptation issues. The role of legislative mechanisms and risk-based approaches in driving and delivering adaptation is discussed alongside future research needs, including the development of organisational maturity models to determine resilient and well adapting organisations. The Adaptation Reporting Power process provides a basis for similar initiatives in other countries, although a clear engagement strategy to ensure buy-in to the process and research on its long-term legacy, including the potential merits of voluntary approaches, is required. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  10. Wind profiling for a coherent wind Doppler lidar by an auto-adaptive background subtraction approach.

    PubMed

    Wu, Yanwei; Guo, Pan; Chen, Siying; Chen, He; Zhang, Yinchao

    2017-04-01

    Auto-adaptive background subtraction (AABS) is proposed as a denoising method for data processing of the coherent Doppler lidar (CDL). The method is proposed specifically for a low-signal-to-noise-ratio regime, in which the drifting power spectral density of CDL data occurs. Unlike the periodogram maximum (PM) and adaptive iteratively reweighted penalized least squares (airPLS), the proposed method presents reliable peaks and is thus advantageous in identifying peak locations. According to the analysis results of simulated and actually measured data, the proposed method outperforms the airPLS method and the PM algorithm in the furthest detectable range. The proposed method improves the detection range approximately up to 16.7% and 40% when compared to the airPLS method and the PM method, respectively. It also has smaller mean wind velocity and standard error values than the airPLS and PM methods. The AABS approach improves the quality of Doppler shift estimates and can be applied to obtain the whole wind profiling by the CDL.

  11. Adaptive Delta Management: cultural aspects of dealing with uncertainty

    NASA Astrophysics Data System (ADS)

    Timmermans, Jos; Haasnoot, Marjolijn; Hermans, Leon; Kwakkel, Jan

    2016-04-01

    Deltas are generally recognized as vulnerable to climate change and therefore a salient topic in adaptation science. Deltas are also highly dynamic systems viewed from physical (erosion, sedimentation, subsidence), social (demographic), economic (trade), infrastructures (transport, energy, metropolization) and cultural (multi-ethnic) perspectives. This multi-faceted dynamic character of delta areas warrants the emergence of a branch of applied adaptation science, Adaptive Delta Management, which explicitly focuses on climate adaptation of such highly dynamic and deeply uncertain systems. The application of Adaptive Delta Management in the Dutch Delta Program and its active international dissemination by Dutch professionals results in the rapid dissemination of Adaptive Delta Management to deltas worldwide. This global dissemination raises concerns among professionals in delta management on its applicability in deltas with cultural conditions and historical developments quite different from those found in the Netherlands and the United Kingdom where the practices now labelled as Adaptive Delta Management first emerged. This research develops an approach and gives a first analysis of the interaction between the characteristics of different approaches in Adaptive Delta Management and their alignment with the cultural conditions encountered in various delta's globally. In this analysis, first different management theories underlying approaches to Adaptive Delta Management as encountered in both scientific and professional publications are identified and characterized on three dimensions: The characteristics dimensions used are: orientation on today, orientation on the future, and decision making (Timmermans, 2015). The different underlying management theories encountered are policy analysis, strategic management, transition management, and adaptive management. These four management theories underlying different approaches in Adaptive Delta Management are connected to

  12. Accelerating the Convergence of Replica Exchange Simulations Using Gibbs Sampling and Adaptive Temperature Sets

    DOE PAGES

    Vogel, Thomas; Perez, Danny

    2015-08-28

    We recently introduced a novel replica-exchange scheme in which an individual replica can sample from states encountered by other replicas at any previous time by way of a global configuration database, enabling the fast propagation of relevant states through the whole ensemble of replicas. This mechanism depends on the knowledge of global thermodynamic functions which are measured during the simulation and not coupled to the heat bath temperatures driving the individual simulations. Therefore, this setup also allows for a continuous adaptation of the temperature set. In this paper, we will review the new scheme and demonstrate its capability. Furthermore, themore » method is particularly useful for the fast and reliable estimation of the microcanonical temperature T(U) or, equivalently, of the density of states g(U) over a wide range of energies.« less

  13. An optimization based sampling approach for multiple metrics uncertainty analysis using generalized likelihood uncertainty estimation

    NASA Astrophysics Data System (ADS)

    Zhou, Rurui; Li, Yu; Lu, Di; Liu, Haixing; Zhou, Huicheng

    2016-09-01

    This paper investigates the use of an epsilon-dominance non-dominated sorted genetic algorithm II (ɛ-NSGAII) as a sampling approach with an aim to improving sampling efficiency for multiple metrics uncertainty analysis using Generalized Likelihood Uncertainty Estimation (GLUE). The effectiveness of ɛ-NSGAII based sampling is demonstrated compared with Latin hypercube sampling (LHS) through analyzing sampling efficiency, multiple metrics performance, parameter uncertainty and flood forecasting uncertainty with a case study of flood forecasting uncertainty evaluation based on Xinanjiang model (XAJ) for Qing River reservoir, China. Results obtained demonstrate the following advantages of the ɛ-NSGAII based sampling approach in comparison to LHS: (1) The former performs more effective and efficient than LHS, for example the simulation time required to generate 1000 behavioral parameter sets is shorter by 9 times; (2) The Pareto tradeoffs between metrics are demonstrated clearly with the solutions from ɛ-NSGAII based sampling, also their Pareto optimal values are better than those of LHS, which means better forecasting accuracy of ɛ-NSGAII parameter sets; (3) The parameter posterior distributions from ɛ-NSGAII based sampling are concentrated in the appropriate ranges rather than uniform, which accords with their physical significance, also parameter uncertainties are reduced significantly; (4) The forecasted floods are close to the observations as evaluated by three measures: the normalized total flow outside the uncertainty intervals (FOUI), average relative band-width (RB) and average deviation amplitude (D). The flood forecasting uncertainty is also reduced a lot with ɛ-NSGAII based sampling. This study provides a new sampling approach to improve multiple metrics uncertainty analysis under the framework of GLUE, and could be used to reveal the underlying mechanisms of parameter sets under multiple conflicting metrics in the uncertainty analysis process.

  14. Adaptive trial designs: a review of barriers and opportunities

    PubMed Central

    2012-01-01

    Adaptive designs allow planned modifications based on data accumulating within a study. The promise of greater flexibility and efficiency stimulates increasing interest in adaptive designs from clinical, academic, and regulatory parties. When adaptive designs are used properly, efficiencies can include a smaller sample size, a more efficient treatment development process, and an increased chance of correctly answering the clinical question of interest. However, improper adaptations can lead to biased studies. A broad definition of adaptive designs allows for countless variations, which creates confusion as to the statistical validity and practical feasibility of many designs. Determining properties of a particular adaptive design requires careful consideration of the scientific context and statistical assumptions. We first review several adaptive designs that garner the most current interest. We focus on the design principles and research issues that lead to particular designs being appealing or unappealing in particular applications. We separately discuss exploratory and confirmatory stage designs in order to account for the differences in regulatory concerns. We include adaptive seamless designs, which combine stages in a unified approach. We also highlight a number of applied areas, such as comparative effectiveness research, that would benefit from the use of adaptive designs. Finally, we describe a number of current barriers and provide initial suggestions for overcoming them in order to promote wider use of appropriate adaptive designs. Given the breadth of the coverage all mathematical and most implementation details are omitted for the sake of brevity. However, the interested reader will find that we provide current references to focused reviews and original theoretical sources which lead to details of the current state of the art in theory and practice. PMID:22917111

  15. Adaptive measurement selection for progressive damage estimation

    NASA Astrophysics Data System (ADS)

    Zhou, Wenfan; Kovvali, Narayan; Papandreou-Suppappola, Antonia; Chattopadhyay, Aditi; Peralta, Pedro

    2011-04-01

    Noise and interference in sensor measurements degrade the quality of data and have a negative impact on the performance of structural damage diagnosis systems. In this paper, a novel adaptive measurement screening approach is presented to automatically select the most informative measurements and use them intelligently for structural damage estimation. The method is implemented efficiently in a sequential Monte Carlo (SMC) setting using particle filtering. The noise suppression and improved damage estimation capability of the proposed method is demonstrated by an application to the problem of estimating progressive fatigue damage in an aluminum compact-tension (CT) sample using noisy PZT sensor measurements.

  16. State of the art of environmentally friendly sample preparation approaches for determination of PBDEs and metabolites in environmental and biological samples: A critical review.

    PubMed

    Berton, Paula; Lana, Nerina B; Ríos, Juan M; García-Reyes, Juan F; Altamirano, Jorgelina C

    2016-01-28

    Green chemistry principles for developing methodologies have gained attention in analytical chemistry in recent decades. A growing number of analytical techniques have been proposed for determination of organic persistent pollutants in environmental and biological samples. In this light, the current review aims to present state-of-the-art sample preparation approaches based on green analytical principles proposed for the determination of polybrominated diphenyl ethers (PBDEs) and metabolites (OH-PBDEs and MeO-PBDEs) in environmental and biological samples. Approaches to lower the solvent consumption and accelerate the extraction, such as pressurized liquid extraction, microwave-assisted extraction, and ultrasound-assisted extraction, are discussed in this review. Special attention is paid to miniaturized sample preparation methodologies and strategies proposed to reduce organic solvent consumption. Additionally, extraction techniques based on alternative solvents (surfactants, supercritical fluids, or ionic liquids) are also commented in this work, even though these are scarcely used for determination of PBDEs. In addition to liquid-based extraction techniques, solid-based analytical techniques are also addressed. The development of greener, faster and simpler sample preparation approaches has increased in recent years (2003-2013). Among green extraction techniques, those based on the liquid phase predominate over those based on the solid phase (71% vs. 29%, respectively). For solid samples, solvent assisted extraction techniques are preferred for leaching of PBDEs, and liquid phase microextraction techniques are mostly used for liquid samples. Likewise, green characteristics of the instrumental analysis used after the extraction and clean-up steps are briefly discussed. Copyright © 2015 Elsevier B.V. All rights reserved.

  17. Beamspace fast fully adaptive brain source localization for limited data sequences

    NASA Astrophysics Data System (ADS)

    Ravan, Maryam

    2017-05-01

    In the electroencephalogram (EEG) or magnetoencephalogram (MEG) context, brain source localization methods that rely on estimating second order statistics often fail when the observations are taken over a short time interval, especially when the number of electrodes is large. To address this issue, in previous study, we developed a multistage adaptive processing called fast fully adaptive (FFA) approach that can significantly reduce the required sample support while still processing all available degrees of freedom (DOFs). This approach processes the observed data in stages through a decimation procedure. In this study, we introduce a new form of FFA approach called beamspace FFA. We first divide the brain into smaller regions and transform the measured data from the source space to the beamspace in each region. The FFA approach is then applied to the beamspaced data of each region. The goal of this modification is to benefit the correlation sensitivity reduction between sources in different brain regions. To demonstrate the performance of the beamspace FFA approach in the limited data scenario, simulation results with multiple deep and cortical sources as well as experimental results are compared with regular FFA and widely used FINE approaches. Both simulation and experimental results demonstrate that the beamspace FFA method can localize different types of multiple correlated brain sources in low signal to noise ratios more accurately with limited data.

  18. Adaptive Neural Network-Based Event-Triggered Control of Single-Input Single-Output Nonlinear Discrete-Time Systems.

    PubMed

    Sahoo, Avimanyu; Xu, Hao; Jagannathan, Sarangapani

    2016-01-01

    This paper presents a novel adaptive neural network (NN) control of single-input and single-output uncertain nonlinear discrete-time systems under event sampled NN inputs. In this control scheme, the feedback signals are transmitted, and the NN weights are tuned in an aperiodic manner at the event sampled instants. After reviewing the NN approximation property with event sampled inputs, an adaptive state estimator (SE), consisting of linearly parameterized NNs, is utilized to approximate the unknown system dynamics in an event sampled context. The SE is viewed as a model and its approximated dynamics and the state vector, during any two events, are utilized for the event-triggered controller design. An adaptive event-trigger condition is derived by using both the estimated NN weights and a dead-zone operator to determine the event sampling instants. This condition both facilitates the NN approximation and reduces the transmission of feedback signals. The ultimate boundedness of both the NN weight estimation error and the system state vector is demonstrated through the Lyapunov approach. As expected, during an initial online learning phase, events are observed more frequently. Over time with the convergence of the NN weights, the inter-event times increase, thereby lowering the number of triggered events. These claims are illustrated through the simulation results.

  19. Computer Adaptive Testing, Big Data and Algorithmic Approaches to Education

    ERIC Educational Resources Information Center

    Thompson, Greg

    2017-01-01

    This article critically considers the promise of computer adaptive testing (CAT) and digital data to provide better and quicker data that will improve the quality, efficiency and effectiveness of schooling. In particular, it uses the case of the Australian NAPLAN test that will become an online, adaptive test from 2016. The article argues that…

  20. A scale self-adapting segmentation approach and knowledge transfer for automatically updating land use/cover change databases using high spatial resolution images

    NASA Astrophysics Data System (ADS)

    Wang, Zhihua; Yang, Xiaomei; Lu, Chen; Yang, Fengshuo

    2018-07-01

    Automatic updating of land use/cover change (LUCC) databases using high spatial resolution images (HSRI) is important for environmental monitoring and policy making, especially for coastal areas that connect the land and coast and that tend to change frequently. Many object-based change detection methods are proposed, especially those combining historical LUCC with HSRI. However, the scale parameter(s) segmenting the serial temporal images, which directly determines the average object size, is hard to choose without experts' intervention. And the samples transferred from historical LUCC also need experts' intervention to avoid insufficient or wrong samples. With respect to the scale parameter(s) choosing, a Scale Self-Adapting Segmentation (SSAS) approach based on the exponential sampling of a scale parameter and location of the local maximum of a weighted local variance was proposed to determine the scale selection problem when segmenting images constrained by LUCC for detecting changes. With respect to the samples transferring, Knowledge Transfer (KT), a classifier trained on historical images with LUCC and applied in the classification of updated images, was also proposed. Comparison experiments were conducted in a coastal area of Zhujiang, China, using SPOT 5 images acquired in 2005 and 2010. The results reveal that (1) SSAS can segment images more effectively without intervention of experts. (2) KT can also reach the maximum accuracy of samples transfer without experts' intervention. Strategy SSAS + KT would be a good choice if the temporal historical image and LUCC match, and the historical image and updated image are obtained from the same resource.

  1. Speededness and Adaptive Testing

    ERIC Educational Resources Information Center

    van der Linden, Wim J.; Xiong, Xinhui

    2013-01-01

    Two simple constraints on the item parameters in a response--time model are proposed to control the speededness of an adaptive test. As the constraints are additive, they can easily be included in the constraint set for a shadow-test approach (STA) to adaptive testing. Alternatively, a simple heuristic is presented to control speededness in plain…

  2. Pricing and reimbursement experiences and insights in the European Union and the United States: Lessons learned to approach adaptive payer pathways.

    PubMed

    Faulkner, S D; Lee, M; Qin, D; Morrell, L; Xoxi, E; Sammarco, A; Cammarata, S; Russo, P; Pani, L; Barker, R

    2016-12-01

    Earlier patient access to beneficial therapeutics that addresses unmet need is one of the main requirements of innovation in global healthcare systems already burdened by unsustainable budgets. "Adaptive pathways" encompass earlier cross-stakeholder engagement, regulatory tools, and iterative evidence generation through the life cycle of the medicinal product. A key enabler of earlier patient access is through more flexible and adaptive payer approaches to pricing and reimbursement that reflect the emerging evidence generated. © 2016 American Society for Clinical Pharmacology and Therapeutics.

  3. A new adaptive multiple modelling approach for non-linear and non-stationary systems

    NASA Astrophysics Data System (ADS)

    Chen, Hao; Gong, Yu; Hong, Xia

    2016-07-01

    This paper proposes a novel adaptive multiple modelling algorithm for non-linear and non-stationary systems. This simple modelling paradigm comprises K candidate sub-models which are all linear. With data available in an online fashion, the performance of all candidate sub-models are monitored based on the most recent data window, and M best sub-models are selected from the K candidates. The weight coefficients of the selected sub-model are adapted via the recursive least square (RLS) algorithm, while the coefficients of the remaining sub-models are unchanged. These M model predictions are then optimally combined to produce the multi-model output. We propose to minimise the mean square error based on a recent data window, and apply the sum to one constraint to the combination parameters, leading to a closed-form solution, so that maximal computational efficiency can be achieved. In addition, at each time step, the model prediction is chosen from either the resultant multiple model or the best sub-model, whichever is the best. Simulation results are given in comparison with some typical alternatives, including the linear RLS algorithm and a number of online non-linear approaches, in terms of modelling performance and time consumption.

  4. 120nm resolution in thick samples with structured illumination and adaptive optics

    NASA Astrophysics Data System (ADS)

    Thomas, Benjamin; Sloan, Megan; Wolstenholme, Adrian J.; Kner, Peter

    2014-03-01

    μLinear Structured Illumination Microscopy (SIM) provides a two-fold increase over the diffraction limited resolution. SIM produces excellent images with 120nm resolution in tissue culture cells in two and three dimensions. For SIM to work correctly, the point spread function (PSF) and optical transfer function (OTF) must be known, and, ideally, should be unaberrated. When imaging through thick samples, aberrations will be introduced into the optical system which will reduce the peak intensity and increase the width of the PSF. This will lead to reduced resolution and artifacts in SIM images. Adaptive optics can be used to correct the optical wavefront restoring the PSF to its unaberrated state, and AO has been used in several types of fluorescence microscopy. We demonstrate that AO can be used with SIM to achieve 120nm resolution through 25m of tissue by imaging through the full thickness of an adult C. elegans roundworm. The aberrations can be corrected over a 25μm × 45μm field of view with one wavefront correction setting, demonstrating that AO can be used effectively with widefield superresolution techniques.

  5. Self-Avoiding Walks Over Adaptive Triangular Grids

    NASA Technical Reports Server (NTRS)

    Heber, Gerd; Biswas, Rupak; Gao, Guang R.; Saini, Subhash (Technical Monitor)

    1999-01-01

    Space-filling curves is a popular approach based on a geometric embedding for linearizing computational meshes. We present a new O(n log n) combinatorial algorithm for constructing a self avoiding walk through a two dimensional mesh containing n triangles. We show that for hierarchical adaptive meshes, the algorithm can be locally adapted and easily parallelized by taking advantage of the regularity of the refinement rules. The proposed approach should be very useful in the runtime partitioning and load balancing of adaptive unstructured grids.

  6. Design of Field Experiments for Adaptive Sampling of the Ocean with Autonomous Vehicles

    NASA Astrophysics Data System (ADS)

    Zheng, H.; Ooi, B. H.; Cho, W.; Dao, M. H.; Tkalich, P.; Patrikalakis, N. M.

    2010-05-01

    Due to the highly non-linear and dynamical nature of oceanic phenomena, the predictive capability of various ocean models depends on the availability of operational data. A practical method to improve the accuracy of the ocean forecast is to use a data assimilation methodology to combine in-situ measured and remotely acquired data with numerical forecast models of the physical environment. Autonomous surface and underwater vehicles with various sensors are economic and efficient tools for exploring and sampling the ocean for data assimilation; however there is an energy limitation to such vehicles, and thus effective resource allocation for adaptive sampling is required to optimize the efficiency of exploration. In this paper, we use physical oceanography forecasts of the coastal zone of Singapore for the design of a set of field experiments to acquire useful data for model calibration and data assimilation. The design process of our experiments relied on the oceanography forecast including the current speed, its gradient, and vorticity in a given region of interest for which permits for field experiments could be obtained and for time intervals that correspond to strong tidal currents. Based on these maps, resources available to our experimental team, including Autonomous Surface Craft (ASC) are allocated so as to capture the oceanic features that result from jets and vortices behind bluff bodies (e.g., islands) in the tidal current. Results are summarized from this resource allocation process and field experiments conducted in January 2009.

  7. Adaptive Optical System for Retina Imaging Approaches Clinic Applications

    NASA Astrophysics Data System (ADS)

    Ling, N.; Zhang, Y.; Rao, X.; Wang, C.; Hu, Y.; Jiang, W.; Jiang, C.

    We presented "A small adaptive optical system on table for human retinal imaging" at the 3rd Workshop on Adaptive Optics for Industry and Medicine. In this system, a 19 element small deformable mirror was used as wavefront correction element. High resolution images of photo receptors and capillaries of human retina were obtained. In recent two years, at the base of this system a new adaptive optical system for human retina imaging has been developed. The wavefront correction element is a newly developed 37 element deformable mirror. Some modifications have been adopted for easy operation. Experiments for different imaging wavelengths and axial positions were conducted. Mosaic pictures of photoreceptors and capillaries were obtained. 100 normal and abnormal eyes of different ages have been inspected.The first report in the world concerning the most detailed capillary distribution images cover ±3° by ± 3° field around the fovea has been demonstrated. Some preliminary very early diagnosis experiment has been tried in laboratory. This system is being planned to move to the hospital for clinic experiments.

  8. Cultural adaptation and psychometric properties of the family questionnaire in a Brazilian sample of relatives of schizophrenia outpatients.

    PubMed

    Zanetti, Ana C G; Wiedemann, Georg; Dantas, Rosana A S; Hayashida, Miyeko; de Azevedo-Marques, João M; Galera, Sueli A F

    2013-06-01

    To evaluate the internal reliability and validity of the Brazilian Portuguese version of the Family Questionnaire among families of schizophrenia outpatients. The main studies about the family environment of schizophrenia patients are related to the concept of Expressed Emotion. There is currently no instrument to evaluate this concept in Brazil that is easily applicable and comparable with studies from other countries. Methodological and cross-sectional research design. A convenience sample of 130 relatives of schizophrenia outpatients was selected. The translation and cultural adaptation of the instrument involved experts in mental health and experts in the German language and included back translation, semantic evaluation of items and pretesting of the instrument with 30 relatives of schizophrenia outpatients. The psychometric properties of the instrument were studied with another 100 relatives, which fulfilled the requirements for the Brazilian Portuguese version of the instrument. The psychometric properties of the instrument were assessed by construct validity (using an analysis of its key components, comparisons between distinct groups-convergent validity with the Antonovsky's Sense of Coherence Scale) and reliability (checking the internal consistency of its items and its test-retest reproducibility). The analysis of main components confirmed dimensionality patterns that were comparable between the original and adapted versions. In two domains of the instrument, critical comments and emotional over-involvement had moderate and significant correlations, respectively, with Antonovsky's Sense of Coherence Scale, appropriate values of Cronbach's alpha and strong and significant correlations, respectively, in test-retest reproducibility. We observed significant differences between distinct groups of parents in the category of emotional over-involvement. We conclude that the Portuguese-adapted version of the Family Questionnaire is valid and reliable for the

  9. Stable Direct Adaptive Control of Linear Infinite-dimensional Systems Using a Command Generator Tracker Approach

    NASA Technical Reports Server (NTRS)

    Balas, M. J.; Kaufman, H.; Wen, J.

    1985-01-01

    A command generator tracker approach to model following contol of linear distributed parameter systems (DPS) whose dynamics are described on infinite dimensional Hilbert spaces is presented. This method generates finite dimensional controllers capable of exponentially stable tracking of the reference trajectories when certain ideal trajectories are known to exist for the open loop DPS; we present conditions for the existence of these ideal trajectories. An adaptive version of this type of controller is also presented and shown to achieve (in some cases, asymptotically) stable finite dimensional control of the infinite dimensional DPS.

  10. Adaptive and Adaptable Automation Design: A Critical Review of the Literature and Recommendations for Future Research

    NASA Technical Reports Server (NTRS)

    Prinzel, Lawrence J., III; Kaber, David B.

    2006-01-01

    This report presents a review of literature on approaches to adaptive and adaptable task/function allocation and adaptive interface technologies for effective human management of complex systems that are likely to be issues for the Next Generation Air Transportation System, and a focus of research under the Aviation Safety Program, Integrated Intelligent Flight Deck Project. Contemporary literature retrieved from an online database search is summarized and integrated. The major topics include the effects of delegation-type, adaptable automation on human performance, workload and situation awareness, the effectiveness of various automation invocation philosophies and strategies to function allocation in adaptive systems, and the role of user modeling in adaptive interface design and the performance implications of adaptive interface technology.

  11. Using lot quality-assurance sampling and area sampling to identify priority areas for trachoma control: Viet Nam.

    PubMed

    Myatt, Mark; Mai, Nguyen Phuong; Quynh, Nguyen Quang; Nga, Nguyen Huy; Tai, Ha Huy; Long, Nguyen Hung; Minh, Tran Hung; Limburg, Hans

    2005-10-01

    To report on the use of lot quality-assurance sampling (LQAS) surveys undertaken within an area-sampling framework to identify priority areas for intervention with trachoma control activities in Viet Nam. The LQAS survey method for the rapid assessment of the prevalence of active trachoma was adapted for use in Viet Nam with the aim of classifying individual communes by the prevalence of active trachoma among children in primary school. School-based sampling was used; school sites to be sampled were selected using an area-sampling approach. A total of 719 communes in 41 districts in 18 provinces were surveyed. Survey staff found the LQAS survey method both simple and rapid to use after initial problems with area-sampling methods were identified and remedied. The method yielded a finer spatial resolution of prevalence than had been previously achieved in Viet Nam using semiquantitative rapid assessment surveys and multistage cluster-sampled surveys. When used with area-sampling techniques, the LQAS survey method has the potential to form the basis of survey instruments that can be used to efficiently target resources for interventions against active trachoma. With additional work, such methods could provide a generally applicable tool for effective programme planning and for the certification of the elimination of trachoma as a blinding disease.

  12. Comprehensive multiphase NMR spectroscopy: Basic experimental approaches to differentiate phases in heterogeneous samples

    NASA Astrophysics Data System (ADS)

    Courtier-Murias, Denis; Farooq, Hashim; Masoom, Hussain; Botana, Adolfo; Soong, Ronald; Longstaffe, James G.; Simpson, Myrna J.; Maas, Werner E.; Fey, Michael; Andrew, Brian; Struppe, Jochem; Hutchins, Howard; Krishnamurthy, Sridevi; Kumar, Rajeev; Monette, Martine; Stronks, Henry J.; Hume, Alan; Simpson, André J.

    2012-04-01

    Heterogeneous samples, such as soils, sediments, plants, tissues, foods and organisms, often contain liquid-, gel- and solid-like phases and it is the synergism between these phases that determine their environmental and biological properties. Studying each phase separately can perturb the sample, removing important structural information such as chemical interactions at the gel-solid interface, kinetics across boundaries and conformation in the natural state. In order to overcome these limitations a Comprehensive Multiphase-Nuclear Magnetic Resonance (CMP-NMR) probe has been developed, and is introduced here, that permits all bonds in all phases to be studied and differentiated in whole unaltered natural samples. The CMP-NMR probe is built with high power circuitry, Magic Angle Spinning (MAS), is fitted with a lock channel, pulse field gradients, and is fully susceptibility matched. Consequently, this novel NMR probe has to cover all HR-MAS aspects without compromising power handling to permit the full range of solution-, gel- and solid-state experiments available today. Using this technology, both structures and interactions can be studied independently in each phase as well as transfer/interactions between phases within a heterogeneous sample. This paper outlines some basic experimental approaches using a model heterogeneous multiphase sample containing liquid-, gel- and solid-like components in water, yielding separate 1H and 13C spectra for the different phases. In addition, 19F performance is also addressed. To illustrate the capability of 19F NMR soil samples, containing two different contaminants, are used, demonstrating a preliminary, but real-world application of this technology. This novel NMR approach possesses a great potential for the in situ study of natural samples in their native state.

  13. Adaptive management of natural resources-framework and issues

    USGS Publications Warehouse

    Williams, B.K.

    2011-01-01

    Adaptive management, an approach for simultaneously managing and learning about natural resources, has been around for several decades. Interest in adaptive decision making has grown steadily over that time, and by now many in natural resources conservation claim that adaptive management is the approach they use in meeting their resource management responsibilities. Yet there remains considerable ambiguity about what adaptive management actually is, and how it is to be implemented by practitioners. The objective of this paper is to present a framework and conditions for adaptive decision making, and discuss some important challenges in its application. Adaptive management is described as a two-phase process of deliberative and iterative phases, which are implemented sequentially over the timeframe of an application. Key elements, processes, and issues in adaptive decision making are highlighted in terms of this framework. Special emphasis is given to the question of geographic scale, the difficulties presented by non-stationarity, and organizational challenges in implementing adaptive management. ?? 2010.

  14. Statistical approaches to account for false-positive errors in environmental DNA samples.

    PubMed

    Lahoz-Monfort, José J; Guillera-Arroita, Gurutzeta; Tingley, Reid

    2016-05-01

    Environmental DNA (eDNA) sampling is prone to both false-positive and false-negative errors. We review statistical methods to account for such errors in the analysis of eDNA data and use simulations to compare the performance of different modelling approaches. Our simulations illustrate that even low false-positive rates can produce biased estimates of occupancy and detectability. We further show that removing or classifying single PCR detections in an ad hoc manner under the suspicion that such records represent false positives, as sometimes advocated in the eDNA literature, also results in biased estimation of occupancy, detectability and false-positive rates. We advocate alternative approaches to account for false-positive errors that rely on prior information, or the collection of ancillary detection data at a subset of sites using a sampling method that is not prone to false-positive errors. We illustrate the advantages of these approaches over ad hoc classifications of detections and provide practical advice and code for fitting these models in maximum likelihood and Bayesian frameworks. Given the severe bias induced by false-negative and false-positive errors, the methods presented here should be more routinely adopted in eDNA studies. © 2015 John Wiley & Sons Ltd.

  15. Environmental impacts of climate change adaptation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Enríquez-de-Salamanca, Álvaro, E-mail: aenriquez@draba.org; Díaz-Sierra, Rubén, E-mail: sierra@dfmf.uned.es; Martín-Aranda, Rosa M., E-mail: rmartin@ccia.uned.es

    Climate change adaptation reduces adverse effects of climate change but may also have undesirable environmental impacts. However, these impacts are yet poorly defined and analysed in the existing literature. To complement this knowledge-gap, we reviewed the literature to unveil the relationship between climate change adaptation and environmental impact assessment, and the degree to which environmental impacts are included in climate change adaptation theory and practice. Our literature review showed that technical, social and economic perspectives on climate change adaptation receive much more attention than the environmental perspective. The scarce interest on the environmental impacts of adaptation may be attributed tomore » (1) an excessive sectoral approach, with dominance of non-environmental perspectives, (2) greater interest in mitigation and direct climate change impacts rather than in adaptation impacts, (3) a tendency to consider adaptation as inherently good, and (4) subjective/preconceived notions on which measures are good or bad, without a comprehensive assessment. Environmental Assessment (EA) has a long established history as an effective tool to include environment into decision-making, although it does not yet guarantee a proper assessment of adaptation, because it is still possible to postpone or even circumvent the processes of assessing the impacts of climate adaptation. Our results suggest that there is a need to address adaptation proactively by including it in EA, to update current policy frameworks, and to demand robust and reliable evaluation of alternatives. Only through the full EA of adaptation measures can we improve our understanding of the primary and secondary impacts of adaptation to global environmental change. - Highlights: • Climate change adaptation may have undesirable environmental impacts. • The impacts of adaptation are yet poorly analysed in the literature. • There is an excessive sectoral approach to adaptation

  16. Force-induced bone growth and adaptation: A system theoretical approach to understanding bone mechanotransduction

    NASA Astrophysics Data System (ADS)

    Maldonado, Solvey; Findeisen, Rolf

    2010-06-01

    The modeling, analysis, and design of treatment therapies for bone disorders based on the paradigm of force-induced bone growth and adaptation is a challenging task. Mathematical models provide, in comparison to clinical, medical and biological approaches an structured alternative framework to understand the concurrent effects of the multiple factors involved in bone remodeling. By now, there are few mathematical models describing the appearing complex interactions. However, the resulting models are complex and difficult to analyze, due to the strong nonlinearities appearing in the equations, the wide range of variability of the states, and the uncertainties in parameters. In this work, we focus on analyzing the effects of changes in model structure and parameters/inputs variations on the overall steady state behavior using systems theoretical methods. Based on an briefly reviewed existing model that describes force-induced bone adaptation, the main objective of this work is to analyze the stationary behavior and to identify plausible treatment targets for remodeling related bone disorders. Identifying plausible targets can help in the development of optimal treatments combining both physical activity and drug-medication. Such treatments help to improve/maintain/restore bone strength, which deteriorates under bone disorder conditions, such as estrogen deficiency.

  17. An adaptive CBPR approach to create weight management materials for a school-based health center intervention.

    PubMed

    Sussman, Andrew L; Montoya, Carolyn; Werder, Olaf; Davis, Sally; Wallerstein, Nina; Kong, Alberta S

    2013-01-01

    From our previous clinical work with overweight/obese youth, we identified the need for research to create an effective weight management intervention to address the growing prevalence of adolescent metabolic syndrome. Formative assessment through an adaptive community-based participatory research (CBPR) approach was conducted toward the development of a nutritional and physical activity (DVD) and clinician toolkit for a school-based health center (SBHC) weight management intervention. We first conducted parent and adolescent interviews on views and experiences about obesity while convening a community advisory council (CAC) recruited from two participating urban New Mexico high schools. Thematic findings from the interviews were analyzed with the CAC to develop culturally and developmentally appropriate intervention materials. Themes from the parent and adolescent interviews included general barriers/challenges, factors influencing motivation, and change facilitators. The CAC and university-based research team reached consensus on the final content of nutrition and physical activity topics to produce a DVD and clinician toolkit through six monthly sessions. These materials used in the SBHC intervention resulted in a greater reduction of body mass index when compared to adolescents receiving standard care. Formative assessment using an adaptive CBPR approach resulted in the creation of culturally and age appropriate weight reduction materials that were acceptable to study participants. This trial is registered with ClinicalTrials.gov NCT00841334.

  18. Modeling the Internet of Things, Self-Organizing and Other Complex Adaptive Communication Networks: A Cognitive Agent-Based Computing Approach

    PubMed Central

    2016-01-01

    Background Computer Networks have a tendency to grow at an unprecedented scale. Modern networks involve not only computers but also a wide variety of other interconnected devices ranging from mobile phones to other household items fitted with sensors. This vision of the "Internet of Things" (IoT) implies an inherent difficulty in modeling problems. Purpose It is practically impossible to implement and test all scenarios for large-scale and complex adaptive communication networks as part of Complex Adaptive Communication Networks and Environments (CACOONS). The goal of this study is to explore the use of Agent-based Modeling as part of the Cognitive Agent-based Computing (CABC) framework to model a Complex communication network problem. Method We use Exploratory Agent-based Modeling (EABM), as part of the CABC framework, to develop an autonomous multi-agent architecture for managing carbon footprint in a corporate network. To evaluate the application of complexity in practical scenarios, we have also introduced a company-defined computer usage policy. Results The conducted experiments demonstrated two important results: Primarily CABC-based modeling approach such as using Agent-based Modeling can be an effective approach to modeling complex problems in the domain of IoT. Secondly, the specific problem of managing the Carbon footprint can be solved using a multiagent system approach. PMID:26812235

  19. Modeling the Internet of Things, Self-Organizing and Other Complex Adaptive Communication Networks: A Cognitive Agent-Based Computing Approach.

    PubMed

    Laghari, Samreen; Niazi, Muaz A

    2016-01-01

    Computer Networks have a tendency to grow at an unprecedented scale. Modern networks involve not only computers but also a wide variety of other interconnected devices ranging from mobile phones to other household items fitted with sensors. This vision of the "Internet of Things" (IoT) implies an inherent difficulty in modeling problems. It is practically impossible to implement and test all scenarios for large-scale and complex adaptive communication networks as part of Complex Adaptive Communication Networks and Environments (CACOONS). The goal of this study is to explore the use of Agent-based Modeling as part of the Cognitive Agent-based Computing (CABC) framework to model a Complex communication network problem. We use Exploratory Agent-based Modeling (EABM), as part of the CABC framework, to develop an autonomous multi-agent architecture for managing carbon footprint in a corporate network. To evaluate the application of complexity in practical scenarios, we have also introduced a company-defined computer usage policy. The conducted experiments demonstrated two important results: Primarily CABC-based modeling approach such as using Agent-based Modeling can be an effective approach to modeling complex problems in the domain of IoT. Secondly, the specific problem of managing the Carbon footprint can be solved using a multiagent system approach.

  20. Computerized Adaptive Personality Testing: A Review and Illustration With the MMPI-2 Computerized Adaptive Version.

    ERIC Educational Resources Information Center

    Forbey, Johnathan D.; Ben-Porath, Yossef S.

    2007-01-01

    Computerized adaptive testing in personality assessment can improve efficiency by significantly reducing the number of items administered to answer an assessment question. Two approaches have been explored for adaptive testing in computerized personality assessment: item response theory and the countdown method. In this article, the authors…

  1. Adaptive sampling of information in perceptual decision-making.

    PubMed

    Cassey, Thomas C; Evens, David R; Bogacz, Rafal; Marshall, James A R; Ludwig, Casimir J H

    2013-01-01

    In many perceptual and cognitive decision-making problems, humans sample multiple noisy information sources serially, and integrate the sampled information to make an overall decision. We derive the optimal decision procedure for two-alternative choice tasks in which the different options are sampled one at a time, sources vary in the quality of the information they provide, and the available time is fixed. To maximize accuracy, the optimal observer allocates time to sampling different information sources in proportion to their noise levels. We tested human observers in a corresponding perceptual decision-making task. Observers compared the direction of two random dot motion patterns that were triggered only when fixated. Observers allocated more time to the noisier pattern, in a manner that correlated with their sensory uncertainty about the direction of the patterns. There were several differences between the optimal observer predictions and human behaviour. These differences point to a number of other factors, beyond the quality of the currently available sources of information, that influences the sampling strategy.

  2. Separation of high-resolution samples of overlapping latent fingerprints using relaxation labeling

    NASA Astrophysics Data System (ADS)

    Qian, Kun; Schott, Maik; Schöne, Werner; Hildebrandt, Mario

    2012-06-01

    The analysis of latent fingerprint patterns generally requires clearly recognizable friction ridge patterns. Currently, overlapping latent fingerprints pose a major problem for traditional crime scene investigation. This is due to the fact that these fingerprints usually have very similar optical properties. Consequently, the distinction of two or more overlapping fingerprints from each other is not trivially possible. While it is possible to employ chemical imaging to separate overlapping fingerprints, the corresponding methods require sophisticated fingerprint acquisition methods and are not compatible with conventional forensic fingerprint data. A separation technique that is purely based on the local orientation of the ridge patterns of overlapping fingerprints is proposed by Chen et al. and quantitatively evaluated using off-the-shelf fingerprint matching software with mostly artificially composed overlapping fingerprint samples, which is motivated by the scarce availability of authentic test samples. The work described in this paper adapts the approach presented by Chen et al. for its application on authentic high resolution fingerprint samples acquired by a contactless measurement device based on a Chromatic White Light (CWL) sensor. An evaluation of the work is also given, with the analysis of all adapted parameters. Additionally, the separability requirement proposed by Chen et al. is also evaluated for practical feasibility. Our results show promising tendencies for the application of this approach on high-resolution data, yet the separability requirement still poses a further challenge.

  3. Estimating Sampling Selection Bias in Human Genetics: A Phenomenological Approach

    PubMed Central

    Risso, Davide; Taglioli, Luca; De Iasio, Sergio; Gueresi, Paola; Alfani, Guido; Nelli, Sergio; Rossi, Paolo; Paoli, Giorgio; Tofanelli, Sergio

    2015-01-01

    This research is the first empirical attempt to calculate the various components of the hidden bias associated with the sampling strategies routinely-used in human genetics, with special reference to surname-based strategies. We reconstructed surname distributions of 26 Italian communities with different demographic features across the last six centuries (years 1447–2001). The degree of overlapping between "reference founding core" distributions and the distributions obtained from sampling the present day communities by probabilistic and selective methods was quantified under different conditions and models. When taking into account only one individual per surname (low kinship model), the average discrepancy was 59.5%, with a peak of 84% by random sampling. When multiple individuals per surname were considered (high kinship model), the discrepancy decreased by 8–30% at the cost of a larger variance. Criteria aimed at maximizing locally-spread patrilineages and long-term residency appeared to be affected by recent gene flows much more than expected. Selection of the more frequent family names following low kinship criteria proved to be a suitable approach only for historically stable communities. In any other case true random sampling, despite its high variance, did not return more biased estimates than other selective methods. Our results indicate that the sampling of individuals bearing historically documented surnames (founders' method) should be applied, especially when studying the male-specific genome, to prevent an over-stratification of ancient and recent genetic components that heavily biases inferences and statistics. PMID:26452043

  4. Estimating Sampling Selection Bias in Human Genetics: A Phenomenological Approach.

    PubMed

    Risso, Davide; Taglioli, Luca; De Iasio, Sergio; Gueresi, Paola; Alfani, Guido; Nelli, Sergio; Rossi, Paolo; Paoli, Giorgio; Tofanelli, Sergio

    2015-01-01

    This research is the first empirical attempt to calculate the various components of the hidden bias associated with the sampling strategies routinely-used in human genetics, with special reference to surname-based strategies. We reconstructed surname distributions of 26 Italian communities with different demographic features across the last six centuries (years 1447-2001). The degree of overlapping between "reference founding core" distributions and the distributions obtained from sampling the present day communities by probabilistic and selective methods was quantified under different conditions and models. When taking into account only one individual per surname (low kinship model), the average discrepancy was 59.5%, with a peak of 84% by random sampling. When multiple individuals per surname were considered (high kinship model), the discrepancy decreased by 8-30% at the cost of a larger variance. Criteria aimed at maximizing locally-spread patrilineages and long-term residency appeared to be affected by recent gene flows much more than expected. Selection of the more frequent family names following low kinship criteria proved to be a suitable approach only for historically stable communities. In any other case true random sampling, despite its high variance, did not return more biased estimates than other selective methods. Our results indicate that the sampling of individuals bearing historically documented surnames (founders' method) should be applied, especially when studying the male-specific genome, to prevent an over-stratification of ancient and recent genetic components that heavily biases inferences and statistics.

  5. Assessment of Different Sampling Methods for Measuring and Representing Macular Cone Density Using Flood-Illuminated Adaptive Optics.

    PubMed

    Feng, Shu; Gale, Michael J; Fay, Jonathan D; Faridi, Ambar; Titus, Hope E; Garg, Anupam K; Michaels, Keith V; Erker, Laura R; Peters, Dawn; Smith, Travis B; Pennesi, Mark E

    2015-09-01

    To describe a standardized flood-illuminated adaptive optics (AO) imaging protocol suitable for the clinical setting and to assess sampling methods for measuring cone density. Cone density was calculated following three measurement protocols: 50 × 50-μm sampling window values every 0.5° along the horizontal and vertical meridians (fixed-interval method), the mean density of expanding 0.5°-wide arcuate areas in the nasal, temporal, superior, and inferior quadrants (arcuate mean method), and the peak cone density of a 50 × 50-μm sampling window within expanding arcuate areas near the meridian (peak density method). Repeated imaging was performed in nine subjects to determine intersession repeatability of cone density. Cone density montages could be created for 67 of the 74 subjects. Image quality was determined to be adequate for automated cone counting for 35 (52%) of the 67 subjects. We found that cone density varied with different sampling methods and regions tested. In the nasal and temporal quadrants, peak density most closely resembled histological data, whereas the arcuate mean and fixed-interval methods tended to underestimate the density compared with histological data. However, in the inferior and superior quadrants, arcuate mean and fixed-interval methods most closely matched histological data, whereas the peak density method overestimated cone density compared with histological data. Intersession repeatability testing showed that repeatability was greatest when sampling by arcuate mean and lowest when sampling by fixed interval. We show that different methods of sampling can significantly affect cone density measurements. Therefore, care must be taken when interpreting cone density results, even in a normal population.

  6. Assessment of Different Sampling Methods for Measuring and Representing Macular Cone Density Using Flood-Illuminated Adaptive Optics

    PubMed Central

    Feng, Shu; Gale, Michael J.; Fay, Jonathan D.; Faridi, Ambar; Titus, Hope E.; Garg, Anupam K.; Michaels, Keith V.; Erker, Laura R.; Peters, Dawn; Smith, Travis B.; Pennesi, Mark E.

    2015-01-01

    Purpose To describe a standardized flood-illuminated adaptive optics (AO) imaging protocol suitable for the clinical setting and to assess sampling methods for measuring cone density. Methods Cone density was calculated following three measurement protocols: 50 × 50-μm sampling window values every 0.5° along the horizontal and vertical meridians (fixed-interval method), the mean density of expanding 0.5°-wide arcuate areas in the nasal, temporal, superior, and inferior quadrants (arcuate mean method), and the peak cone density of a 50 × 50-μm sampling window within expanding arcuate areas near the meridian (peak density method). Repeated imaging was performed in nine subjects to determine intersession repeatability of cone density. Results Cone density montages could be created for 67 of the 74 subjects. Image quality was determined to be adequate for automated cone counting for 35 (52%) of the 67 subjects. We found that cone density varied with different sampling methods and regions tested. In the nasal and temporal quadrants, peak density most closely resembled histological data, whereas the arcuate mean and fixed-interval methods tended to underestimate the density compared with histological data. However, in the inferior and superior quadrants, arcuate mean and fixed-interval methods most closely matched histological data, whereas the peak density method overestimated cone density compared with histological data. Intersession repeatability testing showed that repeatability was greatest when sampling by arcuate mean and lowest when sampling by fixed interval. Conclusions We show that different methods of sampling can significantly affect cone density measurements. Therefore, care must be taken when interpreting cone density results, even in a normal population. PMID:26325414

  7. An approach enabling adaptive FEC for OFDM in fiber-VLLC system

    NASA Astrophysics Data System (ADS)

    Wei, Yiran; He, Jing; Deng, Rui; Shi, Jin; Chen, Shenghai; Chen, Lin

    2017-12-01

    In this paper, we propose an orthogonal circulant matrix transform (OCT)-based adaptive frame-level-forward error correction (FEC) scheme for fiber-visible laser light communication (VLLC) system and experimentally demonstrate by Reed-Solomon (RS) Code. In this method, no extra bits are spent for adaptive message, except training sequence (TS), which is simultaneously used for synchronization and channel estimation. Therefore, RS-coding can be adaptively performed frames by frames via the last received codeword-error-rate (CER) feedback estimated by the TSs of the previous few OFDM frames. In addition, the experimental results exhibit that over 20 km standard single-mode fiber (SSMF) and 8 m visible light transmission, the costs of RS codewords are at most 14.12% lower than those of conventional adaptive subcarrier-RS-code based 16-QAM OFDM at bit error rate (BER) of 10-5.

  8. Adaptive management of rangeland systems

    USGS Publications Warehouse

    Allen, Craig R.; Angeler, David G.; Fontaine, Joseph J.; Garmestani, Ahjond S.; Hart, Noelle M.; Pope, Kevin L.; Twidwell, Dirac

    2017-01-01

    Adaptive management is an approach to natural resource management that uses structured learning to reduce uncertainties for the improvement of management over time. The origins of adaptive management are linked to ideas of resilience theory and complex systems. Rangeland management is particularly well suited for the application of adaptive management, having sufficient controllability and reducible uncertainties. Adaptive management applies the tools of structured decision making and requires monitoring, evaluation, and adjustment of management. Adaptive governance, involving sharing of power and knowledge among relevant stakeholders, is often required to address conflict situations. Natural resource laws and regulations can present a barrier to adaptive management when requirements for legal certainty are met with environmental uncertainty. However, adaptive management is possible, as illustrated by two cases presented in this chapter. Despite challenges and limitations, when applied appropriately adaptive management leads to improved management through structured learning, and rangeland management is an area in which adaptive management shows promise and should be further explored.

  9. Adaptive output feedback control of flexible-joint robots using neural networks: dynamic surface design approach.

    PubMed

    Yoo, Sung Jin; Park, Jin Bae; Choi, Yoon Ho

    2008-10-01

    In this paper, we propose a new robust output feedback control approach for flexible-joint electrically driven (FJED) robots via the observer dynamic surface design technique. The proposed method only requires position measurements of the FJED robots. To estimate the link and actuator velocity information of the FJED robots with model uncertainties, we develop an adaptive observer using self-recurrent wavelet neural networks (SRWNNs). The SRWNNs are used to approximate model uncertainties in both robot (link) dynamics and actuator dynamics, and all their weights are trained online. Based on the designed observer, the link position tracking controller using the estimated states is induced from the dynamic surface design procedure. Therefore, the proposed controller can be designed more simply than the observer backstepping controller. From the Lyapunov stability analysis, it is shown that all signals in a closed-loop adaptive system are uniformly ultimately bounded. Finally, the simulation results on a three-link FJED robot are presented to validate the good position tracking performance and robustness of the proposed control system against payload uncertainties and external disturbances.

  10. Adaptation in Living Systems

    NASA Astrophysics Data System (ADS)

    Tu, Yuhai; Rappel, Wouter-Jan

    2018-03-01

    Adaptation refers to the biological phenomenon where living systems change their internal states in response to changes in their environments in order to maintain certain key functions critical for their survival and fitness. Adaptation is one of the most ubiquitous and arguably one of the most fundamental properties of living systems. It occurs throughout all biological scales, from adaptation of populations of species over evolutionary time to adaptation of a single cell to different environmental stresses during its life span. In this article, we review some of the recent progress made in understanding molecular mechanisms of cellular-level adaptation. We take the minimalist (or the physicist) approach and study the simplest systems that exhibit generic adaptive behaviors, namely chemotaxis in bacterium cells (Escherichia coli) and eukaryotic cells (Dictyostelium). We focus on understanding the basic biochemical interaction networks that are responsible for adaptation dynamics. By combining theoretical modeling with quantitative experimentation, we demonstrate universal features in adaptation as well as important differences in different cellular systems. Future work in extending the modeling framework to study adaptation in more complex systems such as sensory neurons is also discussed.

  11. Delinquent-Victim Youth-Adapting a Trauma-Informed Approach for the Juvenile Justice System.

    PubMed

    Rapp, Lisa

    2016-01-01

    The connection between victimization and later delinquency is well established and most youth involved with the juvenile justice system have at least one if not multiple victimizations in their history. Poly-victimized youth or those presenting with complex trauma require specialized assessment and services to prevent deleterious emotional, physical, and social life consequences. Empirical studies have provided information which can guide practitioners work with these youth and families, yet many of the policies and practices of the juvenile justice system are counter to this model. Many youth-serving organizations are beginning to review their operations to better match a trauma-informed approach and in this article the author will highlight how a trauma-informed care model could be utilized to adapt the juvenile justice system.

  12. Sampling for area estimation: A comparison of full-frame sampling with the sample segment approach

    NASA Technical Reports Server (NTRS)

    Hixson, M.; Bauer, M. E.; Davis, B. J. (Principal Investigator)

    1979-01-01

    The author has identified the following significant results. Full-frame classifications of wheat and non-wheat for eighty counties in Kansas were repetitively sampled to simulate alternative sampling plans. Evaluation of four sampling schemes involving different numbers of samples and different size sampling units shows that the precision of the wheat estimates increased as the segment size decreased and the number of segments was increased. Although the average bias associated with the various sampling schemes was not significantly different, the maximum absolute bias was directly related to sampling size unit.

  13. Salient object detection: manifold-based similarity adaptation approach

    NASA Astrophysics Data System (ADS)

    Zhou, Jingbo; Ren, Yongfeng; Yan, Yunyang; Gao, Shangbing

    2014-11-01

    A saliency detection algorithm based on manifold-based similarity adaptation is proposed. The proposed algorithm is divided into three steps. First, we segment an input image into superpixels, which are represented as the nodes in a graph. Second, a new similarity measurement is used in the proposed algorithm. The weight matrix of the graph, which indicates the similarities between the nodes, uses a similarity-based method. It also captures the manifold structure of the image patches, in which the graph edges are determined in a data adaptive manner in terms of both similarity and manifold structure. Then, we use local reconstruction method as a diffusion method to obtain the saliency maps. The objective function in the proposed method is based on local reconstruction, with which estimated weights capture the manifold structure. Experiments on four bench-mark databases demonstrate the accuracy and robustness of the proposed method.

  14. Turbulent Output-Based Anisotropic Adaptation

    NASA Technical Reports Server (NTRS)

    Park, Michael A.; Carlson, Jan-Renee

    2010-01-01

    Controlling discretization error is a remaining challenge for computational fluid dynamics simulation. Grid adaptation is applied to reduce estimated discretization error in drag or pressure integral output functions. To enable application to high O(10(exp 7)) Reynolds number turbulent flows, a hybrid approach is utilized that freezes the near-wall boundary layer grids and adapts the grid away from the no slip boundaries. The hybrid approach is not applicable to problems with under resolved initial boundary layer grids, but is a powerful technique for problems with important off-body anisotropic features. Supersonic nozzle plume, turbulent flat plate, and shock-boundary layer interaction examples are presented with comparisons to experimental measurements of pressure and velocity. Adapted grids are produced that resolve off-body features in locations that are not known a priori.

  15. Least-Squares Adaptive Control Using Chebyshev Orthogonal Polynomials

    NASA Technical Reports Server (NTRS)

    Nguyen, Nhan T.; Burken, John; Ishihara, Abraham

    2011-01-01

    This paper presents a new adaptive control approach using Chebyshev orthogonal polynomials as basis functions in a least-squares functional approximation. The use of orthogonal basis functions improves the function approximation significantly and enables better convergence of parameter estimates. Flight control simulations demonstrate the effectiveness of the proposed adaptive control approach.

  16. [The post-colonial theoretical approach in cultural nursing research about adapting nursing care to non-Western populations].

    PubMed

    Racine, Louise

    2003-12-01

    Providing culturally competent care to non-Western populations remains a challenge in Western pluralist societies. Limitations related to the use of cultural theories to adapt nursing care have to be acknowledged. Cultural theories erase the impact of the larger social context on non-Western populations' health. Health problems arising from social inequities have to be addressed, if culturally adapted nursing interventions, are to be designed. To this end, post-colonialist theoretical approach represents a promising avenue since it is aimed at unmasking health problems intersecting with race, ethnicity, gender, and social classes. Research endeavours are directed at integrating marginalized knowledge in nursing theorization. As well, post-colonialism, from which the concept of cultural safety is derived, is a means to enhance the quality of care offered by nurses coming from dominant ethnic group to non-Western populations.

  17. Method and apparatus for telemetry adaptive bandwidth compression

    NASA Technical Reports Server (NTRS)

    Graham, Olin L.

    1987-01-01

    Methods and apparatus are provided for automatic and/or manual adaptive bandwidth compression of telemetry. An adaptive sampler samples a video signal from a scanning sensor and generates a sequence of sampled fields. Each field and range rate information from the sensor are hence sequentially transmitted to and stored in a multiple and adaptive field storage means. The field storage means then, in response to an automatic or manual control signal, transfers the stored sampled field signals to a video monitor in a form for sequential or simultaneous display of a desired number of stored signal fields. The sampling ratio of the adaptive sample, the relative proportion of available communication bandwidth allocated respectively to transmitted data and video information, and the number of fields simultaneously displayed are manually or automatically selectively adjustable in functional relationship to each other and detected range rate. In one embodiment, when relatively little or no scene motion is detected, the control signal maximizes sampling ratio and causes simultaneous display of all stored fields, thus maximizing resolution and bandwidth available for data transmission. When increased scene motion is detected, the control signal is adjusted accordingly to cause display of fewer fields. If greater resolution is desired, the control signal is adjusted to increase the sampling ratio.

  18. Addressing uncertainty in adaptation planning for agriculture.

    PubMed

    Vermeulen, Sonja J; Challinor, Andrew J; Thornton, Philip K; Campbell, Bruce M; Eriyagama, Nishadi; Vervoort, Joost M; Kinyangi, James; Jarvis, Andy; Läderach, Peter; Ramirez-Villegas, Julian; Nicklin, Kathryn J; Hawkins, Ed; Smith, Daniel R

    2013-05-21

    We present a framework for prioritizing adaptation approaches at a range of timeframes. The framework is illustrated by four case studies from developing countries, each with associated characterization of uncertainty. Two cases on near-term adaptation planning in Sri Lanka and on stakeholder scenario exercises in East Africa show how the relative utility of capacity vs. impact approaches to adaptation planning differ with level of uncertainty and associated lead time. An additional two cases demonstrate that it is possible to identify uncertainties that are relevant to decision making in specific timeframes and circumstances. The case on coffee in Latin America identifies altitudinal thresholds at which incremental vs. transformative adaptation pathways are robust options. The final case uses three crop-climate simulation studies to demonstrate how uncertainty can be characterized at different time horizons to discriminate where robust adaptation options are possible. We find that impact approaches, which use predictive models, are increasingly useful over longer lead times and at higher levels of greenhouse gas emissions. We also find that extreme events are important in determining predictability across a broad range of timescales. The results demonstrate the potential for robust knowledge and actions in the face of uncertainty.

  19. Addressing uncertainty in adaptation planning for agriculture

    PubMed Central

    Vermeulen, Sonja J.; Challinor, Andrew J.; Thornton, Philip K.; Campbell, Bruce M.; Eriyagama, Nishadi; Vervoort, Joost M.; Kinyangi, James; Jarvis, Andy; Läderach, Peter; Ramirez-Villegas, Julian; Nicklin, Kathryn J.; Hawkins, Ed; Smith, Daniel R.

    2013-01-01

    We present a framework for prioritizing adaptation approaches at a range of timeframes. The framework is illustrated by four case studies from developing countries, each with associated characterization of uncertainty. Two cases on near-term adaptation planning in Sri Lanka and on stakeholder scenario exercises in East Africa show how the relative utility of capacity vs. impact approaches to adaptation planning differ with level of uncertainty and associated lead time. An additional two cases demonstrate that it is possible to identify uncertainties that are relevant to decision making in specific timeframes and circumstances. The case on coffee in Latin America identifies altitudinal thresholds at which incremental vs. transformative adaptation pathways are robust options. The final case uses three crop–climate simulation studies to demonstrate how uncertainty can be characterized at different time horizons to discriminate where robust adaptation options are possible. We find that impact approaches, which use predictive models, are increasingly useful over longer lead times and at higher levels of greenhouse gas emissions. We also find that extreme events are important in determining predictability across a broad range of timescales. The results demonstrate the potential for robust knowledge and actions in the face of uncertainty. PMID:23674681

  20. Addressing global uncertainty and sensitivity in first-principles based microkinetic models by an adaptive sparse grid approach

    NASA Astrophysics Data System (ADS)

    Döpking, Sandra; Plaisance, Craig P.; Strobusch, Daniel; Reuter, Karsten; Scheurer, Christoph; Matera, Sebastian

    2018-01-01

    In the last decade, first-principles-based microkinetic modeling has been developed into an important tool for a mechanistic understanding of heterogeneous catalysis. A commonly known, but hitherto barely analyzed issue in this kind of modeling is the presence of sizable errors from the use of approximate Density Functional Theory (DFT). We here address the propagation of these errors to the catalytic turnover frequency (TOF) by global sensitivity and uncertainty analysis. Both analyses require the numerical quadrature of high-dimensional integrals. To achieve this efficiently, we utilize and extend an adaptive sparse grid approach and exploit the confinement of the strongly non-linear behavior of the TOF to local regions of the parameter space. We demonstrate the methodology on a model of the oxygen evolution reaction at the Co3O4 (110)-A surface, using a maximum entropy error model that imposes nothing but reasonable bounds on the errors. For this setting, the DFT errors lead to an absolute uncertainty of several orders of magnitude in the TOF. We nevertheless find that it is still possible to draw conclusions from such uncertain models about the atomistic aspects controlling the reactivity. A comparison with derivative-based local sensitivity analysis instead reveals that this more established approach provides incomplete information. Since the adaptive sparse grids allow for the evaluation of the integrals with only a modest number of function evaluations, this approach opens the way for a global sensitivity analysis of more complex models, for instance, models based on kinetic Monte Carlo simulations.